Compare commits

...

55 Commits

Author SHA1 Message Date
Leonardo de Moura
1d8dcd356c remove TODO 2026-01-29 02:24:23 +00:00
Leonardo de Moura
7ce66ec087 test: 2026-01-29 02:24:23 +00:00
Leonardo de Moura
d57870c02c implement +instances 2026-01-29 02:24:23 +00:00
Leonardo de Moura
4e8fb9d63a feat: add +instances
stage2 can be compiled without issues.
2026-01-29 02:24:23 +00:00
Leonardo de Moura
5ba29703ac chore: better stage2 fixes 2026-01-29 02:24:23 +00:00
Leonardo de Moura
2b99b6ab52 fix: stage2 2026-01-29 02:24:23 +00:00
Leonardo de Moura
7f679a7d60 a better fix 2026-01-29 02:24:23 +00:00
Leonardo de Moura
8080c0736f fix: stage2 2026-01-29 02:24:23 +00:00
Leonardo de Moura
b6644ddb03 chore: fix tests 2026-01-29 02:24:23 +00:00
Leonardo de Moura
08310eb4da feat: do not dsimp instances
This PR ensures `dsimp` does not "simplify" instances by default.
The old behavior can be retrieved by using
```
set_option backward.dsimp.instances true
```
Applying `dsimp` to instances creates non-standard instances, and this
creates all sorts of problems in Mathlib.
This modification is similar to
```
set_option backward.dsimp.proofs true
```
2026-01-29 02:24:22 +00:00
Leonardo de Moura
e4365abdf2 fix: use isClass? instead of binder annotation to identify instance parameters (#12172)
This PR fixes how we determine whether a function parameter is an
instance.
Previously, we relied on binder annotations (e.g., `[Ring A]` vs `{_ :
Ring A}`)
to make this determination. This is unreliable because users
legitimately use
`{..}` binders for class types when the instance is already available
from
context. For example:
```lean
structure OrdSet (α : Type) [Hashable α] [BEq α] where
  ...

def OrdSet.insert {_ : Hashable α} {_ : BEq α} (s : OrdSet α) (a : α) : OrdSet α :=
  ...
```

Here, `Hashable` and `BEq` are classes, but the `{..}` binder is
intentional, the
instances come from `OrdSet`'s parameters, so type class resolution is
unnecessary.

The fix checks the parameter's *type* using `isClass?` rather than its
syntax, and
caches this information in `FunInfo`. This affects several subsystems:

- **Discrimination trees**: instance parameters should not be indexed
even if marked with `{..}`
- **Congruence lemma generation**: instances require special treatment
- **`grind` canonicalizer**: must ensure canonical instances

**Potential regressions**: automation may now behave differently in
cases where it
previously misidentified instance parameters. For example, a rewrite
rule in `simp` that was
not firing due to incorrect indexing may now fire.

---------

Co-authored-by: Kim Morrison <kim@tqft.net>
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-29 02:24:22 +00:00
Leonardo de Moura
9e18eea271 feat: add mkBackwardRuleFromExpr (#12205)
This PR adds `mkBackwardRuleFromExpr` to create backward rules from
expressions, complementing the existing `mkBackwardRuleFromDecl` which
only works with declaration names.

The new function enables creating backward rules from partially applied
terms. For example, `mkBackwardRuleFromExpr (mkApp (mkConst
``Exists.intro [1]) Nat.mkType)` creates a rule for `Exists.intro` with
the type parameter fixed to `Nat`, leaving only the witness and proof as
subgoals.

The `levelParams` parameter supports universe polymorphism: when
creating a rule like `Prod.mk Nat` that should work at multiple universe
levels, the caller specifies which level parameters remain polymorphic.
The pattern's universe variables are then instantiated appropriately at
each application site.

Also refactors `Pattern.lean` to share code between declaration-based
and expression-based pattern creation, extracting `mkPatternFromType`
and `mkEqPatternFromType` as common helpers.
2026-01-28 05:00:15 +00:00
Kim Morrison
fa4cd6d78c feat: add theorems relating find? with findIdx? and findFinIdx? (#12204)
This PR adds theorems showing the consistency between `find?` and the
various index-finding functions. The theorems establish bidirectional
relationships between finding elements and finding their indices.

**Forward direction** (find? in terms of index):
- `find?_eq_map_findFinIdx?_getElem`: `xs.find? p = (xs.findFinIdx?
p).map (xs[·])`
- `find?_eq_bind_findIdx?_getElem?`: `xs.find? p = (xs.findIdx? p).bind
(xs[·]?)`
- `find?_eq_getElem?_findIdx`: `xs.find? p = xs[xs.findIdx p]?`

**Reverse direction** (index in terms of find?):
- `findIdx?_eq_bind_find?_idxOf?`: `xs.findIdx? p = (xs.find? p).bind
(xs.idxOf?)`
- `findFinIdx?_eq_bind_find?_finIdxOf?`: `xs.findFinIdx? p = (xs.find?
p).bind (xs.finIdxOf?)`
- `findIdx_eq_getD_bind_find?_idxOf?`: `xs.findIdx p = ((xs.find?
p).bind (xs.idxOf?)).getD xs.length`

All theorems are provided for `List`, `Array`, and `Vector` (where
applicable).

Requested at
https://leanprover.zulipchat.com/#narrow/channel/113488-general/topic/show.20that.20Array.2Efind.3F.20and.20Array.2EfindFinIdx.3F.20consistent/near/567340199

🤖 Prepared with Claude Code

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-28 04:55:29 +00:00
Kim Morrison
e1b19198a9 feat: another grind_pattern for getElem?_pos (#11963)
This PR activates `getElem?_pos` more aggressively, triggered by `c[i]`.

- [x] depends on: #12176

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-28 03:45:54 +00:00
Mac Malone
1590c9c9d9 chore: lake: fix tests on non-Linux platforms (#11955)
This PR fixes failures in the Lake tests on non-Linux platforms.
2026-01-28 03:32:49 +00:00
Kim Morrison
cd8280700d fix: create PR releases even when test suite fails (#12202)
This PR fixes an issue where PR releases were not created when the test
suite failed, even though the build artifacts were available. The
workflow now runs whenever a PR's CI completes, regardless of
success/failure, and relies on the artifact verification step to ensure
the necessary build artifacts exist before proceeding.

This allows developers to get PR toolchains and test against Mathlib
even when the Lean test suite has failures, as long as the build jobs
succeeded.

🤖 Prepared with Claude Code

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-28 03:13:48 +00:00
Kim Morrison
2e779f79de fix: bump numInstances for delayed grind theorem instances (#12176)
This PR fixes a bug where delayed E-match theorem instances could cause
uniqueId collisions in the instance tracking map.

The `uniqueId` for theorem instances is generated using `numInstances`,
but this counter was only bumped for immediately activated instances
(`.ready` case), not for delayed instances (`.next` case). This caused
ID collisions:

1. Theorem A matches, becomes delayed, gets `uniqueId = N`
2. Counter isn't bumped (stays at N)
3. Theorem B matches next, gets `uniqueId = N` (same!)
4. B's entry overwrites A's entry in `instanceMap`
5. A's tracking is lost

This manifested as `grind?` and `finish?` producing `instantiate approx`
(meaning "we couldn't determine which theorems to use") instead of
proper `instantiate only [...]` with specific theorem lists.

The fix bumps `numInstances` for delayed instances too, ensuring each
theorem instance gets a truly unique ID.

🤖 Prepared with Claude Code

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-28 03:09:27 +00:00
Leonardo de Moura
50bbf101be test: delayed assignment performance issue (#12201)
This PR adds a benchmark that exposes a performance issue at
`instantiateMVars` when there are many nested delayed assignments.
2026-01-28 02:08:39 +00:00
Elazar Gershuni
90ba5f3f40 feat: add prefix and suffix map injectivity lemmas (#12108)
This PR adds `prefix_map_iff_of_injective` and
`suffix_map_iff_of_injective` lemmas to Init.Data.List.Nat.Sublist.

These lemmas establish that if a function `f` is injective, then the
prefix and suffix relations are preserved under mapping (e.g., `l₁.map f
<+: l₂.map f ↔ l₁ <+: l₂`). These additions complement the existing
index-based lemmas in this file and allow for simpler structural proofs
without resorting to `take`, `drop`, or manual index manipulation.
2026-01-27 22:54:16 +00:00
Lean stage0 autoupdater
cedc641fd5 chore: update stage0 2026-01-27 20:33:17 +00:00
Eric Wieser
afee8aa1c1 fix: avoid SIGFPE on x86_64 for ISize division overflow (#12110)
This PR fixes a SIGFPE crash on x86_64 when evaluating `(ISize.minValue
/ -1 : ISize)`, filling an omission from #11624.

Closes #12097.
2026-01-27 19:36:07 +00:00
Henrik Böving
31e4eb62b7 perf: speed up compiler recompilation (#12196) 2026-01-27 18:50:58 +00:00
Marc Huisinga
4bcb3cea42 test: make interactive runner uri patching slightly more robust (#12194) 2026-01-27 16:23:46 +00:00
Joachim Breitner
36bae38e6b test: add big_struct_dep1 benchmark (#12191) 2026-01-27 14:36:09 +00:00
Markus Himmel
ba0e755adc feat: Std.Iter.first? (#12162)
This PR adds the function `Std.Iter.first?` and proves the specification
lemma `Std.Iter.first?_eq_match_step` if the iterator is productive.

The monadic variant on `Std.IterM` is also provided.

We use this new function to fix the default implementation for
`startsWith` and `dropPrefix` on `String` patterns, which used to fail
if the searcher returned a `skip` at the beginning. None of the patterns
we ship out of the box were affected by this, but user-defined patterns
were vulnerable.

---------

Co-authored-by: Paul Reichert <6992158+datokrat@users.noreply.github.com>
2026-01-27 12:10:16 +00:00
Kim Morrison
596827c0e9 test: add regression test for structure-extends-class congr (#12187)
This PR adds regression tests that catch issues where structures/classes
with class-typed fields produce HEq goals in `congr` instead of handling
Prop fields automatically.

Both tests pass on v4.28.0-rc1 (before isInstance detection changes).

## Test 1: Structure extending classes (mirrors Mathlib's GroupTopology)

```lean
structure MyGroupTopology (α : Type) extends MyTopology α, IsContinuousMul α

theorem MyGroupTopology.toMyTopology_injective {α : Type} :
    Function.Injective (MyGroupTopology.toMyTopology : MyGroupTopology α → MyTopology α) := by
  intro f g h
  cases f
  cases g
  congr
```

**Failure mode:** `⊢ toIsContinuousMul✝¹ ≍ toIsContinuousMul✝`

## Test 2: Class with explicit class-typed field (mirrors Mathlib's
PseudoEMetricSpace)

```lean
class MyMetricSpace (α : Type) extends MyDist α where
  dist_self : ∀ x : α, dist x x = 0
  toMyUniformity : MyUniformity α  -- explicit class-typed field (NOT from extends)
  uniformity_dist : toMyUniformity.uniformity (fun x y => dist x y = 0)

protected theorem MyMetricSpace.ext {α : Type} {m m' : MyMetricSpace α}
    (h : m.toMyDist = m'.toMyDist) (hU : m.toMyUniformity = m'.toMyUniformity) : m = m' := by
  cases m
  cases m'
  congr 1 <;> assumption
```

**Failure mode:** `⊢ dist_self✝¹ ≍ dist_self✝` and `⊢ uniformity_dist✝¹
≍ uniformity_dist✝`

## Context

These tests are related to #12172, which changes instance parameter
detection from binder-based to `isClass?`-based. That change can affect
how structure fields are classified in congruence lemma generation.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-27 11:57:18 +00:00
Marc Huisinga
3666544aad refactor: eliminate $/lean/ileanHeaderInfo (#12107) 2026-01-27 10:08:37 +00:00
Sebastian Graf
4ce04776b6 fix: do not assign synthetic opaque MVars in mspec (#12184)
This PR ensures that the `mspec` tactic does not assign synthetic opaque
MVars occurring in the goal, just like the `apply` tactic.
2026-01-27 10:05:20 +00:00
Marc Huisinga
621fdea272 refactor: eliminate FileIdent.mod (#12089) 2026-01-27 09:55:29 +00:00
Marc Huisinga
fb3aae7509 refactor: remove redundant calls to DocumentMeta.mod (#12085) 2026-01-27 09:02:20 +00:00
Kim Morrison
f11fffb27b doc: clarify release notes title format requirements (#12182)
This PR clarifies the release notes title format in the release
checklist documentation.

**Changes:**
- Add explicit section explaining title format for -rc1, subsequent RCs,
and stable releases
- Make it clear that titles should include both the RC suffix AND the
date (e.g., "Lean 4.7.0-rc1 (2024-03-15)")
- Update example to use realistic date format instead of YYYY-MM-DD
- Clarify that only content is written for -rc1, subsequent releases
just update the title

This addresses confusion about whether RC release notes should include
the date in the title.

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-27 08:08:47 +00:00
Sebastian Graf
7e1f9e24c4 chore: activate two mspec tests (#12183) 2026-01-27 07:52:56 +00:00
Kim Morrison
42a0e92453 doc: clarify release notes timing with reference-manual tags (#12171)
This PR documents an issue encountered during the v4.28.0-rc1 release:
if release notes are merged to the reference-manual repository AFTER the
version tag is created, the deployed documentation won't include them.

The fix is to either:
1. Include release notes in the same PR as the toolchain bump (or merge
before tagging)
2. Regenerate the tag after merging release notes

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-27 06:12:59 +00:00
Markus Himmel
d4c74b3566 fix: missing order instances for Int (#12181)
This PR adds two missing order instances for `Int`.

As reported on
[Zulip](https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/No.20Std.2EMaxOrEq.20Int.20instance.2C.20but.20yes.20Std.2EMinOrEq.20Int/near/570198709).
2026-01-27 05:42:30 +00:00
Kim Morrison
2e8afdf74d fix: use gh release create instead of action-gh-release (#12180)
This PR switches the PR release workflow from
`softprops/action-gh-release` to `gh release create`.

The `softprops/action-gh-release` action enumerates all releases to
check for existing ones, which fails when the repository has more than
10000 releases due to GitHub API pagination limits. The
`lean4-pr-releases` repository has accumulated over 10000 releases,
causing the PR release workflow to fail with:

```
Only the first 10000 results are available.
```

This is currently blocking all PR toolchain releases, including
https://github.com/leanprover/lean4/pull/12175.

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-27 05:04:43 +00:00
Kim Morrison
c7f941076e fix: scope FamilyOut.fam_eq simp lemma to Lake namespace (#12178)
This PR scopes the `simp` attribute on `FamilyOut.fam_eq` to the `Lake`
namespace. The lemma has a very permissive discrimination tree key
(`_`), so when `Lake.Util.Family` is transitively imported into
downstream projects, it causes `simp` to attempt this lemma on every
goal, leading to timeouts.

See
https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/Lake.20.60FamilyOut.2Efam_eq.60.20leads.20to.20timeouts.3F

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Mac Malone <tydeu@hatpress.net>
2026-01-27 04:24:08 +00:00
Kim Morrison
9185fd2a34 fix: correct comment about instance implicit arguments (#12173)
This PR fixes a comment that said "implicit arguments" when the code
actually checks `isInstImplicit`, which is specifically for instance
implicit arguments (`[...]` binders), not all implicit arguments.

🤖 Prepared with Claude Code

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-27 01:33:55 +00:00
Kim Morrison
642863e8c5 chore: begin development cycle for v4.29.0 (#12169)
This PR bumps the version to 4.29.0 to begin the next development cycle
after v4.28.0-rc1.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-26 23:31:37 +00:00
Leonardo de Moura
62d2688579 feat: eta-reduction support in SymM (#12168)
This PR adds support for eta-reduction in `SymM`.
2026-01-26 21:30:29 +00:00
Sebastian Graf
e8870da205 chore: improve performance of mpure_intro and mvcgen by avoiding whnfD (#12165)
New measurements:

```
goal_10: 181.910200 ms, kernel: 37.241050 ms
goal_20: 386.540215 ms, kernel: 83.497428 ms
goal_30: 648.282057 ms, kernel: 117.038447 ms
goal_40: 946.733191 ms, kernel: 168.369124 ms
goal_50: 1325.846873 ms, kernel: 223.838786 ms
goal_60: 1734.175705 ms, kernel: 285.594486 ms
goal_70: 2199.522317 ms, kernel: 351.659865 ms
goal_80: 2700.077802 ms, kernel: 428.303337 ms
goal_90: 3260.446641 ms, kernel: 515.976499 ms
goal_100: 3865.503733 ms, kernel: 600.229962 ms
```

Previously, goal_100 took 7.8s.
2026-01-26 17:58:33 +00:00
Lean stage0 autoupdater
a011c9c5dd chore: update stage0 2026-01-26 18:21:01 +00:00
Joachim Breitner
a6a3df8af0 perf: use .inj in proof of .injEq (#12164)
This PR uses the `.inj` theorem in the proof of one direction of the
`.injEq` theorem.
2026-01-26 14:50:32 +00:00
Sebastian Graf
b44c7e161c test: add two benchmarks for mvcgen in the style of SymM (#12163)
This PR adds two benchmarks for mvcgen in the style of Leo's SymM
benchmarks.

While performance on add_sub_cancel_StateM.lean is in the same order of
magnitude as the corresponding MetaM benchmark, add_if_sub_StateM.lean
is far slower.

Measurements for add_sub_cancel:
```
goal_10:   245.576221 ms, kernel: 134.134182 ms
goal_20:   613.945320 ms, kernel: 115.453811 ms
goal_30:  1074.053596 ms, kernel: 179.076070 ms
goal_40:  1680.678302 ms, kernel: 252.066677 ms
goal_50:  2457.209584 ms, kernel: 293.974096 ms
goal_60:  3271.773330 ms, kernel: 368.394386 ms
goal_70:  3981.247921 ms, kernel: 434.297822 ms
goal_80:  5077.300540 ms, kernel: 507.047772 ms
goal_90:  6486.990060 ms, kernel: 556.952095 ms
goal_100: 7791.399986 ms, kernel: 623.605163 ms
```

Measurements for add_if_sub:

```
goal_2: 89.762349 ms, kernel: 43.320205 ms
goal_3: 190.655546 ms, kernel: 38.888499 ms
goal_4: 434.461936 ms, kernel: 75.234581 ms
goal_5: 1110.295284 ms, kernel: 161.698707 ms
goal_6: 3241.383031 ms, kernel: 326.137173 ms
goal_7: 11675.609970 ms, kernel: 684.907188 ms
```

Much room for improvement.
2026-01-26 13:17:47 +00:00
Sebastian Graf
0bcac0d46c feat: add Option.of_wp_eq and Except.of_wp_eq (#12161)
This PR adds `Option.of_wp_eq` and `Except.of_wp_eq`, similar to the
existing `Except.of_wp`. `Except.of_wp` is deprecated because applying
it requires prior generalization, at which point it is more convenient
to use `Except.of_wp_eq`.
2026-01-26 12:50:23 +00:00
Lean stage0 autoupdater
1bf16f710e chore: update stage0 2026-01-26 12:17:07 +00:00
Henrik Böving
c3d753640a feat: use static initializers where possible (#12082)
This PR makes the compiler produce C code that statically initializes
close terms when possible. This change reduces startup time as the terms
are directly stored in the binary instead of getting computed at
startup.

The set of terms currently supported by this mechanism are:
- string literals
- ctors called with other statically initializeable arguments
- `Name.mkStrX` and other `Name` ctors as they require special support
due to their computed field and occur frequently due to name literals.

In core there are currently 152,524 closed terms and of these 103,929
(68%) get initialized statically with this PR. The remaining 48585 ones
are not extracted because they use (potentially transitively) various
non trivial pieces of code like `stringToMessageData` etc. We might
decide to add special support for these in the future but for the moment
this feels like it's overfitting too much for core.
2026-01-26 11:22:12 +00:00
Joachim Breitner
e94ed002b5 perf: in FunInd, boldly do not check terms (#12160)
This PR removes calls to `check` that we expect to pass under normal
circumstances. This may be re-added later guarded by a `debug` option.
2026-01-26 11:22:00 +00:00
Sebastian Graf
7564329f06 fix: make Std.Do's post macro universe polymorphic (#12159)
This PR makes Std.Do's `post` macro universe polymorphic by expanding to
`PUnit.unit` instead of `()`.
2026-01-26 11:20:16 +00:00
Eric Wieser
0336a8385b chore: inline trace nodes (#11954)
This extracts a `postCallback` helper so that only the actual callback
is inlined.

Part of the motivation here is to exclude these tracing frames from
flame graph profiles.

---------

Co-authored-by: Sebastian Ullrich <sebasti@nullri.ch>
2026-01-26 08:51:25 +00:00
David Thrane Christiansen
c6e530a4f1 doc: add link to reference manual in stack overflow message (#12157)
This PR updates #12137 with a link to the Lean reference manual.

---------

Co-authored-by: Sebastian Ullrich <sebasti@nullri.ch>
2026-01-26 07:56:48 +00:00
Kim Morrison
97d427b32b doc: document release notes process and add guard check (#12158)
This PR documents the release notes writing process in detail and adds a
guard check to `release_checklist.py` to ensure release notes are
created for `-rc1` releases before proceeding with downstream repository
updates.

- **doc/dev/release_checklist.md**: Expanded "Writing the release notes"
section with detailed steps for generating, reviewing, and formatting
release notes in Verso format
- **script/release_checklist.py**: Added
`check_release_notes_file_exists()` to verify the release notes file
exists in reference-manual repository
- **.claude/commands/release.md**: Added "Release Notes" section
explaining the process for creating release notes during `-rc1` releases

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-26 07:16:01 +00:00
Kim Morrison
bc1a22cc22 chore: add plausible as verso dependency in release_repos.yml (#12155)
verso depends on plausible, but this wasn't recorded in
release_repos.yml. This caused the release checklist to not properly
track the dependency ordering.
2026-01-26 06:55:45 +00:00
Leonardo de Moura
0e28043ec6 feat: add simpTelescope simproc for simplifying binders before intro (#12154)
This PR adds `simpTelescope`, a simproc that simplifies telescope
binders (`have`-expression values and arrow hypotheses) but not the
final body. This is useful for simplifying targets before introducing
hypotheses.
2026-01-25 23:16:30 +00:00
Leonardo de Moura
45862d5486 feat: improves simpArrowTelescope simproc (#12153)
This PR improves the `simpArrowTelescope` simproc that simplifies
non-dependent arrow telescopes: `p₁ → p₂ → ... → q`.

The simproc now also applies telescope-specific simplifications:
- `False → q` to `True` (when `q : Prop`)
- `True → q` to `q` (when `q : Prop`)
- `p → True` to `True`
2026-01-25 22:29:38 +00:00
Leonardo de Moura
ba8c2ed4ee feat: add simpArrowTelescope for compact proofs of arrow simplification (#12152)
This PR adds `simpArrowTelescope`, a simproc that simplifies telescopes
of non-dependent arrows (p₁ → p₂ → ... → q) while avoiding quadratic
proof growth.

When using `Expr.forallE` to represent nested implications, each nesting
level bumps de Bruijn indices in subterms, destroying sharing even with
hash-consing. For example, a free variable `x` gets different de Bruijn
representations at each depth, causing proof terms to grow.

`simpArrowTelescope` works by:

- Converting arrows to `Arrow p q` (a definitional wrapper)
- Simplifying each component
- Converting back to `→` form

Since `Arrow` arguments are not under binders, subterms remain identical
across nesting levels and can be shared.

The `simp_4` benchmark demonstrates the improvement:

With `forallE`: ~160ms, proof_size ≈ 173k
With `Arrow`: ~43ms, proof_size ≈ 16k
Tradeoff: `simpArrowTelescope` misses simplifications that depend on the
arrow structure (e.g., `p → p` to `True`), since post-methods aren't
applied to intermediate arrows. Thus, it is not used by default. to use
it, one has to set `simpArrowTelescope` as a `pre`-method.
2026-01-25 20:43:59 +00:00
1953 changed files with 4464 additions and 1955 deletions

View File

@@ -13,12 +13,54 @@ These comments explain the scripts' behavior, which repositories get special han
## Arguments
- `version`: The version to release (e.g., v4.24.0)
## Release Notes (Required for -rc1 releases)
For first release candidates (`-rc1`), you must create release notes BEFORE the reference-manual toolchain bump PR can be merged.
**Steps to create release notes:**
1. Generate the release notes:
```bash
cd /path/to/lean4
python3 script/release_notes.py --since <previous_version> > /tmp/release-notes-<version>.md
```
Replace `<previous_version>` with the last stable release (e.g., `v4.27.0` when releasing `v4.28.0-rc1`).
2. Review `/tmp/release-notes-<version>.md` for common issues:
- **Unterminated code blocks**: Look for code fences that aren't closed. Fetch original PR with `gh pr view <number>` to repair.
- **Truncated descriptions**: Some may end mid-sentence. Complete them from the original PR.
- **Markdown issues**: Other syntax problems that could cause parsing errors.
3. Create the release notes file in the reference-manual repository:
- File path: `Manual/Releases/v<version>.lean` (e.g., `v4_28_0.lean`)
- Use Verso format with proper imports and `#doc (Manual)` block
- **Use `#` for headers, not `##`** (Verso uses level 1 for subsections)
- **Use plain ` ``` ` not ` ```lean `** (the latter executes code)
- **Wrap underscore identifiers in backticks**: `` `bv_decide` `` not `bv_decide`
4. Update `Manual/Releases.lean`:
- Add import: `import Manual.Releases.«v4_28_0»`
- Add include: `{include 0 Manual.Releases.«v4_28_0»}`
5. Build to verify: `lake build Manual.Releases.v4_28_0`
6. Create a **separate PR** for release notes (not bundled with toolchain bump):
```bash
git checkout -b v<version>-release-notes
gh pr create --title "doc: add v<version> release notes"
```
For subsequent RCs (`-rc2`, etc.) and stable releases, just update the version number in the existing release notes file title.
See `doc/dev/release_checklist.md` section "Writing the release notes" for full details.
## Process
1. Run `script/release_checklist.py {version}` to check the current status
2. **CRITICAL: If preliminary lean4 checks fail, STOP immediately and alert the user**
- Check for: release branch exists, CMake version correct, tag exists, release page exists, release notes exist
- Check for: release branch exists, CMake version correct, tag exists, release page exists, release notes file exists
- **IMPORTANT**: The release page is created AUTOMATICALLY by CI after pushing the tag - DO NOT create it manually
- **IMPORTANT**: For -rc1 releases, release notes must be created before proceeding
- Do NOT create any PRs or proceed with repository updates if these checks fail
3. Create a todo list tracking all repositories that need updates
4. **CRITICAL RULE: You can ONLY run `release_steps.py` for a repository if `release_checklist.py` explicitly says to do so**

View File

@@ -20,7 +20,9 @@ on:
jobs:
on-success:
runs-on: ubuntu-latest
if: github.event.workflow_run.conclusion == 'success' && github.event.workflow_run.event == 'pull_request' && github.repository == 'leanprover/lean4'
# Run even if CI fails, as long as build artifacts are available
# The "Verify release artifacts exist" step will fail if necessary artifacts are missing
if: github.event.workflow_run.event == 'pull_request' && github.repository == 'leanprover/lean4'
steps:
- name: Retrieve information about the original workflow
uses: potiuk/get-workflow-origin@v1_1 # https://github.com/marketplace/actions/get-workflow-origin
@@ -62,42 +64,56 @@ jobs:
git -C lean4.git remote add pr-releases https://foo:'${{ secrets.PR_RELEASES_TOKEN }}'@github.com/${{ github.repository_owner }}/lean4-pr-releases.git
git -C lean4.git push -f pr-releases pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }}
git -C lean4.git push -f pr-releases pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }}-"${SHORT_SHA}"
- name: Delete existing release if present
- name: Delete existing releases if present
if: ${{ steps.workflow-info.outputs.pullRequestNumber != '' }}
run: |
# Try to delete any existing release for the current PR (just the version without the SHA suffix).
# Delete any existing releases for this PR.
# The short format release is always recreated with the latest commit.
# The SHA-suffixed release should be unique per commit, but delete just in case.
gh release delete --repo ${{ github.repository_owner }}/lean4-pr-releases pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }} -y || true
gh release delete --repo ${{ github.repository_owner }}/lean4-pr-releases pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }}-${{ env.SHORT_SHA }} -y || true
env:
GH_TOKEN: ${{ secrets.PR_RELEASES_TOKEN }}
# Verify artifacts were downloaded (equivalent to fail_on_unmatched_files in the old action).
- name: Verify release artifacts exist
if: ${{ steps.workflow-info.outputs.pullRequestNumber != '' }}
run: |
shopt -s nullglob
files=(artifacts/*/*)
if [ ${#files[@]} -eq 0 ]; then
echo "::error::No artifacts found matching artifacts/*/*"
exit 1
fi
echo "Found ${#files[@]} artifacts to upload:"
printf '%s\n' "${files[@]}"
# We use `gh release create` instead of `softprops/action-gh-release` because
# the latter enumerates all releases to check for existing ones, which fails
# when the repository has more than 10000 releases (GitHub API pagination limit).
# Upstream fix: https://github.com/softprops/action-gh-release/pull/725
- name: Release (short format)
if: ${{ steps.workflow-info.outputs.pullRequestNumber != '' }}
uses: softprops/action-gh-release@a06a81a03ee405af7f2048a818ed3f03bbf83c7b
with:
name: Release for PR ${{ steps.workflow-info.outputs.pullRequestNumber }}
# There are coredumps files here as well, but all in deeper subdirectories.
files: artifacts/*/*
fail_on_unmatched_files: true
draft: false
tag_name: pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }}
repository: ${{ github.repository_owner }}/lean4-pr-releases
run: |
# There are coredump files in deeper subdirectories; artifacts/*/* gets the release archives.
gh release create \
--repo ${{ github.repository_owner }}/lean4-pr-releases \
--title "Release for PR ${{ steps.workflow-info.outputs.pullRequestNumber }}" \
--notes "" \
pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }} \
artifacts/*/*
env:
# The token used here must have `workflow` privileges.
GITHUB_TOKEN: ${{ secrets.PR_RELEASES_TOKEN }}
GH_TOKEN: ${{ secrets.PR_RELEASES_TOKEN }}
- name: Release (SHA-suffixed format)
if: ${{ steps.workflow-info.outputs.pullRequestNumber != '' }}
uses: softprops/action-gh-release@a06a81a03ee405af7f2048a818ed3f03bbf83c7b
with:
name: Release for PR ${{ steps.workflow-info.outputs.pullRequestNumber }} (${{ steps.workflow-info.outputs.sourceHeadSha }})
# There are coredumps files here as well, but all in deeper subdirectories.
files: artifacts/*/*
fail_on_unmatched_files: true
draft: false
tag_name: pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }}-${{ env.SHORT_SHA }}
repository: ${{ github.repository_owner }}/lean4-pr-releases
run: |
gh release create \
--repo ${{ github.repository_owner }}/lean4-pr-releases \
--title "Release for PR ${{ steps.workflow-info.outputs.pullRequestNumber }} (${{ steps.workflow-info.outputs.sourceHeadSha }})" \
--notes "" \
pr-release-${{ steps.workflow-info.outputs.pullRequestNumber }}-${{ env.SHORT_SHA }} \
artifacts/*/*
env:
# The token used here must have `workflow` privileges.
GITHUB_TOKEN: ${{ secrets.PR_RELEASES_TOKEN }}
GH_TOKEN: ${{ secrets.PR_RELEASES_TOKEN }}
- name: Report release status (short format)
if: ${{ steps.workflow-info.outputs.pullRequestNumber != '' }}

View File

@@ -218,6 +218,21 @@ Please read https://leanprover-community.github.io/contribute/tags_and_branches.
# Writing the release notes
Release notes content is only written for the first release candidate (`-rc1`). For subsequent RCs and stable releases,
just update the title in the existing release notes file (see "Release notes title format" below).
## Release notes title format
The title in the `#doc (Manual)` line must follow these formats:
- **For -rc1**: `"Lean 4.7.0-rc1 (2024-03-15)"` — Include the RC suffix and the release date
- **For -rc2, -rc3, etc.**: `"Lean 4.7.0-rc2 (2024-03-20)"` — Update the RC number and date
- **For stable release**: `"Lean 4.7.0 (2024-04-01)"` — Remove the RC suffix but keep the date
The date should be the actual date when the tag was pushed (or when CI completed and created the release page).
## Generating the release notes
Release notes are automatically generated from the commit history, using `script/release_notes.py`.
Run this as `script/release_notes.py --since v4.6.0`, where `v4.6.0` is the *previous* release version.
@@ -232,4 +247,113 @@ Some judgement is required here: ignore commits which look minor,
but manually add items to the release notes for significant PRs that were rebase-merged.
There can also be pre-written entries in `./releases_drafts`, which should be all incorporated in the release notes and then deleted from the branch.
## Reviewing and fixing the generated markdown
Before adding the release notes to the reference manual, carefully review the generated markdown for these common issues:
1. **Unterminated code blocks**: PR descriptions sometimes have unclosed code fences. Look for code blocks
that don't have a closing ` ``` `. If found, fetch the original PR description with `gh pr view <number>`
and repair the code block with the complete content.
2. **Truncated descriptions**: Some PR descriptions may end abruptly mid-sentence. Review these and complete
the descriptions based on the original PR.
3. **Markdown syntax issues**: Check for other markdown problems that could cause parsing errors.
## Creating the release notes file
The release notes go in `Manual/Releases/v4_7_0.lean` in the reference-manual repository.
The file structure must follow the Verso format:
```lean
/-
Copyright (c) 2025 Lean FRO LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Author: <Your Name>
-/
import VersoManual
import Manual.Meta
import Manual.Meta.Markdown
open Manual
open Verso.Genre
open Verso.Genre.Manual
open Verso.Genre.Manual.InlineLean
#doc (Manual) "Lean 4.7.0-rc1 (2024-03-15)" =>
%%%
tag := "release-v4.7.0"
file := "v4.7.0"
%%%
<release notes content here>
```
**Important formatting rules for Verso:**
- Use `#` for section headers inside the document, not `##` (Verso uses header level 1 for subsections)
- Use plain ` ``` ` for code blocks, not ` ```lean ` (the latter will cause Lean to execute the code)
- Identifiers with underscores like `bv_decide` should be wrapped in backticks: `` `bv_decide` ``
(otherwise the underscore may be interpreted as markdown emphasis)
## Updating Manual/Releases.lean
After creating the release notes file, update `Manual/Releases.lean` to include it:
1. Add the import near the top with other version imports:
```lean
import Manual.Releases.«v4_7_0»
```
2. Add the include statement after the other includes:
```lean
{include 0 Manual.Releases.«v4_7_0»}
```
## Building and verifying
Build the release notes to check for errors:
```bash
lake build Manual.Releases.v4_7_0
```
Common errors and fixes:
- "Wrong header nesting - got ## but expected at most #": Change `##` to `#`
- "Tactic 'X' failed" or similar: Code is being executed; change ` ```lean ` to ` ``` `
- "'_'" errors: Underscore in identifier being parsed as emphasis; wrap in backticks
## Creating the PR
**Important: Timing with the reference-manual tag**
The reference-manual repository deploys documentation when a version tag is pushed. If you merge
release notes AFTER the tag is created, the deployed documentation won't include them.
You have two options:
1. **Preferred**: Include the release notes in the same PR as the toolchain bump (or merge the
release notes PR before creating the tag). This ensures the tag includes the release notes.
2. **If release notes are merged after the tag**: You must regenerate the tag to trigger a new deployment:
```bash
cd /path/to/reference-manual
git fetch origin
git tag -d v4.7.0-rc1 # Delete local tag
git tag v4.7.0-rc1 origin/main # Create tag at current main (which has release notes)
git push origin :refs/tags/v4.7.0-rc1 # Delete remote tag
git push origin v4.7.0-rc1 # Push new tag (triggers Deploy workflow)
```
If creating a separate PR for release notes:
```bash
git checkout -b v4.7.0-release-notes
git add Manual/Releases/v4_7_0.lean Manual/Releases.lean
git commit -m "doc: add v4.7.0 release notes"
git push -u origin v4.7.0-release-notes
gh pr create --title "doc: add v4.7.0 release notes" --body "This PR adds the release notes for Lean v4.7.0."
```
See `./releases_drafts/README.md` for more information about pre-written release note entries.
See `./releases_drafts/README.md` for more information.

View File

@@ -185,6 +185,30 @@ def get_release_notes(tag_name):
except Exception:
return None
def check_release_notes_file_exists(toolchain, github_token):
"""Check if the release notes file exists in the reference-manual repository.
For -rc1 releases, this checks that the release notes have been created.
For subsequent RCs and stable releases, release notes should already exist.
Returns tuple (exists: bool, is_rc1: bool) where is_rc1 indicates if this is
the first release candidate (when release notes need to be written).
"""
# Determine the release notes file path
# e.g., v4.28.0-rc1 -> Manual/Releases/v4_28_0.lean
base_version = strip_rc_suffix(toolchain.lstrip('v')) # "4.28.0"
file_name = f"v{base_version.replace('.', '_')}.lean" # "v4_28_0.lean"
file_path = f"Manual/Releases/{file_name}"
is_rc1 = toolchain.endswith("-rc1")
repo_url = "https://github.com/leanprover/reference-manual"
# Check if the file exists on main branch
content = get_branch_content(repo_url, "main", file_path, github_token)
return (content is not None, is_rc1)
def get_branch_content(repo_url, branch, file_path, github_token):
api_url = repo_url.replace("https://github.com/", "https://api.github.com/repos/") + f"/contents/{file_path}?ref={branch}"
headers = {'Authorization': f'token {github_token}'} if github_token else {}
@@ -714,6 +738,27 @@ def main():
else:
print(f" ✅ Release notes page title looks good ('{actual_title}').")
# Check if release notes file exists in reference-manual repository
# For -rc1 releases, this is when release notes need to be written
# For subsequent RCs and stable releases, they should already exist
release_notes_exists, is_rc1 = check_release_notes_file_exists(toolchain, github_token)
base_version = strip_rc_suffix(toolchain.lstrip('v'))
release_notes_file = f"Manual/Releases/v{base_version.replace('.', '_')}.lean"
if not release_notes_exists:
if is_rc1:
print(f" ❌ Release notes file not found: {release_notes_file}")
print(f" This is an -rc1 release, so release notes need to be written.")
print(f" Run `script/release_notes.py --since <previous_version>` to generate them.")
print(f" See doc/dev/release_checklist.md section 'Writing the release notes' for details.")
lean4_success = False
else:
print(f" ❌ Release notes file not found: {release_notes_file}")
print(f" Release notes should have been created for -rc1. Check the reference-manual repository.")
lean4_success = False
else:
print(f" ✅ Release notes file exists: {release_notes_file}")
repo_status["lean4"] = lean4_success
# If the release page doesn't exist, skip repository checks and master branch checks

View File

@@ -14,13 +14,6 @@ repositories:
bump-branch: true
dependencies: []
- name: verso
url: https://github.com/leanprover/verso
toolchain-tag: true
stable-branch: false
branch: main
dependencies: []
- name: lean4checker
url: https://github.com/leanprover/lean4checker
toolchain-tag: true
@@ -42,6 +35,14 @@ repositories:
branch: main
dependencies: []
- name: verso
url: https://github.com/leanprover/verso
toolchain-tag: true
stable-branch: false
branch: main
dependencies:
- plausible
- name: import-graph
url: https://github.com/leanprover-community/import-graph
toolchain-tag: true

View File

@@ -10,7 +10,7 @@ endif()
include(ExternalProject)
project(LEAN CXX C)
set(LEAN_VERSION_MAJOR 4)
set(LEAN_VERSION_MINOR 28)
set(LEAN_VERSION_MINOR 29)
set(LEAN_VERSION_PATCH 0)
set(LEAN_VERSION_IS_RELEASE 0) # This number is 1 in the release revision, and 0 otherwise.
set(LEAN_SPECIAL_VERSION_DESC "" CACHE STRING "Additional version description like 'nightly-2018-03-11'")

View File

@@ -469,13 +469,13 @@ namespace EStateM
instance : LawfulMonad (EStateM ε σ) := .mk'
(id_map := fun x => funext <| fun s => by
dsimp only [EStateM.instMonad, EStateM.map]
simp only [Functor.map, EStateM.map]
match x s with
| .ok _ _ => rfl
| .error _ _ => rfl)
(pure_bind := fun _ _ => by rfl)
(bind_assoc := fun x _ _ => funext <| fun s => by
dsimp only [EStateM.instMonad, EStateM.bind]
simp only [bind, EStateM.bind]
match x s with
| .ok _ _ => rfl
| .error _ _ => rfl)

View File

@@ -674,6 +674,39 @@ theorem isNone_findFinIdx? {xs : Array α} {p : α → Bool} :
simp only [Option.map_map, Function.comp_def, Fin.cast_cast]
simp [Array.size]
/-! ### find? and findFinIdx? -/
theorem find?_eq_map_findFinIdx?_getElem {xs : Array α} {p : α Bool} :
xs.find? p = (xs.findFinIdx? p).map (xs[·]) := by
cases xs
simp [List.find?_eq_map_findFinIdx?_getElem]
rfl
theorem find?_eq_bind_findIdx?_getElem? {xs : Array α} {p : α Bool} :
xs.find? p = (xs.findIdx? p).bind (xs[·]?) := by
cases xs
simp [List.find?_eq_bind_findIdx?_getElem?]
theorem find?_eq_getElem?_findIdx {xs : Array α} {p : α Bool} :
xs.find? p = xs[xs.findIdx p]? := by
cases xs
simp [List.find?_eq_getElem?_findIdx]
theorem findIdx?_eq_bind_find?_idxOf? [BEq α] [LawfulBEq α] {xs : Array α} {p : α Bool} :
xs.findIdx? p = (xs.find? p).bind (xs.idxOf? ·) := by
cases xs
simp [List.findIdx?_eq_bind_find?_idxOf?]
theorem findFinIdx?_eq_bind_find?_finIdxOf? [BEq α] [LawfulBEq α] {xs : Array α} {p : α Bool} :
xs.findFinIdx? p = (xs.find? p).bind (xs.finIdxOf? ·) := by
cases xs
simp [List.findFinIdx?_eq_bind_find?_finIdxOf?]
theorem findIdx_eq_getD_bind_find?_idxOf? [BEq α] [LawfulBEq α] {xs : Array α} {p : α Bool} :
xs.findIdx p = ((xs.find? p).bind (xs.idxOf? ·)).getD xs.size := by
cases xs
simp [List.findIdx_eq_getD_bind_find?_idxOf?]
/-! ### idxOf
The verification API for `idxOf` is still incomplete.

View File

@@ -1447,4 +1447,12 @@ instance : LawfulOrderLT Int where
lt_iff := by
simp [ Int.not_le, Decidable.imp_iff_not_or, Std.Total.total]
instance : LawfulOrderLeftLeaningMin Int where
min_eq_left _ _ := Int.min_eq_left
min_eq_right _ _ h := Int.min_eq_right (le_of_lt (not_le.1 h))
instance : LawfulOrderLeftLeaningMax Int where
max_eq_left _ _ := Int.max_eq_left
max_eq_right _ _ h := Int.max_eq_right (le_of_lt (not_le.1 h))
end Int

View File

@@ -630,6 +630,43 @@ def Iter.Total.find? {α β : Type w} [Iterator α Id β] [IteratorLoop α Id Id
(it : Iter.Total (α := α) β) (f : β Bool) : Option β :=
it.it.find? f
/--
Returns the first output of the iterator, or `none` if no such output is found.
`O(|it|)` since the iterator may skip an unknown number of times before returning a result.
Short-circuits upon encountering the first result. Only the first element of `it` is examined.
If the iterator is not productive, this function might run forever. The variant
`it.ensureTermination.first?` always terminates after finitely many steps.
Examples:
* `[7, 6].iter.first? = some 7`
* `[].iter.first? = none`
-/
@[inline]
def Iter.first? {α β : Type w} [Iterator α Id β] [IteratorLoop α Id Id]
(it : Iter (α := α) β) : Option β :=
it.toIterM.first?.run
/--
Returns the first output of the iterator, or `none` if no such output is found.
`O(|it|)` since the iterator may skip an unknown number of times before returning a result.
Short-circuits upon encountering the first result. The elements in `it` are examined in order of
iteration.
This variant terminates after finitely many steps and requires a proof that the iterator is
productive. If such a proof is not available, consider using `Iter.first?`.
Examples:
* `[7, 6].iter.first? = some 7`
* `[].iter.first? = none`
-/
@[inline]
def Iter.Total.first? {α β : Type w} [Iterator α Id β] [IteratorLoop α Id Id] [Productive α Id]
(it : Iter.Total (α := α) β) : Option β :=
it.it.first?
/--
Steps through the whole iterator, counting the number of outputs emitted.

View File

@@ -185,8 +185,8 @@ instance instLawfulIteratorLoopDefaultImplementation (α : Type w) (m : Type w
constructor; simp
theorem IteratorLoop.wellFounded_of_finite {m : Type w Type w'}
{α β : Type w} {γ : Type x} [Iterator α m β] [Finite α m] :
WellFounded α m (γ := γ) fun _ _ _ => True := by
{α β : Type w} {γ : Type x} [Iterator α m β] [Finite α m] {P : β γ ForInStep γ Prop} :
WellFounded α m (γ := γ) P := by
apply Subrelation.wf
(r := InvImage IterM.TerminationMeasures.Finite.Rel (fun p => p.1.finitelyManySteps))
· intro p' p h
@@ -197,6 +197,16 @@ theorem IteratorLoop.wellFounded_of_finite {m : Type w → Type w'}
· apply InvImage.wf
exact WellFoundedRelation.wf
theorem IteratorLoop.wellFounded_of_productive {α β : Type w} {m : Type w Type w'}
[Iterator α m β] [IteratorLoop α m m] [Productive α m] {P : β γ ForInStep γ Prop}
(hp : {b g s}, P b g s s matches ForInStep.done ..) :
WellFounded α m (γ := γ) P := by
rw [WellFounded]
unfold IteratorLoop.rel
have {b g q} : ¬ P b g (ForInStep.yield q) := fun h => by simpa using hp h
simp only [and_false, exists_false, false_or, this]
exact Subrelation.wf And.left (InvImage.wf Prod.fst Productive.wf)
/--
This `ForIn'`-style loop construct traverses a finite iterator using an `IteratorLoop` instance.
-/
@@ -902,6 +912,44 @@ def IterM.Total.find? {α β : Type w} {m : Type w → Type w'} [Monad m] [Itera
m (Option β) :=
it.it.find? f
/--
Returns the first output of the iterator, or `none` if no such output is found.
`O(|it|)` since the iterator may skip an unknown number of times before returning a result.
Short-circuits upon encountering the first result. Only the first element of `it` is examined.
If the iterator is not productive, this function might run forever. The variant
`it.ensureTermination.first?` always terminates after finitely many steps.
Examples:
* `([7, 6].iterM Id).first? = pure (some 7)`
* `([].iterM Id).first? = pure none`
-/
@[inline]
def IterM.first? {α β : Type w} {m : Type w Type w'} [Monad m] [Iterator α m β]
[IteratorLoop α m m] (it : IterM (α := α) m β) : m (Option β) :=
IteratorLoop.forIn (fun _ _ => flip Bind.bind) _ (fun b _ s => s = ForInStep.done (some b)) it
none (fun b _ _ => pure ForInStep.done (some b), rfl)
/--
Returns the first output of the iterator, or `none` if no such output is found.
`O(|it|)` since the iterator may skip an unknown number of times before returning a result.
Short-circuits upon encountering the first result. The elements in `it` are examined in order of
iteration.
This variant terminates after finitely many steps and requires a proof that the iterator is
productive. If such a proof is not available, consider using `IterM.first?`.
Examples:
* `([7, 6].iterM Id).first? = pure (some 7)`
* `([].iterM Id).first? = pure none`
-/
@[inline]
def IterM.Total.first? {α β : Type w} {m : Type w Type w'} [Monad m] [Iterator α m β]
[IteratorLoop α m m] [Productive α m] (it : IterM.Total (α := α) m β) : m (Option β) :=
it.it.first?
section Count
/--

View File

@@ -111,6 +111,11 @@ instance {n : Type u → Type w} [Monad n] [LawfulMonad n] :
liftBind_pure := by simp
liftBind_bind := by simp
instance {m : Type u Type v} [Monad m] [LawfulMonad m] :
LawfulMonadLiftBindFunction (m := m) (n := m) (fun _ _ => flip Bind.bind) where
liftBind_pure := by simp [flip]
liftBind_bind := by simp [flip]
end LiftBind
end Std.Internal

View File

@@ -915,4 +915,26 @@ theorem Iter.findM?_pure {α β : Type w} {m : Type w → Type w'} [Monad m]
· simp [ihs _]
· simp
theorem Iter.first?_eq_first?_toIterM {α β : Type w} [Iterator α Id β] [IteratorLoop α Id Id]
{it : Iter (α := α) β} :
it.first? = it.toIterM.first?.run := (rfl)
theorem Iter.first?_eq_match_step {α β : Type w} [Iterator α Id β] [IteratorLoop α Id Id]
[Productive α Id] [LawfulIteratorLoop α Id Id] {it : Iter (α := α) β} :
it.first? = match it.step.val with
| .yield _ out => some out
| .skip it' => it'.first?
| .done => none := by
rw [Iter.first?_eq_first?_toIterM, IterM.first?_eq_match_step]
simp only [Id.run_bind, step]
generalize it.toIterM.step.run.inflate = s
rcases s with _|_|_, _ <;> simp [Iter.first?_eq_first?_toIterM]
theorem Iter.first?_eq_head?_toList {α β : Type w} [Iterator α Id β] [IteratorLoop α Id Id]
[Finite α Id] [LawfulIteratorLoop α Id Id] {it : Iter (α := α) β} :
it.first? = it.toList.head? := by
induction it using Iter.inductSteps with | step it ihy ihs
rw [first?_eq_match_step, toList_eq_match_step]
cases it.step using PlausibleIterStep.casesOn <;> simp [*]
end Std

View File

@@ -15,6 +15,15 @@ public section
namespace Std
open Std.Iterators
theorem IterM.DefaultConsumers.forIn_eq {α β : Type w} {m : Type w Type w'}
{n : Type x Type x'} [Monad n] [Iterator α m β]
{lift : (γ : Type w) (δ : Type x) (γ n δ) m γ n δ}
{plausible_forInStep : β γ ForInStep γ Prop} {it : IterM (α := α) m β} {init : γ}
{f : (b : β) it.IsPlausibleIndirectOutput b (c : γ) n (Subtype (plausible_forInStep b c))} :
letI : IteratorLoop α m n := .defaultImplementation
IteratorLoop.forIn lift γ plausible_forInStep it init f =
IterM.DefaultConsumers.forIn' lift γ plausible_forInStep it init _ (fun _ => id) f := rfl
theorem IterM.DefaultConsumers.forIn'_eq_match_step {α β : Type w} {m : Type w Type w'}
[Iterator α m β] {n : Type x Type x'} [Monad n] [LawfulMonad n]
{lift : γ δ, (γ n δ) m γ n δ} {γ : Type x}
@@ -731,7 +740,7 @@ theorem IterM.findSomeM?_eq_match_step {α β γ : Type w} {m : Type w → Type
· simp
theorem IterM.findSome?_eq_findSomeM? {α β γ : Type w} {m : Type w Type w'} [Monad m]
[Iterator α m β] [IteratorLoop α m m] [Finite α m]
[Iterator α m β] [IteratorLoop α m m]
{it : IterM (α := α) m β} {f : β Option γ} :
it.findSome? f = it.findSomeM? (pure <| f ·) :=
(rfl)
@@ -832,4 +841,24 @@ theorem IterM.findM?_pure {α β : Type w} {m : Type w → Type w'} [Monad m]
· simp [ihs _]
· simp
theorem IterM.first?_eq_match_step {α β : Type w} {m : Type w Type w'} [Monad m]
[Iterator α m β] [IteratorLoop α m m] [LawfulMonad m] [Productive α m]
[LawfulIteratorLoop α m m] {it : IterM (α := α) m β} :
it.first? = (do
match ( it.step).inflate.val with
| .yield _ out => return (some out)
| .skip it' => it'.first?
| .done => return none) := by
simp only [first?]
have := IteratorLoop.wellFounded_of_productive (α := α) (β := β) (m := m)
(P := fun b g s => s = ForInStep.done (some b)) (by simp)
simp only [LawfulIteratorLoop.lawful _ _ _ _ _ this]
rw [IterM.DefaultConsumers.forIn_eq, IterM.DefaultConsumers.forIn'_eq_match_step _ this]
simp only [flip, pure_bind]
congr
ext s
split <;> try (simp [*]; done)
simp only [DefaultConsumers.forIn_eq, *]
exact IterM.DefaultConsumers.forIn'_eq_forIn' _ this (by simp)
end Std

View File

@@ -1077,6 +1077,37 @@ theorem isNone_findFinIdx? {l : List α} {p : α → Bool} :
simp [hf, findFinIdx?_cons]
split <;> simp [ih, Function.comp_def]
/-! ### find? and findFinIdx? -/
theorem find?_eq_map_findFinIdx?_getElem {xs : List α} {p : α Bool} :
xs.find? p = (xs.findFinIdx? p).map (xs[·]) := by
induction xs with
| nil => simp
| cons x xs ih =>
simp only [find?_cons, findFinIdx?_cons]
split <;> rename_i h
· simp [h]
· simp [h, ih, Function.comp_def]
theorem find?_eq_bind_findIdx?_getElem? {xs : List α} {p : α Bool} :
xs.find? p = (xs.findIdx? p).bind (xs[·]?) := by
induction xs with
| nil => simp
| cons x xs ih =>
simp only [find?_cons, findIdx?_cons]
split <;> rename_i h
· simp [h]
· simp [h, ih, Option.bind_map, Function.comp_def]
theorem find?_eq_getElem?_findIdx {xs : List α} {p : α Bool} :
xs.find? p = xs[xs.findIdx p]? := by
induction xs with
| nil => simp
| cons x xs ih =>
simp only [find?_cons, findIdx_cons]
split <;> rename_i h
· simp [h]
· simp [h, ih]
/-! ### idxOf
@@ -1103,8 +1134,6 @@ theorem idxOf_append [BEq α] [LawfulBEq α] {l₁ l₂ : List α} {a : α} :
· rw [if_neg]
simpa using h
theorem idxOf_eq_length [BEq α] [LawfulBEq α] {l : List α} (h : a l) : l.idxOf a = l.length := by
induction l with
| nil => rfl
@@ -1113,8 +1142,6 @@ theorem idxOf_eq_length [BEq α] [LawfulBEq α] {l : List α} (h : a ∉ l) : l.
simp only [idxOf_cons, cond_eq_ite, beq_iff_eq]
split <;> simp_all
theorem idxOf_lt_length_of_mem [BEq α] [EquivBEq α] {l : List α} (h : a l) : l.idxOf a < l.length := by
induction l with
| nil => simp at h
@@ -1143,8 +1170,6 @@ theorem idxOf_lt_length_iff [BEq α] [LawfulBEq α] {l : List α} {a : α} :
grind_pattern idxOf_lt_length_iff => l.idxOf a, l.length
/-! ### finIdxOf?
The verification API for `finIdxOf?` is still incomplete.
@@ -1214,7 +1239,9 @@ The lemmas below should be made consistent with those for `findIdx?` (and proved
· rintro w x h rfl
contradiction
theorem idxOf?_eq_some_iff [BEq α] [LawfulBEq α] {l : List α} {a : α} {i : Nat} :
l.idxOf? a = some i (h : i < l.length), l[i] = a j (_ : j < i), ¬l[j] = a := by
simp [idxOf?, findIdx?_eq_some_iff_getElem]
@[simp, grind =]
theorem isSome_idxOf? [BEq α] [LawfulBEq α] {l : List α} {a : α} :
@@ -1230,6 +1257,56 @@ theorem isNone_idxOf? [BEq α] [LawfulBEq α] {l : List α} {a : α} :
(l.idxOf? a).isNone = ¬ a l := by
simp
theorem finIdxOf?_eq_pmap_idxOf? {l : List α} {a : α} [BEq α] [LawfulBEq α] :
l.finIdxOf? a =
(l.idxOf? a).pmap
(fun i h => i, (idxOf?_eq_some_iff.mp h).1)
(fun _ h => h) := by
ext i, h
simp only [finIdxOf?_eq_some_iff, Fin.getElem_fin, Fin.forall_iff, Fin.mk_lt_mk,
idxOf?_eq_some_iff, Option.pmap_eq_some_iff, Fin.mk.injEq, exists_and_left, exists_prop,
and_self_left, exists_eq_right', h, exists_true_left, and_congr_right_iff]
intro w
constructor
· intro w j h₁
apply w <;> omega
· intro w j h₁ h₂
apply w <;> omega
/-! ### find? and idxOf? -/
theorem findIdx?_eq_bind_find?_idxOf? [BEq α] [LawfulBEq α] {xs : List α} {p : α Bool} :
xs.findIdx? p = (xs.find? p).bind (xs.idxOf?) := by
induction xs with
| nil => simp
| cons x xs ih =>
simp [findIdx?_cons, find?_cons]
split <;> rename_i h
· simp [h, idxOf?_cons]
· simp [h, ih, Function.comp_def, idxOf?_cons]
cases w : xs.find? p with
| none => simp
| some x' =>
simp
rintro rfl
have := find?_some w
contradiction
theorem findFinIdx?_eq_bind_find?_finIdxOf? [BEq α] [LawfulBEq α] {xs : List α} {p : α Bool} :
xs.findFinIdx? p = (xs.find? p).bind (xs.finIdxOf?) := by
simp only [findFinIdx?_eq_pmap_findIdx?, findIdx?_eq_bind_find?_idxOf?, finIdxOf?_eq_pmap_idxOf?]
ext i
simp only [Option.bind_eq_some_iff, Option.pmap_eq_some_iff, exists_and_left, and_exists_self]
constructor
· rintro a, h₁, h₂, rfl
exact h₁, by simp [h₂]
· rintro h₁, h₂, a, h₃, rfl
exact a, h₁, h₂, h₃, rfl
theorem findIdx_eq_getD_bind_find?_idxOf? [BEq α] [LawfulBEq α] {xs : List α} {p : α Bool} :
xs.findIdx p = ((xs.find? p).bind (xs.idxOf?)).getD xs.length := by
rw [findIdx_eq_getD_findIdx?, findIdx?_eq_bind_find?_idxOf?]
/-! ### lookup -/
section lookup

View File

@@ -85,7 +85,7 @@ theorem cons_lex_cons_iff : Lex r (a :: l₁) (b :: l₂) ↔ r a b a = b
theorem cons_lt_cons_iff [LT α] {a b} {l₁ l₂ : List α} :
(a :: l₁) < (b :: l₂) a < b a = b l₁ < l₂ := by
dsimp only [instLT, List.lt]
simp only [LT.lt, List.lt]
simp [cons_lex_cons_iff]
@[simp] theorem cons_lt_cons_self [LT α] [i₀ : Std.Irrefl (· < · : α α Prop)] {l₁ l₂ : List α} :
@@ -101,7 +101,7 @@ theorem cons_le_cons_iff [LT α]
[i₂ : Std.Trichotomous (· < · : α α Prop)]
{a b} {l₁ l₂ : List α} :
(a :: l₁) (b :: l₂) a < b a = b l₁ l₂ := by
dsimp only [instLE, instLT, List.le, List.lt]
simp only [LE.le, LT.lt, List.le, List.lt]
open Classical in
simp only [not_cons_lex_cons_iff, ne_eq]
constructor

View File

@@ -131,6 +131,14 @@ theorem suffix_iff_eq_drop : l₁ <:+ l₂ ↔ l₁ = drop (length l₂ - length
fun h => append_cancel_left <| (suffix_iff_eq_append.1 h).trans (take_append_drop _ _).symm,
fun e => e.symm drop_suffix _ _
theorem prefix_map_iff_of_injective {f : α β} (hf : Function.Injective f) :
l₁.map f <+: l₂.map f l₁ <+: l₂ := by
simp [prefix_iff_eq_take, map_take, map_inj_right hf]
theorem suffix_map_iff_of_injective {f : α β} (hf : Function.Injective f) :
l₁.map f <:+ l₂.map f l₁ <:+ l₂ := by
simp [suffix_iff_eq_drop, map_drop, map_inj_right hf]
@[grind =] theorem prefix_take_le_iff {xs : List α} (hm : i < xs.length) :
xs.take i <+: xs.take j i j := by
simp only [prefix_iff_eq_take, length_take]

View File

@@ -8,6 +8,7 @@ module
prelude
public import Init.Data.String.Basic
public import Init.Data.Iterators.Basic
public import Init.Data.Iterators.Consumers.Loop
set_option doc.verso true
@@ -129,23 +130,24 @@ variable [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
variable (pat : ρ) [ToForwardSearcher pat σ]
@[specialize pat]
def defaultStartsWith (s : Slice) : Bool :=
def defaultStartsWith (s : Slice) [Std.IteratorLoop (σ s) Id Id] : Bool :=
let searcher := ToForwardSearcher.toSearcher pat s
match searcher.step with
| .yield _ (.matched start ..) _ => s.startPos = start
match searcher.first? with
| some (.matched start ..) => s.startPos = start
| _ => false
@[specialize pat]
def defaultDropPrefix? (s : Slice) : Option s.Pos :=
def defaultDropPrefix? (s : Slice) [Std.IteratorLoop (σ s) Id Id] : Option s.Pos :=
let searcher := ToForwardSearcher.toSearcher pat s
match searcher.step with
| .yield _ (.matched _ endPos) _ => some endPos
match searcher.first? with
| some (.matched _ endPos) => some endPos
| _ => none
@[always_inline, inline]
def defaultImplementation {pat : ρ} [ToForwardSearcher pat σ] : ForwardPattern pat where
startsWith := defaultStartsWith pat
dropPrefix? := defaultDropPrefix? pat
def defaultImplementation {pat : ρ} [ToForwardSearcher pat σ] [ s, Std.IteratorLoop (σ s) Id Id] :
ForwardPattern pat where
startsWith s := defaultStartsWith pat s
dropPrefix? s := defaultDropPrefix? pat s
end ForwardPattern
@@ -188,23 +190,24 @@ variable [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
variable (pat : ρ) [ToBackwardSearcher pat σ]
@[specialize pat]
def defaultEndsWith (s : Slice) : Bool :=
def defaultEndsWith (s : Slice) [Std.IteratorLoop (σ s) Id Id] : Bool :=
let searcher := ToBackwardSearcher.toSearcher pat s
match searcher.step with
| .yield _ (.matched _ endPos) _ => s.endPos = endPos
match searcher.first? with
| some (.matched _ endPos) => s.endPos = endPos
| _ => false
@[specialize pat]
def defaultDropSuffix? (s : Slice) : Option s.Pos :=
def defaultDropSuffix? (s : Slice) [Std.IteratorLoop (σ s) Id Id] : Option s.Pos :=
let searcher := ToBackwardSearcher.toSearcher pat s
match searcher.step with
| .yield _ (.matched startPos _) _ => some startPos
match searcher.first? with
| some (.matched startPos _) => some startPos
| _ => none
@[always_inline, inline]
def defaultImplementation {pat : ρ} [ToBackwardSearcher pat σ] : BackwardPattern pat where
endsWith := defaultEndsWith pat
dropSuffix? := defaultDropSuffix? pat
def defaultImplementation {pat : ρ} [ToBackwardSearcher pat σ] [ s, Std.IteratorLoop (σ s) Id Id] :
BackwardPattern pat where
endsWith s := defaultEndsWith pat s
dropSuffix? s := defaultDropSuffix? pat s
end ToBackwardSearcher

View File

@@ -341,4 +341,16 @@ theorem isNone_findFinIdx? {xs : Vector α n} {p : α → Bool} :
rcases xs with xs, rfl
simp [hf, Function.comp_def]
/-! ### find? and findFinIdx? -/
theorem find?_eq_map_findFinIdx?_getElem {xs : Vector α n} {p : α Bool} :
xs.find? p = (xs.findFinIdx? p).map (xs[·]) := by
rcases xs with xs, rfl
simp [Array.find?_eq_map_findFinIdx?_getElem]
theorem findFinIdx?_eq_bind_find?_finIdxOf? [BEq α] [LawfulBEq α] {xs : Vector α n} {p : α Bool} :
xs.findFinIdx? p = (xs.find? p).bind (xs.finIdxOf? ·) := by
rcases xs with xs, rfl
simp [Array.findFinIdx?_eq_bind_find?_finIdxOf?]
end Vector

View File

@@ -172,6 +172,9 @@ instance (priority := low) [GetElem coll idx elem valid] [∀ xs i, Decidable (v
rw [getElem?_def]
exact dif_pos h
grind_pattern getElem?_pos => c[i] where
guard dom c i
@[simp, grind =] theorem getElem?_neg [GetElem? cont idx elem dom] [LawfulGetElem cont idx elem dom]
(c : cont) (i : idx) (h : ¬dom c i) : c[i]? = none := by
have : Decidable (dom c i) := .isFalse h

View File

@@ -137,6 +137,11 @@ structure Config where
For local theorems, use `+suggestions` instead.
-/
locals : Bool := false
/--
If `instances` is `true`, `dsimp` will visit instance arguments.
If option `backward.dsimp.instances` is `true`, it overrides this field.
-/
instances : Bool := false
deriving Inhabited, BEq
end DSimp
@@ -308,6 +313,11 @@ structure Config where
For local theorems, use `+suggestions` instead.
-/
locals : Bool := false
/--
If `instances` is `true`, `dsimp` will visit instance arguments.
If option `backward.dsimp.instances` is `true`, it overrides this field.
-/
instances : Bool := false
deriving Inhabited, BEq
-- Configuration object for `simp_all`

View File

@@ -44,6 +44,61 @@ theorem implies_congr_left {p₁ p₂ : Sort u} {q : Sort v} (h : p₁ = p₂) :
theorem implies_congr_right {p : Sort u} {q₁ q₂ : Sort v} (h : q₁ = q₂) : (p q₁) = (p q₂) :=
h rfl
namespace Lean
/--
`Arrow α β` is definitionally equal to `α → β`, but represented as a function
application rather than `Expr.forallE`.
This representation is useful for proof automation that builds nested implications
like `pₙ → ... → p₂ → p₁`. With `Expr.forallE`, each nesting level introduces a
binder that bumps de Bruijn indices in subterms, destroying sharing even with
hash-consing. For example, if `p₁` contains `#20`, then at depth 2 it becomes `#21`,
at depth 3 it becomes `#22`, etc., causing quadratic proof growth.
With `arrow`, both arguments are explicit (not under binders), so subterms remain
identical across nesting levels and can be shared, yielding linear-sized proofs.
-/
def Arrow (α : Sort u) (β : Sort v) : Sort (imax u v) := α β
theorem arrow_congr {p₁ p₂ : Sort u} {q₁ q₂ : Sort v} (h₁ : p₁ = p₂) (h₂ : q₁ = q₂) : Arrow p₁ q₁ = Arrow p₂ q₂ :=
h₁ h₂ rfl
theorem arrow_congr_left {p₁ p₂ : Sort u} {q : Sort v} (h : p₁ = p₂) : Arrow p₁ q = Arrow p₂ q :=
h rfl
theorem arrow_congr_right {p : Sort u} {q₁ q₂ : Sort v} (h : q₁ = q₂) : Arrow p q₁ = Arrow p q₂ :=
h rfl
theorem true_arrow (p : Prop) : Arrow True p = p := by
simp [Arrow]; constructor
next => intro h; exact h .intro
next => intros; assumption
theorem true_arrow_congr_left (p q : Prop) : p = True Arrow p q = q := by
intros; subst p; apply true_arrow
theorem true_arrow_congr_right (q q' : Prop) : q = q' Arrow True q = q' := by
intros; subst q; apply true_arrow
theorem true_arrow_congr (p q q' : Prop) : p = True q = q' Arrow p q = q' := by
intros; subst p q; apply true_arrow
theorem false_arrow (p : Prop) : Arrow False p = True := by
simp [Arrow]; constructor
next => intros; exact .intro
next => intros; contradiction
theorem false_arrow_congr (p q : Prop) : p = False Arrow p q = True := by
intros; subst p; apply false_arrow
theorem arrow_true (α : Sort u) : Arrow α True = True := by
simp [Arrow]; constructor <;> intros <;> exact .intro
theorem arrow_true_congr (α : Sort u) (p : Prop) : p = True Arrow α p = True := by
intros; subst p; apply arrow_true
end Lean
theorem iff_congr {p₁ p₂ q₁ q₂ : Prop} (h₁ : p₁ p₂) (h₂ : q₁ q₂) : (p₁ q₁) (p₂ q₂) :=
Iff.of_eq (propext h₁ propext h₂ rfl)

View File

@@ -27,6 +27,7 @@ public import Lean.Compiler.IR.ToIR
public import Lean.Compiler.IR.ToIRType
public import Lean.Compiler.IR.Meta
public import Lean.Compiler.IR.Toposort
public import Lean.Compiler.IR.SimpleGroundExpr
-- The following imports are not required by the compiler. They are here to ensure that there
-- are no orphaned modules.
@@ -71,6 +72,7 @@ def compile (decls : Array Decl) : CompilerM (Array Decl) := do
logDecls `result decls
checkDecls decls
decls toposortDecls decls
decls.forM Decl.detectSimpleGround
addDecls decls
inferMeta decls
return decls

View File

@@ -186,7 +186,7 @@ def getDecl (n : Name) : CompilerM Decl := do
def findLocalDecl (n : Name) : CompilerM (Option Decl) :=
return declMapExt.getState ( getEnv) |>.find? n
/-- Returns the list of IR declarations in declaration order. -/
/-- Returns the list of IR declarations in reverse declaration order. -/
def getDecls (env : Environment) : List Decl :=
declMapExt.getEntries env

View File

@@ -12,6 +12,7 @@ public import Lean.Compiler.IR.NormIds
public import Lean.Compiler.IR.SimpCase
public import Lean.Compiler.IR.Boxing
public import Lean.Compiler.ModPkgExt
import Lean.Compiler.IR.SimpleGroundExpr
public section
@@ -76,6 +77,26 @@ def toCType : IRType → String
| IRType.struct _ _ => panic! "not implemented yet"
| IRType.union _ _ => panic! "not implemented yet"
def toHexDigit (c : Nat) : String :=
String.singleton c.digitChar
def quoteString (s : String) : String :=
let q := "\"";
let q := s.foldl
(fun q c => q ++
if c == '\n' then "\\n"
else if c == '\r' then "\\r"
else if c == '\t' then "\\t"
else if c == '\\' then "\\\\"
else if c == '\"' then "\\\""
else if c == '?' then "\\?" -- avoid trigraphs
else if c.toNat <= 31 then
"\\x" ++ toHexDigit (c.toNat / 16) ++ toHexDigit (c.toNat % 16)
-- TODO(Leo): we should use `\unnnn` for escaping unicode characters.
else String.singleton c)
q;
q ++ "\""
def throwInvalidExportName {α : Type} (n : Name) : M α :=
throw s!"invalid export name '{n}'"
@@ -101,30 +122,160 @@ def toCInitName (n : Name) : M String := do
def emitCInitName (n : Name) : M Unit :=
toCInitName n >>= emit
def ctorScalarSizeStr (usize : Nat) (ssize : Nat) : String :=
if usize == 0 then toString ssize
else if ssize == 0 then s!"sizeof(size_t)*{usize}"
else s!"sizeof(size_t)*{usize} + {ssize}"
structure GroundState where
auxCounter : Nat := 0
abbrev GroundM := StateT GroundState M
partial def emitGroundDecl (decl : Decl) (cppBaseName : String) : M Unit := do
let some ground := getSimpleGroundExpr ( getEnv) decl.name | unreachable!
discard <| compileGround ground |>.run {}
where
compileGround (e : SimpleGroundExpr) : GroundM Unit := do
let valueName compileGroundToValue e
let declPrefix := if isClosedTermName ( getEnv) decl.name then "static" else "LEAN_EXPORT"
emitLn <| s!"{declPrefix} const lean_object* {cppBaseName} = (const lean_object*)&{valueName};"
compileGroundToValue (e : SimpleGroundExpr) : GroundM String := do
match e with
| .ctor cidx objArgs usizeArgs scalarArgs =>
let val compileCtor cidx objArgs usizeArgs scalarArgs
mkValueCLit "lean_ctor_object" val
| .string data =>
let leanStringTag := 249
let header := mkHeader 0 0 leanStringTag
let size := data.utf8ByteSize + 1 -- null byte
let length := data.length
let data : String := quoteString data
mkValueCLit
"lean_string_object"
s!"\{.m_header = {header}, .m_size = {size}, .m_capacity = {size}, .m_length = {length}, .m_data = {data}}"
| .pap func args =>
let numFixed := args.size
let leanClosureTag := 245
let header := mkHeader s!"sizeof(lean_closure_object) + sizeof(void*)*{numFixed}" 0 leanClosureTag
let funPtr := s!"(void*){← toCName func}"
let arity := ( getDecl func).params.size
let args args.mapM groundArgToCLit
let argArray := String.intercalate "," args.toList
mkValueCLit
"lean_closure_object"
s!"\{.m_header = {header}, .m_fun = {funPtr}, .m_arity = {arity}, .m_num_fixed = {numFixed}, .m_objs = \{{argArray}} }"
| .nameMkStr args =>
let obj groundNameMkStrToCLit args
mkValueCLit "lean_ctor_object" obj
| .reference refDecl => findValueDecl refDecl
mkValueName (name : String) : String :=
name ++ "_value"
mkAuxValueName (name : String) (idx : Nat) : String :=
mkValueName name ++ s!"_aux_{idx}"
mkAuxDecl (type value : String) : GroundM String := do
let idx modifyGet fun s => (s.auxCounter, { s with auxCounter := s.auxCounter + 1 })
let name := mkAuxValueName cppBaseName idx
emitLn <| s!"static const {type} {name} = {value};"
return name
mkValueCLit (type value : String) : GroundM String := do
let valueName := mkValueName cppBaseName
emitLn <| s!"static const {type} {valueName} = {value};"
return valueName
groundNameMkStrToCLit (args : Array (Name × UInt64)) : GroundM String := do
assert! args.size > 0
if args.size == 1 then
let (ref, hash) := args[0]!
let hash := uint64ToByteArrayLE hash
compileCtor 1 #[.tagged 0, .reference ref] #[] hash
else
let (ref, hash) := args.back!
let args := args.pop
let lit groundNameMkStrToCLit args
let auxName mkAuxDecl "lean_ctor_object" lit
let hash := uint64ToByteArrayLE hash
compileCtor 1 #[.rawReference auxName, .reference ref] #[] hash
groundArgToCLit (a : SimpleGroundArg) : GroundM String := do
match a with
| .tagged val => return s!"((lean_object*)(((size_t)({val}) << 1) | 1))"
| .reference decl => return s!"((lean_object*)&{← findValueDecl decl})"
| .rawReference decl => return s!"((lean_object*)&{decl})"
findValueDecl (decl : Name) : GroundM String := do
let mut decl := decl
while true do
if let some (.reference ref) := getSimpleGroundExpr ( getEnv) decl then
decl := ref
else
break
return mkValueName ( toCName decl)
compileCtor (cidx : Nat) (objArgs : Array SimpleGroundArg) (usizeArgs : Array USize)
(scalarArgs : Array UInt8) : GroundM String := do
let header := mkCtorHeader objArgs.size usizeArgs.size scalarArgs.size cidx
let objArgs objArgs.mapM groundArgToCLit
let usizeArgs : Array String := usizeArgs.map fun val => s!"(lean_object*)(size_t)({val}ULL)"
assert! scalarArgs.size % 8 == 0
let scalarArgs : Array String := Id.run do
let chunks := scalarArgs.size / 8
let mut packed := Array.emptyWithCapacity chunks
for idx in 0...chunks do
let b1 := scalarArgs[idx * 8]!
let b2 := scalarArgs[idx * 8 + 1]!
let b3 := scalarArgs[idx * 8 + 2]!
let b4 := scalarArgs[idx * 8 + 3]!
let b5 := scalarArgs[idx * 8 + 4]!
let b6 := scalarArgs[idx * 8 + 5]!
let b7 := scalarArgs[idx * 8 + 6]!
let b8 := scalarArgs[idx * 8 + 7]!
let lit := s!"LEAN_SCALAR_PTR_LITERAL({b1}, {b2}, {b3}, {b4}, {b5}, {b6}, {b7}, {b8})"
packed := packed.push lit
return packed
let argArray := String.intercalate "," (objArgs ++ usizeArgs ++ scalarArgs).toList
return s!"\{.m_header = {header}, .m_objs = \{{argArray}}}"
mkCtorHeader (numObjs : Nat) (usize : Nat) (ssize : Nat) (tag : Nat) : String :=
let size := s!"sizeof(lean_ctor_object) + sizeof(void*)*{numObjs} + {ctorScalarSizeStr usize ssize}"
mkHeader size numObjs tag
mkHeader {α : Type} [ToString α] (csSz : α) (other : Nat) (tag : Nat) : String :=
s!"\{.m_rc = 0, .m_cs_sz = {csSz}, .m_other = {other}, .m_tag = {tag}}"
def emitFnDeclAux (decl : Decl) (cppBaseName : String) (isExternal : Bool) : M Unit := do
let ps := decl.params
let env getEnv
if ps.isEmpty then
if isExternal then emit "extern "
else if isClosedTermName env decl.name then emit "static "
else emit "LEAN_EXPORT "
if isSimpleGroundDecl env decl.name then
emitGroundDecl decl cppBaseName
else
if !isExternal then emit "LEAN_EXPORT "
emit (toCType decl.resultType ++ " " ++ cppBaseName)
unless ps.isEmpty do
emit "("
-- We omit void parameters, note that they are guaranteed not to occur in boxed functions
let ps := ps.filter (fun p => !p.ty.isVoid)
-- We omit erased parameters for extern constants
let ps := if isExternC env decl.name then ps.filter (fun p => !p.ty.isErased) else ps
if ps.size > closureMaxArgs && isBoxedName decl.name then
emit "lean_object**"
if ps.isEmpty then
if isExternal then emit "extern "
else if isClosedTermName env decl.name then emit "static "
else emit "LEAN_EXPORT "
else
ps.size.forM fun i _ => do
if i > 0 then emit ", "
emit (toCType ps[i].ty)
emit ")"
emitLn ";"
if !isExternal then emit "LEAN_EXPORT "
emit (toCType decl.resultType ++ " " ++ cppBaseName)
unless ps.isEmpty do
emit "("
-- We omit void parameters, note that they are guaranteed not to occur in boxed functions
let ps := ps.filter (fun p => !p.ty.isVoid)
-- We omit erased parameters for extern constants
let ps := if isExternC env decl.name then ps.filter (fun p => !p.ty.isErased) else ps
if ps.size > closureMaxArgs && isBoxedName decl.name then
emit "lean_object**"
else
ps.size.forM fun i _ => do
if i > 0 then emit ", "
emit (toCType ps[i].ty)
emit ")"
emitLn ";"
def emitFnDecl (decl : Decl) (isExternal : Bool) : M Unit := do
let cppBaseName toCName decl.name
@@ -137,10 +288,9 @@ def emitExternDeclAux (decl : Decl) (cNameStr : String) : M Unit := do
def emitFnDecls : M Unit := do
let env getEnv
let decls := getDecls env
let decls := getDecls env |>.reverse
let modDecls : NameSet := decls.foldl (fun s d => s.insert d.name) {}
let usedDecls : NameSet := decls.foldl (fun s d => collectUsedDecls env d (s.insert d.name)) {}
let usedDecls := usedDecls.toList
let usedDecls := collectUsedDecls env decls
usedDecls.forM fun n => do
let decl getDecl n;
match getExternNameFor env `c decl.name with
@@ -353,10 +503,8 @@ def emitArgs (ys : Array Arg) : M Unit :=
if i > 0 then emit ", "
emitArg ys[i]
def emitCtorScalarSize (usize : Nat) (ssize : Nat) : M Unit := do
if usize == 0 then emit ssize
else if ssize == 0 then emit "sizeof(size_t)*"; emit usize
else emit "sizeof(size_t)*"; emit usize; emit " + "; emit ssize
def emitCtorScalarSize (usize : Nat) (ssize : Nat) : M Unit :=
emit <| ctorScalarSizeStr usize ssize
def emitAllocCtor (c : CtorInfo) : M Unit := do
emit "lean_alloc_ctor("; emit c.cidx; emit ", "; emit c.size; emit ", "
@@ -435,12 +583,18 @@ def emitExternCall (f : FunId) (ps : Array Param) (extData : ExternAttrData) (ys
| some (ExternEntry.inline _ pat) => do emit (expandExternPattern pat (toStringArgs ys)); emitLn ";"
| _ => throw s!"failed to emit extern application '{f}'"
def emitLeanFunReference (f : FunId) : M Unit := do
if isSimpleGroundDecl ( getEnv) f then
emit s!"((lean_object*)({← toCName f}))"
else
emitCName f
def emitFullApp (z : VarId) (f : FunId) (ys : Array Arg) : M Unit := do
emitLhs z
let decl getDecl f
match decl with
| .fdecl (xs := ps) .. | .extern (xs := ps) (ext := { entries := [.opaque], .. }) .. =>
emitCName f
emitLeanFunReference f
if ys.size > 0 then
let (ys, _) := ys.zip ps |>.filter (fun (_, p) => !p.ty.isVoid) |>.unzip
emit "("; emitArgs ys; emit ")"
@@ -482,26 +636,6 @@ def emitUnbox (z : VarId) (t : IRType) (x : VarId) : M Unit := do
def emitIsShared (z : VarId) (x : VarId) : M Unit := do
emitLhs z; emit "!lean_is_exclusive("; emit x; emitLn ");"
def toHexDigit (c : Nat) : String :=
String.singleton c.digitChar
def quoteString (s : String) : String :=
let q := "\"";
let q := s.foldl
(fun q c => q ++
if c == '\n' then "\\n"
else if c == '\r' then "\\r"
else if c == '\t' then "\\t"
else if c == '\\' then "\\\\"
else if c == '\"' then "\\\""
else if c == '?' then "\\?" -- avoid trigraphs
else if c.toNat <= 31 then
"\\x" ++ toHexDigit (c.toNat / 16) ++ toHexDigit (c.toNat % 16)
-- TODO(Leo): we should use `\unnnn` for escaping unicode characters.
else String.singleton c)
q;
q ++ "\""
def emitNumLit (t : IRType) (v : Nat) : M Unit := do
if t.isObj then
if v < UInt32.size then
@@ -670,7 +804,7 @@ def emitDeclAux (d : Decl) : M Unit := do
let env getEnv
let (_, jpMap) := mkVarJPMaps d
withReader (fun ctx => { ctx with jpMap := jpMap }) do
unless hasInitAttr env d.name do
unless hasInitAttr env d.name || isSimpleGroundDecl env d.name do
match d with
| .fdecl (f := f) (xs := xs) (type := t) (body := b) .. =>
let baseName toCName f;
@@ -749,7 +883,8 @@ def emitDeclInit (d : Decl) : M Unit := do
if getBuiltinInitFnNameFor? env d.name |>.isSome then
emit "}"
| _ =>
emitCName n; emit " = "; emitCInitName n; emitLn "();"; emitMarkPersistent d n
if !isSimpleGroundDecl env d.name then
emitCName n; emit " = "; emitCInitName n; emitLn "();"; emitMarkPersistent d n
def emitInitFn : M Unit := do
let env getEnv

View File

@@ -31,6 +31,7 @@ time. These changes can likely be done similar to the ones in EmitC:
- function decls need to be fixed
- full applications need to be fixed
- tail calls need to be fixed
- closed term static initializers
-/
def leanMainFn := "_lean_main"
@@ -537,14 +538,12 @@ def emitFnDecls : M llvmctx Unit := do
let env getEnv
let decls := getDecls env
let modDecls : NameSet := decls.foldl (fun s d => s.insert d.name) {}
let usedDecls : NameSet := decls.foldl (fun s d => collectUsedDecls env d (s.insert d.name)) {}
let usedDecls := usedDecls.toList
for n in usedDecls do
let decl getDecl n
let usedDecls := collectUsedDecls env decls
usedDecls.forM fun n => do
let decl getDecl n;
match getExternNameFor env `c decl.name with
| some cName => emitExternDeclAux decl cName
| none => emitFnDecl decl (!modDecls.contains n)
return ()
def emitLhsSlot_ (x : VarId) : M llvmctx (LLVM.LLVMType llvmctx × LLVM.Value llvmctx) := do
let state get

View File

@@ -25,10 +25,19 @@ def usesModuleFrom (env : Environment) (modulePrefix : Name) : Bool :=
namespace CollectUsedDecls
abbrev M := ReaderT Environment (StateM NameSet)
structure State where
set : NameSet := {}
order : Array Name := #[]
abbrev M := ReaderT Environment (StateM State)
@[inline] def collect (f : FunId) : M Unit :=
modify fun s => s.insert f
modify fun { set, order } =>
let (contained, set) := set.containsThenInsert f
if !contained then
{ set, order := order.push f }
else
{ set, order }
partial def collectFnBody : FnBody M Unit
| .vdecl _ _ v b =>
@@ -46,14 +55,19 @@ def collectInitDecl (fn : Name) : M Unit := do
| some initFn => collect initFn
| _ => pure ()
def collectDecl : Decl M NameSet
| .fdecl (f := f) (body := b) .. => collectInitDecl f *> CollectUsedDecls.collectFnBody b *> get
| .extern (f := f) .. => collectInitDecl f *> get
def collectDecl : Decl M Unit
| .fdecl (f := f) (body := b) .. => collectInitDecl f *> CollectUsedDecls.collectFnBody b
| .extern (f := f) .. => collectInitDecl f
def collectDeclLoop (decls : List Decl) : M Unit := do
decls.forM fun decl => do
collectDecl decl
collect decl.name
end CollectUsedDecls
def collectUsedDecls (env : Environment) (decl : Decl) (used : NameSet := {}) : NameSet :=
(CollectUsedDecls.collectDecl decl env).run' used
def collectUsedDecls (env : Environment) (decls : List Decl) : Array Name :=
(CollectUsedDecls.collectDeclLoop decls env).run {} |>.snd.order
abbrev VarTypeMap := Std.HashMap VarId IRType
abbrev JPParamsMap := Std.HashMap JoinPointId (Array Param)

View File

@@ -0,0 +1,355 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Henrik Böving
-/
module
prelude
public import Lean.Compiler.IR.CompilerM
public import Lean.EnvExtension
import Lean.Compiler.ClosedTermCache
/-!
This module contains logic for detecting simple ground expressions that can be extracted into
statically initializable variables. To do this it attempts to compile declarations into
a simple language of expressions, `SimpleGroundExpr`. If this attempt succeeds it stores the result
in an environment extension, accessible through `getSimpleGroundExpr`. Later on the code emission
step can reference this environment extension to generate static initializers for the respective
declaration.
-/
namespace Lean
namespace IR
/--
An argument to a `SimpleGroundExpr`. They get compiled to `lean_object*` in various ways.
-/
public inductive SimpleGroundArg where
/--
A simple tagged literal.
-/
| tagged (val : Nat)
/--
A reference to another declaration that was marked as a simple ground expression. This gets
compiled to a reference to the mangled version of the name.
-/
| reference (n : Name)
/--
A reference directly to a raw C name. This gets compiled to a reference to the name directly.
-/
| rawReference (s : String)
deriving Inhabited
/--
A simple ground expression that can be turned into a static initializer.
-/
public inductive SimpleGroundExpr where
/--
Represents a `lean_ctor_object`. Crucially the `scalarArgs` array must have a size that is a
multiple of 8.
-/
| ctor (cidx : Nat) (objArgs : Array SimpleGroundArg) (usizeArgs : Array USize) (scalarArgs : Array UInt8)
/--
A string literal, represented by a `lean_string_object`.
-/
| string (data : String)
/--
A partial application, represented by a `lean_closure_object`.
-/
| pap (func : FunId) (args : Array SimpleGroundArg)
/--
An application of `Lean.Name.mkStrX`. This expression is represented separately to ensure that
long name literals get extracted into statically initializable constants. The arguments contain
both the name of the string literal it references as well as the hash of the name up to that
point. This is done to make emitting the literal as simple as possible.
-/
| nameMkStr (args : Array (Name × UInt64))
/--
A reference to another declaration that was marked as a simple ground expression. This gets
compiled to a reference to the mangled version of the name.
-/
| reference (n : Name)
deriving Inhabited
public structure SimpleGroundExtState where
constNames : PHashMap Name SimpleGroundExpr := {}
revNames : List Name := []
deriving Inhabited
builtin_initialize simpleGroundDeclExt : EnvExtension SimpleGroundExtState
registerEnvExtension (pure {}) (asyncMode := .sync)
(replay? := some fun oldState newState _ s =>
let newNames := newState.revNames.take (newState.revNames.length - oldState.revNames.length)
newNames.foldl (init := s) fun s n =>
let g := newState.constNames.find! n
{ s with constNames := s.constNames.insert n g, revNames := n :: s.revNames }
)
/--
Record `declName` as mapping to the simple ground expr `expr`.
-/
public def addSimpleGroundDecl (env : Environment) (declName : Name) (expr : SimpleGroundExpr) :
Environment :=
simpleGroundDeclExt.modifyState env fun s =>
{ s with constNames := s.constNames.insert declName expr, revNames := declName :: s.revNames }
/--
Attempt to fetch a `SimpleGroundExpr` associated with `declName` if it exists.
-/
public def getSimpleGroundExpr (env : Environment) (declName : Name) : Option SimpleGroundExpr :=
(simpleGroundDeclExt.getState env).constNames.find? declName
/--
Like `getSimpleGroundExpr` but recursively traverses `reference` exprs to get to actual ground
values.
-/
public def getSimpleGroundExprWithResolvedRefs (env : Environment) (declName : Name) :
Option SimpleGroundExpr := Id.run do
let mut declName := declName
while true do
let val := getSimpleGroundExpr env declName
match val with
| some (.reference ref) => declName := ref
| other => return other
return none
/--
Check if `declName` is recorded as being a `SimpleGroundExpr`.
-/
public def isSimpleGroundDecl (env : Environment) (declName : Name) : Bool :=
(simpleGroundDeclExt.getState env).constNames.contains declName
public def uint64ToByteArrayLE (n : UInt64) : Array UInt8 :=
#[
n.toUInt8,
(n >>> 0x08).toUInt8,
(n >>> 0x10).toUInt8,
(n >>> 0x18).toUInt8,
(n >>> 0x20).toUInt8,
(n >>> 0x28).toUInt8,
(n >>> 0x30).toUInt8,
(n >>> 0x38).toUInt8,
]
inductive SimpleGroundValue where
| arg (arg : SimpleGroundArg)
| uint8 (val : UInt8)
| uint16 (val : UInt16)
| uint32 (val : UInt32)
| uint64 (val : UInt64)
| usize (val : USize)
deriving Inhabited
structure State where
groundMap : Std.HashMap VarId SimpleGroundValue := {}
abbrev M := StateRefT State $ OptionT CompilerM
/--
Attempt to compile `b` into a `SimpleGroundExpr`. If `b` is not compileable return `none`.
The compiler currently supports the following patterns:
- String literals
- Partial applications with other simple expressions
- Constructor calls with other simple expressions
- `Name.mkStrX`, `Name.str._override`, and `Name.num._override`
- references to other declarations marked as simple ground expressions
-/
partial def compileToSimpleGroundExpr (b : FnBody) : CompilerM (Option SimpleGroundExpr) :=
compileFnBody b |>.run' {} |>.run
where
compileFnBody (b : FnBody) : M SimpleGroundExpr := do
match b with
| .vdecl id _ expr (.ret (.var id')) =>
guard <| id == id'
compileFinalExpr expr
| .vdecl id ty expr b => compileNonFinalExpr id ty expr b
| _ => failure
@[inline]
record (id : VarId) (val : SimpleGroundValue) : M Unit :=
modify fun s => { s with groundMap := s.groundMap.insert id val }
compileNonFinalExpr (id : VarId) (ty : IRType) (expr : Expr) (b : FnBody) : M SimpleGroundExpr := do
match expr with
| .fap c #[] =>
guard <| isSimpleGroundDecl ( getEnv) c
record id (.arg (.reference c))
compileFnBody b
| .lit v =>
match v with
| .num v =>
match ty with
| .tagged =>
guard <| v < 2^31
record id (.arg (.tagged v))
| .uint8 => record id (.uint8 (.ofNat v))
| .uint16 => record id (.uint16 (.ofNat v))
| .uint32 => record id (.uint32 (.ofNat v))
| .uint64 => record id (.uint64 (.ofNat v))
| .usize => record id (.usize (.ofNat v))
| _ => failure
compileFnBody b
| .str .. => failure
| .ctor i objArgs =>
if i.isScalar then
record id (.arg (.tagged i.cidx))
compileFnBody b
else
let objArgs compileArgs objArgs
let usizeArgs := Array.replicate i.usize 0
-- Align to 8 bytes for alignment with lean_object*
let align (v a : Nat) : Nat :=
(v / a) * a + a * (if v % a != 0 then 1 else 0)
let alignedSsize := align i.ssize 8
let ssizeArgs := Array.replicate alignedSsize 0
compileSetChain id i objArgs usizeArgs ssizeArgs b
| _ => failure
compileSetChain (id : VarId) (info : CtorInfo) (objArgs : Array SimpleGroundArg) (usizeArgs : Array USize)
(scalarArgs : Array UInt8) (b : FnBody) : M SimpleGroundExpr := do
match b with
| .ret (.var id') =>
guard <| id == id'
return .ctor info.cidx objArgs usizeArgs scalarArgs
| .sset id' i offset y _ b =>
guard <| id == id'
let i := i - objArgs.size - usizeArgs.size
let offset := i * 8 + offset
let scalarArgs
match ( get).groundMap[y]! with
| .uint8 v =>
let scalarArgs := scalarArgs.set! offset v
pure scalarArgs
| .uint16 v =>
let scalarArgs := scalarArgs.set! offset v.toUInt8
let scalarArgs := scalarArgs.set! (offset + 1) (v >>> 0x08).toUInt8
pure scalarArgs
| .uint32 v =>
let scalarArgs := scalarArgs.set! offset v.toUInt8
let scalarArgs := scalarArgs.set! (offset + 1) (v >>> 0x08).toUInt8
let scalarArgs := scalarArgs.set! (offset + 2) (v >>> 0x10).toUInt8
let scalarArgs := scalarArgs.set! (offset + 3) (v >>> 0x18).toUInt8
pure scalarArgs
| .uint64 v =>
let scalarArgs := scalarArgs.set! offset v.toUInt8
let scalarArgs := scalarArgs.set! (offset + 1) (v >>> 0x08).toUInt8
let scalarArgs := scalarArgs.set! (offset + 2) (v >>> 0x10).toUInt8
let scalarArgs := scalarArgs.set! (offset + 3) (v >>> 0x18).toUInt8
let scalarArgs := scalarArgs.set! (offset + 4) (v >>> 0x20).toUInt8
let scalarArgs := scalarArgs.set! (offset + 5) (v >>> 0x28).toUInt8
let scalarArgs := scalarArgs.set! (offset + 6) (v >>> 0x30).toUInt8
let scalarArgs := scalarArgs.set! (offset + 7) (v >>> 0x38).toUInt8
pure scalarArgs
| _ => failure
compileSetChain id info objArgs usizeArgs scalarArgs b
| .uset id' i y b =>
guard <| id == id'
let i := i - objArgs.size
let .usize v := ( get).groundMap[y]! | failure
let usizeArgs := usizeArgs.set! i v
compileSetChain id info objArgs usizeArgs scalarArgs b
| _ => failure
compileFinalExpr (e : Expr) : M SimpleGroundExpr := do
match e with
| .lit v =>
match v with
| .str v => return .string v
| .num .. => failure
| .ctor i args =>
guard <| i.usize == 0 && i.ssize == 0 && !args.isEmpty
return .ctor i.cidx ( compileArgs args) #[] #[]
| .fap ``Name.num._override args =>
let pre compileArg args[0]!
let .tagged i compileArg args[1]! | failure
let name := Name.num ( interpNameLiteral pre) i
let hash := name.hash
return .ctor 2 #[pre, .tagged i] #[] (uint64ToByteArrayLE hash)
| .fap ``Name.str._override args =>
let pre compileArg args[0]!
let (ref, str) compileStrArg args[1]!
let name := Name.str ( interpNameLiteral pre) str
let hash := name.hash
return .ctor 1 #[pre, .reference ref] #[] (uint64ToByteArrayLE hash)
| .fap ``Name.mkStr1 args
| .fap ``Name.mkStr2 args
| .fap ``Name.mkStr3 args
| .fap ``Name.mkStr4 args
| .fap ``Name.mkStr5 args
| .fap ``Name.mkStr6 args
| .fap ``Name.mkStr7 args
| .fap ``Name.mkStr8 args =>
let mut nameAcc := Name.anonymous
let mut processedArgs := Array.emptyWithCapacity args.size
for arg in args do
let (ref, str) compileStrArg arg
nameAcc := .str nameAcc str
processedArgs := processedArgs.push (ref, nameAcc.hash)
return .nameMkStr processedArgs
| .pap c ys => return .pap c ( compileArgs ys)
| .fap c #[] =>
guard <| isSimpleGroundDecl ( getEnv) c
return .reference c
| _ => failure
compileArg (arg : Arg) : M SimpleGroundArg := do
match arg with
| .var var =>
let .arg arg := ( get).groundMap[var]! | failure
return arg
| .erased => return .tagged 0
compileArgs (args : Array Arg) : M (Array SimpleGroundArg) := do
args.mapM compileArg
compileStrArg (arg : Arg) : M (Name × String) := do
let .var var := arg | failure
let (.arg (.reference ref)) := ( get).groundMap[var]! | failure
let some (.string val) := getSimpleGroundExprWithResolvedRefs ( getEnv) ref | failure
return (ref, val)
interpStringLiteral (arg : SimpleGroundArg) : M String := do
let .reference ref := arg | failure
let some (.string val) := getSimpleGroundExprWithResolvedRefs ( getEnv) ref | failure
return val
interpNameLiteral (arg : SimpleGroundArg) : M Name := do
match arg with
| .tagged 0 => return .anonymous
| .reference ref =>
match getSimpleGroundExprWithResolvedRefs ( getEnv) ref with
| some (.ctor 1 #[pre, .reference ref] _ _) =>
let pre interpNameLiteral pre
let str interpStringLiteral (.reference ref)
return .str pre str
| some (.ctor 2 #[pre, .tagged i] _ _) =>
let pre interpNameLiteral pre
return .num pre i
| some (.nameMkStr args) =>
args.foldlM (init := .anonymous) fun acc (ref, _) => do
let part interpStringLiteral (.reference ref)
return .str acc part
| _ => failure
| _ => failure
/--
Detect whether `d` can be compiled to a `SimpleGroundExpr`. If it can record the associated
`SimpleGroundExpr` into the environment for later processing by code emission.
-/
public def Decl.detectSimpleGround (d : Decl) : CompilerM Unit := do
let .fdecl (body := body) (xs := params) (type := type) .. := d | return ()
if type.isPossibleRef && params.isEmpty then
if let some groundExpr compileToSimpleGroundExpr body then
trace[compiler.ir.simple_ground] m!"Marked {d.name} as simple ground expr"
modifyEnv fun env => addSimpleGroundDecl env d.name groundExpr
builtin_initialize registerTraceClass `compiler.ir.simple_ground (inherited := true)
end IR
end Lean

View File

@@ -5,13 +5,15 @@ Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Compiler.Options
public import Lean.Compiler.IR
public import Lean.Compiler.LCNF.Passes
public import Lean.Compiler.LCNF.ToDecl
public import Lean.Compiler.LCNF.Check
import Lean.Compiler.Options
import Lean.Compiler.IR
import Lean.Compiler.LCNF.Passes
import Lean.Compiler.LCNF.ToDecl
import Lean.Compiler.LCNF.Check
import Lean.Meta.Match.MatcherInfo
import Lean.Compiler.LCNF.SplitSCC
public import Lean.Compiler.IR.Basic
public import Lean.Compiler.LCNF.CompilerM
public section
namespace Lean.Compiler.LCNF
/--

View File

@@ -293,26 +293,18 @@ instance : FromJson ModuleRefs where
node.foldlM (init := ) fun m k v =>
return m.insert ( RefIdent.fromJson? ( Json.parse k)) ( fromJson? v)
/--
Used in the `$/lean/ileanHeaderInfo` watchdog <- worker notifications.
Contains the direct imports of the file managed by a worker.
-/
structure LeanILeanHeaderInfoParams where
/-- Version of the file these imports are from. -/
version : Nat
/-- Direct imports of this file. -/
directImports : Array ImportInfo
deriving FromJson, ToJson
/--
Used in the `$/lean/ileanHeaderSetupInfo` watchdog <- worker notifications.
Contains status information about `lake setup-file`.
Contains status information about `lake setup-file` and the direct imports of the file managed by
a worker.
-/
structure LeanILeanHeaderSetupInfoParams where
/-- Version of the file these imports are from. -/
version : Nat
/-- Whether setting up the header using `lake setup-file` has failed. -/
isSetupFailure : Bool
/-- Direct imports of this file. -/
directImports : Array ImportInfo
deriving FromJson, ToJson
/--

View File

@@ -122,7 +122,7 @@ instance : FromJson CompletionIdentifier where
| _ => .error "Expected string in `FromJson` instance of `CompletionIdentifier`."
structure ResolvableCompletionItemData where
mod : Name
uri : DocumentUri
pos : Position
/-- Position of the completion info that this completion item was created from. -/
cPos? : Option Nat := none
@@ -132,7 +132,7 @@ structure ResolvableCompletionItemData where
instance : ToJson ResolvableCompletionItemData where
toJson d := Id.run do
let mut arr : Array Json := #[
toJson d.mod,
toJson d.uri,
d.pos.line,
d.pos.character,
]
@@ -147,7 +147,7 @@ instance : FromJson ResolvableCompletionItemData where
| .arr elems => do
if elems.size < 3 then
.error "Expected array of size 3 in `FromJson` instance of `ResolvableCompletionItemData"
let mod : Name fromJson? elems[0]!
let uri : DocumentUri fromJson? elems[0]!
let line : Nat fromJson? elems[1]!
let character : Nat fromJson? elems[2]!
let mut cPos? : Option Nat := none
@@ -162,7 +162,7 @@ instance : FromJson ResolvableCompletionItemData where
id? := some id
let pos := { line, character }
return {
mod
uri
pos
cPos?
id?

View File

@@ -1272,9 +1272,6 @@ private def addProjections (params : Array Expr) (r : ElabHeaderResult) (fieldIn
for fieldInfo in fieldInfos do
if fieldInfo.kind.isSubobject then
addDeclarationRangesFromSyntax fieldInfo.declName r.view.ref fieldInfo.ref
for decl in projDecls do
-- projections may generate equation theorems
enableRealizationsForConst decl.projName
private def registerStructure (structName : Name) (infos : Array StructFieldInfo) : TermElabM Unit := do
let fields infos.filterMapM fun info => do
@@ -1563,6 +1560,11 @@ def elabStructureCommand : InductiveElabDescr where
checkResolutionOrder view.declName
return {
finalize := do
-- Enable realizations for projections here (after @[class] attribute is applied)
-- so that the realization context has class information available.
for fieldInfo in fieldInfos do
if fieldInfo.kind.isInCtor then
enableRealizationsForConst fieldInfo.declName
if view.isClass then
addParentInstances parentInfos
}

View File

@@ -82,13 +82,27 @@ def elabMPureIntro : Tactic
replaceMainGoal [mv]
| _ => throwUnsupportedSyntax
private def extractPureProp (e : Expr) : MetaM (Option Expr) := do
let e instantiateMVarsIfMVarApp e
let some (_, e) := e.app2? ``ULift.down | return none
let f := e.getAppFn
unless f.isConstOf ``SPred.pure do return none
let args := e.getAppArgs
if args.size < 2 then return none
let σs := args[0]!
let n TypeList.length σs
unless n = args.size - 2 do return none
let p := args[1]!
return p
partial def _root_.Lean.MVarId.applyRflAndAndIntro (mvar : MVarId) : MetaM Unit := do
-- The target might look like `(⌜?n = n ∧ ?m = b⌝ s).down`, which we reduce to
-- `?n = n ∧ ?m = b` by `whnfD`.
-- The target might look like `(⌜n = ?n ∧ ?m = b⌝ s).down`, which we reduce to
-- `n = ?n ∧ ?m = b` with `extractPureProp`.
-- (Recall that `⌜s = 4⌝ s` is `SPred.pure (σs:=[Nat]) (s = 4) s` and `SPred.pure` is
-- semi-reducible.)
let ty whnfD ( mvar.getType)
trace[Elab.Tactic.Do.spec] "whnf: {ty}"
let ty mvar.getType >>= instantiateMVarsIfMVarApp
let ty (·.getD ty) <$> extractPureProp ty
trace[Elab.Tactic.Do.spec] "pure Prop: {ty}"
if ty.isAppOf ``True then
mvar.assign (mkConst ``True.intro)
else if let some (lhs, rhs) := ty.app2? ``And then
@@ -127,16 +141,3 @@ def MGoal.pureTrivial (goal : MGoal) : OptionT MetaM Expr := do
return ((), m)
return prf
catch _ => failure
/-
def MGoal.pureRfl (goal : MGoal) : OptionT MetaM Expr := do
let mv ← mkFreshExprMVar goal.toExpr
let ([], _) ← try runTactic mv.mvarId! (← `(tactic| apply $(mkIdent ``Std.Do.SPred.Tactic.Pure.intro); rfl)) catch _ => failure
| failure
return mv
def MGoal.pureRfl (goal : MGoal) : OptionT MetaM Expr := do
let mv ← mkFreshExprMVar goal.toExpr
let ([], _) ← try runTactic mv.mvarId! (← `(tactic| apply $(mkIdent ``Std.Do.SPred.Tactic.Pure.intro); rfl)) catch _ => failure
| failure
return mv
-/

View File

@@ -199,7 +199,7 @@ public def mSpec (goal : MGoal) (elabSpecAtWP : Expr → n SpecTheorem) (goalTag
let_expr f@Triple m ps instWP α prog P Q := specTy
| liftMetaM <| throwError "target not a Triple application {specTy}"
let wp' := mkApp5 (mkConst ``WP.wp f.constLevels!) m ps instWP α prog
unless ( withAssignableSyntheticOpaque <| isDefEqGuarded wp wp') do
unless ( isDefEqGuarded wp wp') do
Term.throwTypeMismatchError none wp wp' spec
-- Try synthesizing synthetic MVars. We don't have the convenience of `TermElabM`, hence

View File

@@ -209,8 +209,8 @@ def SuccessPoint.clause (p : SuccessPoint) : Expr :=
/-- The last syntactic element of a `FailureCond`. -/
inductive ExceptCondsDefault where
/-- `()`. This means we can suggest `post⟨...⟩`. -/
| unit
/-- `PUnit.unit`. This means we can suggest `post⟨...⟩`. -/
| punit
/-- `ExceptConds.false`. This means we can suggest `⇓ _ => _`. -/
| false
/-- `ExceptConds.true`. This means we can suggest `⇓? _ => _`. -/
@@ -229,7 +229,7 @@ When the default is not defeq to `ExceptConds.false`, we use it as the default.
-/
structure FailureCondHints where
points : Array Expr := #[]
default : ExceptCondsDefault := .unit
default : ExceptCondsDefault := .punit
/-- Look at how `inv` is used in the `vcs` and collect hints about how `inv` should be instantiated.
In case it succeeds, there will be
@@ -293,8 +293,8 @@ def collectInvariantHints (vcs : Array MVarId) (inv : MVarId) (xs : Expr) (letMu
-- Just overwrite the existing entry. Computing a join here is overkill for the few cases
-- where this is going to be used.
failureConds := { failureConds with points := points }
if conds.isConstOf ``Unit.unit then
failureConds := { failureConds with default := .unit }
if conds.isConstOf ``PUnit.unit then
failureConds := { failureConds with default := .punit }
else if conds.isAppOfArity ``ExceptConds.false 1 then
failureConds := { failureConds with default := .false }
else if conds.isAppOfArity ``ExceptConds.true 1 then
@@ -402,8 +402,8 @@ public def suggestInvariant (vcs : Array MVarId) (inv : MVarId) : TacticM Term :
-- 2. However, on early return we want to suggest something using `Invariant.withEarlyReturn`.
-- 3. When there are non-`False` failure conditions, we cannot suggest `⇓ ⟨xs, letMuts⟩ => ...`.
-- We might be able to suggest `⇓? ⟨xs, letMuts⟩ => ...` (`True` failure condition),
-- or `post⟨...⟩` (more than 0 failure handlers, but ending in `()`), and fall back to
-- `by exact ⟨...⟩` (not ending in `()`).
-- or `post⟨...⟩` (more than 0 failure handlers, but ending in `PUnit.unit`), and fall back to
-- `by exact ⟨...⟩` (not ending in `PUnit.unit`).
-- 4. Similarly for the `onExcept` argument of `Invariant.withEarlyReturn`.
-- Hence the spaghetti code.
--
@@ -429,7 +429,7 @@ public def suggestInvariant (vcs : Array MVarId) (inv : MVarId) : TacticM Term :
-- Now the configuration mess.
if failureConds.points.isEmpty then
match failureConds.default with
| .false | .unit =>
| .false | .punit =>
`(Invariant.withEarlyReturn (onReturn := fun r letMuts => $onReturn) (onContinue := fun xs letMuts => $onContinue))
-- we handle the following two cases here rather than through
-- `postCondWithMultipleConditions` below because that would insert a superfluous `by exact _`.
@@ -469,7 +469,7 @@ where
postCondWithMultipleConditions (handlers : Array Term) (default : ExceptCondsDefault) : MetaM Term := do
let handlers := Syntax.TSepArray.ofElems (sep := ",") handlers
match default with
| .unit => `(post$handlers,*)
| .punit => `(post$handlers,*)
-- See the comment in `post⟨_⟩` syntax for why we emit `by exact` here.
| .false => `(by exact $handlers,*, ExceptConds.false)
| .true => `(by exact $handlers,*, ExceptConds.true)

View File

@@ -7,6 +7,7 @@ module
prelude
import Init.Grind.Lint
import Lean.Elab.Tactic.Grind.Lint
import Std
-- We allow these as grind lemmas even though they triggers >20 further instantiations.
-- See tests/lean/run/grind_lint_*.lean for more details.
@@ -16,8 +17,18 @@ import Lean.Elab.Tactic.Grind.Lint
#grind_lint skip List.Sublist.append
#grind_lint skip List.Sublist.middle
#grind_lint skip List.getLast?_pmap
#grind_lint skip List.getLast_attach
#grind_lint skip List.getLast_attachWith
#grind_lint skip List.head_attachWith
#grind_lint skip Array.back_singleton
#grind_lint skip Array.count_singleton
#grind_lint skip Array.foldl_empty
#grind_lint skip Array.foldr_empty
#grind_lint skip Array.getElem_zero_filter
#grind_lint skip Array.getElem_zero_filterMap
#grind_lint skip Std.ExtHashMap.getElem_filterMap'
#grind_lint skip Std.ExtTreeMap.getElem_filterMap'
#grind_lint skip Std.HashMap.getElem_filterMap'
#grind_lint skip Std.TreeMap.getElem_filterMap'
#grind_lint skip suffix sizeOf_spec

View File

@@ -87,10 +87,14 @@ private def mkDischargeWrapper (optDischargeSyntax : Syntax) : TacticM Simp.Disc
`optConfig` is of the form `("(" "config" ":=" term ")")?`
-/
def elabSimpConfig (optConfig : Syntax) (kind : SimpKind) : TacticM Meta.Simp.Config := do
match kind with
| .simp => elabSimpConfigCore optConfig
| .simpAll => return ( elabSimpConfigCtxCore optConfig).toConfig
| .dsimp => return { ( elabDSimpConfigCore optConfig) with }
let cfg match kind with
| .simp => elabSimpConfigCore optConfig
| .simpAll => pure ( elabSimpConfigCtxCore optConfig).toConfig
| .dsimp => pure { ( elabDSimpConfigCore optConfig) with }
if Simp.backward.dsimp.instances.get ( getOptions) then
return { cfg with instances := true }
else
return cfg
inductive ResolveSimpIdResult where
| none

View File

@@ -66,7 +66,7 @@ unsafe def fold {α : Type} (f : Name → α → MetaM α) (e : Expr) (acc : α)
| .app f a =>
let fi getFunInfo f (some 1)
if fi.paramInfo[0]!.isInstImplicit then
-- Don't visit implicit arguments.
-- Don't visit instance implicit arguments.
visit f acc
else
visit a ( visit f acc)

View File

@@ -117,7 +117,7 @@ where
let infos getParamsInfo aFn aArgs.size
for i in *...infos.size do
-- We ignore instance implicit arguments during comparison
if !infos[i]!.isInstImplicit then
if !infos[i]!.isInstance then
if ( lt aArgs[i]! bArgs[i]!) then
return true
else if ( lt bArgs[i]! aArgs[i]!) then
@@ -155,7 +155,7 @@ where
let infos getParamsInfo f args.size
for i in *...infos.size do
-- We ignore instance implicit arguments during comparison
if !infos[i]!.isInstImplicit then
if !infos[i]!.isInstance then
if !( lt args[i]! b) then
return false
for h : i in infos.size...args.size do

View File

@@ -241,6 +241,8 @@ structure ParamInfo where
This information affects the generation of congruence theorems.
-/
isDecInst : Bool := false
/-- `isInstance` is true if the parameter type is a class instance. -/
isInstance : Bool := false
/--
`higherOrderOutParam` is true if this parameter is a higher-order output parameter
of local instance.

View File

@@ -197,7 +197,7 @@ def getCongrSimpKinds (f : Expr) (info : FunInfo) : MetaM (Array CongrArgKind) :
result := result.push .fixed
else if info.paramInfo[i].isProp then
result := result.push .cast
else if info.paramInfo[i].isInstImplicit then
else if info.paramInfo[i].isInstance then
if let some mask := mask? then
if h2 : i < mask.size then
if mask[i] then
@@ -226,7 +226,7 @@ def getCongrSimpKindsForArgZero (info : FunInfo) : MetaM (Array CongrArgKind) :=
result := result.push .eq
else if info.paramInfo[i].isProp then
result := result.push .cast
else if info.paramInfo[i].isInstImplicit then
else if info.paramInfo[i].isInstance then
if shouldUseSubsingletonInst info result i then
result := result.push .subsingletonInst
else

View File

@@ -104,7 +104,7 @@ private def tmpStar := mkMVar tmpMVarId
private def ignoreArg (a : Expr) (i : Nat) (infos : Array ParamInfo) : MetaM Bool := do
if h : i < infos.size then
let info := infos[i]
if info.isInstImplicit then
if info.isInstance then
return true
else if info.isImplicit || info.isStrictImplicit then
return !( isType a)

View File

@@ -300,7 +300,13 @@ private partial def isDefEqArgs (f : Expr) (args₁ args₂ : Array Expr) : Meta
let a₁ := args₁[i]!
let a₂ := args₂[i]!
let info := finfo.paramInfo[i]!
if info.isInstImplicit then
/-
**Note**: We use `binderInfo.isInstImplicit` (the `[..]` annotation) rather than
`info.isInstance` (whether the type is a class). Type class synthesis should only
be triggered for parameters explicitly marked for instance resolution, not merely
for parameters whose types happen to be class types.
-/
if info.binderInfo.isInstImplicit then
discard <| trySynthPending a₁
discard <| trySynthPending a₂
unless ( withInferTypeConfig <| Meta.isExprDefEqAux a₁ a₂) do
@@ -309,7 +315,7 @@ private partial def isDefEqArgs (f : Expr) (args₁ args₂ : Array Expr) : Meta
let a₁ := args₁[i]!
let a₂ := args₂[i]!
let info := finfo.paramInfo[i]!
if info.isInstImplicit then
if info.isInstance then
unless ( withInferTypeConfig <| Meta.isExprDefEqAux a₁ a₂) do
return false
else

View File

@@ -85,26 +85,50 @@ private def getFunInfoAux (fn : Expr) (maxArgs? : Option Nat) : MetaM FunInfo :=
let dependsOnHigherOrderOutParam :=
!higherOrderOutParams.isEmpty
&& Option.isSome (decl.type.find? fun e => e.isFVar && higherOrderOutParams.contains e.fvarId!)
let className? isClass? decl.type
/-
**Note**: We use `isClass? decl.type` instead of relying solely on binder annotations
(`[...]`) because users sometimes use implicit binders for class types when the instance
is already available from context. For example:
```
structure OrdSet (α : Type) [Hashable α] [BEq α] where ...
def OrdSet.insert {_ : Hashable α} {_ : BEq α} (s : OrdSet α) (a : α) : OrdSet α := ...
```
Here, `Hashable` and `BEq` are classes, but implicit binders are used because the
instances come from `OrdSet`'s parameters.
However, we also require the binder to be non-explicit because structures extending
classes use explicit binders for their constructor parameters:
```
structure MyGroupTopology (α : Type) extends MyTopology α, IsContinuousMul α
-- Constructor type:
-- MyGroupTopology.mk (toMyTopology : MyTopology α) (toIsContinuousMul : IsContinuousMul α) : ...
```
These explicit parameters should not be treated as instances for automation purposes.
-/
let isInstance := className?.isSome && !decl.binderInfo.isExplicit
paramInfo := updateHasFwdDeps paramInfo backDeps
paramInfo := paramInfo.push {
backDeps, dependsOnHigherOrderOutParam
backDeps, dependsOnHigherOrderOutParam, isInstance
binderInfo := decl.binderInfo
isProp := ( isProp decl.type)
isDecInst := ( forallTelescopeReducing decl.type fun _ type => return type.isAppOf ``Decidable)
}
if decl.binderInfo == .instImplicit then
/- Collect higher order output parameters of this class -/
if let some className isClass? decl.type then
if let some outParamPositions := getOutParamPositions? ( getEnv) className then
unless outParamPositions.isEmpty do
let args := decl.type.getAppArgs
for h2 : i in *...args.size do
if outParamPositions.contains i then
let arg := args[i]
if let some idx := fvars.idxOf? arg then
if ( whnf ( inferType arg)).isForall then
paramInfo := paramInfo.modify idx fun info => { info with higherOrderOutParam := true }
higherOrderOutParams := higherOrderOutParams.insert arg.fvarId!
if isInstance then
/- Collect higher order output parameters of this class IF `isInstance` is `true` -/
let some className := className? | unreachable!
/- Collect higher order output parameters of this class IF `isInstance` is `true` -/
if let some outParamPositions := getOutParamPositions? ( getEnv) className then
unless outParamPositions.isEmpty do
let args := decl.type.getAppArgs
for h2 : i in *...args.size do
if outParamPositions.contains i then
let arg := args[i]
if let some idx := fvars.idxOf? arg then
if ( whnf ( inferType arg)).isForall then
paramInfo := paramInfo.modify idx fun info => { info with higherOrderOutParam := true }
higherOrderOutParams := higherOrderOutParams.insert arg.fvarId!
let resultDeps := collectDeps fvars type
paramInfo := updateHasFwdDeps paramInfo resultDeps
return { resultDeps, paramInfo }

View File

@@ -139,13 +139,14 @@ private partial def andProjections (e : Expr) : MetaM (Array Expr) := do
return acc.push e
go e ( inferType e) #[]
private def mkInjectiveEqTheoremValue (ctorName : Name) (targetType : Expr) : MetaM Expr := do
private def mkInjectiveEqTheoremValue (ctorVal : ConstructorVal) (targetType : Expr) : MetaM Expr := do
forallTelescopeReducing targetType fun xs type => do
let mvar mkFreshExprSyntheticOpaqueMVar type
let [mvarId₁, mvarId₂] mvar.mvarId!.apply (mkConst ``Eq.propIntro)
| throwError "unexpected number of subgoals when proving injective theorem for constructor `{ctorName}`"
let (h, mvarId₁) mvarId₁.intro1
solveEqOfCtorEq ctorName mvarId₁ h
| throwError "unexpected number of subgoals when proving injective theorem for constructor `{ctorVal.name}`"
let injPrf := mkConst (mkInjectiveTheoremNameFor ctorVal.name) (ctorVal.levelParams.map mkLevelParam)
let injPrf := mkAppN injPrf xs
mvarId₁.assign injPrf
let mut mvarId₂ := mvarId₂
while true do
let t mvarId₂.getType
@@ -158,7 +159,7 @@ private def mkInjectiveEqTheoremValue (ctorName : Name) (targetType : Expr) : Me
| _ => pure ()
let (h, mvarId₂') mvarId₂.intro1
(_, mvarId₂) substEq mvarId₂' h
try mvarId₂.refl catch _ => throwError (injTheoremFailureHeader ctorName)
try mvarId₂.refl catch _ => throwError (injTheoremFailureHeader ctorVal.name)
mkLambdaFVars xs mvar
private def mkInjectiveEqTheorem (ctorVal : ConstructorVal) : MetaM Unit := do
@@ -167,7 +168,7 @@ private def mkInjectiveEqTheorem (ctorVal : ConstructorVal) : MetaM Unit := do
let some type mkInjectiveEqTheoremType? ctorVal
| return ()
trace[Meta.injective] "type: {type}"
let value mkInjectiveEqTheoremValue ctorVal.name type
let value mkInjectiveEqTheoremValue ctorVal type
addDecl <| Declaration.thmDecl {
name
levelParams := ctorVal.levelParams

View File

@@ -78,7 +78,7 @@ def tmpStar := mkMVar tmpMVarId
def ignoreArg (a : Expr) (i : Nat) (infos : Array ParamInfo) : MetaM Bool := do
if h : i < infos.size then
let info := infos[i]
if info.isInstImplicit then
if info.isInstance then
return true
else if info.isImplicit || info.isStrictImplicit then
return !( isType a)

View File

@@ -292,9 +292,8 @@ def transform
let aux1 := mkAppN (mkConst matcherApp.matcherName matcherLevels.toList) params'
let aux1 := mkApp aux1 motive'
let aux1 := mkAppN aux1 discrs'
unless ( isTypeCorrect aux1) do
prependError m!"failed to transform matcher, type error when constructing new pre-splitter motive:{indentExpr aux1}\nfailed with" do
check aux1
prependError m!"failed to transform matcher, type error when constructing new pre-splitter motive:{indentExpr aux1}\nfailed with" do
check aux1
let origAltTypes inferArgumentTypesN matcherApp.alts.size aux1
-- We replace the matcher with the splitter
@@ -304,9 +303,8 @@ def transform
let aux2 := mkAppN (mkConst splitter matcherLevels.toList) params'
let aux2 := mkApp aux2 motive'
let aux2 := mkAppN aux2 discrs'
unless ( isTypeCorrect aux2) do
prependError m!"failed to transform matcher, type error when constructing splitter motive:{indentExpr aux2}\nfailed with" do
check aux2
prependError m!"failed to transform matcher, type error when constructing splitter motive:{indentExpr aux2}\nfailed with" do
check aux2
let altTypes inferArgumentTypesN matcherApp.alts.size aux2
let mut alts' := #[]
@@ -359,8 +357,7 @@ def transform
let aux := mkAppN (mkConst matcherApp.matcherName matcherLevels.toList) params'
let aux := mkApp aux motive'
let aux := mkAppN aux discrs'
unless ( isTypeCorrect aux) do
logError m!"failed to transform matcher, type error when constructing new motive:{indentExpr aux}"
prependError m!"failed to transform matcher, type error when constructing new motive:{indentExpr aux}" do
check aux
let altTypes inferArgumentTypesN matcherApp.alts.size aux

View File

@@ -23,6 +23,7 @@ public import Lean.Meta.Sym.Apply
public import Lean.Meta.Sym.InferType
public import Lean.Meta.Sym.Simp
public import Lean.Meta.Sym.Util
public import Lean.Meta.Sym.Eta
public import Lean.Meta.Sym.Grind
/-!

View File

@@ -87,6 +87,21 @@ public def mkBackwardRuleFromDecl (declName : Name) (num? : Option Nat := none)
let resultPos := mkResultPos pattern
return { expr := mkConst declName, pattern, resultPos }
/--
Creates a `BackwardRule` from an expression.
`levelParams` is not `[]` if the expression is supposed to be
universe polymorphic.
The `num?` parameter optionally limits how many arguments are included in the pattern
(useful for partially applying theorems).
-/
public def mkBackwardRuleFromExpr (e : Expr) (levelParams : List Name := []) (num? : Option Nat := none) : MetaM BackwardRule := do
let pattern mkPatternFromExpr e levelParams num?
let resultPos := mkResultPos pattern
let e := e.instantiateLevelParams levelParams (pattern.levelParams.map mkLevelParam)
return { expr := e, pattern, resultPos }
/--
Creates a value to assign to input goal metavariable using unification result.

View File

@@ -0,0 +1,53 @@
/-
Copyright (c) 2026 Amazon.com, Inc. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Meta.Sym.ExprPtr
public import Lean.Meta.Basic
import Lean.Meta.Transform
namespace Lean.Meta.Sym
/--
Checks if `body` is eta-expanded with `n` applications: `f (.bvar (n-1)) ... (.bvar 0)`.
Returns `f` if so and `f` has no loose bvars; otherwise returns `default`.
- `n`: number of remaining applications to check
- `i`: expected bvar index (starts at 0, increments with each application)
- `default`: returned when not eta-reducible (enables pointer equality check)
-/
def etaReduceAux (body : Expr) (n : Nat) (i : Nat) (default : Expr) : Expr := Id.run do
match n with
| 0 => if body.hasLooseBVars then default else body
| n+1 =>
let .app f (.bvar j) := body | default
if j == i then etaReduceAux f n (i+1) default else default
/--
If `e` is of the form `(fun x₁ ... xₙ => f x₁ ... xₙ)` and `f` does not contain `x₁`, ..., `xₙ`,
then returns `f`. Otherwise, returns `e`.
Returns the original expression when not reducible to enable pointer equality checks.
-/
public def etaReduce (e : Expr) : Expr :=
go e 0
where
go (body : Expr) (n : Nat) : Expr :=
match body with
| .lam _ _ b _ => go b (n+1)
| _ => if n == 0 then e else etaReduceAux body n 0 e
/-- Returns `true` if `e` can be eta-reduced. Uses pointer equality for efficiency. -/
public def isEtaReducible (e : Expr) : Bool :=
!isSameExpr e (etaReduce e)
/-- Applies `etaReduce` to all subexpressions. Returns `e` unchanged if no subexpression is eta-reducible. -/
public def etaReduceAll (e : Expr) : MetaM Expr := do
unless Option.isSome <| e.find? isEtaReducible do return e
let pre (e : Expr) : MetaM TransformStep := do
let e' := etaReduce e
if isSameExpr e e' then return .continue
else return .visit e'
Meta.transform e (pre := pre)
end Lean.Meta.Sym

View File

@@ -18,6 +18,7 @@ import Lean.Meta.Sym.ProofInstInfo
import Lean.Meta.Sym.AlphaShareBuilder
import Lean.Meta.Sym.LitValues
import Lean.Meta.Sym.Offset
import Lean.Meta.Sym.Eta
namespace Lean.Meta.Sym
open Internal
@@ -98,7 +99,7 @@ def isUVar? (n : Name) : Option Nat := Id.run do
return some idx
/-- Helper function for implementing `mkPatternFromDecl` and `mkEqPatternFromDecl` -/
def preprocessPattern (declName : Name) : MetaM (List Name × Expr) := do
def preprocessDeclPattern (declName : Name) : MetaM (List Name × Expr) := do
let info getConstInfo declName
let levelParams := info.levelParams.mapIdx fun i _ => Name.num uvarPrefix i
let us := levelParams.map mkLevelParam
@@ -106,6 +107,14 @@ def preprocessPattern (declName : Name) : MetaM (List Name × Expr) := do
let type preprocessType type
return (levelParams, type)
def preprocessExprPattern (e : Expr) (levelParams₀ : List Name) : MetaM (List Name × Expr) := do
let type inferType e
let levelParams := levelParams₀.mapIdx fun i _ => Name.num uvarPrefix i
let us := levelParams.map mkLevelParam
let type := type.instantiateLevelParams levelParams₀ us
let type preprocessType type
return (levelParams, type)
/--
Creates a mask indicating which pattern variables require type checking during matching.
@@ -166,6 +175,16 @@ def mkPatternCore (type : Expr) (levelParams : List Name) (varTypes : Array Expr
mkProofInstArgInfo? xs
return { levelParams, varTypes, pattern, fnInfos, varInfos?, checkTypeMask? }
def mkPatternFromType (levelParams : List Name) (type : Expr) (num? : Option Nat) : MetaM Pattern := do
let hugeNumber := 10000000
let num := num?.getD hugeNumber
let rec go (i : Nat) (pattern : Expr) (varTypes : Array Expr) : MetaM Pattern := do
if i < num then
if let .forallE _ d b _ := pattern then
return ( go (i+1) b (varTypes.push d))
mkPatternCore type levelParams varTypes pattern
go 0 type #[]
/--
Creates a `Pattern` from the type of a theorem.
@@ -180,15 +199,22 @@ If `num?` is `some n`, at most `n` leading quantifiers are stripped.
If `num?` is `none`, all leading quantifiers are stripped.
-/
public def mkPatternFromDecl (declName : Name) (num? : Option Nat := none) : MetaM Pattern := do
let (levelParams, type) preprocessPattern declName
let hugeNumber := 10000000
let num := num?.getD hugeNumber
let rec go (i : Nat) (pattern : Expr) (varTypes : Array Expr) : MetaM Pattern := do
if i < num then
if let .forallE _ d b _ := pattern then
return ( go (i+1) b (varTypes.push d))
mkPatternCore type levelParams varTypes pattern
go 0 type #[]
let (levelParams, type) preprocessDeclPattern declName
mkPatternFromType levelParams type num?
public def mkPatternFromExpr (e : Expr) (levelParams : List Name := []) (num? : Option Nat := none) : MetaM Pattern := do
let (levelParams, type) preprocessExprPattern e levelParams
mkPatternFromType levelParams type num?
def mkEqPatternFromType (levelParams : List Name) (type : Expr) : MetaM (Pattern × Expr) := do
let rec go (pattern : Expr) (varTypes : Array Expr) : MetaM (Pattern × Expr) := do
if let .forallE _ d b _ := pattern then
return ( go b (varTypes.push d))
else
let_expr Eq _ lhs rhs := pattern | throwError "conclusion is not a equality{indentExpr type}"
let pattern mkPatternCore type levelParams varTypes lhs
return (pattern, rhs)
go type #[]
/--
Creates a `Pattern` from an equational theorem, using the left-hand side of the equation.
@@ -202,15 +228,8 @@ For a theorem `∀ x₁ ... xₙ, lhs = rhs`, returns a pattern matching `lhs` w
Throws an error if the theorem's conclusion is not an equality.
-/
public def mkEqPatternFromDecl (declName : Name) : MetaM (Pattern × Expr) := do
let (levelParams, type) preprocessPattern declName
let rec go (pattern : Expr) (varTypes : Array Expr) : MetaM (Pattern × Expr) := do
if let .forallE _ d b _ := pattern then
return ( go b (varTypes.push d))
else
let_expr Eq _ lhs rhs := pattern | throwError "resulting type for `{.ofConstName declName}` is not an equality"
let pattern mkPatternCore type levelParams varTypes lhs
return (pattern, rhs)
go type #[]
let (levelParams, type) preprocessDeclPattern declName
mkEqPatternFromType levelParams type
structure UnifyM.Context where
pattern : Pattern
@@ -323,7 +342,11 @@ def isAssignedMVar (e : Expr) : MetaM Bool :=
| _ => return false
partial def process (p : Expr) (e : Expr) : UnifyM Bool := do
match p with
let e' := etaReduce e
if !isSameExpr e e' then
-- **Note**: We eagerly eta reduce patterns
process p e'
else match p with
| .bvar bidx => assignExpr bidx e
| .mdata _ p => process p e
| .const declName us =>
@@ -723,7 +746,12 @@ def isDefEqApp (tFn : Expr) (t : Expr) (s : Expr) (_ : tFn = t.getAppFn) : DefEq
@[export lean_sym_def_eq]
def isDefEqMainImpl (t : Expr) (s : Expr) : DefEqM Bool := do
if isSameExpr t s then return true
match t, s with
-- **Note**: `etaReduce` is supposed to be fast, and does not allocate memory
let t' := etaReduce t
let s' := etaReduce s
if !isSameExpr t t' || !isSameExpr s s' then
isDefEqMain t' s'
else match t, s with
| .lit l₁, .lit l₂ => return l₁ == l₂
| .sort u, .sort v => isLevelDefEqS u v
| .lam .., .lam .. => isDefEqBindingS t s

View File

@@ -9,6 +9,7 @@ public import Lean.Meta.Sym.SymM
import Lean.Meta.Sym.IsClass
import Lean.Meta.Sym.Util
import Lean.Meta.Transform
import Lean.Meta.Sym.Eta
namespace Lean.Meta.Sym
/--
@@ -17,7 +18,8 @@ Preprocesses types that used for pattern matching and unification.
public def preprocessType (type : Expr) : MetaM Expr := do
let type Sym.unfoldReducible type
let type Core.betaReduce type
zetaReduce type
let type zetaReduce type
etaReduceAll type
/--
Analyzes whether the given free variables (aka arguments) are proofs or instances.

View File

@@ -22,3 +22,4 @@ public import Lean.Meta.Sym.Simp.EvalGround
public import Lean.Meta.Sym.Simp.Discharger
public import Lean.Meta.Sym.Simp.ControlFlow
public import Lean.Meta.Sym.Simp.Goal
public import Lean.Meta.Sym.Simp.Telescope

View File

@@ -27,16 +27,16 @@ def simpIte : Simproc := fun e => do
let_expr f@ite α c _ a b := e | return .rfl
match ( simp c) with
| .rfl _ =>
if isSameExpr c ( getTrueExpr) then
if ( isTrueExpr c) then
return .step a <| mkApp3 (mkConst ``ite_true f.constLevels!) α a b
else if isSameExpr c ( getFalseExpr) then
else if ( isFalseExpr c) then
return .step b <| mkApp3 (mkConst ``ite_false f.constLevels!) α a b
else
return .rfl (done := true)
| .step c' h _ =>
if isSameExpr c' ( getTrueExpr) then
if ( isTrueExpr c') then
return .step a <| mkApp (e.replaceFn ``ite_cond_eq_true) h
else if isSameExpr c' ( getFalseExpr) then
else if ( isFalseExpr c') then
return .step b <| mkApp (e.replaceFn ``ite_cond_eq_false) h
else
let .some inst' trySynthInstance (mkApp (mkConst ``Decidable) c') | return .rfl
@@ -56,20 +56,20 @@ def simpDIte : Simproc := fun e => do
let_expr f@dite α c _ a b := e | return .rfl
match ( simp c) with
| .rfl _ =>
if isSameExpr c ( getTrueExpr) then
if ( isTrueExpr c) then
let a' share <| a.betaRev #[mkConst ``True.intro]
return .step a' <| mkApp3 (mkConst ``dite_true f.constLevels!) α a b
else if isSameExpr c ( getFalseExpr) then
else if ( isFalseExpr c) then
let b' share <| b.betaRev #[mkConst ``not_false]
return .step b' <| mkApp3 (mkConst ``dite_false f.constLevels!) α a b
else
return .rfl (done := true)
| .step c' h _ =>
if isSameExpr c' ( getTrueExpr) then
if ( isTrueExpr c') then
let h' shareCommon <| mkOfEqTrueCore c h
let a share <| a.betaRev #[h']
return .step a <| mkApp (e.replaceFn ``dite_cond_eq_true) h
else if isSameExpr c' ( getFalseExpr) then
else if ( isFalseExpr c') then
let h' shareCommon <| mkOfEqFalseCore c h
let b share <| b.betaRev #[h']
return .step b <| mkApp (e.replaceFn ``dite_cond_eq_false) h

View File

@@ -7,6 +7,8 @@ module
prelude
public import Lean.Meta.Sym.Simp.SimpM
import Lean.Meta.Sym.AlphaShareBuilder
import Lean.Meta.Sym.InferType
import Lean.Meta.Sym.Simp.Result
namespace Lean.Meta.Sym.Simp
/--
@@ -25,7 +27,7 @@ The proof uses the approach used in `mkFunextFor` followed by an `Eq.ndrec`.
def mkForallCongrFor (xs : Array Expr) : MetaM Expr := do
let prop := mkSort 0
let type mkForallFVars xs prop
let w getLevel type
let w Meta.getLevel type
withLocalDeclD `p type fun p =>
withLocalDeclD `q type fun q => do
let eq := mkApp3 (mkConst ``Eq [1]) prop (mkAppN p xs) (mkAppN q xs)
@@ -53,6 +55,119 @@ def mkForallCongrFor (xs : Array Expr) : MetaM Expr := do
open Internal
structure ArrowInfo where
binderName : Name
binderInfo : BinderInfo
u : Level
v : Level
structure ToArrowResult where
arrow : Expr
infos : List ArrowInfo
v : Level
def toArrow (e : Expr) : SymM ToArrowResult := do
if let .forallE n α β bi := e then
if !β.hasLooseBVars then
let { arrow, infos, v } toArrow β
let u getLevel α
let arrow mkAppS₂ ( mkConstS ``Arrow [u, v]) α arrow
let info := { binderName := n, binderInfo := bi, u, v }
return { arrow, v := mkLevelIMax' u v, infos := info :: infos }
return { arrow := e, infos := [], v := ( getLevel e) }
def toForall (e : Expr) (infos : List ArrowInfo) : SymM Expr := do
let { binderName, binderInfo, .. } :: infos := infos | return e
let_expr Arrow α β := e | return e
mkForallS binderName binderInfo α ( toForall β infos)
/--
Recursively simplifies an `Arrow` telescope, applying telescope-specific simplifications:
- **False hypothesis**: `False → q` simplifies to `True` (via `false_arrow`)
- **True hypothesis**: `True → q` simplifies to `q` (via `true_arrow`)
- **True conclusion**: `p → True` simplifies to `True` (via `arrow_true`)
The first two are applicable only if `q` is in `Prop` (checked via `info.v.isZero`).
Returns the simplified result paired with the remaining `ArrowInfo` list. When a telescope
collapses (e.g., to `True`), the returned `infos` list is empty, signaling to `toForall`
that no reconstruction is needed.
-/
partial def simpArrows (e : Expr) (infos : List ArrowInfo) (simpBody : Simproc) : SimpM (Result × List ArrowInfo) := do
match infos with
| [] => return (( simpBody e), [])
| info :: infos' =>
let_expr f@Arrow p q := e | return (( simpBody e), infos)
let p_r simp p
if ( isFalseExpr (p_r.getResultExpr p)) && info.v.isZero then
match p_r with
| .rfl _ => return (.step ( getTrueExpr) (mkApp (mkConst ``false_arrow) q), [])
| .step _ h _ => return (.step ( getTrueExpr) (mkApp3 (mkConst ``false_arrow_congr) p q h), [])
let (q_r, infos') simpArrows q infos' simpBody
if ( isTrueExpr (q_r.getResultExpr q)) then
match q_r with
| .rfl _ => return (.step ( getTrueExpr) (mkApp (mkConst ``arrow_true [info.u]) p), [])
| .step _ h _ => return (.step ( getTrueExpr) (mkApp3 (mkConst ``arrow_true_congr [info.u]) p q h), [])
match p_r, q_r with
| .rfl _, .rfl _ =>
if ( isTrueExpr p) && info.v.isZero then
return (.step q (mkApp (mkConst ``true_arrow) q), infos')
else
return (.rfl, infos)
| .step p' h _, .rfl _ =>
if ( isTrueExpr p') && info.v.isZero then
return (.step q (mkApp3 (mkConst ``true_arrow_congr_left) p q h), infos')
else
let e' mkAppS₂ f p' q
return (.step e' <| mkApp4 (mkConst ``arrow_congr_left f.constLevels!) p p' q h, info :: infos')
| .rfl _, .step q' h _ =>
if ( isTrueExpr p) && info.v.isZero then
return (.step q' (mkApp3 (mkConst ``true_arrow_congr_right) q q' h), infos')
else
let e' mkAppS₂ f p q'
return (.step e' <| mkApp4 (mkConst ``arrow_congr_right f.constLevels!) p q q' h, info :: infos')
| .step p' h₁ _, .step q' h₂ _ =>
if ( isTrueExpr p') && info.v.isZero then
return (.step q' (mkApp5 (mkConst ``true_arrow_congr) p q q' h₁ h₂), infos')
else
let e' mkAppS₂ f p' q'
return (.step e' <| mkApp6 (mkConst ``arrow_congr f.constLevels!) p p' q q' h₁ h₂, info :: infos')
/--
Simplifies a telescope of non-dependent arrows `p₁ → p₂ → ... → pₙ → q` by:
1. Converting to `Arrow p₁ (Arrow p₂ (... (Arrow pₙ q)))` (see `toArrow`)
2. Simplifying each `pᵢ` and `q` (see `simpArrows`)
3. Converting back to `→` form (see `toForall`)
Using `Arrow` (a definitional wrapper around `→`) avoids the quadratic proof growth that
occurs with `Expr.forallE`. With `forallE`, each nesting level bumps de Bruijn indices in
subterms, destroying sharing. For example, if each `pᵢ` contains a free variable `x`, the
de Bruijn representation of `x` differs at each depth, preventing hash-consing from
recognizing them as identical.
With `Arrow`, both arguments are explicit (not under binders), so subterms remain identical
across nesting levels and can be shared, yielding linear-sized proofs.
**Tradeoff**: This function simplifies each `pᵢ` and `q` individually, but misses
simplifications that depend on the arrow structure itself. For example, `q → p → p`
won't be simplified to `True` (when `p : Prop`) because the simplifier does not have
a chance to apply `post` methods to the intermediate arrow `p → p`.
Thus, this is a simproc that is meant to be used as a pre-method and marks the
result as fully simplified to prevent `simpArrow` from being applied.
-/
public def simpArrowTelescope (simpBody : Simproc := simp) : Simproc := fun e => do
unless e.isArrow do return .rfl -- not applicable
let { arrow, infos, v } toArrow e
let (.step arrow' h _, infos) simpArrows arrow infos simpBody | return .rfl (done := true)
let e' toForall arrow' infos
let α := mkSort v
let v1 := v.succ
let h := mkApp6 (mkConst ``Eq.trans [v1]) α e arrow arrow' (mkApp2 (mkConst ``Eq.refl [v1]) α arrow) h
let h := mkApp6 (mkConst ``Eq.trans [v1]) α e arrow' e' h (mkApp2 (mkConst ``Eq.refl [v1]) α e')
return .step e' h (done := true)
public def simpArrow (e : Expr) : SimpM Result := do
let p := e.bindingDomain!
let q := e.bindingBody!
@@ -75,7 +190,7 @@ public def simpArrow (e : Expr) : SimpM Result := do
let e' e.updateForallS! p' q'
return .step e' <| mkApp6 (mkConst ``implies_congr [u, v]) p p' q q' h₁ h₂
public def simpForall (e : Expr) : SimpM Result := do
public def simpForall' (simpArrow : Simproc) (simpBody : Simproc) (e : Expr) : SimpM Result := do
if e.isArrow then
simpArrow e
else if ( isProp e) then
@@ -86,7 +201,7 @@ public def simpForall (e : Expr) : SimpM Result := do
return .rfl
where
main (xs : Array Expr) (b : Expr) : SimpM Result := do
match ( simp b) with
match ( simpBody b) with
| .rfl _ => return .rfl
| .step b' h _ =>
let h mkLambdaFVars xs h
@@ -101,4 +216,7 @@ where
| .forallE _ _ b _ => if b.hasLooseBVar 0 then getForallTelescopeSize b (n+1) else n
| _ => n
public def simpForall : Simproc :=
simpForall' simpArrow simp
end Lean.Meta.Sym.Simp

View File

@@ -6,7 +6,7 @@ Authors: Leonardo de Moura
module
prelude
public import Lean.Meta.Sym.Simp.SimpM
import Lean.Meta.Sym.Simp.Lambda
public import Lean.Meta.Sym.Simp.Lambda
import Lean.Meta.Sym.AlphaShareBuilder
import Lean.Meta.Sym.InstantiateS
import Lean.Meta.Sym.ReplaceS
@@ -316,7 +316,8 @@ For each application `f a`:
- If only `a` changed: use `congrArg : a = a' → f a = f a'`
- If neither changed: return `.rfl`
-/
def simpBetaApp (e : Expr) (fType : Expr) (fnUnivs argUnivs : Array Level) : SimpM Result := do
def simpBetaApp (e : Expr) (fType : Expr) (fnUnivs argUnivs : Array Level)
(simpBody : Simproc) : SimpM Result := do
return ( go e 0).1
where
go (e : Expr) (i : Nat) : SimpM (Result × Expr) := do
@@ -339,7 +340,7 @@ where
let h := mkApp6 ( mkCongrPrefix ``congr fType i) f f' a a' hf ha
pure <| .step e' h
return (r, fType.bindingBody!)
| .lam .. => return ( simpLambda e, fType)
| .lam .. => return ( simpBody e, fType)
| _ => unreachable!
mkCongrPrefix (declName : Name) (fType : Expr) (i : Nat) : SymM Expr := do
@@ -375,12 +376,12 @@ e₃ = e₄ (by rfl, definitional equality from toHave)
e₁ = e₄ (by transitivity)
```
-/
def simpHaveCore (e : Expr) : SimpM SimpHaveResult := do
def simpHaveCore (e : Expr) (simpBody : Simproc) : SimpM SimpHaveResult := do
let e₁ := e
let r toBetaApp e₁
let e₂ := r.e
let { fnUnivs, argUnivs } getUnivs r.fType
match ( simpBetaApp e₂ r.fType fnUnivs argUnivs) with
match ( simpBetaApp e₂ r.fType fnUnivs argUnivs simpBody) with
| .rfl _ => return { result := .rfl, α := r.α, u := r.u }
| .step e₃ h _ =>
let h₁ := mkApp6 (mkConst ``Eq.trans [r.u]) r.α e₁ e₂ e₃ r.h h
@@ -397,8 +398,8 @@ Simplify a `have`-telescope.
This is the main entry point for `have`-telescope simplification in `Sym.simp`.
See module documentation for the algorithm overview.
-/
public def simpHave (e : Expr) : SimpM Result := do
return ( simpHaveCore e).result
public def simpHave (e : Expr) (simpBody : Simproc) : SimpM Result := do
return ( simpHaveCore e simpBody).result
/--
Simplify a `have`-telescope and eliminate unused bindings.
@@ -406,8 +407,8 @@ Simplify a `have`-telescope and eliminate unused bindings.
This combines simplification with dead variable elimination in a single pass,
avoiding quadratic behavior from multiple passes.
-/
public def simpHaveAndZetaUnused (e₁ : Expr) : SimpM Result := do
let r simpHaveCore e₁
public def simpHaveAndZetaUnused (e₁ : Expr) (simpBody : Simproc) : SimpM Result := do
let r simpHaveCore e₁ simpBody
match r.result with
| .rfl _ =>
let e₂ zetaUnused e₁
@@ -425,7 +426,7 @@ public def simpHaveAndZetaUnused (e₁ : Expr) : SimpM Result := do
(mkApp2 (mkConst ``Eq.refl [r.u]) r.α e₃)
return .step e₃ h
public def simpLet (e : Expr) : SimpM Result := do
public def simpLet' (simpBody : Simproc) (e : Expr) : SimpM Result := do
if !e.letNondep! then
/-
**Note**: We don't do anything if it is a dependent `let`.
@@ -433,6 +434,9 @@ public def simpLet (e : Expr) : SimpM Result := do
-/
return .rfl
else
simpHaveAndZetaUnused e
simpHaveAndZetaUnused e simpBody
public def simpLet : Simproc :=
simpLet' simpLambda
end Lean.Meta.Sym.Simp

View File

@@ -46,12 +46,12 @@ def mkFunextFor (xs : Array Expr) (β : Expr) : MetaM Expr := do
let result mkLambdaFVars #[f, g, h] result
return result
public def simpLambda (e : Expr) : SimpM Result := do
public def simpLambda' (simpBody : Simproc) (e : Expr) : SimpM Result := do
lambdaTelescope e fun xs b => withoutModifyingCacheIfNotWellBehaved do
main xs ( shareCommon b)
where
main (xs : Array Expr) (b : Expr) : SimpM Result := do
match ( simp b) with
match ( simpBody b) with
| .rfl _ => return .rfl
| .step b' h _ =>
let h mkLambdaFVars xs h
@@ -69,4 +69,7 @@ where
modify fun s => { s with funext := s.funext.insert { expr := key } h }
return h
public def simpLambda : Simproc :=
simpLambda' simp
end Lean.Meta.Sym.Simp

View File

@@ -26,4 +26,8 @@ public def Result.markAsDone : Result → Result
| .rfl _ => .rfl true
| .step e h _ => .step e h true
public def Result.getResultExpr : Expr Result Expr
| e, .rfl _ => e
| _, .step e _ _ => e
end Lean.Meta.Sym.Simp

View File

@@ -0,0 +1,25 @@
/-
Copyright (c) 2026 Amazon.com, Inc. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Meta.Sym.Simp.SimpM
import Lean.Meta.Sym.Simp.Have
import Lean.Meta.Sym.Simp.Forall
namespace Lean.Meta.Sym.Simp
/--
Simplify telescope binders (`have`-expression values, and arrow hypotheses)
but not the final body. This simproc is useful to simplify target before
introducing.
-/
public partial def simpTelescope : Simproc := fun e => do
match e with
| .letE .. =>
simpLet' (simpLambda' simpTelescope) e
| .forallE .. =>
simpForall' (simpArrow := simpArrowTelescope simpTelescope) (simpBody := simpLambda' simpTelescope) e
| _ => return .rfl
end Lean.Meta.Sym.Simp

View File

@@ -157,8 +157,12 @@ def getSharedExprs : SymM SharedExprs :=
/-- Returns the internalized `True` constant. -/
def getTrueExpr : SymM Expr := return ( getSharedExprs).trueExpr
/-- Returns `true` if `e` is the internalized `True` expression. -/
def isTrueExpr (e : Expr) : SymM Bool := return isSameExpr e ( getTrueExpr)
/-- Returns the internalized `False` constant. -/
def getFalseExpr : SymM Expr := return ( getSharedExprs).falseExpr
/-- Returns `true` if `e` is the internalized `False` expression. -/
def isFalseExpr (e : Expr) : SymM Bool := return isSameExpr e ( getFalseExpr)
/-- Returns the internalized `Bool.true`. -/
def getBoolTrueExpr : SymM Expr := return ( getSharedExprs).btrueExpr
/-- Returns the internalized `Bool.false`. -/

View File

@@ -658,7 +658,6 @@ partial def buildInductionBody (toErase toClear : Array FVarId) (goal : Expr)
return mkApp4 (mkConst ``Bool.dcond [u]) goal c' t' f'
| _ =>
-- Check for unreachable cases. We look for the kind of expressions that `by contradiction`
-- produces
if e.isAppOf ``False.elim && 1 < e.getAppNumArgs then
@@ -846,7 +845,7 @@ where doRealize (inductName : Name) := do
throwError "Function {name} defined via WellFounded.fix with unexpected arity {funBody.getAppNumArgs}:{indentExpr funBody}"
else
throwError "Function {name} not defined via WellFounded.fix:{indentExpr funBody}"
check e'
let (body', mvars) M2.run do
forallTelescope ( inferType e').bindingDomain! fun xs goal => do
if xs.size 2 then
@@ -876,10 +875,6 @@ where doRealize (inductName : Name) := do
let e' instantiateMVars e'
return (e', paramMask)
unless ( isTypeCorrect e') do
logError m!"failed to derive a type-correct induction principle:{indentExpr e'}"
check e'
let eTyp inferType e'
let eTyp elimTypeAnnotations eTyp
let eTyp letToHave eTyp
@@ -1066,13 +1061,9 @@ where doRealize inductName := do
let value mkLambdaFVars alts value
let value mkLambdaFVars motives value
let value mkLambdaFVars params value
check value
let value cleanPackedArgs eqnInfo value
return value
unless isTypeCorrect value do
logError m!"final term is type incorrect:{indentExpr value}"
check value
let type inferType value
let type elimOptParam type
let type letToHave type
@@ -1302,10 +1293,6 @@ where doRealize inductName := do
trace[Meta.FunInd] "complete body of mutual induction principle:{indentExpr e'}"
pure (e', paramMask, motiveArities)
unless ( isTypeCorrect e') do
logError m!"constructed induction principle is not type correct:{indentExpr e'}"
check e'
let eTyp inferType e'
let eTyp elimTypeAnnotations eTyp
let eTyp letToHave eTyp
@@ -1444,9 +1431,6 @@ def deriveCases (unfolding : Bool) (name : Name) : MetaM Unit := do
let e' mkLambdaFVars #[motive] e'
mkLambdaFVarsMasked params e'
mapError (f := (m!"constructed functional cases principle is not type correct:{indentExpr e'}\n{indentD ·}")) do
check e'
let eTyp inferType e'
let eTyp elimTypeAnnotations eTyp
let eTyp letToHave eTyp

View File

@@ -74,9 +74,9 @@ public partial def getAnchor (e : Expr) : GrindM UInt64 := do
let arg := args[i]
if h : i < pinfos.size then
let info := pinfos[i]
-- **Note**: we ignore implicit instances we computing stable hash codes
-- **Note**: we ignore instances we computing stable hash codes
-- TODO: evaluate whether we should ignore regular implicit arguments too.
unless info.isInstImplicit do
unless info.isImplicit do
r := mix r ( getAnchor arg)
else
r := mix r ( getAnchor arg)

View File

@@ -174,7 +174,7 @@ See comments at `ShouldCanonResult`.
private def shouldCanon (pinfos : Array ParamInfo) (i : Nat) (arg : Expr) : MetaM ShouldCanonResult := do
if h : i < pinfos.size then
let pinfo := pinfos[i]
if pinfo.isInstImplicit then
if pinfo.isInstance then
return .canonInst
else if pinfo.isProp then
return .visit

View File

@@ -162,7 +162,7 @@ structure Context where
extensions : ExtensionStateArray := #[]
debug : Bool -- Cached `grind.debug (← getOptions)`
export Sym (getTrueExpr getFalseExpr getBoolTrueExpr getBoolFalseExpr getNatZeroExpr getOrderingEqExpr getIntExpr)
export Sym (getTrueExpr getFalseExpr getBoolTrueExpr getBoolFalseExpr getNatZeroExpr getOrderingEqExpr getIntExpr isTrueExpr isFalseExpr)
/-- Key for the congruence theorem cache. -/
structure CongrTheoremCacheKey where
@@ -379,14 +379,6 @@ Abstracts nested proofs in `e`. This is a preprocessing step performed before in
def abstractNestedProofs (e : Expr) : GrindM Expr :=
Meta.abstractNestedProofs e
/-- Returns `true` if `e` is the internalized `True` expression. -/
def isTrueExpr (e : Expr) : GrindM Bool :=
return isSameExpr e ( getTrueExpr)
/-- Returns `true` if `e` is the internalized `False` expression. -/
def isFalseExpr (e : Expr) : GrindM Bool :=
return isSameExpr e ( getFalseExpr)
/--
Creates a congruence theorem for a `f`-applications with `numArgs` arguments.
-/
@@ -1115,11 +1107,11 @@ def getGeneration (e : Expr) : GoalM Nat :=
/-- Returns `true` if `e` is in the equivalence class of `True`. -/
def isEqTrue (e : Expr) : GoalM Bool := do
return isSameExpr ( getENode e).root ( getTrueExpr)
return ( isTrueExpr ( getENode e).root)
/-- Returns `true` if `e` is in the equivalence class of `False`. -/
def isEqFalse (e : Expr) : GoalM Bool := do
return isSameExpr ( getENode e).root ( getFalseExpr)
return ( isFalseExpr ( getENode e).root)
/-- Returns `true` if `e` is in the equivalence class of `Bool.true`. -/
def isEqBoolTrue (e : Expr) : GoalM Bool := do
@@ -1687,6 +1679,9 @@ def addTheoremInstance (thm : EMatchTheorem) (proof : Expr) (prop : Expr) (gener
modify fun s => { s with
ematch.delayedThmInsts := s.ematch.delayedThmInsts.insert { expr := guard } thms
ematch.numDelayedInstances := s.ematch.numDelayedInstances + 1
-- Bump numInstances for delayed instances too, so that uniqueId generation
-- in EMatch.instantiateTheorem' doesn't produce collisions.
ematch.numInstances := s.ematch.numInstances + 1
}
def DelayedTheoremInstance.check (delayed : DelayedTheoremInstance) : GoalM Unit := do

View File

@@ -23,6 +23,11 @@ register_builtin_option backward.dsimp.proofs : Bool := {
descr := "Let `dsimp` simplify proof terms"
}
register_builtin_option backward.dsimp.instances : Bool := {
defValue := false
descr := "Let `dsimp` simplify instance terms"
}
register_builtin_option debug.simp.check.have : Bool := {
defValue := false
descr := "(simp) enable consistency checks for `have` telescope simplification"
@@ -497,7 +502,11 @@ private partial def dsimpImpl (e : Expr) : SimpM Expr := do
let pre := m.dpre >> doNotVisitOfNat >> doNotVisitOfScientific >> doNotVisitCharLit >> doNotVisitProofs
let post := m.dpost >> dsimpReduce
withInDSimpWithCache fun cache => do
transformWithCache e cache (usedLetOnly := cfg.zeta || cfg.zetaUnused) (pre := pre) (post := post)
transformWithCache e cache
(usedLetOnly := cfg.zeta || cfg.zetaUnused)
(skipInstances := !cfg.instances)
(pre := pre)
(post := post)
def visitFn (e : Expr) : SimpM Result := do
let f := e.getAppFn

View File

@@ -627,7 +627,7 @@ def congrArgs (r : Result) (args : Array Expr) : SimpM Result := do
if h : i < infos.size then
trace[Debug.Meta.Tactic.simp] "app [{i}] {infos.size} {arg} hasFwdDeps: {infos[i].hasFwdDeps}"
let info := infos[i]
if cfg.ground && info.isInstImplicit then
if cfg.ground && info.isInstance then
-- We don't visit instance implicit arguments when we are reducing ground terms.
-- Motivation: many instance implicit arguments are ground, and it does not make sense
-- to reduce them if the parent term is not ground.
@@ -713,7 +713,7 @@ def simpAppUsingCongr (e : Expr) : SimpM Result := do
if h : i < infos.size then
let info := infos[i]
trace[Debug.Meta.Tactic.simp] "app [{i}] {infos.size} {a} hasFwdDeps: {info.hasFwdDeps}"
if cfg.ground && info.isInstImplicit then
if cfg.ground && info.isInstance then
-- We don't visit instance implicit arguments when we are reducing ground terms.
-- Motivation: many instance implicit arguments are ground, and it does not make sense
-- to reduce them if the parent term is not ground.
@@ -788,7 +788,7 @@ def tryAutoCongrTheorem? (e : Expr) : SimpM (Option Result) := do
let mut i := 0 -- index at args
for arg in args, kind in cgrThm.argKinds do
if h : config.ground i < infos.size then
if (infos[i]'h.2).isInstImplicit then
if (infos[i]'h.2).isInstance then
-- Do not visit instance implicit arguments when `ground := true`
-- See comment at `congrArgs`
argsNew := argsNew.push arg

View File

@@ -4,12 +4,10 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Meta.Basic
public import Lean.Meta.FunInfo
public section
namespace Lean
inductive TransformStep where
@@ -30,16 +28,18 @@ inductive TransformStep where
namespace Core
/--
Transform the expression `input` using `pre` and `post`.
- First `pre` is invoked with the current expression and recursion is continued according to the `TransformStep` result.
In all cases, the expression contained in the result, if any, must be definitionally equal to the current expression.
- After recursion, if any, `post` is invoked on the resulting expression.
Recursively transforms `input` using `pre` and `post` callbacks.
The term `s` in both `pre s` and `post s` may contain loose bound variables. So, this method is not appropriate for
if one needs to apply operations (e.g., `whnf`, `inferType`) that do not handle loose bound variables.
Consider using `Meta.transform` to avoid loose bound variables.
For each subexpression:
1. `pre` is invoked first; recursion continues according to the `TransformStep` result.
2. After recursion (if any), `post` is invoked on the resulting expression.
This method is useful for applying transformations such as beta-reduction and delta-reduction.
The expressions passed to `pre` and `post` may contain loose bound variables.
Use `Meta.transform` instead if you need operations like `whnf` or `inferType`
that require expressions without loose bound variables.
Results are cached using pointer equality (`ExprStructEq`), so structurally
identical subexpressions are transformed only once.
-/
partial def transform {m} [Monad m] [MonadLiftT CoreM m] [MonadControlT CoreM m]
(input : Expr)
@@ -70,6 +70,7 @@ partial def transform {m} [Monad m] [MonadLiftT CoreM m] [MonadControlT CoreM m]
| _ => visitPost e
visit input |>.run
/-- Applies beta reduction to all beta-reducible subexpressions in `e`. -/
def betaReduce (e : Expr) : CoreM Expr :=
transform e (pre := fun e => return if e.isHeadBetaTarget then .visit e.headBeta else .continue)
@@ -78,12 +79,19 @@ end Core
namespace Meta
/--
Similar to `Meta.transform`, but allows the use of a pre-existing cache.
Like `Meta.transform`, but accepts and returns a cache for reuse across multiple calls.
Warnings:
- For the cache to be valid, it must always use the same `pre` and `post` functions.
- It is important that there are no other references to `cache` when it is passed to
`transformWithCache`, to avoid unnecessary copying of the hash map.
Parameters:
- `usedLetOnly`: when true, `mkLambdaFVars`/`mkForallFVars`/`mkLetFVars` only abstract
over variables that are actually used in the body.
- `skipConstInApp`: when true, constant heads in applications are not visited separately.
- `skipInstances`: when true, instance arguments (determined via `getFunInfo`) are not visited.
The `skipInstances` flag is used by `dsimp` to avoid rewriting instances.
**Warnings:**
- The cache is only valid when using the same `pre` and `post` functions.
- Ensure there are no other references to `cache` to avoid unnecessary hash map copying.
-/
@[inline]
partial def transformWithCache {m} [Monad m] [MonadLiftT MetaM m] [MonadControlT MetaM m]
@@ -93,6 +101,7 @@ partial def transformWithCache {m} [Monad m] [MonadLiftT MetaM m] [MonadControlT
(post : Expr m TransformStep := fun e => return .done e)
(usedLetOnly := false)
(skipConstInApp := false)
(skipInstances := false)
: m (Expr × Std.HashMap ExprStructEq Expr) :=
let _ : STWorld IO.RealWorld m :=
let _ : MonadLiftT (ST IO.RealWorld) m := { monadLift := fun x => liftM (m := MetaM) (liftM (m := ST IO.RealWorld) x) }
@@ -123,10 +132,22 @@ partial def transformWithCache {m} [Monad m] [MonadLiftT MetaM m] [MonadControlT
| e => visitPost ( mkLetFVars (usedLetOnly := usedLetOnly) (generalizeNondepLet := false) fvars ( visit (e.instantiateRev fvars)))
let visitApp (e : Expr) : MonadCacheT ExprStructEq Expr m Expr :=
e.withApp fun f args => do
if skipConstInApp && f.isConst then
visitPost (mkAppN f ( args.mapM visit))
else
visitPost (mkAppN ( visit f) ( args.mapM visit))
let f if skipConstInApp && f.isConst then pure f else visit f
if skipInstances then
let infos := ( getFunInfoNArgs f args.size).paramInfo
let mut args := args.toVector
for h : i in *...args.size do
let arg := args[i]
if h : i < infos.size then
let info := infos[i]
if skipInstances && info.isInstance then
continue
args := args.set i ( visit arg)
else
args := args.set i ( visit arg)
visitPost (mkAppN f args.toArray)
else
visitPost (mkAppN f ( args.mapM visit))
match ( pre e) with
| .done e => pure e
| .visit e => visit e
@@ -163,6 +184,10 @@ def transform {m} [Monad m] [MonadLiftT MetaM m] [MonadControlT MetaM m]
let (e, _) transformWithCache input {} pre post usedLetOnly skipConstInApp
return e
/--
Replaces all free variables in `e` that have let-values with their values.
The substitution is applied recursively to the values themselves.
-/
-- TODO: add options to distinguish zeta and zetaDelta reduction
def zetaReduce (e : Expr) : MetaM Expr := do
let pre (e : Expr) : MetaM TransformStep := do
@@ -173,7 +198,8 @@ def zetaReduce (e : Expr) : MetaM Expr := do
transform e (pre := pre) (usedLetOnly := true)
/--
Zeta reduces only the provided fvars, beta reducing the substitutions.
Zeta-reduces only the specified free variables, applying beta reduction after substitution.
For example, if `x` has value `fun y => y + 1` and appears as `x 2`, the result is `2 + 1`.
-/
def zetaDeltaFVars (e : Expr) (fvars : Array FVarId) : MetaM Expr :=
let unfold? (fvarId : FVarId) : MetaM (Option Expr) := do
@@ -207,10 +233,10 @@ def unfoldDeclsFrom (biggerEnv : Environment) (e : Expr) : CoreM Expr := do
Core.transform e (pre := pre)
/--
Unfolds theorems that are applied to a `f x₁ .. xₙ` where `f` is in the given array, and
`n ≤ SectionVars`, i.e. an unsaturated application of `f`.
Unfolds theorems that are applied to `f x₁ .. xₙ` where `f` is in `fnNames` and
`n ≤ numSectionVars` (i.e., an unsaturated application of `f`).
This is used tounfoldIfArgIsAppOf undo proof abstraction for termination checking, as otherwise the bare
This is used to undo proof abstraction for termination checking, as otherwise the bare
occurrence of the recursive function prevents termination checking from succeeding.
Usually, the argument is just `f` (the constant), arising from `mkAuxTheorem` abstracting over the
@@ -249,9 +275,11 @@ def unfoldIfArgIsAppOf (fnNames : Array Name) (numSectionVars : Nat) (e : Expr)
af.isConst && fnNames.any fun f => af.constName! == f && axs.size numSectionVars
/-- Removes all `inaccessible` annotations from `e`. -/
def eraseInaccessibleAnnotations (e : Expr) : CoreM Expr :=
Core.transform e (post := fun e => return .done <| if let some e := inaccessible? e then e else e)
/-- Removes all `patternWithRef` annotations from `e`. -/
def erasePatternRefAnnotations (e : Expr) : CoreM Expr :=
Core.transform e (post := fun e => return .done <| if let some (_, e) := patternWithRef? e then e else e)

View File

@@ -36,12 +36,12 @@ structure CodeActionResolveData where
providerResultIndex : Nat
deriving ToJson, FromJson
def CodeAction.getFileSource! (ca : CodeAction) : FileIdent :=
let r : Except String FileIdent := do
def CodeAction.getFileSource! (ca : CodeAction) : DocumentUri :=
let r : Except String DocumentUri := do
let some data := ca.data?
| throw s!"no data param on code action {ca.title}"
let data : CodeActionResolveData fromJson? data
return .uri data.params.textDocument.uri
return data.params.textDocument.uri
match r with
| Except.ok uri => uri
| Except.error e => panic! e

View File

@@ -29,7 +29,7 @@ private def filterDuplicateCompletionItems
return r
partial def find?
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(fileMap : FileMap)
(hoverPos : String.Pos.Raw)
@@ -45,21 +45,21 @@ partial def find?
let completions : Array ScoredCompletionItem
match i.info with
| .id stx id danglingDot lctx .. =>
idCompletion mod pos completionInfoPos i.ctx lctx stx id i.hoverInfo danglingDot
idCompletion uri pos completionInfoPos i.ctx lctx stx id i.hoverInfo danglingDot
| .dot info .. =>
dotCompletion mod pos completionInfoPos i.ctx info
dotCompletion uri pos completionInfoPos i.ctx info
| .dotId _ id lctx expectedType? =>
dotIdCompletion mod pos completionInfoPos i.ctx lctx id expectedType?
dotIdCompletion uri pos completionInfoPos i.ctx lctx id expectedType?
| .fieldId _ id lctx structName =>
fieldIdCompletion mod pos completionInfoPos i.ctx lctx id structName
fieldIdCompletion uri pos completionInfoPos i.ctx lctx id structName
| .option stx =>
optionCompletion mod pos completionInfoPos i.ctx stx caps
optionCompletion uri pos completionInfoPos i.ctx stx caps
| .errorName _ partialId =>
errorNameCompletion mod pos completionInfoPos i.ctx partialId caps
errorNameCompletion uri pos completionInfoPos i.ctx partialId caps
| .endSection _ id? danglingDot scopeNames =>
endSectionCompletion mod pos completionInfoPos id? danglingDot scopeNames
endSectionCompletion uri pos completionInfoPos id? danglingDot scopeNames
| .tactic .. =>
tacticCompletion mod pos completionInfoPos i.ctx
tacticCompletion uri pos completionInfoPos i.ctx
| _ =>
pure #[]
allCompletions := allCompletions ++ completions

View File

@@ -23,7 +23,7 @@ open FuzzyMatching
section Infrastructure
private structure Context where
mod : Name
uri : DocumentUri
pos : Lsp.Position
completionInfoPos : Nat
@@ -46,7 +46,7 @@ section Infrastructure
: M Unit := do
let ctx read
let data := {
mod := ctx.mod,
uri := ctx.uri,
pos := ctx.pos,
cPos? := ctx.completionInfoPos,
id?
@@ -83,7 +83,7 @@ section Infrastructure
addItem item
private def runM
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
@@ -92,7 +92,7 @@ section Infrastructure
: CancellableM (Array ResolvableCompletionItem) := do
let tk read
let r ctx.runMetaM lctx do
x.run mod, pos, completionInfoPos |>.run {} |>.run tk
x.run uri, pos, completionInfoPos |>.run {} |>.run tk
match r with
| .error _ => throw .requestCancelled
| .ok (_, s) => return s.items
@@ -401,7 +401,7 @@ private def idCompletionCore
completeNamespaces ctx id danglingDot
def idCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
@@ -411,17 +411,17 @@ def idCompletion
(hoverInfo : HoverInfo)
(danglingDot : Bool)
: CancellableM (Array ResolvableCompletionItem) :=
runM mod pos completionInfoPos ctx lctx do
runM uri pos completionInfoPos ctx lctx do
idCompletionCore ctx stx id hoverInfo danglingDot
def dotCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
(info : TermInfo)
: CancellableM (Array ResolvableCompletionItem) :=
runM mod pos completionInfoPos ctx info.lctx do
runM uri pos completionInfoPos ctx info.lctx do
let nameSet try
getDotCompletionTypeNameSet ( instantiateMVars ( inferType info.expr))
catch _ =>
@@ -443,7 +443,7 @@ def dotCompletion
( decl.kind) ( decl.tags)
def dotIdCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
@@ -451,7 +451,7 @@ def dotIdCompletion
(id : Name)
(expectedType? : Option Expr)
: CancellableM (Array ResolvableCompletionItem) :=
runM mod pos completionInfoPos ctx lctx do
runM uri pos completionInfoPos ctx lctx do
let some expectedType := expectedType?
| return ()
@@ -484,7 +484,7 @@ def dotIdCompletion
addUnresolvedCompletionItem label (.const c.name) kind tags
def fieldIdCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
@@ -492,7 +492,7 @@ def fieldIdCompletion
(id : Option Name)
(structName : Name)
: CancellableM (Array ResolvableCompletionItem) :=
runM mod pos completionInfoPos ctx lctx do
runM uri pos completionInfoPos ctx lctx do
let idStr := id.map (·.toString) |>.getD ""
let fieldNames := getStructureFieldsFlattened ( getEnv) structName (includeSubobjectFields := false)
for fieldName in fieldNames do
@@ -536,7 +536,7 @@ private def trailingDotCompletion [ForIn Id Coll (Name × α)]
return items
def optionCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
@@ -555,7 +555,7 @@ def optionCompletion
kind? := CompletionItemKind.property -- TODO: investigate whether this is the best kind for options.
textEdit?
data? := {
mod
uri
pos
cPos? := completionInfoPos
id? := none : ResolvableCompletionItemData
@@ -563,7 +563,7 @@ def optionCompletion
}
def errorNameCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
@@ -584,7 +584,7 @@ def errorNameCompletion
textEdit?
tags? := explan.metadata.removedVersion?.map fun _ => #[CompletionItemTag.deprecated]
data? := {
mod
uri
pos
cPos? := completionInfoPos
id? := none : ResolvableCompletionItemData
@@ -592,7 +592,7 @@ def errorNameCompletion
}
def tacticCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(ctx : ContextInfo)
@@ -605,7 +605,7 @@ def tacticCompletion
documentation? := tacticDoc.docString.map fun docString =>
{ value := docString, kind := MarkupKind.markdown : MarkupContent }
kind? := CompletionItemKind.keyword
data? := { mod, pos, cPos? := completionInfoPos, id? := none : ResolvableCompletionItemData }
data? := { uri, pos, cPos? := completionInfoPos, id? := none : ResolvableCompletionItemData }
}
return items
@@ -637,7 +637,7 @@ where
".".intercalate scopeNames[idx:].toList
def endSectionCompletion
(mod : Name)
(uri : DocumentUri)
(pos : Lsp.Position)
(completionInfoPos : Nat)
(id? : Option Name)
@@ -653,7 +653,7 @@ def endSectionCompletion
label := candidate
kind? := CompletionItemKind.module
data? := {
mod
uri
pos
cPos? := completionInfoPos
id? := none : ResolvableCompletionItemData

View File

@@ -14,7 +14,7 @@ namespace Lean.Lsp.ResolvableCompletionList
def compressItemDataFast (acc : String) (data : ResolvableCompletionItemData) :
String := Id.run do
let mut acc := acc ++ "["
acc := Json.renderString data.mod.toString acc
acc := Json.renderString data.uri acc
acc := acc ++ "," ++ data.pos.line.repr
acc := acc ++ "," ++ data.pos.character.repr
if let some cPos := data.cPos? then

View File

@@ -127,30 +127,30 @@ Sets the `data?` field of every `CompletionItem` in `completionList` using `para
`completionItem/resolve` requests can be routed to the correct file worker even for
`CompletionItem`s produced by the import completion.
-/
def addCompletionItemData (mod : Name) (pos : Lsp.Position) (completionList : CompletionList)
def addCompletionItemData (uri : DocumentUri) (pos : Lsp.Position) (completionList : CompletionList)
: CompletionList :=
let data := { mod, pos : Lean.Lsp.ResolvableCompletionItemData }
let data := { uri, pos : Lean.Lsp.ResolvableCompletionItemData }
{ completionList with items := completionList.items.map fun item =>
{ item with data? := some <| toJson data } }
def find (mod : Name) (pos : Lsp.Position) (text : FileMap) (headerStx : TSyntax ``Parser.Module.header) (availableImports : AvailableImports) : CompletionList :=
def find (uri : DocumentUri) (pos : Lsp.Position) (text : FileMap) (headerStx : TSyntax ``Parser.Module.header) (availableImports : AvailableImports) : CompletionList :=
let availableImports := availableImports.toImportTrie
let completionPos := text.lspPosToUtf8Pos pos
if isImportNameCompletionRequest headerStx completionPos then
let allAvailableImportNameCompletions := availableImports.toArray.map ({ label := toString · })
addCompletionItemData mod pos { isIncomplete := false, items := allAvailableImportNameCompletions }
addCompletionItemData uri pos { isIncomplete := false, items := allAvailableImportNameCompletions }
else if isImportCmdCompletionRequest headerStx completionPos then
let allAvailableFullImportCompletions := availableImports.toArray.map ({ label := s!"import {·}" })
addCompletionItemData mod pos { isIncomplete := false, items := allAvailableFullImportCompletions }
addCompletionItemData uri pos { isIncomplete := false, items := allAvailableFullImportCompletions }
else
let completionNames : Array Name := computePartialImportCompletions headerStx completionPos availableImports
let completions : Array CompletionItem := completionNames.map ({ label := toString · })
addCompletionItemData mod pos { isIncomplete := false, items := completions }
addCompletionItemData uri pos { isIncomplete := false, items := completions }
def computeCompletions (mod : Name) (pos : Lsp.Position) (text : FileMap) (headerStx : TSyntax ``Parser.Module.header)
def computeCompletions (uri : DocumentUri) (pos : Lsp.Position) (text : FileMap) (headerStx : TSyntax ``Parser.Module.header)
: IO CompletionList := do
let availableImports collectAvailableImports
let completionList := find mod pos text headerStx availableImports
return addCompletionItemData mod pos completionList
let completionList := find uri pos text headerStx availableImports
return addCompletionItemData uri pos completionList
end ImportCompletion

View File

@@ -13,34 +13,24 @@ public section
namespace Lean.Lsp
inductive FileIdent where
| uri (uri : DocumentUri)
| mod (mod : Name)
deriving Inhabited
instance : ToString FileIdent where
toString
| .uri uri => toString uri
| .mod mod => toString mod
class FileSource (α : Type) where
fileSource : α FileIdent
fileSource : α DocumentUri
export FileSource (fileSource)
instance : FileSource Location :=
fun l => .uri l.uri
fun l => l.uri
instance : FileSource TextDocumentIdentifier :=
fun i => .uri i.uri
fun i => i.uri
instance : FileSource VersionedTextDocumentIdentifier :=
fun i => .uri i.uri
fun i => i.uri
instance : FileSource TextDocumentEdit :=
fun e => fileSource e.textDocument
instance : FileSource TextDocumentItem :=
fun i => .uri i.uri
fun i => i.uri
instance : FileSource TextDocumentPositionParams :=
fun p => fileSource p.textDocument
@@ -76,7 +66,7 @@ instance : FileSource ReferenceParams :=
fun h => fileSource h.toTextDocumentPositionParams
instance : FileSource WaitForDiagnosticsParams :=
fun p => .uri p.uri
fun p => p.uri
instance : FileSource DocumentHighlightParams :=
fun h => fileSource h.toTextDocumentPositionParams
@@ -100,16 +90,16 @@ instance : FileSource PlainTermGoalParams where
fileSource p := fileSource p.textDocument
instance : FileSource RpcConnectParams where
fileSource p := .uri p.uri
fileSource p := p.uri
instance : FileSource RpcCallParams where
fileSource p := fileSource p.textDocument
instance : FileSource RpcReleaseParams where
fileSource p := .uri p.uri
fileSource p := p.uri
instance : FileSource RpcKeepAliveParams where
fileSource p := .uri p.uri
fileSource p := p.uri
instance : FileSource CodeActionParams where
fileSource p := fileSource p.textDocument
@@ -133,20 +123,20 @@ Since this function can panic and clients typically send `completionItem/resolve
selected completion item, all completion items returned by the server in `textDocument/completion`
requests must have a `data?` field that has a `mod` field.
-/
def CompletionItem.getFileSource! (item : CompletionItem) : FileIdent :=
let r : Except String FileIdent := do
def CompletionItem.getFileSource! (item : CompletionItem) : DocumentUri :=
let r : Except String DocumentUri := do
let some data := item.data?
| throw s!"no data param on completion item {item.label}"
match data with
| .obj _ =>
-- In the language server, `data` is always an array,
-- but we also support having `mod` as an object field for
-- but we also support having `uri` as an object field for
-- `chainLspRequestHandler` consumers.
let mod data.getObjValAs? Name "mod"
return .mod mod
let uri data.getObjValAs? DocumentUri "uri"
return uri
| .arr _ =>
let mod fromJson? <| data.getArrVal? 0
return .mod mod
let uri fromJson? <| data.getArrVal? 0
return uri
| _ =>
throw s!"unexpected completion item data: {data}"
match r with

View File

@@ -155,13 +155,12 @@ section Elab
let param := { version := m.version, references, decls }
return { method, param }
private def mkIleanHeaderInfoNotification (m : DocumentMeta)
(directImports : Array ImportInfo) : JsonRpc.Notification Lsp.LeanILeanHeaderInfoParams :=
{ method := "$/lean/ileanHeaderInfo", param := { version := m.version, directImports } }
private def mkIleanHeaderSetupInfoNotification (m : DocumentMeta)
(isSetupFailure : Bool) : JsonRpc.Notification Lsp.LeanILeanHeaderSetupInfoParams :=
{ method := "$/lean/ileanHeaderSetupInfo", param := { version := m.version, isSetupFailure } }
(directImports : Array ImportInfo) (isSetupFailure : Bool) :
JsonRpc.Notification Lsp.LeanILeanHeaderSetupInfoParams := {
method := "$/lean/ileanHeaderSetupInfo"
param := { version := m.version, directImports, isSetupFailure }
}
private def mkIleanInfoUpdateNotification : DocumentMeta Array Elab.InfoTree
BaseIO (JsonRpc.Notification Lsp.LeanIleanInfoParams) :=
@@ -400,7 +399,6 @@ def setupImports
return .error { diagnostics := .empty, result? := none, metaSnap := default }
let header := stx.toModuleHeader
chanOut.sync.send <| .ofMsg <| mkIleanHeaderInfoNotification doc <| collectImports stx
let fileSetupResult setupFile doc header fun stderrLine => do
let progressDiagnostic := {
range := 0, 0, 1, 0
@@ -410,29 +408,33 @@ def setupImports
message := stderrLine
}
chanOut.sync.send <| .ofMsg <| mkPublishDiagnosticsNotification doc #[progressDiagnostic]
let isSetupError := fileSetupResult.kind matches .importsOutOfDate
|| fileSetupResult.kind matches .error ..
chanOut.sync.send <| .ofMsg <| mkIleanHeaderSetupInfoNotification doc isSetupError
let isSetupError := fileSetupResult matches .importsOutOfDate
|| fileSetupResult matches .error ..
chanOut.sync.send <| .ofMsg <|
mkIleanHeaderSetupInfoNotification doc (collectImports stx) isSetupError
-- clear progress notifications in the end
chanOut.sync.send <| .ofMsg <| mkPublishDiagnosticsNotification doc #[]
match fileSetupResult.kind with
| .importsOutOfDate =>
return .error {
diagnostics := ( Language.diagnosticsOfHeaderError
"Imports are out of date and must be rebuilt; \
use the \"Restart File\" command in your editor.")
result? := none
metaSnap := default
}
| .error msg =>
return .error {
diagnostics := ( diagnosticsOfHeaderError msg)
result? := none
metaSnap := default
}
| _ => pure ()
let setup := fileSetupResult.setup
let setup do
match fileSetupResult with
| .importsOutOfDate =>
return .error {
diagnostics := ( Language.diagnosticsOfHeaderError
"Imports are out of date and must be rebuilt; \
use the \"Restart File\" command in your editor.")
result? := none
metaSnap := default
: Language.Lean.HeaderProcessedSnapshot
}
| .error msg =>
return .error {
diagnostics := ( diagnosticsOfHeaderError msg)
result? := none
metaSnap := default
}
| .noLakefile =>
pure { name := doc.mod, isModule := header.isModule }
| .success setup =>
pure setup
-- override cmdline options with file options
let opts := cmdlineOpts.mergeBy (fun _ _ fileOpt => fileOpt) setup.options.toOptions
@@ -704,14 +706,14 @@ section MessageHandling
: WorkerM (ServerTask (Except Error AvailableImportsCache)) := do
let ctx read
let st get
let mod := st.doc.meta.mod
let uri := st.doc.meta.uri
let text := st.doc.meta.text
match st.importCachingTask? with
| none => ServerTask.IO.asTask do
let availableImports ImportCompletion.collectAvailableImports
let lastRequestTimestampMs IO.monoMsNow
let completions := ImportCompletion.find mod params.position text st.doc.initSnap.stx availableImports
let completions := ImportCompletion.find uri params.position text st.doc.initSnap.stx availableImports
ctx.chanOut.sync.send <| .ofMsg <| .response id (toJson completions)
pure { availableImports, lastRequestTimestampMs : AvailableImportsCache }
@@ -721,7 +723,7 @@ section MessageHandling
if timestampNowMs - lastRequestTimestampMs >= 10000 then
availableImports ImportCompletion.collectAvailableImports
lastRequestTimestampMs := timestampNowMs
let completions := ImportCompletion.find mod params.position text st.doc.initSnap.stx availableImports
let completions := ImportCompletion.find uri params.position text st.doc.initSnap.stx availableImports
ctx.chanOut.sync.send <| .ofMsg <| .response id (toJson completions)
pure { availableImports, lastRequestTimestampMs : AvailableImportsCache }

View File

@@ -49,7 +49,7 @@ def handleCompletion (p : CompletionParams)
mapTaskCostly (findCompletionCmdDataAtPos doc pos) fun cmdData? => do
let some (cmdStx, infoTree) := cmdData?
| return { items := #[], isIncomplete := true }
Completion.find? doc.meta.mod p.position doc.meta.text pos cmdStx infoTree caps
Completion.find? doc.meta.uri p.position doc.meta.text pos cmdStx infoTree caps
/--
Handles `completionItem/resolve` requests that are sent by the client after the user selects

View File

@@ -56,10 +56,10 @@ partial def runLakeSetupFile
let exitCode lakeProc.wait
return spawnArgs, exitCode, stdout, stderr
/-- Categorizes possible outcomes of running `lake setup-file`. -/
inductive FileSetupResultKind where
/-- Result of running `lake setup-file`. -/
inductive FileSetupResult where
/-- File configuration loaded and dependencies updated successfully. -/
| success
| success (setup : ModuleSetup)
/-- No Lake project found, no setup was done. -/
| noLakefile
/-- Imports must be rebuilt but `--no-build` was specified. -/
@@ -67,43 +67,16 @@ inductive FileSetupResultKind where
/-- Other error during Lake invocation. -/
| error (msg : String)
/-- Result of running `lake setup-file`. -/
structure FileSetupResult where
/-- Kind of outcome. -/
kind : FileSetupResultKind
/-- Configuration from a successful setup, or else the default. -/
setup : ModuleSetup
def FileSetupResult.ofSuccess (setup : ModuleSetup) : IO FileSetupResult := do return {
kind := FileSetupResultKind.success
setup
}
def FileSetupResult.ofNoLakefile (m : DocumentMeta) (header : ModuleHeader) : IO FileSetupResult := do return {
kind := FileSetupResultKind.noLakefile
setup := {name := m.mod, isModule := header.isModule}
}
def FileSetupResult.ofImportsOutOfDate (m : DocumentMeta) (header : ModuleHeader) : IO FileSetupResult := do return {
kind := FileSetupResultKind.importsOutOfDate
setup := {name := m.mod, isModule := header.isModule}
}
def FileSetupResult.ofError (m : DocumentMeta) (header : ModuleHeader) (msg : String) : IO FileSetupResult := do return {
kind := FileSetupResultKind.error msg
setup := {name := m.mod, isModule := header.isModule}
}
/-- Uses `lake setup-file` to compile dependencies on the fly and add them to `LEAN_PATH`.
Compilation progress is reported to `handleStderr`. Returns the search path for
source files and the options for the file. -/
partial def setupFile (m : DocumentMeta) (header : ModuleHeader) (handleStderr : String IO Unit) : IO FileSetupResult := do
let some filePath := System.Uri.fileUriToPath? m.uri
| return FileSetupResult.ofNoLakefile m header -- untitled files have no lakefile
| return FileSetupResult.noLakefile -- untitled files have no lakefile
let lakePath determineLakePath
if !( System.FilePath.pathExists lakePath) then
return FileSetupResult.ofNoLakefile m header
return FileSetupResult.noLakefile
let result runLakeSetupFile m lakePath filePath header handleStderr
@@ -112,12 +85,12 @@ partial def setupFile (m : DocumentMeta) (header : ModuleHeader) (handleStderr :
match result.exitCode with
| 0 =>
let Except.ok (setup : ModuleSetup) := Json.parse result.stdout >>= fromJson?
| return FileSetupResult.ofError m header s!"Invalid output from `{cmdStr}`:\n{result.stdout}\nstderr:\n{result.stderr}"
| return FileSetupResult.error s!"Invalid output from `{cmdStr}`:\n{result.stdout}\nstderr:\n{result.stderr}"
setup.dynlibs.forM loadDynlib
FileSetupResult.ofSuccess setup
return FileSetupResult.success setup
| 2 => -- exit code for lake reporting that there is no lakefile
FileSetupResult.ofNoLakefile m header
return FileSetupResult.noLakefile
| 3 => -- exit code for `--no-build`
FileSetupResult.ofImportsOutOfDate m header
return FileSetupResult.importsOutOfDate
| _ =>
FileSetupResult.ofError m header s!"`{cmdStr}` failed:\n{result.stdout}\nstderr:\n{result.stderr}"
return FileSetupResult.error s!"`{cmdStr}` failed:\n{result.stdout}\nstderr:\n{result.stderr}"

View File

@@ -571,31 +571,6 @@ def removeIlean (self : References) (path : System.FilePath) : References :=
namesToRemove.foldl (init := self) fun self name _ =>
{ self with ileans := self.ileans.erase name }
/--
Replaces the direct imports of a worker for the module `name` in `self` with
a new set of direct imports.
-/
def updateWorkerImports
(self : References)
(name : Name)
(moduleUri : DocumentUri)
(version : Nat)
(directImports : Array ImportInfo)
: IO References := do
let directImports DirectImports.convertImportInfos directImports
let some workerRefs := self.workers[name]?
| return { self with workers := self.workers.insert name { moduleUri, version, directImports, isSetupFailure? := none, refs := , decls := } }
match compare version workerRefs.version with
| .lt => return self
| .gt => return { self with workers := self.workers.insert name { moduleUri, version, directImports, isSetupFailure? := none, refs := , decls := } }
| .eq =>
let isSetupFailure? := workerRefs.isSetupFailure?
let refs := workerRefs.refs
let decls := workerRefs.decls
return { self with
workers := self.workers.insert name { moduleUri, version, directImports, isSetupFailure?, refs, decls }
}
/--
Replaces the direct imports of a worker for the module `name` in `self` with
a new set of direct imports.
@@ -605,12 +580,13 @@ def updateWorkerSetupInfo
(name : Name)
(moduleUri : DocumentUri)
(version : Nat)
(directImports : Array ImportInfo)
(isSetupFailure : Bool)
: IO References := do
let directImports DirectImports.convertImportInfos directImports
let isSetupFailure? := some isSetupFailure
let some workerRefs := self.workers[name]?
| return { self with workers := self.workers.insert name { moduleUri, version, directImports := , isSetupFailure?, refs := , decls := } }
let directImports := workerRefs.directImports
| return { self with workers := self.workers.insert name { moduleUri, version, directImports, isSetupFailure?, refs := , decls := } }
match compare version workerRefs.version with
| .lt => return self
| .gt => return { self with workers := self.workers.insert name { moduleUri, version, directImports, isSetupFailure?, refs := , decls := } }
@@ -702,7 +678,8 @@ def allDirectImports (self : References) : AllDirectImportsMap := Id.run do
for (name, ilean) in self.ileans do
allDirectImports := allDirectImports.insert name (ilean.moduleUri, ilean.directImports)
for (name, worker) in self.workers do
allDirectImports := allDirectImports.insert name (worker.moduleUri, worker.directImports)
if worker.hasRefs then
allDirectImports := allDirectImports.insert name (worker.moduleUri, worker.directImports)
return allDirectImports
/--
@@ -723,7 +700,8 @@ The current imports in a file worker take precedence over those in .ilean files.
-/
def getDirectImports? (self : References) (mod : Name) : Option DirectImports := do
if let some worker := self.workers[mod]? then
return worker.directImports
if worker.hasRefs then
return worker.directImports
if let some ilean := self.ileans[mod]? then
return ilean.directImports
none

View File

@@ -485,7 +485,7 @@ def SerializedLspResponse.toSerializedMessage
++ "}"
structure RequestHandler where
fileSource : Json Except RequestError FileIdent
fileSource : Json Except RequestError DocumentUri
handle : Json RequestM (RequestTask SerializedLspResponse)
builtin_initialize requestHandlers : IO.Ref (PersistentHashMap String RequestHandler)
@@ -597,7 +597,7 @@ inductive RequestHandlerCompleteness where
| «partial» (refreshMethod : String) (refreshIntervalMs : Nat)
structure StatefulRequestHandler where
fileSource : Json Except RequestError FileIdent
fileSource : Json Except RequestError DocumentUri
/--
`handle` with explicit state management for chaining request handlers.
This function is pure w.r.t. `lastTaskMutex` and `stateRef`, but not `RequestM`.
@@ -790,7 +790,7 @@ def handleLspRequest (method : String) (params : Json) : RequestM (RequestTask S
s!"request '{method}' routed through watchdog but unknown in worker; are both using the same plugins?"
| some rh => rh.handle params
def routeLspRequest (method : String) (params : Json) : IO (Except RequestError FileIdent) := do
def routeLspRequest (method : String) (params : Json) : IO (Except RequestError DocumentUri) := do
if isStatefulLspRequestMethod method then
match lookupStatefulLspRequestHandler method with
| none => return Except.error <| RequestError.methodNotFound method

View File

@@ -273,13 +273,13 @@ def patchUri (s : String) : IO String := do
catch _ =>
return s
let c := path.components.toArray
if let some srcIdx := c.findIdx? (· == "src") then
if let some (_, srcIdx) := c.zipIdx.filter (·.1 == "src") |>.back? then
if ! c[srcIdx + 1]?.any (fun dir => dir == "Init" || dir == "Lean" || dir == "Std") then
return s
let c := c.drop <| srcIdx
let path := System.mkFilePath c.toList
return System.Uri.pathToUri path
if let some testIdx := c.findIdx? (· == "tests") then
if let some (_, testIdx) := c.zipIdx.filter (·.1 == "tests") |>.back? then
let c := c.drop <| testIdx
let path := System.mkFilePath c.toList
return System.Uri.pathToUri path

View File

@@ -265,31 +265,6 @@ section FileWorker
end FileWorker
section ServerM
structure FileWorkerMap where
fileWorkers : Std.TreeMap DocumentUri FileWorker := {}
uriByMod : Std.TreeMap Name DocumentUri Name.quickCmp := {}
def FileWorkerMap.getUri? (m : FileWorkerMap) (id : FileIdent) : Option DocumentUri :=
match id with
| .uri uri => uri
| .mod mod => m.uriByMod.get? mod
def FileWorkerMap.insert (m : FileWorkerMap) (uri : DocumentUri) (fw : FileWorker) :
FileWorkerMap where
fileWorkers := m.fileWorkers.insert uri fw
uriByMod := m.uriByMod.insert fw.doc.mod uri
def FileWorkerMap.erase (m : FileWorkerMap) (uri : DocumentUri) : FileWorkerMap := Id.run do
let some fw := m.fileWorkers.get? uri
| return m
return {
fileWorkers := m.fileWorkers.erase uri
uriByMod := m.uriByMod.erase fw.doc.mod
}
def FileWorkerMap.get? (m : FileWorkerMap) (uri : DocumentUri) : Option FileWorker := do
m.fileWorkers.get? uri
abbrev ImportMap := Std.TreeMap DocumentUri (Std.TreeSet DocumentUri)
/-- Global import data for all open files managed by this watchdog. -/
@@ -414,7 +389,7 @@ section ServerM
logData : LogData
/-- Command line arguments. -/
args : List String
fileWorkersRef : IO.Ref FileWorkerMap
fileWorkersRef : IO.Ref (Std.TreeMap DocumentUri FileWorker)
/-- We store these to pass them to workers. -/
initParams : InitializeParams
workerPath : System.FilePath
@@ -464,15 +439,10 @@ section ServerM
let rd rd.modifyReferencesM (m := IO) f
set rd
def getFileWorkerUri? (id : FileIdent) : ServerM (Option DocumentUri) :=
return ( ( read).fileWorkersRef.get).getUri? id
def getFileWorker? (uri : DocumentUri) : ServerM (Option FileWorker) :=
return ( (read).fileWorkersRef.get).get? uri
def fileWorkerExists (id : FileIdent) : ServerM Bool := do
let some uri getFileWorkerUri? id
| return false
def fileWorkerExists (uri : DocumentUri) : ServerM Bool := do
return ( getFileWorker? uri).isSome
def eraseFileWorker (uri : DocumentUri) : ServerM Unit := do
@@ -528,15 +498,10 @@ section ServerM
let r? eraseGetPendingRequest uri id
return r?.isSome
def handleILeanHeaderInfo (fw : FileWorker) (params : LeanILeanHeaderInfoParams) : ServerM Unit := do
let module := fw.doc.mod
let uri := fw.doc.uri
modifyReferencesIO (·.updateWorkerImports module uri params.version params.directImports)
def handleILeanHeaderSetupInfo (fw : FileWorker) (params : LeanILeanHeaderSetupInfoParams) : ServerM Unit := do
let module := fw.doc.mod
let uri := fw.doc.uri
modifyReferencesIO (·.updateWorkerSetupInfo module uri params.version params.isSetupFailure)
modifyReferencesIO (·.updateWorkerSetupInfo module uri params.version params.directImports params.isSetupFailure)
def handleIleanInfoUpdate (fw : FileWorker) (params : LeanIleanInfoParams) : ServerM Unit := do
let module := fw.doc.mod
@@ -854,9 +819,6 @@ section ServerM
let globalID ( read).serverRequestData.modifyGet
(·.trackOutboundRequest fw.doc.uri id)
writeMessage (Message.request globalID method params?)
| .notification "$/lean/ileanHeaderInfo" =>
if let .ok params := parseNotificationParams? LeanILeanHeaderInfoParams msg then
handleILeanHeaderInfo fw params
| .notification "$/lean/ileanHeaderSetupInfo" =>
if let .ok params := parseNotificationParams? LeanILeanHeaderSetupInfoParams msg then
handleILeanHeaderSetupInfo fw params
@@ -949,11 +911,9 @@ section ServerM
pure ()
def tryWriteMessage
(id : FileIdent)
(uri : DocumentUri)
(msg : JsonRpc.Message)
: ServerM Unit := do
let some uri getFileWorkerUri? id
| return
let some fw getFileWorker? uri
| return
if let some req := JsonRpc.Request.ofMessage? msg then
@@ -986,7 +946,7 @@ section ServerM
staleDependency := staleDependency
: LeanStaleDependencyParams
}
tryWriteMessage (.uri uri) notification
tryWriteMessage uri notification
end ServerM
section RequestHandling
@@ -1294,7 +1254,7 @@ section NotificationHandling
}
updateFileWorkers { fw with doc := newDoc }
let notification := Notification.mk "textDocument/didChange" p
tryWriteMessage (.uri doc.uri) notification
tryWriteMessage doc.uri notification
/--
When a file is saved, notifies all file workers for files that depend on this file that this
@@ -1307,7 +1267,7 @@ section NotificationHandling
let importData s.importData.get
let dependents := importData.importedBy.getD p.textDocument.uri
for uri, _ in fws.fileWorkers do
for uri, _ in fws do
if ! dependents.contains uri then
continue
notifyAboutStaleDependency uri p.textDocument.uri
@@ -1349,7 +1309,7 @@ section NotificationHandling
let ctx read
let some uri ctx.requestData.getUri? p.id
| return
tryWriteMessage (.uri uri) (Notification.mk "$/cancelRequest" p)
tryWriteMessage uri (Notification.mk "$/cancelRequest" p)
def forwardNotification {α : Type} [ToJson α] [FileSource α] (method : String) (params : α)
: ServerM Unit :=
@@ -1387,7 +1347,7 @@ section MessageHandling
def handleReferenceRequest α β [FromJson α] [ToJson β] (id : RequestID) (params : Json)
(handler : α ReaderT ReferenceRequestContext IO β) : ServerM Unit := do
let ctx read
let fileWorkerMods := ( ctx.fileWorkersRef.get).fileWorkers.map fun _ fw => fw.doc.mod
let fileWorkerMods := ( ctx.fileWorkersRef.get).map fun _ fw => fw.doc.mod
let references getReferences
let _ ServerTask.IO.asTask <| ReaderT.run (ρ := ServerContext) (r := ctx) do
try
@@ -1498,7 +1458,7 @@ section MessageHandling
def handleResponse (id : RequestID) (result : Json) : ServerM Unit := do
let some translation ( read).serverRequestData.modifyGet (·.translateInboundResponse id)
| return
tryWriteMessage (.uri translation.sourceUri) (Response.mk translation.localID result)
tryWriteMessage translation.sourceUri (Response.mk translation.localID result)
def handleResponseError
(id : RequestID)
@@ -1508,15 +1468,15 @@ section MessageHandling
: ServerM Unit := do
let some translation ( read).serverRequestData.modifyGet (·.translateInboundResponse id)
| return
tryWriteMessage (.uri translation.sourceUri) (ResponseError.mk translation.localID code message data?)
tryWriteMessage translation.sourceUri (ResponseError.mk translation.localID code message data?)
end MessageHandling
section MainLoop
def shutdown : ServerM Unit := do
let fileWorkers (read).fileWorkersRef.get
for uri, _ in fileWorkers.fileWorkers do
for uri, _ in fileWorkers do
try terminateFileWorker uri catch _ => pure ()
for _, fw in fileWorkers.fileWorkers do
for _, fw in fileWorkers do
-- TODO: Wait for process group to finish instead
try let _ fw.killProcAndWait catch _ => pure ()
@@ -1540,7 +1500,7 @@ section MainLoop
let st read
let workers st.fileWorkersRef.get
let mut workerTasks := #[]
for (_, fw) in workers.fileWorkers do
for (_, fw) in workers do
let some commTask := fw.commTask?
| continue
if ( getWorkerState fw) matches WorkerState.crashed then
@@ -1765,7 +1725,7 @@ def initAndRunWatchdog (args : List String) (i o : FS.Stream) : IO Unit := do
pendingWaitForILeanRequests := #[]
}
startLoadingReferences referenceData
let fileWorkersRef IO.mkRef ({} : FileWorkerMap)
let fileWorkersRef IO.mkRef ({} : Std.TreeMap DocumentUri FileWorker)
let serverRequestData IO.mkRef {
pendingServerRequests := Std.TreeMap.empty
freshServerRequestID := 0

View File

@@ -270,9 +270,9 @@ withTraceNode `isPosTrace (msg := (return m!"{ExceptToEmoji.toEmoji ·} checking
The `cls`, `collapsed`, and `tag` arguments are forwarded to the constructor of `TraceData`.
-/
@[inline]
def withTraceNode [always : MonadAlwaysExcept ε m] [MonadLiftT BaseIO m] (cls : Name)
(msg : Except ε α m MessageData) (k : m α) (collapsed := true) (tag := "") : m α := do
let _ := always.except
let opts getOptions
if !opts.hasTrace then
return ( k)
@@ -280,21 +280,27 @@ def withTraceNode [always : MonadAlwaysExcept ε m] [MonadLiftT BaseIO m] (cls :
unless clsEnabled || trace.profiler.get opts do
return ( k)
let oldTraces getResetTraces
let (res, start, stop) withStartStop opts <| observing k
let aboveThresh := trace.profiler.get opts &&
stop - start > trace.profiler.threshold.unitAdjusted opts
unless clsEnabled || aboveThresh do
modifyTraces (oldTraces ++ ·)
return ( MonadExcept.ofExcept res)
let ref getRef
let mut m try msg res catch _ => pure m!"<exception thrown while producing trace node message>"
let mut data := { cls, collapsed, tag }
if trace.profiler.get opts then
data := { data with startTime := start, stopTime := stop }
addTraceNode oldTraces data ref m
MonadExcept.ofExcept res
let resStartStop withStartStop opts <| let _ := always.except; observing k
postCallback opts clsEnabled oldTraces msg resStartStop
where
postCallback (opts : Options) (clsEnabled oldTraces msg resStartStop) : m α := do
let _ := always.except
let (res, start, stop) := resStartStop
let aboveThresh := trace.profiler.get opts &&
stop - start > trace.profiler.threshold.unitAdjusted opts
unless clsEnabled || aboveThresh do
modifyTraces (oldTraces ++ ·)
return ( MonadExcept.ofExcept res)
let ref getRef
let mut m try msg res catch _ => pure m!"<exception thrown while producing trace node message>"
let mut data := { cls, collapsed, tag }
if trace.profiler.get opts then
data := { data with startTime := start, stopTime := stop }
addTraceNode oldTraces data ref m
MonadExcept.ofExcept res
/-- A version of `Lean.withTraceNode` which allows generating the message within the computation. -/
@[inline]
def withTraceNode' [MonadAlwaysExcept Exception m] [MonadLiftT BaseIO m] (cls : Name)
(k : m (α × MessageData)) (collapsed := true) (tag := "") : m α :=
let msg := fun
@@ -380,10 +386,10 @@ the result produced by `k` into an emoji (e.g., `💥️`, `✅️`, `❌️`).
TODO: find better name for this function.
-/
@[inline]
def withTraceNodeBefore [MonadRef m] [AddMessageContext m] [MonadOptions m]
[always : MonadAlwaysExcept ε m] [MonadLiftT BaseIO m] [ExceptToEmoji ε α] (cls : Name)
(msg : Unit m MessageData) (k : m α) (collapsed := true) (tag := "") : m α := do
let _ := always.except
let opts getOptions
if !opts.hasTrace then
return ( k)
@@ -394,18 +400,23 @@ def withTraceNodeBefore [MonadRef m] [AddMessageContext m] [MonadOptions m]
let ref getRef
-- make sure to preserve context *before* running `k`
let msg withRef ref do addMessageContext ( msg ())
let (res, start, stop) withStartStop opts <| observing k
let aboveThresh := trace.profiler.get opts &&
stop - start > trace.profiler.threshold.unitAdjusted opts
unless clsEnabled || aboveThresh do
modifyTraces (oldTraces ++ ·)
return ( MonadExcept.ofExcept res)
let mut msg := m!"{ExceptToEmoji.toEmoji res} {msg}"
let mut data := { cls, collapsed, tag }
if trace.profiler.get opts then
data := { data with startTime := start, stopTime := stop }
addTraceNode oldTraces data ref msg
MonadExcept.ofExcept res
let resStartStop withStartStop opts <| let _ := always.except; observing k
postCallback opts clsEnabled oldTraces ref msg resStartStop
where
postCallback (opts : Options) (clsEnabled oldTraces ref msg resStartStop) : m α := do
let _ := always.except
let (res, start, stop) := resStartStop
let aboveThresh := trace.profiler.get opts &&
stop - start > trace.profiler.threshold.unitAdjusted opts
unless clsEnabled || aboveThresh do
modifyTraces (oldTraces ++ ·)
return ( MonadExcept.ofExcept res)
let mut msg := m!"{ExceptToEmoji.toEmoji res} {msg}"
let mut data := { cls, collapsed, tag }
if trace.profiler.get opts then
data := { data with startTime := start, stopTime := stop }
addTraceNode oldTraces data ref msg
MonadExcept.ofExcept res
def addTraceAsMessages [Monad m] [MonadRef m] [MonadLog m] [MonadTrace m] : m Unit := do
if trace.profiler.output.get? ( getOptions) |>.isSome then

View File

@@ -330,7 +330,7 @@ abbrev PostCond (α : Type u) (ps : PostShape.{u}) : Type u :=
@[inherit_doc PostCond]
scoped macro:max "post⟨" handlers:term,+,? "" : term =>
`(by exact $handlers,*, ())
`(by exact $handlers,*, PUnit.unit)
-- NB: Postponement through by exact is the entire point of this macro
-- until https://github.com/leanprover/lean4/pull/8074 lands

View File

@@ -143,12 +143,27 @@ you want to use `mvcgen` to reason about `prog`.
theorem ReaderM.of_wp_run_eq {α} {x : α} {prog : ReaderM ρ α} (h : ReaderT.run prog r = x) (P : α Prop) :
( wpprog ( a _ => P a) r) P x := h (· True.intro)
/--
Adequacy lemma for `Except`.
Useful if you want to prove a property about a complex expression `prog : Except ε α` that you have
generalized to a variable `x` and you want to use `mvcgen` to reason about `prog`.
-/
theorem Except.of_wp_eq {ε α : Type u} {x prog : Except ε α} (h : prog = x) (P : Except ε α Prop) :
( wpprog postfun a => P (.ok a), fun e => P (.error e)) P x := by
subst h
intro hspec
simp only [wp, PredTrans.pushExcept_apply, PredTrans.pure_apply] at hspec
split at hspec
case h_1 a s' heq => rw[ heq] at hspec; exact hspec True.intro
case h_2 e s' heq => rw[ heq] at hspec; exact hspec True.intro
/--
Adequacy lemma for `Except`.
Useful if you want to prove a property about an expression `prog : Except ε α` and you want to use
`mvcgen` to reason about `prog`.
-/
theorem Except.of_wp {α} {prog : Except ε α} (P : Except ε α Prop) :
@[deprecated Except.of_wp_eq (since := "2026-01-26")]
theorem Except.of_wp {ε α : Type u} {prog : Except ε α} (P : Except ε α Prop) :
( wpprog postfun a => P (.ok a), fun e => P (.error e)) P prog := by
intro hspec
simp only [wp, PredTrans.pushExcept_apply, PredTrans.pure_apply] at hspec
@@ -156,6 +171,20 @@ theorem Except.of_wp {α} {prog : Except ε α} (P : Except ε α → Prop) :
case h_1 a s' heq => rw[ heq] at hspec; exact hspec True.intro
case h_2 e s' heq => rw[ heq] at hspec; exact hspec True.intro
/--
Adequacy lemma for `Option`.
Useful if you want to prove a property about a complex expression `prog : Option α` that you have
generalized to a variable `x` and you want to use `mvcgen` to reason about `prog`.
-/
theorem Option.of_wp_eq {α : Type u} {x prog : Option α} (h : prog = x) (P : Option α Prop) :
( wpprog postfun a => P (some a), fun _ => P none) P x := by
subst h
intro hspec
simp only [wp, PredTrans.pushOption_apply, PredTrans.pure_apply] at hspec
split at hspec
case h_1 a s' heq => rw[ heq] at hspec; exact hspec True.intro
case h_2 s' heq => rw[ heq] at hspec; exact hspec True.intro
/--
Adequacy lemma for `EStateM.run`.
Useful if you want to prove a property about an expression `x` defined as `EStateM.run prog s` and

View File

@@ -2664,15 +2664,19 @@ static inline size_t lean_isize_mul(size_t a1, size_t a2) {
static inline size_t lean_isize_div(size_t a1, size_t a2) {
ptrdiff_t lhs = (ptrdiff_t)a1;
ptrdiff_t rhs = (ptrdiff_t)a2;
return (size_t)(rhs == 0 ? 0 : lhs / rhs);
if (rhs == 0) return 0;
// Check for overflow: PTRDIFF_MIN / -1 would trap on x86 idiv
if (lhs == PTRDIFF_MIN && rhs == -1) return (size_t)PTRDIFF_MIN;
return (size_t)(lhs / rhs);
}
static inline size_t lean_isize_mod(size_t a1, size_t a2) {
ptrdiff_t lhs = (ptrdiff_t)a1;
ptrdiff_t rhs = (ptrdiff_t)a2;
return (size_t)(rhs == 0 ? lhs : lhs % rhs);
if (rhs == 0) return (size_t)lhs;
// Check for overflow: PTRDIFF_MIN / -1 would trap on x86 idiv
if (lhs == PTRDIFF_MIN && rhs == -1) return 0;
return (size_t)(lhs % rhs);
}
static inline size_t lean_isize_land(size_t a1, size_t a2) {
@@ -3175,6 +3179,12 @@ static inline lean_obj_res lean_manual_get_root(lean_obj_arg _unit) {
return lean_mk_string(LEAN_MANUAL_ROOT);
}
#ifdef LEAN_EMSCRIPTEN
#define LEAN_SCALAR_PTR_LITERAL(b1, b2, b3, b4, b5, b6, b7, b8) (lean_object*)((uint32_t)b1 | ((uint32_t)b2 << 8) | ((uint32_t)b3 << 16) | ((uint32_t)b4 << 24)), (lean_object*)((uint32_t)b5 | ((uint32_t)b6 << 8) | ((uint32_t)b7 << 16) | ((uint32_t)b8 << 24))
#else
#define LEAN_SCALAR_PTR_LITERAL(b1, b2, b3, b4, b5, b6, b7, b8) (lean_object*)((uint64_t)b1 | ((uint64_t)b2 << 8) | ((uint64_t)b3 << 16) | ((uint64_t)b4 << 24) | ((uint64_t)b5 << 32) | ((uint64_t)b6 << 40) | ((uint64_t)b7 << 48) | ((uint64_t)b8 << 56))
#endif
#ifdef __cplusplus
}
#endif

View File

@@ -151,8 +151,10 @@ public class FamilyDef {α : Type u} {β : Type v} (f : α → β) (a : α) (b :
public class FamilyOut {α : Type u} {β : Type v} (f : α β) (a : α) (b : outParam β) : Prop where
fam_eq : f a = b
-- Simplifies proofs involving open type families
attribute [simp] FamilyOut.fam_eq
-- Simplifies proofs involving open type families.
-- Scoped to avoid slowing down `simp` in downstream projects (the discrimination
-- tree key is `_`, so it would be attempted on every goal).
attribute [scoped simp] FamilyOut.fam_eq
public instance [FamilyDef f a b] : FamilyOut f a b where
fam_eq := FamilyDef.fam_eq

View File

@@ -9,6 +9,7 @@ Author: Leonardo de Moura
#include "runtime/exception.h"
#include "runtime/thread.h"
#include "runtime/sstream.h"
#include <lean/version.h>
namespace lean {
throwable::throwable(char const * msg):m_msg(msg) {}
@@ -18,7 +19,7 @@ throwable::~throwable() noexcept {}
char const * throwable::what() const noexcept { return m_msg.c_str(); }
stack_space_exception::stack_space_exception(char const * component_name):
m_msg((sstream() << "deep recursion was detected at '" << component_name << "' (potential solution: increase stack space in your system)").str()) {
m_msg((sstream() << "deep recursion was detected at '" << component_name << "' (potential solution: increase elaboration stack size using the `lean --tstack` flag). This flag can be set in the `weakLeanArgs` field of the Lake configuration. Further details are available in the Lean reference manual at " << LEAN_MANUAL_ROOT << "find/?domain=Verso.Genre.Manual.section&name=lake-config-toml").str()) {
}
memory_exception::memory_exception(char const * component_name):

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
stage0/stdlib/Init/Coe.c generated

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More