Compare commits

..

39 Commits

Author SHA1 Message Date
Sebastian Ullrich
d53b46a0f3 chore: update stage0 2026-04-10 15:08:43 +02:00
Sebastian Ullrich
5a9d3bc991 chore: update stage0 2026-04-10 15:08:43 +02:00
Sebastian Ullrich
8678c99b76 fix: respect module visibility in initialize/builtin_initialize
Previously, `elabInitialize` only checked for explicit `private` when
deciding whether to mangle `fullId`, ignoring the `module` system's
default-private semantics. It also overrode the user's visibility for
the generated `initFn` via `visibility.ofBool`.

Now, `elabInitialize` uses `elabVisibility` + `isInferredPublic` to
correctly handle all visibility contexts. The generated `initFn`
inherits the user's visibility rather than being forced public.

Also factors out `elabVisibility` from `elabModifiers` for reuse.
2026-04-10 15:08:43 +02:00
Henrik Böving
bc2da2dc74 perf: assorted compiler annotations (#13357)
This PR is based on a systematic review of all read-only operations on
the default containers in core. Where sensible it applies specialize
annotations on higher order operations that lack them or borrow
annotations on parameters that should morally be borrowed (e.g. the
container when iterating over it).
2026-04-10 11:47:40 +00:00
Kyle Miller
e0a29f43d2 feat: adjust deriving Inhabited to use structure field defaults (#9815)
This PR changes the `Inhabited` deriving handler for `structure` types
to use default field values when present; this ensures that `{}` and
`default` are interchangeable when all fields have default values. The
handler effectively uses `by refine' {..} <;> exact default` to
construct the inhabitant. (Note: when default field values cannot be
resolved, they are ignored, as usual for ellipsis mode.)

Implementation note: the handler now constructs the `Expr` directly and
adds it to the environment, though the `instance` is still added using
`elabCommand`.

Closes #9463
2026-04-09 18:54:24 +00:00
Wojciech Różowski
a07649a4c6 feat: add warning for non-portable module names (#13318)
This PR adds a check for OS-forbidden names and characters in module
names. This implements the functionality of `modulesOSForbidden` linter
of mathlib.
2026-04-09 16:16:51 +00:00
Sebastian Ullrich
031bfa5989 fix: handle flattened inheritance in wrapInstance (#13302)
Instead of unconditionally wrapping value of fields that were copied
from flattened parent structures, try finding an existing instance and
projecting it first.
2026-04-09 15:53:21 +00:00
Lean stage0 autoupdater
1d1c5c6e30 chore: update stage0 2026-04-09 15:26:42 +00:00
Sebastian Ullrich
c0fbddbf6f chore: scale nat_repr to a more reasonable runtime (#13347) 2026-04-09 13:54:56 +00:00
Kyle Miller
c60f97a3fa feat: allow field notation to use explicit universe levels (#13262)
This PR extends Lean's syntax to allow explicit universe levels in
expressions such as `e.f.{u,v}`, `(f e).g.{u}`, and `e |>.f.{u,v} x y
z`. It fixes a bug where universe levels would be attributed to the
wrong expression; for example `x.f.{u}` would be interpreted as
`x.{u}.f`. It also changes the syntax of top-level declarations to not
allow space between the identifier and the universe level list, and it
fixes a bug in the `checkWsBefore` parser where it would not detect
whitespace across `optional` parsers.

Closes #8743
2026-04-09 13:29:10 +00:00
Mac Malone
82bb27fd7d fix: lake: report bad imports from a library build (#13340)
This PR fixes a Lake issue where library builds would not produce
informative errors about bad imports (unlike module builds).
2026-04-09 04:03:52 +00:00
Mac Malone
ab0ec9ef95 chore: use weakLeanArgs for Lake plugin (#13335)
This PR changes the Lake core build to use `weakLeanArgs` for the Lake
plugin.

This fixes a trace mismatch between local builds and the CI caused by
the differing paths.
2026-04-09 00:02:39 +00:00
Sebastian Graf
f9b2f6b597 fix: use getDecLevel/isLevelDefEq for for loop mut var universe constraints (#13332)
This PR fixes universe unification for `for` loops with `mut` variables
whose types span multiple implicit universes. The old approach used
`ensureHasType (mkSort mi.u.succ)` per variable, which generated
constraints like `max (?u+1) (?v+1) =?= ?u+1` that the universe solver
cannot decompose. The new approach uses `getDecLevel`/`isLevelDefEq` on
the decremented level, producing `max ?u ?v =?= ?u` which `solveSelfMax`
handles directly.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 16:34:51 +00:00
Sebastian Ullrich
a3cc301de5 fix: wrapInstance should not reduce non-constructor instances (#13327)
This otherwise can break `Decidable` instances
2026-04-08 16:31:28 +00:00
Lean stage0 autoupdater
3a8db01ce8 chore: update stage0 2026-04-08 15:28:03 +00:00
Joachim Breitner
06fb4bec52 feat: require indentation in commands, allow empty tactic sequences (#13229)
This PR wraps the top-level command parser with `withPosition` to
enforce indentation in `by` blocks, combined with an empty-by fallback
for better error messages.

This subsumes #3215 (which introduced `withPosition commandParser` but
without the empty-by fallback). It is also related to #9524, which
explores elaboration with empty tactic sequences — this PR reuses that
idea for the empty-by fallback, so that a `by` not followed by an
indented tactic produces an elaboration error (unsolved goals) rather
than a parse error.

**Changes:**
- `topLevelCommandParserFn` now uses `(withPosition commandParser).fn`,
setting the saved position at the start of each top-level command
- `tacticSeqIndentGt` gains an empty tactic sequence fallback
(`pushNone`) so that missing indentation produces an elaboration error
(unsolved goals) instead of a parse error
- `isEmptyBy` in `goalsAt?` removed: with strict `by` indentation, empty
`by` blocks parse successfully via `pushNone` (producing empty nodes)
rather than producing `.missing` syntax, making the `isEmptyBy` check
dead code. The `isEmpty` helper in `isSyntheticTacticCompletion`
continues to work correctly because it handles both `.missing` and empty
nodes from `pushNone` (via the vacuously-true `args.all isEmpty` on
`#[]`)
- Test files updated to indent `by` blocks and expression continuations
that were previously at column 0

**Behavior:**
- Top-level `by` blocks now require indentation (column > 0 for commands
at column 0)
- Commands indented inside `section` require proofs to be indented past
the command's column
- `#guard_msgs in example : True := by` works because tactic indentation
is checked against the outermost command's column
- Expression continuations (not just `by`) must also be indented past
the command, which is slightly more strict but more consistent
- `have : True := by` followed by a dedent now correctly puts `this` in
scope in the outer tactic block (the `have` is structurally complete
with an unsolved-goal error, rather than a parse error)

**Code changes observed in practice (lean4 test suite + Mathlib):**

- `by` blocks: top-level `theorem ... := by` / `decreasing_by` followed
by tactics at column 0 must be indented
- `variable` continuations: `variable {A : Type*} [Foo A]\n{B : Type*}`
where the second line starts at column 0 must be indented (most common
category in Mathlib)
- Expression continuations: `def f : T :=\nexpr` or `#synth Foo\n[args]`
where the body/arguments start at column 0
- Structure literals: `.symm\n{ toFun := ...` where the struct literal
starts at column 0

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 14:05:47 +00:00
Sebastian Ullrich
35b4c7dbfc feat: implicit public meta import Init in non-prelude files (#13323)
Ensure metaprograms have implicit access to `Init` like everyone else.
Closes #13310.
2026-04-08 11:46:46 +00:00
Joachim Breitner
2398d2cc66 feat: no [defeq] attribute on sizeOf_spec lemmas (#13320)
This PR changes the auto-generated `sizeOf` definitions to be not
exposed and the `sizeOf_spec` theorem to be not marked `[defeq]`.
2026-04-08 11:10:50 +00:00
Kim Morrison
8353964e55 feat: wire PowIdentity into grind ring solver (#13088)
This PR wires the `PowIdentity` typeclass (from
https://github.com/leanprover/lean4/pull/13086) into the `grind` ring
solver's Groebner basis engine.

When a ring has a `PowIdentity α p` instance, the solver pushes `x ^ p =
x` as a new fact for each variable `x`, which becomes `x^p - x = 0` in
the Groebner basis. Since `p` is an `outParam`, instance discovery is
decoupled from `IsCharP` — the solver synthesizes `PowIdentity α ?p`
with a fresh metavar and lets instance search find both the instance and
the exponent.

This correctly handles non-prime finite fields: for `F_4` (char 2, 4
elements), Mathlib would provide `PowIdentity F_4 4` and the solver
would discover `p = 4`, not `p = 2`.

Note: the original motivating example `(x + y)^2 = x^128 + y^2` from
https://github.com/leanprover/lean4/issues/12842 does not yet work
because the `ToInt` module lifts `Fin 2` expressions to integers and
expands `x^128` via the binomial theorem before the ring solver can
reduce it. Addressing that is a separate deeper change.

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 10:14:10 +00:00
Kim Morrison
334d9bd4f3 feat: add markMeta parameter to addAndCompile (#13311)
This PR adds an optional `markMeta : Bool := false` parameter to
`addAndCompile`, so that callers can propagate the `meta` marking
without manually splitting into `addDecl` + `markMeta` + `compileDecl`.

Also updates `ParserCompiler` to use the new parameter.

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 09:09:01 +00:00
Kim Morrison
f7f5fc5ecd feat: add PowIdentity typeclass for grind ring solver (#13086)
This PR adds a `Lean.Grind.PowIdentity` typeclass stating that `x ^ p =
x` for all elements of a commutative semiring, with `p` as an
`outParam`.

The primary source of instances is Fermat's little theorem: for a finite
field with `q` elements, `x ^ q = x`. Since `p` is an `outParam`,
instance synthesis discovers the correct exponent automatically — the
solver does not need to know the characteristic or cardinality in
advance.

A concrete instance for `Fin 2` is provided. Mathlib can provide
instances for general finite fields via `FiniteField.pow_card`.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 09:05:12 +00:00
Joachim Breitner
659db85510 fix: suggest (rfl) not id rfl in linter (#13319)
This PR amends #13317 to suggest `:= (rfl)` as the recommended way to
avoid a theorem to be automatically marked `[defeq]`, for consistency
with existing documentation. Rationale: the special treatment of `:=
rfl` is based on syntax, not the proof term, so it’s appropriate to use
different syntax. And also I like the way it reads like a “muted whisper
of `rfl`”.
2026-04-08 08:21:23 +00:00
Wojciech Różowski
91dd99165a feat: add warning when applying global attribute using in (#13223)
This PR adds a warning preventing a user from applying global attribute
using `... in ...`, e.g.
```lean4
theorem a : True := trivial
attribute [simp] a in
def b : True := a
```
2026-04-08 06:20:34 +00:00
Lean stage0 autoupdater
e44351add9 chore: update stage0 2026-04-08 05:43:47 +00:00
Leonardo de Moura
fd2723d9c0 feat: add linter for rfl simp theorems at restricted transparency (#13317)
This PR adds an opt-in linter (`set_option simp.rfl.checkTransparency
true`) that warns when a `rfl` simp theorem's LHS and RHS are not
definitionally equal at `.instances` transparency. Bad rfl-simp theorems
— those that only hold at higher transparency — create problems
throughout the system because `simp` and `dsimp` operate at restricted
transparency. The linter suggests two fixes: use `id rfl` as the proof
(to remove the `rfl` status), or mark relevant constants as
`[implicit_reducible]`.

This is part of a broader effort to ensure `isDefEq` respects
transparency levels. The linter helps systematically identify
problematic rfl-simp theorems so they can be fixed incrementally.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 04:49:07 +00:00
Kim Morrison
ad2105dc94 feat: add List.prod, Array.prod, and Vector.prod (#13200)
This PR adds `prod` (multiplicative fold) for `List`, `Array`, and
`Vector`, mirroring the existing `sum` API. Includes basic simp lemmas
(`prod_nil`, `prod_cons`, `prod_append`, `prod_singleton`,
`prod_reverse`, `prod_push`, `prod_eq_foldl`), Nat-specialized lemmas
(`prod_pos_iff_forall_pos_nat`, `prod_eq_zero_iff_exists_zero_nat`,
`prod_replicate_nat`), Int-specialized lemmas (`prod_replicate_int`),
cross-type lemmas (`prod_toArray`, `prod_toList`), and `Perm.prod_nat`
with grind patterns.

The min/max pigeonhole-style bounds from the `sum` Nat/Int files are
omitted as they don't have natural multiplicative analogues.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 04:01:49 +00:00
Mac Malone
235aedfaf7 chore: use --wfail for core CI build (#13294)
This PR introduces the `WFAIL` CMake setting that uses `--wfail` for
Lake builds of core.

This also, by extension, fixes a mismatch in the trace of core CI builds
vs local core builds.
2026-04-08 02:17:39 +00:00
Kim Morrison
30dca7b545 fix: make delta-derived Prop-valued instances theorems (#13304)
This PR makes the delta-deriving handler create `theorem` declarations
instead of `def` declarations when the instance type is a `Prop`.
Previously, `deriving instance Nonempty for Foo` would always create a
`def`, which is inconsistent with the behavior of a handwritten
`instance` declaration.

For example, given:
```lean
def Foo (α : Type u) := List α
deriving instance Nonempty for Foo
```

Before: `@[implicit_reducible] def instNonemptyFoo ...`
After: `@[implicit_reducible] theorem instNonemptyFoo ...`

The implementation checks `isProp result.type` after constructing the
instance closure, and uses `mkThmOrUnsafeDef` for the Prop case (which
also handles the unsafe fallback correctly). The noncomputable check is
skipped for Prop-typed instances since theorems can freely reference
noncomputable constants.

Closes #13295

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 01:19:48 +00:00
Kim Morrison
7e04970c58 fix: skip nightly-testing merge when branch does not exist (#13308)
This PR makes `release_steps.py` robust to release repos that have no
`nightly-testing` branch. Previously, `git merge origin/nightly-testing`
would fail with "not something we can merge" and the error handler
misinterpreted this as a merge conflict, then crashed trying to commit
with nothing to commit. Now we check for the branch with
`git ls-remote --heads` before attempting the merge.

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 00:31:22 +00:00
Kim Morrison
0a6ee838df fix: update lean-toolchain in verso test-projects during release (#13309)
This PR updates `release_steps.py` to sync all `lean-toolchain` files in
verso's `test-projects/` subdirectories to match the root toolchain
during release bumps. The verso CI "SubVerso version consistency" check
requires these to match, and the script was only syncing
`lake-manifest.json` sub-manifests but not toolchain files.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 00:29:38 +00:00
Sebastian Graf
ec72785927 chore: git ignore .claude/worktrees (#13299)
This PR ignores `.claude/worktrees` after I have had one instance too
many where I nuked all my worktrees due to a brain-dead `git clean
-fxd`.
2026-04-07 09:51:39 +00:00
Keith Seim
ba33c3daa4 fix: match stub signature of lean_uv_dns_get_info to real implementation (#13234)
This PR fixes a build issue when Lean is not linked against libuv.

## Problem

In `src/runtime/uv/dns.cpp`, the non-libuv stub of
`lean_uv_dns_get_info` (in the `#else` branch, compiled when building
without libuv) has a **4-parameter** signature:

```cpp
lean_uv_dns_get_info(b_obj_arg name, b_obj_arg service, uint8_t family, int8_t protocol)
```

But the real implementation above the `#else` has only **3 parameters**:

```cpp
lean_uv_dns_get_info(b_obj_arg name, b_obj_arg service, uint8_t family)
```

The Lean `@[extern]` declaration also expects 3 parameters. The stub has
an extra `int8_t protocol` parameter that the real function and the Lean
FFI caller do not use.

## Fix

Remove the extra `protocol` parameter from the stub so both branches
have the same signature.

## Evidence

Discovered while building Lean4 to WASM via Emscripten for a production
project ([specify-lean](https://github.com/kjsdesigns/specify)) since
v4.27.0. The stub branch is compiled in this configuration, and the
signature mismatch was caught at link time. The fix has been stable in
production across multiple Lean version bumps.

Related: [Zulip thread on WASM build
fixes](https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/WASM.20build.20fixes.3A.20libuv.20symbol.20leakage.20.28.236817.29.20and.20unique_lo/with/580836892)
(2026-03-21).

Co-authored-by: Keith Seim <keith@MacBook-Pro.local>
2026-04-07 09:39:53 +00:00
Keith Seim
db1e2ac34c fix: add missing release() and adopt_lock_t to single-threaded unique_lock stub (#13233)
This PR fixes runtime build issues when `LEAN_MULTI_THREAD` is not set.

## Problem

When building with `LEAN_MULTI_THREAD` undefined (required for
Emscripten/WASM targets), the stub `unique_lock<T>` in
`src/runtime/thread.h` is missing two members that the real
`std::unique_lock` provides:

1. **`release()`** — called by runtime code paths, causes a compile
error when the stub is active
2. **`unique_lock(T const &, std::adopt_lock_t)`** — required by code
that acquires a lock before constructing the `unique_lock`

The other stubs in this file (`mutex`, `lock_guard`,
`condition_variable`) are complete; only `unique_lock` is missing API
surface.

## Fix

Add the two missing members to the single-threaded `unique_lock` stub:

```cpp
unique_lock(T const &, std::adopt_lock_t) {}
T * release() { return nullptr; }
```

Both are no-ops, matching the semantics of a single-threaded
environment. `release()` returns `nullptr` (no mutex to release). The
`adopt_lock_t` constructor is a no-op (no lock to adopt).

## Evidence

I've been using this fix in a production project
([specify-lean](https://github.com/kjsdesigns/specify)) since v4.27.0 to
build the Lean4 runtime to WASM via Emscripten. The fix has been stable
across multiple Lean version bumps.

I posted about this on
[Zulip](https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/WASM.20build.20fixes.3A.20libuv.20symbol.20leakage.20.28.236817.29.20and.20unique_lo/with/580836892)
on 2026-03-21.

Co-authored-by: Keith Seim <keith@MacBook-Pro.local>
2026-04-07 09:30:13 +00:00
Jason Yuen
cb06946972 chore: fix typo in annotatedBorrows (#13276)
Fixes a public-facing typo from #13274.
2026-04-07 09:17:26 +00:00
Sebastian Ullrich
4f6bcc5ada chore: avoid segfault in stage2 (#13296)
Lake.Load must not define metaprograms under the current `--plugin`
setup
2026-04-07 09:08:04 +00:00
Jason Yuen
0650cbe0fa chore: fix typo in isInvalidContinuationByte (#13275)
Fixes a public-facing typo from #13274.
2026-04-07 09:01:19 +00:00
Kim Morrison
8bb07f336d chore: add leansqlite to release repos (#13293)
This PR adds `leansqlite` to `release_repos.yml` and lists it as a
dependency of `doc-gen4`, which requires it.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 08:54:42 +00:00
Mac Malone
c16e88644c feat: Runtime.hold (#13270)
This PR adds `Runtime.hold`, which ensures its argument remains alive
until the callsite by holding a reference to it. This can be useful for
unsafe code (such as an FFI) that relies on a Lean object not being
freed until after some point in the program.

It is implemented via a `static inline` function in `lean.h` so that C
will optimize away the call.

---------

Co-authored-by: Henrik Böving <hargonix@gmail.com>
2026-04-07 08:44:35 +00:00
Mac Malone
96d502bd11 refactor: introduce LakefileConfig & well-formed workspaces (#13282)
This PR introduces `LakefileConfig`, which can be constructed from a
Lake configuration file without all the information required to
construct a full `Package`. Also, workspaces now have a well-formedness
property attached which ensures the workspace indices of its packages
match their index in the workspace. Finally, the facet configuration map
now has its own type: `FacetConfigMap`.

Created by splitting off the major self-contained (but overlapping)
refactors from #11662.
2026-04-06 23:34:48 +00:00
900 changed files with 2429 additions and 897 deletions

View File

@@ -66,6 +66,8 @@ To rebuild individual stage 2 modules without a full `make stage2`, use Lake dir
cd build/release/stage2 && lake build Init.Prelude
```
To run tests in stage2, replace `-C build/release` from above with `-C build/release/stage2`.
## New features
When asked to implement new features:

View File

@@ -131,7 +131,7 @@ jobs:
[ -d build ] || mkdir build
cd build
# arguments passed to `cmake`
OPTIONS=(-DLEAN_EXTRA_MAKE_OPTS=-DwarningAsError=true)
OPTIONS=(-DWFAIL=ON)
if [[ -n '${{ matrix.release }}' ]]; then
# this also enables githash embedding into stage 1 library, which prohibits reusing
# `.olean`s across commits, so we don't do it in the fast non-release CI

View File

@@ -77,7 +77,7 @@ jobs:
# sync options with `Linux Lake` to ensure cache reuse
run: |
mkdir -p build
cmake --preset release -B build -DLEAN_EXTRA_MAKE_OPTS=-DwarningAsError=true
cmake --preset release -B build -DWFAIL=ON
shell: 'nix develop -c bash -euxo pipefail {0}'
- if: env.should_update_stage0 == 'yes'
run: |

1
.gitignore vendored
View File

@@ -34,3 +34,4 @@ wdErr.txt
wdIn.txt
wdOut.txt
downstream_releases/
.claude/worktrees/

View File

@@ -28,6 +28,14 @@ repositories:
branch: main
dependencies: []
- name: leansqlite
url: https://github.com/leanprover/leansqlite
toolchain-tag: true
stable-branch: false
branch: main
dependencies:
- plausible
- name: verso
url: https://github.com/leanprover/verso
toolchain-tag: true
@@ -100,7 +108,7 @@ repositories:
toolchain-tag: true
stable-branch: false
branch: main
dependencies: [lean4-cli, BibtexQuery, mathlib4]
dependencies: [lean4-cli, BibtexQuery, mathlib4, leansqlite]
- name: cslib
url: https://github.com/leanprover/cslib

View File

@@ -481,11 +481,9 @@ def execute_release_steps(repo, version, config):
run_command("lake update", cwd=repo_path, stream_output=True)
elif repo_name == "verso":
# verso has nested Lake projects in test-projects/ that each have their own
# lake-manifest.json with a subverso pin. After updating the root manifest via
# `lake update`, sync the de-modulized subverso rev into all sub-manifests.
# The sub-projects use an old toolchain (v4.21.0) that doesn't support module/prelude
# syntax, so they need the de-modulized version (tagged no-modules/<root-rev>).
# The "SubVerso version consistency" CI check accepts either the root or de-modulized rev.
# lake-manifest.json with a subverso pin and their own lean-toolchain.
# After updating the root manifest via `lake update`, sync the de-modulized
# subverso rev into all sub-manifests, and update their lean-toolchain files.
run_command("lake update", cwd=repo_path, stream_output=True)
print(blue("Syncing de-modulized subverso rev to test-project sub-manifests..."))
sync_script = (
@@ -498,6 +496,15 @@ def execute_release_steps(repo, version, config):
)
run_command(sync_script, cwd=repo_path)
print(green("Synced de-modulized subverso rev to all test-project sub-manifests"))
# Update all lean-toolchain files in test-projects/ to match the root
print(blue("Updating lean-toolchain files in test-projects/..."))
find_result = run_command("find test-projects -name lean-toolchain", cwd=repo_path)
for tc_path in find_result.stdout.strip().splitlines():
if tc_path:
tc_file = repo_path / tc_path
with open(tc_file, "w") as f:
f.write(f"leanprover/lean4:{version}\n")
print(green(f" Updated {tc_path}"))
elif dependencies:
run_command(f'perl -pi -e \'s/"v4\\.[0-9]+(\\.[0-9]+)?(-rc[0-9]+)?"/"' + version + '"/g\' lakefile.*', cwd=repo_path)
run_command("lake update", cwd=repo_path, stream_output=True)
@@ -659,56 +666,61 @@ def execute_release_steps(repo, version, config):
# Fetch latest changes to ensure we have the most up-to-date nightly-testing branch
print(blue("Fetching latest changes from origin..."))
run_command("git fetch origin", cwd=repo_path)
try:
print(blue("Merging origin/nightly-testing..."))
run_command("git merge origin/nightly-testing", cwd=repo_path)
print(green("Merge completed successfully"))
except subprocess.CalledProcessError:
# Merge failed due to conflicts - check which files are conflicted
print(blue("Merge conflicts detected, checking which files are affected..."))
# Get conflicted files using git status
status_result = run_command("git status --porcelain", cwd=repo_path)
conflicted_files = []
for line in status_result.stdout.splitlines():
if len(line) >= 2 and line[:2] in ['UU', 'AA', 'DD', 'AU', 'UA', 'DU', 'UD']:
# Extract filename (skip the first 3 characters which are status codes)
conflicted_files.append(line[3:])
# Filter out allowed files
allowed_patterns = ['lean-toolchain', 'lake-manifest.json']
problematic_files = []
for file in conflicted_files:
is_allowed = any(pattern in file for pattern in allowed_patterns)
if not is_allowed:
problematic_files.append(file)
if problematic_files:
# There are conflicts in non-allowed files - fail
print(red("❌ Merge failed!"))
print(red(f"Merging nightly-testing resulted in conflicts in:"))
for file in problematic_files:
print(red(f" - {file}"))
print(red("Please resolve these conflicts manually."))
return
else:
# Only allowed files are conflicted - resolve them automatically
print(green(f"✅ Only allowed files conflicted: {', '.join(conflicted_files)}"))
print(blue("Resolving conflicts automatically..."))
# For lean-toolchain and lake-manifest.json, keep our versions
# Check if nightly-testing branch exists on origin (use local ref after fetch for exact match)
nightly_check = run_command("git show-ref --verify --quiet refs/remotes/origin/nightly-testing", cwd=repo_path, check=False)
if nightly_check.returncode != 0:
print(yellow("No nightly-testing branch found on origin, skipping merge"))
else:
try:
print(blue("Merging origin/nightly-testing..."))
run_command("git merge origin/nightly-testing", cwd=repo_path)
print(green("Merge completed successfully"))
except subprocess.CalledProcessError:
# Merge failed due to conflicts - check which files are conflicted
print(blue("Merge conflicts detected, checking which files are affected..."))
# Get conflicted files using git status
status_result = run_command("git status --porcelain", cwd=repo_path)
conflicted_files = []
for line in status_result.stdout.splitlines():
if len(line) >= 2 and line[:2] in ['UU', 'AA', 'DD', 'AU', 'UA', 'DU', 'UD']:
# Extract filename (skip the first 3 characters which are status codes)
conflicted_files.append(line[3:])
# Filter out allowed files
allowed_patterns = ['lean-toolchain', 'lake-manifest.json']
problematic_files = []
for file in conflicted_files:
print(blue(f"Keeping our version of {file}"))
run_command(f"git checkout --ours {file}", cwd=repo_path)
# Complete the merge
run_command("git add .", cwd=repo_path)
run_command("git commit --no-edit", cwd=repo_path)
print(green("Merge completed successfully with automatic conflict resolution"))
is_allowed = any(pattern in file for pattern in allowed_patterns)
if not is_allowed:
problematic_files.append(file)
if problematic_files:
# There are conflicts in non-allowed files - fail
print(red("❌ Merge failed!"))
print(red(f"Merging nightly-testing resulted in conflicts in:"))
for file in problematic_files:
print(red(f" - {file}"))
print(red("Please resolve these conflicts manually."))
return
else:
# Only allowed files are conflicted - resolve them automatically
print(green(f"✅ Only allowed files conflicted: {', '.join(conflicted_files)}"))
print(blue("Resolving conflicts automatically..."))
# For lean-toolchain and lake-manifest.json, keep our versions
for file in conflicted_files:
print(blue(f"Keeping our version of {file}"))
run_command(f"git checkout --ours {file}", cwd=repo_path)
# Complete the merge
run_command("git add .", cwd=repo_path)
run_command("git commit --no-edit", cwd=repo_path)
print(green("✅ Merge completed successfully with automatic conflict resolution"))
# Build and test (skip for Mathlib)
if repo_name not in ["mathlib4"]:

View File

@@ -116,11 +116,19 @@ option(CHECK_OLEAN_VERSION "Only load .olean files compiled with the current ver
option(USE_LAKE "Use Lake instead of lean.mk for building core libs from language server" ON)
option(USE_LAKE_CACHE "Use the Lake artifact cache for stage 1 builds (requires USE_LAKE)" OFF)
set(LEAN_EXTRA_MAKE_OPTS "" CACHE STRING "extra options to lean --make")
set(LEAN_EXTRA_OPTS "" CACHE STRING "extra options to lean (via lake or make)")
set(LEAN_EXTRA_MAKE_OPTS "" CACHE STRING "extra options to leanmake")
set(LEANC_CC ${CMAKE_C_COMPILER} CACHE STRING "C compiler to use in `leanc`")
# Temporary, core-only flags. Must be synced with stdlib_flags.h.
string(APPEND LEAN_EXTRA_MAKE_OPTS " -Dbackward.do.legacy=false")
string(APPEND LEAN_EXTRA_OPTS " -Dbackward.do.legacy=false")
# option used by the CI to fail on warnings
option(WFAIL "Fail build if warnings are emitted by Lean" ON)
if(WFAIL MATCHES "ON")
string(APPEND LAKE_EXTRA_ARGS " --wfail")
string(APPEND LEAN_EXTRA_MAKE_OPTS " -DwarningAsError=true")
endif()
if(LAZY_RC MATCHES "ON")
set(LEAN_LAZY_RC "#define LEAN_LAZY_RC")
@@ -198,7 +206,7 @@ set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY "${CMAKE_BINARY_DIR}/lib/lean")
# OSX default thread stack size is very small. Moreover, in Debug mode, each new stack frame consumes a lot of extra memory.
if((MULTI_THREAD MATCHES "ON") AND (CMAKE_SYSTEM_NAME MATCHES "Darwin"))
string(APPEND LEAN_EXTRA_MAKE_OPTS " -s40000")
string(APPEND LEAN_EXTRA_OPTS " -s40000")
endif()
# We want explicit stack probes in huge Lean stack frames for robust stack overflow detection
@@ -670,6 +678,9 @@ else()
set(LEAN_PATH_SEPARATOR ":")
endif()
# inherit genral options for lean.mk.in and stdlib.make.in
string(APPEND LEAN_EXTRA_MAKE_OPTS " ${LEAN_EXTRA_OPTS}")
# Version
configure_file("${LEAN_SOURCE_DIR}/version.h.in" "${LEAN_BINARY_DIR}/include/lean/version.h")
if(STAGE EQUAL 0)
@@ -1054,7 +1065,7 @@ string(REPLACE "ROOT" "${CMAKE_BINARY_DIR}" LEANC_CC "${LEANC_CC}")
string(REPLACE "ROOT" "${CMAKE_BINARY_DIR}" LEANC_INTERNAL_FLAGS "${LEANC_INTERNAL_FLAGS}")
string(REPLACE "ROOT" "${CMAKE_BINARY_DIR}" LEANC_INTERNAL_LINKER_FLAGS "${LEANC_INTERNAL_LINKER_FLAGS}")
toml_escape("${LEAN_EXTRA_MAKE_OPTS}" LEAN_EXTRA_OPTS_TOML)
toml_escape("${LEAN_EXTRA_OPTS}" LEAN_EXTRA_OPTS_TOML)
if(CMAKE_BUILD_TYPE MATCHES "Debug|Release|RelWithDebInfo|MinSizeRel")
set(CMAKE_BUILD_TYPE_TOML "${CMAKE_BUILD_TYPE}")

View File

@@ -802,6 +802,7 @@ Examples:
def firstM {α : Type u} {m : Type v Type w} [Alternative m] (f : α m β) (as : Array α) : m β :=
go 0
where
@[specialize]
go (i : Nat) : m β :=
if hlt : i < as.size then
f as[i] <|> go (i+1)
@@ -1085,6 +1086,17 @@ Examples:
def sum {α} [Add α] [Zero α] : Array α α :=
foldr (· + ·) 0
/--
Computes the product of the elements of an array.
Examples:
* `#[a, b, c].prod = a * (b * (c * 1))`
* `#[1, 2, 5].prod = 10`
-/
@[inline, expose]
def prod {α} [Mul α] [One α] : Array α α :=
foldr (· * ·) 1
/--
Counts the number of elements in the array `as` that satisfy the Boolean predicate `p`.
@@ -1253,7 +1265,7 @@ Examples:
-/
@[inline, expose]
def findIdx? {α : Type u} (p : α Bool) (as : Array α) : Option Nat :=
let rec loop (j : Nat) :=
let rec @[specialize] loop (j : Nat) :=
if h : j < as.size then
if p as[j] then some j else loop (j + 1)
else none

View File

@@ -7,6 +7,7 @@ module
prelude
public import Init.Data.List.Int.Sum
public import Init.Data.List.Int.Prod
public import Init.Data.Array.MinMax
import Init.Data.Int.Lemmas
@@ -74,4 +75,17 @@ theorem sum_div_length_le_max_of_max?_eq_some_int {xs : Array Int} (h : xs.max?
simpa [List.max?_toArray, List.sum_toArray] using
List.sum_div_length_le_max_of_max?_eq_some_int (by simpa using h)
@[simp] theorem prod_replicate_int {n : Nat} {a : Int} : (replicate n a).prod = a ^ n := by
rw [ List.toArray_replicate, List.prod_toArray]
simp
theorem prod_append_int {as₁ as₂ : Array Int} : (as₁ ++ as₂).prod = as₁.prod * as₂.prod := by
simp [prod_append]
theorem prod_reverse_int (xs : Array Int) : xs.reverse.prod = xs.prod := by
simp [prod_reverse]
theorem prod_eq_foldl_int {xs : Array Int} : xs.prod = xs.foldl (init := 1) (· * ·) := by
simp only [foldl_eq_foldr_reverse, Int.mul_comm, prod_eq_foldr, prod_reverse_int]
end Array

View File

@@ -4380,6 +4380,47 @@ theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
xs.sum = xs.foldl (init := 0) (· + ·) := by
simp [ sum_toList, List.sum_eq_foldl]
/-! ### prod -/
@[simp, grind =] theorem prod_empty [Mul α] [One α] : (#[] : Array α).prod = 1 := rfl
theorem prod_eq_foldr [Mul α] [One α] {xs : Array α} :
xs.prod = xs.foldr (init := 1) (· * ·) :=
rfl
@[simp, grind =]
theorem prod_toList [Mul α] [One α] {as : Array α} : as.toList.prod = as.prod := by
cases as
simp [Array.prod, List.prod]
@[simp, grind =]
theorem prod_append [One α] [Mul α] [Std.Associative (α := α) (· * ·)]
[Std.LawfulLeftIdentity (α := α) (· * ·) 1]
{as₁ as₂ : Array α} : (as₁ ++ as₂).prod = as₁.prod * as₂.prod := by
simp [ prod_toList, List.prod_append]
@[simp, grind =]
theorem prod_singleton [Mul α] [One α] [Std.LawfulRightIdentity (· * ·) (1 : α)] {x : α} :
#[x].prod = x := by
simp [Array.prod_eq_foldr, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem prod_push [Mul α] [One α] [Std.Associative (α := α) (· * ·)]
[Std.LawfulIdentity (· * ·) (1 : α)] {xs : Array α} {x : α} :
(xs.push x).prod = xs.prod * x := by
simp [Array.prod_eq_foldr, Std.LawfulRightIdentity.right_id, Std.LawfulLeftIdentity.left_id,
Array.foldr_assoc]
@[simp, grind =]
theorem prod_reverse [One α] [Mul α] [Std.Associative (α := α) (· * ·)]
[Std.Commutative (α := α) (· * ·)]
[Std.LawfulLeftIdentity (α := α) (· * ·) 1] (xs : Array α) : xs.reverse.prod = xs.prod := by
simp [ prod_toList, List.prod_reverse]
theorem prod_eq_foldl [One α] [Mul α] [Std.Associative (α := α) (· * ·)]
[Std.LawfulIdentity (· * ·) (1 : α)] {xs : Array α} :
xs.prod = xs.foldl (init := 1) (· * ·) := by
simp [ prod_toList, List.prod_eq_foldl]
theorem foldl_toList_eq_flatMap {l : List α} {acc : Array β}
{F : Array β α Array β} {G : α List β}
(H : acc a, (F acc a).toList = acc.toList ++ G a) :

View File

@@ -8,6 +8,7 @@ module
prelude
public import Init.Data.Array.MinMax
import Init.Data.List.Nat.Sum
import Init.Data.List.Nat.Prod
import Init.Data.Array.Lemmas
public section
@@ -81,4 +82,24 @@ theorem sum_div_length_le_max_of_max?_eq_some_nat {xs : Array Nat} (h : xs.max?
simpa [List.max?_toArray, List.sum_toArray] using
List.sum_div_length_le_max_of_max?_eq_some_nat (by simpa using h)
protected theorem prod_pos_iff_forall_pos_nat {xs : Array Nat} : 0 < xs.prod x xs, 0 < x := by
simp [ prod_toList, List.prod_pos_iff_forall_pos_nat]
protected theorem prod_eq_zero_iff_exists_zero_nat {xs : Array Nat} :
xs.prod = 0 x xs, x = 0 := by
simp [ prod_toList, List.prod_eq_zero_iff_exists_zero_nat]
@[simp] theorem prod_replicate_nat {n : Nat} {a : Nat} : (replicate n a).prod = a ^ n := by
rw [ List.toArray_replicate, List.prod_toArray]
simp
theorem prod_append_nat {as₁ as₂ : Array Nat} : (as₁ ++ as₂).prod = as₁.prod * as₂.prod := by
simp [prod_append]
theorem prod_reverse_nat (xs : Array Nat) : xs.reverse.prod = xs.prod := by
simp [prod_reverse]
theorem prod_eq_foldl_nat {xs : Array Nat} : xs.prod = xs.foldl (init := 1) (· * ·) := by
simp only [foldl_eq_foldr_reverse, Nat.mul_comm, prod_eq_foldr, prod_reverse_nat]
end Array

View File

@@ -282,6 +282,7 @@ The lexicographic order with respect to `lt` is:
* `as.lex [] = false` is `false`
* `(a :: as).lex (b :: bs)` is true if `lt a b` or `a == b` and `lex lt as bs` is true.
-/
@[specialize]
def lex [BEq α] (l₁ l₂ : List α) (lt : α α Bool := by exact (· < ·)) : Bool :=
match l₁, l₂ with
| [], _ :: _ => true
@@ -1004,6 +1005,7 @@ Examples:
* `[8, 3, 2, 4, 2, 7, 4].dropWhile (· < 4) = [8, 3, 2, 4, 2, 7, 4]`
* `[8, 3, 2, 4, 2, 7, 4].dropWhile (· < 100) = []`
-/
@[specialize]
def dropWhile (p : α Bool) : List α List α
| [] => []
| a::l => match p a with
@@ -1460,9 +1462,11 @@ Examples:
["circle", "square", "triangle"]
```
-/
@[inline]
def modifyTailIdx (l : List α) (i : Nat) (f : List α List α) : List α :=
go i l
where
@[specialize]
go : Nat List α List α
| 0, l => f l
| _+1, [] => []
@@ -1498,6 +1502,7 @@ Examples:
* `[1, 2, 3].modify 2 (· * 10) = [1, 2, 30]`
* `[1, 2, 3].modify 3 (· * 10) = [1, 2, 3]`
-/
@[inline]
def modify (l : List α) (i : Nat) (f : α α) : List α :=
l.modifyTailIdx i (modifyHead f)
@@ -1604,6 +1609,7 @@ Examples:
* `[7, 6, 5, 8, 1, 2, 6].find? (· < 5) = some 1`
* `[7, 6, 5, 8, 1, 2, 6].find? (· < 1) = none`
-/
@[specialize]
def find? (p : α Bool) : List α Option α
| [] => none
| a::as => match p a with
@@ -1626,6 +1632,7 @@ Examples:
* `[7, 6, 5, 8, 1, 2, 6].findSome? (fun x => if x < 5 then some (10 * x) else none) = some 10`
* `[7, 6, 5, 8, 1, 2, 6].findSome? (fun x => if x < 1 then some (10 * x) else none) = none`
-/
@[specialize]
def findSome? (f : α Option β) : List α Option β
| [] => none
| a::as => match f a with
@@ -1649,6 +1656,7 @@ Examples:
* `[7, 6, 5, 8, 1, 2, 6].findRev? (· < 5) = some 2`
* `[7, 6, 5, 8, 1, 2, 6].findRev? (· < 1) = none`
-/
@[specialize]
def findRev? (p : α Bool) : List α Option α
| [] => none
| a::as => match findRev? p as with
@@ -1667,6 +1675,7 @@ Examples:
* `[7, 6, 5, 8, 1, 2, 6].findSomeRev? (fun x => if x < 5 then some (10 * x) else none) = some 20`
* `[7, 6, 5, 8, 1, 2, 6].findSomeRev? (fun x => if x < 1 then some (10 * x) else none) = none`
-/
@[specialize]
def findSomeRev? (f : α Option β) : List α Option β
| [] => none
| a::as => match findSomeRev? f as with
@@ -1717,9 +1726,11 @@ Examples:
* `[7, 6, 5, 8, 1, 2, 6].findIdx (· < 5) = some 4`
* `[7, 6, 5, 8, 1, 2, 6].findIdx (· < 1) = none`
-/
@[inline]
def findIdx? (p : α Bool) (l : List α) : Option Nat :=
go l 0
where
@[specialize]
go : List α Nat Option Nat
| [], _ => none
| a :: l, i => if p a then some i else go l (i + 1)
@@ -1750,6 +1761,7 @@ Examples:
@[inline] def findFinIdx? (p : α Bool) (l : List α) : Option (Fin l.length) :=
go l 0 (by simp)
where
@[specialize]
go : (l' : List α) (i : Nat) (h : l'.length + i = l.length) Option (Fin l.length)
| [], _, _ => none
| a :: l, i, h =>
@@ -1886,7 +1898,7 @@ Examples:
* `[2, 4, 5, 6].any (· % 2 = 0) = true`
* `[2, 4, 5, 6].any (· % 2 = 1) = true`
-/
@[suggest_for List.some]
@[suggest_for List.some, specialize]
def any : (l : List α) (p : α Bool) Bool
| [], _ => false
| h :: t, p => p h || any t p
@@ -1906,7 +1918,7 @@ Examples:
* `[2, 4, 6].all (· % 2 = 0) = true`
* `[2, 4, 5, 6].all (· % 2 = 0) = false`
-/
@[suggest_for List.every]
@[suggest_for List.every, specialize]
def all : List α (α Bool) Bool
| [], _ => true
| h :: t, p => p h && all t p
@@ -2007,6 +2019,7 @@ Examples:
* `[1, 2, 3].zipWithAll Prod.mk [5, 6] = [(some 1, some 5), (some 2, some 6), (some 3, none)]`
* `[x₁, x₂].zipWithAll f [y] = [f (some x₁) (some y), f (some x₂) none]`
-/
@[specialize]
def zipWithAll (f : Option α Option β γ) : List α List β List γ
| [], bs => bs.map fun b => f none (some b)
| a :: as, [] => (a :: as).map fun a => f (some a) none
@@ -2056,6 +2069,20 @@ def sum {α} [Add α] [Zero α] : List αα :=
@[simp, grind =] theorem sum_cons [Add α] [Zero α] {a : α} {l : List α} : (a::l).sum = a + l.sum := rfl
theorem sum_eq_foldr [Add α] [Zero α] {l : List α} : l.sum = l.foldr (· + ·) 0 := rfl
/--
Computes the product of the elements of a list.
Examples:
* `[a, b, c].prod = a * (b * (c * 1))`
* `[1, 2, 5].prod = 10`
-/
def prod {α} [Mul α] [One α] : List α α :=
foldr (· * ·) 1
@[simp, grind =] theorem prod_nil [Mul α] [One α] : ([] : List α).prod = 1 := rfl
@[simp, grind =] theorem prod_cons [Mul α] [One α] {a : α} {l : List α} : (a::l).prod = a * l.prod := rfl
theorem prod_eq_foldr [Mul α] [One α] {l : List α} : l.prod = l.foldr (· * ·) 1 := rfl
/-! ### range -/
/--

View File

@@ -444,8 +444,8 @@ theorem findM?_eq_findSomeM? [Monad m] [LawfulMonad m] {p : α → m Bool} {as :
intro b
cases b <;> simp
@[inline, expose] protected def forIn' {α : Type u} {β : Type v} {m : Type v Type w} [Monad m] (as : List α) (init : β) (f : (a : α) a as β m (ForInStep β)) : m β :=
let rec @[specialize] loop : (as' : List α) (b : β) Exists (fun bs => bs ++ as' = as) m β
@[inline, expose] protected def forIn' {α : Type u} {β : Type v} {m : Type v Type w} [Monad m] (as : @& List α) (init : β) (f : (a : α) a as β m (ForInStep β)) : m β :=
let rec @[specialize] loop : (as' : @& List α) (b : β) Exists (fun bs => bs ++ as' = as) m β
| [], b, _ => pure b
| a::as', b, h => do
have : a as := by

View File

@@ -7,3 +7,4 @@ module
prelude
public import Init.Data.List.Int.Sum
public import Init.Data.List.Int.Prod

View File

@@ -0,0 +1,31 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
module
prelude
import Init.Data.List.Lemmas
import Init.Data.Int.Lemmas
public import Init.Data.Int.Pow
public import Init.Data.List.Basic
public section
set_option linter.listVariables true -- Enforce naming conventions for `List`/`Array`/`Vector` variables.
set_option linter.indexVariables true -- Enforce naming conventions for index variables.
namespace List
@[simp]
theorem prod_replicate_int {n : Nat} {a : Int} : (replicate n a).prod = a ^ n := by
induction n <;> simp_all [replicate_succ, Int.pow_succ, Int.mul_comm]
theorem prod_append_int {l₁ l₂ : List Int} : (l₁ ++ l₂).prod = l₁.prod * l₂.prod := by
simp [prod_append]
theorem prod_reverse_int (xs : List Int) : xs.reverse.prod = xs.prod := by
simp [prod_reverse]
end List

View File

@@ -1878,6 +1878,24 @@ theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
simp_all [sum_append, Std.Commutative.comm (α := α) _ 0,
Std.LawfulLeftIdentity.left_id, Std.Commutative.comm]
@[simp, grind =]
theorem prod_append [Mul α] [One α] [Std.LawfulLeftIdentity (α := α) (· * ·) 1]
[Std.Associative (α := α) (· * ·)] {l₁ l₂ : List α} : (l₁ ++ l₂).prod = l₁.prod * l₂.prod := by
induction l₁ generalizing l₂ <;> simp_all [Std.Associative.assoc, Std.LawfulLeftIdentity.left_id]
@[simp, grind =]
theorem prod_singleton [Mul α] [One α] [Std.LawfulRightIdentity (· * ·) (1 : α)] {x : α} :
[x].prod = x := by
simp [List.prod_eq_foldr, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem prod_reverse [One α] [Mul α] [Std.Associative (α := α) (· * ·)]
[Std.Commutative (α := α) (· * ·)]
[Std.LawfulLeftIdentity (α := α) (· * ·) 1] (xs : List α) : xs.reverse.prod = xs.prod := by
induction xs <;>
simp_all [prod_append, Std.Commutative.comm (α := α) _ 1,
Std.LawfulLeftIdentity.left_id, Std.Commutative.comm]
/-! ### concat
Note that `concat_eq_append` is a `@[simp]` lemma, so `concat` should usually not appear in goals.
@@ -2784,6 +2802,11 @@ theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
xs.sum = xs.foldl (init := 0) (· + ·) := by
simp [sum_eq_foldr, foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
theorem prod_eq_foldl [One α] [Mul α] [Std.Associative (α := α) (· * ·)]
[Std.LawfulIdentity (· * ·) (1 : α)] {xs : List α} :
xs.prod = xs.foldl (init := 1) (· * ·) := by
simp [prod_eq_foldr, foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
-- The argument `f : α₁ → α₂` is intentionally explicit, as it is sometimes not found by unification.
theorem foldl_hom (f : α₁ α₂) {g₁ : α₁ β α₁} {g₂ : α₂ β α₂} {l : List β} {init : α₁}
(H : x y, g₂ (f x) y = f (g₁ x y)) : l.foldl g₂ (f init) = f (l.foldl g₁ init) := by

View File

@@ -13,6 +13,7 @@ public import Init.Data.List.Nat.Sublist
public import Init.Data.List.Nat.TakeDrop
public import Init.Data.List.Nat.Count
public import Init.Data.List.Nat.Sum
public import Init.Data.List.Nat.Prod
public import Init.Data.List.Nat.Erase
public import Init.Data.List.Nat.Find
public import Init.Data.List.Nat.BEq

View File

@@ -0,0 +1,50 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
module
prelude
import Init.Data.List.Lemmas
public import Init.BinderPredicates
public import Init.NotationExtra
import Init.Data.Nat.Lemmas
public section
set_option linter.listVariables true -- Enforce naming conventions for `List`/`Array`/`Vector` variables.
set_option linter.indexVariables true -- Enforce naming conventions for index variables.
namespace List
protected theorem prod_eq_zero_iff_exists_zero_nat {l : List Nat} : l.prod = 0 x l, x = 0 := by
induction l with
| nil => simp
| cons x xs ih =>
simp [Nat.mul_eq_zero, ih, eq_comm (a := (0 : Nat))]
protected theorem prod_pos_iff_forall_pos_nat {l : List Nat} : 0 < l.prod x l, 0 < x := by
induction l with
| nil => simp
| cons x xs ih =>
simp only [prod_cons, mem_cons, forall_eq_or_imp, ih]
constructor
· intro h
exact Nat.pos_of_mul_pos_right h, Nat.pos_of_mul_pos_left h
· exact fun hx, hxs => Nat.mul_pos hx hxs
@[simp]
theorem prod_replicate_nat {n : Nat} {a : Nat} : (replicate n a).prod = a ^ n := by
induction n <;> simp_all [replicate_succ, Nat.pow_succ, Nat.mul_comm]
theorem prod_append_nat {l₁ l₂ : List Nat} : (l₁ ++ l₂).prod = l₁.prod * l₂.prod := by
simp [prod_append]
theorem prod_reverse_nat (xs : List Nat) : xs.reverse.prod = xs.prod := by
simp [prod_reverse]
theorem prod_eq_foldl_nat {xs : List Nat} : xs.prod = xs.foldl (init := 1) (· * ·) := by
simp only [foldl_eq_foldr_reverse, Nat.mul_comm, prod_eq_foldr, prod_reverse_nat]
end List

View File

@@ -606,6 +606,13 @@ theorem sum_nat {l₁ l₂ : List Nat} (h : l₁ ~ l₂) : l₁.sum = l₂.sum :
| swap => simpa [List.sum_cons] using Nat.add_left_comm ..
| trans _ _ ih₁ ih₂ => simp [ih₁, ih₂]
theorem prod_nat {l₁ l₂ : List Nat} (h : l₁ ~ l₂) : l₁.prod = l₂.prod := by
induction h with
| nil => simp
| cons _ _ ih => simp [ih]
| swap => simpa [List.prod_cons] using Nat.mul_left_comm ..
| trans _ _ ih₁ ih₂ => simp [ih₁, ih₂]
theorem all_eq {l₁ l₂ : List α} {f : α Bool} (hp : l₁.Perm l₂) : l₁.all f = l₂.all f := by
rw [Bool.eq_iff_iff]; simp [hp.mem_iff]
@@ -615,6 +622,9 @@ theorem any_eq {l₁ l₂ : List α} {f : α → Bool} (hp : l₁.Perm l₂) : l
grind_pattern Perm.sum_nat => l₁ ~ l₂, l₁.sum
grind_pattern Perm.sum_nat => l₁ ~ l₂, l₂.sum
grind_pattern Perm.prod_nat => l₁ ~ l₂, l₁.prod
grind_pattern Perm.prod_nat => l₁ ~ l₂, l₂.prod
end Perm
end List

View File

@@ -213,6 +213,9 @@ theorem forM_toArray [Monad m] (l : List α) (f : α → m PUnit) :
@[simp, grind =] theorem sum_toArray [Add α] [Zero α] (l : List α) : l.toArray.sum = l.sum := by
simp [Array.sum, List.sum]
@[simp, grind =] theorem prod_toArray [Mul α] [One α] (l : List α) : l.toArray.prod = l.prod := by
simp [Array.prod, List.prod]
@[simp, grind =] theorem append_toArray (l₁ l₂ : List α) :
l₁.toArray ++ l₂.toArray = (l₁ ++ l₂).toArray := by
apply ext'

View File

@@ -60,7 +60,7 @@ theorem gcd_def (x y : Nat) : gcd x y = if x = 0 then y else gcd (y % x) x := by
cases n with
| zero => simp
| succ n =>
-- `simp [gcd_succ]` produces an invalid term unless `gcd_succ` is proved with `id rfl` instead
-- `simp [gcd_succ]` produces an invalid term unless `gcd_succ` is proved with `(rfl)` instead
rw [gcd_succ]
exact gcd_zero_left _
instance : Std.LawfulIdentity gcd 0 where

View File

@@ -1718,7 +1718,7 @@ theorem toArray_roc_append_toArray_roc {l m n : Nat} (h : l ≤ m) (h' : m ≤ n
@[simp]
theorem getElem_toArray_roc {m n i : Nat} (_h : i < (m<...=n).toArray.size) :
(m<...=n).toArray[i]'_h = m + 1 + i := by
simp [toArray_roc_eq_toArray_rco]
simp [toArray_roc_eq_toArray_rco]
theorem getElem?_toArray_roc {m n i : Nat} :
(m<...=n).toArray[i]? = if i < n - m then some (m + 1 + i) else none := by

View File

@@ -363,7 +363,7 @@ theorem toBitVec_eq_of_parseFirstByte_eq_threeMore {b : UInt8} (h : parseFirstBy
public def isInvalidContinuationByte (b : UInt8) : Bool :=
b &&& 0xc0 != 0x80
theorem isInvalidContinutationByte_eq_false_iff {b : UInt8} :
theorem isInvalidContinuationByte_eq_false_iff {b : UInt8} :
isInvalidContinuationByte b = false b &&& 0xc0 = 0x80 := by
simp [isInvalidContinuationByte]

View File

@@ -506,6 +506,16 @@ Examples:
@[inline, expose] def sum [Add α] [Zero α] (xs : Vector α n) : α :=
xs.toArray.sum
/--
Computes the product of the elements of a vector.
Examples:
* `#v[a, b, c].prod = a * (b * (c * 1))`
* `#v[1, 2, 5].prod = 10`
-/
@[inline, expose] def prod [Mul α] [One α] (xs : Vector α n) : α :=
xs.toArray.prod
/--
Pad a vector on the left with a given element.

View File

@@ -30,4 +30,16 @@ theorem sum_reverse_int (xs : Vector Int n) : xs.reverse.sum = xs.sum := by
theorem sum_eq_foldl_int {xs : Vector Int n} : xs.sum = xs.foldl (b := 0) (· + ·) := by
simp only [foldl_eq_foldr_reverse, Int.add_comm, sum_eq_foldr, sum_reverse_int]
@[simp] theorem prod_replicate_int {n : Nat} {a : Int} : (replicate n a).prod = a ^ n := by
simp [ prod_toArray, Array.prod_replicate_int]
theorem prod_append_int {as₁ as₂ : Vector Int n} : (as₁ ++ as₂).prod = as₁.prod * as₂.prod := by
simp [ prod_toArray]
theorem prod_reverse_int (xs : Vector Int n) : xs.reverse.prod = xs.prod := by
simp [prod_reverse]
theorem prod_eq_foldl_int {xs : Vector Int n} : xs.prod = xs.foldl (b := 1) (· * ·) := by
simp only [foldl_eq_foldr_reverse, Int.mul_comm, prod_eq_foldr, prod_reverse_int]
end Vector

View File

@@ -278,6 +278,12 @@ theorem toArray_mk {xs : Array α} (h : xs.size = n) : (Vector.mk xs h).toArray
@[simp, grind =] theorem sum_toArray [Add α] [Zero α] {xs : Vector α n} :
xs.toArray.sum = xs.sum := rfl
@[simp] theorem prod_mk [Mul α] [One α] {xs : Array α} (h : xs.size = n) :
(Vector.mk xs h).prod = xs.prod := rfl
@[simp, grind =] theorem prod_toArray [Mul α] [One α] {xs : Vector α n} :
xs.toArray.prod = xs.prod := rfl
@[simp] theorem eq_mk : xs = Vector.mk as h xs.toArray = as := by
cases xs
simp
@@ -551,6 +557,10 @@ theorem toArray_toList {xs : Vector α n} : xs.toList.toArray = xs.toArray := rf
xs.toList.sum = xs.sum := by
rw [ toList_toArray, Array.sum_toList, sum_toArray]
@[simp, grind =] theorem prod_toList [Mul α] [One α] {xs : Vector α n} :
xs.toList.prod = xs.prod := by
rw [ toList_toArray, Array.prod_toList, prod_toArray]
@[simp] theorem getElem_toList {xs : Vector α n} {i : Nat} (h : i < xs.toList.length) :
xs.toList[i] = xs[i]'(by simpa using h) := by
cases xs
@@ -3134,3 +3144,39 @@ theorem sum_eq_foldl [Zero α] [Add α]
{xs : Vector α n} :
xs.sum = xs.foldl (b := 0) (· + ·) := by
simp [ sum_toList, List.sum_eq_foldl]
/-! ### prod -/
@[simp, grind =] theorem prod_empty [Mul α] [One α] : (#v[] : Vector α 0).prod = 1 := rfl
theorem prod_eq_foldr [Mul α] [One α] {xs : Vector α n} :
xs.prod = xs.foldr (b := 1) (· * ·) :=
rfl
@[simp, grind =]
theorem prod_append [One α] [Mul α] [Std.Associative (α := α) (· * ·)]
[Std.LeftIdentity (α := α) (· * ·) 1] [Std.LawfulLeftIdentity (α := α) (· * ·) 1]
{as₁ as₂ : Vector α n} : (as₁ ++ as₂).prod = as₁.prod * as₂.prod := by
simp [ prod_toList, List.prod_append]
@[simp, grind =]
theorem prod_singleton [Mul α] [One α] [Std.LawfulRightIdentity (· * ·) (1 : α)] {x : α} :
#v[x].prod = x := by
simp [ prod_toList, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem prod_push [Mul α] [One α] [Std.Associative (α := α) (· * ·)]
[Std.LawfulIdentity (· * ·) (1 : α)] {xs : Vector α n} {x : α} :
(xs.push x).prod = xs.prod * x := by
simp [ prod_toArray]
@[simp, grind =]
theorem prod_reverse [One α] [Mul α] [Std.Associative (α := α) (· * ·)]
[Std.Commutative (α := α) (· * ·)]
[Std.LawfulLeftIdentity (α := α) (· * ·) 1] (xs : Vector α n) : xs.reverse.prod = xs.prod := by
simp [ prod_toList, List.prod_reverse]
theorem prod_eq_foldl [One α] [Mul α]
[Std.Associative (α := α) (· * ·)] [Std.LawfulIdentity (· * ·) (1 : α)]
{xs : Vector α n} :
xs.prod = xs.foldl (b := 1) (· * ·) := by
simp [ prod_toList, List.prod_eq_foldl]

View File

@@ -37,4 +37,23 @@ theorem sum_reverse_nat (xs : Vector Nat n) : xs.reverse.sum = xs.sum := by
theorem sum_eq_foldl_nat {xs : Vector Nat n} : xs.sum = xs.foldl (b := 0) (· + ·) := by
simp only [foldl_eq_foldr_reverse, Nat.add_comm, sum_eq_foldr, sum_reverse_nat]
protected theorem prod_pos_iff_forall_pos_nat {xs : Vector Nat n} : 0 < xs.prod x xs, 0 < x := by
simp [ prod_toArray, Array.prod_pos_iff_forall_pos_nat]
protected theorem prod_eq_zero_iff_exists_zero_nat {xs : Vector Nat n} :
xs.prod = 0 x xs, x = 0 := by
simp [ prod_toArray, Array.prod_eq_zero_iff_exists_zero_nat]
@[simp] theorem prod_replicate_nat {n : Nat} {a : Nat} : (replicate n a).prod = a ^ n := by
simp [ prod_toArray, Array.prod_replicate_nat]
theorem prod_append_nat {as₁ as₂ : Vector Nat n} : (as₁ ++ as₂).prod = as₁.prod * as₂.prod := by
simp [ prod_toArray]
theorem prod_reverse_nat (xs : Vector Nat n) : xs.reverse.prod = xs.prod := by
simp [prod_reverse]
theorem prod_eq_foldl_nat {xs : Vector Nat n} : xs.prod = xs.foldl (b := 1) (· * ·) := by
simp only [foldl_eq_foldr_reverse, Nat.mul_comm, prod_eq_foldr, prod_reverse_nat]
end Vector

View File

@@ -564,6 +564,28 @@ end Ring
end IsCharP
/--
`PowIdentity α p` states that `x ^ p = x` holds for all elements of `α`.
The primary source of instances is Fermat's little theorem: for a finite field with `q` elements,
`x ^ q = x` for every `x`. For `Fin p` or `ZMod p` with prime `p`, this gives `x ^ p = x`.
The `grind` ring solver uses this typeclass to add the relation `x ^ p - x = 0` to the
Groebner basis, which allows it to reduce high-degree polynomials. Mathlib can provide
instances for general finite fields via `FiniteField.pow_card`.
-/
class PowIdentity (α : Type u) [CommSemiring α] (p : outParam Nat) : Prop where
/-- Every element satisfies `x ^ p = x`. -/
pow_eq (x : α) : x ^ p = x
namespace PowIdentity
variable [CommSemiring α] [PowIdentity α p]
theorem pow (x : α) : x ^ p = x := pow_eq x
end PowIdentity
open AddCommGroup
theorem no_int_zero_divisors {α : Type u} [IntModule α] [NoNatZeroDivisors α] {k : Int} {a : α}

View File

@@ -193,7 +193,7 @@ theorem mul_assoc (a b c : Q α) : mul (mul a b) c = mul a (mul b c) := by
simp [Semiring.left_distrib, Semiring.right_distrib]; refine 0, ?_; ac_rfl
theorem mul_one (a : Q α) : mul a (natCast 1) = a := by
obtain _, _ := a; simp
obtain _, _ := a; simp
theorem one_mul (a : Q α) : mul (natCast 1) a = a := by
obtain _, _ := a; simp

View File

@@ -156,6 +156,12 @@ instance [i : NeZero n] : ToInt.Pow (Fin n) (.co 0 n) where
rw [pow_succ, ToInt.Mul.toInt_mul, ih, ToInt.wrap_toInt,
IntInterval.wrap_mul (by simp), Int.pow_succ, ToInt.wrap_toInt]
instance : PowIdentity (Fin 2) 2 where
pow_eq x := by
match x with
| 0, _ => rfl
| 1, _ => rfl
end Fin
end Lean.Grind

View File

@@ -1880,3 +1880,12 @@ lead to undefined behavior.
-/
@[extern "lean_runtime_forget"]
def Runtime.forget (a : α) : BaseIO Unit := return
set_option linter.unusedVariables false in
/--
Ensures `a` remains at least alive until the call site by holding a reference to `a`. This can be useful
for unsafe code (such as an FFI) that relies on a Lean object not being freed until after some point
in the program. At runtime, this will be a no-op as the C compiler will optimize away this call.
-/
@[extern "lean_runtime_hold"]
def Runtime.hold (a : @& α) : BaseIO Unit := return

View File

@@ -9,6 +9,7 @@ prelude
public import Lean.Meta.Sorry
public import Lean.Util.CollectAxioms
public import Lean.OriginalConstKind
import Lean.Compiler.MetaAttr
import all Lean.OriginalConstKind -- for accessing `privateConstKindsExt`
public section
@@ -208,8 +209,12 @@ where
catch _ => pure ()
def addAndCompile (decl : Declaration) (logCompileErrors : Bool := true) : CoreM Unit := do
def addAndCompile (decl : Declaration) (logCompileErrors : Bool := true)
(markMeta : Bool := false) : CoreM Unit := do
addDecl decl
if markMeta then
for n in decl.getNames do
modifyEnv (Lean.markMeta · n)
compileDecl decl (logErrors := logCompileErrors)
end Lean

View File

@@ -54,7 +54,7 @@ unsafe def registerInitAttrUnsafe (attrName : Name) (runAfterImport : Bool) (ref
descr := "initialization procedure for global references"
-- We want to run `[init]` in declaration order
preserveOrder := true
getParam := fun declName stx => do
getParam := fun declName stx => withoutExporting do
let decl getConstInfo declName
match ( Attribute.Builtin.getIdent? stx) with
| some initFnName =>
@@ -149,8 +149,6 @@ def setBuiltinInitAttr (env : Environment) (declName : Name) (initFnName : Name
def declareBuiltin (forDecl : Name) (value : Expr) : CoreM Unit :=
-- can always be private, not referenced directly except through emitted C code
withoutExporting do
-- TODO: needs an update-stage0 + prefer_native=true for breaking symbol name
withExporting do
let name mkAuxDeclName (kind := `_regBuiltin ++ forDecl)
let type := mkApp (mkConst `IO) (mkConst `Unit)
let decl := Declaration.defnDecl { name, levelParams := [], type, value, hints := ReducibilityHints.opaque,

View File

@@ -67,7 +67,7 @@ structure ParamMap where
The set of fvars that were already annotated as borrowed before arriving at this pass. We try to
preserve the annotations here if possible.
-/
annoatedBorrows : Std.HashSet FVarId := {}
annotatedBorrows : Std.HashSet FVarId := {}
namespace ParamMap
@@ -95,7 +95,7 @@ where
modify fun m =>
{ m with
map := m.map.insert (.decl decl.name) (initParamsIfNotExported exported decl.params),
annoatedBorrows := decl.params.foldl (init := m.annoatedBorrows) fun acc p =>
annotatedBorrows := decl.params.foldl (init := m.annotatedBorrows) fun acc p =>
if p.borrow then acc.insert p.fvarId else acc
}
goCode decl.name code
@@ -116,7 +116,7 @@ where
modify fun m =>
{ m with
map := m.map.insert (.jp declName decl.fvarId) (initParams decl.params),
annoatedBorrows := decl.params.foldl (init := m.annoatedBorrows) fun acc p =>
annotatedBorrows := decl.params.foldl (init := m.annotatedBorrows) fun acc p =>
if p.borrow then acc.insert p.fvarId else acc
}
goCode declName decl.value
@@ -286,7 +286,7 @@ where
ownFVar (fvarId : FVarId) (reason : OwnReason) : InferM Unit := do
unless ( get).owned.contains fvarId do
if !reason.isForced && ( get).paramMap.annoatedBorrows.contains fvarId then
if !reason.isForced && ( get).paramMap.annotatedBorrows.contains fvarId then
trace[Compiler.inferBorrow] "user annotation blocked owning {← PP.run <| PP.ppFVar fvarId}: {← reason.toString}"
else
trace[Compiler.inferBorrow] "own {← PP.run <| PP.ppFVar fvarId}: {← reason.toString}"

View File

@@ -446,24 +446,6 @@ Note that the value of `ctx.initHeartbeats` is ignored and replaced with `IO.get
@[inline] def CoreM.toIO' (x : CoreM α) (ctx : Context) (s : State) : IO α :=
(·.1) <$> x.toIO ctx s
/--
Record a check-in for the `LEAN_CHECK_SYSTEM_INTERVAL_MS` monitoring.
When that env var is set, warns on stderr if too much CPU time has elapsed since
the last check-in from either the C++ `check_system` or this function.
-/
@[extern "lean_check_system_interval"]
private opaque checkSystemIntervalImpl (componentName : @& String) : BaseIO Unit
@[extern "lean_check_system_interval_is_enabled"]
private opaque checkSystemIntervalIsEnabled : Unit Bool
private builtin_initialize checkSystemIntervalEnabled : Bool
pure (checkSystemIntervalIsEnabled ())
@[inline] def checkSystemInterval (componentName : @& String) : BaseIO Unit := do
if checkSystemIntervalEnabled then
checkSystemIntervalImpl componentName
/--
Throws an internal interrupt exception if cancellation has been requested. The exception is not
caught by `try catch` but is intended to be caught by `Command.withLoggingExceptions` at the top
@@ -474,7 +456,6 @@ Like `checkSystem` but without the global heartbeat check, for callers that have
heartbeat tracking (e.g. `SynthInstance`).
-/
@[inline] def checkInterrupted : CoreM Unit := do
checkSystemInterval "Lean elaborator"
if let some tk := ( read).cancelTk? then
if ( tk.isSet) then
throwInterruptException

View File

@@ -227,7 +227,7 @@ variable {β : Type v}
set_option linter.unusedVariables.funArgs false in
@[specialize]
partial def forInAux {α : Type u} {β : Type v} {m : Type v Type w} [Monad m] [inh : Inhabited β]
(f : α β m (ForInStep β)) (n : PersistentArrayNode α) (b : β) : m (ForInStep β) := do
(f : α β m (ForInStep β)) (n : @&PersistentArrayNode α) (b : β) : m (ForInStep β) := do
let mut b := b
match n with
| leaf vs =>
@@ -243,7 +243,7 @@ partial def forInAux {α : Type u} {β : Type v} {m : Type v → Type w} [Monad
| ForInStep.yield bNew => b := bNew
return ForInStep.yield b
@[specialize] protected def forIn (t : PersistentArray α) (init : β) (f : α β m (ForInStep β)) : m β := do
@[specialize] protected def forIn (t : @&PersistentArray α) (init : β) (f : α β m (ForInStep β)) : m β := do
match ( forInAux (inh := init) f t.root init) with
| ForInStep.done b => pure b
| ForInStep.yield b =>

View File

@@ -153,7 +153,7 @@ partial def findAtAux [BEq α] (keys : Array α) (vals : Array β) (heq : keys.s
else findAtAux keys vals heq (i+1) k
else none
partial def findAux [BEq α] : Node α β USize α Option β
partial def findAux [BEq α] : @&Node α β USize α Option β
| Node.entries entries, h, k =>
let j := (mod2Shift h shift).toNat
match entries[j]! with
@@ -162,7 +162,7 @@ partial def findAux [BEq α] : Node α β → USize → α → Option β
| Entry.entry k' v => if k == k' then some v else none
| Node.collision keys vals heq, _, k => findAtAux keys vals heq 0 k
def find? {_ : BEq α} {_ : Hashable α} : PersistentHashMap α β α Option β
def find? {_ : BEq α} {_ : Hashable α} : @&PersistentHashMap α β α Option β
| { root }, k => findAux root (hash k |>.toUSize) k
instance {_ : BEq α} {_ : Hashable α} : GetElem (PersistentHashMap α β) α (Option β) fun _ _ => True where
@@ -184,7 +184,7 @@ partial def findEntryAtAux [BEq α] (keys : Array α) (vals : Array β) (heq : k
else findEntryAtAux keys vals heq (i+1) k
else none
partial def findEntryAux [BEq α] : Node α β USize α Option (α × β)
partial def findEntryAux [BEq α] : @&Node α β USize α Option (α × β)
| Node.entries entries, h, k =>
let j := (mod2Shift h shift).toNat
match entries[j]! with
@@ -193,7 +193,7 @@ partial def findEntryAux [BEq α] : Node α β → USize → α → Option (α
| Entry.entry k' v => if k == k' then some (k', v) else none
| Node.collision keys vals heq, _, k => findEntryAtAux keys vals heq 0 k
def findEntry? {_ : BEq α} {_ : Hashable α} : PersistentHashMap α β α Option (α × β)
def findEntry? {_ : BEq α} {_ : Hashable α} : @&PersistentHashMap α β α Option (α × β)
| { root }, k => findEntryAux root (hash k |>.toUSize) k
partial def findKeyDAtAux [BEq α] (keys : Array α) (vals : Array β) (heq : keys.size = vals.size) (i : Nat) (k : α) (k₀ : α) : α :=
@@ -320,7 +320,7 @@ def foldl {_ : BEq α} {_ : Hashable α} (map : PersistentHashMap α β) (f : σ
Id.run <| map.foldlM (pure <| f · · ·) init
protected def forIn {_ : BEq α} {_ : Hashable α} [Monad m]
(map : PersistentHashMap α β) (init : σ) (f : α × β σ m (ForInStep σ)) : m σ := do
(map : @&PersistentHashMap α β) (init : σ) (f : α × β σ m (ForInStep σ)) : m σ := do
let intoError : ForInStep σ Except σ σ
| .done s => .error s
| .yield s => .ok s

View File

@@ -131,9 +131,9 @@ partial def find? (t : Trie α) (s : String) : Option α :=
loop 0 t
/-- Returns an `Array` of all values in the trie, in no particular order. -/
partial def values (t : Trie α) : Array α := go t |>.run #[] |>.2
partial def values (t : @&Trie α) : Array α := go t |>.run #[] |>.2
where
go : Trie α StateM (Array α) Unit
go : @&Trie α StateM (Array α) Unit
| leaf a? => do
if let some a := a? then
modify (·.push a)

View File

@@ -13,6 +13,7 @@ public import Lean.IdentifierSuggestion
import all Lean.Elab.ErrorUtils
import Lean.Elab.DeprecatedArg
import Init.Omega
import Init.Data.List.MapIdx
public section
@@ -1299,13 +1300,13 @@ where
inductive LValResolution where
/-- When applied to `f`, effectively expands to `BaseStruct.fieldName (self := Struct.toBase f)`.
This is a special named argument where it suppresses any explicit arguments depending on it so that type parameters don't need to be supplied. -/
| projFn (baseStructName : Name) (structName : Name) (fieldName : Name)
| projFn (baseStructName : Name) (structName : Name) (fieldName : Name) (levels : List Level)
/-- Similar to `projFn`, but for extracting field indexed by `idx`. Works for one-constructor inductive types in general. -/
| projIdx (structName : Name) (idx : Nat)
/-- When applied to `f`, effectively expands to `constName ... (Struct.toBase f)`, with the argument placed in the correct
positional argument if possible, or otherwise as a named argument. The `Struct.toBase` is not present if `baseStructName == structName`,
in which case these do not need to be structures. Supports generalized field notation. -/
| const (baseStructName : Name) (structName : Name) (constName : Name)
| const (baseStructName : Name) (structName : Name) (constName : Name) (levels : List Level)
/-- Like `const`, but with `fvar` instead of `constName`.
The `baseName` is the base name of the type to search for in the parameter list. -/
| localRec (baseName : Name) (fvar : Expr)
@@ -1380,7 +1381,7 @@ private def reverseFieldLookup (env : Environment) (fieldName : String) :=
private def resolveLValAux (e : Expr) (eType : Expr) (lval : LVal) : TermElabM LValResolution := do
match eType.getAppFn, lval with
| .const structName _, LVal.fieldIdx ref idx =>
| .const structName _, LVal.fieldIdx ref idx levels =>
if idx == 0 then
throwError "Invalid projection: Index must be greater than 0"
let env getEnv
@@ -1393,10 +1394,14 @@ private def resolveLValAux (e : Expr) (eType : Expr) (lval : LVal) : TermElabM L
if idx - 1 < numFields then
if isStructure env structName then
let fieldNames := getStructureFields env structName
return LValResolution.projFn structName structName fieldNames[idx - 1]!
return LValResolution.projFn structName structName fieldNames[idx - 1]! levels
else
/- `structName` was declared using `inductive` command.
So, we don't projection functions for it. Thus, we use `Expr.proj` -/
unless levels.isEmpty do
throwError "Invalid projection: Explicit universe levels are only supported for inductive types \
defined using the `structure` command. \
The expression{indentExpr e}\nhas type{inlineExpr eType}which is not a `structure`."
return LValResolution.projIdx structName (idx - 1)
else
if numFields == 0 then
@@ -1409,31 +1414,33 @@ private def resolveLValAux (e : Expr) (eType : Expr) (lval : LVal) : TermElabM L
++ MessageData.note m!"The expression{indentExpr e}\nhas type{inlineExpr eType}which has only \
{numFields} field{numFields.plural}"
++ tupleHint
| .const structName _, LVal.fieldName ref fieldName _ _ => withRef ref do
| .const structName _, LVal.fieldName ref fieldName levels _ _ => withRef ref do
let env getEnv
if isStructure env structName then
if let some baseStructName := findField? env structName (Name.mkSimple fieldName) then
return LValResolution.projFn baseStructName structName (Name.mkSimple fieldName)
return LValResolution.projFn baseStructName structName (Name.mkSimple fieldName) levels
-- Search the local context first
let fullName := Name.mkStr (privateToUserName structName) fieldName
for localDecl in ( getLCtx) do
if localDecl.isAuxDecl then
if let some localDeclFullName := ( getLCtx).auxDeclToFullName.get? localDecl.fvarId then
if fullName == privateToUserName localDeclFullName then
unless levels.isEmpty do
throwInvalidExplicitUniversesForLocal localDecl.toExpr
/- LVal notation is being used to make a "local" recursive call. -/
return LValResolution.localRec structName localDecl.toExpr
-- Then search the environment
if let some (baseStructName, fullName) findMethod? structName (.mkSimple fieldName) then
return LValResolution.const baseStructName structName fullName
return LValResolution.const baseStructName structName fullName levels
throwInvalidFieldAt ref fieldName fullName
-- Suggest a potential unreachable private name as hint. This does not cover structure
-- inheritance, nor `import all`.
(declHint := (mkPrivateName env structName).mkStr fieldName)
| .forallE .., LVal.fieldName ref fieldName suffix? fullRef =>
| .forallE .., LVal.fieldName ref fieldName levels suffix? fullRef =>
let fullName := Name.str `Function fieldName
if ( getEnv).contains fullName then
return LValResolution.const `Function `Function fullName
return LValResolution.const `Function `Function fullName levels
match e.getAppFn, suffix? with
| Expr.const c _, some suffix =>
throwUnknownNameWithSuggestions (idOrConst := "constant") (ref? := fullRef) (c ++ suffix)
@@ -1443,7 +1450,7 @@ private def resolveLValAux (e : Expr) (eType : Expr) (lval : LVal) : TermElabM L
throwError "Invalid projection: Projections cannot be used on functions, and{indentExpr e}\n\
has function type{inlineExprTrailing eType}"
| .mvar .., .fieldName _ fieldName _ _ =>
| .mvar .., .fieldName _ fieldName levels _ _ =>
let hint := match reverseFieldLookup ( getEnv) fieldName with
| #[] => MessageData.nil
| #[opt] => .hint' m!"Consider replacing the field projection `.{fieldName}` with a call to the function `{.ofConstName opt}`."
@@ -1451,13 +1458,13 @@ private def resolveLValAux (e : Expr) (eType : Expr) (lval : LVal) : TermElabM L
{MessageData.joinSep (opts.toList.map (indentD m!" `{.ofConstName ·}`")) .nil}"
throwNamedError lean.invalidField (m!"Invalid field notation: Type of{indentExpr e}\nis not \
known; cannot resolve field `{fieldName}`" ++ hint)
| .mvar .., .fieldIdx _ i =>
| .mvar .., .fieldIdx _ i _ =>
throwError m!"Invalid projection: Type of{indentExpr e}\nis not known; cannot resolve \
projection `{i}`"
| _, _ =>
match e.getAppFn, lval with
| Expr.const c _, .fieldName _ref _fieldName (some suffix) fullRef =>
| Expr.const c _, .fieldName _ref _fieldName _levels (some suffix) fullRef =>
throwUnknownNameWithSuggestions (idOrConst := "constant") (ref? := fullRef) (c ++ suffix)
| _, .fieldName .. =>
throwNamedError lean.invalidField m!"Invalid field notation: Field projection operates on \
@@ -1706,12 +1713,12 @@ private def elabAppLValsAux (namedArgs : Array NamedArg) (args : Array Arg) (exp
let f mkProjAndCheck structName idx f
let f addTermInfo lval.getRef f
loop f lvals
| LValResolution.projFn baseStructName structName fieldName =>
| LValResolution.projFn baseStructName structName fieldName levels =>
let f mkBaseProjections baseStructName structName f
let some info := getFieldInfo? ( getEnv) baseStructName fieldName | unreachable!
if ( isInaccessiblePrivateName info.projFn) then
throwError "Field `{fieldName}` from structure `{structName}` is private"
let projFn withRef lval.getRef <| mkConst info.projFn
let projFn withRef lval.getRef <| mkConst info.projFn levels
let projFn addProjTermInfo lval.getRef projFn
if lvals.isEmpty then
let namedArgs addNamedArg namedArgs { name := `self, val := Arg.expr f, suppressDeps := true }
@@ -1719,9 +1726,9 @@ private def elabAppLValsAux (namedArgs : Array NamedArg) (args : Array Arg) (exp
else
let f elabAppArgs projFn #[{ name := `self, val := Arg.expr f, suppressDeps := true }] #[] (expectedType? := none) (explicit := false) (ellipsis := false)
loop f lvals
| LValResolution.const baseStructName structName constName =>
| LValResolution.const baseStructName structName constName levels =>
let f if baseStructName != structName then mkBaseProjections baseStructName structName f else pure f
let projFn withRef lval.getRef <| mkConst constName
let projFn withRef lval.getRef <| mkConst constName levels
let projFn addProjTermInfo lval.getRef projFn
if lvals.isEmpty then
let (args, namedArgs) addLValArg baseStructName f args namedArgs projFn explicit
@@ -1772,15 +1779,19 @@ false, no elaboration function executed by `x` will reset it to
/--
Elaborates the resolutions of a function. The `fns` array is the output of `resolveName'`.
-/
private def elabAppFnResolutions (fRef : Syntax) (fns : List (Expr × Syntax × List Syntax)) (lvals : List LVal)
private def elabAppFnResolutions (fRef : Syntax) (fns : List (Expr × Syntax × List Syntax × List Level)) (lvals : List LVal)
(namedArgs : Array NamedArg) (args : Array Arg) (expectedType? : Option Expr) (explicit ellipsis overloaded : Bool)
(acc : Array (TermElabResult Expr)) (forceTermInfo : Bool := false) :
TermElabM (Array (TermElabResult Expr)) := do
let overloaded := overloaded || fns.length > 1
-- Set `errToSorry` to `false` if `fns` > 1. See comment above about the interaction between `errToSorry` and `observing`.
withReader (fun ctx => { ctx with errToSorry := fns.length == 1 && ctx.errToSorry }) do
fns.foldlM (init := acc) fun acc (f, fIdent, fields) => do
let lvals' := toLVals fields (first := true)
fns.foldlM (init := acc) fun acc (f, fIdent, fields, projLevels) => do
let lastIdx := fields.length - 1
let lvals' := fields.mapIdx fun idx field =>
let suffix? := if idx == 0 then some <| toName fields else none
let levels := if idx == lastIdx then projLevels else []
LVal.fieldName field field.getId.getString! levels suffix? fRef
let s observing do
checkDeprecated fIdent f
let f addTermInfo fIdent f expectedType? (force := forceTermInfo)
@@ -1794,11 +1805,6 @@ where
| field :: fields => .mkStr (go fields) field.getId.toString
go fields.reverse
toLVals : List Syntax (first : Bool) List LVal
| [], _ => []
| field::fields, true => .fieldName field field.getId.getString! (toName (field::fields)) fRef :: toLVals fields false
| field::fields, false => .fieldName field field.getId.getString! none fRef :: toLVals fields false
private def elabAppFnId (fIdent : Syntax) (fExplicitUnivs : List Level) (lvals : List LVal)
(namedArgs : Array NamedArg) (args : Array Arg) (expectedType? : Option Expr) (explicit ellipsis overloaded : Bool)
(acc : Array (TermElabResult Expr)) :
@@ -1832,7 +1838,7 @@ To infer a namespace from the expected type, we do the following operations:
- if the type is of the form `c x₁ ... xₙ` with `c` a constant, then try using `c` as the namespace,
and if that doesn't work, try unfolding the expression and continuing.
-/
private partial def resolveDottedIdentFn (idRef : Syntax) (id : Name) (explicitUnivs : List Level) (expectedType? : Option Expr) : TermElabM (List (Expr × Syntax × List Syntax)) := do
private partial def resolveDottedIdentFn (idRef : Syntax) (id : Name) (explicitUnivs : List Level) (expectedType? : Option Expr) : TermElabM (List (Expr × Syntax × List Syntax × List Level)) := do
unless id.isAtomic do
throwError "Invalid dotted identifier notation: The name `{id}` must be atomic"
tryPostponeIfNoneOrMVar expectedType?
@@ -1844,7 +1850,7 @@ private partial def resolveDottedIdentFn (idRef : Syntax) (id : Name) (explicitU
withForallBody expectedType fun resultType => do
go resultType expectedType #[]
where
throwNoExpectedType := do
throwNoExpectedType {α} : TermElabM α := do
let hint match reverseFieldLookup ( getEnv) (id.getString!) with
| #[] => pure MessageData.nil
| suggestions =>
@@ -1863,7 +1869,7 @@ where
withForallBody body k
else
k type
go (resultType : Expr) (expectedType : Expr) (previousExceptions : Array Exception) : TermElabM (List (Expr × Syntax × List Syntax)) := do
go (resultType : Expr) (expectedType : Expr) (previousExceptions : Array Exception) : TermElabM (List (Expr × Syntax × List Syntax × List Level)) := do
let resultType instantiateMVars resultType
let resultTypeFn := resultType.getAppFn
try
@@ -1880,11 +1886,11 @@ where
|>.filter (fun (_, fieldList) => fieldList.isEmpty)
|>.map Prod.fst
if !candidates.isEmpty then
candidates.mapM fun resolvedName => return ( mkConst resolvedName explicitUnivs, getRef, [])
candidates.mapM fun resolvedName => return ( mkConst resolvedName explicitUnivs, getRef, [], [])
else if let some (fvar, []) resolveLocalName fullName then
unless explicitUnivs.isEmpty do
throwInvalidExplicitUniversesForLocal fvar
return [(fvar, getRef, [])]
return [(fvar, getRef, [], [])]
else
throwUnknownIdentifierAt ( getRef) (declHint := fullName) <| m!"Unknown constant `{.ofConstName fullName}`"
++ .note m!"Inferred this name from the expected resulting type of `.{id}`:{indentExpr expectedType}"
@@ -1914,26 +1920,37 @@ private partial def elabAppFn (f : Syntax) (lvals : List LVal) (namedArgs : Arra
withReader (fun ctx => { ctx with errToSorry := false }) do
f.getArgs.foldlM (init := acc) fun acc f => elabAppFn f lvals namedArgs args expectedType? explicit ellipsis true acc
else
let elabFieldName (e field : Syntax) (explicit : Bool) := do
let newLVals := field.identComponents.map fun comp =>
-- We use `none` in `suffix?` since `field` can't be part of a composite name
LVal.fieldName comp comp.getId.getString! none f
let elabFieldName (e field : Syntax) (explicitUnivs : List Level) := do
let comps := field.identComponents
let lastIdx := comps.length - 1
let newLVals := comps.mapIdx fun idx comp =>
let levels := if idx = lastIdx then explicitUnivs else []
let suffix? := none -- We use `none` since the field can't be part of a composite name
LVal.fieldName comp comp.getId.getString! levels suffix? f
elabAppFn e (newLVals ++ lvals) namedArgs args expectedType? explicit ellipsis overloaded acc
let elabFieldIdx (e idxStx : Syntax) (explicit : Bool) := do
let elabFieldIdx (e idxStx : Syntax) (explicitUnivs : List Level) := do
let some idx := idxStx.isFieldIdx?
| throwError "Internal error: Unexpected field index syntax `{idxStx}`"
elabAppFn e (LVal.fieldIdx idxStx idx :: lvals) namedArgs args expectedType? explicit ellipsis overloaded acc
let elabDottedIdent (id : Syntax) (explicitUnivs : List Level) (explicit : Bool) : TermElabM (Array (TermElabResult Expr)) := do
elabAppFn e (LVal.fieldIdx idxStx idx explicitUnivs :: lvals) namedArgs args expectedType? explicit ellipsis overloaded acc
let elabDottedIdent (id : Syntax) (explicitUnivs : List Level) : TermElabM (Array (TermElabResult Expr)) := do
let res withRef f <| resolveDottedIdentFn id id.getId.eraseMacroScopes explicitUnivs expectedType?
-- Use (forceTermInfo := true) because we want to record the result of .ident resolution even in patterns
elabAppFnResolutions f res lvals namedArgs args expectedType? explicit ellipsis overloaded acc (forceTermInfo := true)
match f with
| `($(e).$idx:fieldIdx) => elabFieldIdx e idx explicit
| `($e |>.$idx:fieldIdx) => elabFieldIdx e idx explicit
| `($(e).$field:ident) => elabFieldName e field explicit
| `($e |>.$field:ident) => elabFieldName e field explicit
| `(@$(e).$idx:fieldIdx) => elabFieldIdx e idx (explicit := true)
| `(@$(e).$field:ident) => elabFieldName e field (explicit := true)
| `($(e).$idx:fieldIdx)
| `($e |>.$idx:fieldIdx) =>
elabFieldIdx e idx []
| `($(e).$idx:fieldIdx.{$us,*})
| `($e |>.$idx:fieldIdx.{$us,*}) =>
let us elabExplicitUnivs us
elabFieldIdx e idx us
| `($(e).$field:ident)
| `($e |>.$field:ident) =>
elabFieldName e field []
| `($(e).$field:ident.{$us,*})
| `($e |>.$field:ident.{$us,*}) =>
let us elabExplicitUnivs us
elabFieldName e field us
| `($_:ident@$_:term) =>
throwError m!"Expected a function, but found the named pattern{indentD f}"
++ .note m!"Named patterns `<identifier>@<term>` can only be used when pattern-matching"
@@ -1942,12 +1959,15 @@ private partial def elabAppFn (f : Syntax) (lvals : List LVal) (namedArgs : Arra
| `($id:ident.{$us,*}) => do
let us elabExplicitUnivs us
elabAppFnId id us lvals namedArgs args expectedType? explicit ellipsis overloaded acc
| `(.$id:ident) => elabDottedIdent id [] explicit
| `(.$id:ident) => elabDottedIdent id []
| `(.$id:ident.{$us,*}) =>
let us elabExplicitUnivs us
elabDottedIdent id us explicit
elabDottedIdent id us
| `(@$_:ident)
| `(@$_:ident.{$_us,*})
| `(@$(_).$_:fieldIdx)
| `(@$(_).$_:ident)
| `(@$(_).$_:ident.{$_us,*})
| `(@.$_:ident)
| `(@.$_:ident.{$_us,*}) =>
elabAppFn (f.getArg 1) lvals namedArgs args expectedType? (explicit := true) ellipsis overloaded acc
@@ -2084,10 +2104,10 @@ private def elabAtom : TermElab := fun stx expectedType? => do
@[builtin_term_elab dotIdent] def elabDotIdent : TermElab := elabAtom
@[builtin_term_elab explicitUniv] def elabExplicitUniv : TermElab := elabAtom
@[builtin_term_elab pipeProj] def elabPipeProj : TermElab
| `($e |>.%$tk$f $args*), expectedType? =>
| `($e |>.%$tk$f$[.{$us?,*}]? $args*), expectedType? =>
universeConstraintsCheckpoint do
let (namedArgs, args, ellipsis) expandArgs args
let mut stx `($e |>.%$tk$f)
let mut stx `($e |>.%$tk$f$[.{$us?,*}]?)
if let (some startPos, some stopPos) := (e.raw.getPos?, f.raw.getTailPos?) then
stx := stx.raw.setInfo <| .synthetic (canonical := true) startPos stopPos
elabAppAux stx namedArgs args (ellipsis := ellipsis) expectedType?
@@ -2095,15 +2115,16 @@ private def elabAtom : TermElab := fun stx expectedType? => do
@[builtin_term_elab explicit] def elabExplicit : TermElab := fun stx expectedType? =>
match stx with
| `(@$_:ident) => elabAtom stx expectedType? -- Recall that `elabApp` also has support for `@`
| `(@$_:ident.{$_us,*}) => elabAtom stx expectedType?
| `(@$(_).$_:fieldIdx) => elabAtom stx expectedType?
| `(@$(_).$_:ident) => elabAtom stx expectedType?
| `(@.$_:ident) => elabAtom stx expectedType?
| `(@.$_:ident.{$_us,*}) => elabAtom stx expectedType?
| `(@($t)) => elabTerm t expectedType? (implicitLambda := false) -- `@` is being used just to disable implicit lambdas
| `(@$t) => elabTerm t expectedType? (implicitLambda := false) -- `@` is being used just to disable implicit lambdas
| _ => throwUnsupportedSyntax
| `(@$_:ident) => elabAtom stx expectedType? -- Recall that `elabApp` also has support for `@`
| `(@$_:ident.{$_us,*}) => elabAtom stx expectedType?
| `(@$(_).$_:fieldIdx) => elabAtom stx expectedType?
| `(@$(_).$_:ident) => elabAtom stx expectedType?
| `(@$(_).$_:ident.{$_us,*}) => elabAtom stx expectedType?
| `(@.$_:ident) => elabAtom stx expectedType?
| `(@.$_:ident.{$_us,*}) => elabAtom stx expectedType?
| `(@($t)) => elabTerm t expectedType? (implicitLambda := false) -- `@` is being used just to disable implicit lambdas
| `(@$t) => elabTerm t expectedType? (implicitLambda := false) -- `@` is being used just to disable implicit lambdas
| _ => throwUnsupportedSyntax
@[builtin_term_elab choice] def elabChoice : TermElab := elabAtom
@[builtin_term_elab proj] def elabProj : TermElab := elabAtom

View File

@@ -111,8 +111,14 @@ open Lean.Meta
for x in loopMutVars do
let defn getLocalDeclFromUserName x.getId
Term.addTermInfo' x defn.toExpr
-- ForIn forces all mut vars into the same universe: that of the do block result type.
discard <| Term.ensureHasType (mkSort (mi.u.succ)) defn.type
-- ForIn forces the mut tuple into the universe mi.u: that of the do block result type.
-- If we don't do this, then we are stuck on solving constraints such as
-- `max ?u.46 ?u.47 =?= max (max ?u.22 ?u.46) ?u.47`
-- It's important we do this as a separate isLevelDefEq check on the decremented level because
-- otherwise (`ensureHasType (mkSort mi.u.succ)`) we are stuck on constraints like
-- `max (?u+1) (?v+1) =?= ?u+1`
let u getDecLevel defn.type
discard <| isLevelDefEq u mi.u
defs := defs.push defn.toExpr
if info.returnsEarly && loopMutVars.isEmpty then
defs := defs.push (mkConst ``Unit.unit)

View File

@@ -65,9 +65,28 @@ def Visibility.isPublic : Visibility → Bool
| .public => true
| _ => false
/--
Returns whether the given visibility modifier should be interpreted as `public` in the current
environment.
NOTE: `Environment.isExporting` defaults to `false` when command elaborators are invoked for
backward compatibility. It needs to be initialized apropriately first before calling this function
as e.g. done in `elabDeclaration`.
-/
def Visibility.isInferredPublic (env : Environment) (v : Visibility) : Bool :=
if env.isExporting || !env.header.isModule then !v.isPrivate else v.isPublic
/-- Converts optional visibility syntax to a `Visibility` value. -/
def elabVisibility [Monad m] [MonadError m] (vis? : Option (TSyntax ``Parser.Command.visibility)) :
m Visibility :=
match vis? with
| none => pure .regular
| some v =>
match v with
| `(Parser.Command.visibility| private) => pure .private
| `(Parser.Command.visibility| public) => pure .public
| _ => throwErrorAt v "unexpected visibility modifier"
/-- Whether a declaration is default, partial or nonrec. -/
inductive RecKind where
| «partial» | «nonrec» | default
@@ -183,13 +202,7 @@ def elabModifiers (stx : TSyntax ``Parser.Command.declModifiers) : m Modifiers :
else
RecKind.nonrec
let docString? := docCommentStx.getOptional?.map (TSyntax.mk ·, doc.verso.get ( getOptions))
let visibility match visibilityStx.getOptional? with
| none => pure .regular
| some v =>
match v with
| `(Parser.Command.visibility| private) => pure .private
| `(Parser.Command.visibility| public) => pure .public
| _ => throwErrorAt v "unexpected visibility modifier"
let visibility elabVisibility (visibilityStx.getOptional?.map (·))
let isProtected := !protectedStx.isNone
let attrs match attrsStx.getOptional? with
| none => pure #[]

View File

@@ -340,31 +340,29 @@ def elabMutual : CommandElab := fun stx => do
@[builtin_command_elab Lean.Parser.Command.«initialize»] def elabInitialize : CommandElab
| stx@`($declModifiers:declModifiers $kw:initializeKeyword $[$id? : $type? ]? $doSeq) => do
withExporting (isExporting := ( getScope).isPublic) do
let attrId := mkIdentFrom stx <| if kw.raw[0].isToken "initialize" then `init else `builtin_init
if let (some id, some type) := (id?, type?) then
let `(Parser.Command.declModifiersT| $[$doc?:docComment]? $[@[$attrs?,*]]? $(vis?)? $[meta%$meta?]? $[unsafe%$unsafe?]?) := stx[0]
| throwErrorAt declModifiers "invalid initialization command, unexpected modifiers"
let defStx `($[$doc?:docComment]? @[$attrId:ident initFn, $(attrs?.getD ),*] $(vis?)? $[meta%$meta?]? opaque $id : $type)
let mut fullId := ( getCurrNamespace) ++ id.getId
if vis?.any (·.raw.isOfKind ``Parser.Command.private) then
let visibility elabVisibility vis?
if !visibility.isInferredPublic ( getEnv) then
fullId := mkPrivateName ( getEnv) fullId
-- We need to add `id`'s ranges *before* elaborating `initFn` (and then `id` itself) as
-- otherwise the info context created by `with_decl_name` will be incomplete and break the
-- call hierarchy
addDeclarationRangesForBuiltin fullId defStx.raw[0] defStx.raw[1]
let vis := Parser.Command.visibility.ofBool (!isPrivateName fullId)
elabCommand ( `(
$vis:visibility $[meta%$meta?]? $[unsafe%$unsafe?]? def initFn : IO $type := with_decl_name% $(mkIdent fullId) do $doSeq
@[no_expose] private $[meta%$meta?]? $[unsafe%$unsafe?]? def initFn : IO $type := with_decl_name% $(mkIdent fullId) do $doSeq
$defStx:command))
else
let `(Parser.Command.declModifiersT| $[$doc?:docComment]? $[@[$attrs?,*]]? $(_)? $[meta%$meta?]? $[unsafe%$unsafe?]?) := declModifiers
| throwErrorAt declModifiers "invalid initialization command, unexpected modifiers"
let attrs := (attrs?.map (·.getElems)).getD #[]
let attrs := attrs.push ( `(Lean.Parser.Term.attrInstance| $attrId:ident))
-- `[builtin_init]` can be private as it is used for local codegen only but `[init]` must be
-- available for the interpreter.
let vis := Parser.Command.visibility.ofBool (attrId.getId == `init)
elabCommand ( `($[$doc?:docComment]? @[$[$attrs],*] $vis:visibility $[meta%$meta?]? $[unsafe%$unsafe?]? def initFn : IO Unit := do $doSeq))
elabCommand ( `($[$doc?:docComment]? @[no_expose, $[$attrs],*] private $[meta%$meta?]? $[unsafe%$unsafe?]? def initFn : IO Unit := do $doSeq))
| _ => throwUnsupportedSyntax
builtin_initialize

View File

@@ -233,27 +233,41 @@ def processDefDeriving (view : DerivingClassView) (decl : Expr) (isNoncomputable
finally
Core.setMessageLog (msgLog ++ ( Core.getMessageLog))
let env getEnv
let hints := ReducibilityHints.regular (getMaxHeight env result.value + 1)
let decl mkDefinitionValInferringUnsafe instName result.levelParams.toList result.type result.value hints
-- Pre-check: if the instance value depends on noncomputable definitions and the user didn't write
-- `noncomputable`, give an actionable error with a `Try this:` suggestion.
unless isNoncomputable || ( read).isNoncomputableSection || ( isProp result.type) do
let noncompRef? := preNormValue.foldConsts none fun n acc =>
acc <|> if Lean.isNoncomputable (asyncMode := .local) env n then some n else none
if let some noncompRef := noncompRef? then
if let some cmdRef := cmdRef? then
if let some origText := cmdRef.reprint then
let newText := (origText.replace "deriving instance " "deriving noncomputable instance ").trimAscii
logInfoAt cmdRef m!"Try this: {newText}"
throwError "failed to derive instance because it depends on \
`{.ofConstName noncompRef}`, which is noncomputable"
if isNoncomputable || ( read).isNoncomputableSection then
addDecl <| Declaration.defnDecl decl
modifyEnv (addNoncomputable · instName)
let isPropType isProp result.type
if isPropType then
let decl mkThmOrUnsafeDef {
name := instName, levelParams := result.levelParams.toList,
type := result.type, value := result.value
}
addDecl decl
else
addAndCompile <| Declaration.defnDecl decl
let hints := ReducibilityHints.regular (getMaxHeight env result.value + 1)
let decl mkDefinitionValInferringUnsafe instName result.levelParams.toList result.type result.value hints
-- Pre-check: if the instance value depends on noncomputable definitions and the user didn't write
-- `noncomputable`, give an actionable error with a `Try this:` suggestion.
unless isNoncomputable || ( read).isNoncomputableSection do
let noncompRef? := preNormValue.foldConsts none fun n acc =>
acc <|> if Lean.isNoncomputable (asyncMode := .local) env n then some n else none
if let some noncompRef := noncompRef? then
if let some cmdRef := cmdRef? then
if let some origText := cmdRef.reprint then
let newText := (origText.replace "deriving instance " "deriving noncomputable instance ").trimAscii
logInfoAt cmdRef m!"Try this: {newText}"
throwError "failed to derive instance because it depends on \
`{.ofConstName noncompRef}`, which is noncomputable"
if isNoncomputable || ( read).isNoncomputableSection then
addDecl <| Declaration.defnDecl decl
modifyEnv (addNoncomputable · instName)
else
addAndCompile <| Declaration.defnDecl decl
trace[Elab.Deriving] "Derived instance `{.ofConstName instName}`"
registerInstance instName AttributeKind.global (eval_prio default)
-- For Prop-typed instances (theorems), skip `implicit_reducible` since reducibility hints are
-- irrelevant for theorems. This matches the behavior of the handwritten `instance` command
-- (see `MutualDef.lean`).
if isPropType then
addInstance instName AttributeKind.global (eval_prio default)
else
registerInstance instName AttributeKind.global (eval_prio default)
addDeclarationRangesFromSyntax instName ( getRef)
end Term

View File

@@ -25,25 +25,23 @@ private def mkInhabitedInstanceUsing (inductiveTypeName : Name) (ctorName : Name
| none =>
return false
where
addLocalInstancesForParamsAux {α} (k : LocalInst2Index TermElabM α) : List Expr Nat LocalInst2Index TermElabM α
| [], _, map => k map
| x::xs, i, map =>
addLocalInstancesForParamsAux {α} (k : Array Expr LocalInst2Index TermElabM α) : List Expr Nat Array Expr LocalInst2Index TermElabM α
| [], _, insts, map => k insts map
| x::xs, i, insts, map =>
try
let instType mkAppM `Inhabited #[x]
if ( isTypeCorrect instType) then
withLocalDeclD ( mkFreshUserName `inst) instType fun inst => do
trace[Elab.Deriving.inhabited] "adding local instance {instType}"
addLocalInstancesForParamsAux k xs (i+1) (map.insert inst.fvarId! i)
else
addLocalInstancesForParamsAux k xs (i+1) map
check instType
withLocalDecl ( mkFreshUserName `inst) .instImplicit instType fun inst => do
trace[Elab.Deriving.inhabited] "adding local instance {instType}"
addLocalInstancesForParamsAux k xs (i+1) (insts.push inst) (map.insert inst.fvarId! i)
catch _ =>
addLocalInstancesForParamsAux k xs (i+1) map
addLocalInstancesForParamsAux k xs (i+1) insts map
addLocalInstancesForParams {α} (xs : Array Expr) (k : LocalInst2Index TermElabM α) : TermElabM α := do
addLocalInstancesForParams {α} (xs : Array Expr) (k : Array Expr LocalInst2Index TermElabM α) : TermElabM α := do
if addHypotheses then
addLocalInstancesForParamsAux k xs.toList 0 {}
addLocalInstancesForParamsAux k xs.toList 0 #[] {}
else
k {}
k #[] {}
collectUsedLocalsInsts (usedInstIdxs : IndexSet) (localInst2Index : LocalInst2Index) (e : Expr) : IndexSet :=
if localInst2Index.isEmpty then
@@ -58,58 +56,88 @@ where
runST (fun _ => visit |>.run usedInstIdxs) |>.2
/-- Create an `instance` command using the constructor `ctorName` with a hypothesis `Inhabited α` when `α` is one of the inductive type parameters
at position `i` and `i ∈ assumingParamIdxs`. -/
mkInstanceCmdWith (assumingParamIdxs : IndexSet) : TermElabM Syntax := do
let ctx Deriving.mkContext ``Inhabited "inhabited" inductiveTypeName
at position `i` and `i ∈ usedInstIdxs`. -/
mkInstanceCmdWith (instId : Ident) (usedInstIdxs : IndexSet) (auxFunId : Ident) : TermElabM Syntax := do
let indVal getConstInfoInduct inductiveTypeName
let ctorVal getConstInfoCtor ctorName
let mut indArgs := #[]
let mut binders := #[]
for i in *...indVal.numParams + indVal.numIndices do
let arg := mkIdent ( mkFreshUserName `a)
indArgs := indArgs.push arg
let binder `(bracketedBinderF| { $arg:ident })
binders := binders.push binder
if assumingParamIdxs.contains i then
let binder `(bracketedBinderF| [Inhabited $arg:ident ])
binders := binders.push binder
binders := binders.push <| `(bracketedBinderF| { $arg:ident })
if usedInstIdxs.contains i then
binders := binders.push <| `(bracketedBinderF| [Inhabited $arg:ident ])
let type `(@$(mkCIdent inductiveTypeName):ident $indArgs:ident*)
let mut ctorArgs := #[]
for _ in *...ctorVal.numParams do
ctorArgs := ctorArgs.push ( `(_))
for _ in *...ctorVal.numFields do
ctorArgs := ctorArgs.push ( ``(Inhabited.default))
let val `(@$(mkIdent ctorName):ident $ctorArgs*)
let ctx mkContext ``Inhabited "default" inductiveTypeName
let auxFunName := ctx.auxFunNames[0]!
`(def $(mkIdent auxFunName):ident $binders:bracketedBinder* : $type := $val
instance $(mkIdent ctx.instName):ident $binders:bracketedBinder* : Inhabited $type := $(mkIdent auxFunName))
`(instance $instId:ident $binders:bracketedBinder* : Inhabited $type := $auxFunId)
solveMVarsWithDefault (e : Expr) : TermElabM Unit := do
let mvarIds getMVarsNoDelayed e
mvarIds.forM fun mvarId => mvarId.withContext do
unless mvarId.isAssigned do
let type mvarId.getType
withTraceNode `Elab.Deriving.inhabited (fun _ => return m!"synthesizing Inhabited instance for{inlineExprTrailing type}") do
let val mkDefault type
mvarId.assign val
trace[Elab.Deriving.inhabited] "value:{inlineExprTrailing val}"
mkInstanceCmd? : TermElabM (Option Syntax) := do
let ctorVal getConstInfoCtor ctorName
forallTelescopeReducing ctorVal.type fun xs _ =>
addLocalInstancesForParams xs[*...ctorVal.numParams] fun localInst2Index => do
let mut usedInstIdxs := {}
let mut ok := true
for h : i in ctorVal.numParams...xs.size do
let x := xs[i]
let instType mkAppM `Inhabited #[( inferType x)]
trace[Elab.Deriving.inhabited] "checking {instType} for `{ctorName}`"
match ( trySynthInstance instType) with
| LOption.some e =>
usedInstIdxs := collectUsedLocalsInsts usedInstIdxs localInst2Index e
| _ =>
trace[Elab.Deriving.inhabited] "failed to generate instance using `{ctorName}` {if addHypotheses then "(assuming parameters are inhabited)" else ""} because of field with type{indentExpr (← inferType x)}"
ok := false
break
if !ok then
return none
mkDefaultValue (indVal : InductiveVal) : TermElabM (Expr × Expr × IndexSet) := do
let us := indVal.levelParams.map Level.param
forallTelescopeReducing indVal.type fun xs _ =>
withImplicitBinderInfos xs do
addLocalInstancesForParams xs[0...indVal.numParams] fun insts localInst2Index => do
let type := mkAppN (.const inductiveTypeName us) xs
let val
if isStructure ( getEnv) inductiveTypeName then
withTraceNode `Elab.Deriving.inhabited (fun _ => return m!"using structure instance elaborator") do
let stx `(structInst| {..})
withoutErrToSorry <| elabTermAndSynthesize stx type
else
trace[Elab.Deriving.inhabited] "inhabited instance using `{ctorName}` {if addHypotheses then "(assuming parameters are inhabited)" else ""} {usedInstIdxs.toList}"
let cmd mkInstanceCmdWith usedInstIdxs
trace[Elab.Deriving.inhabited] "\n{cmd}"
return some cmd
withTraceNode `Elab.Deriving.inhabited (fun _ => return m!"using constructor `{.ofConstName ctorName}`") do
let val := mkAppN (.const ctorName us) xs[0...indVal.numParams]
let (mvars, _, type') forallMetaTelescopeReducing ( inferType val)
unless isDefEq type type' do
throwError "cannot unify{indentExpr type}\nand type of constructor{indentExpr type'}"
pure <| mkAppN val mvars
solveMVarsWithDefault val
let val instantiateMVars val
if val.hasMVar then
throwError "default value contains metavariables{inlineExprTrailing val}"
let fvars := Lean.collectFVars {} val
let insts' := insts.filter fvars.visitedExpr.contains
let usedInstIdxs := collectUsedLocalsInsts {} localInst2Index val
assert! insts'.size == usedInstIdxs.size
trace[Elab.Deriving.inhabited] "inhabited instance using{inlineExpr val}{if insts'.isEmpty then m!"" else m!"(assuming parameters {insts'} are inhabited)"}"
let xs' := xs ++ insts'
let auxType mkForallFVars xs' type
let auxVal mkLambdaFVars xs' val
return (auxType, auxVal, usedInstIdxs)
mkInstanceCmd? : TermElabM (Option Syntax) :=
withExporting (isExporting := !isPrivateName ctorName) do
let ctx mkContext ``Inhabited "default" inductiveTypeName
let auxFunName := ( getCurrNamespace) ++ ctx.auxFunNames[0]!
let indVal getConstInfoInduct inductiveTypeName
let (auxType, auxVal, usedInstIdxs)
try
withDeclName auxFunName do mkDefaultValue indVal
catch e =>
trace[Elab.Deriving.inhabited] "error: {e.toMessageData}"
return none
addDecl <| .defnDecl <| mkDefinitionValInferringUnsafe
(name := auxFunName)
(levelParams := indVal.levelParams)
(type := auxType)
(value := auxVal)
(hints := ReducibilityHints.regular (getMaxHeight ( getEnv) auxVal + 1))
if isMarkedMeta ( getEnv) inductiveTypeName then
modifyEnv (markMeta · auxFunName)
unless ( read).isNoncomputableSection do
compileDecls #[auxFunName]
enableRealizationsForConst auxFunName
trace[Elab.Deriving.inhabited] "defined {.ofConstName auxFunName}"
let cmd mkInstanceCmdWith (mkIdent ctx.instName) usedInstIdxs (mkCIdent auxFunName)
trace[Elab.Deriving.inhabited] "\n{cmd}"
return some cmd
private def mkInhabitedInstance (declName : Name) : CommandElabM Unit := do
withoutExposeFromCtors declName do

View File

@@ -10,6 +10,7 @@ public import Lean.Parser.Module
meta import Lean.Parser.Module
import Lean.Compiler.ModPkgExt
public import Lean.DeprecatedModule
import Init.Data.String.Modify
public section
@@ -28,7 +29,9 @@ def HeaderSyntax.isModule (header : HeaderSyntax) : Bool :=
def HeaderSyntax.imports (stx : HeaderSyntax) (includeInit : Bool := true) : Array Import :=
match stx with
| `(Parser.Module.header| $[module%$moduleTk]? $[prelude%$preludeTk]? $importsStx*) =>
let imports := if preludeTk.isNone && includeInit then #[{ module := `Init : Import }] else #[]
let imports := if preludeTk.isNone && includeInit then
#[{ module := `Init : Import }, { module := `Init, isMeta := true : Import }]
else #[]
imports ++ importsStx.map fun
| `(Parser.Module.import| $[public%$publicTk]? $[meta%$metaTk]? import $[all%$allTk]? $n) =>
{ module := n.getId, importAll := allTk.isSome
@@ -95,6 +98,43 @@ def checkDeprecatedImports
| none => messages
| none => messages
private def osForbiddenChars : Array Char :=
#['<', '>', '"', '|', '?', '*', '!']
private def osForbiddenNames : Array String :=
#["CON", "PRN", "AUX", "NUL",
"COM1", "COM2", "COM3", "COM4", "COM5", "COM6", "COM7", "COM8", "COM9",
"COM¹", "COM²", "COM³",
"LPT1", "LPT2", "LPT3", "LPT4", "LPT5", "LPT6", "LPT7", "LPT8", "LPT9",
"LPT¹", "LPT²", "LPT³"]
private def checkComponentPortability (comp : String) : Option String :=
if osForbiddenNames.contains comp.toUpper then
some s!"'{comp}' is a reserved file name on some operating systems"
else if let some c := osForbiddenChars.find? (comp.contains ·) then
some s!"contains character '{c}' which is forbidden on some operating systems"
else
none
def checkModuleNamePortability
(mainModule : Name) (inputCtx : Parser.InputContext) (startPos : String.Pos.Raw)
(messages : MessageLog) : MessageLog :=
go mainModule messages
where
go : Name → MessageLog → MessageLog
| .anonymous, messages => messages
| .str parent s, messages =>
let messages := match checkComponentPortability s with
| some reason => messages.add {
fileName := inputCtx.fileName
pos := inputCtx.fileMap.toPosition startPos
severity := .error
data := s!"module name '{mainModule}' is not portable: {reason}"
}
| none => messages
go parent messages
| .num parent _, messages => go parent messages
def processHeaderCore
(startPos : String.Pos.Raw) (imports : Array Import) (isModule : Bool)
(opts : Options) (messages : MessageLog) (inputCtx : Parser.InputContext)
@@ -122,6 +162,7 @@ def processHeaderCore
pure (env, messages.add { fileName := inputCtx.fileName, data := toString e, pos := pos })
let env := env.setMainModule mainModule |>.setModulePackage package?
let messages := checkDeprecatedImports env imports opts inputCtx startPos messages headerStx? origHeaderStx?
let messages := checkModuleNamePortability mainModule inputCtx startPos messages
return (env, messages)
/--

View File

@@ -233,7 +233,7 @@ def setImportAll : Parser := fun _ s =>
def main : Parser :=
keywordCore "module" (setIsModule false) (setIsModule true) >>
keywordCore "prelude" (fun _ s => s.pushImport `Init) skip >>
keywordCore "prelude" (fun _ s => (s.pushImport `Init).pushImport { module := `Init, isMeta := true }) skip >>
manyImports (atomic (keywordCore "public" skip setExported >>
keywordCore "meta" skip setMeta >>
keyword "import") >>

View File

@@ -113,7 +113,7 @@ private def isSectionVariable (e : Expr) : TermElabM Bool := do
if ( read).quotLCtx.contains val then
return
let rs try resolveName stx val [] [] catch _ => pure []
for (e, _) in rs do
for (e, _, _) in rs do
match e with
| Expr.fvar _ .. =>
if quotPrecheck.allowSectionVars.get ( getOptions) && ( isSectionVariable e) then

View File

@@ -232,7 +232,10 @@ structure TacticFinishedSnapshot extends Language.Snapshot where
state? : Option SavedState
/-- Untyped snapshots from `logSnapshotTask`, saved at this level for cancellation. -/
moreSnaps : Array (SnapshotTask SnapshotTree)
deriving Inhabited
instance : Inhabited TacticFinishedSnapshot where
default := { toSnapshot := default, state? := default, moreSnaps := default }
instance : ToSnapshotTree TacticFinishedSnapshot where
toSnapshotTree s := s.toSnapshot, s.moreSnaps
@@ -246,7 +249,10 @@ structure TacticParsedSnapshot extends Language.Snapshot where
finished : SnapshotTask TacticFinishedSnapshot
/-- Tasks for subsequent, potentially parallel, tactic steps. -/
next : Array (SnapshotTask TacticParsedSnapshot) := #[]
deriving Inhabited
instance : Inhabited TacticParsedSnapshot where
default := { toSnapshot := default, stx := default, finished := default }
partial instance : ToSnapshotTree TacticParsedSnapshot where
toSnapshotTree := go where
go := fun s => s.toSnapshot,
@@ -627,13 +633,13 @@ builtin_initialize termElabAttribute : KeyedDeclsAttribute TermElab ← mkTermEl
`[LVal.fieldName "foo", LVal.fieldIdx 1]`.
-/
inductive LVal where
| fieldIdx (ref : Syntax) (i : Nat)
| fieldIdx (ref : Syntax) (i : Nat) (levels : List Level)
/-- Field `suffix?` is for producing better error messages because `x.y` may be a field access or a hierarchical/composite name.
`ref` is the syntax object representing the field. `fullRef` includes the LHS. -/
| fieldName (ref : Syntax) (name : String) (suffix? : Option Name) (fullRef : Syntax)
| fieldName (ref : Syntax) (name : String) (levels : List Level) (suffix? : Option Name) (fullRef : Syntax)
def LVal.getRef : LVal Syntax
| .fieldIdx ref _ => ref
| .fieldIdx ref .. => ref
| .fieldName ref .. => ref
def LVal.isFieldName : LVal Bool
@@ -642,8 +648,11 @@ def LVal.isFieldName : LVal → Bool
instance : ToString LVal where
toString
| .fieldIdx _ i => toString i
| .fieldName _ n .. => n
| .fieldIdx _ i levels .. => toString i ++ levelsToString levels
| .fieldName _ n levels .. => n ++ levelsToString levels
where
levelsToString levels :=
if levels.isEmpty then "" else ".{" ++ String.intercalate "," (levels.map toString) ++ "}"
/-- Return the name of the declaration being elaborated if available. -/
def getDeclName? : TermElabM (Option Name) := return ( read).declName?
@@ -2111,8 +2120,10 @@ def checkDeprecated (ref : Syntax) (e : Expr) : TermElabM Unit := do
@[inline] def withoutCheckDeprecated [MonadWithReaderOf Context m] : m α m α :=
withTheReader Context (fun ctx => { ctx with checkDeprecated := false })
private def mkConsts (candidates : List (Name × List String)) (explicitLevels : List Level) : TermElabM (List (Expr × List String)) := do
private def mkConsts (candidates : List (Name × List String)) (explicitLevels : List Level) : TermElabM (List (Expr × List String × List Level)) := do
candidates.foldlM (init := []) fun result (declName, projs) => do
-- levels apply to the last projection, not the constant
let (constLevels, projLevels) := if projs.isEmpty then (explicitLevels, []) else ([], explicitLevels)
-- TODO: better support for `mkConst` failure. We may want to cache the failures, and report them if all candidates fail.
/-
We disable `checkDeprecated` here because there may be many overloaded symbols.
@@ -2121,25 +2132,38 @@ private def mkConsts (candidates : List (Name × List String)) (explicitLevels :
At `elabAppFnId`, we perform the check when converting the list returned by `resolveName'` into a list of
`TermElabResult`s.
-/
let const withoutCheckDeprecated <| mkConst declName explicitLevels
return (const, projs) :: result
let const withoutCheckDeprecated <| mkConst declName constLevels
return (const, projs, projLevels) :: result
def throwInvalidExplicitUniversesForLocal {α} (e : Expr) : TermElabM α :=
throwError "invalid use of explicit universe parameters, `{e}` is a local variable"
def resolveName (stx : Syntax) (n : Name) (preresolved : List Syntax.Preresolved) (explicitLevels : List Level) (expectedType? : Option Expr := none) : TermElabM (List (Expr × List String)) := do
/--
Gives all resolutions of the name `n`.
- `explicitLevels` provides a prefix of level parameters to the constant. For resolutions with a projection
component, the levels are not used, since they must apply to the last projection, not the constant.
In that case, the third component of the tuple is `explicitLevels`.
-/
def resolveName (stx : Syntax) (n : Name) (preresolved : List Syntax.Preresolved) (explicitLevels : List Level) (expectedType? : Option Expr := none) : TermElabM (List (Expr × List String × List Level)) := do
addCompletionInfo <| CompletionInfo.id stx stx.getId (danglingDot := false) ( getLCtx) expectedType?
let processLocal (e : Expr) (projs : List String) := do
if projs.isEmpty then
if explicitLevels.isEmpty then
return [(e, [], [])]
else
throwInvalidExplicitUniversesForLocal e
else
return [(e, projs, explicitLevels)]
if let some (e, projs) resolveLocalName n then
unless explicitLevels.isEmpty do
throwInvalidExplicitUniversesForLocal e
return [(e, projs)]
return processLocal e projs
let preresolved := preresolved.filterMap fun
| .decl n projs => some (n, projs)
| _ => none
-- check for section variable capture by a quotation
let ctx read
if let some (e, projs) := preresolved.findSome? fun (n, projs) => ctx.sectionFVars.find? n |>.map (·, projs) then
return [(e, projs)] -- section variables should shadow global decls
return processLocal e projs -- section variables should shadow global decls
if preresolved.isEmpty then
mkConsts ( realizeGlobalName n) explicitLevels
else
@@ -2148,14 +2172,17 @@ def resolveName (stx : Syntax) (n : Name) (preresolved : List Syntax.Preresolved
/--
Similar to `resolveName`, but creates identifiers for the main part and each projection with position information derived from `ident`.
Example: Assume resolveName `v.head.bla.boo` produces `(v.head, ["bla", "boo"])`, then this method produces
`(v.head, id, [f₁, f₂])` where `id` is an identifier for `v.head`, and `f₁` and `f₂` are identifiers for fields `"bla"` and `"boo"`. -/
def resolveName' (ident : Syntax) (explicitLevels : List Level) (expectedType? : Option Expr := none) : TermElabM (Name × List (Expr × Syntax × List Syntax)) := do
`(v.head, id, [f₁, f₂])` where `id` is an identifier for `v.head`, and `f₁` and `f₂` are identifiers for fields `"bla"` and `"boo"`.
See the comment there about `explicitLevels` and the meaning of the `List Level` component of the returned tuple.
-/
def resolveName' (ident : Syntax) (explicitLevels : List Level) (expectedType? : Option Expr := none) : TermElabM (Name × List (Expr × Syntax × List Syntax × List Level)) := do
let .ident _ _ n preresolved := ident
| throwError "identifier expected"
let r resolveName ident n preresolved explicitLevels expectedType?
let rc r.mapM fun (c, fields) => do
let rc r.mapM fun (c, fields, levels) => do
let ids := ident.identComponents (nFields? := fields.length)
return (c, ids.head!, ids.tail!)
return (c, ids.head!, ids.tail!, levels)
return (n, rc)
@@ -2163,7 +2190,7 @@ def resolveId? (stx : Syntax) (kind := "term") (withInfo := false) : TermElabM (
match stx with
| .ident _ _ val preresolved =>
let rs try resolveName stx val preresolved [] catch _ => pure []
let rs := rs.filter fun _, projs => projs.isEmpty
let rs := rs.filter fun _, projs, _ => projs.isEmpty
let fs := rs.map fun (f, _) => f
match fs with
| [] => return none

View File

@@ -616,7 +616,13 @@ structure Environment where
/--
Indicates whether the environment is being used in an exported context, i.e. whether it should
provide access to only the data to be imported by other modules participating in the module
system.
system. Apart from controlling access, some operations such as `mkAuxDeclName` may also change
their output based on this flag.
By default, `isExporting` is set to false when command elaborators are invoked such that they have
access to the full local environment. Use `with(out)Exporting` to modify based on context. For
example, `elabDeclaration` sets it based on `(← getScope).isPublic` on the top level, then
`elabMutualDef` may switch from public to private when e.g. entering the proof of a theorem.
-/
isExporting : Bool := false
deriving Nonempty

View File

@@ -67,7 +67,9 @@ structure Snapshot where
`diagnostics`) occurred that prevents processing of the remainder of the file.
-/
isFatal := false
deriving Inhabited
instance : Inhabited Snapshot where
default := { desc := "", diagnostics := default }
/-- Range that is marked as being processed by the server while a task is running. -/
inductive SnapshotTask.ReportingRange where
@@ -236,7 +238,10 @@ partial def SnapshotTask.cancelRec [ToSnapshotTree α] (t : SnapshotTask α) : B
/-- Snapshot type without child nodes. -/
structure SnapshotLeaf extends Snapshot
deriving Inhabited, TypeName
deriving TypeName
instance : Inhabited SnapshotLeaf where
default := { toSnapshot := default }
instance : ToSnapshotTree SnapshotLeaf where
toSnapshotTree s := SnapshotTree.mk s.toSnapshot #[]

View File

@@ -18,3 +18,4 @@ public import Lean.Linter.List
public import Lean.Linter.Sets
public import Lean.Linter.UnusedSimpArgs
public import Lean.Linter.Coe
public import Lean.Linter.GlobalAttributeIn

View File

@@ -0,0 +1,59 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Wojciech Różowski
-/
module
prelude
public import Lean.Elab.Command
public import Lean.Linter.Basic
namespace Lean.Linter
open Elab.Command
private structure TopDownSkipQuot where
stx : Syntax
def topDownSkipQuot (stx : Syntax) : TopDownSkipQuot := stx
partial instance [Monad m] : ForIn m TopDownSkipQuot Syntax where
forIn := fun stx init f => do
let rec @[specialize] loop stx b [Inhabited (type_of% b)] := do
if stx.isQuot then return ForInStep.yield b
match ( f stx b) with
| ForInStep.yield b' =>
let mut b := b'
if let Syntax.node _ _ args := stx then
for arg in args do
match ( loop arg b) with
| ForInStep.yield b' => b := b'
| ForInStep.done b' => return ForInStep.done b'
return ForInStep.yield b
| ForInStep.done b => return ForInStep.done b
match ( @loop stx init init) with
| ForInStep.yield b => return b
| ForInStep.done b => return b
def getGlobalAttributesIn? : Syntax Option (Ident × Array (TSyntax `attr))
| `(attribute [$x,*] $id in $_) =>
let xs := x.getElems.filterMap fun a => match a.raw with
| `(Parser.Command.eraseAttr| -$_) => none
| `(Parser.Term.attrInstance| local $_attr:attr) => none
| `(Parser.Term.attrInstance| scoped $_attr:attr) => none
| `(attr| $a) => some a
(id, xs)
| _ => default
def globalAttributeIn : Linter where run := withSetOptionIn fun stx => do
for s in topDownSkipQuot stx do
if let some (id, nonScopedNorLocal) := getGlobalAttributesIn? s then
for attr in nonScopedNorLocal do
logErrorAt attr
m!"Despite the `in`, the attribute {attr} is added globally to {id}\n\
please remove the `in` or make this a `local {attr}`"
builtin_initialize addLinter globalAttributeIn
end Lean.Linter

View File

@@ -149,17 +149,15 @@ partial def mkSizeOfFn (recName : Name) (declName : Name): MetaM Unit := do
trace[Meta.sizeOf] "declName: {declName}"
trace[Meta.sizeOf] "type: {sizeOfType}"
trace[Meta.sizeOf] "val: {sizeOfValue}"
-- We expose the `sizeOf` functions so that the `spec` theorems can be publicly `defeq`
withExporting do
addDecl <| Declaration.defnDecl {
name := declName
levelParams := levelParams
type := sizeOfType
value := sizeOfValue
safety := DefinitionSafety.safe
hints := ReducibilityHints.abbrev
}
enableRealizationsForConst declName
addDecl <| Declaration.defnDecl {
name := declName
levelParams := levelParams
type := sizeOfType
value := sizeOfValue
safety := DefinitionSafety.safe
hints := ReducibilityHints.abbrev
}
enableRealizationsForConst declName
/--
Create `sizeOf` functions for all inductive datatypes in the mutual inductive declaration containing `typeName`
@@ -469,7 +467,6 @@ private def mkSizeOfSpecTheorem (indInfo : InductiveVal) (sizeOfFns : Array Name
type := thmType
value := thmValue
}
inferDefEqAttr thmName
simpAttr.add thmName default .global
grindAttr.add thmName grindAttrStx .global

View File

@@ -112,6 +112,25 @@ private def processInv (e inst a : Expr) : RingM Unit := do
return ()
pushNewFact <| mkApp3 (mkConst ``Grind.CommRing.inv_split [ring.u]) ring.type fieldInst a
/--
For each new variable `x` in a ring with `PowIdentity α p`,
push the equation `x ^ p = x` as a new fact into grind.
-/
private def processPowIdentityVars : RingM Unit := do
let ring getCommRing
let some (powIdentityInst, csInst, p) := ring.powIdentityInst? | return ()
let startIdx := ring.powIdentityVarCount
let vars := ring.toRing.vars
if startIdx >= vars.size then return ()
for i in [startIdx:vars.size] do
let x := vars[i]!
trace_goal[grind.ring] "PowIdentity: pushing x^{p} = x for {x}"
-- Construct proof: @PowIdentity.pow_eq α csInst p powIdentityInst x
let proof := mkApp5 (mkConst ``Grind.PowIdentity.pow_eq [ring.u])
ring.type csInst (mkNatLit p) powIdentityInst x
pushNewFact proof
modifyCommRing fun s => { s with powIdentityVarCount := vars.size }
/-- Returns `true` if `e` is a term `a⁻¹`. -/
private def internalizeInv (e : Expr) : GoalM Bool := do
match_expr e with
@@ -138,6 +157,7 @@ def internalize (e : Expr) (parent? : Option Expr) : GoalM Unit := do
denote := s.denote.insert { expr := e } re
denoteEntries := s.denoteEntries.push (e, re)
}
processPowIdentityVars
else if let some semiringId getCommSemiringId? type then SemiringM.run semiringId do
let some re sreify? e | return ()
trace_goal[grind.ring.internalize] "semiring [{semiringId}]: {e}"

View File

@@ -38,11 +38,13 @@ where
let noZeroDivInst? getNoZeroDivInst? u type
trace_goal[grind.ring] "NoNatZeroDivisors available: {noZeroDivInst?.isSome}"
let fieldInst? synthInstance? <| mkApp (mkConst ``Grind.Field [u]) type
let powIdentityInst? getPowIdentityInst? u type
trace_goal[grind.ring] "PowIdentity available: {powIdentityInst?.isSome}"
let semiringId? := none
let id := ( get').rings.size
let ring : CommRing := {
id, semiringId?, type, u, semiringInst, ringInst, commSemiringInst,
commRingInst, charInst?, noZeroDivInst?, fieldInst?,
commRingInst, charInst?, noZeroDivInst?, fieldInst?, powIdentityInst?,
}
modify' fun s => { s with rings := s.rings.push ring }
return some id

View File

@@ -214,6 +214,8 @@ structure CommRing extends Ring where
noZeroDivInst? : Option Expr
/-- `Field` instance for `type` if available. -/
fieldInst? : Option Expr
/-- `PowIdentity` instance, the synthesized `CommSemiring` instance, and exponent `p` if available. -/
powIdentityInst? : Option (Expr × Expr × Nat) := none
/-- `denoteEntries` is `denote` as a `PArray` for deterministic traversal. -/
denoteEntries : PArray (Expr × RingExpr) := {}
/-- Next unique id for `EqCnstr`s. -/
@@ -238,6 +240,8 @@ structure CommRing extends Ring where
recheck : Bool := false
/-- Inverse theorems that have been already asserted. -/
invSet : PHashSet Expr := {}
/-- Number of variables for which `PowIdentity` equations have been pushed. -/
powIdentityVarCount : Nat := 0
/--
An equality of the form `c = 0`. It is used to simplify polynomial coefficients.
-/

View File

@@ -19,6 +19,20 @@ def getIsCharInst? (u : Level) (type : Expr) (semiringInst : Expr) : GoalM (Opti
let some n evalNat? n | return none
return some (charInst, n)
def getPowIdentityInst? (u : Level) (type : Expr) : GoalM (Option (Expr × Expr × Nat)) := do withNewMCtxDepth do
-- We use a fresh metavar for `CommSemiring` (unlike `getIsCharInst?` which pins the semiring)
-- because `PowIdentity` instances may be declared against a canonical `CommSemiring` instance
-- that is not definitionally equal to `CommRing.toCommSemiring`. The synthesized `csInst` is
-- stored and used in proof terms to ensure type-correctness.
let csInst mkFreshExprMVar (mkApp (mkConst ``Grind.CommSemiring [u]) type)
let p mkFreshExprMVar (mkConst ``Nat)
let powIdentityType := mkApp3 (mkConst ``Grind.PowIdentity [u]) type csInst p
let some inst synthInstance? powIdentityType | return none
let csInst instantiateMVars csInst
let p instantiateMVars p
let some pVal evalNat? p | return none
return some (inst, csInst, pVal)
def getNoZeroDivInst? (u : Level) (type : Expr) : GoalM (Option Expr) := do
let natModuleType := mkApp (mkConst ``Grind.NatModule [u]) type
let some natModuleInst synthInstance? natModuleType | return none

View File

@@ -242,7 +242,6 @@ def ppSimpTheorem [Monad m] [MonadEnv m] [MonadError m] (s : SimpTheorem) : m Me
instance : BEq SimpTheorem where
beq e₁ e₂ := e₁.proof == e₂.proof
/--
Configuration for `MetaM` used to process global simp theorems
-/
@@ -256,8 +255,6 @@ def simpGlobalConfig : ConfigWithKey :=
@[inline] def withSimpGlobalConfig : MetaM α MetaM α :=
withConfigWithKey simpGlobalConfig
private partial def isPerm : Expr Expr MetaM Bool
| .app f₁ a₁, .app f₂ a₂ => isPerm f₁ f₂ <&&> isPerm a₁ a₂
| .mdata _ s, t => isPerm s t
@@ -370,11 +367,32 @@ private def mkSimpTheoremKeys (type : Expr) (noIndexAtArgs : Bool) : MetaM (Arra
| some (_, lhs, rhs) => pure ( DiscrTree.mkPath lhs noIndexAtArgs, isPerm lhs rhs)
| none => throwError "Unexpected kind of simp theorem{indentExpr type}"
private def mkSimpTheoremCore (origin : Origin) (e : Expr) (levelParams : Array Name) (proof : Expr) (post : Bool) (prio : Nat) (noIndexAtArgs : Bool) : MetaM SimpTheorem := do
register_builtin_option simp.rfl.checkTransparency: Bool := {
defValue := false
descr := "if true, Lean generates a warning if the left and right-hand sides of the `[simp]` equation are not definitionally equal at the restricted transparency level used by `simp` "
}
private def mkSimpTheoremCore (origin : Origin) (e : Expr) (levelParams : Array Name) (proof : Expr)
(post : Bool) (prio : Nat) (noIndexAtArgs : Bool) : MetaM SimpTheorem := do
assert! origin != .fvar .anonymous
let type instantiateMVars ( inferType e)
let (keys, perm) mkSimpTheoremKeys type noIndexAtArgs
return { origin, keys, perm, post, levelParams, proof, priority := prio, rfl := ( isRflProof proof) }
let rfl isRflProof proof
if rfl && simp.rfl.checkTransparency.get ( getOptions) then
forallTelescopeReducing type fun _ type => do
let checkDefEq (lhs rhs : Expr) := do
unless ( withTransparency .instances <| isDefEq lhs rhs) do
logWarning m!"`{origin.key}` is a `[defeq]` simp theorem, but its left-hand side{indentExpr lhs}\n\
is not definitionally equal to the right-hand side{indentExpr rhs}\n\
at `.instances` transparency. Possible solutions:\n\
1- use `(rfl)` as the proof\n\
2- mark constants occurring in the lhs and rhs as `[implicit_reducible]`"
match_expr type with
| Eq _ lhs rhs => checkDefEq lhs rhs
| Iff lhs rhs => checkDefEq lhs rhs
| _ =>
logWarning m!"'{origin.key}' is a 'rfl' simp theorem, unexpected resulting type{indentExpr type}"
return { origin, keys, perm, post, levelParams, proof, priority := prio, rfl }
/--
Creates a `SimpTheorem` from a global theorem.

View File

@@ -9,8 +9,8 @@ prelude
public import Lean.Meta.Closure
public import Lean.Meta.SynthInstance
public import Lean.Meta.CtorRecognizer
public section
public import Lean.Meta.AppBuilder
import Lean.Structure
/-!
# Instance Wrapping
@@ -25,22 +25,30 @@ Given an instance `i : I` and expected type `I'` (where `I'` must be mvar-free),
`wrapInstance` constructs a result instance as follows, executing all steps at
`instances` transparency:
1. If `I'` is not a class, return `i` unchanged.
1. If `I'` is not a class application, return `i` unchanged.
2. If `I'` is a proposition, wrap `i` in an auxiliary theorem of type `I'` and return it
(controlled by `backward.inferInstanceAs.wrap.instances`).
3. Reduce `i` to whnf.
4. If `i` is not a constructor application: if the type of `i` is already defeq to `I'`,
4. If `i` is not a constructor application: if `I` is already defeq to `I'`,
return `i`; otherwise wrap it in an auxiliary definition of type `I'` and return it
(controlled by `backward.inferInstanceAs.wrap.instances`).
5. Otherwise, for `i = ctor a₁ ... aₙ` with `ctor : C ?p₁ ... ?pₙ`:
- Unify `C ?p₁ ... ?pₙ` with `I'`.
- Return a new application `ctor a₁' ... aₙ' : I'` where each `aᵢ'` is constructed as:
5. Otherwise, if `i` is an application of `ctor` of class `C`:
- Unify the conclusion of the type of `ctor` with `I'` to obtain adjusted field type `Fᵢ'` for
each field.
- Return a new application `ctor ... : I'` where the fields are adjusted as follows:
- If the field type is a proposition: assign directly if types are defeq, otherwise
wrap in an auxiliary theorem.
- If the field type is a class: first try to reuse an existing synthesized instance
for the target type (controlled by `backward.inferInstanceAs.wrap.reuseSubInstances`);
if that fails, recurse with source instance `aᵢ` and expected type `?pᵢ`.
- Otherwise (data field): assign directly if types are defeq, otherwise wrap in an
- If the field is a parent field (subobject) `p : P`: first try to reuse an existing
instance that can be synthesized for `P` (controlled by
`backward.inferInstanceAs.wrap.reuseSubInstances`) in order to preserve defeqs; if that
fails, recurse.
- If it is a field of a flattened parent class `C'` and
`backward.inferInstanceAs.wrap.reuseSubInstances` is true, try synthesizing an instance of
`C'` for `I'` and if successful, use the corresponding projection of the found instance in
order to preserve defeqs; otherwise, continue.
- Specifically, construct the chain of base projections from `C` to `C'` applied to `_ : I'`
and infer its type to obtain an appropriate application of `C'` for the instance search.
- Otherwise (non-inherited data field): assign directly if types are defeq, otherwise wrap in an
auxiliary definition to fix the type (controlled by `backward.inferInstanceAs.wrap.data`).
## Options
@@ -56,34 +64,36 @@ Given an instance `i : I` and expected type `I'` (where `I'` must be mvar-free),
namespace Lean.Meta
register_builtin_option backward.inferInstanceAs.wrap : Bool := {
public register_builtin_option backward.inferInstanceAs.wrap : Bool := {
defValue := true
descr := "wrap instance bodies in `inferInstanceAs` and the default `deriving` handler"
}
register_builtin_option backward.inferInstanceAs.wrap.reuseSubInstances : Bool := {
public register_builtin_option backward.inferInstanceAs.wrap.reuseSubInstances : Bool := {
defValue := true
descr := "when recursing into sub-instances, reuse existing instances for the target type instead of re-wrapping them, which can be important to avoid non-defeq instance diamonds"
}
register_builtin_option backward.inferInstanceAs.wrap.instances : Bool := {
public register_builtin_option backward.inferInstanceAs.wrap.instances : Bool := {
defValue := true
descr := "wrap non-reducible instances in auxiliary definitions to fix their types"
}
register_builtin_option backward.inferInstanceAs.wrap.data : Bool := {
public register_builtin_option backward.inferInstanceAs.wrap.data : Bool := {
defValue := true
descr := "wrap data fields in auxiliary definitions to fix their types"
}
builtin_initialize registerTraceClass `Meta.wrapInstance
open Meta
/--
Rebuild a type application with fresh synthetic metavariables for instance-implicit arguments.
Non-instance-implicit arguments are assigned from the original application's arguments.
If the function is over-applied, extra arguments are preserved.
-/
def abstractInstImplicitArgs (type : Expr) : MetaM Expr := do
public def abstractInstImplicitArgs (type : Expr) : MetaM Expr := do
let fn := type.getAppFn
let args := type.getAppArgs
let (mvars, bis, _) forallMetaTelescope ( inferType fn)
@@ -93,11 +103,38 @@ def abstractInstImplicitArgs (type : Expr) : MetaM Expr := do
let args := mvars ++ args.drop mvars.size
instantiateMVars (mkAppN fn args)
partial def getFieldOrigin (structName field : Name) : MetaM (Name × StructureFieldInfo) := do
let env getEnv
for parent in getStructureParentInfo env structName do
if (findField? env parent.structName field).isSome then
return getFieldOrigin parent.structName field
let some fi := getFieldInfo? env structName field
| throwError "no such field {field} in {structName}"
return (structName, fi)
/-- Projects application of a structure type to corresponding application of a parent structure. -/
def getParentStructType? (structName parentStructName : Name) (structType : Expr) : MetaM (Option Expr) := OptionT.run do
let env getEnv
let some path := getPathToBaseStructure? env parentStructName structName | failure
withLocalDeclD `self structType fun self => do
let proj path.foldlM (init := self) fun e projFn => do
let ty whnf ( inferType e)
let .const _ us := ty.getAppFn
| trace[Meta.wrapInstance] "could not reduce type `{ty}`"
failure
let params := ty.getAppArgs
pure <| mkApp (mkAppN (.const projFn us) params) e
let projTy whnf <| inferType proj
if projTy.containsFVar self.fvarId! then
trace[Meta.wrapInstance] "parent type depends on instance fields{indentExpr projTy}"
failure
return projTy
/--
Wrap an instance value so its type matches the expected type exactly.
See the module docstring for the full algorithm specification.
-/
partial def wrapInstance (inst expectedType : Expr) (compile : Bool := true)
public partial def wrapInstance (inst expectedType : Expr) (compile : Bool := true)
(logCompileErrors : Bool := true) (isMeta : Bool := false) : MetaM Expr := withTransparency .instances do
withTraceNode `Meta.wrapInstance
(fun _ => return m!"type: {expectedType}") do
@@ -112,8 +149,7 @@ partial def wrapInstance (inst expectedType : Expr) (compile : Bool := true)
return inst
-- Try to reduce it to a constructor.
let inst whnf inst
inst.withApp fun f args => do
( whnf inst).withApp fun f args => do
let some (.ctorInfo ci) f.constName?.mapM getConstInfo
| do
trace[Meta.wrapInstance] "did not reduce to constructor application, returning/wrapping as is: {inst}"
@@ -156,8 +192,10 @@ partial def wrapInstance (inst expectedType : Expr) (compile : Bool := true)
else
trace[Meta.wrapInstance] "proof field {i} does not have expected type {argExpectedType} but {argType}, wrapping in auxiliary theorem: {arg}"
mvarId.assign ( mkAuxTheorem argExpectedType arg (zetaDelta := true))
continue
-- Recurse into instance arguments of the constructor
else if ( isClass? argExpectedType).isSome then
if ( isClass? argExpectedType).isSome then
if backward.inferInstanceAs.wrap.reuseSubInstances.get ( getOptions) then
-- Reuse existing instance for the target type if any. This is especially important when recursing
-- as it guarantees subinstances of overlapping instances are defeq under more than just
@@ -171,22 +209,35 @@ partial def wrapInstance (inst expectedType : Expr) (compile : Bool := true)
mvarId.assign ( wrapInstance arg argExpectedType (compile := compile)
(logCompileErrors := logCompileErrors) (isMeta := isMeta))
else
-- For data fields, assign directly or wrap in aux def to fix types.
if backward.inferInstanceAs.wrap.data.get ( getOptions) then
let argType inferType arg
if isDefEq argExpectedType argType then
mvarId.assign arg
else
let name mkAuxDeclName
mvarId.assign ( mkAuxDefinition name argExpectedType arg (compile := false))
setInlineAttribute name
if isMeta then modifyEnv (markMeta · name)
if compile then
compileDecls (logErrors := logCompileErrors) #[name]
enableRealizationsForConst name
else
mvarId.assign arg
return mkAppN f ( mvars.mapM instantiateMVars)
continue
end Lean.Meta
if backward.inferInstanceAs.wrap.reuseSubInstances.get ( getOptions) then
let (baseClassName, fieldInfo) getFieldOrigin className mvarDecl.userName
if baseClassName != className then
trace[Meta.wrapInstance] "found inherited field `{mvarDecl.userName}` from parent `{baseClassName}`"
if let some baseClassType getParentStructType? className baseClassName expectedType then
try
if let .some existingBaseClassInst trySynthInstance baseClassType then
trace[Meta.wrapInstance] "using projection of existing instance `{existingBaseClassInst}`"
mvarId.assign ( mkProjection existingBaseClassInst fieldInfo.fieldName)
continue
trace[Meta.wrapInstance] "did not find existing instance for `{baseClassName}`"
catch e =>
trace[Meta.wrapInstance] "error when attempting to reuse existing instance for `{baseClassName}`: {e.toMessageData}"
-- For data fields, assign directly or wrap in aux def to fix types.
if backward.inferInstanceAs.wrap.data.get ( getOptions) then
let argType inferType arg
if isDefEq argExpectedType argType then
mvarId.assign arg
else
let name mkAuxDeclName
mvarId.assign ( mkAuxDefinition name argExpectedType arg (compile := false))
setInlineAttribute name
if isMeta then modifyEnv (markMeta · name)
if compile then
compileDecls (logErrors := logCompileErrors) #[name]
enableRealizationsForConst name
else
mvarId.assign arg
return mkAppN f ( mvars.mapM instantiateMVars)

View File

@@ -1115,11 +1115,6 @@ def symbolNoAntiquot (sym : String) : Parser :=
{ info := symbolInfo sym
fn := symbolFn sym }
def checkTailNoWs (prev : Syntax) : Bool :=
match prev.getTailInfo with
| .original _ _ trailing _ => trailing.stopPos == trailing.startPos
| _ => false
/-- Check if the following token is the symbol _or_ identifier `sym`. Useful for
parsing local tokens that have not been added to the token table (but may have
been so by some unrelated code).
@@ -1168,13 +1163,18 @@ partial def strAux (sym : String) (errorMsg : String) (j : String.Pos.Raw) :Pars
else parse (j.next' sym h₁) c (s.next' c i h₂)
parse j
private def pickNonNone (stack : SyntaxStack) : Syntax :=
match stack.toSubarray.findRev? fun stx => !stx.isNone with
| none => Syntax.missing
| some stx => stx
def checkTailWs (prev : Syntax) : Bool :=
match prev.getTailInfo with
| .original _ _ trailing _ => trailing.stopPos > trailing.startPos
| _ => false
def checkWsBeforeFn (errorMsg : String) : ParserFn := fun _ s =>
let prev := s.stxStack.back
let prev := pickNonNone s.stxStack
if checkTailWs prev then s else s.mkError errorMsg
/-- The `ws` parser requires that there is some whitespace at this location.
@@ -1202,10 +1202,10 @@ This parser has arity 0 - it does not capture anything. -/
info := epsilonInfo
fn := checkLinebreakBeforeFn errorMsg
private def pickNonNone (stack : SyntaxStack) : Syntax :=
match stack.toSubarray.findRev? fun stx => !stx.isNone with
| none => Syntax.missing
| some stx => stx
def checkTailNoWs (prev : Syntax) : Bool :=
match prev.getTailInfo with
| .original _ _ trailing _ => trailing.stopPos == trailing.startPos
| _ => false
def checkNoWsBeforeFn (errorMsg : String) : ParserFn := fun _ s =>
let prev := pickNonNone s.stxStack

View File

@@ -122,7 +122,9 @@ def declModifiers (inline : Bool) := leading_parser
/-- `declId` matches `foo` or `foo.{u,v}`: an identifier possibly followed by a list of universe names -/
-- @[builtin_doc] -- FIXME: suppress the hover
def declId := leading_parser
ident >> optional (".{" >> sepBy1 (recover ident (skipUntil (fun c => c.isWhitespace || c [',', '}']))) ", " >> "}")
ident >>
optional (checkNoWsBefore "no space before '.{'" >> ".{" >>
sepBy1 (recover ident (skipUntil (fun c => c.isWhitespace || c [',', '}']))) ", " >> "}")
/-- `declSig` matches the signature of a declaration with required type: a list of binders and then `: type` -/
-- @[builtin_doc] -- FIXME: suppress the hover
def declSig := leading_parser

View File

@@ -127,7 +127,10 @@ private def consumeInput (inputCtx : InputContext) (pmctx : ParserModuleContext)
| none => s.pos
def topLevelCommandParserFn : ParserFn :=
commandParser.fn
-- set position to enforce appropriate indentation of applications etc.
-- We don't do it for nested commands such as in quotations where
-- formatting might be less rigid.
(withPosition commandParser).fn
partial def parseCommand (inputCtx : InputContext) (pmctx : ParserModuleContext) (mps : ModuleParserState) (messages : MessageLog) : Syntax × ModuleParserState × MessageLog := Id.run do
let mut pos := mps.pos

View File

@@ -889,14 +889,21 @@ def isIdent (stx : Syntax) : Bool :=
-- antiquotations should also be allowed where an identifier is expected
stx.isAntiquot || stx.isIdent
def isIdentOrDotIdent (stx : Syntax) : Bool :=
isIdent stx || stx.isOfKind ``dotIdent
/-- Predicate for what `explicitUniv` can follow. It is only meant to be used on an identifier
that becomes the head constant of an application. -/
def isIdentOrDotIdentOrProj (stx : Syntax) : Bool :=
isIdent stx || stx.isOfKind ``dotIdent || stx.isOfKind ``proj
/-- `x.{u, ...}` explicitly specifies the universes `u, ...` of the constant `x`. -/
@[builtin_term_parser] def explicitUniv : TrailingParser := trailing_parser
checkStackTop isIdentOrDotIdent "expected preceding identifier" >>
/-- Syntax for `.{u, ...}` itself. Generally the `explicitUniv` trailing parser suffices.
However, for `e |>.x.{u} a1 a2 a3` notation we need to be able to express explicit universes in the
middle of the syntax. -/
def explicitUnivSuffix : Parser :=
checkNoWsBefore "no space before '.{'" >> ".{" >>
sepBy1 levelParser ", " >> "}"
/-- `x.{u, ...}` explicitly specifies the universes `u, ...` of the constant `x`. -/
@[builtin_term_parser] def explicitUniv : TrailingParser := trailing_parser
checkStackTop isIdentOrDotIdentOrProj "expected preceding identifier" >>
explicitUnivSuffix
/-- `x@e` or `x@h:e` matches the pattern `e` and binds its value to the identifier `x`.
If present, the identifier `h` is bound to a proof of `x = e`. -/
@[builtin_term_parser] def namedPattern : TrailingParser := trailing_parser
@@ -909,7 +916,7 @@ If present, the identifier `h` is bound to a proof of `x = e`. -/
It is especially useful for avoiding parentheses with repeated applications.
-/
@[builtin_term_parser] def pipeProj := trailing_parser:minPrec
" |>." >> checkNoWsBefore >> (fieldIdx <|> rawIdent) >> many argument
" |>." >> checkNoWsBefore >> (fieldIdx <|> rawIdent) >> optional explicitUnivSuffix >> many argument
@[builtin_term_parser] def pipeCompletion := trailing_parser:minPrec
" |>."

View File

@@ -84,11 +84,12 @@ Delimiter-free indentation is determined by the *first* tactic of the sequence.
tacticSeqBracketed <|> tacticSeq1Indented
/-- Same as [`tacticSeq`] but requires delimiter-free tactic sequence to have strict indentation.
The strict indentation requirement only apply to *nested* `by`s, as top-level `by`s do not have a
position set. -/
Falls back to an empty tactic sequence when no appropriately indented content follows, producing
an elaboration error (unsolved goals) rather than a parse error. -/
@[builtin_doc, run_builtin_parser_attribute_hooks]
def tacticSeqIndentGt := withAntiquot (mkAntiquot "tacticSeq" ``tacticSeq) <| node ``tacticSeq <|
tacticSeqBracketed <|> (checkColGt "indented tactic sequence" >> tacticSeq1Indented)
tacticSeqBracketed <|> (checkColGt "indented tactic sequence" >> tacticSeq1Indented) <|>
node ``tacticSeq1Indented pushNone
/- Raw sequence for quotation and grouping -/
@[run_builtin_parser_attribute_hooks]

View File

@@ -110,9 +110,7 @@ partial def compileParserExpr (e : Expr) : MetaM Expr := do
}
-- usually `meta` is inferred during compilation for auxiliary definitions, but as
-- `ctx.combinatorAttr` may enforce correct use of the modifier, infer now.
if isMarkedMeta ( getEnv) c then
modifyEnv (markMeta · c')
addAndCompile decl
addAndCompile decl (markMeta := isMarkedMeta ( getEnv) c)
modifyEnv (ctx.combinatorAttr.setDeclFor · c c')
if cinfo.type.isConst then
if let some kind parserNodeKind? cinfo.value! then

View File

@@ -461,8 +461,7 @@ partial def InfoTree.goalsAt? (text : FileMap) (t : InfoTree) (hoverPos : String
ctxInfo := ctx
tacticInfo := ti
useAfter := hoverPos > pos && !cs.any (hasNestedTactic pos tailPos)
-- consider every position unindented after an empty `by` to support "hanging" `by` uses
indented := (text.toPosition pos).column > (text.toPosition hoverPos).column && !isEmptyBy ti.stx
indented := (text.toPosition pos).column > (text.toPosition hoverPos).column
-- use goals just before cursor as fall-back only
-- thus for `(by foo)`, placing the cursor after `foo` shows its state as long
-- as there is no state on `)`
@@ -485,9 +484,6 @@ where
| InfoTree.node (Info.ofMacroExpansionInfo _) cs =>
cs.any (hasNestedTactic pos tailPos)
| _ => false
isEmptyBy (stx : Syntax) : Bool :=
-- there are multiple `by` kinds with the same structure
stx.getNumArgs == 2 && stx[0].isToken "by" && stx[1].getNumArgs == 1 && stx[1][0].isMissing
partial def InfoTree.termGoalAt? (t : InfoTree) (hoverPos : String.Pos.Raw) : Option InfoWithCtx :=

View File

@@ -3168,6 +3168,10 @@ static inline lean_obj_res lean_manual_get_root(lean_obj_arg _unit) {
return lean_mk_string(LEAN_MANUAL_ROOT);
}
static inline lean_obj_res lean_runtime_hold(b_lean_obj_arg a) {
return lean_box(0);
}
#ifdef LEAN_EMSCRIPTEN
#define LEAN_SCALAR_PTR_LITERAL(b1, b2, b3, b4, b5, b6, b7, b8) (lean_object*)((uint32_t)b1 | ((uint32_t)b2 << 8) | ((uint32_t)b3 << 16) | ((uint32_t)b4 << 24)), (lean_object*)((uint32_t)b5 | ((uint32_t)b6 << 8) | ((uint32_t)b7 << 16) | ((uint32_t)b8 << 24))
#else

View File

@@ -19,7 +19,7 @@ import Lake.Build.InputFile
namespace Lake
/-- The initial set of Lake facets. -/
public def initFacetConfigs : DNameMap FacetConfig :=
public def initFacetConfigs : FacetConfigMap :=
DNameMap.empty
|> insert Module.initFacetConfigs
|> insert Package.initFacetConfigs
@@ -30,4 +30,4 @@ public def initFacetConfigs : DNameMap FacetConfig :=
|> insert InputDir.initFacetConfigs
where
insert {k} (group : DNameMap (KFacetConfig k)) map :=
group.foldl (init := map) fun m k v => m.insert k v.toFacetConfig
group.foldl (init := map) fun m _ v => m.insert v.toFacetConfig

View File

@@ -24,6 +24,11 @@ namespace Lake
/-! ## Build Lean & Static Lib -/
private structure ModuleCollection where
mods : Array Module := #[]
modSet : ModuleSet :=
hasErrors : Bool := false
/--
Collect the local modules of a library.
That is, the modules from `getModuleArray` plus their local transitive imports.
@@ -31,23 +36,27 @@ That is, the modules from `getModuleArray` plus their local transitive imports.
private partial def LeanLib.recCollectLocalModules
(self : LeanLib) : FetchM (Job (Array Module))
:= ensureJob do
let mut mods := #[]
let mut modSet := ModuleSet.empty
let mut col : ModuleCollection := {}
for mod in ( self.getModuleArray) do
(mods, modSet) go mod mods modSet
return Job.pure mods
col go mod col
if col.hasErrors then
-- This is not considered a fatal error because we want the modules
-- built to provide better error categorization in the monitor.
logError s!"{self.name}: some modules have bad imports"
return Job.pure col.mods
where
go root mods modSet := do
let mut mods := mods
let mut modSet := modSet
unless modSet.contains root do
modSet := modSet.insert root
let imps ( root.imports.fetch).await
go root col := do
let mut col := col
unless col.modSet.contains root do
col := {col with modSet := col.modSet.insert root}
-- We discard errors here as they will be reported later when the module is built.
let some imps ( root.imports.fetch).wait?
| return {col with hasErrors := true}
for mod in imps do
if mod.lib.name = self.name then
(mods, modSet) go mod mods modSet
mods := mods.push root
return (mods, modSet)
col go mod col
col := {col with mods := col.mods.push root}
return col
/-- The `LibraryFacetConfig` for the builtin `modulesFacet`. -/
private def LeanLib.modulesFacetConfig : LibraryFacetConfig modulesFacet :=

View File

@@ -463,6 +463,7 @@ def visitModule (pkgs : Array Name) (srcSearchPath : SearchPath)
if prelude?.isNone then
let j : Nat := s.env.getModuleIdx? `Init |>.get!
deps := deps.union .pub {j}
deps := deps.union .metaPub {j}
-- Accumulate `transDeps` which is the non-reflexive transitive closure of the still-live imports
let mut transDeps := Needs.empty

View File

@@ -27,6 +27,20 @@ public structure FacetConfig (name : Name) : Type where
memoize : Bool := true
deriving Inhabited
/-- A mapping of facet names to the configuration for that name. -/
public abbrev FacetConfigMap := DNameMap FacetConfig
/--
Tries to retrieve the facet configuration for the given {lean}`name`,
returning {lean}`none` if no such mapping is present.
-/
public nonrec def FacetConfigMap.get? (name : Name) (self : FacetConfigMap) : Option (FacetConfig name) :=
self.get? name -- specializes `get?`
/-- Inserts the facet configuration {lean}`cfg` into the map (overwriting any existing configuration). -/
public nonrec def FacetConfigMap.insert {name} (cfg : FacetConfig name) (self : FacetConfigMap) : FacetConfigMap :=
self.insert name cfg -- specializes `insert`
public protected abbrev FacetConfig.name (_ : FacetConfig name) := name
public structure KFacetConfig (k : Name) (name : Name) extends FacetConfig name where
@@ -70,6 +84,9 @@ public structure NamedConfigDecl (β : Name → Type u) where
name : Name
config : β name
/-- A facet declaration from a configuration file. -/
public abbrev FacetDecl := NamedConfigDecl FacetConfig
/-- A module facet's declarative configuration. -/
public abbrev ModuleFacetConfig := KFacetConfig Module.facetKind

View File

@@ -0,0 +1,53 @@
/-
Copyright (c) 2025 Mac Malone. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Mac Malone
-/
module
prelude
public import Lake.Config.Script
public import Lake.Config.Dependency
public import Lake.Config.PackageConfig
public import Lake.Config.FacetConfig
public import Lake.Config.TargetConfig
set_option doc.verso true
open Lean
namespace Lake
/-- The configuration defined by a Lake configuration file (e.g., {lit}`lakefile.(lean|toml)`). -/
public structure LakefileConfig where
/-- The package deceleration of the configuration file. -/
pkgDecl : PackageDecl
/-- Depdency configurations in the order declared by the configuration file. -/
depConfigs : Array Dependency := #[]
/-- Facet configurations in the order declared by the configuration file. -/
facetDecls : Array FacetDecl := {}
/-- Target configurations in the order declared by the configuration file. -/
targetDecls : Array (PConfigDecl pkgDecl.keyName) := #[]
/-- Name-declaration map of target configurations declared in the configuration file. -/
targetDeclMap : DNameMap (NConfigDecl pkgDecl.keyName) :=
targetDecls.foldl (fun m d => m.insert d.name (.mk d rfl)) {}
/-- The names of targets declared via {lit}`@[default_target]`. -/
defaultTargets : Array Name := #[]
/-- Scripts declared in the configuration file. -/
scripts : NameMap Script := {}
/-- The names of the scripts declared via {lit}`@[default_script]`. -/
defaultScripts : Array Script := #[]
/-- {lit}`post_update` hooks in the order declared by the configuration file.. -/
postUpdateHooks : Array (OpaquePostUpdateHook pkgDecl.keyName) := #[]
/--
The name of the configuration file's test driver.
It is either declared via {lit}`@[test_driver]` or derived from {lean}`PackageConfig.testDriver`.
-/
testDriver : String := pkgDecl.config.testDriver
/--
The name of the configuration file's test driver.
It is either declared via {lit}`@[lint_driver]` or derived from {lean}`PackageConfig.lintDriver`.
-/
lintDriver : String := pkgDecl.config.lintDriver

View File

@@ -45,7 +45,7 @@ public structure Package where
/-- The path to the package's configuration file (relative to `dir`). -/
relConfigFile : FilePath
/-- The path to the package's JSON manifest of remote dependencies (relative to `dir`). -/
relManifestFile : FilePath := defaultManifestFile
relManifestFile : FilePath
/-- The package's scope (e.g., in Reservoir). -/
scope : String
/-- The URL to this package's Git remote. -/

View File

@@ -336,11 +336,16 @@ public configuration PackageConfig (p : Name) (n : Name) extends WorkspaceConfig
deriving Inhabited
/-- The package's name as specified by the author. -/
@[deprecated "Deprecated without replacement" (since := "2025-12-10")]
public abbrev PackageConfig.origName (_ : PackageConfig p n) := n
/-- A package declaration from a configuration written in Lean. -/
public structure PackageDecl where
name : Name
baseName : Name
keyName : Name
origName : Name
config : PackageConfig name origName
config : PackageConfig keyName origName
deriving TypeName
@[deprecated PackageDecl.keyName (since := "2025-12-10")]
public abbrev PackageDecl.name := @keyName

View File

@@ -22,15 +22,17 @@ open Lean (Name LeanOptions)
namespace Lake
/-- **For internal use only.** Computes the cache to use for the package based on the environment. -/
/--
**For internal use only.**
Computes the cache to use for the package based on the environment.
-/
public def computeLakeCache (pkg : Package) (lakeEnv : Lake.Env) : Cache :=
if pkg.bootstrap then
lakeEnv.lakeSystemCache?.getD pkg.lakeDir / "cache"
else
lakeEnv.lakeCache?.getD pkg.lakeDir / "cache"
/-- A Lake workspace -- the top-level package directory. -/
public structure Workspace : Type where
public structure Workspace.Raw : Type where
/-- The root package of the workspace. -/
root : Package
/-- The detected {lean}`Lake.Env` of the workspace. -/
@@ -49,14 +51,25 @@ public structure Workspace : Type where
The packages within the workspace
(in {lit}`require` declaration order with the root coming first).
-/
packages : Array Package := {}
packages : Array Package := #[]
/-- Name-package map of packages within the workspace. -/
packageMap : DNameMap NPackage := {}
/-- Configuration map of facets defined in the workspace. -/
facetConfigs : DNameMap FacetConfig := {}
facetConfigs : FacetConfigMap := {}
deriving Nonempty
public instance : Nonempty Workspace :=
by constructor <;> exact Classical.ofNonempty
public structure Workspace.Raw.WF (ws : Workspace.Raw) : Prop where
packages_wsIdx : (h : i < ws.packages.size), (ws.packages[i]'h).wsIdx = i
/-- A Lake workspace -- the top-level package directory. -/
public structure Workspace extends raw : Workspace.Raw, wf : raw.WF
public instance : Nonempty Workspace := .intro {
lakeEnv := default
lakeConfig := Classical.ofNonempty
root := Classical.ofNonempty
packages_wsIdx h := by simp at h
}
public hydrate_opaque_type OpaqueWorkspace Workspace
@@ -175,9 +188,20 @@ This is configured through {lit}`cache.service` entries in the system Lake confi
@[inline] public def packageOverridesFile (self : Workspace) : FilePath :=
self.lakeDir / "package-overrides.json"
/-- **For internal use only.** Add a well-formed package to the workspace. -/
@[inline] public def addPackage' (pkg : Package) (self : Workspace) (h : pkg.wsIdx = self.packages.size) : Workspace :=
{self with
packages := self.packages.push pkg
packageMap := self.packageMap.insert pkg.keyName pkg
packages_wsIdx {i} i_lt := by
cases Nat.lt_add_one_iff_lt_or_eq.mp <| Array.size_push .. i_lt with
| inl i_lt => simpa [Array.getElem_push_lt i_lt] using self.packages_wsIdx i_lt
| inr i_eq => simpa [i_eq] using h
}
/-- Add a package to the workspace. -/
public def addPackage (pkg : Package) (self : Workspace) : Workspace :=
{self with packages := self.packages.push pkg, packageMap := self.packageMap.insert pkg.keyName pkg}
@[inline] public def addPackage (pkg : Package) (self : Workspace) : Workspace :=
self.addPackage' {pkg with wsIdx := self.packages.size} rfl
/-- Returns the unique package in the workspace (if any) that is identified by {lean}`keyName`. -/
@[inline] public protected def findPackageByKey? (keyName : Name) (self : Workspace) : Option (NPackage keyName) :=
@@ -251,15 +275,15 @@ public def findTargetDecl? (name : Name) (self : Workspace) : Option ((pkg : Pac
self.packages.findSome? fun pkg => pkg.findTargetDecl? name <&> (pkg, ·)
/-- Add a facet to the workspace. -/
public def addFacetConfig {name} (cfg : FacetConfig name) (self : Workspace) : Workspace :=
{self with facetConfigs := self.facetConfigs.insert name cfg}
@[inline] public def addFacetConfig {name} (cfg : FacetConfig name) (self : Workspace) : Workspace :=
{self with facetConfigs := self.facetConfigs.insert cfg}
/-- Try to find a facet configuration in the workspace with the given name. -/
public def findFacetConfig? (name : Name) (self : Workspace) : Option (FacetConfig name) :=
@[inline] public def findFacetConfig? (name : Name) (self : Workspace) : Option (FacetConfig name) :=
self.facetConfigs.get? name
/-- Add a module facet to the workspace. -/
public def addModuleFacetConfig (cfg : ModuleFacetConfig name) (self : Workspace) : Workspace :=
@[inline] public def addModuleFacetConfig (cfg : ModuleFacetConfig name) (self : Workspace) : Workspace :=
self.addFacetConfig cfg.toFacetConfig
/-- Try to find a module facet configuration in the workspace with the given name. -/
@@ -267,7 +291,7 @@ public def findModuleFacetConfig? (name : Name) (self : Workspace) : Option (Mod
self.findFacetConfig? name |>.bind (·.toKind? Module.facetKind)
/-- Add a package facet to the workspace. -/
public def addPackageFacetConfig (cfg : PackageFacetConfig name) (self : Workspace) : Workspace :=
@[inline] public def addPackageFacetConfig (cfg : PackageFacetConfig name) (self : Workspace) : Workspace :=
self.addFacetConfig cfg.toFacetConfig
/-- Try to find a package facet configuration in the workspace with the given name. -/
@@ -275,7 +299,7 @@ public def findPackageFacetConfig? (name : Name) (self : Workspace) : Option (Pa
self.findFacetConfig? name |>.bind (·.toKind? Package.facetKind)
/-- Add a library facet to the workspace. -/
public def addLibraryFacetConfig (cfg : LibraryFacetConfig name) (self : Workspace) : Workspace :=
@[inline] public def addLibraryFacetConfig (cfg : LibraryFacetConfig name) (self : Workspace) : Workspace :=
self.addFacetConfig cfg.toFacetConfig
/-- Try to find a library facet configuration in the workspace with the given name. -/

View File

@@ -26,17 +26,18 @@ def elabPackageCommand : CommandElab := fun stx => do
let configId : Ident `(pkgConfig)
let id mkConfigDeclIdent nameStx?
let origName := Name.quoteFrom id id.getId
let idx, name := nameExt.getState ( getEnv)
let name := match name with
let wsIdx, name := nameExt.getState ( getEnv)
let baseName := match name with
| .anonymous => origName
| name => Name.quoteFrom id name
let name := Syntax.mkCApp ``Name.num #[name, quote idx]
let ty := Syntax.mkCApp ``PackageConfig #[name, origName]
let wsIdx := quote wsIdx
let keyName := Syntax.mkCApp ``Name.num #[baseName, wsIdx]
let ty := Syntax.mkCApp ``PackageConfig #[keyName, origName]
elabConfig ``PackageConfig configId ty cfg
let attr `(Term.attrInstance| «package»)
let attrs := #[attr] ++ expandAttrs attrs?
let id := mkIdentFrom id packageDeclName
let decl `({name := $name, origName := $origName, config := $configId})
let decl `({baseName := $baseName, origName := $origName, keyName := $keyName, config := $configId})
let cmd `($[$doc?]? @[$attrs,*] abbrev $id : PackageDecl := $decl)
withMacroExpansion stx cmd <| elabCommand cmd
let nameId := mkIdentFrom id <| packageDeclName.str "name"

View File

@@ -7,6 +7,7 @@ module
prelude
public import Lake.Config.Env
public import Lake.Config.Lang
public import Lake.Config.LakeConfig
public import Lake.Load.Manifest
@@ -41,8 +42,16 @@ public structure LoadConfig where
pkgDir : FilePath := wsDir / relPkgDir
/-- The package's Lake configuration file (relative to its directory). -/
relConfigFile : FilePath := defaultConfigFile
/-- The full path to the loaded package's Lake configuration file. -/
/-- The full path to the loaded package's Lake configuration file. -/
configFile : FilePath := pkgDir / relConfigFile
/-
The format of the package's configuration file.
If {lean}`none`, the format will be determined by the file's extension.
-/
configLang? : Option ConfigLang := none
/-- The package's Lake manifest file (relative to its directory). -/
relManifestFile : FilePath := defaultManifestFile
/-- Additional package overrides for this workspace load. -/
packageOverrides : Array PackageEntry := #[]
/-- A set of key-value Lake configuration options (i.e., {lit}`-K` settings). -/
@@ -55,6 +64,7 @@ public structure LoadConfig where
updateDeps : Bool := false
/--
Whether to update the workspace's {lit}`lean-toolchain` when dependencies are updated.
If {lean}`true` and a toolchain update occurs, Lake will need to be restarted.
-/
updateToolchain : Bool := true

View File

@@ -7,6 +7,7 @@ module
prelude
public import Lake.Config.Package
public import Lake.Config.LakefileConfig
public import Lake.Load.Config
import Lake.Load.Lean.Elab
import Lake.Load.Lean.Eval
@@ -21,22 +22,7 @@ open Lean
namespace Lake
/--
Elaborate a Lean configuration file into a `Package`.
The resulting package does not yet include any dependencies.
-/
public def loadLeanConfig (cfg : LoadConfig) : LogIO (Package × Environment) := do
/-- Elaborate a Lake configuration file written in Lean and extract the Lake configuration. -/
public def loadLeanConfig (cfg : LoadConfig) : LogIO LakefileConfig := do
let configEnv importConfigFile cfg
let keyName, origName, config IO.ofExcept <| PackageDecl.loadFromEnv configEnv cfg.leanOpts
let baseName := if cfg.pkgName.isAnonymous then origName else cfg.pkgName
let pkg : Package := {
wsIdx := cfg.pkgIdx
baseName, keyName, origName, config
dir := cfg.pkgDir
relDir := cfg.relPkgDir
configFile := cfg.configFile
relConfigFile := cfg.relConfigFile
scope := cfg.scope
remoteUrl := cfg.remoteUrl
}
return ( pkg.loadFromEnv configEnv cfg.leanOpts, configEnv)
LakefileConfig.loadFromEnv configEnv cfg.leanOpts

View File

@@ -7,6 +7,7 @@ module
prelude
public import Lake.Config.Workspace
public import Lake.Config.LakefileConfig
import Lean.DocString
import Lake.DSL.AttributesCore
@@ -75,39 +76,38 @@ private def mkOrdTagMap
return map.insert declName <| f declName
/-- Load a `PackageDecl` from a configuration environment. -/
public def PackageDecl.loadFromEnv
private def PackageDecl.loadFromEnv
(env : Environment) (opts := Options.empty)
: Except String PackageDecl := do
let declName
match packageAttr.getAllEntries env |>.toList with
| [] => error s!"configuration file is missing a `package` declaration"
| [name] => pure name
| [keyName] => pure keyName
| _ => error s!"configuration file has multiple `package` declarations"
evalConstCheck env opts _ declName
/--
Load the optional elements of a `Package` from the Lean environment.
This is done after loading its core configuration but before resolving
its dependencies.
-/
public def Package.loadFromEnv
(self : Package) (env : Environment) (opts : Options)
: LogIO Package := do
/-- Load a Lake configuration from a configuration file's environment. -/
public def LakefileConfig.loadFromEnv
(env : Environment) (opts : Options)
: LogIO LakefileConfig := do
-- load package declaration
let pkgDecl IO.ofExcept <| PackageDecl.loadFromEnv env opts
let prettyName := pkgDecl.baseName.toString (escape := false)
let keyName := pkgDecl.keyName
-- load target, script, hook, and driver configurations
let constTargetMap IO.ofExcept <| mkOrdTagMap env targetAttr fun name => do
evalConstCheck env opts ConfigDecl name
let targetDecls constTargetMap.toArray.mapM fun decl => do
if h : decl.pkg = self.keyName then
if h : decl.pkg = keyName then
return .mk decl h
else
error s!"\
target '{decl.name}' was defined in package '{decl.pkg}', \
but registered under '{self.keyName}'"
error s!"target '{decl.name}' was defined in package '{decl.pkg}', \
but registered under '{keyName}'"
let targetDeclMap targetDecls.foldlM (init := {}) fun m decl => do
if let some orig := m.get? decl.name then
error s!"\
{self.prettyName}: target '{decl.name}' was already defined as a '{orig.kind}', \
error s!"{prettyName}: target '{decl.name}' was already defined as a '{orig.kind}', \
but then redefined as a '{decl.kind}'"
else
return m.insert decl.name (.mk decl rfl)
@@ -116,8 +116,7 @@ public def Package.loadFromEnv
if let some exeConfig := decl.config? LeanExe.configKind then
let root := exeConfig.root
if let some origExe := exeRoots.get? root then
error s!"\
{self.prettyName}: executable '{decl.name}' has the same root module '{root}' \
error s!"{prettyName}: executable '{decl.name}' has the same root module '{root}' \
as executable '{origExe}'"
else
return exeRoots.insert root decl.name
@@ -125,80 +124,78 @@ public def Package.loadFromEnv
return exeRoots
let defaultTargets defaultTargetAttr.getAllEntries env |>.mapM fun name =>
if let some decl := constTargetMap.find? name then pure decl.name else
error s!"{self.prettyName}: package is missing target '{name}' marked as a default"
error s!"{prettyName}: package is missing target '{name}' marked as a default"
let scripts mkTagMap env scriptAttr fun scriptName => do
let name := self.prettyName ++ "/" ++ scriptName.toString (escape := false)
let name := prettyName ++ "/" ++ scriptName.toString (escape := false)
let fn IO.ofExcept <| evalConstCheck env opts ScriptFn scriptName
return {name, fn, doc? := findDocString? env scriptName : Script}
let defaultScripts defaultScriptAttr.getAllEntries env |>.mapM fun name =>
if let some script := scripts.get? name then pure script else
error s!"{self.prettyName}: package is missing script '{name}' marked as a default"
error s!"{prettyName}: package is missing script '{name}' marked as a default"
let postUpdateHooks postUpdateAttr.getAllEntries env |>.mapM fun name =>
match evalConstCheck env opts PostUpdateHookDecl name with
| .ok decl =>
if h : decl.pkg = self.keyName then
return OpaquePostUpdateHook.mk cast (by rw [h, keyName]) decl.fn
if h : decl.pkg = keyName then
return OpaquePostUpdateHook.mk cast (by rw [h]) decl.fn
else
error s!"post-update hook was defined in '{decl.pkg}', but was registered in '{self.keyName}'"
error s!"post-update hook was defined in '{decl.pkg}', but was registered in '{keyName}'"
| .error e => error e
let depConfigs IO.ofExcept <| packageDepAttr.getAllEntries env |>.mapM fun name =>
evalConstCheck env opts Dependency name
let testDrivers testDriverAttr.getAllEntries env |>.mapM fun name =>
if let some decl := constTargetMap.find? name then
pure decl.name
return decl.name
else if scripts.contains name then
pure name
return name
else
error s!"{self.prettyName}: package is missing script or target '{name}' marked as a test driver"
let testDriver
error s!"{prettyName}: package is missing script or target '{name}' marked as a test driver"
let testDriver id do
if testDrivers.size > 1 then
error s!"{self.prettyName}: only one script, executable, or library can be tagged @[test_driver]"
error s!"{prettyName}: only one script, executable, or library can be tagged @[test_driver]"
else if h : testDrivers.size > 0 then
if self.config.testDriver.isEmpty then
pure (testDrivers[0]'h |>.toString)
if pkgDecl.config.testDriver.isEmpty then
return (testDrivers[0]'h |>.toString)
else
error s!"{self.prettyName}: cannot both set testDriver and use @[test_driver]"
error s!"{prettyName}: cannot both set testDriver and use @[test_driver]"
else
pure self.config.testDriver
return pkgDecl.config.testDriver
let lintDrivers lintDriverAttr.getAllEntries env |>.mapM fun name =>
if let some decl := constTargetMap.find? name then
pure decl.name
return decl.name
else if scripts.contains name then
pure name
return name
else
error s!"{self.prettyName}: package is missing script or target '{name}' marked as a lint driver"
let lintDriver
error s!"{prettyName}: package is missing script or target '{name}' marked as a lint driver"
let lintDriver id do
if lintDrivers.size > 1 then
error s!"{self.prettyName}: only one script or executable can be tagged @[lint_driver]"
error s!"{prettyName}: only one script or executable can be tagged @[lint_driver]"
else if h : lintDrivers.size > 0 then
if self.config.lintDriver.isEmpty then
pure (lintDrivers[0]'h |>.toString)
if pkgDecl.config.lintDriver.isEmpty then
return (lintDrivers[0]'h |>.toString)
else
error s!"{self.prettyName}: cannot both set lintDriver and use @[lint_driver]"
error s!"{prettyName}: cannot both set lintDriver and use @[lint_driver]"
else
pure self.config.lintDriver
return pkgDecl.config.lintDriver
-- load facets
let facetDecls IO.ofExcept do
let mut decls : Array FacetDecl := #[]
for name in moduleFacetAttr.getAllEntries env do
let decl evalConstCheck env opts ModuleFacetDecl name
decls := decls.push {decl with config := decl.config.toFacetConfig}
for name in packageFacetAttr.getAllEntries env do
let decl evalConstCheck env opts PackageFacetDecl name
decls := decls.push {decl with config := decl.config.toFacetConfig}
for name in libraryFacetAttr.getAllEntries env do
let decl evalConstCheck env opts LibraryFacetDecl name
decls := decls.push {decl with config := decl.config.toFacetConfig}
return decls
-- Fill in the Package
return {self with
depConfigs, targetDecls, targetDeclMap
defaultTargets, scripts, defaultScripts
testDriver, lintDriver, postUpdateHooks
return {
pkgDecl, depConfigs, facetDecls,
targetDecls, targetDeclMap, defaultTargets,
scripts, defaultScripts,
testDriver, lintDriver,
postUpdateHooks,
}
/--
Load module/package facets into a `Workspace` from a configuration environment.
-/
public def Workspace.addFacetsFromEnv
(env : Environment) (opts : Options) (self : Workspace)
: Except String Workspace := do
let mut ws := self
for name in moduleFacetAttr.getAllEntries env do
let decl evalConstCheck env opts ModuleFacetDecl name
ws := ws.addModuleFacetConfig decl.config
for name in packageFacetAttr.getAllEntries env do
let decl evalConstCheck env opts PackageFacetDecl name
ws := ws.addPackageFacetConfig decl.config
for name in libraryFacetAttr.getAllEntries env do
let decl evalConstCheck env opts LibraryFacetDecl name
ws := ws.addLibraryFacetConfig decl.config
return ws

View File

@@ -94,6 +94,9 @@ public structure PackageEntry where
namespace PackageEntry
@[inline] public def prettyName (entry : PackageEntry) : String :=
entry.name.toString (escape := false)
public protected def toJson (entry : PackageEntry) : Json :=
let fields := [
("name", toJson entry.name),

View File

@@ -10,6 +10,7 @@ public import Lake.Config.Env
public import Lake.Load.Manifest
public import Lake.Config.Package
import Lake.Util.Git
import Lake.Util.IO
import Lake.Reservoir
open System Lean
@@ -87,6 +88,8 @@ def materializeGitRepo
cloneGitPkg name repo url rev?
public structure MaterializedDep where
/-- Absolute path to the materialized package. -/
pkgDir : FilePath
/-- Path to the materialized package relative to the workspace's root directory. -/
relPkgDir : FilePath
/--
@@ -105,17 +108,32 @@ namespace MaterializedDep
@[inline] public def name (self : MaterializedDep) : Name :=
self.manifestEntry.name
@[inline] public def prettyName (self : MaterializedDep) : String :=
self.manifestEntry.name.toString (escape := false)
@[inline] public def scope (self : MaterializedDep) : String :=
self.manifestEntry.scope
/-- Path to the dependency's configuration file (relative to `relPkgDir`). -/
@[inline] public def manifestFile? (self : MaterializedDep) : Option FilePath :=
/-- Path to the dependency's manfiest file (relative to `relPkgDir`). -/
@[inline] public def relManifestFile? (self : MaterializedDep) : Option FilePath :=
self.manifestEntry.manifestFile?
/-- Path to the dependency's manfiest file (relative to `relPkgDir`). -/
@[inline] public def relManifestFile (self : MaterializedDep) : FilePath :=
self.relManifestFile?.getD defaultManifestFile
/-- Absolute path to the dependency's manfiest file. -/
@[inline] public def manifestFile (self : MaterializedDep) : FilePath :=
self.pkgDir / self.relManifestFile
/-- Path to the dependency's configuration file (relative to `relPkgDir`). -/
@[inline] public def configFile (self : MaterializedDep) : FilePath :=
@[inline] public def relConfigFile (self : MaterializedDep) : FilePath :=
self.manifestEntry.configFile
/-- Absolute path to the dependency's configuration file. -/
@[inline] public def configFile (self : MaterializedDep) : FilePath :=
self.pkgDir / self.relConfigFile
public def fixedToolchain (self : MaterializedDep) : Bool :=
match self.manifest? with
| .ok manifest => manifest.fixedToolchain
@@ -157,10 +175,11 @@ public def Dependency.materialize
(lakeEnv : Env) (wsDir relPkgsDir relParentDir : FilePath)
: LoggerIO MaterializedDep := do
if let some src := dep.src? then
let sname := dep.name.toString (escape := false)
match src with
| .path dir =>
let relPkgDir := relParentDir / dir
mkDep (wsDir / relPkgDir) relPkgDir "" (.path relPkgDir)
mkDep sname relPkgDir "" (.path relPkgDir)
| .git url inputRev? subDir? => do
let sname := dep.name.toString (escape := false)
let repoUrl := Git.filterUrl? url |>.getD ""
@@ -208,16 +227,19 @@ public def Dependency.materialize
| _ => error s!"{pkg.fullName}: Git source not found on Reservoir"
where
materializeGit name relPkgDir gitUrl remoteUrl inputRev? subDir? : LoggerIO MaterializedDep := do
let pkgDir := wsDir / relPkgDir
let repo := GitRepo.mk pkgDir
let gitDir := wsDir / relPkgDir
let repo := GitRepo.mk gitDir
let gitUrl := lakeEnv.pkgUrlMap.find? dep.name |>.getD gitUrl
materializeGitRepo name repo gitUrl inputRev?
let rev repo.getHeadRevision
let relPkgDir := if let some subDir := subDir? then relPkgDir / subDir else relPkgDir
mkDep pkgDir relPkgDir remoteUrl <| .git gitUrl rev inputRev? subDir?
@[inline] mkDep pkgDir relPkgDir remoteUrl src : LoggerIO MaterializedDep := do
mkDep name relPkgDir remoteUrl <| .git gitUrl rev inputRev? subDir?
@[inline] mkDep name relPkgDir remoteUrl src : LoggerIO MaterializedDep := do
let pkgDir := wsDir / relPkgDir
let some pkgDir resolvePath? pkgDir
| error s!"{name}: package directory not found: {pkgDir}"
return {
relPkgDir, remoteUrl
pkgDir, relPkgDir, remoteUrl,
manifest? := Manifest.load (pkgDir / defaultManifestFile) |>.toBaseIO
manifestEntry := {name := dep.name, scope := dep.scope, inherited, src}
}
@@ -231,9 +253,9 @@ public def PackageEntry.materialize
: LoggerIO MaterializedDep :=
match manifestEntry.src with
| .path (dir := relPkgDir) .. =>
mkDep (wsDir / relPkgDir) relPkgDir ""
mkDep relPkgDir ""
| .git (url := url) (rev := rev) (subDir? := subDir?) .. => do
let sname := manifestEntry.name.toString (escape := false)
let sname := manifestEntry.prettyName
let relGitDir := relPkgsDir / sname
let gitDir := wsDir / relGitDir
let repo := GitRepo.mk gitDir
@@ -254,12 +276,15 @@ public def PackageEntry.materialize
let url := lakeEnv.pkgUrlMap.find? manifestEntry.name |>.getD url
cloneGitPkg sname repo url rev
let relPkgDir := match subDir? with | .some subDir => relGitDir / subDir | .none => relGitDir
mkDep gitDir relPkgDir (Git.filterUrl? url |>.getD "")
mkDep relPkgDir (Git.filterUrl? url |>.getD "")
where
@[inline] mkDep pkgDir relPkgDir remoteUrl : LoggerIO MaterializedDep := do
@[inline] mkDep relPkgDir remoteUrl : LoggerIO MaterializedDep := do
let pkgDir := wsDir / relPkgDir
let some pkgDir resolvePath? pkgDir
| error s!"{manifestEntry.prettyName}: package directory not found: {pkgDir}"
let manifest? id do
if let some manifestFile := manifestEntry.manifestFile? then
Manifest.load (pkgDir / manifestFile) |>.toBaseIO
else
return .error (.noFileOrDirectory "" 0 "")
return {relPkgDir, remoteUrl, manifest?, manifestEntry}
return {pkgDir, relPkgDir, remoteUrl, manifest?, manifestEntry}

View File

@@ -8,10 +8,13 @@ module
prelude
public import Lake.Load.Config
public import Lake.Config.Package
public import Lake.Config.LakefileConfig
import Lake.Util.IO
import Lake.Load.Lean
import Lake.Load.Toml
set_option doc.verso true
/-! # Package Loader
This module contains the main definitions to load a package
@@ -23,6 +26,38 @@ open System (FilePath)
namespace Lake
/--
**For interal use only.**
Constructs a package from the configuration defined in its Lake configuration file
and the load configuaration.
-/
public def mkPackage
(loadCfg : LoadConfig) (fileCfg : LakefileConfig) (wsIdx := loadCfg.pkgIdx)
: Package :=
let {
pkgDir := dir, relPkgDir := relDir,
configFile, relConfigFile, relManifestFile,
scope, remoteUrl, ..} := loadCfg
let {
depConfigs, defaultTargets, scripts, defaultScripts, testDriver, lintDriver
-- destructing needed for type-correctness
pkgDecl, targetDecls, targetDeclMap, postUpdateHooks,
..} := fileCfg
let {baseName, keyName, origName, config} := pkgDecl
{
wsIdx, baseName, keyName, origName, config
dir, relDir, configFile, relConfigFile, relManifestFile
scope, remoteUrl
depConfigs
targetDecls, targetDeclMap, defaultTargets
scripts, defaultScripts
testDriver, lintDriver
postUpdateHooks
}
public theorem wsIdx_mkPackage : (mkPackage l f i).wsIdx = i := by rfl
/--
Return whether a configuration file with the given name
and/or a supported extension exists.
@@ -49,21 +84,25 @@ public def realConfigFile (cfgFile : FilePath) : BaseIO FilePath := do
resolvePath (cfgFile.addExtension "toml")
/--
Loads a Lake package configuration (either Lean or TOML).
The resulting package does not yet include any dependencies.
**For internal use only.**
Resolves a relative configuration file path in {lean}`cfg` and
detects its configuration language (if necessary).
-/
public def loadPackageCore
public def resolveConfigFile
(name : String) (cfg : LoadConfig)
: LogIO (Package × Option Environment) := do
if let some ext := cfg.relConfigFile.extension then
let cfg
if let some configFile resolvePath? cfg.configFile then
pure {cfg with configFile}
else error s!"{name}: configuration file not found: {cfg.configFile}"
: LogIO {cfg : LoadConfig // cfg.configLang?.isSome} := do
if h : cfg.configLang?.isSome then
let some configFile resolvePath? cfg.configFile
| error s!"{name}: configuration file not found: {cfg.configFile}"
return {cfg with configFile}, h
else if let some ext := cfg.relConfigFile.extension then
let some configFile resolvePath? cfg.configFile
| error s!"{name}: configuration file not found: {cfg.configFile}"
match ext with
| "lean" => (·.map id some) <$> loadLeanConfig cfg
| "toml" => ((·,none)) <$> loadTomlConfig cfg
| _ => error s!"{name}: configuration has unsupported file extension: {cfg.configFile}"
| "lean" => return {cfg with configFile, configLang? := some .lean}, rfl
| "toml" => return {cfg with configFile, configLang? := some .toml}, rfl
| _ => error s!"{name}: configuration has unsupported file extension: {configFile}"
else
let relLeanFile := cfg.relConfigFile.addExtension "lean"
let relTomlFile := cfg.relConfigFile.addExtension "toml"
@@ -72,18 +111,28 @@ public def loadPackageCore
if let some configFile resolvePath? leanFile then
if ( tomlFile.pathExists) then
logInfo s!"{name}: {relLeanFile} and {relTomlFile} are both present; using {relLeanFile}"
let cfg := {cfg with configFile, relConfigFile := relLeanFile}
let (pkg, env) loadLeanConfig cfg
return (pkg, some env)
return {cfg with configFile, relConfigFile := relLeanFile, configLang? := some .lean}, rfl
else if let some configFile resolvePath? tomlFile then
return {cfg with configFile, relConfigFile := relTomlFile, configLang? := some .toml}, rfl
else
if let some configFile resolvePath? tomlFile then
let cfg := {cfg with configFile, relConfigFile := relTomlFile}
let pkg loadTomlConfig cfg
return (pkg, none)
else
error s!"{name}: no configuration file with a supported extension:\n{leanFile}\n{tomlFile}"
error s!"{name}: no configuration file with a supported extension:\n{leanFile}\n{tomlFile}"
/--
**For internal use only.**
Reads the configuration of a Lake configuration file.
The load configuration {lean}`cfg` is assumed to already have an absolute path in
{lean}`cfg.configFile` and that the proper configuratin langauge for it is in
{lean}`cfg.configLang?`. This can be ensured through {lean}`resolveConfigFile`.
-/
public def loadConfigFile (cfg : LoadConfig) (h : cfg.configLang?.isSome) : LogIO LakefileConfig := do
match cfg.configLang?.get h with
| .lean => loadLeanConfig cfg
| .toml => loadTomlConfig cfg
/-- Loads a Lake package as a single independent object (without dependencies). -/
public def loadPackage (config : LoadConfig) : LogIO Package := do
Lean.searchPathRef.set config.lakeEnv.leanSearchPath
(·.1) <$> loadPackageCore "[root]" config
public def loadPackage (cfg : LoadConfig) : LogIO Package := do
Lean.searchPathRef.set cfg.lakeEnv.leanSearchPath
let cfg, h resolveConfigFile "[root]" cfg
let fileCfg loadConfigFile cfg h
return mkPackage cfg fileCfg

View File

@@ -25,37 +25,43 @@ This module contains definitions for resolving the dependencies of a package.
namespace Lake
/-- Returns the load configuration of a materialized dependency. -/
@[inline] def mkDepLoadConfig
(ws : Workspace) (dep : MaterializedDep)
(lakeOpts : NameMap String) (leanOpts : Options) (reconfigure : Bool)
: LoadConfig where
lakeEnv := ws.lakeEnv
wsDir := ws.dir
pkgIdx := ws.packages.size
pkgName := dep.name
pkgDir := dep.pkgDir
relPkgDir := dep.relPkgDir
relConfigFile := dep.relConfigFile
relManifestFile := dep.relManifestFile
lakeOpts; leanOpts; reconfigure
scope := dep.scope
remoteUrl := dep.remoteUrl
def Workspace.addFacetDecls (decls : Array FacetDecl) (self : Workspace) : Workspace :=
decls.foldl (·.addFacetConfig ·.config) self
/--
Loads the package configuration of a materialized dependency.
Adds the facets defined in the package to the `Workspace`.
Adds the package and the facets defined within it to the `Workspace`.
-/
def loadDepPackage
(wsIdx : Nat)
def addDepPackage
(dep : MaterializedDep)
(lakeOpts : NameMap String)
(leanOpts : Options) (reconfigure : Bool)
: StateT Workspace LogIO Package := fun ws => do
let name := dep.name.toString (escape := false)
let pkgDir := ws.dir / dep.relPkgDir
let some pkgDir resolvePath? pkgDir
| error s!"{name}: package directory not found: {pkgDir}"
let (pkg, env?) loadPackageCore name {
lakeEnv := ws.lakeEnv
wsDir := ws.dir
pkgIdx := wsIdx
pkgName := dep.name
pkgDir
relPkgDir := dep.relPkgDir
relConfigFile := dep.configFile
lakeOpts, leanOpts, reconfigure
scope := dep.scope
remoteUrl := dep.remoteUrl
}
if let some env := env? then
let ws IO.ofExcept <| ws.addFacetsFromEnv env leanOpts
return (pkg, ws)
else
return (pkg, ws)
let wsIdx := ws.packages.size
let loadCfg := mkDepLoadConfig ws dep lakeOpts leanOpts reconfigure
let loadCfg, h resolveConfigFile dep.prettyName loadCfg
let fileCfg loadConfigFile loadCfg h
let pkg := mkPackage loadCfg fileCfg wsIdx
let ws := ws.addPackage' pkg wsIdx_mkPackage
let ws := ws.addFacetDecls fileCfg.facetDecls
return (pkg, ws)
/--
The resolver's call stack of dependencies.
@@ -100,7 +106,7 @@ abbrev ResolveT m := DepStackT <| StateT Workspace m
/-
Recursively visits each node in a package's dependency graph, starting from
the workspace package `root`. Each dependency missing from the workspace is
resolved using the `load` function and added into the workspace.
added to the workspace using the `resolve` function.
Recursion occurs breadth-first. Each direct dependency of a package is
resolved in reverse order before recursing to the dependencies' dependencies.
@@ -108,9 +114,10 @@ resolved in reverse order before recursing to the dependencies' dependencies.
See `Workspace.updateAndMaterializeCore` for more details.
-/
@[inline] private def Workspace.resolveDepsCore
[Monad m] [MonadError m] (ws : Workspace)
(load : Package Dependency Nat StateT Workspace m Package)
[Monad m] [MonadError m] [MonadLiftT LogIO m] (ws : Workspace)
(resolve : Package Dependency Workspace m MaterializedDep)
(root : Package := ws.root) (stack : DepStack := {})
(leanOpts : Options := {}) (reconfigure := true)
: m Workspace := do
ws.runResolveT go root stack
where
@@ -123,8 +130,8 @@ where
return -- already handled in another branch
if pkg.baseName = dep.name then
error s!"{pkg.prettyName}: package requires itself (or a package with the same name)"
let depPkg load pkg dep ws.packages.size
modifyThe Workspace (·.addPackage depPkg)
let matDep resolve pkg dep ( getWorkspace)
discard <| addDepPackage matDep dep.opts leanOpts reconfigure
-- Recursively load the dependencies' dependencies
( getWorkspace).packages.forM recurse start
@@ -171,17 +178,17 @@ private def reuseManifest
logWarning s!"{rootName}: ignoring previous manifest because it failed to load: {e}"
/-- Add a package dependency's manifest entries to the update state. -/
private def addDependencyEntries (pkg : Package) (matDep : MaterializedDep) : UpdateT LoggerIO PUnit := do
match matDep.manifest? with
private def addDependencyEntries (dep : MaterializedDep) : UpdateT LoggerIO PUnit := do
match ( Manifest.load dep.manifestFile |>.toBaseIO) with
| .ok manifest =>
manifest.packages.forM fun entry => do
unless ( getThe (NameMap PackageEntry)).contains entry.name do
let entry := entry.setInherited.inDirectory pkg.relDir
let entry := entry.setInherited.inDirectory dep.relPkgDir
store entry.name entry
| .error (.noFileOrDirectory ..) =>
logWarning s!"{pkg.prettyName}: ignoring missing manifest '{pkg.manifestFile}'"
logWarning s!"{dep.prettyName}: ignoring missing manifest:\n {dep.manifestFile}"
| .error e =>
logWarning s!"{pkg.prettyName}: ignoring manifest because it failed to load: {e}"
logWarning s!"{dep.prettyName}: ignoring manifest because it failed to load: {e}"
/-- Materialize a single dependency, updating it if desired. -/
private def updateAndMaterializeDep
@@ -356,7 +363,6 @@ def Workspace.updateAndMaterializeCore
(updateToolchain := true)
: LoggerIO (Workspace × NameMap PackageEntry) := UpdateT.run do
reuseManifest ws toUpdate
let ws := ws.addPackage ws.root
if updateToolchain then
let deps := ws.root.depConfigs.reverse
let matDeps deps.mapM fun dep => do
@@ -365,21 +371,18 @@ def Workspace.updateAndMaterializeCore
ws.updateToolchain matDeps
let start := ws.packages.size
let ws (deps.zip matDeps).foldlM (init := ws) fun ws (dep, matDep) => do
let (depPkg, ws) loadUpdatedDep ws.packages.size dep matDep ws
let ws := ws.addPackage depPkg
addDependencyEntries matDep
let (_, ws) addDepPackage matDep dep.opts leanOpts true ws
return ws
ws.packages.foldlM (init := ws) (start := start) fun ws pkg =>
ws.resolveDepsCore (stack := [ws.root.baseName]) updateAndLoadDep pkg
ws.resolveDepsCore updateAndAddDep pkg [ws.root.baseName] leanOpts true
else
ws.resolveDepsCore updateAndLoadDep
ws.resolveDepsCore updateAndAddDep (leanOpts := leanOpts) (reconfigure := true)
where
@[inline] updateAndLoadDep pkg dep wsIdx := do
let matDep updateAndMaterializeDep ( getWorkspace) pkg dep
loadUpdatedDep wsIdx dep matDep
@[inline] loadUpdatedDep wsIdx dep matDep : StateT Workspace (UpdateT LoggerIO) Package := do
let depPkg loadDepPackage wsIdx matDep dep.opts leanOpts true
addDependencyEntries depPkg matDep
return depPkg
@[inline] updateAndAddDep pkg dep ws := do
let matDep updateAndMaterializeDep ws pkg dep
addDependencyEntries matDep
return matDep
/-- Write package entries to the workspace manifest. -/
def Workspace.writeManifest
@@ -471,12 +474,9 @@ public def Workspace.materializeDeps
if pkgEntries.isEmpty && !ws.root.depConfigs.isEmpty then
error "missing manifest; use `lake update` to generate one"
-- Materialize all dependencies
let ws := ws.addPackage ws.root
ws.resolveDepsCore fun pkg dep wsIdx => do
let ws getWorkspace
ws.resolveDepsCore (leanOpts := leanOpts) (reconfigure := reconfigure) fun pkg dep ws => do
if let some entry := pkgEntries.find? dep.name then
let result entry.materialize ws.lakeEnv ws.dir relPkgsDir
loadDepPackage wsIdx result dep.opts leanOpts reconfigure
entry.materialize ws.lakeEnv ws.dir relPkgsDir
else
if pkg.isRoot then
error <|

View File

@@ -7,10 +7,12 @@ module
prelude
public import Lake.Config.Package
public import Lake.Config.LakefileConfig
public import Lake.Load.Config
public import Lake.Toml.Decode
import Lake.Toml.Load
import Lean.Parser.Extension
import Lake.Build.Infos
import Init.Omega
meta import Lake.Config.LakeConfig
meta import Lake.Config.InputFileConfig
@@ -424,14 +426,15 @@ private def decodeTargetDecls
(pkg : Name) (prettyName : String) (t : Table)
: DecodeM (Array (PConfigDecl pkg) × DNameMap (NConfigDecl pkg)) := do
let r : DecodeTargetState pkg := {}
let r go r LeanLib.keyword LeanLib.configKind LeanLibConfig.decodeToml
let r go r LeanExe.keyword LeanExe.configKind LeanExeConfig.decodeToml
let r go r InputFile.keyword InputFile.configKind InputFileConfig.decodeToml
let r go r InputDir.keyword InputDir.configKind InputDirConfig.decodeToml
let r go r LeanLib.keyword LeanLib.configKind LeanLibConfig.decodeToml (by simp)
let r go r LeanExe.keyword LeanExe.configKind LeanExeConfig.decodeToml (by simp)
let r go r InputFile.keyword InputFile.configKind InputFileConfig.decodeToml (by simp)
let r go r InputDir.keyword InputDir.configKind InputDirConfig.decodeToml (by simp)
return (r.decls, r.map)
where
go (r : DecodeTargetState pkg) kw kind
(decode : {n : Name} Table DecodeM (ConfigType kind pkg n)) := do
(decode : {n : Name} Table DecodeM (ConfigType kind pkg n))
(h : DataType kind = OpaqueConfigTarget kind) := do
let some tableArrayVal := t.find? kw | return r
let some vals tryDecode? tableArrayVal.decodeValueArray | return r
vals.foldlM (init := r) fun r val => do
@@ -446,8 +449,10 @@ where
else
let config @decode name t
let decl : NConfigDecl pkg name :=
-- Safety: By definition, config kind = facet kind for declarative configurations.
unsafe {pkg, name, kind, config, wf_data := lcProof}
-- Safety: By definition, for declarative configurations, the type of a package target
-- is its configuration's data kind (i.e., `CustomData pkg name = DataType kind`).
-- In the equivalent Lean configuration, this would hold by type family axiom.
unsafe {pkg, name, kind, config, wf_data := fun _ => lcProof, h}
-- Check that executables have distinct root module names
let exeRoots id do
if h : kind = LeanExe.configKind then
@@ -469,8 +474,8 @@ where
/-! ## Root Loader -/
/-- Load a `Package` from a Lake configuration file written in TOML. -/
public def loadTomlConfig (cfg: LoadConfig) : LogIO Package := do
/-- Load a Lake configuration from a file written in TOML. -/
public def loadTomlConfig (cfg : LoadConfig) : LogIO LakefileConfig := do
let input IO.FS.readFile cfg.configFile
let ictx := mkInputContext input cfg.relConfigFile.toString
match ( loadToml ictx |>.toBaseIO) with
@@ -482,21 +487,12 @@ public def loadTomlConfig (cfg: LoadConfig) : LogIO Package := do
let keyName := baseName.num wsIdx
let prettyName := baseName.toString (escape := false)
let config @PackageConfig.decodeToml keyName origName table
let pkgDecl := {baseName, keyName, origName, config : PackageDecl}
let (targetDecls, targetDeclMap) decodeTargetDecls keyName prettyName table
let defaultTargets table.tryDecodeD `defaultTargets #[]
let defaultTargets := defaultTargets.map stringToLegalOrSimpleName
let depConfigs table.tryDecodeD `require #[]
return {
wsIdx, baseName, keyName, origName
dir := cfg.pkgDir
relDir := cfg.relPkgDir
configFile := cfg.configFile
relConfigFile := cfg.relConfigFile
scope := cfg.scope
remoteUrl := cfg.remoteUrl
config, depConfigs, targetDecls, targetDeclMap
defaultTargets
}
return {pkgDecl, depConfigs, targetDecls, targetDeclMap, defaultTargets}
if errs.isEmpty then
return pkg
else

View File

@@ -31,18 +31,21 @@ Does not resolve dependencies.
public def loadWorkspaceRoot (config : LoadConfig) : LogIO Workspace := do
Lean.searchPathRef.set config.lakeEnv.leanSearchPath
let lakeConfig loadLakeConfig config.lakeEnv
let (root, env?) loadPackageCore "[root]" {config with pkgIdx := 0}
let config := {config with pkgIdx := 0}
let config, h resolveConfigFile "[root]" config
let fileCfg loadConfigFile config h
let root := mkPackage config fileCfg 0
let facetConfigs := fileCfg.facetDecls.foldl (·.insert ·.config) initFacetConfigs
let ws : Workspace := {
root
lakeEnv := config.lakeEnv
lakeConfig
lakeArgs? := config.lakeArgs?
facetConfigs := initFacetConfigs
facetConfigs
packages_wsIdx h := by simp at h
}
if let some env := env? then
IO.ofExcept <| ws.addFacetsFromEnv env config.leanOpts
else
return ws
let ws := ws.addPackage' root wsIdx_mkPackage
return ws
/--
Load a `Workspace` for a Lake package by

View File

@@ -41,7 +41,6 @@ leanLibDir = "lib/lean"
nativeLibDir = "lib/lean"
# Additional options derived from the CMake configuration
# For example, CI will set `-DwarningAsError=true` through this
moreLeanArgs = [${LEAN_EXTRA_OPTS_TOML}]
# Uncomment to limit number of reported errors further in case of overwhelming cmdline output
@@ -77,7 +76,7 @@ globs = ["Lake.*"]
defaultFacets = ["static", "static.export"]
# Load the previous stage's lake native code into lake's build process in order to prevent ABI
# breakages from affecting bootstrapping.
moreLeanArgs = ["--plugin", "${PREV_STAGE}/${CMAKE_RELATIVE_LIBRARY_OUTPUT_DIRECTORY}/libLake_shared${CMAKE_SHARED_LIBRARY_SUFFIX}"]
weakLeanArgs = ["--plugin", "${PREV_STAGE}/${CMAKE_RELATIVE_LIBRARY_OUTPUT_DIRECTORY}/libLake_shared${CMAKE_SHARED_LIBRARY_SUFFIX}"]
[[lean_lib]]
name = "LakeMain"

View File

@@ -5,9 +5,6 @@ Released under Apache 2.0 license as described in the file LICENSE.
Author: Leonardo de Moura
*/
#include <limits>
#include <cstdlib>
#include <ctime>
#include <execinfo.h>
#include "runtime/thread.h"
#include "runtime/interrupt.h"
#include "runtime/exception.h"
@@ -19,68 +16,6 @@ namespace lean {
LEAN_THREAD_VALUE(size_t, g_max_heartbeat, 0);
LEAN_THREAD_VALUE(size_t, g_heartbeat, 0);
// --- check_system interval monitoring ---
//
// When LEAN_CHECK_SYSTEM_INTERVAL_MS=N is set, warn on stderr if check_system
// is not called within N milliseconds of CPU time on the current thread.
// Uses CLOCK_THREAD_CPUTIME_ID so that IO waits do not count towards the interval.
// Zero overhead when env var is unset.
// 0 = disabled
static unsigned g_check_system_interval_ms = 0;
static bool g_check_system_interval_initialized = false;
static unsigned get_check_system_interval_ms() {
if (!g_check_system_interval_initialized) {
g_check_system_interval_initialized = true;
if (const char * env = std::getenv("LEAN_CHECK_SYSTEM_INTERVAL_MS")) {
g_check_system_interval_ms = std::atoi(env);
}
}
return g_check_system_interval_ms;
}
static uint64_t thread_cpu_time_ns() {
struct timespec ts;
clock_gettime(CLOCK_THREAD_CPUTIME_ID, &ts);
return static_cast<uint64_t>(ts.tv_sec) * 1000000000ULL + static_cast<uint64_t>(ts.tv_nsec);
}
// Thread-local: last CPU time when check_system was called on this thread.
LEAN_THREAD_VALUE(uint64_t, g_last_check_system_ns, 0);
static void check_system_interval(char const * component_name) {
unsigned interval_ms = get_check_system_interval_ms();
if (interval_ms > 0) {
uint64_t now_ns = thread_cpu_time_ns();
uint64_t last_ns = g_last_check_system_ns;
g_last_check_system_ns = now_ns;
if (last_ns != 0) {
uint64_t elapsed_ms = (now_ns - last_ns) / 1000000;
if (elapsed_ms > interval_ms) {
fprintf(stderr,
"[check_system] WARNING: %llu ms CPU time since last check_system call "
"(component: %s)\n",
(unsigned long long)elapsed_ms, component_name);
void * bt_buf[64];
int nptrs = backtrace(bt_buf, 64);
backtrace_symbols_fd(bt_buf, nptrs, 2); // fd 2 = stderr
// Reset timer after printing to avoid backtrace overhead cascading
g_last_check_system_ns = thread_cpu_time_ns();
}
}
}
}
extern "C" LEAN_EXPORT obj_res lean_check_system_interval(b_lean_obj_arg component_name) {
check_system_interval(lean_string_cstr(component_name));
return lean_io_result_mk_ok(lean_box(0));
}
extern "C" LEAN_EXPORT uint8_t lean_check_system_interval_is_enabled(lean_obj_arg /* unit */) {
return get_check_system_interval_ms() > 0;
}
extern "C" LEAN_EXPORT obj_res lean_internal_get_default_max_heartbeat() {
#ifdef LEAN_DEFAULT_MAX_HEARTBEAT
return lean_box(LEAN_DEFAULT_MAX_HEARTBEAT);
@@ -136,7 +71,6 @@ void check_interrupted() {
}
void check_system(char const * component_name, bool do_check_interrupted) {
check_system_interval(component_name);
check_stack(component_name);
check_memory(component_name);
if (do_check_interrupted) {

View File

@@ -181,9 +181,11 @@ public:
template<typename T> class unique_lock {
public:
unique_lock(T const &) {}
unique_lock(T const &, std::adopt_lock_t) {}
~unique_lock() {}
void lock() {}
void unlock() {}
T * release() { return nullptr; }
};
inline unsigned hardware_concurrency() { return 1; }
}

View File

@@ -173,7 +173,7 @@ extern "C" LEAN_EXPORT lean_obj_res lean_uv_dns_get_name(b_obj_arg addr) {
#else
// Std.Internal.IO.Async.DNS.getAddrInfo (host service : @& String) (family : UInt8) : IO (IO.Promise (Except IO.Error (Array IPAddr)))
extern "C" LEAN_EXPORT lean_obj_res lean_uv_dns_get_info(b_obj_arg name, b_obj_arg service, uint8_t family, int8_t protocol) {
extern "C" LEAN_EXPORT lean_obj_res lean_uv_dns_get_info(b_obj_arg name, b_obj_arg service, uint8_t family) {
lean_always_assert(
false && ("Please build a version of Lean4 with libuv to invoke this.")
);

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More