Compare commits

...

42 Commits

Author SHA1 Message Date
Sebastian Ullrich
80cbab1642 chore: don't fail on running build bench on built stage3 (#13467) 2026-04-18 22:07:21 +00:00
Leonardo de Moura
c0a53ffe97 chore: minor tweaks to Sym.simp test and benchmark (#13468)
This PR applies two minor tweaks:
- `tests/bench/sym/simp_1.lean`: share-common the proof term before
counting objects in `getProofSize`, so the reported size reflects the
shared representation.
- `tests/elab/sym_simp_3.lean`: use `>>` instead of `.andThen` when
composing `Sym.Simp` methods.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 21:11:30 +00:00
Lean stage0 autoupdater
91aa82da7f chore: update stage0 2026-04-18 11:10:08 +00:00
Kim Morrison
3e9ed3c29d chore: add AGENTS.md symlink to CLAUDE.md (#13461)
This PR adds an `AGENTS.md` symlink to `.claude/CLAUDE.md` so
Codex-style repository instructions can resolve to the same checked-in
guidance Claude Code already uses.

This keeps the repository's agent-facing instructions in one place
instead of maintaining separate copies for different tools. Previous
Codex runs did not automatically pick up the existing
`.claude/CLAUDE.md` guidance, which caused avoidable drift in PR
formatting and workflow behavior.
2026-04-18 06:48:05 +00:00
Mac Malone
43e1e8285b refactor: lake: introduce GitRev (#13456)
This PR adds a type abbreviation `GitRev` to Lake, which is used for
`String` values that signify Git revisions. Such revisions may be a SHA1
commit hash, a branch name, or one of Git's more complex specifiers.

The PR also adds a number of additional Git primitives which are useful
for #11662.
2026-04-18 03:44:26 +00:00
Mac Malone
d3c069593b refactor: introduce Package.depPkgs (#13445)
This PR adds `Pakcage.depPkgs` for internal use (namely in #11662). This
field contains the list of the package's direct dependencies (as package
objects). This is much more efficient than going through the old package
`deps` facet.

As part of this refactor, the Workspace `root` is now derived from
`packages` instead of being is own independent field.
2026-04-18 03:44:21 +00:00
Mac Malone
0fa749f71b refactor: lake: introduce lowerHexUInt64 (#13455)
This PR adds a `lowerHexUInt64` utility to Lake which is used to
optimize `Hash.hex`.
2026-04-18 03:43:11 +00:00
Kyle Miller
592eb02bb2 feat: have level metavariable pretty printer instantiate level metavariables (#13438)
This PR makes the universe level pretty printer instantiate level
metavariables when `pp.instantiateMVars` is true.

Previously level metavariables were not instantiated.

The PR adjusts the tracing in the LevelDefEq module to create the trace
message using the original MetavarContext. It also adds
`Meta.isLevelDefEq.step` traces for when level metavariables are
assigned.
2026-04-18 01:07:22 +00:00
Leonardo de Moura
70df9742f4 fix: kernel error in grind order module for Nat casts to non-Int types (#13453)
This PR fixes a kernel error in `grind` when propagating a `Nat`
equality to an order structure whose carrier type is not `Int` (e.g.
`Rat`). The auxiliary `Lean.Grind.Order.of_nat_eq` lemma was specialized
to `Int`, so the kernel rejected the application when the cast
destination differed.

We add a polymorphic `of_natCast_eq` lemma over `{α : Type u} [NatCast
α]` and cache the cast destination type in `TermMapEntry`.
`processNewEq` now uses the original `of_nat_eq` when the destination is
`Int` (the common case) and the new lemma otherwise. The symmetric
`nat_eq` propagation (deriving `Nat` equality from a derived cast
equality) is now guarded to fire only when the destination is `Int`,
since the `nat_eq` lemma is still specialized to `Int`.

Closes #13265.
2026-04-17 23:51:21 +00:00
Leonardo de Moura
9c245d5531 test: add regression test for Sym.simp eta-reduction (#13416) (#13452)
This PR adds a direct regression test for issue #13416. It exercises
`Std.HashMap.getElem_insert`, whose `dom` argument is a lambda closing
over pattern variables, and checks that the discrimination tree lookup
finds the theorem once the target's `dom` lambda is eta-reduced.

The underlying fix landed in #13448; this test pins the specific MWE
from the original issue so a regression would surface immediately.

Closes #13416

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:47:53 +00:00
Mac Malone
3c4440d3bc feat: lake: JobAction.reuse & .unpack (#13423)
This PR adds `JobAction.reuse` and `JobAction.unpack` which provide more
information captions for what a job is doing for the build monitor.
`reuse` is set when using an artifact from the Lake cache, `unpack` is
set when unpacking module `.ltar` archives and release (Reservoir or
GitHub) archives.
2026-04-17 22:34:04 +00:00
Leonardo de Moura
2964193af8 fix: avoid assigning mvar when Sym.intros produces no binders (#13451)
This PR fixes a bug in `Sym.introCore.finalize` where the original
metavariable was unconditionally assigned via a delayed assignment, even
when no binders were introduced. As a result, `Sym.intros` would return
`.failed` while the goal metavariable had already been silently
assigned, confusing downstream code that relies on `isAssigned` (e.g. VC
filters in `mvcgen'`).

The test and fix were suggested by Sebastian Graf (@sgraf812).

Co-authored-by: Sebastian Graf <sgraf1337@gmail.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:47:47 +00:00
Eric Wieser
43e96b119d fix: prevent a hang in acLt (#13367)
This PR removes some cases where `simp` would significantly overrun a
timeout.

This is a little tricky to test cleanly; using mathlib's
`#count_heartbeats` as
```lean4
#count_heartbeats in
set_option maxHeartbeats 200000 in
example (k : Nat) (a : Fin (1 + k + 1) → Nat) :
    0 ≤ sumRange (1 + k + 1) (fun i =>
        if h : i < 1 + k + 1 then a ⟨i, h⟩ else 0) := by
  simp only [Nat.add_comm, sumRange_add]
```
I see 200010 heartbeats with this PR, and 1873870 (9x the requested
limit) without.

This type of failure is wasteful in AI systems which try tactics with a
short timeout.
2026-04-17 21:46:29 +00:00
Leonardo de Moura
615f45ad7a chore: fix Sym benchmarks using stale run' API (#13450)
This PR updates two Sym benchmarks (`add_sub_cancel.lean` and
`meta_simp_1.lean`) to use the current `SymM.run` API. Both files still
referenced `run'`, which no longer exists, so they failed to elaborate.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:31:57 +00:00
Sebastian Graf
8a9a5ebd5e refactor: promote repeat/while builtin parsers to default priority (#13447)
This PR removes the transitional `syntax` declarations for `repeat`,
`while`, and `repeat ... until` from `Init.While` and promotes the
corresponding `@[builtin_doElem_parser]` defs in `Lean.Parser.Do` from
`low` to default priority, making them the canonical parsers.

The `macro_rules` in `Init.While` are kept as a bootstrap: they expand
`repeat`/`while`/`until` directly to `for _ in Loop.mk do ...`, which is
what any `prelude` Init file needs. The `@[builtin_macro]` /
`@[builtin_doElem_elab]` in `Lean.Elab.BuiltinDo.Repeat` are only
visible once `Lean.Elab.*` is transitively imported, so they cannot
serve Init bootstrap. The duplication will be removed in a follow-up
after the next stage0 update.
2026-04-17 20:57:41 +00:00
Leonardo de Moura
1af697a44b fix: eta-reduce patterns containing loose pattern variables (#13448)
This PR fixes a regression in `Sym.simp` where rewrite rules whose LHS
contains a lambda over a pattern variable (e.g. `∃ x, a = x`) failed to
match targets with semantically equivalent structure.

`Sym.etaReduceAux` previously refused any eta-reduction whenever the
body had loose bound variables, but patterns produced by stripping outer
foralls always carry such loose bvars. The eta-reduction therefore
skipped patterns while still firing on the target, producing mismatched
discrimination tree keys and no match.

The fix narrows the check to loose bvars in the range `[0, n)` (those
that would actually refer to the peeled binders) and lowers any
remaining loose bvars by `n` so that pattern-variable references stay
consistent in the reduced expression. The discrimination tree now
classifies patterns like `exists_eq_True : (∃ x, a = x) = True` with
their full structure rather than falling back to `.other`.

Includes a regression test (`sym_simp_1.lean`) and Sebastian Graf's MWE
(`sym_eta_mwe.lean`).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 20:49:21 +00:00
Lean stage0 autoupdater
234267b08a chore: update stage0 2026-04-17 16:13:04 +00:00
Sebastian Graf
704df340cb feat: make repeat and while syntax builtin (#13442)
This PR promotes the `repeat`, `while`, and `repeat ... until` parsers
from `syntax` declarations in `Init.While` to `@[builtin_doElem_parser]`
definitions in `Lean.Parser.Do`, alongside the other do-element parsers.
The `while` variants and `repeat ... until` get `@[builtin_macro]`
expansions; `repeat` itself gets a `@[builtin_doElem_elab]` so a
follow-up can extend it with an option-driven choice between `Loop.mk`
and a well-founded `Repeat.mk`.

The new builtin parsers are registered at `low` priority so that the
bootstrapping `syntax` declarations in `Init.While` (still needed for
stage0 compatibility) take precedence during the transition. After the
next stage0 update, the `Init.While` syntax and macros can be removed.
2026-04-17 15:19:59 +00:00
Henrik Böving
f180c9ce17 fix: handling of EmitC for small hex string literals (#13435)
This PR fixes a bug in EmitC that can be caused by working with the
string literal `"\x01abc"` in
Lean and causes a C compiler error.

The error is as follows:
```
run.c:29:189: error: hex escape sequence out of range
   29 | static const lean_string_object l_badString___closed__0_value = {.m_header = {.m_rc = 0, .m_cs_sz = 0, .m_other = 0, .m_tag = 249}, .m_size = 5, .m_capacity = 5, .m_length = 4, .m_data = "\x01abc"};
      |                                                                                                                                                                                             ^~~~~~~
1 error generated.
```
This happens as hex escape sequences can be arbitrarily long while lean
expects them to cut off
after two chars. Thus, the C compiler parses the string as one large hex
escape sequence `01abc` and
subsequently notices this is too large.

Discovered by @datokrat
2026-04-17 15:16:28 +00:00
Lean stage0 autoupdater
040376ebd0 chore: update stage0 2026-04-17 10:09:17 +00:00
Sebastian Graf
ce998700e6 feat: add ControlInfo handler for doRepeat (#13437)
This PR adds a builtin `doElem_control_info` handler for `doRepeat`. It
is ineffective as long as we have the macro for `repeat`.
2026-04-17 09:17:52 +00:00
Sebastian Ullrich
b09d39a766 chore: build bench: replace warmup with selective build (#13432) 2026-04-17 08:01:17 +00:00
Lean stage0 autoupdater
348ed9b3b0 chore: update stage0 2026-04-17 08:05:59 +00:00
Sebastian Graf
f8b9610b74 feat: add Lean.doRepeat elaborators for repeat/while loops (#13434)
This PR names the `repeat` syntax (`doRepeat`) and installs dedicated
elaborators for it in both the legacy and new do-elaborators. Both
currently expand to `for _ in Loop.mk do ...`, identical to the existing
fallback macro in `Init.While`.

The elaborators are dead code today because that fallback macro fires
first. A follow-up PR will drop the macro (after this PR's stage0 update
lands) and extend `elabDoRepeat` to choose between `Loop.mk` and a
well-founded `Repeat.mk` based on a `backward.do.while` option.
2026-04-17 07:17:57 +00:00
Wojciech Różowski
3fc99eef10 feat: add instance validation checks in addInstance (#13389)
This PR adds two validation checks to `addInstance` that provide early
feedback for common mistakes in instance declarations:

1. **Non-class instance check**: errors when an instance target type is
not a type class. This catches the common mistake of writing `instance`
for a plain structure. Previously handled by the `nonClassInstance`
linter in Batteries (`Batteries.Tactic.Lint.TypeClass`), this is now
checked directly at declaration time.

2. **Impossible argument check**: errors when an instance has arguments
that cannot be inferred by instance synthesis. Specifically, it flags
arguments that are not instance-implicit and do not appear in any
subsequent instance-implicit argument or in the return type. Previously
such instances would be silently accepted but could never be
synthesised.

Supersedes #13237 and #13333.
2026-04-16 17:48:16 +00:00
Wojciech Różowski
b99356ebcf chore: enable warning.simp.varHead (#13403)
This PR globally enables `warning.simp.varHead` (added in #13325) and
silences the warning in `Lake.Util.Family.Mathlib` adaptations were
already merged as part of adaptations for #13325. This is a separate PR
from #13325 due to warning appearing when re-bootstrapping, so we needed
`stage0` update before enabling this option.
2026-04-16 16:11:09 +00:00
Henrik Böving
7e8a710ca3 fix: two bugs in io.cpp (#13427)
This PR fixes two minor bugs in `io.cpp`:
1. A resource leak in a Windows error path of
`Std.Time.Database.Windows.getNextTransition`
2. A buffer overrun in `IO.appPath` on linux when the executable is a
symlink at max path length.
2026-04-16 12:38:17 +00:00
Kim Morrison
621c558c13 fix: make delta-derived instances respect enclosing meta sections (#13315)
This PR fixes `processDefDeriving` to propagate the `meta` attribute to
instances derived via delta deriving, so that `deriving BEq` inside a
`public meta section` produces a meta instance. Previously the derived
`instBEqFoo` was not marked meta, and the LCNF visibility checker
rejected meta definitions that used `==` on the alias — this came up
while bumping verso to v4.30.0-rc1.

`processDefDeriving` now computes `isMeta` from two sources:
1. `(← read).isMetaSection` — true inside a `public meta section`,
covering the original issue #13313.
2. `isMarkedMeta (← getEnv) declName` — true when the type being derived
for was individually marked `meta` (e.g. `meta def Foo := Nat`), via
`elabMutualDef` in `src/Lean/Elab/MutualDef.lean`.

This value is passed to `wrapInstance` for aux declarations and to the
new `addAndCompile (markMeta := ...)` parameter from #13311, matching
how the regular command elaboration pipeline handles meta definitions.

Existing regression tests `tests/elab/13043.lean` and
`tests/elab/12897.lean` already cover meta-section + `wrapInstance` aux
def interaction. The new `tests/elab/13313.lean` specifically covers the
delta-derived `BEq` + LCNF-use case (the original issue) and an explicit
`meta def ... deriving BEq` outside a meta section (motivating the
second disjunct).

- [ ] depends on: #13311

Closes #13313

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-16 09:18:54 +00:00
Sebastian Graf
490c79502b fix: improve result type mismatch errors and locations in new do elaborator (#13404)
This PR fixes #12846, where the new do elaborator produced confusing
errors when a do element's continuation had a mismatched monadic result
type. The errors were misleading both in location (e.g., pointing at the
value of `let x ← value` rather than the `let` keyword) and in content
(e.g., mentioning `PUnit.unit` which the user never wrote).

The fix introduces `DoElemCont.ensureUnitAt`/`ensureHasTypeAt`, which
check the continuation result type early and report mismatches with a
clear message ("The `do` element has monadic result type ... but the
rest of the `do` block has monadic result type ..."). Each do-element
elaborator (`let`, `have`, `let rec`, `for`, `unless`, `dbg_trace`,
`assert!`, `idbg`, etc.) now captures its keyword token via `%$tk` and
passes it to `ensureUnitAt` so that the error points at the do element
rather than at an internal elaboration artifact. The old ad-hoc type
check in `for` and the confusing `ensureHasType` call in
`continueWithUnit` are replaced by this uniform mechanism. Additionally,
`extractMonadInfo` now calls `instantiateMVars` on the expected type,
and `While.lean`/`If.lean` macros propagate token info through their
expansions.

Closes #12846

---------

Co-authored-by: Rob23oba <robin.arnez@web.de>
2026-04-16 09:16:27 +00:00
Wojciech Różowski
fed2f32651 chore: revert "feat: add lake builtin-lint (#13422)
This PR reverts leanprover/lean4#13393.
2026-04-15 19:28:59 +00:00
Henrik Böving
5949ae8664 fix: expand reset reuse in the presence of double oproj (#13421)
This PR fixes an issue in the expand reset reuse pass that causes
segfaults in very rare situations.

This bug occurs in situations where two projections from the same field
happen right before a reset,
for example:
```
let x.2 := oproj[0] _x.1;
inc x.2;
let x.3 := oproj[0] _x.1;
inc x.3;
let _x.4 := reset[1] _x.1;
```
when expand reset reuse we optimize situations like this to only `inc`
on the cold path as on the
hot path we are going to keep the projectees alive until at least
`reuse` by just not `dec`-ing the
resetee. However, the algorithm for this assumed that we do not project
more than once from each
field and thus removed both `inc x.2` and `inc x.3` which is too much.

The bug was masked compared to the original #13407 that was reproducible
in 4.29, because the
presented code relied on semantics of global constants which were
changed in 4.30. The PR contains a
modified (and more consistent) reproducer.

Closes: #13407
Co investigated with @Rob23oba
2026-04-15 19:16:22 +00:00
Wojciech Różowski
fe77e4d2d1 fix: coinductive syntax causing panic in macro scopes (#13420)
This PR fixes a panic when `coinductive` predicates are defined inside
macro scopes where constructor names carry macro scopes. The existing
guard only checked the declaration name for macro scopes, missing the
case where constructor identifiers are generated inside a macro
quotation and thus carry macro scopes. This caused
`removeFunctorPostfixInCtor` to panic on `Name.num` components from
macro scope encoding.

Closes #13415
2026-04-15 18:50:31 +00:00
Wojciech Różowski
9b1426fd9c feat: add lake builtin-lint (#13393)
This PR adds a basic support for `lake builtin-lint` command that is
used to run environment linters and in the future will be extend to deal
with the core syntax linters.
2026-04-15 18:14:40 +00:00
Lean stage0 autoupdater
b6bfe019a1 chore: update stage0 2026-04-15 10:55:52 +00:00
Sebastian Graf
748783a5ac feat: add internal skip do-element parser (#13413)
This PR adds an internal `skip` syntax for do blocks, intended for use
by the `if` and `unless` elaborators to replace `pure PUnit.unit` in
implicit else branches. This gives the elaborator a dedicated syntax
node to attach better error messages and location info to, rather than
synthesizing `pure PUnit.unit` which leaks internal details into
user-facing errors.

Includes a stage0 trigger comment so that the new parser is available
during bootstrapping.

Co-authored-by: Rob23oba <robin.arnez@web.de>
2026-04-15 10:01:01 +00:00
Marc Huisinga
df23b79c90 fix: tactic completion in empty by blocks (#13348)
This PR fixes a bug where tactic auto-completion would produce tactic
completion items in the entire trailing whitespace of an empty tactic
block. Since #13229 further restricted top-level `by` blocks to be
indentation- sensitive, this PR adjusts the logic to only display
completion items at a "proper" indentation level.

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-15 08:39:55 +00:00
Mac Malone
8156373037 fix: pass CMake Lake args to the Lake make build (#13410)
This PR fixes a bug in #13294 where the Lake arguments were not actually
passed to the Lake make build.
2026-04-14 22:07:29 +00:00
Sebastian Graf
75487a1bf8 fix: universe normalization in getDecLevel (#13391)
This PR adds level instantiation and normalization in `getDecLevel` and
`getDecLevel?` before calling `decLevel`.

`getLevel` can return levels with uninstantiated metavariables or
un-normalized structure, such as `max ?u ?v` where the metavariables
have already been assigned. After instantiation and normalization (via
`normalizeLevel`), a level like `max ?u ?v` (with `?u := 1, ?v := 0`)
simplifies to `1 = succ 0`, which `decLevel` can decrement. Without this
step, `decLevel` sees `max ?u ?v`, tries to decrement both arms, fails
on a zero-valued arm, and reports "invalid universe level".

Concretely, this fixes `for` loops with `mut` variables of
sort-polymorphic type (e.g. `PProd Nat True`) where the state tuple's
universe level ends up as an uninstantiated `max`.

The expected-output change in `doNotation1.lean` is because the `for`
loop's unit type now resolves to `Unit` instead of `PUnit` due to the
improved level handling.
2026-04-14 21:27:22 +00:00
Henrik Böving
559f6c0ae7 perf: specialize qsort properly onto the lt function (#13409) 2026-04-14 19:57:30 +00:00
Henrik Böving
a0ee357152 chore: add io compute benchmark (#13406) 2026-04-14 18:33:32 +00:00
Henrik Böving
1884c3b2ed feat: make mimalloc security options available (#13401)
This PR adds the option `LEAN_MI_SECURE` to our CMake build. It can be
configured with values `0`
through `4`. Every increment enables additional memory safety
mitigations in mimalloc, at the cost
of 2%-20% instruction count, depending on the benchmark. The option is
disabled by default in our
release builds as most of our users do not use the Lean runtime in
security sensitive situations.
Distributors and organization deploying production Lean code should
consider enabling the option as
a hardening measure. The effects of the various levels can be found at
https://github.com/microsoft/mimalloc/blob/v2.2.7/include/mimalloc/types.h#L56-L60.
2026-04-14 13:22:07 +00:00
Sebastian Ullrich
cdb6401c65 chore: don't waste time building CMake mimalloc that we don't use (#13402) 2026-04-14 09:49:31 +00:00
845 changed files with 1942 additions and 1138 deletions

View File

@@ -279,7 +279,8 @@ jobs:
"os": large ? "nscloud-ubuntu-24.04-amd64-8x16-with-cache" : "ubuntu-latest",
"enabled": true,
"check-rebootstrap": level >= 1,
"check-stage3": level >= 2,
// Done as part of test-bench
//"check-stage3": level >= 2,
"test": true,
// NOTE: `test-bench` currently seems to be broken on `ubuntu-latest`
"test-bench": large && level >= 2,
@@ -291,7 +292,8 @@ jobs:
"os": large ? "nscloud-ubuntu-24.04-amd64-8x16-with-cache" : "ubuntu-latest",
"enabled": true,
"check-rebootstrap": level >= 1,
"check-stage3": level >= 2,
// Done as part of test-bench
//"check-stage3": level >= 2,
"test": true,
"secondary": true,
// NOTE: `test-bench` currently seems to be broken on `ubuntu-latest`

1
AGENTS.md Symbolic link
View File

@@ -0,0 +1 @@
.claude/CLAUDE.md

View File

@@ -129,6 +129,7 @@ if(USE_MIMALLOC)
# cadical, it might be worth reorganizing the directory structure.
SOURCE_DIR
"${CMAKE_BINARY_DIR}/mimalloc/src/mimalloc"
EXCLUDE_FROM_ALL
)
FetchContent_MakeAvailable(mimalloc)
endif()

View File

@@ -110,6 +110,7 @@ option(RUNTIME_STATS "RUNTIME_STATS" OFF)
option(BSYMBOLIC "Link with -Bsymbolic to reduce call overhead in shared libraries (Linux)" ON)
option(USE_GMP "USE_GMP" ON)
option(USE_MIMALLOC "use mimalloc" ON)
set(LEAN_MI_SECURE 0 CACHE STRING "Configure mimalloc memory safety mitigations (https://github.com/microsoft/mimalloc/blob/v2.2.7/include/mimalloc/types.h#L56-L60)")
# development-specific options
option(CHECK_OLEAN_VERSION "Only load .olean files compiled with the current version of Lean" OFF)
@@ -117,6 +118,7 @@ option(USE_LAKE "Use Lake instead of lean.mk for building core libs from languag
option(USE_LAKE_CACHE "Use the Lake artifact cache for stage 1 builds (requires USE_LAKE)" OFF)
set(LEAN_EXTRA_OPTS "" CACHE STRING "extra options to lean (via lake or make)")
set(LEAN_EXTRA_LAKE_OPTS "" CACHE STRING "extra options to lake")
set(LEAN_EXTRA_MAKE_OPTS "" CACHE STRING "extra options to leanmake")
set(LEANC_CC ${CMAKE_C_COMPILER} CACHE STRING "C compiler to use in `leanc`")
@@ -126,7 +128,7 @@ string(APPEND LEAN_EXTRA_OPTS " -Dbackward.do.legacy=false")
# option used by the CI to fail on warnings
option(WFAIL "Fail build if warnings are emitted by Lean" ON)
if(WFAIL MATCHES "ON")
string(APPEND LAKE_EXTRA_ARGS " --wfail")
string(APPEND LEAN_EXTRA_LAKE_OPTS " --wfail")
string(APPEND LEAN_EXTRA_MAKE_OPTS " -DwarningAsError=true")
endif()

View File

@@ -33,6 +33,7 @@ if necessary so that the middle (pivot) element is at index `hi`.
We then iterate from `k = lo` to `k = hi`, with a pointer `i` starting at `lo`, and
swapping each element which is less than the pivot to position `i`, and then incrementing `i`.
-/
@[inline]
def qpartition {n} (as : Vector α n) (lt : α α Bool) (lo hi : Nat) (w : lo hi := by omega)
(hlo : lo < n := by omega) (hhi : hi < n := by omega) : {m : Nat // lo m m hi} × Vector α n :=
let mid := (lo + hi) / 2
@@ -44,7 +45,7 @@ def qpartition {n} (as : Vector α n) (lt : αα → Bool) (lo hi : Nat) (w
-- elements in `[i, k)` are greater than or equal to `pivot`,
-- elements in `[k, hi)` are unexamined,
-- while `as[hi]` is (by definition) the pivot.
let rec loop (as : Vector α n) (i k : Nat)
let rec @[specialize] loop (as : Vector α n) (i k : Nat)
(ilo : lo i := by omega) (ik : i k := by omega) (w : k hi := by omega) :=
if h : k < hi then
if lt as[k] pivot then

View File

@@ -59,9 +59,9 @@ Examples:
* `Nat.repeat f 3 a = f <| f <| f <| a`
* `Nat.repeat (· ++ "!") 4 "Hello" = "Hello!!!!"`
-/
@[specialize, expose] def repeat {α : Type u} (f : α α) : (n : Nat) (a : α) α
@[specialize, expose] def «repeat» {α : Type u} (f : α α) : (n : Nat) (a : α) α
| 0, a => a
| succ n, a => f (repeat f n a)
| succ n, a => f («repeat» f n a)
/--
Applies a function to a starting value the specified number of times.
@@ -1221,9 +1221,9 @@ theorem not_lt_eq (a b : Nat) : (¬ (a < b)) = (b ≤ a) :=
theorem not_gt_eq (a b : Nat) : (¬ (a > b)) = (a b) :=
not_lt_eq b a
@[csimp] theorem repeat_eq_repeatTR : @repeat = @repeatTR :=
@[csimp] theorem repeat_eq_repeatTR : @«repeat» = @repeatTR :=
funext fun α => funext fun f => funext fun n => funext fun init =>
let rec go : m n, repeatTR.loop f m (repeat f n init) = repeat f (m + n) init
let rec go : m n, repeatTR.loop f m («repeat» f n init) = «repeat» f (m + n) init
| 0, n => by simp [repeatTR.loop]
| succ m, n => by rw [repeatTR.loop, succ_add]; exact go m (succ n)
(go n 0).symm

View File

@@ -87,7 +87,7 @@ public theorem IsLinearOrder.of_ord {α : Type u} [LE α] [Ord α] [LawfulOrderO
/--
This lemma derives a `LawfulOrderLT α` instance from a property involving an `Ord α` instance.
-/
public instance LawfulOrderLT.of_ord (α : Type u) [Ord α] [LT α] [LE α] [LawfulOrderOrd α]
public theorem LawfulOrderLT.of_ord (α : Type u) [Ord α] [LT α] [LE α] [LawfulOrderOrd α]
(lt_iff_compare_eq_lt : a b : α, a < b compare a b = .lt) :
LawfulOrderLT α where
lt_iff a b := by
@@ -96,7 +96,7 @@ public instance LawfulOrderLT.of_ord (α : Type u) [Ord α] [LT α] [LE α] [Law
/--
This lemma derives a `LawfulOrderBEq α` instance from a property involving an `Ord α` instance.
-/
public instance LawfulOrderBEq.of_ord (α : Type u) [Ord α] [BEq α] [LE α] [LawfulOrderOrd α]
public theorem LawfulOrderBEq.of_ord (α : Type u) [Ord α] [BEq α] [LE α] [LawfulOrderOrd α]
(beq_iff_compare_eq_eq : a b : α, a == b compare a b = .eq) :
LawfulOrderBEq α where
beq_iff_le_and_ge := by
@@ -105,7 +105,7 @@ public instance LawfulOrderBEq.of_ord (α : Type u) [Ord α] [BEq α] [LE α] [L
/--
This lemma derives a `LawfulOrderInf α` instance from a property involving an `Ord α` instance.
-/
public instance LawfulOrderInf.of_ord (α : Type u) [Ord α] [Min α] [LE α] [LawfulOrderOrd α]
public theorem LawfulOrderInf.of_ord (α : Type u) [Ord α] [Min α] [LE α] [LawfulOrderOrd α]
(compare_min_isLE_iff : a b c : α,
(compare a (min b c)).isLE (compare a b).isLE (compare a c).isLE) :
LawfulOrderInf α where
@@ -114,7 +114,7 @@ public instance LawfulOrderInf.of_ord (α : Type u) [Ord α] [Min α] [LE α] [L
/--
This lemma derives a `LawfulOrderMin α` instance from a property involving an `Ord α` instance.
-/
public instance LawfulOrderMin.of_ord (α : Type u) [Ord α] [Min α] [LE α] [LawfulOrderOrd α]
public theorem LawfulOrderMin.of_ord (α : Type u) [Ord α] [Min α] [LE α] [LawfulOrderOrd α]
(compare_min_isLE_iff : a b c : α,
(compare a (min b c)).isLE (compare a b).isLE (compare a c).isLE)
(min_eq_or : a b : α, min a b = a min a b = b) :
@@ -125,7 +125,7 @@ public instance LawfulOrderMin.of_ord (α : Type u) [Ord α] [Min α] [LE α] [L
/--
This lemma derives a `LawfulOrderSup α` instance from a property involving an `Ord α` instance.
-/
public instance LawfulOrderSup.of_ord (α : Type u) [Ord α] [Max α] [LE α] [LawfulOrderOrd α]
public theorem LawfulOrderSup.of_ord (α : Type u) [Ord α] [Max α] [LE α] [LawfulOrderOrd α]
(compare_max_isLE_iff : a b c : α,
(compare (max a b) c).isLE (compare a c).isLE (compare b c).isLE) :
LawfulOrderSup α where
@@ -134,7 +134,7 @@ public instance LawfulOrderSup.of_ord (α : Type u) [Ord α] [Max α] [LE α] [L
/--
This lemma derives a `LawfulOrderMax α` instance from a property involving an `Ord α` instance.
-/
public instance LawfulOrderMax.of_ord (α : Type u) [Ord α] [Max α] [LE α] [LawfulOrderOrd α]
public theorem LawfulOrderMax.of_ord (α : Type u) [Ord α] [Max α] [LE α] [LawfulOrderOrd α]
(compare_max_isLE_iff : a b c : α,
(compare (max a b) c).isLE (compare a c).isLE (compare b c).isLE)
(max_eq_or : a b : α, max a b = a max a b = b) :

View File

@@ -75,6 +75,9 @@ theorem nat_eq (a b : Nat) (x y : Int) : NatCast.natCast a = x → NatCast.natCa
theorem of_nat_eq (a b : Nat) (x y : Int) : NatCast.natCast a = x NatCast.natCast b = y a = b x = y := by
intro _ _; subst x y; intro; simp [*]
theorem of_natCast_eq {α : Type u} [NatCast α] (a b : Nat) (x y : α) : NatCast.natCast a = x NatCast.natCast b = y a = b x = y := by
intro h₁ h₂ h; subst h; exact h₁.symm.trans h₂
theorem le_of_not_le {α} [LE α] [Std.IsLinearPreorder α]
{a b : α} : ¬ a b b a := by
intro h

View File

@@ -10,13 +10,14 @@ public import Init.Core
public section
/-!
# Notation for `while` and `repeat` loops.
-/
namespace Lean
/-! # `repeat` and `while` notation -/
/-!
# `while` and `repeat` loop support
The parsers for `repeat`, `while`, and `repeat ... until` are
`@[builtin_doElem_parser]` definitions in `Lean.Parser.Do`.
-/
inductive Loop where
| mk
@@ -32,24 +33,23 @@ partial def Loop.forIn {β : Type u} {m : Type u → Type v} [Monad m] (_ : Loop
instance [Monad m] : ForIn m Loop Unit where
forIn := Loop.forIn
syntax "repeat " doSeq : doElem
-- The canonical parsers for `repeat`/`while`/`repeat ... until` live in `Lean.Parser.Do`
-- as `@[builtin_doElem_parser]` definitions. We register the expansion macros here so
-- they are available to `prelude` files in `Init`, which do not import `Lean.Elab`.
macro_rules
| `(doElem| repeat $seq) => `(doElem| for _ in Loop.mk do $seq)
syntax "while " ident " : " termBeforeDo " do " doSeq : doElem
| `(doElem| repeat%$tk $seq) => `(doElem| for%$tk _ in Loop.mk do $seq)
macro_rules
| `(doElem| while $h : $cond do $seq) => `(doElem| repeat if $h:ident : $cond then $seq else break)
syntax "while " termBeforeDo " do " doSeq : doElem
| `(doElem| while%$tk $h : $cond do $seq) =>
`(doElem| repeat%$tk if $h:ident : $cond then $seq else break)
macro_rules
| `(doElem| while $cond do $seq) => `(doElem| repeat if $cond then $seq else break)
syntax "repeat " doSeq ppDedent(ppLine) "until " term : doElem
| `(doElem| while%$tk $cond do $seq) =>
`(doElem| repeat%$tk if $cond then $seq else break)
macro_rules
| `(doElem| repeat $seq until $cond) => `(doElem| repeat do $seq:doSeq; if $cond then break)
| `(doElem| repeat%$tk $seq until $cond) =>
`(doElem| repeat%$tk do $seq:doSeq; if $cond then break)
end Lean

View File

@@ -207,7 +207,7 @@ def emitLns [EmitToString α] (as : List α) : EmitM Unit :=
emitLn "}"
return ret
def toHexDigit (c : Nat) : String :=
def toDigit (c : Nat) : String :=
String.singleton c.digitChar
def quoteString (s : String) : String :=
@@ -221,7 +221,11 @@ def quoteString (s : String) : String :=
else if c == '\"' then "\\\""
else if c == '?' then "\\?" -- avoid trigraphs
else if c.toNat <= 31 then
"\\x" ++ toHexDigit (c.toNat / 16) ++ toHexDigit (c.toNat % 16)
-- Use octal escapes instead of hex escapes because C hex escapes are
-- greedy: "\x01abc" would be parsed as the single escape \x01abc rather
-- than \x01 followed by "abc". Octal escapes consume at most 3 digits.
let n := c.toNat
"\\" ++ toDigit (n / 64) ++ toDigit ((n / 8) % 8) ++ toDigit (n % 8)
-- TODO(Leo): we should use `\unnnn` for escaping unicode characters.
else String.singleton c)
q;

View File

@@ -90,6 +90,22 @@ partial def eraseProjIncFor (nFields : Nat) (targetId : FVarId) (ds : Array (Cod
| break
if !(w == z && targetId == x) then
break
if mask[i]!.isSome then
/-
Suppose we encounter a situation like
```
let x.1 := proj[0] y
inc x.1
let x.2 := proj[0] y
inc x.2
```
The `inc x.2` will already have been removed. If we don't perform this check we will also
remove `inc x.1` and have effectively removed two refcounts while only one was legal.
-/
keep := keep.push d
keep := keep.push d'
ds := ds.pop.pop
continue
/-
Found
```

View File

@@ -15,3 +15,4 @@ public import Lean.Elab.BuiltinDo.Jump
public import Lean.Elab.BuiltinDo.Misc
public import Lean.Elab.BuiltinDo.For
public import Lean.Elab.BuiltinDo.TryCatch
public import Lean.Elab.BuiltinDo.Repeat

View File

@@ -21,7 +21,8 @@ def elabDoIdDecl (x : Ident) (xType? : Option Term) (rhs : TSyntax `doElem) (k :
let xType Term.elabType (xType?.getD (mkHole x))
let lctx getLCtx
let ctx read
elabDoElem rhs <| .mk (kind := kind) x.getId xType do
let ref getRef -- store the surrounding reference for error messages in `k`
elabDoElem rhs <| .mk (kind := kind) x.getId xType do withRef ref do
withLCtxKeepingMutVarDefs lctx ctx x.getId do
Term.addLocalVarInfo x ( getFVarFromUserName x.getId)
k

View File

@@ -23,7 +23,7 @@ open Lean.Meta
| `(doFor| for $[$_ : ]? $_:ident in $_ do $_) =>
-- This is the target form of the expander, handled by `elabDoFor` below.
Macro.throwUnsupported
| `(doFor| for $decls:doForDecl,* do $body) =>
| `(doFor| for%$tk $decls:doForDecl,* do $body) =>
let decls := decls.getElems
let `(doForDecl| $[$h? : ]? $pattern in $xs) := decls[0]! | Macro.throwUnsupported
let mut doElems := #[]
@@ -74,12 +74,13 @@ open Lean.Meta
| some ($y, s') =>
$s:ident := s'
do $body)
doElems := doElems.push ( `(doSeqItem| for $[$h? : ]? $x:ident in $xs do $body))
doElems := doElems.push ( `(doSeqItem| for%$tk $[$h? : ]? $x:ident in $xs do $body))
`(doElem| do $doElems*)
| _ => Macro.throwUnsupported
@[builtin_doElem_elab Lean.Parser.Term.doFor] def elabDoFor : DoElab := fun stx dec => do
let `(doFor| for $[$h? : ]? $x:ident in $xs do $body) := stx | throwUnsupportedSyntax
let `(doFor| for%$tk $[$h? : ]? $x:ident in $xs do $body) := stx | throwUnsupportedSyntax
let dec dec.ensureUnitAt tk
checkMutVarsForShadowing #[x]
let uα mkFreshLevelMVar
let uρ mkFreshLevelMVar
@@ -124,9 +125,6 @@ open Lean.Meta
defs := defs.push (mkConst ``Unit.unit)
return defs
unless isDefEq dec.resultType ( mkPUnit) do
logError m!"Type mismatch. `for` loops have result type {← mkPUnit}, but the rest of the `do` sequence expected {dec.resultType}."
let (preS, σ) mkProdMkN ( useLoopMutVars none) mi.u
let (app, p?) match h? with

View File

@@ -17,6 +17,7 @@ namespace Lean.Elab.Do
open Lean.Parser.Term
open Lean.Meta
open InternalSyntax in
/--
If the given syntax is a `doIf`, return an equivalent `doIf` that has an `else` but no `else if`s or
`if let`s.
@@ -25,8 +26,8 @@ If the given syntax is a `doIf`, return an equivalent `doIf` that has an `else`
match stx with
| `(doElem|if $_:doIfProp then $_ else $_) =>
Macro.throwUnsupported
| `(doElem|if $cond:doIfCond then $t $[else if $conds:doIfCond then $ts]* $[else $e?]?) => do
let mut e : Syntax e?.getDM `(doSeq|pure PUnit.unit)
| `(doElem|if%$tk $cond:doIfCond then $t $[else if%$tks $conds:doIfCond then $ts]* $[else $e?]?) => do
let mut e : Syntax e?.getDM `(doSeq| skip%$tk)
let mut eIsSeq := true
for (cond, t) in Array.zip (conds.reverse.push cond) (ts.reverse.push t) do
e if eIsSeq then pure e else `(doSeq|$(e):doElem)

View File

@@ -88,17 +88,18 @@ private def checkLetConfigInDo (config : Term.LetConfig) : DoElabM Unit := do
throwError "`+generalize` is not supported in `do` blocks"
partial def elabDoLetOrReassign (config : Term.LetConfig) (letOrReassign : LetOrReassign) (decl : TSyntax ``letDecl)
(dec : DoElemCont) : DoElabM Expr := do
(tk : Syntax) (dec : DoElemCont) : DoElabM Expr := do
checkLetConfigInDo config
let vars getLetDeclVars decl
letOrReassign.checkMutVars vars
let dec dec.ensureUnitAt tk
-- Some decl preprocessing on the patterns and expected types:
let decl pushTypeIntoReassignment letOrReassign decl
let mγ mkMonadicType ( read).doBlockResultType
match decl with
| `(letDecl| $decl:letEqnsDecl) =>
let declNew `(letDecl| $( liftMacroM <| Term.expandLetEqnsDecl decl):letIdDecl)
return Term.withMacroExpansion decl declNew <| elabDoLetOrReassign config letOrReassign declNew dec
return Term.withMacroExpansion decl declNew <| elabDoLetOrReassign config letOrReassign declNew tk dec
| `(letDecl| $pattern:term $[: $xType?]? := $rhs) =>
let rhs match xType? with | some xType => `(($rhs : $xType)) | none => pure rhs
let contElab : DoElabM Expr := elabWithReassignments letOrReassign vars dec.continueWithUnit
@@ -162,10 +163,11 @@ partial def elabDoLetOrReassign (config : Term.LetConfig) (letOrReassign : LetOr
mkLetFVars #[x, h'] body (usedLetOnly := config.usedOnly) (generalizeNondepLet := false)
| _ => throwUnsupportedSyntax
def elabDoArrow (letOrReassign : LetOrReassign) (stx : TSyntax [``doIdDecl, ``doPatDecl]) (dec : DoElemCont) : DoElabM Expr := do
def elabDoArrow (letOrReassign : LetOrReassign) (stx : TSyntax [``doIdDecl, ``doPatDecl]) (tk : Syntax) (dec : DoElemCont) : DoElabM Expr := do
match stx with
| `(doIdDecl| $x:ident $[: $xType?]? $rhs) =>
letOrReassign.checkMutVars #[x]
let dec dec.ensureUnitAt tk
-- For plain variable reassignment, we know the expected type of the reassigned variable and
-- propagate it eagerly via type ascription if the user hasn't provided one themselves:
let xType? match letOrReassign, xType? with
@@ -177,6 +179,7 @@ def elabDoArrow (letOrReassign : LetOrReassign) (stx : TSyntax [``doIdDecl, ``do
(kind := dec.kind)
| `(doPatDecl| _%$pattern $[: $patType?]? $rhs) =>
let x := mkIdentFrom pattern ( mkFreshUserName `__x)
let dec dec.ensureUnitAt tk
elabDoIdDecl x patType? rhs dec.continueWithUnit (kind := dec.kind)
| `(doPatDecl| $pattern:term $[: $patType?]? $rhs $[| $otherwise? $(rest?)?]?) =>
let rest? := rest?.join
@@ -205,17 +208,18 @@ private def getLetConfigAndCheckMut (letConfigStx : TSyntax ``Parser.Term.letCon
Term.mkLetConfig letConfigStx initConfig
@[builtin_doElem_elab Lean.Parser.Term.doLet] def elabDoLet : DoElab := fun stx dec => do
let `(doLet| let $[mut%$mutTk?]? $config:letConfig $decl:letDecl) := stx | throwUnsupportedSyntax
let `(doLet| let%$tk $[mut%$mutTk?]? $config:letConfig $decl:letDecl) := stx | throwUnsupportedSyntax
let config getLetConfigAndCheckMut config mutTk?
elabDoLetOrReassign config (.let mutTk?) decl dec
elabDoLetOrReassign config (.let mutTk?) decl tk dec
@[builtin_doElem_elab Lean.Parser.Term.doHave] def elabDoHave : DoElab := fun stx dec => do
let `(doHave| have $config:letConfig $decl:letDecl) := stx | throwUnsupportedSyntax
let `(doHave| have%$tk $config:letConfig $decl:letDecl) := stx | throwUnsupportedSyntax
let config Term.mkLetConfig config { nondep := true }
elabDoLetOrReassign config .have decl dec
elabDoLetOrReassign config .have decl tk dec
@[builtin_doElem_elab Lean.Parser.Term.doLetRec] def elabDoLetRec : DoElab := fun stx dec => do
let `(doLetRec| let rec $decls:letRecDecls) := stx | throwUnsupportedSyntax
let `(doLetRec| let%$tk rec $decls:letRecDecls) := stx | throwUnsupportedSyntax
let dec dec.ensureUnitAt tk
let vars getLetRecDeclsVars decls
let mγ mkMonadicType ( read).doBlockResultType
doElabToSyntax m!"let rec body of group {vars}" dec.continueWithUnit fun body => do
@@ -227,13 +231,13 @@ private def getLetConfigAndCheckMut (letConfigStx : TSyntax ``Parser.Term.letCon
@[builtin_doElem_elab Lean.Parser.Term.doReassign] def elabDoReassign : DoElab := fun stx dec => do
-- def doReassign := letIdDeclNoBinders <|> letPatDecl
match stx with
| `(doReassign| $x:ident $[: $xType?]? := $rhs) =>
| `(doReassign| $x:ident $[: $xType?]? :=%$tk $rhs) =>
let decl : TSyntax ``letIdDecl `(letIdDecl| $x:ident $[: $xType?]? := $rhs)
let decl : TSyntax ``letDecl := mkNode ``letDecl #[decl]
elabDoLetOrReassign {} .reassign decl dec
elabDoLetOrReassign {} .reassign decl tk dec
| `(doReassign| $decl:letPatDecl) =>
let decl : TSyntax ``letDecl := mkNode ``letDecl #[decl]
elabDoLetOrReassign {} .reassign decl dec
elabDoLetOrReassign {} .reassign decl decl dec
| _ => throwUnsupportedSyntax
@[builtin_doElem_elab Lean.Parser.Term.doLetElse] def elabDoLetElse : DoElab := fun stx dec => do
@@ -255,17 +259,17 @@ private def getLetConfigAndCheckMut (letConfigStx : TSyntax ``Parser.Term.letCon
elabDoElem ( `(doElem| match $rhs:term with | $pattern => $body:doSeqIndent | _ => $otherwise:doSeqIndent)) dec
@[builtin_doElem_elab Lean.Parser.Term.doLetArrow] def elabDoLetArrow : DoElab := fun stx dec => do
let `(doLetArrow| let $[mut%$mutTk?]? $cfg:letConfig $decl) := stx | throwUnsupportedSyntax
let `(doLetArrow| let%$tk $[mut%$mutTk?]? $cfg:letConfig $decl) := stx | throwUnsupportedSyntax
let config getLetConfigAndCheckMut cfg mutTk?
checkLetConfigInDo config
if config.nondep || config.usedOnly || config.zeta || config.eq?.isSome then
throwErrorAt cfg "configuration options are not supported with `←`"
elabDoArrow (.let mutTk?) decl dec
elabDoArrow (.let mutTk?) decl tk dec
@[builtin_doElem_elab Lean.Parser.Term.doReassignArrow] def elabDoReassignArrow : DoElab := fun stx dec => do
match stx with
| `(doReassignArrow| $decl:doIdDecl) =>
elabDoArrow .reassign decl dec
elabDoArrow .reassign decl decl dec
| `(doReassignArrow| $decl:doPatDecl) =>
elabDoArrow .reassign decl dec
elabDoArrow .reassign decl decl dec
| _ => throwUnsupportedSyntax

View File

@@ -16,6 +16,12 @@ namespace Lean.Elab.Do
open Lean.Parser.Term
open Lean.Meta
open InternalSyntax in
@[builtin_doElem_elab Lean.Parser.Term.InternalSyntax.doSkip] def elabDoSkip : DoElab := fun stx dec => do
let `(doSkip| skip%$tk) := stx | throwUnsupportedSyntax
let dec dec.ensureUnitAt tk
dec.continueWithUnit
@[builtin_doElem_elab Lean.Parser.Term.doExpr] def elabDoExpr : DoElab := fun stx dec => do
let `(doExpr| $e:term) := stx | throwUnsupportedSyntax
let mα mkMonadicType dec.resultType
@@ -26,24 +32,28 @@ open Lean.Meta
let `(doNested| do $doSeq) := stx | throwUnsupportedSyntax
elabDoSeq doSeq.raw dec
open InternalSyntax in
@[builtin_doElem_elab Lean.Parser.Term.doUnless] def elabDoUnless : DoElab := fun stx dec => do
let `(doUnless| unless $cond do $body) := stx | throwUnsupportedSyntax
elabDoElem ( `(doElem| if $cond then pure PUnit.unit else $body)) dec
let `(doUnless| unless%$tk $cond do $body) := stx | throwUnsupportedSyntax
elabDoElem ( `(doElem| if $cond then skip%$tk else $body)) dec
@[builtin_doElem_elab Lean.Parser.Term.doDbgTrace] def elabDoDbgTrace : DoElab := fun stx dec => do
let `(doDbgTrace| dbg_trace $msg:term) := stx | throwUnsupportedSyntax
let `(doDbgTrace| dbg_trace%$tk $msg:term) := stx | throwUnsupportedSyntax
let mγ mkMonadicType ( read).doBlockResultType
let dec dec.ensureUnitAt tk
doElabToSyntax "dbg_trace body" dec.continueWithUnit fun body => do
Term.elabTerm ( `(dbg_trace $msg; $body)) mγ
@[builtin_doElem_elab Lean.Parser.Term.doAssert] def elabDoAssert : DoElab := fun stx dec => do
let `(doAssert| assert! $cond) := stx | throwUnsupportedSyntax
let `(doAssert| assert!%$tk $cond) := stx | throwUnsupportedSyntax
let mγ mkMonadicType ( read).doBlockResultType
let dec dec.ensureUnitAt tk
doElabToSyntax "assert! body" dec.continueWithUnit fun body => do
Term.elabTerm ( `(assert! $cond; $body)) mγ
@[builtin_doElem_elab Lean.Parser.Term.doDebugAssert] def elabDoDebugAssert : DoElab := fun stx dec => do
let `(doDebugAssert| debug_assert! $cond) := stx | throwUnsupportedSyntax
let `(doDebugAssert| debug_assert!%$tk $cond) := stx | throwUnsupportedSyntax
let mγ mkMonadicType ( read).doBlockResultType
let dec dec.ensureUnitAt tk
doElabToSyntax "debug_assert! body" dec.continueWithUnit fun body => do
Term.elabTerm ( `(debug_assert! $cond; $body)) mγ

View File

@@ -0,0 +1,44 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sebastian Graf
-/
module
prelude
public import Lean.Elab.BuiltinDo.Basic
meta import Lean.Parser.Do
import Lean.Elab.BuiltinDo.For
public section
namespace Lean.Elab.Do
open Lean.Parser.Term
/--
Builtin do-element elaborator for `repeat` (syntax kind `Lean.Parser.Term.doRepeat`).
Expands to `for _ in Loop.mk do ...`. A follow-up change will extend this
elaborator to choose between `Loop.mk` and a well-founded `Repeat.mk` based on a
configuration option.
-/
@[builtin_doElem_elab Lean.Parser.Term.doRepeat] def elabDoRepeat : DoElab := fun stx dec => do
let `(doElem| repeat%$tk $seq) := stx | throwUnsupportedSyntax
let expanded `(doElem| for%$tk _ in Loop.mk do $seq)
Term.withMacroExpansion stx expanded <|
withRef expanded <| elabDoElem expanded dec
@[builtin_macro Lean.Parser.Term.doWhileH] def expandDoWhileH : Macro
| `(doElem| while%$tk $h : $cond do $seq) => `(doElem| repeat%$tk if $h:ident : $cond then $seq else break)
| _ => Macro.throwUnsupported
@[builtin_macro Lean.Parser.Term.doWhile] def expandDoWhile : Macro
| `(doElem| while%$tk $cond do $seq) => `(doElem| repeat%$tk if $cond then $seq else break)
| _ => Macro.throwUnsupported
@[builtin_macro Lean.Parser.Term.doRepeatUntil] def expandDoRepeatUntil : Macro
| `(doElem| repeat%$tk $seq until $cond) => `(doElem| repeat%$tk do $seq:doSeq; if $cond then break)
| _ => Macro.throwUnsupported
end Lean.Elab.Do

View File

@@ -220,7 +220,7 @@ def processDefDeriving (view : DerivingClassView) (decl : Expr) (isNoncomputable
instName liftMacroM <| mkUnusedBaseName instName
if isPrivateName declName then
instName := mkPrivateName env instName
let isMeta := ( read).declName?.any (isMarkedMeta ( getEnv))
let isMeta := ( read).isMetaSection || isMarkedMeta ( getEnv) declName
let inst if backward.inferInstanceAs.wrap.get ( getOptions) then
withDeclNameForAuxNaming instName <| withNewMCtxDepth <|
wrapInstance result.instVal result.instType
@@ -255,11 +255,12 @@ def processDefDeriving (view : DerivingClassView) (decl : Expr) (isNoncomputable
logInfoAt cmdRef m!"Try this: {newText}"
throwError "failed to derive instance because it depends on \
`{.ofConstName noncompRef}`, which is noncomputable"
let isMeta := ( read).isMetaSection || isMarkedMeta ( getEnv) declName
if isNoncomputable || ( read).isNoncomputableSection then
addDecl <| Declaration.defnDecl decl
modifyEnv (addNoncomputable · instName)
else
addAndCompile <| Declaration.defnDecl decl
addAndCompile (Declaration.defnDecl decl) (markMeta := isMeta)
trace[Elab.Deriving] "Derived instance `{.ofConstName instName}`"
-- For Prop-typed instances (theorems), skip `implicit_reducible` since reducibility hints are
-- irrelevant for theorems. This matches the behavior of the handwritten `instance` command

View File

@@ -374,14 +374,60 @@ def withLCtxKeepingMutVarDefs (oldLCtx : LocalContext) (oldCtx : Context) (resul
mutVarDefs := oldMutVarDefs
}) k
def mkMonadicResultTypeMismatchError (contType : Expr) (elementType : Expr) : MessageData :=
m!"Type mismatch. The `do` element has monadic result type{indentExpr elementType}\n\
but the rest of the `do` block has monadic result type{indentExpr contType}"
/--
Given a continuation `dec`, a reference `ref`, and an element result type `elementType`, returns a
continuation derived from `dec` with result type `elementType`.
If `dec` already has result type `elementType`, simply returns `dec`.
Otherwise, an error is logged and a new continuation is returned that calls `dec` with `sorry` as a
result. The error is reported at `ref`.
-/
def DoElemCont.ensureHasTypeAt (dec : DoElemCont) (ref : Syntax) (elementType : Expr) : DoElabM DoElemCont := do
if isDefEqGuarded dec.resultType elementType then
return dec
let errMessage := mkMonadicResultTypeMismatchError dec.resultType elementType
unless ( readThe Term.Context).errToSorry do
throwErrorAt ref errMessage
logErrorAt ref errMessage
return {
resultName := mkFreshUserName `__r
resultType := elementType
k := do
mapLetDecl dec.resultName dec.resultType ( mkSorry dec.resultType true)
(nondep := true) (kind := .implDetail) fun _ => dec.k
kind := dec.kind
}
/--
Given a continuation `dec` and a reference `ref`, returns a continuation derived from `dec` with result type `PUnit`.
If `dec` already has result type `PUnit`, simply returns `dec`. Otherwise, an error is logged and a
new continuation is returned that calls `dec` with `sorry` as a result. The error is reported at `ref`.
-/
def DoElemCont.ensureUnitAt (dec : DoElemCont) (ref : Syntax) : DoElabM DoElemCont := do
dec.ensureHasTypeAt ref ( mkPUnit)
/--
Given a continuation `dec`, returns a continuation derived from `dec` with result type `PUnit`.
If `dec` already has result type `PUnit`, simply returns `dec`. Otherwise, an error is logged and a
new continuation is returned that calls `dec` with `sorry` as a result.
-/
def DoElemCont.ensureUnit (dec : DoElemCont) : DoElabM DoElemCont := do
dec.ensureUnitAt ( getRef)
/--
Return `$e >>= fun ($dec.resultName : $dec.resultType) => $(← dec.k)`, cancelling
the bind if `$(← dec.k)` is `pure $dec.resultName` or `e` is some `pure` computation.
-/
def DoElemCont.mkBindUnlessPure (dec : DoElemCont) (e : Expr) : DoElabM Expr := do
-- let eResultTy ← mkFreshResultType
-- let e ← Term.ensureHasType (← mkMonadicType eResultTy) e
-- let dec ← dec.ensureHasType eResultTy
let x := dec.resultName
let eResultTy := dec.resultType
let k := dec.k
let eResultTy := dec.resultType
-- The .ofBinderName below is mainly to interpret `__do_lift` binders as implementation details.
let declKind := .ofBinderName x
let kResultTy mkFreshResultType `kResultTy
@@ -421,9 +467,8 @@ Return `let $k.resultName : PUnit := PUnit.unit; $(← k.k)`, ensuring that the
is `PUnit` and then immediately zeta-reduce the `let`.
-/
def DoElemCont.continueWithUnit (dec : DoElemCont) : DoElabM Expr := do
let unit mkPUnitUnit
discard <| Term.ensureHasType dec.resultType unit
mapLetDeclZeta dec.resultName ( mkPUnit) unit (nondep := true) (kind := .ofBinderName dec.resultName) fun _ =>
let dec dec.ensureUnit
mapLetDeclZeta dec.resultName ( mkPUnit) ( mkPUnitUnit) (nondep := true) (kind := .ofBinderName dec.resultName) fun _ =>
dec.k
/-- Elaborate the `DoElemCont` with the `deadCode` flag set to `deadSyntactically` to emit warnings. -/
@@ -604,6 +649,7 @@ def enterFinally (resultType : Expr) (k : DoElabM Expr) : DoElabM Expr := do
/-- Extracts `MonadInfo` and monadic result type `α` from the expected type of a `do` block `m α`. -/
private partial def extractMonadInfo (expectedType? : Option Expr) : Term.TermElabM (MonadInfo × Expr) := do
let some expectedType := expectedType? | mkUnknownMonadResult
let expectedType instantiateMVars expectedType
let extractStep? (type : Expr) : Term.TermElabM (Option (MonadInfo × Expr)) := do
let .app m resultType := type.consumeMData | return none
unless isType resultType do return none

View File

@@ -79,6 +79,7 @@ builtin_initialize controlInfoElemAttribute : KeyedDeclsAttribute ControlInfoHan
namespace InferControlInfo
open InternalSyntax in
mutual
partial def ofElem (stx : TSyntax `doElem) : TermElabM ControlInfo := do
@@ -128,6 +129,7 @@ partial def ofElem (stx : TSyntax `doElem) : TermElabM ControlInfo := do
return thenInfo.alternative info
| `(doElem| unless $_ do $elseSeq) =>
ControlInfo.alternative {} <$> ofSeq elseSeq
-- For/Repeat
| `(doElem| for $[$[$_ :]? $_ in $_],* do $bodySeq) =>
let info ofSeq bodySeq
return { info with -- keep only reassigns and earlyReturn
@@ -135,6 +137,13 @@ partial def ofElem (stx : TSyntax `doElem) : TermElabM ControlInfo := do
continues := false,
breaks := false
}
| `(doRepeat| repeat $bodySeq) =>
let info ofSeq bodySeq
return { info with
numRegularExits := if info.breaks then 1 else 0,
continues := false,
breaks := false
}
-- Try
| `(doElem| try $trySeq:doSeq $[$catches]* $[finally $finSeq?]?) =>
let mut info ofSeq trySeq
@@ -152,6 +161,7 @@ partial def ofElem (stx : TSyntax `doElem) : TermElabM ControlInfo := do
let finInfo ofOptionSeq finSeq?
return info.sequence finInfo
-- Misc
| `(doElem| skip) => return .pure
| `(doElem| dbg_trace $_) => return .pure
| `(doElem| assert! $_) => return .pure
| `(doElem| debug_assert! $_) => return .pure

View File

@@ -1782,6 +1782,10 @@ mutual
doIfToCode doElem doElems
else if k == ``Parser.Term.doUnless then
doUnlessToCode doElem doElems
else if k == ``Parser.Term.doRepeat then
let seq := doElem[1]
let expanded `(doElem| for _ in Loop.mk do $seq)
doSeqToCode (expanded :: doElems)
else if k == ``Parser.Term.doFor then withFreshMacroScope do
doForToCode doElem doElems
else if k == ``Parser.Term.doMatch then
@@ -1815,6 +1819,13 @@ mutual
return mkTerminalAction term
else
return mkSeq term ( doSeqToCode doElems)
else if k == ``Parser.Term.InternalSyntax.doSkip then
-- In the legacy elaborator, `skip` is treated as `pure PUnit.unit`.
let term withRef doElem `(pure PUnit.unit)
if doElems.isEmpty then
return mkTerminalAction term
else
return mkSeq term ( doSeqToCode doElems)
else
throwError "unexpected do-element of kind {doElem.getKind}:\n{doElem}"
end

View File

@@ -364,8 +364,9 @@ def elabIdbgTerm : TermElab := fun stx expectedType? => do
@[builtin_doElem_elab Lean.Parser.Term.doIdbg]
def elabDoIdbg : DoElab := fun stx dec => do
let `(Lean.Parser.Term.doIdbg| idbg $e) := stx | throwUnsupportedSyntax
let `(Lean.Parser.Term.doIdbg| idbg%$tk $e) := stx | throwUnsupportedSyntax
let mγ mkMonadicType ( read).doBlockResultType
let dec dec.ensureUnitAt tk
doElabToSyntax "idbg body" dec.continueWithUnit fun body => do
elabIdbgCore (e := e) (body := body) (ref := stx) mγ

View File

@@ -73,6 +73,8 @@ private def inductiveSyntaxToView (modifiers : Modifiers) (decl : Syntax) (isCoi
throwError "Constructor cannot be `protected` because it is in a `private` inductive datatype"
checkValidCtorModifier ctorModifiers
let ctorName := ctor.getIdAt 3
if ctorName.hasMacroScopes && isCoinductive then
throwError "Coinductive predicates are not allowed inside of macro scopes"
let ctorName := declName ++ ctorName
let ctorName withRef ctor[3] <| applyVisibility ctorModifiers ctorName
let (binders, type?) := expandOptDeclSig ctor[4]

View File

@@ -175,6 +175,7 @@ where
return !( allChildrenLt a b)
lpo (a b : Expr) : MetaM Bool := do
checkSystem "Lean.Meta.acLt"
-- Case 1: `a < b` if for some child `b_i` of `b`, we have `b_i >= a`
if ( someChildGe b a) then
return true

View File

@@ -69,12 +69,16 @@ def decLevel (u : Level) : MetaM Level := do
/-- This method is useful for inferring universe level parameters for function that take arguments such as `{α : Type u}`.
Recall that `Type u` is `Sort (u+1)` in Lean. Thus, given `α`, we must infer its universe level,
and then decrement 1 to obtain `u`. -/
instantiate and normalize it, and then decrement 1 to obtain `u`. -/
def getDecLevel (type : Expr) : MetaM Level := do
decLevel ( getLevel type)
let l getLevel type
let l normalizeLevel l
decLevel l
def getDecLevel? (type : Expr) : MetaM (Option Level) := do
decLevel? ( getLevel type)
let l getLevel type
let l normalizeLevel l
decLevel? l
builtin_initialize
registerTraceClass `Meta.isLevelDefEq.step

View File

@@ -229,8 +229,33 @@ private partial def computeSynthOrder (inst : Expr) (projInfo? : Option Projecti
return synthed
def checkImpossibleInstance (c : Expr) : MetaM Unit := do
let cTy inferType c
forallTelescopeReducing cTy fun args ty => do
let argTys args.mapM inferType
let impossibleArgs args.zipIdx.filterMapM fun (arg, i) => do
let fv := arg.fvarId!
if ( fv.getDecl).binderInfo.isInstImplicit then return none
if ty.containsFVar fv then return none
if argTys[i+1:].any (·.containsFVar fv) then return none
return some m!"{arg} : {← inferType arg}"
if impossibleArgs.isEmpty then return
let impossibleArgs := MessageData.joinSep impossibleArgs.toList ", "
throwError m!"Instance {c} has arguments "
++ impossibleArgs
++ " that are impossible to infer. Those arguments are not instance-implicit and do not appear in another instance-implicit argument or the return type."
def checkNonClassInstance (declName : Name) (c : Expr) : MetaM Unit := do
let type inferType c
forallTelescopeReducing type fun _ target => do
unless ( isClass? target).isSome do
unless target.isSorry do
throwError m!"instance `{declName}` target `{target}` is not a type class."
def addInstance (declName : Name) (attrKind : AttributeKind) (prio : Nat) : MetaM Unit := do
let c mkConstWithLevelParams declName
checkImpossibleInstance c
checkNonClassInstance declName c
let keys mkInstanceKey c
let status getReducibilityStatus declName
unless status matches .reducible | .implicitReducible do

View File

@@ -38,7 +38,9 @@ and assigning `?m := max ?n v`
private def solveSelfMax (mvarId : LMVarId) (v : Level) : MetaM Unit := do
assert! v.isMax
let n mkFreshLevelMVar
assignLevelMVar mvarId <| mkMaxArgsDiff mvarId v n
let v' := mkMaxArgsDiff mvarId v n
trace[Meta.isLevelDefEq.step] "solveSelfMax: {mkLevelMVar mvarId} := {v'}"
assignLevelMVar mvarId v'
/--
Returns true if `v` is `max u ?m` (or variant). That is, we solve `u =?= max u ?m` as `?m := u`.
@@ -53,6 +55,7 @@ private def tryApproxSelfMax (u v : Level) : MetaM Bool := do
where
solve (v' : Level) (mvarId : LMVarId) : MetaM Bool := do
if u == v' then
trace[Meta.isLevelDefEq.step] "tryApproxSelfMax {mkLevelMVar mvarId} := {u}"
assignLevelMVar mvarId u
return true
else
@@ -71,8 +74,14 @@ private def tryApproxMaxMax (u v : Level) : MetaM Bool := do
| _, _ => return false
where
solve (u₁ u₂ v' : Level) (mvarId : LMVarId) : MetaM Bool := do
if u₁ == v' then assignLevelMVar mvarId u₂; return true
else if u₂ == v' then assignLevelMVar mvarId u₁; return true
if u₁ == v' then
trace[Meta.isLevelDefEq.step] "tryApproxMaxMax {mkLevelMVar mvarId} := {u₂}"
assignLevelMVar mvarId u₂
return true
else if u₂ == v' then
trace[Meta.isLevelDefEq.step] "tryApproxMaxMax {mkLevelMVar mvarId} := {u₁}"
assignLevelMVar mvarId u₁
return true
else return false
private def postponeIsLevelDefEq (lhs : Level) (rhs : Level) : MetaM Unit := do
@@ -97,9 +106,11 @@ mutual
else if ( isMVarWithGreaterDepth v mvarId) then
-- If both `u` and `v` are both metavariables, but depth of v is greater, then we assign `v := u`.
-- This can only happen when levelAssignDepth is set to a smaller value than depth (e.g. during TC synthesis)
trace[Meta.isLevelDefEq.step] "{v} := {u}"
assignLevelMVar v.mvarId! u
return LBool.true
else if !u.occurs v then
trace[Meta.isLevelDefEq.step] "{u} := {v}"
assignLevelMVar u.mvarId! v
return LBool.true
else if v.isMax && !strictOccursMax u v then
@@ -133,8 +144,9 @@ mutual
@[export lean_is_level_def_eq]
partial def isLevelDefEqAuxImpl : Level Level MetaM Bool
| Level.succ lhs, Level.succ rhs => isLevelDefEqAux lhs rhs
| lhs, rhs =>
withTraceNode `Meta.isLevelDefEq (fun _ => return m!"{lhs} =?= {rhs}") do
| lhs, rhs => do
withTraceNodeBefore `Meta.isLevelDefEq (fun _ =>
withOptions (·.set `pp.instantiateMVars false) do addMessageContext m!"{lhs} =?= {rhs}") do
if lhs.getLevelOffset == rhs.getLevelOffset then
return lhs.getOffset == rhs.getOffset
else

View File

@@ -9,19 +9,37 @@ public import Lean.Meta.Sym.ExprPtr
public import Lean.Meta.Basic
import Lean.Meta.Transform
namespace Lean.Meta.Sym
/--
Returns `true` if `e` contains a loose bound variable with index in `[0, n)`
This function assumes `n` is small. If this becomes a bottleneck, we should
implement a version of `lean_expr_has_loose_bvar` that checks the range in one traversal.
-/
def hasLooseBVarsInRange (e : Expr) (n : Nat) : Bool :=
e.hasLooseBVars && go n
where
go : Nat Bool
| 0 => false
| i+1 => e.hasLooseBVar i || go i
/--
Checks if `body` is eta-expanded with `n` applications: `f (.bvar (n-1)) ... (.bvar 0)`.
Returns `f` if so and `f` has no loose bvars; otherwise returns `default`.
Returns `f` if so and `f` has no loose bvars with indices in the range `[0, n)`; otherwise returns `default`.
- `n`: number of remaining applications to check
- `i`: expected bvar index (starts at 0, increments with each application)
- `default`: returned when not eta-reducible (enables pointer equality check)
-/
def etaReduceAux (body : Expr) (n : Nat) (i : Nat) (default : Expr) : Expr := Id.run do
match n with
| 0 => if body.hasLooseBVars then default else body
| n+1 =>
let .app f (.bvar j) := body | default
if j == i then etaReduceAux f n (i+1) default else default
def etaReduceAux (body : Expr) (n : Nat) (i : Nat) (default : Expr) : Expr :=
go body n i
where
go (body : Expr) (m : Nat) (i : Nat) : Expr := Id.run do
match m with
| 0 =>
if hasLooseBVarsInRange body n then default
else body.lowerLooseBVars n n
| m+1 =>
let .app f (.bvar j) := body | default
if j == i then go f m (i+1) else default
/--
If `e` is of the form `(fun x₁ ... xₙ => f x₁ ... xₙ)` and `f` does not contain `x₁`, ..., `xₙ`,

View File

@@ -48,6 +48,8 @@ def introCore (mvarId : MVarId) (max : Nat) (names : Array Name) : SymM (Array F
assignDelayedMVar auxMVar.mvarId! fvars mvarId'
mvarId.assign val
let finalize (lctx : LocalContext) (localInsts : LocalInstances) (fvars : Array Expr) (type : Expr) : SymM (Array Expr × MVarId) := do
if fvars.isEmpty then
return (#[], mvarId)
let type instantiateRevS type fvars
let mvar' mkFreshExprMVarAt lctx localInsts type .syntheticOpaque mvarDecl.userName
let mvarId' := mvar'.mvarId!

View File

@@ -148,11 +148,14 @@ def propagatePending : OrderM Unit := do
- `h₁ : ↑ue' = ue`
- `h₂ : ↑ve' = ve`
- `h : ue = ve`
**Note**: We currently only support `Nat`. Thus `↑a` is actually
`NatCast.natCast a`. If we decide to support arbitrary semirings
in this module, we must adjust this code.
**Note**: We currently only support `Nat` originals. Thus `↑a` is actually
`NatCast.natCast a`. The lemma `nat_eq` is specialized to `Int`, so we
only invoke it when the cast destination is `Int`. For other types (e.g.
`Rat`), `pushEq ue ve h` above is sufficient and `grind` core can derive
the `Nat` equality via `norm_cast`/cast injectivity if needed.
-/
pushEq ue' ve' <| mkApp7 (mkConst ``Grind.Order.nat_eq) ue' ve' ue ve h₁ h₂ h
if ( inferType ue) == Int.mkType then
pushEq ue' ve' <| mkApp7 (mkConst ``Grind.Order.nat_eq) ue' ve' ue ve h₁ h₂ h
where
/--
If `e` is an auxiliary term used to represent some term `a`, returns
@@ -343,7 +346,7 @@ def getStructIdOf? (e : Expr) : GoalM (Option Nat) := do
return ( get').exprToStructId.find? { expr := e }
def propagateIneq (e : Expr) : GoalM Unit := do
if let some (e', he) := ( get').termMap.find? { expr := e } then
if let some { e := e', h := he, .. } := ( get').termMap.find? { expr := e } then
go e' (some he)
else
go e none
@@ -369,20 +372,27 @@ builtin_grind_propagator propagateLT ↓LT.lt := propagateIneq
public def processNewEq (a b : Expr) : GoalM Unit := do
unless isSameExpr a b do
let h mkEqProof a b
if let some (a', h₁) getAuxTerm? a then
let some (b', h₂) getAuxTerm? b | return ()
if let some { e := a', h := h₁, α } getAuxTerm? a then
let some { e := b', h := h₂, .. } getAuxTerm? b | return ()
/-
We have
- `h : a = b`
- `h₁ : ↑a = a'`
- `h₂ : ↑b = b'`
where `a'` and `b'` are `NatCast.natCast α inst _` for some type `α`.
-/
let h := mkApp7 (mkConst ``Grind.Order.of_nat_eq) a b a' b' h₁ h₂ h
go a' b' h
if α == Int.mkType then
let h := mkApp7 (mkConst ``Grind.Order.of_nat_eq) a b a' b' h₁ h₂ h
go a' b' h
else
let u getDecLevel α
let inst synthInstance (mkApp (mkConst ``NatCast [u]) α)
let h := mkApp9 (mkConst ``Grind.Order.of_natCast_eq [u]) α inst a b a' b' h₁ h₂ h
go a' b' h
else
go a b h
where
getAuxTerm? (e : Expr) : GoalM (Option (Expr × Expr)) := do
getAuxTerm? (e : Expr) : GoalM (Option TermMapEntry) := do
return ( get').termMap.find? { expr := e }
go (a b h : Expr) : GoalM Unit := do

View File

@@ -166,9 +166,9 @@ def setStructId (e : Expr) : OrderM Unit := do
exprToStructId := s.exprToStructId.insert { expr := e } structId
}
def updateTermMap (e eNew h : Expr) : GoalM Unit := do
def updateTermMap (e eNew h α : Expr) : GoalM Unit := do
modify' fun s => { s with
termMap := s.termMap.insert { expr := e } (eNew, h)
termMap := s.termMap.insert { expr := e } { e := eNew, h, α }
termMapInv := s.termMapInv.insert { expr := eNew } (e, h)
}
@@ -198,9 +198,9 @@ where
getOriginal? (e : Expr) : GoalM (Option Expr) := do
if let some (e', _) := ( get').termMapInv.find? { expr := e } then
return some e'
let_expr NatCast.natCast _ _ a := e | return none
let_expr NatCast.natCast α _ a := e | return none
if ( alreadyInternalized a) then
updateTermMap a e ( mkEqRefl e)
updateTermMap a e ( mkEqRefl e) α
return some a
else
return none
@@ -290,7 +290,7 @@ def internalizeTerm (e : Expr) : OrderM Unit := do
open Arith.Cutsat in
def adaptNat (e : Expr) : GoalM Expr := do
if let some (eNew, _) := ( get').termMap.find? { expr := e } then
if let some { e := eNew, .. } := ( get').termMap.find? { expr := e } then
return eNew
else match_expr e with
| LE.le _ _ lhs rhs => adaptCnstr lhs rhs (isLT := false)
@@ -307,12 +307,12 @@ where
let h := mkApp6
(mkConst (if isLT then ``Nat.ToInt.lt_eq else ``Nat.ToInt.le_eq))
lhs rhs lhs' rhs' h₁ h₂
updateTermMap e eNew h
updateTermMap e eNew h ( getIntExpr)
return eNew
adaptTerm : GoalM Expr := do
let (eNew, h) natToInt e
updateTermMap e eNew h
updateTermMap e eNew h ( getIntExpr)
return eNew
def adapt (α : Expr) (e : Expr) : GoalM (Expr × Expr) := do

View File

@@ -128,6 +128,13 @@ structure Struct where
propagate : List ToPropagate := []
deriving Inhabited
/-- Entry/Value for the map `termMap` in `State` -/
structure TermMapEntry where
e : Expr
h : Expr
α : Expr
deriving Inhabited
/-- State for all order types detected by `grind`. -/
structure State where
/-- Order structures detected. -/
@@ -143,7 +150,7 @@ structure State where
Example: given `x y : Nat`, `x ≤ y + 1` is mapped to `Int.ofNat x ≤ Int.ofNat y + 1`, and proof
of equivalence.
-/
termMap : PHashMap ExprPtr (Expr × Expr) := {}
termMap : PHashMap ExprPtr TermMapEntry := {}
/-- `termMap` inverse -/
termMapInv : PHashMap ExprPtr (Expr × Expr) := {}
deriving Inhabited

View File

@@ -51,7 +51,7 @@ register_builtin_option debug.tactic.simp.checkDefEqAttr : Bool := {
}
register_builtin_option warning.simp.varHead : Bool := {
defValue := false
defValue := true
descr := "If true, warns when the head symbol of the left-hand side of a `@[simp]` theorem \
is a variable. Such lemmas are tried on every simp step, which can be slow."
}

View File

@@ -66,6 +66,14 @@ def notFollowedByRedefinedTermToken :=
"do" <|> "dbg_trace" <|> "idbg" <|> "assert!" <|> "debug_assert!" <|> "for" <|> "unless" <|> "return" <|> symbol "try")
"token at 'do' element"
namespace InternalSyntax
/--
Internal syntax used in the `if` and `unless` elaborators. Behaves like `pure PUnit.unit` but
uses `()` if possible and gives better error messages.
-/
scoped syntax (name := doSkip) "skip" : doElem
end InternalSyntax
@[builtin_doElem_parser] def doLet := leading_parser
"let " >> optional "mut " >> letConfig >> letDecl
@[builtin_doElem_parser] def doLetElse := leading_parser withPosition <|
@@ -265,6 +273,15 @@ with debug assertions enabled (see the `debugAssertions` option).
@[builtin_doElem_parser] def doDebugAssert := leading_parser:leadPrec
"debug_assert! " >> termParser
@[builtin_doElem_parser] def doRepeat := leading_parser
"repeat " >> doSeq
@[builtin_doElem_parser] def doWhileH := leading_parser
"while " >> ident >> " : " >> withForbidden "do" termParser >> " do " >> doSeq
@[builtin_doElem_parser] def doWhile := leading_parser
"while " >> withForbidden "do" termParser >> " do " >> doSeq
@[builtin_doElem_parser] def doRepeatUntil := leading_parser
"repeat " >> doSeq >> ppDedent ppLine >> "until " >> termParser
/-
We use `notFollowedBy` to avoid counterintuitive behavior.

View File

@@ -484,6 +484,7 @@ open SubExpr (Pos PosMap)
open Delaborator (OptionsPerPos topDownAnalyze DelabM getPPOption)
def delabLevel (l : Level) (prec : Nat) : MetaM Syntax.Level := do
let l if getPPInstantiateMVars ( getOptions) then instantiateLevelMVars l else pure l
let mvars := getPPMVarsLevels ( getOptions)
return Level.quote l prec (mvars := mvars) (lIndex? := ( getMCtx).findLevelIndex?)

View File

@@ -117,12 +117,13 @@ private partial def isSyntheticTacticCompletion
(cmdStx : Syntax)
: Bool := Id.run do
let hoverFilePos := fileMap.toPosition hoverPos
go hoverFilePos cmdStx 0
go hoverFilePos cmdStx 0 none
where
go
(hoverFilePos : Position)
(stx : Syntax)
(leadingWs : Nat)
(hoverFilePos : Position)
(stx : Syntax)
(leadingWs : Nat)
(leadingTokenTailPos? : Option String.Pos.Raw)
: Bool := Id.run do
match stx.getPos?, stx.getTailPos? with
| some startPos, some endPos =>
@@ -132,8 +133,9 @@ where
if ! isCursorInCompletionRange then
return false
let mut wsBeforeArg := leadingWs
let mut lastArgTailPos? := leadingTokenTailPos?
for arg in stx.getArgs do
if go hoverFilePos arg wsBeforeArg then
if go hoverFilePos arg wsBeforeArg lastArgTailPos? then
return true
-- We must account for the whitespace before an argument because the syntax nodes we use
-- to identify tactic blocks only start *after* the whitespace following a `by`, and we
@@ -141,7 +143,12 @@ where
-- This method of computing whitespace assumes that there are no syntax nodes without tokens
-- after `by` and before the first proper tactic syntax.
wsBeforeArg := arg.getTrailingSize
return isCompletionInEmptyTacticBlock stx
-- Track the tail position of the most recent preceding sibling that has a position so
-- that empty tactic blocks (which lack positions) can locate their opening token (e.g.
-- the `by` keyword) for indentation checking. The tail position lets us distinguish
-- cursors before and after the opener on the opener's line.
lastArgTailPos? := arg.getTailPos? <|> lastArgTailPos?
return isCompletionInEmptyTacticBlock stx lastArgTailPos?
|| isCompletionAfterSemicolon stx
|| isCompletionOnTacticBlockIndentation hoverFilePos stx
| _, _ =>
@@ -150,7 +157,7 @@ where
-- tactic blocks always occur within other syntax with ranges that let us narrow down the
-- search to the degree that we can be sure that the cursor is indeed in this empty tactic
-- block.
return isCompletionInEmptyTacticBlock stx
return isCompletionInEmptyTacticBlock stx leadingTokenTailPos?
isCompletionOnTacticBlockIndentation
(hoverFilePos : Position)
@@ -190,8 +197,47 @@ where
else
none
isCompletionInEmptyTacticBlock (stx : Syntax) : Bool :=
isCursorInProperWhitespace fileMap hoverPos && isEmptyTacticBlock stx
isCompletionInEmptyTacticBlock (stx : Syntax) (leadingTokenTailPos? : Option String.Pos.Raw) : Bool := Id.run do
if ! isCursorInProperWhitespace fileMap hoverPos then
return false
if ! isEmptyTacticBlock stx then
return false
-- Bracketed tactic blocks `{ ... }` are delimited by the braces themselves, so tactics
-- inserted in an empty bracketed block can appear at any column between the braces
-- (`withoutPosition` disables indentation constraints inside `tacticSeqBracketed`).
if stx.getKind == ``Parser.Tactic.tacticSeqBracketed then
let some openEndPos := stx[0].getTailPos?
| return false
let some closeStartPos := stx[2].getPos?
| return false
return openEndPos.byteIdx <= hoverPos.byteIdx && hoverPos.byteIdx <= closeStartPos.byteIdx
return isAtExpectedTacticIndentation leadingTokenTailPos?
-- After an empty tactic opener like `by`, tactics on a subsequent line must be inserted at an
-- increased indentation level (two spaces past the indentation of the line containing the
-- opener token). We mirror that here so that tactic completions are not offered in positions
-- where a tactic could not actually be inserted. On the same line as the opener, completions
-- are allowed only in the trailing whitespace past the opener — cursors earlier on the line
-- belong to the surrounding term, not to the tactic block.
isAtExpectedTacticIndentation (leadingTokenTailPos? : Option String.Pos.Raw) : Bool := Id.run do
let some tokenTailPos := leadingTokenTailPos?
| return true
let hoverFilePos := fileMap.toPosition hoverPos
let tokenTailFilePos := fileMap.toPosition tokenTailPos
if hoverFilePos.line == tokenTailFilePos.line then
return hoverPos.byteIdx >= tokenTailPos.byteIdx
let expectedColumn := countLeadingSpaces (fileMap.lineStart tokenTailFilePos.line) + 2
return hoverFilePos.column == expectedColumn
countLeadingSpaces (pos : String.Pos.Raw) : Nat := Id.run do
let mut p := pos
let mut n : Nat := 0
while ! p.atEnd fileMap.source do
if p.get fileMap.source != ' ' then
break
p := p.next fileMap.source
n := n + 1
return n
isEmptyTacticBlock (stx : Syntax) : Bool :=
stx.getKind == ``Parser.Tactic.tacticSeq && isEmpty stx

View File

@@ -18,7 +18,7 @@ open Std.DHashMap.Internal
namespace Std.DHashMap.Raw
instance instDecidableEquiv {α : Type u} {β : α Type v} [BEq α] [LawfulBEq α] [Hashable α] [ k, BEq (β k)] [ k, LawfulBEq (β k)] {m₁ m₂ : Raw α β} (h₁ : m₁.WF) (h₂ : m₂.WF) : Decidable (m₁ ~m m₂) :=
def instDecidableEquiv {α : Type u} {β : α Type v} [BEq α] [LawfulBEq α] [Hashable α] [ k, BEq (β k)] [ k, LawfulBEq (β k)] {m₁ m₂ : Raw α β} (h₁ : m₁.WF) (h₂ : m₂.WF) : Decidable (m₁ ~m m₂) :=
Raw₀.decidableEquiv m₁, h₁.size_buckets_pos m₂, h₂.size_buckets_pos h₁ h₂
end Std.DHashMap.Raw

View File

@@ -19,7 +19,7 @@ open Std.DTreeMap.Internal.Impl
namespace Std.DTreeMap.Raw
instance instDecidableEquiv {α : Type u} {β : α Type v} {cmp : α α Ordering} [TransCmp cmp] [LawfulEqCmp cmp] [ k, BEq (β k)] [ k, LawfulBEq (β k)] {t₁ t₂ : Raw α β cmp} (h₁ : t₁.WF) (h₂ : t₂.WF) : Decidable (t₁ ~m t₂) :=
def instDecidableEquiv {α : Type u} {β : α Type v} {cmp : α α Ordering} [TransCmp cmp] [LawfulEqCmp cmp] [ k, BEq (β k)] [ k, LawfulBEq (β k)] {t₁ t₂ : Raw α β cmp} (h₁ : t₁.WF) (h₂ : t₂.WF) : Decidable (t₁ ~m t₂) :=
let : Ord α := cmp;
let : Decidable (t₁.inner ~m t₂.inner) := decidableEquiv t₁.1 t₂.1 h₁ h₂;
decidable_of_iff _ fun h => h, fun h => h.1

View File

@@ -19,7 +19,7 @@ open Std.DHashMap.Raw
namespace Std.HashMap.Raw
instance instDecidableEquiv {α : Type u} {β : Type v} [BEq α] [LawfulBEq α] [Hashable α] [BEq β] [LawfulBEq β] {m₁ m₂ : Raw α β} (h₁ : m₁.WF) (h₂ : m₂.WF) : Decidable (m₁ ~m m₂) :=
def instDecidableEquiv {α : Type u} {β : Type v} [BEq α] [LawfulBEq α] [Hashable α] [BEq β] [LawfulBEq β] {m₁ m₂ : Raw α β} (h₁ : m₁.WF) (h₂ : m₂.WF) : Decidable (m₁ ~m m₂) :=
let : Decidable (m₁.1 ~m m₂.1) := DHashMap.Raw.instDecidableEquiv h₁.out h₂.out;
decidable_of_iff _ fun h => h, fun h => h.1

View File

@@ -19,7 +19,7 @@ open Std.HashMap.Raw
namespace Std.HashSet.Raw
instance instDecidableEquiv {α : Type u} [BEq α] [LawfulBEq α] [Hashable α] {m₁ m₂ : Raw α} (h₁ : m₁.WF) (h₂ : m₂.WF) : Decidable (m₁ ~m m₂) :=
def instDecidableEquiv {α : Type u} [BEq α] [LawfulBEq α] [Hashable α] {m₁ m₂ : Raw α} (h₁ : m₁.WF) (h₂ : m₂.WF) : Decidable (m₁ ~m m₂) :=
let : Decidable (m₁.1 ~m m₂.1) := HashMap.Raw.instDecidableEquiv h₁.out h₂.out;
decidable_of_iff _ fun h => h, fun h => h.1

View File

@@ -20,7 +20,7 @@ open Std.DTreeMap.Raw
namespace Std.TreeMap.Raw
instance instDecidableEquiv {α : Type u} {β : Type v} {cmp : α α Ordering} [TransCmp cmp] [LawfulEqCmp cmp] [BEq β] [LawfulBEq β] {t₁ t₂ : Raw α β cmp} (h₁ : t₁.WF) (h₂ : t₂.WF) : Decidable (t₁ ~m t₂) :=
def instDecidableEquiv {α : Type u} {β : Type v} {cmp : α α Ordering} [TransCmp cmp] [LawfulEqCmp cmp] [BEq β] [LawfulBEq β] {t₁ t₂ : Raw α β cmp} (h₁ : t₁.WF) (h₂ : t₂.WF) : Decidable (t₁ ~m t₂) :=
let : Ord α := cmp;
let : Decidable (t₁.inner ~m t₂.inner) := DTreeMap.Raw.instDecidableEquiv h₁ h₂;
decidable_of_iff _ fun h => h, fun h => h.1

View File

@@ -20,7 +20,7 @@ open Std.TreeMap.Raw
namespace Std.TreeSet.Raw
instance instDecidableEquiv {α : Type u} {cmp : α α Ordering} [TransCmp cmp] [LawfulEqCmp cmp] {t₁ t₂ : Raw α cmp} (h₁ : t₁.WF) (h₂ : t₂.WF) : Decidable (t₁ ~m t₂) :=
def instDecidableEquiv {α : Type u} {cmp : α α Ordering} [TransCmp cmp] [LawfulEqCmp cmp] {t₁ t₂ : Raw α cmp} (h₁ : t₁.WF) (h₂ : t₂.WF) : Decidable (t₁ ~m t₂) :=
let : Ord α := cmp;
let : Decidable (t₁.inner ~m t₂.inner) := TreeMap.Raw.instDecidableEquiv h₁ h₂;
decidable_of_iff _ fun h => h, fun h => h.1

View File

@@ -235,7 +235,7 @@ public def checkHashUpToDate
: JobM Bool := (·.isUpToDate) <$> checkHashUpToDate' info depTrace depHash oldTrace
/--
**Ror internal use only.**
**For internal use only.**
Checks whether `info` is up-to-date with the trace.
If so, replays the log of the trace if available.
-/
@@ -271,20 +271,24 @@ Returns `true` if the saved trace exists and its hash matches `inputHash`.
If up-to-date, replays the saved log from the trace and sets the current
build action to `replay`. Otherwise, if the log is empty and trace is synthetic,
or if the trace is not up-to-date, the build action will be set to `fetch`.
or if the trace is not up-to-date, the build action will be set to `reuse`.
-/
public def SavedTrace.replayOrFetchIfUpToDate (inputHash : Hash) (self : SavedTrace) : JobM Bool := do
public def SavedTrace.replayCachedIfUpToDate (inputHash : Hash) (self : SavedTrace) : JobM Bool := do
if let .ok data := self then
if data.depHash == inputHash then
if data.synthetic && data.log.isEmpty then
updateAction .fetch
updateAction .reuse
else
updateAction .replay
data.log.replay
return true
updateAction .fetch
updateAction .reuse
return false
@[deprecated replayCachedIfUpToDate (since := "2026-04-15")]
public abbrev SavedTrace.replayOrFetchIfUpToDate (inputHash : Hash) (self : SavedTrace) : JobM Bool := do
self.replayCachedIfUpToDate inputHash
/-- **For internal use only.** -/
public class ToOutputJson (α : Type u) where
toOutputJson (arts : α) : Json
@@ -684,7 +688,7 @@ public def buildArtifactUnlessUpToDate
let fetchArt? restore := do
let some (art : XArtifact exe) getArtifacts? inputHash savedTrace pkg
| return none
unless ( savedTrace.replayOrFetchIfUpToDate inputHash) do
unless ( savedTrace.replayCachedIfUpToDate inputHash) do
removeFileIfExists file
writeFetchTrace traceFile inputHash (toJson art.descr)
if restore then

View File

@@ -29,11 +29,18 @@ namespace Lake
public inductive JobAction
/-- No information about this job's action is available. -/
| unknown
/-- Tried to replay a cached build action (set by `buildFileUnlessUpToDate`) -/
/-- Tried to reuse a cached build (e.g., can be set by `replayCachedIfUpToDate`). -/
| reuse
/-- Tried to replay a completed build action (e.g., can be set by `replayIfUpToDate`). -/
| replay
/-- Tried to fetch a build from a store (can be set by `buildUnlessUpToDate?`) -/
/-- Tried to unpack a build from an archive (e.g., unpacking a module `ltar`). -/
| unpack
/--
Tried to fetch a build from a remote store (e.g., set when downloading an artifact
on-demand from a cache service in `buildArtifactUnlessUpToDate`).
-/
| fetch
/-- Tried to perform a build action (set by `buildUnlessUpToDate?`) -/
/-- Tried to perform a build action (e.g., set by `buildAction`). -/
| build
deriving Inhabited, Repr, DecidableEq, Ord
@@ -45,11 +52,13 @@ public instance : Min JobAction := minOfLe
public instance : Max JobAction := maxOfLe
public def merge (a b : JobAction) : JobAction :=
max a b
max a b -- inlines `max`
public def verb (failed : Bool) : JobAction String
public def verb (failed : Bool) : (self : JobAction) String
| .unknown => if failed then "Running" else "Ran"
| .reuse => if failed then "Reusing" else "Reused"
| .replay => if failed then "Replaying" else "Replayed"
| .unpack => if failed then "Unpacking" else "Unpacked"
| .fetch => if failed then "Fetching" else "Fetched"
| .build => if failed then "Building" else "Built"

View File

@@ -900,8 +900,9 @@ where
let inputHash := ( getTrace).hash
let some ltarOrArts getArtifacts? inputHash savedTrace mod.pkg
| return .inr savedTrace
match (ltarOrArts : ModuleOutputs) with
match (ltarOrArts : ModuleOutputs) with
| .ltar ltar =>
updateAction .unpack
mod.clearOutputArtifacts
mod.unpackLtar ltar.path
-- Note: This branch implies that only the ltar output is (validly) cached.
@@ -919,7 +920,7 @@ where
else
return .inr savedTrace
| .arts arts =>
unless ( savedTrace.replayOrFetchIfUpToDate inputHash) do
unless ( savedTrace.replayCachedIfUpToDate inputHash) do
mod.clearOutputArtifacts
writeFetchTrace mod.traceFile inputHash (toJson arts.descrs)
let arts

View File

@@ -25,12 +25,8 @@ namespace Lake
open Lean (Name)
/-- Fetch the package's direct dependencies. -/
def Package.recFetchDeps (self : Package) : FetchM (Job (Array Package)) := ensureJob do
(pure ·) <$> self.depConfigs.mapM fun cfg => do
let some dep findPackageByName? cfg.name
| error s!"{self.prettyName}: package not found for dependency '{cfg.name}' \
(this is likely a bug in Lake)"
return dep
def Package.recFetchDeps (self : Package) : FetchM (Job (Array Package)) := do
return Job.pure self.depPkgs
/-- The `PackageFacetConfig` for the builtin `depsFacet`. -/
public def Package.depsFacetConfig : PackageFacetConfig depsFacet :=
@@ -38,10 +34,7 @@ public def Package.depsFacetConfig : PackageFacetConfig depsFacet :=
/-- Compute a topological ordering of the package's transitive dependencies. -/
def Package.recComputeTransDeps (self : Package) : FetchM (Job (Array Package)) := ensureJob do
(pure ·.toArray) <$> self.depConfigs.foldlM (init := OrdPackageSet.empty) fun deps cfg => do
let some dep findPackageByName? cfg.name
| error s!"{self.prettyName}: package not found for dependency '{cfg.name}' \
(this is likely a bug in Lake)"
(pure ·.toArray) <$> self.depPkgs.foldlM (init := OrdPackageSet.empty) fun deps dep => do
let depDeps ( fetch <| dep.transDeps).await
return depDeps.foldl (·.insert ·) deps |>.insert dep
@@ -153,7 +146,7 @@ def Package.fetchBuildArchive
let upToDate buildUnlessUpToDate? (action := .fetch) archiveFile depTrace traceFile do
download url archiveFile headers
unless upToDate && ( self.buildDir.pathExists) do
updateAction .fetch
updateAction .unpack
untar archiveFile self.buildDir
@[inline]

View File

@@ -210,7 +210,7 @@ def mkMonitorContext (cfg : BuildConfig) (jobs : JobQueue) : BaseIO MonitorConte
let failLv := cfg.failLv
let isVerbose := cfg.verbosity = .verbose
let showProgress := cfg.showProgress
let minAction := if isVerbose then .unknown else .fetch
let minAction := if isVerbose then .unknown else .unpack
let showOptional := isVerbose
let showTime := isVerbose || !useAnsi
let updateFrequency := 100

View File

@@ -9,7 +9,7 @@ prelude
public import Lean.Data.Json
import Init.Data.Nat.Fold
meta import Init.Data.Nat.Fold
import Lake.Util.String
public import Lake.Util.String
public import Init.Data.String.Search
public import Init.Data.String.Extra
import Init.Data.Option.Coe
@@ -141,8 +141,8 @@ public def ofHex? (s : String) : Option Hash :=
if s.utf8ByteSize = 16 && isHex s then ofHex s else none
/-- Returns the hash as 16-digit lowercase hex string. -/
public def hex (self : Hash) : String :=
lpad (String.ofList <| Nat.toDigits 16 self.val.toNat) '0' 16
@[inline] public def hex (self : Hash) : String :=
lowerHexUInt64 self.val
/-- Parse a hash from a string of decimal digits. -/
public def ofDecimal? (s : String) : Option Hash :=

View File

@@ -69,7 +69,7 @@ public structure LakeOptions where
scope? : Option CacheServiceScope := none
platform? : Option CachePlatform := none
toolchain? : Option CacheToolchain := none
rev? : Option String := none
rev? : Option GitRev := none
maxRevs : Nat := 100
shake : Shake.Args := {}
@@ -563,7 +563,7 @@ private def computePackageRev (pkgDir : FilePath) : CliStateM String := do
repo.getHeadRevision
private def putCore
(rev : String) (outputs : FilePath) (artDir : FilePath)
(rev : GitRev) (outputs : FilePath) (artDir : FilePath)
(service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: LoggerIO Unit := do

View File

@@ -7,6 +7,7 @@ module
prelude
import Init.Control.Do
public import Lake.Util.Git
public import Lake.Util.Log
public import Lake.Util.Version
public import Lake.Config.Artifact
@@ -469,7 +470,7 @@ public def readOutputs? (cache : Cache) (scope : String) (inputHash : Hash) : Lo
cache.dir / "revisions"
/-- Returns path to the input-to-output mappings of a downloaded package revision. -/
@[inline] public def revisionPath (cache : Cache) (scope : String) (rev : String) : FilePath :=
@[inline] public def revisionPath (cache : Cache) (scope : String) (rev : GitRev) : FilePath :=
cache.revisionDir / scope / s!"{rev}.jsonl"
end Cache
@@ -942,7 +943,7 @@ public def uploadArtifacts
public def mapContentType : String := "application/vnd.reservoir.outputs+json-lines"
def s3RevisionUrl
(rev : String) (service : CacheService) (scope : CacheServiceScope)
(rev : GitRev) (service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: String :=
match scope.impl with
@@ -956,7 +957,7 @@ def s3RevisionUrl
return s!"{url}/{rev}.jsonl"
public def revisionUrl
(rev : String) (service : CacheService) (scope : CacheServiceScope)
(rev : GitRev) (service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: String :=
if service.isReservoir then Id.run do
@@ -974,7 +975,7 @@ public def revisionUrl
service.s3RevisionUrl rev scope platform toolchain
public def downloadRevisionOutputs?
(rev : String) (cache : Cache) (service : CacheService)
(rev : GitRev) (cache : Cache) (service : CacheService)
(localScope : String) (remoteScope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none) (force := false)
: LoggerIO (Option CacheMap) := do
@@ -998,7 +999,7 @@ public def downloadRevisionOutputs?
CacheMap.load path platform.isNone
public def uploadRevisionOutputs
(rev : String) (outputs : FilePath) (service : CacheService) (scope : CacheServiceScope)
(rev : GitRev) (outputs : FilePath) (service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: LoggerIO Unit := do
let url := service.s3RevisionUrl rev scope platform toolchain

View File

@@ -9,6 +9,7 @@ prelude
public import Init.Dynamic
public import Init.System.FilePath
public import Lean.Data.NameMap.Basic
public import Lake.Util.Git
import Init.Data.ToString.Name
import Init.Data.ToString.Macro
@@ -30,7 +31,7 @@ public inductive DependencySrc where
/- A package located at a fixed path relative to the dependent package's directory. -/
| path (dir : FilePath)
/- A package cloned from a Git repository available at a fixed Git `url`. -/
| git (url : String) (rev : Option String) (subDir : Option FilePath)
| git (url : String) (rev : Option GitRev) (subDir : Option FilePath)
deriving Inhabited, Repr
/--

View File

@@ -52,6 +52,8 @@ public structure Package where
remoteUrl : String
/-- Dependency configurations for the package. -/
depConfigs : Array Dependency := #[]
/-- **For internal use only.** Resolved direct dependences of the package. -/
depPkgs : Array Package := #[]
/-- Target configurations in the order declared by the package. -/
targetDecls : Array (PConfigDecl keyName) := #[]
/-- Name-declaration map of target configurations in the package. -/

View File

@@ -33,14 +33,12 @@ public def computeLakeCache (pkg : Package) (lakeEnv : Lake.Env) : Cache :=
lakeEnv.lakeCache?.getD pkg.lakeDir / "cache"
public structure Workspace.Raw : Type where
/-- The root package of the workspace. -/
root : Package
/-- The detected {lean}`Lake.Env` of the workspace. -/
lakeEnv : Lake.Env
/-- The Lake configuration from the system configuration file. -/
lakeConfig : LoadedLakeConfig
/-- The Lake cache. -/
lakeCache : Cache := private_decl% computeLakeCache root lakeEnv
lakeCache : Cache
/--
The CLI arguments Lake was run with.
Used by {lit}`lake update` to perform a restart of Lake on a toolchain update.
@@ -59,6 +57,7 @@ public structure Workspace.Raw : Type where
deriving Nonempty
public structure Workspace.Raw.WF (ws : Workspace.Raw) : Prop where
size_packages_pos : 0 < ws.packages.size
packages_wsIdx : (h : i < ws.packages.size), (ws.packages[i]'h).wsIdx = i
/-- A Lake workspace -- the top-level package directory. -/
@@ -67,8 +66,13 @@ public structure Workspace extends raw : Workspace.Raw, wf : raw.WF
public instance : Nonempty Workspace := .intro {
lakeEnv := default
lakeConfig := Classical.ofNonempty
root := Classical.ofNonempty
packages_wsIdx h := by simp at h
lakeCache := Classical.ofNonempty
packages := #[{(Classical.ofNonempty : Package) with wsIdx := 0}]
size_packages_pos := by simp
packages_wsIdx {i} h := by
cases i with
| zero => simp
| succ => simp at h
}
public hydrate_opaque_type OpaqueWorkspace Workspace
@@ -85,9 +89,13 @@ public def Package.defaultTargetRoots (self : Package) : Array Lean.Name :=
namespace Workspace
/-- The root package of the workspace. -/
@[inline] public def root (self : Workspace) : Package :=
self.packages[0]'self.size_packages_pos
/-- **For internal use.** Whether this workspace is Lean itself. -/
@[inline] def bootstrap (ws : Workspace) : Bool :=
ws.root.bootstrap
@[inline] def bootstrap (self : Workspace) : Bool :=
self.root.bootstrap
/-- The path to the workspace's directory (i.e., the directory of the root package). -/
@[inline] public def dir (self : Workspace) : FilePath :=
@@ -193,12 +201,18 @@ This is configured through {lit}`cache.service` entries in the system Lake confi
{self with
packages := self.packages.push pkg
packageMap := self.packageMap.insert pkg.keyName pkg
size_packages_pos := by simp
packages_wsIdx {i} i_lt := by
cases Nat.lt_add_one_iff_lt_or_eq.mp <| Array.size_push .. i_lt with
| inl i_lt => simpa [Array.getElem_push_lt i_lt] using self.packages_wsIdx i_lt
| inr i_eq => simpa [i_eq] using h
}
/-- **For internal use only.** -/
public theorem packages_addPackage' :
(addPackage' pkg ws h).packages = ws.packages.push pkg
:= by rfl
/-- Add a package to the workspace. -/
@[inline] public def addPackage (pkg : Package) (self : Workspace) : Workspace :=
self.addPackage' {pkg with wsIdx := self.packages.size} rfl
@@ -278,6 +292,11 @@ public def findTargetDecl? (name : Name) (self : Workspace) : Option ((pkg : Pac
@[inline] public def addFacetConfig {name} (cfg : FacetConfig name) (self : Workspace) : Workspace :=
{self with facetConfigs := self.facetConfigs.insert cfg}
/-- **For internal use only.** -/
public theorem packages_addFacetConfig :
(addFacetConfig cfg ws).packages = ws.packages
:= by rfl
/-- Try to find a facet configuration in the workspace with the given name. -/
@[inline] public def findFacetConfig? (name : Name) (self : Workspace) : Option (FacetConfig name) :=
self.facetConfigs.get? name

View File

@@ -8,6 +8,7 @@ module
prelude
public import Lake.Util.Version
public import Lake.Config.Defaults
public import Lake.Util.Git
import Lake.Util.Error
public import Lake.Util.FilePath
import Lake.Util.JsonObject
@@ -75,8 +76,8 @@ public inductive PackageEntrySrc
/-- A remote Git package. -/
| git
(url : String)
(rev : String)
(inputRev? : Option String)
(rev : GitRev)
(inputRev? : Option GitRev)
(subDir? : Option FilePath)
deriving Inhabited

View File

@@ -13,6 +13,8 @@ import Lake.Util.Git
import Lake.Util.IO
import Lake.Reservoir
set_option doc.verso true
open System Lean
/-! # Dependency Materialization
@@ -23,9 +25,12 @@ or resolve a local path dependency.
namespace Lake
/-- Update the Git package in `repo` to `rev` if not already at it. -/
/--
Update the Git package in {lean}`repo` to the revision {lean}`rev?` if not already at it.
IF no revision is specified (i.e., {lean}`rev? = none`), then uses the latest {lit}`master`.
-/
def updateGitPkg
(name : String) (repo : GitRepo) (rev? : Option String)
(name : String) (repo : GitRepo) (rev? : Option GitRev)
: LoggerIO PUnit := do
let rev repo.findRemoteRevision rev?
if ( repo.getHeadRevision) = rev then
@@ -40,9 +45,9 @@ def updateGitPkg
-- so stale ones from the previous revision cause incorrect trace computations.
repo.clean
/-- Clone the Git package as `repo`. -/
/-- Clone the Git package as {lean}`repo`. -/
def cloneGitPkg
(name : String) (repo : GitRepo) (url : String) (rev? : Option String)
(name : String) (repo : GitRepo) (url : String) (rev? : Option GitRev)
: LoggerIO PUnit := do
logInfo s!"{name}: cloning {url}"
repo.clone url
@@ -52,9 +57,9 @@ def cloneGitPkg
repo.checkoutDetach rev
/--
Update the Git repository from `url` in `repo` to `rev?`.
If `repo` is already from `url`, just checkout the new revision.
Otherwise, delete the local repository and clone a fresh copy from `url`.
Update the Git repository from {lean}`url` in {lean}`repo` to {lean}`rev?`.
If {lean}`repo` is already from {lean}`url`, just checkout the new revision.
Otherwise, delete the local repository and clone a fresh copy from {lean}`url`.
-/
def updateGitRepo
(name : String) (repo : GitRepo) (url : String) (rev? : Option String)
@@ -75,8 +80,9 @@ def updateGitRepo
IO.FS.removeDirAll repo.dir
cloneGitPkg name repo url rev?
/--
Materialize the Git repository from `url` into `repo` at `rev?`.
Materialize the Git repository from {lean}`url` into {lean}`repo` at {lean}`rev?`.
Clone it if no local copy exists, otherwise update it.
-/
def materializeGitRepo
@@ -114,11 +120,11 @@ namespace MaterializedDep
@[inline] public def scope (self : MaterializedDep) : String :=
self.manifestEntry.scope
/-- Path to the dependency's manfiest file (relative to `relPkgDir`). -/
/-- Path to the dependency's manfiest file (relative to {lean}`relPkgDir`). -/
@[inline] public def relManifestFile? (self : MaterializedDep) : Option FilePath :=
self.manifestEntry.manifestFile?
/-- Path to the dependency's manfiest file (relative to `relPkgDir`). -/
/-- Path to the dependency's manfiest file (relative to {lean}`relPkgDir`). -/
@[inline] public def relManifestFile (self : MaterializedDep) : FilePath :=
self.relManifestFile?.getD defaultManifestFile
@@ -126,7 +132,7 @@ namespace MaterializedDep
@[inline] public def manifestFile (self : MaterializedDep) : FilePath :=
self.pkgDir / self.relManifestFile
/-- Path to the dependency's configuration file (relative to `relPkgDir`). -/
/-- Path to the dependency's configuration file (relative to {lean}`relPkgDir`). -/
@[inline] public def relConfigFile (self : MaterializedDep) : FilePath :=
self.manifestEntry.configFile
@@ -143,7 +149,7 @@ end MaterializedDep
inductive InputVer
| none
| git (rev : String)
| git (rev : GitRev)
| ver (ver : VerRange)
def pkgNotIndexed (scope name : String) (ver : InputVer) : String :=

View File

@@ -15,6 +15,8 @@ import Lake.Build.Topological
import Lake.Load.Materialize
import Lake.Load.Lean.Eval
import Lake.Load.Package
import Init.Data.Range.Polymorphic.Iterators
import Init.TacticsExtra
open System Lean
@@ -45,23 +47,29 @@ namespace Lake
def Workspace.addFacetDecls (decls : Array FacetDecl) (self : Workspace) : Workspace :=
decls.foldl (·.addFacetConfig ·.config) self
theorem Workspace.packages_addFacetDecls :
(addFacetDecls decls ws).packages = ws.packages
:= by
simp only [addFacetDecls]
apply Array.foldl_induction (fun _ (s : Workspace) => s.packages = ws.packages) rfl
intro i s h
simp only [packages_addFacetConfig, h]
/--
Loads the package configuration of a materialized dependency.
Adds the package and the facets defined within it to the `Workspace`.
-/
def addDepPackage
(dep : MaterializedDep)
(lakeOpts : NameMap String)
(leanOpts : Options) (reconfigure : Bool)
: StateT Workspace LogIO Package := fun ws => do
def Workspace.addDepPackage'
(ws : Workspace) (dep : MaterializedDep)
(lakeOpts : NameMap String) (leanOpts : Options) (reconfigure : Bool)
: LogIO {ws' : Workspace // ws'.packages.size = ws.packages.size + 1} := do
let wsIdx := ws.packages.size
let loadCfg := mkDepLoadConfig ws dep lakeOpts leanOpts reconfigure
let loadCfg, h resolveConfigFile dep.prettyName loadCfg
let fileCfg loadConfigFile loadCfg h
let pkg := mkPackage loadCfg fileCfg wsIdx
let ws := ws.addPackage' pkg wsIdx_mkPackage
let ws := ws.addFacetDecls fileCfg.facetDecls
return (pkg, ws)
let ws := ws.addPackage' pkg wsIdx_mkPackage |>.addFacetDecls fileCfg.facetDecls
return ws, by simp [ws, packages_addFacetDecls, packages_addPackage']
/--
The resolver's call stack of dependencies.
@@ -103,6 +111,52 @@ abbrev ResolveT m := DepStackT <| StateT Workspace m
recFetchAcyclic (·.baseName) go root
return ws
def Workspace.setDepPkgs
(self : Workspace) (wsIdx : Nat) (depPkgs : Array Package)
: Workspace := {self with
packages := self.packages.modify wsIdx ({· with depPkgs})
size_packages_pos := by simp [self.size_packages_pos]
packages_wsIdx {i} := by
if h : wsIdx = i then
simp [h, Array.getElem_modify_self, self.packages_wsIdx]
else
simp [Array.getElem_modify_of_ne h, self.packages_wsIdx]
}
structure ResolveState where
ws : Workspace
depIdxs : Array Nat
lt_of_mem : i depIdxs, i < ws.packages.size
namespace ResolveState
@[inline] def init (ws : Workspace) (size : Nat) : ResolveState :=
{ws, depIdxs := Array.mkEmpty size, lt_of_mem := by simp}
@[inline] def reuseDep (s : ResolveState) (wsIdx : Fin s.ws.packages.size) : ResolveState :=
have lt_of_mem := by
intro i i_mem
cases Array.mem_push.mp i_mem with
| inl i_mem => exact s.lt_of_mem i i_mem
| inr i_eq => simp only [i_eq, wsIdx.isLt]
{s with depIdxs := s.depIdxs.push wsIdx, lt_of_mem}
@[inline] def newDep
(s : ResolveState) (dep : MaterializedDep)
(lakeOpts : NameMap String) (leanOpts : Options) (reconfigure : Bool)
: LogIO ResolveState := do
let ws, depIdxs, lt_of_mem := s
let wsIdx := ws.packages.size
let ws', h ws.addDepPackage' dep lakeOpts leanOpts reconfigure
have lt_of_mem := by
intro i i_mem
cases Array.mem_push.mp i_mem with
| inl i_mem => exact h Nat.lt_add_one_of_lt (lt_of_mem i i_mem)
| inr i_eq => simp only [wsIdx, i_eq, h, Nat.lt_add_one]
return ws', depIdxs.push wsIdx, lt_of_mem
end ResolveState
/-
Recursively visits each node in a package's dependency graph, starting from
the workspace package `root`. Each dependency missing from the workspace is
@@ -121,19 +175,31 @@ See `Workspace.updateAndMaterializeCore` for more details.
: m Workspace := do
ws.runResolveT go root stack
where
@[specialize] go pkg recurse : ResolveT m Unit := do
let start := ( getWorkspace).packages.size
@[specialize] go pkg recurse : ResolveT m Unit := fun depStack ws => do
let start := ws.packages.size
-- Materialize and load the missing direct dependencies of `pkg`
pkg.depConfigs.forRevM fun dep => do
let ws getWorkspace
if ws.packages.any (·.baseName == dep.name) then
return -- already handled in another branch
let s := ResolveState.init ws pkg.depConfigs.size
let ws, depIdxs, lt_of_mem pkg.depConfigs.foldrM (m := m) (init := s) fun dep s => do
if let some wsIdx := s.ws.packages.findFinIdx? (·.baseName == dep.name) then
return s.reuseDep wsIdx -- already handled in another branch
if pkg.baseName = dep.name then
error s!"{pkg.prettyName}: package requires itself (or a package with the same name)"
let matDep resolve pkg dep ( getWorkspace)
discard <| addDepPackage matDep dep.opts leanOpts reconfigure
let matDep resolve pkg dep s.ws
s.newDep matDep dep.opts leanOpts reconfigure
let depsEnd := ws.packages.size
-- Recursively load the dependencies' dependencies
( getWorkspace).packages.forM recurse start
let ws ws.packages.foldlM (start := start) (init := ws) fun ws pkg =>
(·.2) <$> recurse pkg depStack ws
-- Add the package's dependencies to the package
let ws :=
if h_le : depsEnd ws.packages.size then
ws.setDepPkgs pkg.wsIdx <| depIdxs.attach.map fun wsIdx, h_mem =>
ws.packages[wsIdx]'(Nat.lt_of_lt_of_le (lt_of_mem wsIdx h_mem) h_le)
else
have : Inhabited Workspace := ws
panic! "workspace shrunk" -- should be unreachable
return ((), ws)
/--
Adds monad state used to update the manifest.
@@ -372,10 +438,14 @@ def Workspace.updateAndMaterializeCore
let start := ws.packages.size
let ws (deps.zip matDeps).foldlM (init := ws) fun ws (dep, matDep) => do
addDependencyEntries matDep
let (_, ws) addDepPackage matDep dep.opts leanOpts true ws
let ws, _ ws.addDepPackage' matDep dep.opts leanOpts true
return ws
ws.packages.foldlM (init := ws) (start := start) fun ws pkg =>
let stop := ws.packages.size
let ws ws.packages.foldlM (init := ws) (start := start) fun ws pkg =>
ws.resolveDepsCore updateAndAddDep pkg [ws.root.baseName] leanOpts true
-- Set dependency packages after `resolveDepsCore` so
-- that the dependencies' dependencies are also properly set
return ws.setDepPkgs ws.root.wsIdx ws.packages[start...<stop]
else
ws.resolveDepsCore updateAndAddDep (leanOpts := leanOpts) (reconfigure := true)
where

View File

@@ -35,17 +35,22 @@ public def loadWorkspaceRoot (config : LoadConfig) : LogIO Workspace := do
let config, h resolveConfigFile "[root]" config
let fileCfg loadConfigFile config h
let root := mkPackage config fileCfg 0
have wsIdx_root : root.wsIdx = 0 := wsIdx_mkPackage
let facetConfigs := fileCfg.facetDecls.foldl (·.insert ·.config) initFacetConfigs
let ws : Workspace := {
root
return {
lakeEnv := config.lakeEnv
lakeCache := computeLakeCache root config.lakeEnv
lakeConfig
lakeArgs? := config.lakeArgs?
facetConfigs
packages_wsIdx h := by simp at h
packages := #[root]
packageMap := DNameMap.empty.insert root.keyName root
size_packages_pos := by simp
packages_wsIdx {i} h := by
cases i with
| zero => simp [wsIdx_root]
| succ => simp at h
}
let ws := ws.addPackage' root wsIdx_mkPackage
return ws
/--
Load a `Workspace` for a Lake package by

View File

@@ -154,6 +154,7 @@ public class FamilyOut {α : Type u} {β : Type v} (f : α → β) (a : α) (b :
-- Simplifies proofs involving open type families.
-- Scoped to avoid slowing down `simp` in downstream projects (the discrimination
-- tree key is `_`, so it would be attempted on every goal).
set_option warning.simp.varHead false in
attribute [scoped simp] FamilyOut.fam_eq
public instance [FamilyDef f a b] : FamilyOut f a b where

View File

@@ -10,6 +10,7 @@ public import Init.Data.ToString
public import Lake.Util.Proc
import Init.Data.String.TakeDrop
import Init.Data.String.Search
import Lake.Util.String
open System
namespace Lake
@@ -36,18 +37,48 @@ public def filterUrl? (url : String) : Option String :=
some url
public def isFullObjectName (rev : String) : Bool :=
rev.length == 40 && rev.all fun c => c.isDigit || ('a' <= c && c <= 'f')
rev.utf8ByteSize == 40 && isHex rev
end Git
public structure GitRepo where
dir : FilePath
public instance : Coe FilePath GitRepo := (·)
public instance : ToString GitRepo := (·.dir.toString)
/--
A commit-ish [Git revision][1].
This can be SHA1 commit hash, a branch name, or one of Git's more complex specifiers.
[1]: https://git-scm.com/docs/gitrevisions#_specifying_revisions
-/
public abbrev GitRev := String
namespace GitRev
/-- The head revision (i.e., `HEAD`). -/
@[expose] public def head : GitRev := "HEAD"
/-- The revision fetched during the last `git fetch` (i.e., `FETCH_HEAD`). -/
@[expose] public def fetchHead : GitRev := "FETCH_HEAD"
/-- Returns whether this revision is a 40-digit hexadecimal (SHA1) commit hash. -/
public def isFullSha1 (rev : GitRev) : Bool :=
rev.utf8ByteSize == 40 && isHex rev
attribute [deprecated GitRev.isFullSha1 (since := "2026-04-17")] Git.isFullObjectName
/-- Scopes the revision by the remote. -/
@[inline] public def withRemote (remote : String) (rev : GitRev) : GitRev :=
s!"{remote}/{rev}"
end GitRev
namespace GitRepo
public instance : Coe FilePath GitRepo := (·)
public instance : ToString GitRepo := (·.dir.toString)
public def cwd : GitRepo := "."
@[inline] public def dirExists (repo : GitRepo) : BaseIO Bool :=
@@ -71,12 +102,18 @@ public def clone (url : String) (repo : GitRepo) : LogIO PUnit :=
public def quietInit (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["init", "-q"]
public def bareInit (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["init", "--bare", "-q"]
public def insideWorkTree (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["rev-parse", "--is-inside-work-tree"]
public def fetch (repo : GitRepo) (remote := Git.defaultRemote) : LogIO PUnit :=
repo.execGit #["fetch", "--tags", "--force", remote]
public def addWorktreeDetach (path : FilePath) (rev : GitRev) (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["worktree", "add", "--detach", path.toString, rev]
public def checkoutBranch (branch : String) (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["checkout", "-B", branch]
@@ -87,60 +124,80 @@ public def checkoutDetach (hash : String) (repo : GitRepo) : LogIO PUnit :=
public def clean (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["clean", "-xf"]
public def resolveRevision? (rev : String) (repo : GitRepo) : BaseIO (Option String) := do
/-- Resolves the revision to a Git object name (SHA1 hash) which or may not exist in the repository. -/
public def resolveRevision? (rev : GitRev) (repo : GitRepo) : BaseIO (Option GitRev) := do
repo.captureGit? #["rev-parse", "--verify", "--end-of-options", rev]
public def resolveRevision (rev : String) (repo : GitRepo) : LogIO String := do
if Git.isFullObjectName rev then return rev
/-- Resolves the revision to a valid commit hash within the repository. -/
public def findCommit? (rev : GitRev) (repo : GitRepo) : BaseIO (Option GitRev) := do
repo.captureGit? #["rev-parse", "--verify", "--end-of-options", rev ++ "^{commit}"]
public def resolveRevision (rev : GitRev) (repo : GitRepo) : LogIO GitRev := do
if rev.isFullSha1 then return rev
if let some rev repo.resolveRevision? rev then return rev
error s!"{repo}: revision not found '{rev}'"
@[inline] public def getHeadRevision? (repo : GitRepo) : BaseIO (Option String) :=
repo.resolveRevision? "HEAD"
@[inline] public def getHeadRevision? (repo : GitRepo) : BaseIO (Option GitRev) :=
repo.resolveRevision? .head
public def getHeadRevision (repo : GitRepo) : LogIO String := do
public def getHeadRevision (repo : GitRepo) : LogIO GitRev := do
if let some rev repo.getHeadRevision? then return rev
error s!"{repo}: could not resolve 'HEAD' to a commit; \
the repository may be corrupt, so you may need to remove it and try again"
public def getHeadRevisions (repo : GitRepo) (n : Nat := 0) : LogIO (Array String) := do
public def fetchRevision? (repo : GitRepo) (remote : String) (rev : GitRev) : LogIO (Option GitRev) := do
if ( repo.testGit #["fetch", "--tags", "--force", "--refetch", "--filter=tree:0", remote, rev]) then
let some rev repo.resolveRevision? .fetchHead
| error s!"{repo}: could not resolve 'FETCH_HEAD' to a commit after fetching; \
this may be an issue with Lake; please report it"
return rev
else
return none
public def getHeadRevisions (repo : GitRepo) (n : Nat := 0) : LogIO (Array GitRev) := do
let args := #["rev-list", "HEAD"]
let args := if n != 0 then args ++ #["-n", toString n] else args
let revs repo.captureGit args
return revs.split '\n' |>.toStringArray
public def resolveRemoteRevision (rev : String) (remote := Git.defaultRemote) (repo : GitRepo) : LogIO String := do
if Git.isFullObjectName rev then return rev
if let some rev repo.resolveRevision? s!"{remote}/{rev}" then return rev
public def resolveRemoteRevision (rev : GitRev) (remote := Git.defaultRemote) (repo : GitRepo) : LogIO GitRev := do
if rev.isFullSha1 then return rev
if let some rev repo.resolveRevision? (rev.withRemote remote) then return rev
if let some rev repo.resolveRevision? rev then return rev
error s!"{repo}: revision not found '{rev}'"
public def findRemoteRevision (repo : GitRepo) (rev? : Option String := none) (remote := Git.defaultRemote) : LogIO String := do
public def findRemoteRevision (repo : GitRepo) (rev? : Option GitRev := none) (remote := Git.defaultRemote) : LogIO String := do
repo.fetch remote; repo.resolveRemoteRevision (rev?.getD Git.upstreamBranch) remote
public def branchExists (rev : String) (repo : GitRepo) : BaseIO Bool := do
public def branchExists (rev : GitRev) (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["show-ref", "--verify", s!"refs/heads/{rev}"]
public def revisionExists (rev : String) (repo : GitRepo) : BaseIO Bool := do
public def revisionExists (rev : GitRev) (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["rev-parse", "--verify", rev ++ "^{commit}"]
public def getTags (repo : GitRepo) : BaseIO (List String) := do
let some out repo.captureGit? #["tag"] | return []
return out.split '\n' |>.toStringList
public def findTag? (rev : String := "HEAD") (repo : GitRepo) : BaseIO (Option String) := do
public def findTag? (rev : GitRev := .head) (repo : GitRepo) : BaseIO (Option String) := do
repo.captureGit? #["describe", "--tags", "--exact-match", rev]
public def getRemoteUrl?
(remote := Git.defaultRemote) (repo : GitRepo)
: BaseIO (Option String) := do repo.captureGit? #["remote", "get-url", remote]
: BaseIO (Option String) := repo.captureGit? #["remote", "get-url", remote]
public def addRemote (remote : String) (url : String) (repo : GitRepo) : LogIO Unit :=
repo.execGit #["remote", "add", remote, url]
public def setRemoteUrl (remote : String) (url : String) (repo : GitRepo) : LogIO Unit :=
repo.execGit #["remote", "set-url", remote, url]
public def getFilteredRemoteUrl?
(remote := Git.defaultRemote) (repo : GitRepo)
: BaseIO (Option String) := OptionT.run do Git.filterUrl? ( repo.getRemoteUrl? remote)
public def hasNoDiff (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["diff", "HEAD", "--exit-code"]
repo.testGit #["diff", "--exit-code", "HEAD"]
@[inline] public def hasDiff (repo : GitRepo) : BaseIO Bool := do
not <$> repo.hasNoDiff

View File

@@ -7,6 +7,7 @@ module
prelude
public import Init.Data.ToString.Basic
import Init.Data.UInt.Lemmas
import Init.Data.String.Basic
import Init.Data.Nat.Fold
@@ -33,3 +34,38 @@ public def isHex (s : String) : Bool :=
65 c -- 'A'
else
false
def lowerHexByte (n : UInt8) : UInt8 :=
if n 9 then
n + 48 -- + '0'
else
n + 87 -- + ('a' - 10)
theorem isValidChar_of_lt_256 (h : n < 256) : isValidChar n :=
Or.inl <| Nat.lt_trans h (by decide)
def lowerHexChar (n : UInt8) : Char :=
lowerHexByte n |>.toUInt32, isValidChar_of_lt_256 <|
UInt32.lt_iff_toNat_lt.mpr <| (lowerHexByte n).toNat_lt
public def lowerHexUInt64 (n : UInt64) : String :=
String.ofByteArray (ByteArray.emptyWithCapacity 16) ByteArray.isValidUTF8_empty
|>.push (lowerHexChar (n >>> 60 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 56 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 52 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 48 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 44 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 40 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 36 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 32 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 28 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 24 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 20 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 16 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 12 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 8 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 4 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n &&& 0xf).toUInt8)
-- sanity check
example : "0123456789abcdef" = lowerHexUInt64 0x0123456789abcdef := by decide

View File

@@ -39,7 +39,7 @@ if(USE_MIMALLOC)
# Lean code includes it as `lean/mimalloc.h` but for compiling `static.c` itself, add original dir
include_directories(${LEAN_BINARY_DIR}/../mimalloc/src/mimalloc/include)
# make all symbols visible, always build with optimizations as otherwise Lean becomes too slow
set(MIMALLOC_FLAGS "-DMI_SHARED_LIB -DMI_SHARED_LIB_EXPORT -O3 -DNDEBUG -DMI_WIN_NOREDIRECT -Wno-unused-function")
set(MIMALLOC_FLAGS "-DMI_SHARED_LIB -DMI_SHARED_LIB_EXPORT -O3 -DNDEBUG -DMI_WIN_NOREDIRECT -DMI_SECURE=${LEAN_MI_SECURE} -Wno-unused-function")
if(CMAKE_CXX_COMPILER_ID MATCHES "AppleClang|Clang")
string(APPEND MIMALLOC_FLAGS " -Wno-deprecated")
endif()

View File

@@ -752,6 +752,7 @@ extern "C" LEAN_EXPORT obj_res lean_windows_get_next_transition(b_obj_arg timezo
u_strToUTF8(dst_name, sizeof(dst_name), &dst_name_len, tzID, tzIDLength, &status);
if (U_FAILURE(status)) {
ucal_close(cal);
return lean_io_result_mk_error(lean_mk_io_error_invalid_argument(EINVAL, mk_string("failed to convert DST name to UTF-8")));
}
@@ -1397,7 +1398,7 @@ extern "C" LEAN_EXPORT obj_res lean_io_app_path() {
memset(dest, 0, PATH_MAX);
pid_t pid = getpid();
snprintf(path, PATH_MAX, "/proc/%d/exe", pid);
if (readlink(path, dest, PATH_MAX) == -1) {
if (readlink(path, dest, PATH_MAX - 1) == -1) {
return io_result_mk_error("failed to locate application");
} else {
return io_result_mk_ok(mk_string(dest));

View File

@@ -54,7 +54,7 @@ ifeq "${USE_LAKE}" "ON"
# build in parallel
Init:
${PREV_STAGE}/bin/lake build -f ${CMAKE_BINARY_DIR}/lakefile.toml $(LAKE_EXTRA_ARGS)
${PREV_STAGE}/bin/lake build -f ${CMAKE_BINARY_DIR}/lakefile.toml ${LEAN_EXTRA_LAKE_OPTS} $(LAKE_EXTRA_ARGS)
Std Lean Lake Leanc LeanChecker LeanIR: Init

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More