Compare commits

...

19 Commits

Author SHA1 Message Date
Kim Morrison
3e9ed3c29d chore: add AGENTS.md symlink to CLAUDE.md (#13461)
This PR adds an `AGENTS.md` symlink to `.claude/CLAUDE.md` so
Codex-style repository instructions can resolve to the same checked-in
guidance Claude Code already uses.

This keeps the repository's agent-facing instructions in one place
instead of maintaining separate copies for different tools. Previous
Codex runs did not automatically pick up the existing
`.claude/CLAUDE.md` guidance, which caused avoidable drift in PR
formatting and workflow behavior.
2026-04-18 06:48:05 +00:00
Mac Malone
43e1e8285b refactor: lake: introduce GitRev (#13456)
This PR adds a type abbreviation `GitRev` to Lake, which is used for
`String` values that signify Git revisions. Such revisions may be a SHA1
commit hash, a branch name, or one of Git's more complex specifiers.

The PR also adds a number of additional Git primitives which are useful
for #11662.
2026-04-18 03:44:26 +00:00
Mac Malone
d3c069593b refactor: introduce Package.depPkgs (#13445)
This PR adds `Pakcage.depPkgs` for internal use (namely in #11662). This
field contains the list of the package's direct dependencies (as package
objects). This is much more efficient than going through the old package
`deps` facet.

As part of this refactor, the Workspace `root` is now derived from
`packages` instead of being is own independent field.
2026-04-18 03:44:21 +00:00
Mac Malone
0fa749f71b refactor: lake: introduce lowerHexUInt64 (#13455)
This PR adds a `lowerHexUInt64` utility to Lake which is used to
optimize `Hash.hex`.
2026-04-18 03:43:11 +00:00
Kyle Miller
592eb02bb2 feat: have level metavariable pretty printer instantiate level metavariables (#13438)
This PR makes the universe level pretty printer instantiate level
metavariables when `pp.instantiateMVars` is true.

Previously level metavariables were not instantiated.

The PR adjusts the tracing in the LevelDefEq module to create the trace
message using the original MetavarContext. It also adds
`Meta.isLevelDefEq.step` traces for when level metavariables are
assigned.
2026-04-18 01:07:22 +00:00
Leonardo de Moura
70df9742f4 fix: kernel error in grind order module for Nat casts to non-Int types (#13453)
This PR fixes a kernel error in `grind` when propagating a `Nat`
equality to an order structure whose carrier type is not `Int` (e.g.
`Rat`). The auxiliary `Lean.Grind.Order.of_nat_eq` lemma was specialized
to `Int`, so the kernel rejected the application when the cast
destination differed.

We add a polymorphic `of_natCast_eq` lemma over `{α : Type u} [NatCast
α]` and cache the cast destination type in `TermMapEntry`.
`processNewEq` now uses the original `of_nat_eq` when the destination is
`Int` (the common case) and the new lemma otherwise. The symmetric
`nat_eq` propagation (deriving `Nat` equality from a derived cast
equality) is now guarded to fire only when the destination is `Int`,
since the `nat_eq` lemma is still specialized to `Int`.

Closes #13265.
2026-04-17 23:51:21 +00:00
Leonardo de Moura
9c245d5531 test: add regression test for Sym.simp eta-reduction (#13416) (#13452)
This PR adds a direct regression test for issue #13416. It exercises
`Std.HashMap.getElem_insert`, whose `dom` argument is a lambda closing
over pattern variables, and checks that the discrimination tree lookup
finds the theorem once the target's `dom` lambda is eta-reduced.

The underlying fix landed in #13448; this test pins the specific MWE
from the original issue so a regression would surface immediately.

Closes #13416

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:47:53 +00:00
Mac Malone
3c4440d3bc feat: lake: JobAction.reuse & .unpack (#13423)
This PR adds `JobAction.reuse` and `JobAction.unpack` which provide more
information captions for what a job is doing for the build monitor.
`reuse` is set when using an artifact from the Lake cache, `unpack` is
set when unpacking module `.ltar` archives and release (Reservoir or
GitHub) archives.
2026-04-17 22:34:04 +00:00
Leonardo de Moura
2964193af8 fix: avoid assigning mvar when Sym.intros produces no binders (#13451)
This PR fixes a bug in `Sym.introCore.finalize` where the original
metavariable was unconditionally assigned via a delayed assignment, even
when no binders were introduced. As a result, `Sym.intros` would return
`.failed` while the goal metavariable had already been silently
assigned, confusing downstream code that relies on `isAssigned` (e.g. VC
filters in `mvcgen'`).

The test and fix were suggested by Sebastian Graf (@sgraf812).

Co-authored-by: Sebastian Graf <sgraf1337@gmail.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:47:47 +00:00
Eric Wieser
43e96b119d fix: prevent a hang in acLt (#13367)
This PR removes some cases where `simp` would significantly overrun a
timeout.

This is a little tricky to test cleanly; using mathlib's
`#count_heartbeats` as
```lean4
#count_heartbeats in
set_option maxHeartbeats 200000 in
example (k : Nat) (a : Fin (1 + k + 1) → Nat) :
    0 ≤ sumRange (1 + k + 1) (fun i =>
        if h : i < 1 + k + 1 then a ⟨i, h⟩ else 0) := by
  simp only [Nat.add_comm, sumRange_add]
```
I see 200010 heartbeats with this PR, and 1873870 (9x the requested
limit) without.

This type of failure is wasteful in AI systems which try tactics with a
short timeout.
2026-04-17 21:46:29 +00:00
Leonardo de Moura
615f45ad7a chore: fix Sym benchmarks using stale run' API (#13450)
This PR updates two Sym benchmarks (`add_sub_cancel.lean` and
`meta_simp_1.lean`) to use the current `SymM.run` API. Both files still
referenced `run'`, which no longer exists, so they failed to elaborate.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:31:57 +00:00
Sebastian Graf
8a9a5ebd5e refactor: promote repeat/while builtin parsers to default priority (#13447)
This PR removes the transitional `syntax` declarations for `repeat`,
`while`, and `repeat ... until` from `Init.While` and promotes the
corresponding `@[builtin_doElem_parser]` defs in `Lean.Parser.Do` from
`low` to default priority, making them the canonical parsers.

The `macro_rules` in `Init.While` are kept as a bootstrap: they expand
`repeat`/`while`/`until` directly to `for _ in Loop.mk do ...`, which is
what any `prelude` Init file needs. The `@[builtin_macro]` /
`@[builtin_doElem_elab]` in `Lean.Elab.BuiltinDo.Repeat` are only
visible once `Lean.Elab.*` is transitively imported, so they cannot
serve Init bootstrap. The duplication will be removed in a follow-up
after the next stage0 update.
2026-04-17 20:57:41 +00:00
Leonardo de Moura
1af697a44b fix: eta-reduce patterns containing loose pattern variables (#13448)
This PR fixes a regression in `Sym.simp` where rewrite rules whose LHS
contains a lambda over a pattern variable (e.g. `∃ x, a = x`) failed to
match targets with semantically equivalent structure.

`Sym.etaReduceAux` previously refused any eta-reduction whenever the
body had loose bound variables, but patterns produced by stripping outer
foralls always carry such loose bvars. The eta-reduction therefore
skipped patterns while still firing on the target, producing mismatched
discrimination tree keys and no match.

The fix narrows the check to loose bvars in the range `[0, n)` (those
that would actually refer to the peeled binders) and lowers any
remaining loose bvars by `n` so that pattern-variable references stay
consistent in the reduced expression. The discrimination tree now
classifies patterns like `exists_eq_True : (∃ x, a = x) = True` with
their full structure rather than falling back to `.other`.

Includes a regression test (`sym_simp_1.lean`) and Sebastian Graf's MWE
(`sym_eta_mwe.lean`).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 20:49:21 +00:00
Lean stage0 autoupdater
234267b08a chore: update stage0 2026-04-17 16:13:04 +00:00
Sebastian Graf
704df340cb feat: make repeat and while syntax builtin (#13442)
This PR promotes the `repeat`, `while`, and `repeat ... until` parsers
from `syntax` declarations in `Init.While` to `@[builtin_doElem_parser]`
definitions in `Lean.Parser.Do`, alongside the other do-element parsers.
The `while` variants and `repeat ... until` get `@[builtin_macro]`
expansions; `repeat` itself gets a `@[builtin_doElem_elab]` so a
follow-up can extend it with an option-driven choice between `Loop.mk`
and a well-founded `Repeat.mk`.

The new builtin parsers are registered at `low` priority so that the
bootstrapping `syntax` declarations in `Init.While` (still needed for
stage0 compatibility) take precedence during the transition. After the
next stage0 update, the `Init.While` syntax and macros can be removed.
2026-04-17 15:19:59 +00:00
Henrik Böving
f180c9ce17 fix: handling of EmitC for small hex string literals (#13435)
This PR fixes a bug in EmitC that can be caused by working with the
string literal `"\x01abc"` in
Lean and causes a C compiler error.

The error is as follows:
```
run.c:29:189: error: hex escape sequence out of range
   29 | static const lean_string_object l_badString___closed__0_value = {.m_header = {.m_rc = 0, .m_cs_sz = 0, .m_other = 0, .m_tag = 249}, .m_size = 5, .m_capacity = 5, .m_length = 4, .m_data = "\x01abc"};
      |                                                                                                                                                                                             ^~~~~~~
1 error generated.
```
This happens as hex escape sequences can be arbitrarily long while lean
expects them to cut off
after two chars. Thus, the C compiler parses the string as one large hex
escape sequence `01abc` and
subsequently notices this is too large.

Discovered by @datokrat
2026-04-17 15:16:28 +00:00
Lean stage0 autoupdater
040376ebd0 chore: update stage0 2026-04-17 10:09:17 +00:00
Sebastian Graf
ce998700e6 feat: add ControlInfo handler for doRepeat (#13437)
This PR adds a builtin `doElem_control_info` handler for `doRepeat`. It
is ineffective as long as we have the macro for `repeat`.
2026-04-17 09:17:52 +00:00
Sebastian Ullrich
b09d39a766 chore: build bench: replace warmup with selective build (#13432) 2026-04-17 08:01:17 +00:00
658 changed files with 657 additions and 178 deletions

1
AGENTS.md Symbolic link
View File

@@ -0,0 +1 @@
.claude/CLAUDE.md

View File

@@ -59,9 +59,9 @@ Examples:
* `Nat.repeat f 3 a = f <| f <| f <| a`
* `Nat.repeat (· ++ "!") 4 "Hello" = "Hello!!!!"`
-/
@[specialize, expose] def repeat {α : Type u} (f : α α) : (n : Nat) (a : α) α
@[specialize, expose] def «repeat» {α : Type u} (f : α α) : (n : Nat) (a : α) α
| 0, a => a
| succ n, a => f (repeat f n a)
| succ n, a => f («repeat» f n a)
/--
Applies a function to a starting value the specified number of times.
@@ -1221,9 +1221,9 @@ theorem not_lt_eq (a b : Nat) : (¬ (a < b)) = (b ≤ a) :=
theorem not_gt_eq (a b : Nat) : (¬ (a > b)) = (a b) :=
not_lt_eq b a
@[csimp] theorem repeat_eq_repeatTR : @repeat = @repeatTR :=
@[csimp] theorem repeat_eq_repeatTR : @«repeat» = @repeatTR :=
funext fun α => funext fun f => funext fun n => funext fun init =>
let rec go : m n, repeatTR.loop f m (repeat f n init) = repeat f (m + n) init
let rec go : m n, repeatTR.loop f m («repeat» f n init) = «repeat» f (m + n) init
| 0, n => by simp [repeatTR.loop]
| succ m, n => by rw [repeatTR.loop, succ_add]; exact go m (succ n)
(go n 0).symm

View File

@@ -75,6 +75,9 @@ theorem nat_eq (a b : Nat) (x y : Int) : NatCast.natCast a = x → NatCast.natCa
theorem of_nat_eq (a b : Nat) (x y : Int) : NatCast.natCast a = x NatCast.natCast b = y a = b x = y := by
intro _ _; subst x y; intro; simp [*]
theorem of_natCast_eq {α : Type u} [NatCast α] (a b : Nat) (x y : α) : NatCast.natCast a = x NatCast.natCast b = y a = b x = y := by
intro h₁ h₂ h; subst h; exact h₁.symm.trans h₂
theorem le_of_not_le {α} [LE α] [Std.IsLinearPreorder α]
{a b : α} : ¬ a b b a := by
intro h

View File

@@ -10,13 +10,14 @@ public import Init.Core
public section
/-!
# Notation for `while` and `repeat` loops.
-/
namespace Lean
/-! # `repeat` and `while` notation -/
/-!
# `while` and `repeat` loop support
The parsers for `repeat`, `while`, and `repeat ... until` are
`@[builtin_doElem_parser]` definitions in `Lean.Parser.Do`.
-/
inductive Loop where
| mk
@@ -32,23 +33,20 @@ partial def Loop.forIn {β : Type u} {m : Type u → Type v} [Monad m] (_ : Loop
instance [Monad m] : ForIn m Loop Unit where
forIn := Loop.forIn
syntax (name := doRepeat) "repeat " doSeq : doElem
-- The canonical parsers for `repeat`/`while`/`repeat ... until` live in `Lean.Parser.Do`
-- as `@[builtin_doElem_parser]` definitions. We register the expansion macros here so
-- they are available to `prelude` files in `Init`, which do not import `Lean.Elab`.
macro_rules
| `(doElem| repeat%$tk $seq) => `(doElem| for%$tk _ in Loop.mk do $seq)
syntax "while " ident " : " termBeforeDo " do " doSeq : doElem
macro_rules
| `(doElem| while%$tk $h : $cond do $seq) =>
`(doElem| repeat%$tk if $h:ident : $cond then $seq else break)
syntax "while " termBeforeDo " do " doSeq : doElem
macro_rules
| `(doElem| while%$tk $cond do $seq) => `(doElem| repeat%$tk if $cond then $seq else break)
syntax "repeat " doSeq ppDedent(ppLine) "until " term : doElem
| `(doElem| while%$tk $cond do $seq) =>
`(doElem| repeat%$tk if $cond then $seq else break)
macro_rules
| `(doElem| repeat%$tk $seq until $cond) =>

View File

@@ -207,7 +207,7 @@ def emitLns [EmitToString α] (as : List α) : EmitM Unit :=
emitLn "}"
return ret
def toHexDigit (c : Nat) : String :=
def toDigit (c : Nat) : String :=
String.singleton c.digitChar
def quoteString (s : String) : String :=
@@ -221,7 +221,11 @@ def quoteString (s : String) : String :=
else if c == '\"' then "\\\""
else if c == '?' then "\\?" -- avoid trigraphs
else if c.toNat <= 31 then
"\\x" ++ toHexDigit (c.toNat / 16) ++ toHexDigit (c.toNat % 16)
-- Use octal escapes instead of hex escapes because C hex escapes are
-- greedy: "\x01abc" would be parsed as the single escape \x01abc rather
-- than \x01 followed by "abc". Octal escapes consume at most 3 digits.
let n := c.toNat
"\\" ++ toDigit (n / 64) ++ toDigit ((n / 8) % 8) ++ toDigit (n % 8)
-- TODO(Leo): we should use `\unnnn` for escaping unicode characters.
else String.singleton c)
q;

View File

@@ -17,17 +17,28 @@ namespace Lean.Elab.Do
open Lean.Parser.Term
/--
Builtin do-element elaborator for `repeat` (syntax kind `Lean.doRepeat`).
Builtin do-element elaborator for `repeat` (syntax kind `Lean.Parser.Term.doRepeat`).
Expands to `for _ in Loop.mk do ...`. A follow-up PR will drop the fallback macro
in `Init.While` (which currently preempts this elaborator) and extend this
Expands to `for _ in Loop.mk do ...`. A follow-up change will extend this
elaborator to choose between `Loop.mk` and a well-founded `Repeat.mk` based on a
configuration option.
-/
@[builtin_doElem_elab Lean.doRepeat] def elabDoRepeat : DoElab := fun stx dec => do
let `(doElem| repeat $seq) := stx | throwUnsupportedSyntax
let expanded `(doElem| for _ in Loop.mk do $seq)
@[builtin_doElem_elab Lean.Parser.Term.doRepeat] def elabDoRepeat : DoElab := fun stx dec => do
let `(doElem| repeat%$tk $seq) := stx | throwUnsupportedSyntax
let expanded `(doElem| for%$tk _ in Loop.mk do $seq)
Term.withMacroExpansion stx expanded <|
withRef expanded <| elabDoElem expanded dec
@[builtin_macro Lean.Parser.Term.doWhileH] def expandDoWhileH : Macro
| `(doElem| while%$tk $h : $cond do $seq) => `(doElem| repeat%$tk if $h:ident : $cond then $seq else break)
| _ => Macro.throwUnsupported
@[builtin_macro Lean.Parser.Term.doWhile] def expandDoWhile : Macro
| `(doElem| while%$tk $cond do $seq) => `(doElem| repeat%$tk if $cond then $seq else break)
| _ => Macro.throwUnsupported
@[builtin_macro Lean.Parser.Term.doRepeatUntil] def expandDoRepeatUntil : Macro
| `(doElem| repeat%$tk $seq until $cond) => `(doElem| repeat%$tk do $seq:doSeq; if $cond then break)
| _ => Macro.throwUnsupported
end Lean.Elab.Do

View File

@@ -129,6 +129,7 @@ partial def ofElem (stx : TSyntax `doElem) : TermElabM ControlInfo := do
return thenInfo.alternative info
| `(doElem| unless $_ do $elseSeq) =>
ControlInfo.alternative {} <$> ofSeq elseSeq
-- For/Repeat
| `(doElem| for $[$[$_ :]? $_ in $_],* do $bodySeq) =>
let info ofSeq bodySeq
return { info with -- keep only reassigns and earlyReturn
@@ -136,6 +137,13 @@ partial def ofElem (stx : TSyntax `doElem) : TermElabM ControlInfo := do
continues := false,
breaks := false
}
| `(doRepeat| repeat $bodySeq) =>
let info ofSeq bodySeq
return { info with
numRegularExits := if info.breaks then 1 else 0,
continues := false,
breaks := false
}
-- Try
| `(doElem| try $trySeq:doSeq $[$catches]* $[finally $finSeq?]?) =>
let mut info ofSeq trySeq

View File

@@ -1782,7 +1782,7 @@ mutual
doIfToCode doElem doElems
else if k == ``Parser.Term.doUnless then
doUnlessToCode doElem doElems
else if k == `Lean.doRepeat then
else if k == ``Parser.Term.doRepeat then
let seq := doElem[1]
let expanded `(doElem| for _ in Loop.mk do $seq)
doSeqToCode (expanded :: doElems)

View File

@@ -175,6 +175,7 @@ where
return !( allChildrenLt a b)
lpo (a b : Expr) : MetaM Bool := do
checkSystem "Lean.Meta.acLt"
-- Case 1: `a < b` if for some child `b_i` of `b`, we have `b_i >= a`
if ( someChildGe b a) then
return true

View File

@@ -38,7 +38,9 @@ and assigning `?m := max ?n v`
private def solveSelfMax (mvarId : LMVarId) (v : Level) : MetaM Unit := do
assert! v.isMax
let n mkFreshLevelMVar
assignLevelMVar mvarId <| mkMaxArgsDiff mvarId v n
let v' := mkMaxArgsDiff mvarId v n
trace[Meta.isLevelDefEq.step] "solveSelfMax: {mkLevelMVar mvarId} := {v'}"
assignLevelMVar mvarId v'
/--
Returns true if `v` is `max u ?m` (or variant). That is, we solve `u =?= max u ?m` as `?m := u`.
@@ -53,6 +55,7 @@ private def tryApproxSelfMax (u v : Level) : MetaM Bool := do
where
solve (v' : Level) (mvarId : LMVarId) : MetaM Bool := do
if u == v' then
trace[Meta.isLevelDefEq.step] "tryApproxSelfMax {mkLevelMVar mvarId} := {u}"
assignLevelMVar mvarId u
return true
else
@@ -71,8 +74,14 @@ private def tryApproxMaxMax (u v : Level) : MetaM Bool := do
| _, _ => return false
where
solve (u₁ u₂ v' : Level) (mvarId : LMVarId) : MetaM Bool := do
if u₁ == v' then assignLevelMVar mvarId u₂; return true
else if u₂ == v' then assignLevelMVar mvarId u₁; return true
if u₁ == v' then
trace[Meta.isLevelDefEq.step] "tryApproxMaxMax {mkLevelMVar mvarId} := {u₂}"
assignLevelMVar mvarId u₂
return true
else if u₂ == v' then
trace[Meta.isLevelDefEq.step] "tryApproxMaxMax {mkLevelMVar mvarId} := {u₁}"
assignLevelMVar mvarId u₁
return true
else return false
private def postponeIsLevelDefEq (lhs : Level) (rhs : Level) : MetaM Unit := do
@@ -97,9 +106,11 @@ mutual
else if ( isMVarWithGreaterDepth v mvarId) then
-- If both `u` and `v` are both metavariables, but depth of v is greater, then we assign `v := u`.
-- This can only happen when levelAssignDepth is set to a smaller value than depth (e.g. during TC synthesis)
trace[Meta.isLevelDefEq.step] "{v} := {u}"
assignLevelMVar v.mvarId! u
return LBool.true
else if !u.occurs v then
trace[Meta.isLevelDefEq.step] "{u} := {v}"
assignLevelMVar u.mvarId! v
return LBool.true
else if v.isMax && !strictOccursMax u v then
@@ -133,8 +144,9 @@ mutual
@[export lean_is_level_def_eq]
partial def isLevelDefEqAuxImpl : Level Level MetaM Bool
| Level.succ lhs, Level.succ rhs => isLevelDefEqAux lhs rhs
| lhs, rhs =>
withTraceNode `Meta.isLevelDefEq (fun _ => return m!"{lhs} =?= {rhs}") do
| lhs, rhs => do
withTraceNodeBefore `Meta.isLevelDefEq (fun _ =>
withOptions (·.set `pp.instantiateMVars false) do addMessageContext m!"{lhs} =?= {rhs}") do
if lhs.getLevelOffset == rhs.getLevelOffset then
return lhs.getOffset == rhs.getOffset
else

View File

@@ -9,19 +9,37 @@ public import Lean.Meta.Sym.ExprPtr
public import Lean.Meta.Basic
import Lean.Meta.Transform
namespace Lean.Meta.Sym
/--
Returns `true` if `e` contains a loose bound variable with index in `[0, n)`
This function assumes `n` is small. If this becomes a bottleneck, we should
implement a version of `lean_expr_has_loose_bvar` that checks the range in one traversal.
-/
def hasLooseBVarsInRange (e : Expr) (n : Nat) : Bool :=
e.hasLooseBVars && go n
where
go : Nat Bool
| 0 => false
| i+1 => e.hasLooseBVar i || go i
/--
Checks if `body` is eta-expanded with `n` applications: `f (.bvar (n-1)) ... (.bvar 0)`.
Returns `f` if so and `f` has no loose bvars; otherwise returns `default`.
Returns `f` if so and `f` has no loose bvars with indices in the range `[0, n)`; otherwise returns `default`.
- `n`: number of remaining applications to check
- `i`: expected bvar index (starts at 0, increments with each application)
- `default`: returned when not eta-reducible (enables pointer equality check)
-/
def etaReduceAux (body : Expr) (n : Nat) (i : Nat) (default : Expr) : Expr := Id.run do
match n with
| 0 => if body.hasLooseBVars then default else body
| n+1 =>
let .app f (.bvar j) := body | default
if j == i then etaReduceAux f n (i+1) default else default
def etaReduceAux (body : Expr) (n : Nat) (i : Nat) (default : Expr) : Expr :=
go body n i
where
go (body : Expr) (m : Nat) (i : Nat) : Expr := Id.run do
match m with
| 0 =>
if hasLooseBVarsInRange body n then default
else body.lowerLooseBVars n n
| m+1 =>
let .app f (.bvar j) := body | default
if j == i then go f m (i+1) else default
/--
If `e` is of the form `(fun x₁ ... xₙ => f x₁ ... xₙ)` and `f` does not contain `x₁`, ..., `xₙ`,

View File

@@ -48,6 +48,8 @@ def introCore (mvarId : MVarId) (max : Nat) (names : Array Name) : SymM (Array F
assignDelayedMVar auxMVar.mvarId! fvars mvarId'
mvarId.assign val
let finalize (lctx : LocalContext) (localInsts : LocalInstances) (fvars : Array Expr) (type : Expr) : SymM (Array Expr × MVarId) := do
if fvars.isEmpty then
return (#[], mvarId)
let type instantiateRevS type fvars
let mvar' mkFreshExprMVarAt lctx localInsts type .syntheticOpaque mvarDecl.userName
let mvarId' := mvar'.mvarId!

View File

@@ -148,11 +148,14 @@ def propagatePending : OrderM Unit := do
- `h₁ : ↑ue' = ue`
- `h₂ : ↑ve' = ve`
- `h : ue = ve`
**Note**: We currently only support `Nat`. Thus `↑a` is actually
`NatCast.natCast a`. If we decide to support arbitrary semirings
in this module, we must adjust this code.
**Note**: We currently only support `Nat` originals. Thus `↑a` is actually
`NatCast.natCast a`. The lemma `nat_eq` is specialized to `Int`, so we
only invoke it when the cast destination is `Int`. For other types (e.g.
`Rat`), `pushEq ue ve h` above is sufficient and `grind` core can derive
the `Nat` equality via `norm_cast`/cast injectivity if needed.
-/
pushEq ue' ve' <| mkApp7 (mkConst ``Grind.Order.nat_eq) ue' ve' ue ve h₁ h₂ h
if ( inferType ue) == Int.mkType then
pushEq ue' ve' <| mkApp7 (mkConst ``Grind.Order.nat_eq) ue' ve' ue ve h₁ h₂ h
where
/--
If `e` is an auxiliary term used to represent some term `a`, returns
@@ -343,7 +346,7 @@ def getStructIdOf? (e : Expr) : GoalM (Option Nat) := do
return ( get').exprToStructId.find? { expr := e }
def propagateIneq (e : Expr) : GoalM Unit := do
if let some (e', he) := ( get').termMap.find? { expr := e } then
if let some { e := e', h := he, .. } := ( get').termMap.find? { expr := e } then
go e' (some he)
else
go e none
@@ -369,20 +372,27 @@ builtin_grind_propagator propagateLT ↓LT.lt := propagateIneq
public def processNewEq (a b : Expr) : GoalM Unit := do
unless isSameExpr a b do
let h mkEqProof a b
if let some (a', h₁) getAuxTerm? a then
let some (b', h₂) getAuxTerm? b | return ()
if let some { e := a', h := h₁, α } getAuxTerm? a then
let some { e := b', h := h₂, .. } getAuxTerm? b | return ()
/-
We have
- `h : a = b`
- `h₁ : ↑a = a'`
- `h₂ : ↑b = b'`
where `a'` and `b'` are `NatCast.natCast α inst _` for some type `α`.
-/
let h := mkApp7 (mkConst ``Grind.Order.of_nat_eq) a b a' b' h₁ h₂ h
go a' b' h
if α == Int.mkType then
let h := mkApp7 (mkConst ``Grind.Order.of_nat_eq) a b a' b' h₁ h₂ h
go a' b' h
else
let u getDecLevel α
let inst synthInstance (mkApp (mkConst ``NatCast [u]) α)
let h := mkApp9 (mkConst ``Grind.Order.of_natCast_eq [u]) α inst a b a' b' h₁ h₂ h
go a' b' h
else
go a b h
where
getAuxTerm? (e : Expr) : GoalM (Option (Expr × Expr)) := do
getAuxTerm? (e : Expr) : GoalM (Option TermMapEntry) := do
return ( get').termMap.find? { expr := e }
go (a b h : Expr) : GoalM Unit := do

View File

@@ -166,9 +166,9 @@ def setStructId (e : Expr) : OrderM Unit := do
exprToStructId := s.exprToStructId.insert { expr := e } structId
}
def updateTermMap (e eNew h : Expr) : GoalM Unit := do
def updateTermMap (e eNew h α : Expr) : GoalM Unit := do
modify' fun s => { s with
termMap := s.termMap.insert { expr := e } (eNew, h)
termMap := s.termMap.insert { expr := e } { e := eNew, h, α }
termMapInv := s.termMapInv.insert { expr := eNew } (e, h)
}
@@ -198,9 +198,9 @@ where
getOriginal? (e : Expr) : GoalM (Option Expr) := do
if let some (e', _) := ( get').termMapInv.find? { expr := e } then
return some e'
let_expr NatCast.natCast _ _ a := e | return none
let_expr NatCast.natCast α _ a := e | return none
if ( alreadyInternalized a) then
updateTermMap a e ( mkEqRefl e)
updateTermMap a e ( mkEqRefl e) α
return some a
else
return none
@@ -290,7 +290,7 @@ def internalizeTerm (e : Expr) : OrderM Unit := do
open Arith.Cutsat in
def adaptNat (e : Expr) : GoalM Expr := do
if let some (eNew, _) := ( get').termMap.find? { expr := e } then
if let some { e := eNew, .. } := ( get').termMap.find? { expr := e } then
return eNew
else match_expr e with
| LE.le _ _ lhs rhs => adaptCnstr lhs rhs (isLT := false)
@@ -307,12 +307,12 @@ where
let h := mkApp6
(mkConst (if isLT then ``Nat.ToInt.lt_eq else ``Nat.ToInt.le_eq))
lhs rhs lhs' rhs' h₁ h₂
updateTermMap e eNew h
updateTermMap e eNew h ( getIntExpr)
return eNew
adaptTerm : GoalM Expr := do
let (eNew, h) natToInt e
updateTermMap e eNew h
updateTermMap e eNew h ( getIntExpr)
return eNew
def adapt (α : Expr) (e : Expr) : GoalM (Expr × Expr) := do

View File

@@ -128,6 +128,13 @@ structure Struct where
propagate : List ToPropagate := []
deriving Inhabited
/-- Entry/Value for the map `termMap` in `State` -/
structure TermMapEntry where
e : Expr
h : Expr
α : Expr
deriving Inhabited
/-- State for all order types detected by `grind`. -/
structure State where
/-- Order structures detected. -/
@@ -143,7 +150,7 @@ structure State where
Example: given `x y : Nat`, `x ≤ y + 1` is mapped to `Int.ofNat x ≤ Int.ofNat y + 1`, and proof
of equivalence.
-/
termMap : PHashMap ExprPtr (Expr × Expr) := {}
termMap : PHashMap ExprPtr TermMapEntry := {}
/-- `termMap` inverse -/
termMapInv : PHashMap ExprPtr (Expr × Expr) := {}
deriving Inhabited

View File

@@ -273,6 +273,15 @@ with debug assertions enabled (see the `debugAssertions` option).
@[builtin_doElem_parser] def doDebugAssert := leading_parser:leadPrec
"debug_assert! " >> termParser
@[builtin_doElem_parser] def doRepeat := leading_parser
"repeat " >> doSeq
@[builtin_doElem_parser] def doWhileH := leading_parser
"while " >> ident >> " : " >> withForbidden "do" termParser >> " do " >> doSeq
@[builtin_doElem_parser] def doWhile := leading_parser
"while " >> withForbidden "do" termParser >> " do " >> doSeq
@[builtin_doElem_parser] def doRepeatUntil := leading_parser
"repeat " >> doSeq >> ppDedent ppLine >> "until " >> termParser
/-
We use `notFollowedBy` to avoid counterintuitive behavior.

View File

@@ -484,6 +484,7 @@ open SubExpr (Pos PosMap)
open Delaborator (OptionsPerPos topDownAnalyze DelabM getPPOption)
def delabLevel (l : Level) (prec : Nat) : MetaM Syntax.Level := do
let l if getPPInstantiateMVars ( getOptions) then instantiateLevelMVars l else pure l
let mvars := getPPMVarsLevels ( getOptions)
return Level.quote l prec (mvars := mvars) (lIndex? := ( getMCtx).findLevelIndex?)

View File

@@ -235,7 +235,7 @@ public def checkHashUpToDate
: JobM Bool := (·.isUpToDate) <$> checkHashUpToDate' info depTrace depHash oldTrace
/--
**Ror internal use only.**
**For internal use only.**
Checks whether `info` is up-to-date with the trace.
If so, replays the log of the trace if available.
-/
@@ -271,20 +271,24 @@ Returns `true` if the saved trace exists and its hash matches `inputHash`.
If up-to-date, replays the saved log from the trace and sets the current
build action to `replay`. Otherwise, if the log is empty and trace is synthetic,
or if the trace is not up-to-date, the build action will be set to `fetch`.
or if the trace is not up-to-date, the build action will be set to `reuse`.
-/
public def SavedTrace.replayOrFetchIfUpToDate (inputHash : Hash) (self : SavedTrace) : JobM Bool := do
public def SavedTrace.replayCachedIfUpToDate (inputHash : Hash) (self : SavedTrace) : JobM Bool := do
if let .ok data := self then
if data.depHash == inputHash then
if data.synthetic && data.log.isEmpty then
updateAction .fetch
updateAction .reuse
else
updateAction .replay
data.log.replay
return true
updateAction .fetch
updateAction .reuse
return false
@[deprecated replayCachedIfUpToDate (since := "2026-04-15")]
public abbrev SavedTrace.replayOrFetchIfUpToDate (inputHash : Hash) (self : SavedTrace) : JobM Bool := do
self.replayCachedIfUpToDate inputHash
/-- **For internal use only.** -/
public class ToOutputJson (α : Type u) where
toOutputJson (arts : α) : Json
@@ -684,7 +688,7 @@ public def buildArtifactUnlessUpToDate
let fetchArt? restore := do
let some (art : XArtifact exe) getArtifacts? inputHash savedTrace pkg
| return none
unless ( savedTrace.replayOrFetchIfUpToDate inputHash) do
unless ( savedTrace.replayCachedIfUpToDate inputHash) do
removeFileIfExists file
writeFetchTrace traceFile inputHash (toJson art.descr)
if restore then

View File

@@ -29,11 +29,18 @@ namespace Lake
public inductive JobAction
/-- No information about this job's action is available. -/
| unknown
/-- Tried to replay a cached build action (set by `buildFileUnlessUpToDate`) -/
/-- Tried to reuse a cached build (e.g., can be set by `replayCachedIfUpToDate`). -/
| reuse
/-- Tried to replay a completed build action (e.g., can be set by `replayIfUpToDate`). -/
| replay
/-- Tried to fetch a build from a store (can be set by `buildUnlessUpToDate?`) -/
/-- Tried to unpack a build from an archive (e.g., unpacking a module `ltar`). -/
| unpack
/--
Tried to fetch a build from a remote store (e.g., set when downloading an artifact
on-demand from a cache service in `buildArtifactUnlessUpToDate`).
-/
| fetch
/-- Tried to perform a build action (set by `buildUnlessUpToDate?`) -/
/-- Tried to perform a build action (e.g., set by `buildAction`). -/
| build
deriving Inhabited, Repr, DecidableEq, Ord
@@ -45,11 +52,13 @@ public instance : Min JobAction := minOfLe
public instance : Max JobAction := maxOfLe
public def merge (a b : JobAction) : JobAction :=
max a b
max a b -- inlines `max`
public def verb (failed : Bool) : JobAction String
public def verb (failed : Bool) : (self : JobAction) String
| .unknown => if failed then "Running" else "Ran"
| .reuse => if failed then "Reusing" else "Reused"
| .replay => if failed then "Replaying" else "Replayed"
| .unpack => if failed then "Unpacking" else "Unpacked"
| .fetch => if failed then "Fetching" else "Fetched"
| .build => if failed then "Building" else "Built"

View File

@@ -900,8 +900,9 @@ where
let inputHash := ( getTrace).hash
let some ltarOrArts getArtifacts? inputHash savedTrace mod.pkg
| return .inr savedTrace
match (ltarOrArts : ModuleOutputs) with
match (ltarOrArts : ModuleOutputs) with
| .ltar ltar =>
updateAction .unpack
mod.clearOutputArtifacts
mod.unpackLtar ltar.path
-- Note: This branch implies that only the ltar output is (validly) cached.
@@ -919,7 +920,7 @@ where
else
return .inr savedTrace
| .arts arts =>
unless ( savedTrace.replayOrFetchIfUpToDate inputHash) do
unless ( savedTrace.replayCachedIfUpToDate inputHash) do
mod.clearOutputArtifacts
writeFetchTrace mod.traceFile inputHash (toJson arts.descrs)
let arts

View File

@@ -25,12 +25,8 @@ namespace Lake
open Lean (Name)
/-- Fetch the package's direct dependencies. -/
def Package.recFetchDeps (self : Package) : FetchM (Job (Array Package)) := ensureJob do
(pure ·) <$> self.depConfigs.mapM fun cfg => do
let some dep findPackageByName? cfg.name
| error s!"{self.prettyName}: package not found for dependency '{cfg.name}' \
(this is likely a bug in Lake)"
return dep
def Package.recFetchDeps (self : Package) : FetchM (Job (Array Package)) := do
return Job.pure self.depPkgs
/-- The `PackageFacetConfig` for the builtin `depsFacet`. -/
public def Package.depsFacetConfig : PackageFacetConfig depsFacet :=
@@ -38,10 +34,7 @@ public def Package.depsFacetConfig : PackageFacetConfig depsFacet :=
/-- Compute a topological ordering of the package's transitive dependencies. -/
def Package.recComputeTransDeps (self : Package) : FetchM (Job (Array Package)) := ensureJob do
(pure ·.toArray) <$> self.depConfigs.foldlM (init := OrdPackageSet.empty) fun deps cfg => do
let some dep findPackageByName? cfg.name
| error s!"{self.prettyName}: package not found for dependency '{cfg.name}' \
(this is likely a bug in Lake)"
(pure ·.toArray) <$> self.depPkgs.foldlM (init := OrdPackageSet.empty) fun deps dep => do
let depDeps ( fetch <| dep.transDeps).await
return depDeps.foldl (·.insert ·) deps |>.insert dep
@@ -153,7 +146,7 @@ def Package.fetchBuildArchive
let upToDate buildUnlessUpToDate? (action := .fetch) archiveFile depTrace traceFile do
download url archiveFile headers
unless upToDate && ( self.buildDir.pathExists) do
updateAction .fetch
updateAction .unpack
untar archiveFile self.buildDir
@[inline]

View File

@@ -210,7 +210,7 @@ def mkMonitorContext (cfg : BuildConfig) (jobs : JobQueue) : BaseIO MonitorConte
let failLv := cfg.failLv
let isVerbose := cfg.verbosity = .verbose
let showProgress := cfg.showProgress
let minAction := if isVerbose then .unknown else .fetch
let minAction := if isVerbose then .unknown else .unpack
let showOptional := isVerbose
let showTime := isVerbose || !useAnsi
let updateFrequency := 100

View File

@@ -9,7 +9,7 @@ prelude
public import Lean.Data.Json
import Init.Data.Nat.Fold
meta import Init.Data.Nat.Fold
import Lake.Util.String
public import Lake.Util.String
public import Init.Data.String.Search
public import Init.Data.String.Extra
import Init.Data.Option.Coe
@@ -141,8 +141,8 @@ public def ofHex? (s : String) : Option Hash :=
if s.utf8ByteSize = 16 && isHex s then ofHex s else none
/-- Returns the hash as 16-digit lowercase hex string. -/
public def hex (self : Hash) : String :=
lpad (String.ofList <| Nat.toDigits 16 self.val.toNat) '0' 16
@[inline] public def hex (self : Hash) : String :=
lowerHexUInt64 self.val
/-- Parse a hash from a string of decimal digits. -/
public def ofDecimal? (s : String) : Option Hash :=

View File

@@ -69,7 +69,7 @@ public structure LakeOptions where
scope? : Option CacheServiceScope := none
platform? : Option CachePlatform := none
toolchain? : Option CacheToolchain := none
rev? : Option String := none
rev? : Option GitRev := none
maxRevs : Nat := 100
shake : Shake.Args := {}
@@ -563,7 +563,7 @@ private def computePackageRev (pkgDir : FilePath) : CliStateM String := do
repo.getHeadRevision
private def putCore
(rev : String) (outputs : FilePath) (artDir : FilePath)
(rev : GitRev) (outputs : FilePath) (artDir : FilePath)
(service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: LoggerIO Unit := do

View File

@@ -7,6 +7,7 @@ module
prelude
import Init.Control.Do
public import Lake.Util.Git
public import Lake.Util.Log
public import Lake.Util.Version
public import Lake.Config.Artifact
@@ -469,7 +470,7 @@ public def readOutputs? (cache : Cache) (scope : String) (inputHash : Hash) : Lo
cache.dir / "revisions"
/-- Returns path to the input-to-output mappings of a downloaded package revision. -/
@[inline] public def revisionPath (cache : Cache) (scope : String) (rev : String) : FilePath :=
@[inline] public def revisionPath (cache : Cache) (scope : String) (rev : GitRev) : FilePath :=
cache.revisionDir / scope / s!"{rev}.jsonl"
end Cache
@@ -942,7 +943,7 @@ public def uploadArtifacts
public def mapContentType : String := "application/vnd.reservoir.outputs+json-lines"
def s3RevisionUrl
(rev : String) (service : CacheService) (scope : CacheServiceScope)
(rev : GitRev) (service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: String :=
match scope.impl with
@@ -956,7 +957,7 @@ def s3RevisionUrl
return s!"{url}/{rev}.jsonl"
public def revisionUrl
(rev : String) (service : CacheService) (scope : CacheServiceScope)
(rev : GitRev) (service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: String :=
if service.isReservoir then Id.run do
@@ -974,7 +975,7 @@ public def revisionUrl
service.s3RevisionUrl rev scope platform toolchain
public def downloadRevisionOutputs?
(rev : String) (cache : Cache) (service : CacheService)
(rev : GitRev) (cache : Cache) (service : CacheService)
(localScope : String) (remoteScope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none) (force := false)
: LoggerIO (Option CacheMap) := do
@@ -998,7 +999,7 @@ public def downloadRevisionOutputs?
CacheMap.load path platform.isNone
public def uploadRevisionOutputs
(rev : String) (outputs : FilePath) (service : CacheService) (scope : CacheServiceScope)
(rev : GitRev) (outputs : FilePath) (service : CacheService) (scope : CacheServiceScope)
(platform := CachePlatform.none) (toolchain := CacheToolchain.none)
: LoggerIO Unit := do
let url := service.s3RevisionUrl rev scope platform toolchain

View File

@@ -9,6 +9,7 @@ prelude
public import Init.Dynamic
public import Init.System.FilePath
public import Lean.Data.NameMap.Basic
public import Lake.Util.Git
import Init.Data.ToString.Name
import Init.Data.ToString.Macro
@@ -30,7 +31,7 @@ public inductive DependencySrc where
/- A package located at a fixed path relative to the dependent package's directory. -/
| path (dir : FilePath)
/- A package cloned from a Git repository available at a fixed Git `url`. -/
| git (url : String) (rev : Option String) (subDir : Option FilePath)
| git (url : String) (rev : Option GitRev) (subDir : Option FilePath)
deriving Inhabited, Repr
/--

View File

@@ -52,6 +52,8 @@ public structure Package where
remoteUrl : String
/-- Dependency configurations for the package. -/
depConfigs : Array Dependency := #[]
/-- **For internal use only.** Resolved direct dependences of the package. -/
depPkgs : Array Package := #[]
/-- Target configurations in the order declared by the package. -/
targetDecls : Array (PConfigDecl keyName) := #[]
/-- Name-declaration map of target configurations in the package. -/

View File

@@ -33,14 +33,12 @@ public def computeLakeCache (pkg : Package) (lakeEnv : Lake.Env) : Cache :=
lakeEnv.lakeCache?.getD pkg.lakeDir / "cache"
public structure Workspace.Raw : Type where
/-- The root package of the workspace. -/
root : Package
/-- The detected {lean}`Lake.Env` of the workspace. -/
lakeEnv : Lake.Env
/-- The Lake configuration from the system configuration file. -/
lakeConfig : LoadedLakeConfig
/-- The Lake cache. -/
lakeCache : Cache := private_decl% computeLakeCache root lakeEnv
lakeCache : Cache
/--
The CLI arguments Lake was run with.
Used by {lit}`lake update` to perform a restart of Lake on a toolchain update.
@@ -59,6 +57,7 @@ public structure Workspace.Raw : Type where
deriving Nonempty
public structure Workspace.Raw.WF (ws : Workspace.Raw) : Prop where
size_packages_pos : 0 < ws.packages.size
packages_wsIdx : (h : i < ws.packages.size), (ws.packages[i]'h).wsIdx = i
/-- A Lake workspace -- the top-level package directory. -/
@@ -67,8 +66,13 @@ public structure Workspace extends raw : Workspace.Raw, wf : raw.WF
public instance : Nonempty Workspace := .intro {
lakeEnv := default
lakeConfig := Classical.ofNonempty
root := Classical.ofNonempty
packages_wsIdx h := by simp at h
lakeCache := Classical.ofNonempty
packages := #[{(Classical.ofNonempty : Package) with wsIdx := 0}]
size_packages_pos := by simp
packages_wsIdx {i} h := by
cases i with
| zero => simp
| succ => simp at h
}
public hydrate_opaque_type OpaqueWorkspace Workspace
@@ -85,9 +89,13 @@ public def Package.defaultTargetRoots (self : Package) : Array Lean.Name :=
namespace Workspace
/-- The root package of the workspace. -/
@[inline] public def root (self : Workspace) : Package :=
self.packages[0]'self.size_packages_pos
/-- **For internal use.** Whether this workspace is Lean itself. -/
@[inline] def bootstrap (ws : Workspace) : Bool :=
ws.root.bootstrap
@[inline] def bootstrap (self : Workspace) : Bool :=
self.root.bootstrap
/-- The path to the workspace's directory (i.e., the directory of the root package). -/
@[inline] public def dir (self : Workspace) : FilePath :=
@@ -193,12 +201,18 @@ This is configured through {lit}`cache.service` entries in the system Lake confi
{self with
packages := self.packages.push pkg
packageMap := self.packageMap.insert pkg.keyName pkg
size_packages_pos := by simp
packages_wsIdx {i} i_lt := by
cases Nat.lt_add_one_iff_lt_or_eq.mp <| Array.size_push .. i_lt with
| inl i_lt => simpa [Array.getElem_push_lt i_lt] using self.packages_wsIdx i_lt
| inr i_eq => simpa [i_eq] using h
}
/-- **For internal use only.** -/
public theorem packages_addPackage' :
(addPackage' pkg ws h).packages = ws.packages.push pkg
:= by rfl
/-- Add a package to the workspace. -/
@[inline] public def addPackage (pkg : Package) (self : Workspace) : Workspace :=
self.addPackage' {pkg with wsIdx := self.packages.size} rfl
@@ -278,6 +292,11 @@ public def findTargetDecl? (name : Name) (self : Workspace) : Option ((pkg : Pac
@[inline] public def addFacetConfig {name} (cfg : FacetConfig name) (self : Workspace) : Workspace :=
{self with facetConfigs := self.facetConfigs.insert cfg}
/-- **For internal use only.** -/
public theorem packages_addFacetConfig :
(addFacetConfig cfg ws).packages = ws.packages
:= by rfl
/-- Try to find a facet configuration in the workspace with the given name. -/
@[inline] public def findFacetConfig? (name : Name) (self : Workspace) : Option (FacetConfig name) :=
self.facetConfigs.get? name

View File

@@ -8,6 +8,7 @@ module
prelude
public import Lake.Util.Version
public import Lake.Config.Defaults
public import Lake.Util.Git
import Lake.Util.Error
public import Lake.Util.FilePath
import Lake.Util.JsonObject
@@ -75,8 +76,8 @@ public inductive PackageEntrySrc
/-- A remote Git package. -/
| git
(url : String)
(rev : String)
(inputRev? : Option String)
(rev : GitRev)
(inputRev? : Option GitRev)
(subDir? : Option FilePath)
deriving Inhabited

View File

@@ -13,6 +13,8 @@ import Lake.Util.Git
import Lake.Util.IO
import Lake.Reservoir
set_option doc.verso true
open System Lean
/-! # Dependency Materialization
@@ -23,9 +25,12 @@ or resolve a local path dependency.
namespace Lake
/-- Update the Git package in `repo` to `rev` if not already at it. -/
/--
Update the Git package in {lean}`repo` to the revision {lean}`rev?` if not already at it.
IF no revision is specified (i.e., {lean}`rev? = none`), then uses the latest {lit}`master`.
-/
def updateGitPkg
(name : String) (repo : GitRepo) (rev? : Option String)
(name : String) (repo : GitRepo) (rev? : Option GitRev)
: LoggerIO PUnit := do
let rev repo.findRemoteRevision rev?
if ( repo.getHeadRevision) = rev then
@@ -40,9 +45,9 @@ def updateGitPkg
-- so stale ones from the previous revision cause incorrect trace computations.
repo.clean
/-- Clone the Git package as `repo`. -/
/-- Clone the Git package as {lean}`repo`. -/
def cloneGitPkg
(name : String) (repo : GitRepo) (url : String) (rev? : Option String)
(name : String) (repo : GitRepo) (url : String) (rev? : Option GitRev)
: LoggerIO PUnit := do
logInfo s!"{name}: cloning {url}"
repo.clone url
@@ -52,9 +57,9 @@ def cloneGitPkg
repo.checkoutDetach rev
/--
Update the Git repository from `url` in `repo` to `rev?`.
If `repo` is already from `url`, just checkout the new revision.
Otherwise, delete the local repository and clone a fresh copy from `url`.
Update the Git repository from {lean}`url` in {lean}`repo` to {lean}`rev?`.
If {lean}`repo` is already from {lean}`url`, just checkout the new revision.
Otherwise, delete the local repository and clone a fresh copy from {lean}`url`.
-/
def updateGitRepo
(name : String) (repo : GitRepo) (url : String) (rev? : Option String)
@@ -75,8 +80,9 @@ def updateGitRepo
IO.FS.removeDirAll repo.dir
cloneGitPkg name repo url rev?
/--
Materialize the Git repository from `url` into `repo` at `rev?`.
Materialize the Git repository from {lean}`url` into {lean}`repo` at {lean}`rev?`.
Clone it if no local copy exists, otherwise update it.
-/
def materializeGitRepo
@@ -114,11 +120,11 @@ namespace MaterializedDep
@[inline] public def scope (self : MaterializedDep) : String :=
self.manifestEntry.scope
/-- Path to the dependency's manfiest file (relative to `relPkgDir`). -/
/-- Path to the dependency's manfiest file (relative to {lean}`relPkgDir`). -/
@[inline] public def relManifestFile? (self : MaterializedDep) : Option FilePath :=
self.manifestEntry.manifestFile?
/-- Path to the dependency's manfiest file (relative to `relPkgDir`). -/
/-- Path to the dependency's manfiest file (relative to {lean}`relPkgDir`). -/
@[inline] public def relManifestFile (self : MaterializedDep) : FilePath :=
self.relManifestFile?.getD defaultManifestFile
@@ -126,7 +132,7 @@ namespace MaterializedDep
@[inline] public def manifestFile (self : MaterializedDep) : FilePath :=
self.pkgDir / self.relManifestFile
/-- Path to the dependency's configuration file (relative to `relPkgDir`). -/
/-- Path to the dependency's configuration file (relative to {lean}`relPkgDir`). -/
@[inline] public def relConfigFile (self : MaterializedDep) : FilePath :=
self.manifestEntry.configFile
@@ -143,7 +149,7 @@ end MaterializedDep
inductive InputVer
| none
| git (rev : String)
| git (rev : GitRev)
| ver (ver : VerRange)
def pkgNotIndexed (scope name : String) (ver : InputVer) : String :=

View File

@@ -15,6 +15,8 @@ import Lake.Build.Topological
import Lake.Load.Materialize
import Lake.Load.Lean.Eval
import Lake.Load.Package
import Init.Data.Range.Polymorphic.Iterators
import Init.TacticsExtra
open System Lean
@@ -45,23 +47,29 @@ namespace Lake
def Workspace.addFacetDecls (decls : Array FacetDecl) (self : Workspace) : Workspace :=
decls.foldl (·.addFacetConfig ·.config) self
theorem Workspace.packages_addFacetDecls :
(addFacetDecls decls ws).packages = ws.packages
:= by
simp only [addFacetDecls]
apply Array.foldl_induction (fun _ (s : Workspace) => s.packages = ws.packages) rfl
intro i s h
simp only [packages_addFacetConfig, h]
/--
Loads the package configuration of a materialized dependency.
Adds the package and the facets defined within it to the `Workspace`.
-/
def addDepPackage
(dep : MaterializedDep)
(lakeOpts : NameMap String)
(leanOpts : Options) (reconfigure : Bool)
: StateT Workspace LogIO Package := fun ws => do
def Workspace.addDepPackage'
(ws : Workspace) (dep : MaterializedDep)
(lakeOpts : NameMap String) (leanOpts : Options) (reconfigure : Bool)
: LogIO {ws' : Workspace // ws'.packages.size = ws.packages.size + 1} := do
let wsIdx := ws.packages.size
let loadCfg := mkDepLoadConfig ws dep lakeOpts leanOpts reconfigure
let loadCfg, h resolveConfigFile dep.prettyName loadCfg
let fileCfg loadConfigFile loadCfg h
let pkg := mkPackage loadCfg fileCfg wsIdx
let ws := ws.addPackage' pkg wsIdx_mkPackage
let ws := ws.addFacetDecls fileCfg.facetDecls
return (pkg, ws)
let ws := ws.addPackage' pkg wsIdx_mkPackage |>.addFacetDecls fileCfg.facetDecls
return ws, by simp [ws, packages_addFacetDecls, packages_addPackage']
/--
The resolver's call stack of dependencies.
@@ -103,6 +111,52 @@ abbrev ResolveT m := DepStackT <| StateT Workspace m
recFetchAcyclic (·.baseName) go root
return ws
def Workspace.setDepPkgs
(self : Workspace) (wsIdx : Nat) (depPkgs : Array Package)
: Workspace := {self with
packages := self.packages.modify wsIdx ({· with depPkgs})
size_packages_pos := by simp [self.size_packages_pos]
packages_wsIdx {i} := by
if h : wsIdx = i then
simp [h, Array.getElem_modify_self, self.packages_wsIdx]
else
simp [Array.getElem_modify_of_ne h, self.packages_wsIdx]
}
structure ResolveState where
ws : Workspace
depIdxs : Array Nat
lt_of_mem : i depIdxs, i < ws.packages.size
namespace ResolveState
@[inline] def init (ws : Workspace) (size : Nat) : ResolveState :=
{ws, depIdxs := Array.mkEmpty size, lt_of_mem := by simp}
@[inline] def reuseDep (s : ResolveState) (wsIdx : Fin s.ws.packages.size) : ResolveState :=
have lt_of_mem := by
intro i i_mem
cases Array.mem_push.mp i_mem with
| inl i_mem => exact s.lt_of_mem i i_mem
| inr i_eq => simp only [i_eq, wsIdx.isLt]
{s with depIdxs := s.depIdxs.push wsIdx, lt_of_mem}
@[inline] def newDep
(s : ResolveState) (dep : MaterializedDep)
(lakeOpts : NameMap String) (leanOpts : Options) (reconfigure : Bool)
: LogIO ResolveState := do
let ws, depIdxs, lt_of_mem := s
let wsIdx := ws.packages.size
let ws', h ws.addDepPackage' dep lakeOpts leanOpts reconfigure
have lt_of_mem := by
intro i i_mem
cases Array.mem_push.mp i_mem with
| inl i_mem => exact h Nat.lt_add_one_of_lt (lt_of_mem i i_mem)
| inr i_eq => simp only [wsIdx, i_eq, h, Nat.lt_add_one]
return ws', depIdxs.push wsIdx, lt_of_mem
end ResolveState
/-
Recursively visits each node in a package's dependency graph, starting from
the workspace package `root`. Each dependency missing from the workspace is
@@ -121,19 +175,31 @@ See `Workspace.updateAndMaterializeCore` for more details.
: m Workspace := do
ws.runResolveT go root stack
where
@[specialize] go pkg recurse : ResolveT m Unit := do
let start := ( getWorkspace).packages.size
@[specialize] go pkg recurse : ResolveT m Unit := fun depStack ws => do
let start := ws.packages.size
-- Materialize and load the missing direct dependencies of `pkg`
pkg.depConfigs.forRevM fun dep => do
let ws getWorkspace
if ws.packages.any (·.baseName == dep.name) then
return -- already handled in another branch
let s := ResolveState.init ws pkg.depConfigs.size
let ws, depIdxs, lt_of_mem pkg.depConfigs.foldrM (m := m) (init := s) fun dep s => do
if let some wsIdx := s.ws.packages.findFinIdx? (·.baseName == dep.name) then
return s.reuseDep wsIdx -- already handled in another branch
if pkg.baseName = dep.name then
error s!"{pkg.prettyName}: package requires itself (or a package with the same name)"
let matDep resolve pkg dep ( getWorkspace)
discard <| addDepPackage matDep dep.opts leanOpts reconfigure
let matDep resolve pkg dep s.ws
s.newDep matDep dep.opts leanOpts reconfigure
let depsEnd := ws.packages.size
-- Recursively load the dependencies' dependencies
( getWorkspace).packages.forM recurse start
let ws ws.packages.foldlM (start := start) (init := ws) fun ws pkg =>
(·.2) <$> recurse pkg depStack ws
-- Add the package's dependencies to the package
let ws :=
if h_le : depsEnd ws.packages.size then
ws.setDepPkgs pkg.wsIdx <| depIdxs.attach.map fun wsIdx, h_mem =>
ws.packages[wsIdx]'(Nat.lt_of_lt_of_le (lt_of_mem wsIdx h_mem) h_le)
else
have : Inhabited Workspace := ws
panic! "workspace shrunk" -- should be unreachable
return ((), ws)
/--
Adds monad state used to update the manifest.
@@ -372,10 +438,14 @@ def Workspace.updateAndMaterializeCore
let start := ws.packages.size
let ws (deps.zip matDeps).foldlM (init := ws) fun ws (dep, matDep) => do
addDependencyEntries matDep
let (_, ws) addDepPackage matDep dep.opts leanOpts true ws
let ws, _ ws.addDepPackage' matDep dep.opts leanOpts true
return ws
ws.packages.foldlM (init := ws) (start := start) fun ws pkg =>
let stop := ws.packages.size
let ws ws.packages.foldlM (init := ws) (start := start) fun ws pkg =>
ws.resolveDepsCore updateAndAddDep pkg [ws.root.baseName] leanOpts true
-- Set dependency packages after `resolveDepsCore` so
-- that the dependencies' dependencies are also properly set
return ws.setDepPkgs ws.root.wsIdx ws.packages[start...<stop]
else
ws.resolveDepsCore updateAndAddDep (leanOpts := leanOpts) (reconfigure := true)
where

View File

@@ -35,17 +35,22 @@ public def loadWorkspaceRoot (config : LoadConfig) : LogIO Workspace := do
let config, h resolveConfigFile "[root]" config
let fileCfg loadConfigFile config h
let root := mkPackage config fileCfg 0
have wsIdx_root : root.wsIdx = 0 := wsIdx_mkPackage
let facetConfigs := fileCfg.facetDecls.foldl (·.insert ·.config) initFacetConfigs
let ws : Workspace := {
root
return {
lakeEnv := config.lakeEnv
lakeCache := computeLakeCache root config.lakeEnv
lakeConfig
lakeArgs? := config.lakeArgs?
facetConfigs
packages_wsIdx h := by simp at h
packages := #[root]
packageMap := DNameMap.empty.insert root.keyName root
size_packages_pos := by simp
packages_wsIdx {i} h := by
cases i with
| zero => simp [wsIdx_root]
| succ => simp at h
}
let ws := ws.addPackage' root wsIdx_mkPackage
return ws
/--
Load a `Workspace` for a Lake package by

View File

@@ -10,6 +10,7 @@ public import Init.Data.ToString
public import Lake.Util.Proc
import Init.Data.String.TakeDrop
import Init.Data.String.Search
import Lake.Util.String
open System
namespace Lake
@@ -36,18 +37,48 @@ public def filterUrl? (url : String) : Option String :=
some url
public def isFullObjectName (rev : String) : Bool :=
rev.length == 40 && rev.all fun c => c.isDigit || ('a' <= c && c <= 'f')
rev.utf8ByteSize == 40 && isHex rev
end Git
public structure GitRepo where
dir : FilePath
public instance : Coe FilePath GitRepo := (·)
public instance : ToString GitRepo := (·.dir.toString)
/--
A commit-ish [Git revision][1].
This can be SHA1 commit hash, a branch name, or one of Git's more complex specifiers.
[1]: https://git-scm.com/docs/gitrevisions#_specifying_revisions
-/
public abbrev GitRev := String
namespace GitRev
/-- The head revision (i.e., `HEAD`). -/
@[expose] public def head : GitRev := "HEAD"
/-- The revision fetched during the last `git fetch` (i.e., `FETCH_HEAD`). -/
@[expose] public def fetchHead : GitRev := "FETCH_HEAD"
/-- Returns whether this revision is a 40-digit hexadecimal (SHA1) commit hash. -/
public def isFullSha1 (rev : GitRev) : Bool :=
rev.utf8ByteSize == 40 && isHex rev
attribute [deprecated GitRev.isFullSha1 (since := "2026-04-17")] Git.isFullObjectName
/-- Scopes the revision by the remote. -/
@[inline] public def withRemote (remote : String) (rev : GitRev) : GitRev :=
s!"{remote}/{rev}"
end GitRev
namespace GitRepo
public instance : Coe FilePath GitRepo := (·)
public instance : ToString GitRepo := (·.dir.toString)
public def cwd : GitRepo := "."
@[inline] public def dirExists (repo : GitRepo) : BaseIO Bool :=
@@ -71,12 +102,18 @@ public def clone (url : String) (repo : GitRepo) : LogIO PUnit :=
public def quietInit (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["init", "-q"]
public def bareInit (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["init", "--bare", "-q"]
public def insideWorkTree (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["rev-parse", "--is-inside-work-tree"]
public def fetch (repo : GitRepo) (remote := Git.defaultRemote) : LogIO PUnit :=
repo.execGit #["fetch", "--tags", "--force", remote]
public def addWorktreeDetach (path : FilePath) (rev : GitRev) (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["worktree", "add", "--detach", path.toString, rev]
public def checkoutBranch (branch : String) (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["checkout", "-B", branch]
@@ -87,60 +124,80 @@ public def checkoutDetach (hash : String) (repo : GitRepo) : LogIO PUnit :=
public def clean (repo : GitRepo) : LogIO PUnit :=
repo.execGit #["clean", "-xf"]
public def resolveRevision? (rev : String) (repo : GitRepo) : BaseIO (Option String) := do
/-- Resolves the revision to a Git object name (SHA1 hash) which or may not exist in the repository. -/
public def resolveRevision? (rev : GitRev) (repo : GitRepo) : BaseIO (Option GitRev) := do
repo.captureGit? #["rev-parse", "--verify", "--end-of-options", rev]
public def resolveRevision (rev : String) (repo : GitRepo) : LogIO String := do
if Git.isFullObjectName rev then return rev
/-- Resolves the revision to a valid commit hash within the repository. -/
public def findCommit? (rev : GitRev) (repo : GitRepo) : BaseIO (Option GitRev) := do
repo.captureGit? #["rev-parse", "--verify", "--end-of-options", rev ++ "^{commit}"]
public def resolveRevision (rev : GitRev) (repo : GitRepo) : LogIO GitRev := do
if rev.isFullSha1 then return rev
if let some rev repo.resolveRevision? rev then return rev
error s!"{repo}: revision not found '{rev}'"
@[inline] public def getHeadRevision? (repo : GitRepo) : BaseIO (Option String) :=
repo.resolveRevision? "HEAD"
@[inline] public def getHeadRevision? (repo : GitRepo) : BaseIO (Option GitRev) :=
repo.resolveRevision? .head
public def getHeadRevision (repo : GitRepo) : LogIO String := do
public def getHeadRevision (repo : GitRepo) : LogIO GitRev := do
if let some rev repo.getHeadRevision? then return rev
error s!"{repo}: could not resolve 'HEAD' to a commit; \
the repository may be corrupt, so you may need to remove it and try again"
public def getHeadRevisions (repo : GitRepo) (n : Nat := 0) : LogIO (Array String) := do
public def fetchRevision? (repo : GitRepo) (remote : String) (rev : GitRev) : LogIO (Option GitRev) := do
if ( repo.testGit #["fetch", "--tags", "--force", "--refetch", "--filter=tree:0", remote, rev]) then
let some rev repo.resolveRevision? .fetchHead
| error s!"{repo}: could not resolve 'FETCH_HEAD' to a commit after fetching; \
this may be an issue with Lake; please report it"
return rev
else
return none
public def getHeadRevisions (repo : GitRepo) (n : Nat := 0) : LogIO (Array GitRev) := do
let args := #["rev-list", "HEAD"]
let args := if n != 0 then args ++ #["-n", toString n] else args
let revs repo.captureGit args
return revs.split '\n' |>.toStringArray
public def resolveRemoteRevision (rev : String) (remote := Git.defaultRemote) (repo : GitRepo) : LogIO String := do
if Git.isFullObjectName rev then return rev
if let some rev repo.resolveRevision? s!"{remote}/{rev}" then return rev
public def resolveRemoteRevision (rev : GitRev) (remote := Git.defaultRemote) (repo : GitRepo) : LogIO GitRev := do
if rev.isFullSha1 then return rev
if let some rev repo.resolveRevision? (rev.withRemote remote) then return rev
if let some rev repo.resolveRevision? rev then return rev
error s!"{repo}: revision not found '{rev}'"
public def findRemoteRevision (repo : GitRepo) (rev? : Option String := none) (remote := Git.defaultRemote) : LogIO String := do
public def findRemoteRevision (repo : GitRepo) (rev? : Option GitRev := none) (remote := Git.defaultRemote) : LogIO String := do
repo.fetch remote; repo.resolveRemoteRevision (rev?.getD Git.upstreamBranch) remote
public def branchExists (rev : String) (repo : GitRepo) : BaseIO Bool := do
public def branchExists (rev : GitRev) (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["show-ref", "--verify", s!"refs/heads/{rev}"]
public def revisionExists (rev : String) (repo : GitRepo) : BaseIO Bool := do
public def revisionExists (rev : GitRev) (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["rev-parse", "--verify", rev ++ "^{commit}"]
public def getTags (repo : GitRepo) : BaseIO (List String) := do
let some out repo.captureGit? #["tag"] | return []
return out.split '\n' |>.toStringList
public def findTag? (rev : String := "HEAD") (repo : GitRepo) : BaseIO (Option String) := do
public def findTag? (rev : GitRev := .head) (repo : GitRepo) : BaseIO (Option String) := do
repo.captureGit? #["describe", "--tags", "--exact-match", rev]
public def getRemoteUrl?
(remote := Git.defaultRemote) (repo : GitRepo)
: BaseIO (Option String) := do repo.captureGit? #["remote", "get-url", remote]
: BaseIO (Option String) := repo.captureGit? #["remote", "get-url", remote]
public def addRemote (remote : String) (url : String) (repo : GitRepo) : LogIO Unit :=
repo.execGit #["remote", "add", remote, url]
public def setRemoteUrl (remote : String) (url : String) (repo : GitRepo) : LogIO Unit :=
repo.execGit #["remote", "set-url", remote, url]
public def getFilteredRemoteUrl?
(remote := Git.defaultRemote) (repo : GitRepo)
: BaseIO (Option String) := OptionT.run do Git.filterUrl? ( repo.getRemoteUrl? remote)
public def hasNoDiff (repo : GitRepo) : BaseIO Bool := do
repo.testGit #["diff", "HEAD", "--exit-code"]
repo.testGit #["diff", "--exit-code", "HEAD"]
@[inline] public def hasDiff (repo : GitRepo) : BaseIO Bool := do
not <$> repo.hasNoDiff

View File

@@ -7,6 +7,7 @@ module
prelude
public import Init.Data.ToString.Basic
import Init.Data.UInt.Lemmas
import Init.Data.String.Basic
import Init.Data.Nat.Fold
@@ -33,3 +34,38 @@ public def isHex (s : String) : Bool :=
65 c -- 'A'
else
false
def lowerHexByte (n : UInt8) : UInt8 :=
if n 9 then
n + 48 -- + '0'
else
n + 87 -- + ('a' - 10)
theorem isValidChar_of_lt_256 (h : n < 256) : isValidChar n :=
Or.inl <| Nat.lt_trans h (by decide)
def lowerHexChar (n : UInt8) : Char :=
lowerHexByte n |>.toUInt32, isValidChar_of_lt_256 <|
UInt32.lt_iff_toNat_lt.mpr <| (lowerHexByte n).toNat_lt
public def lowerHexUInt64 (n : UInt64) : String :=
String.ofByteArray (ByteArray.emptyWithCapacity 16) ByteArray.isValidUTF8_empty
|>.push (lowerHexChar (n >>> 60 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 56 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 52 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 48 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 44 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 40 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 36 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 32 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 28 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 24 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 20 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 16 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 12 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 8 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n >>> 4 &&& 0xf).toUInt8)
|>.push (lowerHexChar (n &&& 0xf).toUInt8)
-- sanity check
example : "0123456789abcdef" = lowerHexUInt64 0x0123456789abcdef := by decide

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More