mirror of
https://github.com/leanprover/lean4.git
synced 2026-04-04 19:24:09 +00:00
Compare commits
1 Commits
sharecommo
...
sublist_sp
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a4f73120d9 |
@@ -5,8 +5,7 @@ See below for the checklist for release candidates.
|
||||
|
||||
We'll use `v4.6.0` as the intended release version as a running example.
|
||||
|
||||
- One week before the planned release, ensure that (1) someone has written the release notes and (2) someone has written the first draft of the release blog post.
|
||||
If there is any material in `./releases_drafts/`, then the release notes are not done. (See the section "Writing the release notes".)
|
||||
- One week before the planned release, ensure that someone has written the first draft of the release blog post
|
||||
- `git checkout releases/v4.6.0`
|
||||
(This branch should already exist, from the release candidates.)
|
||||
- `git pull`
|
||||
@@ -14,6 +13,11 @@ We'll use `v4.6.0` as the intended release version as a running example.
|
||||
- `set(LEAN_VERSION_MINOR 6)` (for whichever `6` is appropriate)
|
||||
- `set(LEAN_VERSION_IS_RELEASE 1)`
|
||||
- (both of these should already be in place from the release candidates)
|
||||
- In `RELEASES.md`, verify that the `v4.6.0` section has been completed during the release candidate cycle.
|
||||
It should be in bullet point form, with a point for every significant PR,
|
||||
and may have a paragraph describing each major new language feature.
|
||||
It should have a "breaking changes" section calling out changes that are specifically likely
|
||||
to cause problems for downstream users.
|
||||
- `git tag v4.6.0`
|
||||
- `git push $REMOTE v4.6.0`, where `$REMOTE` is the upstream Lean repository (e.g., `origin`, `upstream`)
|
||||
- Now wait, while CI runs.
|
||||
@@ -24,9 +28,8 @@ We'll use `v4.6.0` as the intended release version as a running example.
|
||||
you may want to start on the release candidate checklist now.
|
||||
- Go to https://github.com/leanprover/lean4/releases and verify that the `v4.6.0` release appears.
|
||||
- Edit the release notes on Github to select the "Set as the latest release".
|
||||
- Follow the instructions in creating a release candidate for the "GitHub release notes" step,
|
||||
now that we have a written `RELEASES.md` section.
|
||||
Do a quick sanity check.
|
||||
- Copy and paste the Github release notes from the previous releases candidate for this version
|
||||
(e.g. `v4.6.0-rc1`), and quickly sanity check.
|
||||
- Next, we will move a curated list of downstream repos to the latest stable release.
|
||||
- For each of the repositories listed below:
|
||||
- Make a PR to `master`/`main` changing the toolchain to `v4.6.0`
|
||||
@@ -89,10 +92,6 @@ We'll use `v4.6.0` as the intended release version as a running example.
|
||||
- Toolchain bump PR including updated Lake manifest
|
||||
- Create and push the tag
|
||||
- Merge the tag into `stable`
|
||||
- The `v4.6.0` section of `RELEASES.md` is out of sync between
|
||||
`releases/v4.6.0` and `master`. This should be reconciled:
|
||||
- Replace the `v4.6.0` section on `master` with the `v4.6.0` section on `releases/v4.6.0`
|
||||
and commit this to `master`.
|
||||
- Merge the release announcement PR for the Lean website - it will be deployed automatically
|
||||
- Finally, make an announcement!
|
||||
This should go in https://leanprover.zulipchat.com/#narrow/stream/113486-announce, with topic `v4.6.0`.
|
||||
@@ -103,6 +102,7 @@ We'll use `v4.6.0` as the intended release version as a running example.
|
||||
|
||||
## Optimistic(?) time estimates:
|
||||
- Initial checks and push the tag: 30 minutes.
|
||||
- Note that if `RELEASES.md` has discrepancies this could take longer!
|
||||
- Waiting for the release: 60 minutes.
|
||||
- Fixing release notes: 10 minutes.
|
||||
- Bumping toolchains in downstream repositories, up to creating the Mathlib PR: 30 minutes.
|
||||
@@ -129,26 +129,29 @@ We'll use `v4.7.0-rc1` as the intended release version in this example.
|
||||
git checkout nightly-2024-02-29
|
||||
git checkout -b releases/v4.7.0
|
||||
```
|
||||
- In `RELEASES.md` replace `Development in progress` in the `v4.7.0` section with `Release notes to be written.`
|
||||
- We will rely on automatically generated release notes for release candidates,
|
||||
and the written release notes will be used for stable versions only.
|
||||
It is essential to choose the nightly that will become the release candidate as early as possible, to avoid confusion.
|
||||
- In `RELEASES.md` remove `(development in progress)` from the `v4.7.0` section header.
|
||||
- Our current goal is to have written release notes only about major language features or breaking changes,
|
||||
and to rely on automatically generated release notes for bugfixes and minor changes.
|
||||
- Do not wait on `RELEASES.md` being perfect before creating the `release/v4.7.0` branch. It is essential to choose the nightly which will become the release candidate as early as possible, to avoid confusion.
|
||||
- If there are major changes not reflected in `RELEASES.md` already, you may need to solicit help from the authors.
|
||||
- Minor changes and bug fixes do not need to be documented in `RELEASES.md`: they will be added automatically on the Github release page.
|
||||
- Commit your changes to `RELEASES.md`, and push.
|
||||
- Remember that changes to `RELEASES.md` after you have branched `releases/v4.7.0` should also be cherry-picked back to `master`.
|
||||
- In `src/CMakeLists.txt`,
|
||||
- verify that you see `set(LEAN_VERSION_MINOR 7)` (for whichever `7` is appropriate); this should already have been updated when the development cycle began.
|
||||
- `set(LEAN_VERSION_IS_RELEASE 1)` (this should be a change; on `master` and nightly releases it is always `0`).
|
||||
- Commit your changes to `src/CMakeLists.txt`, and push.
|
||||
- `git tag v4.7.0-rc1`
|
||||
- `git push origin v4.7.0-rc1`
|
||||
- Ping the FRO Zulip that release notes need to be written. The release notes do not block completing the rest of this checklist.
|
||||
- Now wait, while CI runs.
|
||||
- You can monitor this at `https://github.com/leanprover/lean4/actions/workflows/ci.yml`, looking for the `v4.7.0-rc1` tag.
|
||||
- This step can take up to an hour.
|
||||
- (GitHub release notes) Once the release appears at https://github.com/leanprover/lean4/releases/
|
||||
- Once the release appears at https://github.com/leanprover/lean4/releases/
|
||||
- Edit the release notes on Github to select the "Set as a pre-release box".
|
||||
- If release notes have been written already, copy the section of `RELEASES.md` for this version into the Github release notes
|
||||
and use the title "Changes since v4.6.0 (from RELEASES.md)".
|
||||
- Otherwise, in the "previous tag" dropdown, select `v4.6.0`, and click "Generate release notes".
|
||||
This will add a list of all the commits since the last stable version.
|
||||
- Copy the section of `RELEASES.md` for this version into the Github release notes.
|
||||
- Use the title "Changes since v4.6.0 (from RELEASES.md)"
|
||||
- Then in the "previous tag" dropdown, select `v4.6.0`, and click "Generate release notes".
|
||||
- This will add a list of all the commits since the last stable version.
|
||||
- Delete anything already mentioned in the hand-written release notes above.
|
||||
- Delete "update stage0" commits, and anything with a completely inscrutable commit message.
|
||||
- Briefly rearrange the remaining items by category (e.g. `simp`, `lake`, `bug fixes`),
|
||||
@@ -187,17 +190,12 @@ We'll use `v4.7.0-rc1` as the intended release version in this example.
|
||||
Please also make sure that whoever is handling social media knows the release is out.
|
||||
- Begin the next development cycle (i.e. for `v4.8.0`) on the Lean repository, by making a PR that:
|
||||
- Updates `src/CMakeLists.txt` to say `set(LEAN_VERSION_MINOR 8)`
|
||||
- Replaces the "development in progress" in the `v4.7.0` section of `RELEASES.md` with
|
||||
```
|
||||
Release candidate, release notes will be copied from `branch releases/v4.7.0` once completed.
|
||||
```
|
||||
and inserts the following section before that section:
|
||||
```
|
||||
v4.8.0
|
||||
----------
|
||||
Development in progress.
|
||||
```
|
||||
- Removes all the entries from the `./releases_drafts/` folder.
|
||||
- In `RELEASES.md`, update the `v4.7.0` section to say:
|
||||
"Release candidate, release notes will be copied from branch `releases/v4.7.0` once completed."
|
||||
Make sure that whoever is preparing the release notes during this cycle knows that it is their job to do so!
|
||||
- In `RELEASES.md`, update the `v4.8.0` section to say:
|
||||
"Development in progress".
|
||||
- In `RELEASES.md`, verify that the old section `v4.6.0` has the full releases notes from the `releases/v4.6.0` branch.
|
||||
|
||||
## Time estimates:
|
||||
Slightly longer than the corresponding steps for a stable release.
|
||||
@@ -231,18 +229,3 @@ Please read https://leanprover-community.github.io/contribute/tags_and_branches.
|
||||
* It is always okay to merge in the following directions:
|
||||
`master` -> `bump/v4.7.0` -> `bump/nightly-2024-02-15` -> `nightly-testing`.
|
||||
Please remember to push any merges you make to intermediate steps!
|
||||
|
||||
# Writing the release notes
|
||||
|
||||
We are currently trying a system where release notes are compiled all at once from someone looking through the commit history.
|
||||
The exact steps are a work in progress.
|
||||
Here is the general idea:
|
||||
|
||||
* The work is done right on the `releases/v4.6.0` branch sometime after it is created but before the stable release is made.
|
||||
The release notes for `v4.6.0` will be copied to `master`.
|
||||
* There can be material for release notes entries in commit messages.
|
||||
* There can also be pre-written entries in `./releases_drafts`, which should be all incorporated in the release notes and then deleted from the branch.
|
||||
See `./releases_drafts/README.md` for more information.
|
||||
* The release notes should be written from a downstream expert user's point of view.
|
||||
|
||||
This section will be updated when the next release notes are written (for `v4.10.0`).
|
||||
|
||||
@@ -1089,18 +1089,15 @@ def InvImage {α : Sort u} {β : Sort v} (r : β → β → Prop) (f : α → β
|
||||
fun a₁ a₂ => r (f a₁) (f a₂)
|
||||
|
||||
/--
|
||||
The transitive closure `TransGen r` of a relation `r` is the smallest relation which is
|
||||
transitive and contains `r`. `TransGen r a z` if and only if there exists a sequence
|
||||
The transitive closure `r⁺` of a relation `r` is the smallest relation which is
|
||||
transitive and contains `r`. `r⁺ a z` if and only if there exists a sequence
|
||||
`a r b r ... r z` of length at least 1 connecting `a` to `z`.
|
||||
-/
|
||||
inductive Relation.TransGen {α : Sort u} (r : α → α → Prop) : α → α → Prop
|
||||
/-- If `r a b` then `TransGen r a b`. This is the base case of the transitive closure. -/
|
||||
| single {a b} : r a b → TransGen r a b
|
||||
inductive TC {α : Sort u} (r : α → α → Prop) : α → α → Prop where
|
||||
/-- If `r a b` then `r⁺ a b`. This is the base case of the transitive closure. -/
|
||||
| base : ∀ a b, r a b → TC r a b
|
||||
/-- The transitive closure is transitive. -/
|
||||
| tail {a b c} : TransGen r a b → r b c → TransGen r a c
|
||||
|
||||
/-- Deprecated synonym for `Relation.TransGen`. -/
|
||||
@[deprecated Relation.TransGen (since := "2024-07-16")] abbrev TC := @Relation.TransGen
|
||||
| trans : ∀ a b c, TC r a b → TC r b c → TC r a c
|
||||
|
||||
/-! # Subtype -/
|
||||
|
||||
|
||||
@@ -51,7 +51,7 @@ theorem foldlM_eq_foldlM_data.aux [Monad m]
|
||||
simp [foldlM_eq_foldlM_data.aux f arr i (j+1) H]
|
||||
rw (config := {occs := .pos [2]}) [← List.get_drop_eq_drop _ _ ‹_›]
|
||||
rfl
|
||||
· rw [List.drop_of_length_le (Nat.ge_of_not_lt ‹_›)]; rfl
|
||||
· rw [List.drop_length_le (Nat.ge_of_not_lt ‹_›)]; rfl
|
||||
|
||||
theorem foldlM_eq_foldlM_data [Monad m]
|
||||
(f : β → α → m β) (init : β) (arr : Array α) :
|
||||
@@ -141,7 +141,7 @@ where
|
||||
· rw [← List.get_drop_eq_drop _ i ‹_›]
|
||||
simp only [aux (i + 1), map_eq_pure_bind, data_length, List.foldlM_cons, bind_assoc, pure_bind]
|
||||
rfl
|
||||
· rw [List.drop_of_length_le (Nat.ge_of_not_lt ‹_›)]; rfl
|
||||
· rw [List.drop_length_le (Nat.ge_of_not_lt ‹_›)]; rfl
|
||||
termination_by arr.size - i
|
||||
decreasing_by decreasing_trivial_pre_omega
|
||||
|
||||
|
||||
@@ -101,13 +101,13 @@ Returns an undefined value if `x` is not finite.
|
||||
instance : ToString Float where
|
||||
toString := Float.toString
|
||||
|
||||
@[extern "lean_uint64_to_float"] opaque UInt64.toFloat (n : UInt64) : Float
|
||||
|
||||
instance : Repr Float where
|
||||
reprPrec n prec := if n < UInt64.toFloat 0 then Repr.addAppParen (toString n) prec else toString n
|
||||
reprPrec n _ := Float.toString n
|
||||
|
||||
instance : ReprAtom Float := ⟨⟩
|
||||
|
||||
@[extern "lean_uint64_to_float"] opaque UInt64.toFloat (n : UInt64) : Float
|
||||
|
||||
@[extern "sin"] opaque Float.sin : Float → Float
|
||||
@[extern "cos"] opaque Float.cos : Float → Float
|
||||
@[extern "tan"] opaque Float.tan : Float → Float
|
||||
|
||||
@@ -22,7 +22,7 @@ along with `@[csimp]` lemmas,
|
||||
|
||||
In `Init.Data.List.Lemmas` we develop the full API for these functions.
|
||||
|
||||
Recall that `length`, `get`, `set`, `foldl`, and `concat` have already been defined in `Init.Prelude`.
|
||||
Recall that `length`, `get`, `set`, `fold`, and `concat` have already been defined in `Init.Prelude`.
|
||||
|
||||
The operations are organized as follow:
|
||||
* Equality: `beq`, `isEqv`.
|
||||
@@ -33,7 +33,7 @@ The operations are organized as follow:
|
||||
and decidability for predicates quantifying over membership in a `List`.
|
||||
* Sublists: `take`, `drop`, `takeWhile`, `dropWhile`, `partition`, `dropLast`,
|
||||
`isPrefixOf`, `isPrefixOf?`, `isSuffixOf`, `isSuffixOf?`, `Subset`, `Sublist`, `rotateLeft` and `rotateRight`.
|
||||
* Manipulating elements: `replace`, `insert`, `erase`, `eraseP`, `eraseIdx`, `find?`, `findSome?`, and `lookup`.
|
||||
* Manipulating elements: `replace`, `insert`, `erase`, `eraseIdx`, `find?`, `findSome?`, and `lookup`.
|
||||
* Logic: `any`, `all`, `or`, and `and`.
|
||||
* Zippers: `zipWith`, `zip`, `zipWithAll`, and `unzip`.
|
||||
* Ranges and enumeration: `range`, `iota`, `enumFrom`, and `enum`.
|
||||
@@ -942,55 +942,6 @@ def rotateRight (xs : List α) (n : Nat := 1) : List α :=
|
||||
|
||||
@[simp] theorem rotateRight_nil : ([] : List α).rotateRight n = [] := rfl
|
||||
|
||||
/-! ## Pairwise, Nodup -/
|
||||
|
||||
section Pairwise
|
||||
|
||||
variable (R : α → α → Prop)
|
||||
|
||||
/--
|
||||
`Pairwise R l` means that all the elements with earlier indexes are
|
||||
`R`-related to all the elements with later indexes.
|
||||
```
|
||||
Pairwise R [1, 2, 3] ↔ R 1 2 ∧ R 1 3 ∧ R 2 3
|
||||
```
|
||||
For example if `R = (·≠·)` then it asserts `l` has no duplicates,
|
||||
and if `R = (·<·)` then it asserts that `l` is (strictly) sorted.
|
||||
-/
|
||||
inductive Pairwise : List α → Prop
|
||||
/-- All elements of the empty list are vacuously pairwise related. -/
|
||||
| nil : Pairwise []
|
||||
/-- `a :: l` is `Pairwise R` if `a` `R`-relates to every element of `l`,
|
||||
and `l` is `Pairwise R`. -/
|
||||
| cons : ∀ {a : α} {l : List α}, (∀ a', a' ∈ l → R a a') → Pairwise l → Pairwise (a :: l)
|
||||
|
||||
attribute [simp] Pairwise.nil
|
||||
|
||||
variable {R}
|
||||
|
||||
@[simp] theorem pairwise_cons : Pairwise R (a::l) ↔ (∀ a', a' ∈ l → R a a') ∧ Pairwise R l :=
|
||||
⟨fun | .cons h₁ h₂ => ⟨h₁, h₂⟩, fun ⟨h₁, h₂⟩ => h₂.cons h₁⟩
|
||||
|
||||
instance instDecidablePairwise [DecidableRel R] :
|
||||
(l : List α) → Decidable (Pairwise R l)
|
||||
| [] => isTrue .nil
|
||||
| hd :: tl =>
|
||||
match instDecidablePairwise tl with
|
||||
| isTrue ht =>
|
||||
match decidableBAll (R hd) tl with
|
||||
| isFalse hf => isFalse fun hf' => hf (pairwise_cons.1 hf').1
|
||||
| isTrue ht' => isTrue <| pairwise_cons.mpr (And.intro ht' ht)
|
||||
| isFalse hf => isFalse fun | .cons _ ih => hf ih
|
||||
|
||||
end Pairwise
|
||||
|
||||
/-- `Nodup l` means that `l` has no duplicates, that is, any element appears at most
|
||||
once in the List. It is defined as `Pairwise (≠)`. -/
|
||||
def Nodup : List α → Prop := Pairwise (· ≠ ·)
|
||||
|
||||
instance nodupDecidable [DecidableEq α] : ∀ l : List α, Decidable (Nodup l) :=
|
||||
instDecidablePairwise
|
||||
|
||||
/-! ## Manipulating elements -/
|
||||
|
||||
/-! ### replace -/
|
||||
@@ -1036,11 +987,6 @@ theorem erase_cons [BEq α] (a b : α) (l : List α) :
|
||||
(b :: l).erase a = if b == a then l else b :: l.erase a := by
|
||||
simp only [List.erase]; split <;> simp_all
|
||||
|
||||
/-- `eraseP p l` removes the first element of `l` satisfying the predicate `p`. -/
|
||||
def eraseP (p : α → Bool) : List α → List α
|
||||
| [] => []
|
||||
| a :: l => bif p a then l else a :: eraseP p l
|
||||
|
||||
/-! ### eraseIdx -/
|
||||
|
||||
/--
|
||||
|
||||
@@ -295,24 +295,6 @@ theorem replicateTR_loop_eq : ∀ n, replicateTR.loop a n acc = replicate n a ++
|
||||
· rw [IH] <;> simp_all
|
||||
· simp
|
||||
|
||||
/-- Tail-recursive version of `eraseP`. -/
|
||||
@[inline] def erasePTR (p : α → Bool) (l : List α) : List α := go l #[] where
|
||||
/-- Auxiliary for `erasePTR`: `erasePTR.go p l xs acc = acc.toList ++ eraseP p xs`,
|
||||
unless `xs` does not contain any elements satisfying `p`, where it returns `l`. -/
|
||||
@[specialize] go : List α → Array α → List α
|
||||
| [], _ => l
|
||||
| a :: l, acc => bif p a then acc.toListAppend l else go l (acc.push a)
|
||||
|
||||
@[csimp] theorem eraseP_eq_erasePTR : @eraseP = @erasePTR := by
|
||||
funext α p l; simp [erasePTR]
|
||||
let rec go (acc) : ∀ xs, l = acc.data ++ xs →
|
||||
erasePTR.go p l xs acc = acc.data ++ xs.eraseP p
|
||||
| [] => fun h => by simp [erasePTR.go, eraseP, h]
|
||||
| x::xs => by
|
||||
simp [erasePTR.go, eraseP]; cases p x <;> simp
|
||||
· intro h; rw [go _ xs]; {simp}; simp [h]
|
||||
exact (go #[] _ rfl).symm
|
||||
|
||||
/-! ### eraseIdx -/
|
||||
|
||||
/-- Tail recursive version of `List.eraseIdx`. -/
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -120,43 +120,6 @@ theorem get?_take_eq_if {l : List α} {n m : Nat} :
|
||||
(l.take n).get? m = if m < n then l.get? m else none := by
|
||||
simp [getElem?_take_eq_if]
|
||||
|
||||
theorem head?_take {l : List α} {n : Nat} :
|
||||
(l.take n).head? = if n = 0 then none else l.head? := by
|
||||
simp [head?_eq_getElem?, getElem?_take_eq_if]
|
||||
split
|
||||
· rw [if_neg (by omega)]
|
||||
· rw [if_pos (by omega)]
|
||||
|
||||
theorem head_take {l : List α} {n : Nat} (h : l.take n ≠ []) :
|
||||
(l.take n).head h = l.head (by simp_all) := by
|
||||
apply Option.some_inj.1
|
||||
rw [← head?_eq_head, ← head?_eq_head, head?_take, if_neg]
|
||||
simp_all
|
||||
|
||||
theorem getLast?_take {l : List α} : (l.take n).getLast? = if n = 0 then none else l[n - 1]?.or l.getLast? := by
|
||||
rw [getLast?_eq_getElem?, getElem?_take_eq_if, length_take]
|
||||
split
|
||||
· rw [if_neg (by omega)]
|
||||
rw [Nat.min_def]
|
||||
split
|
||||
· rw [getElem?_eq_getElem (by omega)]
|
||||
simp
|
||||
· rw [← getLast?_eq_getElem?, getElem?_eq_none (by omega)]
|
||||
simp
|
||||
· rw [if_pos]
|
||||
omega
|
||||
|
||||
theorem getLast_take {l : List α} (h : l.take n ≠ []) :
|
||||
(l.take n).getLast h = l[n - 1]?.getD (l.getLast (by simp_all)) := by
|
||||
rw [getLast_eq_getElem, getElem_take']
|
||||
simp [length_take, Nat.min_def]
|
||||
simp at h
|
||||
split
|
||||
· rw [getElem?_eq_getElem (by omega)]
|
||||
simp
|
||||
· rw [getElem?_eq_none (by omega), getLast_eq_getElem]
|
||||
simp
|
||||
|
||||
@[simp]
|
||||
theorem take_eq_take :
|
||||
∀ {l : List α} {m n : Nat}, l.take m = l.take n ↔ min m l.length = min n l.length
|
||||
@@ -282,31 +245,6 @@ theorem getElem?_drop (L : List α) (i j : Nat) : (L.drop i)[j]? = L[i + j]? :=
|
||||
theorem get?_drop (L : List α) (i j : Nat) : get? (L.drop i) j = get? L (i + j) := by
|
||||
simp
|
||||
|
||||
theorem head?_drop (l : List α) (n : Nat) :
|
||||
(l.drop n).head? = l[n]? := by
|
||||
rw [head?_eq_getElem?, getElem?_drop, Nat.add_zero]
|
||||
|
||||
theorem head_drop {l : List α} {n : Nat} (h : l.drop n ≠ []) :
|
||||
(l.drop n).head h = l[n]'(by simp_all) := by
|
||||
have w : n < l.length := length_lt_of_drop_ne_nil h
|
||||
simpa [head?_eq_head, getElem?_eq_getElem, h, w] using head?_drop l n
|
||||
|
||||
theorem getLast?_drop {l : List α} : (l.drop n).getLast? = if l.length ≤ n then none else l.getLast? := by
|
||||
rw [getLast?_eq_getElem?, getElem?_drop]
|
||||
rw [length_drop]
|
||||
split
|
||||
· rw [getElem?_eq_none (by omega)]
|
||||
· rw [getLast?_eq_getElem?]
|
||||
congr
|
||||
omega
|
||||
|
||||
theorem getLast_drop {l : List α} (h : l.drop n ≠ []) :
|
||||
(l.drop n).getLast h = l.getLast (ne_nil_of_length_pos (by simp at h; omega)) := by
|
||||
simp only [ne_eq, drop_eq_nil_iff_le] at h
|
||||
apply Option.some_inj.1
|
||||
simp only [← getLast?_eq_getLast, getLast?_drop, ite_eq_right_iff]
|
||||
omega
|
||||
|
||||
theorem set_eq_take_append_cons_drop {l : List α} {n : Nat} {a : α} :
|
||||
l.set n a = if n < l.length then l.take n ++ a :: l.drop (n + 1) else l := by
|
||||
split <;> rename_i h
|
||||
|
||||
@@ -634,10 +634,6 @@ theorem succ_lt_succ_iff : succ a < succ b ↔ a < b := ⟨lt_of_succ_lt_succ, s
|
||||
|
||||
theorem add_one_inj : a + 1 = b + 1 ↔ a = b := succ_inj'
|
||||
|
||||
theorem ne_add_one (n : Nat) : n ≠ n + 1 := fun h => by cases h
|
||||
|
||||
theorem add_one_ne (n : Nat) : n + 1 ≠ n := fun h => by cases h
|
||||
|
||||
theorem add_one_le_add_one_iff : a + 1 ≤ b + 1 ↔ a ≤ b := succ_le_succ_iff
|
||||
|
||||
theorem add_one_lt_add_one_iff : a + 1 < b + 1 ↔ a < b := succ_lt_succ_iff
|
||||
@@ -819,9 +815,6 @@ protected theorem pred_succ (n : Nat) : pred n.succ = n := rfl
|
||||
@[simp] protected theorem zero_sub_one : 0 - 1 = 0 := rfl
|
||||
@[simp] protected theorem add_one_sub_one (n : Nat) : n + 1 - 1 = n := rfl
|
||||
|
||||
theorem sub_one_eq_self (n : Nat) : n - 1 = n ↔ n = 0 := by cases n <;> simp [ne_add_one]
|
||||
theorem eq_self_sub_one (n : Nat) : n = n - 1 ↔ n = 0 := by cases n <;> simp [add_one_ne]
|
||||
|
||||
theorem succ_pred {a : Nat} (h : a ≠ 0) : a.pred.succ = a := by
|
||||
induction a with
|
||||
| zero => contradiction
|
||||
|
||||
@@ -190,9 +190,6 @@ theorem comp_map (h : β → γ) (g : α → β) (x : Option α) : x.map (h ∘
|
||||
|
||||
theorem mem_map_of_mem (g : α → β) (h : a ∈ x) : g a ∈ Option.map g x := h.symm ▸ map_some' ..
|
||||
|
||||
@[simp] theorem filter_none (p : α → Bool) : none.filter p = none := rfl
|
||||
theorem filter_some : Option.filter p (some a) = if p a then some a else none := rfl
|
||||
|
||||
theorem bind_map_comm {α β} {x : Option (Option α)} {f : α → β} :
|
||||
x.bind (Option.map f) = (x.map (Option.map f)).bind id := by cases x <;> simp
|
||||
|
||||
|
||||
@@ -230,7 +230,7 @@ protected def Int.repr : Int → String
|
||||
| negSucc m => "-" ++ Nat.repr (succ m)
|
||||
|
||||
instance : Repr Int where
|
||||
reprPrec i prec := if i < 0 then Repr.addAppParen i.repr prec else i.repr
|
||||
reprPrec i _ := i.repr
|
||||
|
||||
def hexDigitRepr (n : Nat) : String :=
|
||||
String.singleton <| Nat.digitChar n
|
||||
|
||||
@@ -267,7 +267,6 @@ syntax (name := rawNatLit) "nat_lit " num : term
|
||||
|
||||
@[inherit_doc] infixr:90 " ∘ " => Function.comp
|
||||
@[inherit_doc] infixr:35 " × " => Prod
|
||||
@[inherit_doc] infixr:35 " ×' " => PProd
|
||||
|
||||
@[inherit_doc] infix:50 " ∣ " => Dvd.dvd
|
||||
@[inherit_doc] infixl:55 " ||| " => HOr.hOr
|
||||
@@ -704,28 +703,6 @@ syntax (name := checkSimp) "#check_simp " term "~>" term : command
|
||||
-/
|
||||
syntax (name := checkSimpFailure) "#check_simp " term "!~>" : command
|
||||
|
||||
/--
|
||||
`#discr_tree_key t` prints the discrimination tree keys for a term `t` (or, if it is a single identifier, the type of that constant).
|
||||
It uses the default configuration for generating keys.
|
||||
|
||||
For example,
|
||||
```
|
||||
#discr_tree_key (∀ {a n : Nat}, bar a (OfNat.ofNat n))
|
||||
-- bar _ (@OfNat.ofNat Nat _ _)
|
||||
|
||||
#discr_tree_simp_key Nat.add_assoc
|
||||
-- @HAdd.hAdd Nat Nat Nat _ (@HAdd.hAdd Nat Nat Nat _ _ _) _
|
||||
```
|
||||
|
||||
`#discr_tree_simp_key` is similar to `#discr_tree_key`, but treats the underlying type
|
||||
as one of a simp lemma, i.e. transforms it into an equality and produces the key of the
|
||||
left-hand side.
|
||||
-/
|
||||
syntax (name := discrTreeKeyCmd) "#discr_tree_key " term : command
|
||||
|
||||
@[inherit_doc discrTreeKeyCmd]
|
||||
syntax (name := discrTreeSimpKeyCmd) "#discr_tree_simp_key" term : command
|
||||
|
||||
/--
|
||||
The `seal foo` command ensures that the definition of `foo` is sealed, meaning it is marked as `[irreducible]`.
|
||||
This command is particularly useful in contexts where you want to prevent the reduction of `foo` in proofs.
|
||||
|
||||
@@ -488,9 +488,9 @@ attribute [unbox] Prod
|
||||
|
||||
/--
|
||||
Similar to `Prod`, but `α` and `β` can be propositions.
|
||||
You can use `α ×' β` as notation for `PProd α β`.
|
||||
We use this type internally to automatically generate the `brecOn` recursor.
|
||||
-/
|
||||
@[pp_using_anonymous_constructor]
|
||||
structure PProd (α : Sort u) (β : Sort v) where
|
||||
/-- The first projection out of a pair. if `p : PProd α β` then `p.1 : α`. -/
|
||||
fst : α
|
||||
@@ -3172,8 +3172,8 @@ class MonadStateOf (σ : semiOutParam (Type u)) (m : Type u → Type v) where
|
||||
export MonadStateOf (set)
|
||||
|
||||
/--
|
||||
Like `get`, but with `σ` explicit. This is useful if a monad supports
|
||||
`MonadStateOf` for multiple different types `σ`.
|
||||
Like `withReader`, but with `ρ` explicit. This is useful if a monad supports
|
||||
`MonadWithReaderOf` for multiple different types `ρ`.
|
||||
-/
|
||||
abbrev getThe (σ : Type u) {m : Type u → Type v} [MonadStateOf σ m] : m σ :=
|
||||
MonadStateOf.get
|
||||
|
||||
@@ -295,8 +295,6 @@ theorem not_forall_of_exists_not {p : α → Prop} : (∃ x, ¬p x) → ¬∀ x,
|
||||
|
||||
@[simp] theorem exists_eq_left' : (∃ a, a' = a ∧ p a) ↔ p a' := by simp [@eq_comm _ a']
|
||||
|
||||
@[simp] theorem exists_eq_right' : (∃ a, p a ∧ a' = a) ↔ p a' := by simp [@eq_comm _ a']
|
||||
|
||||
@[simp] theorem forall_eq_or_imp : (∀ a, a = a' ∨ q a → p a) ↔ p a' ∧ ∀ a, q a → p a := by
|
||||
simp only [or_imp, forall_and, forall_eq]
|
||||
|
||||
|
||||
@@ -102,11 +102,3 @@ instance ShareCommonT.monadShareCommon [Monad m] : MonadShareCommon (ShareCommon
|
||||
|
||||
@[inline] def ShareCommonT.run [Monad m] (x : ShareCommonT σ m α) : m α := x.run' default
|
||||
@[inline] def ShareCommonM.run (x : ShareCommonM σ α) : α := ShareCommonT.run x
|
||||
|
||||
/--
|
||||
A more restrictive but efficient max sharing primitive.
|
||||
|
||||
Remark: it optimizes the number of RC operations, and the strategy for caching results.
|
||||
-/
|
||||
@[extern "lean_sharecommon_quick"]
|
||||
def ShareCommon.shareCommon' (a : α) : α := a
|
||||
|
||||
@@ -148,26 +148,22 @@ end InvImage
|
||||
wf := InvImage.wf f h.wf
|
||||
|
||||
-- The transitive closure of a well-founded relation is well-founded
|
||||
open Relation
|
||||
namespace TC
|
||||
variable {α : Sort u} {r : α → α → Prop}
|
||||
|
||||
theorem Acc.transGen (h : Acc r a) : Acc (TransGen r) a := by
|
||||
induction h with
|
||||
| intro x _ H =>
|
||||
refine Acc.intro x fun y hy ↦ ?_
|
||||
cases hy with
|
||||
| single hyx =>
|
||||
exact H y hyx
|
||||
| tail hyz hzx =>
|
||||
exact (H _ hzx).inv hyz
|
||||
theorem accessible {z : α} (ac : Acc r z) : Acc (TC r) z := by
|
||||
induction ac with
|
||||
| intro x acx ih =>
|
||||
apply Acc.intro x
|
||||
intro y rel
|
||||
induction rel with
|
||||
| base a b rab => exact ih a rab
|
||||
| trans a b c rab _ _ ih₂ => apply Acc.inv (ih₂ acx ih) rab
|
||||
|
||||
theorem acc_transGen_iff : Acc (TransGen r) a ↔ Acc r a :=
|
||||
⟨Subrelation.accessible TransGen.single, Acc.transGen⟩
|
||||
theorem wf (h : WellFounded r) : WellFounded (TC r) :=
|
||||
⟨fun a => accessible (apply h a)⟩
|
||||
end TC
|
||||
|
||||
theorem WellFounded.transGen (h : WellFounded r) : WellFounded (TransGen r) :=
|
||||
⟨fun a ↦ (h.apply a).transGen⟩
|
||||
|
||||
@[deprecated Acc.transGen (since := "2024-07-16")] abbrev TC.accessible := @Acc.transGen
|
||||
@[deprecated WellFounded.transGen (since := "2024-07-16")] abbrev TC.wf := @WellFounded.transGen
|
||||
namespace Nat
|
||||
|
||||
-- less-than is well-founded
|
||||
|
||||
@@ -94,7 +94,7 @@ def emitCInitName (n : Name) : M Unit :=
|
||||
def shouldExport (n : Name) : Bool :=
|
||||
-- HACK: exclude symbols very unlikely to be used by the interpreter or other consumers of
|
||||
-- libleanshared to avoid Windows symbol limit
|
||||
!(`Lean.Compiler.LCNF).isPrefixOf n && !(`Lean.IR).isPrefixOf n && !(`Lean.Server).isPrefixOf n
|
||||
!(`Lean.Compiler.LCNF).isPrefixOf n
|
||||
|
||||
def emitFnDeclAux (decl : Decl) (cppBaseName : String) (isExternal : Bool) : M Unit := do
|
||||
let ps := decl.params
|
||||
|
||||
@@ -5,7 +5,6 @@ Authors: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Lean.Compiler.Options
|
||||
import Lean.Compiler.ExternAttr
|
||||
import Lean.Compiler.LCNF.PassManager
|
||||
import Lean.Compiler.LCNF.Passes
|
||||
import Lean.Compiler.LCNF.PrettyPrinter
|
||||
|
||||
@@ -4,7 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Lean.PrettyPrinter.Delaborator.Options
|
||||
import Lean.PrettyPrinter
|
||||
import Lean.Compiler.LCNF.CompilerM
|
||||
import Lean.Compiler.LCNF.Internalize
|
||||
|
||||
|
||||
@@ -286,7 +286,6 @@ def mkInductiveValEx (name : Name) (levelParams : List Name) (type : Expr) (numP
|
||||
|
||||
def InductiveVal.numCtors (v : InductiveVal) : Nat := v.ctors.length
|
||||
def InductiveVal.isNested (v : InductiveVal) : Bool := v.numNested > 0
|
||||
def InductiveVal.numTypeFormers (v : InductiveVal) : Nat := v.all.length + v.numNested
|
||||
|
||||
structure ConstructorVal extends ConstantVal where
|
||||
/-- Inductive type this constructor is a member of -/
|
||||
|
||||
@@ -1292,7 +1292,6 @@ private partial def elabAppFnId (fIdent : Syntax) (fExplicitUnivs : List Level)
|
||||
funLVals.foldlM (init := acc) fun acc (f, fIdent, fields) => do
|
||||
let lvals' := toLVals fields (first := true)
|
||||
let s ← observing do
|
||||
checkDeprecated fIdent f
|
||||
let f ← addTermInfo fIdent f expectedType?
|
||||
let e ← elabAppLVals f (lvals' ++ lvals) namedArgs args expectedType? explicit ellipsis
|
||||
if overloaded then ensureHasType expectedType? e else return e
|
||||
|
||||
@@ -11,6 +11,7 @@ import Lean.Elab.Eval
|
||||
import Lean.Elab.Command
|
||||
import Lean.Elab.Open
|
||||
import Lean.Elab.SetOption
|
||||
import Lean.PrettyPrinter
|
||||
|
||||
namespace Lean.Elab.Command
|
||||
|
||||
|
||||
@@ -220,31 +220,6 @@ partial def mkPairs (elems : Array Term) : MacroM Term :=
|
||||
pure acc
|
||||
loop (elems.size - 1) elems.back
|
||||
|
||||
/-- Return syntax `PProd.mk elems[0] (PProd.mk elems[1] ... (PProd.mk elems[elems.size - 2] elems[elems.size - 1])))` -/
|
||||
partial def mkPPairs (elems : Array Term) : MacroM Term :=
|
||||
let rec loop (i : Nat) (acc : Term) := do
|
||||
if i > 0 then
|
||||
let i := i - 1
|
||||
let elem := elems[i]!
|
||||
let acc ← `(PProd.mk $elem $acc)
|
||||
loop i acc
|
||||
else
|
||||
pure acc
|
||||
loop (elems.size - 1) elems.back
|
||||
|
||||
/-- Return syntax `MProd.mk elems[0] (MProd.mk elems[1] ... (MProd.mk elems[elems.size - 2] elems[elems.size - 1])))` -/
|
||||
partial def mkMPairs (elems : Array Term) : MacroM Term :=
|
||||
let rec loop (i : Nat) (acc : Term) := do
|
||||
if i > 0 then
|
||||
let i := i - 1
|
||||
let elem := elems[i]!
|
||||
let acc ← `(MProd.mk $elem $acc)
|
||||
loop i acc
|
||||
else
|
||||
pure acc
|
||||
loop (elems.size - 1) elems.back
|
||||
|
||||
|
||||
open Parser in
|
||||
partial def hasCDot : Syntax → Bool
|
||||
| Syntax.node _ k args =>
|
||||
|
||||
@@ -316,7 +316,9 @@ private def mkSilentAnnotationIfHole (e : Expr) : TermElabM Expr := do
|
||||
return false
|
||||
return true
|
||||
if canClear then
|
||||
withErasedFVars #[fvarId] do elabTerm body expectedType?
|
||||
let lctx := (← getLCtx).erase fvarId
|
||||
let localInsts := (← getLocalInstances).filter (·.fvar.fvarId! != fvarId)
|
||||
withLCtx lctx localInsts do elabTerm body expectedType?
|
||||
else
|
||||
elabTerm body expectedType?
|
||||
|
||||
|
||||
@@ -672,7 +672,8 @@ partial def main (patternVarDecls : Array PatternVarDecl) (ps : Array Expr) (mat
|
||||
throwError "invalid patterns, `{mkFVar explicit}` is an explicit pattern variable, but it only occurs in positions that are inaccessible to pattern matching{indentD (MessageData.joinSep (ps.toList.map (MessageData.ofExpr .)) m!"\n\n")}"
|
||||
let packed ← pack patternVars ps matchType
|
||||
trace[Elab.match] "packed: {packed}"
|
||||
withErasedFVars explicitPatternVars do
|
||||
let lctx := explicitPatternVars.foldl (init := (← getLCtx)) fun lctx d => lctx.erase d
|
||||
withTheReader Meta.Context (fun ctx => { ctx with lctx := lctx }) do
|
||||
check packed
|
||||
unpack packed fun patternVars patterns matchType => do
|
||||
let localDecls ← patternVars.mapM fun x => x.fvarId!.getDecl
|
||||
|
||||
@@ -923,7 +923,6 @@ where
|
||||
trace[Elab.definition] "{preDef.declName} : {preDef.type} :=\n{preDef.value}"
|
||||
let preDefs ← withLevelNames allUserLevelNames <| levelMVarToParamPreDecls preDefs
|
||||
let preDefs ← instantiateMVarsAtPreDecls preDefs
|
||||
let preDefs ← shareCommonPreDefs preDefs
|
||||
let preDefs ← fixLevelParams preDefs scopeLevelNames allUserLevelNames
|
||||
for preDef in preDefs do
|
||||
trace[Elab.definition] "after eraseAuxDiscr, {preDef.declName} : {preDef.type} :=\n{preDef.value}"
|
||||
|
||||
@@ -4,7 +4,6 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Init.ShareCommon
|
||||
import Lean.Compiler.NoncomputableAttr
|
||||
import Lean.Util.CollectLevelParams
|
||||
import Lean.Meta.AbstractNestedProofs
|
||||
@@ -54,20 +53,18 @@ private def getLevelParamsPreDecls (preDefs : Array PreDefinition) (scopeLevelNa
|
||||
| Except.ok levelParams => pure levelParams
|
||||
|
||||
def fixLevelParams (preDefs : Array PreDefinition) (scopeLevelNames allUserLevelNames : List Name) : TermElabM (Array PreDefinition) := do
|
||||
profileitM Exception s!"fix level params" (← getOptions) do
|
||||
withTraceNode `Elab.def.fixLevelParams (fun _ => return m!"fix level params") do
|
||||
-- We used to use `shareCommon` here, but is was a bottleneck
|
||||
let levelParams ← getLevelParamsPreDecls preDefs scopeLevelNames allUserLevelNames
|
||||
let us := levelParams.map mkLevelParam
|
||||
let fixExpr (e : Expr) : Expr :=
|
||||
e.replace fun c => match c with
|
||||
| Expr.const declName _ => if preDefs.any fun preDef => preDef.declName == declName then some $ Lean.mkConst declName us else none
|
||||
| _ => none
|
||||
return preDefs.map fun preDef =>
|
||||
{ preDef with
|
||||
type := fixExpr preDef.type,
|
||||
value := fixExpr preDef.value,
|
||||
levelParams := levelParams }
|
||||
-- We used to use `shareCommon` here, but is was a bottleneck
|
||||
let levelParams ← getLevelParamsPreDecls preDefs scopeLevelNames allUserLevelNames
|
||||
let us := levelParams.map mkLevelParam
|
||||
let fixExpr (e : Expr) : Expr :=
|
||||
e.replace fun c => match c with
|
||||
| Expr.const declName _ => if preDefs.any fun preDef => preDef.declName == declName then some $ Lean.mkConst declName us else none
|
||||
| _ => none
|
||||
return preDefs.map fun preDef =>
|
||||
{ preDef with
|
||||
type := fixExpr preDef.type,
|
||||
value := fixExpr preDef.value,
|
||||
levelParams := levelParams }
|
||||
|
||||
def applyAttributesOf (preDefs : Array PreDefinition) (applicationTime : AttributeApplicationTime) : TermElabM Unit := do
|
||||
for preDef in preDefs do
|
||||
@@ -213,17 +210,4 @@ def checkCodomainsLevel (preDefs : Array PreDefinition) : MetaM Unit := do
|
||||
m!"for `{preDefs[0]!.declName}` is{indentExpr type₀} : {← inferType type₀}\n" ++
|
||||
m!"and for `{preDefs[i]!.declName}` is{indentExpr typeᵢ} : {← inferType typeᵢ}"
|
||||
|
||||
def shareCommonPreDefs (preDefs : Array PreDefinition) : CoreM (Array PreDefinition) := do
|
||||
profileitM Exception "share common exprs" (← getOptions) do
|
||||
withTraceNode `Elab.def.maxSharing (fun _ => return m!"share common exprs") do
|
||||
let mut es := #[]
|
||||
for preDef in preDefs do
|
||||
es := es.push preDef.type |>.push preDef.value
|
||||
es := ShareCommon.shareCommon' es
|
||||
let mut result := #[]
|
||||
for h : i in [:preDefs.size] do
|
||||
let preDef := preDefs[i]
|
||||
result := result.push { preDef with type := es[2*i]!, value := es[2*i+1]! }
|
||||
return result
|
||||
|
||||
end Lean.Elab
|
||||
|
||||
@@ -111,7 +111,7 @@ def checkTerminationByHints (preDefs : Array PreDefinition) : CoreM Unit := do
|
||||
preDefWith.termination.terminationBy? matches some {structural := true, ..}
|
||||
for preDef in preDefs do
|
||||
if let .some termBy := preDef.termination.terminationBy? then
|
||||
if !structural && !preDefsWithout.isEmpty then
|
||||
if !preDefsWithout.isEmpty then
|
||||
let m := MessageData.andList (preDefsWithout.toList.map (m!"{·.declName}"))
|
||||
let doOrDoes := if preDefsWithout.size = 1 then "does" else "do"
|
||||
logErrorAt termBy.ref (m!"Incomplete set of `termination_by` annotations:\n"++
|
||||
@@ -135,12 +135,13 @@ def checkTerminationByHints (preDefs : Array PreDefinition) : CoreM Unit := do
|
||||
/--
|
||||
Elaborates the `TerminationHint` in the clique to `TerminationArguments`
|
||||
-/
|
||||
def elabTerminationByHints (preDefs : Array PreDefinition) : TermElabM (Array (Option TerminationArgument)) := do
|
||||
preDefs.mapM fun preDef => do
|
||||
def elabTerminationByHints (preDefs : Array PreDefinition) : TermElabM (Option TerminationArguments) := do
|
||||
let tas ← preDefs.mapM fun preDef => do
|
||||
let arity ← lambdaTelescope preDef.value fun xs _ => pure xs.size
|
||||
let hints := preDef.termination
|
||||
hints.terminationBy?.mapM
|
||||
(TerminationArgument.elab preDef.declName preDef.type arity hints.extraParams ·)
|
||||
return tas.sequenceMap id -- only return something if every function has a hint
|
||||
|
||||
def shouldUseStructural (preDefs : Array PreDefinition) : Bool :=
|
||||
preDefs.any fun preDef =>
|
||||
@@ -153,70 +154,68 @@ def shouldUseWF (preDefs : Array PreDefinition) : Bool :=
|
||||
|
||||
|
||||
def addPreDefinitions (preDefs : Array PreDefinition) : TermElabM Unit := withLCtx {} {} do
|
||||
profileitM Exception "process pre-definitions" (← getOptions) do
|
||||
withTraceNode `Elab.def.processPreDef (fun _ => return m!"process pre-definitions") do
|
||||
for preDef in preDefs do
|
||||
trace[Elab.definition.body] "{preDef.declName} : {preDef.type} :=\n{preDef.value}"
|
||||
let preDefs ← preDefs.mapM ensureNoUnassignedMVarsAtPreDef
|
||||
let preDefs ← betaReduceLetRecApps preDefs
|
||||
let cliques := partitionPreDefs preDefs
|
||||
for preDefs in cliques do
|
||||
trace[Elab.definition.scc] "{preDefs.map (·.declName)}"
|
||||
if preDefs.size == 1 && isNonRecursive preDefs[0]! then
|
||||
/-
|
||||
We must erase `recApp` annotations even when `preDef` is not recursive
|
||||
because it may use another recursive declaration in the same mutual block.
|
||||
See issue #2321
|
||||
-/
|
||||
let preDef ← eraseRecAppSyntax preDefs[0]!
|
||||
ensureEqnReservedNamesAvailable preDef.declName
|
||||
if preDef.modifiers.isNoncomputable then
|
||||
addNonRec preDef
|
||||
else
|
||||
addAndCompileNonRec preDef
|
||||
preDef.termination.ensureNone "not recursive"
|
||||
else if preDefs.any (·.modifiers.isUnsafe) then
|
||||
addAndCompileUnsafe preDefs
|
||||
preDefs.forM (·.termination.ensureNone "unsafe")
|
||||
else if preDefs.any (·.modifiers.isPartial) then
|
||||
for preDef in preDefs do
|
||||
trace[Elab.definition.body] "{preDef.declName} : {preDef.type} :=\n{preDef.value}"
|
||||
let preDefs ← preDefs.mapM ensureNoUnassignedMVarsAtPreDef
|
||||
let preDefs ← betaReduceLetRecApps preDefs
|
||||
let cliques := partitionPreDefs preDefs
|
||||
for preDefs in cliques do
|
||||
trace[Elab.definition.scc] "{preDefs.map (·.declName)}"
|
||||
if preDefs.size == 1 && isNonRecursive preDefs[0]! then
|
||||
/-
|
||||
We must erase `recApp` annotations even when `preDef` is not recursive
|
||||
because it may use another recursive declaration in the same mutual block.
|
||||
See issue #2321
|
||||
-/
|
||||
let preDef ← eraseRecAppSyntax preDefs[0]!
|
||||
ensureEqnReservedNamesAvailable preDef.declName
|
||||
if preDef.modifiers.isNoncomputable then
|
||||
addNonRec preDef
|
||||
else
|
||||
addAndCompileNonRec preDef
|
||||
preDef.termination.ensureNone "not recursive"
|
||||
else if preDefs.any (·.modifiers.isUnsafe) then
|
||||
addAndCompileUnsafe preDefs
|
||||
preDefs.forM (·.termination.ensureNone "unsafe")
|
||||
else if preDefs.any (·.modifiers.isPartial) then
|
||||
for preDef in preDefs do
|
||||
if preDef.modifiers.isPartial && !(← whnfD preDef.type).isForall then
|
||||
withRef preDef.ref <| throwError "invalid use of 'partial', '{preDef.declName}' is not a function{indentExpr preDef.type}"
|
||||
addAndCompilePartial preDefs
|
||||
preDefs.forM (·.termination.ensureNone "partial")
|
||||
if preDef.modifiers.isPartial && !(← whnfD preDef.type).isForall then
|
||||
withRef preDef.ref <| throwError "invalid use of 'partial', '{preDef.declName}' is not a function{indentExpr preDef.type}"
|
||||
addAndCompilePartial preDefs
|
||||
preDefs.forM (·.termination.ensureNone "partial")
|
||||
else
|
||||
ensureFunIndReservedNamesAvailable preDefs
|
||||
try
|
||||
checkCodomainsLevel preDefs
|
||||
checkTerminationByHints preDefs
|
||||
let termArgs ← elabTerminationByHints preDefs
|
||||
if shouldUseStructural preDefs then
|
||||
structuralRecursion preDefs termArgs
|
||||
else if shouldUseWF preDefs then
|
||||
wfRecursion preDefs termArgs
|
||||
else
|
||||
ensureFunIndReservedNamesAvailable preDefs
|
||||
try
|
||||
checkCodomainsLevel preDefs
|
||||
checkTerminationByHints preDefs
|
||||
let termArg?s ← elabTerminationByHints preDefs
|
||||
if shouldUseStructural preDefs then
|
||||
structuralRecursion preDefs termArg?s
|
||||
else if shouldUseWF preDefs then
|
||||
wfRecursion preDefs termArg?s
|
||||
else
|
||||
withRef (preDefs[0]!.ref) <| mapError
|
||||
(orelseMergeErrors
|
||||
(structuralRecursion preDefs termArg?s)
|
||||
(wfRecursion preDefs termArg?s))
|
||||
(fun msg =>
|
||||
let preDefMsgs := preDefs.toList.map (MessageData.ofExpr $ mkConst ·.declName)
|
||||
m!"fail to show termination for{indentD (MessageData.joinSep preDefMsgs Format.line)}\nwith errors\n{msg}")
|
||||
catch ex =>
|
||||
logException ex
|
||||
let s ← saveState
|
||||
withRef (preDefs[0]!.ref) <| mapError
|
||||
(orelseMergeErrors
|
||||
(structuralRecursion preDefs termArgs)
|
||||
(wfRecursion preDefs termArgs))
|
||||
(fun msg =>
|
||||
let preDefMsgs := preDefs.toList.map (MessageData.ofExpr $ mkConst ·.declName)
|
||||
m!"fail to show termination for{indentD (MessageData.joinSep preDefMsgs Format.line)}\nwith errors\n{msg}")
|
||||
catch ex =>
|
||||
logException ex
|
||||
let s ← saveState
|
||||
try
|
||||
if preDefs.all fun preDef => preDef.kind == DefKind.def || preDefs.all fun preDef => preDef.kind == DefKind.abbrev then
|
||||
-- try to add as partial definition
|
||||
try
|
||||
if preDefs.all fun preDef => preDef.kind == DefKind.def || preDefs.all fun preDef => preDef.kind == DefKind.abbrev then
|
||||
-- try to add as partial definition
|
||||
try
|
||||
addAndCompilePartial preDefs (useSorry := true)
|
||||
catch _ =>
|
||||
-- Compilation failed try again just as axiom
|
||||
s.restore
|
||||
addAsAxioms preDefs
|
||||
else if preDefs.all fun preDef => preDef.kind == DefKind.theorem then
|
||||
addAsAxioms preDefs
|
||||
catch _ => s.restore
|
||||
addAndCompilePartial preDefs (useSorry := true)
|
||||
catch _ =>
|
||||
-- Compilation failed try again just as axiom
|
||||
s.restore
|
||||
addAsAxioms preDefs
|
||||
else if preDefs.all fun preDef => preDef.kind == DefKind.theorem then
|
||||
addAsAxioms preDefs
|
||||
catch _ => s.restore
|
||||
|
||||
builtin_initialize
|
||||
registerTraceClass `Elab.definition.body
|
||||
|
||||
@@ -10,7 +10,6 @@ import Lean.Elab.RecAppSyntax
|
||||
import Lean.Elab.PreDefinition.Basic
|
||||
import Lean.Elab.PreDefinition.Structural.Basic
|
||||
import Lean.Elab.PreDefinition.Structural.FunPacker
|
||||
import Lean.Elab.PreDefinition.Structural.RecArgInfo
|
||||
|
||||
namespace Lean.Elab.Structural
|
||||
open Meta
|
||||
@@ -18,63 +17,51 @@ open Meta
|
||||
private def throwToBelowFailed : MetaM α :=
|
||||
throwError "toBelow failed"
|
||||
|
||||
partial def searchPProd (e : Expr) (F : Expr) (k : Expr → Expr → MetaM α) : MetaM α := do
|
||||
match (← whnf e) with
|
||||
| .app (.app (.const `PProd _) d1) d2 =>
|
||||
(do searchPProd d1 (← mkAppM ``PProd.fst #[F]) k)
|
||||
<|> (do searchPProd d2 (← mkAppM `PProd.snd #[F]) k)
|
||||
| .app (.app (.const `And _) d1) d2 =>
|
||||
(do searchPProd d1 (← mkAppM `And.left #[F]) k)
|
||||
<|> (do searchPProd d2 (← mkAppM `And.right #[F]) k)
|
||||
| .const `PUnit _
|
||||
| .const `True _ => throwToBelowFailed
|
||||
| _ => k e F
|
||||
|
||||
/-- See `toBelow` -/
|
||||
private partial def toBelowAux (C : Expr) (belowDict : Expr) (arg : Expr) (F : Expr) : MetaM Expr := do
|
||||
trace[Elab.definition.structural] "belowDict start:{indentExpr belowDict}\narg:{indentExpr arg}"
|
||||
-- First search through the PProd packing of the different `brecOn` motives
|
||||
searchPProd belowDict F fun belowDict F => do
|
||||
trace[Elab.definition.structural] "belowDict step 1:{indentExpr belowDict}"
|
||||
-- Then instantiate parameters of a reflexive type, if needed
|
||||
forallTelescopeReducing belowDict fun xs belowDict => do
|
||||
let arg ← zetaReduce arg
|
||||
let argArgs := arg.getAppArgs
|
||||
unless argArgs.size >= xs.size do throwToBelowFailed
|
||||
let n := argArgs.size
|
||||
let argTailArgs := argArgs.extract (n - xs.size) n
|
||||
let belowDict := belowDict.replaceFVars xs argTailArgs
|
||||
-- And again search through the PProd packing due to multiple functions recursing on the
|
||||
-- same inductive data type
|
||||
-- (We could use the funIdx and the `positions` array to replace this search with more
|
||||
-- targeted indexing.)
|
||||
searchPProd belowDict (mkAppN F argTailArgs) fun belowDict F => do
|
||||
trace[Elab.definition.structural] "belowDict step 2:{indentExpr belowDict}"
|
||||
match belowDict with
|
||||
| .app belowDictFun belowDictArg =>
|
||||
unless belowDictFun.getAppFn == C do throwToBelowFailed
|
||||
unless ← isDefEq belowDictArg arg do throwToBelowFailed
|
||||
pure F
|
||||
| _ =>
|
||||
trace[Elab.definition.structural] "belowDict not an app:{indentExpr belowDict}"
|
||||
throwToBelowFailed
|
||||
let belowDict ← whnf belowDict
|
||||
trace[Elab.definition.structural] "belowDict: {belowDict}, arg: {arg}"
|
||||
match belowDict with
|
||||
| .app (.app (.const `PProd _) d1) d2 =>
|
||||
(do toBelowAux C d1 arg (← mkAppM `PProd.fst #[F]))
|
||||
<|>
|
||||
(do toBelowAux C d2 arg (← mkAppM `PProd.snd #[F]))
|
||||
| .app (.app (.const `And _) d1) d2 =>
|
||||
(do toBelowAux C d1 arg (← mkAppM `And.left #[F]))
|
||||
<|>
|
||||
(do toBelowAux C d2 arg (← mkAppM `And.right #[F]))
|
||||
| _ => forallTelescopeReducing belowDict fun xs belowDict => do
|
||||
let arg ← zetaReduce arg
|
||||
let argArgs := arg.getAppArgs
|
||||
unless argArgs.size >= xs.size do throwToBelowFailed
|
||||
let n := argArgs.size
|
||||
let argTailArgs := argArgs.extract (n - xs.size) n
|
||||
let belowDict := belowDict.replaceFVars xs argTailArgs
|
||||
match belowDict with
|
||||
| .app belowDictFun belowDictArg =>
|
||||
unless belowDictFun.getAppFn == C do throwToBelowFailed
|
||||
unless ← isDefEq belowDictArg arg do throwToBelowFailed
|
||||
pure (mkAppN F argTailArgs)
|
||||
| _ =>
|
||||
trace[Elab.definition.structural] "belowDict not an app: {belowDict}"
|
||||
throwToBelowFailed
|
||||
|
||||
/-- See `toBelow` -/
|
||||
private def withBelowDict [Inhabited α] (below : Expr) (numIndParams : Nat)
|
||||
(positions : Positions) (k : Array Expr → Expr → MetaM α) : MetaM α := do
|
||||
let numTypeFormers := positions.size
|
||||
let numIndAll := positions.size
|
||||
let belowType ← inferType below
|
||||
trace[Elab.definition.structural] "belowType: {belowType}"
|
||||
unless (← isTypeCorrect below) do
|
||||
trace[Elab.definition.structural] "not type correct!"
|
||||
belowType.withApp fun f args => do
|
||||
unless numIndParams + numTypeFormers < args.size do
|
||||
unless numIndParams + numIndAll < args.size do
|
||||
trace[Elab.definition.structural] "unexpected 'below' type{indentExpr belowType}"
|
||||
throwToBelowFailed
|
||||
let params := args[:numIndParams]
|
||||
let finalArgs := args[numIndParams+numTypeFormers:]
|
||||
let finalArgs := args[numIndParams+numIndAll:]
|
||||
let pre := mkAppN f params
|
||||
let motiveTypes ← inferArgumentTypesN numTypeFormers pre
|
||||
let motiveTypes ← inferArgumentTypesN numIndAll pre
|
||||
let numMotives : Nat := positions.numIndices
|
||||
trace[Elab.definition.structural] "numMotives: {numMotives}"
|
||||
let mut CTypes := Array.mkArray numMotives (.sort 37) -- dummy value
|
||||
@@ -146,16 +133,26 @@ private partial def replaceRecApps (recArgInfos : Array RecArgInfo) (positions :
|
||||
e.withApp fun f args => do
|
||||
if let .some fnIdx := recArgInfos.findIdx? (f.isConstOf ·.fnName) then
|
||||
let recArgInfo := recArgInfos[fnIdx]!
|
||||
let some recArg := args[recArgInfo.recArgPos]?
|
||||
| throwError "insufficient number of parameters at recursive application {indentExpr e}"
|
||||
let numFixed := recArgInfo.numFixed
|
||||
let recArgPos := recArgInfo.recArgPos
|
||||
if recArgPos >= args.size then
|
||||
throwError "insufficient number of parameters at recursive application {indentExpr e}"
|
||||
let recArg := args[recArgPos]!
|
||||
-- For reflexive type, we may have nested recursive applications in recArg
|
||||
let recArg ← loop below recArg
|
||||
let f ←
|
||||
try toBelow below recArgInfo.indGroupInst.params.size positions fnIdx recArg
|
||||
try toBelow below recArgInfo.indParams.size positions fnIdx recArg
|
||||
catch _ => throwError "failed to eliminate recursive application{indentExpr e}"
|
||||
-- We don't pass the fixed parameters, the indices and the major arg to `f`, only the rest
|
||||
let (_, fArgs) := recArgInfo.pickIndicesMajor args[recArgInfo.numFixed:]
|
||||
let fArgs ← fArgs.mapM (replaceRecApps recArgInfos positions below ·)
|
||||
-- Recall that the fixed parameters are not in the scope of the `brecOn`. So, we skip them.
|
||||
let argsNonFixed := args.extract numFixed args.size
|
||||
-- The function `f` does not explicitly take `recArg` and its indices as arguments. So, we skip them too.
|
||||
let mut fArgs := #[]
|
||||
for i in [:argsNonFixed.size] do
|
||||
let j := i + numFixed
|
||||
if recArgInfo.recArgPos != j && !recArgInfo.indicesPos.contains j then
|
||||
let arg := argsNonFixed[i]!
|
||||
let arg ← replaceRecApps recArgInfos positions below arg
|
||||
fArgs := fArgs.push arg
|
||||
return mkAppN f fArgs
|
||||
else
|
||||
return mkAppN (← loop below f) (← args.mapM (loop below))
|
||||
@@ -228,39 +225,35 @@ def mkBRecOnF (recArgInfos : Array RecArgInfo) (positions : Positions)
|
||||
let valueNew ← replaceRecApps recArgInfos positions below value
|
||||
mkLambdaFVars (indexMajorArgs ++ #[below] ++ otherArgs) valueNew
|
||||
|
||||
|
||||
/--
|
||||
Given the `motives`, figures out whether to use `.brecOn` or `.binductionOn`, pass
|
||||
the right universe levels, the parameters, and the motives.
|
||||
It was already checked earlier in `checkCodomainsLevel` that the functions live in the same universe.
|
||||
-/
|
||||
def mkBRecOnConst (recArgInfos : Array RecArgInfo) (positions : Positions)
|
||||
(motives : Array Expr) : MetaM (Nat → Expr) := do
|
||||
let indGroup := recArgInfos[0]!.indGroupInst
|
||||
(motives : Array Expr) : MetaM (Name → Expr) := do
|
||||
-- For now, just look at the first
|
||||
let recArgInfo := recArgInfos[0]!
|
||||
let motive := motives[0]!
|
||||
let brecOnUniv ← lambdaTelescope motive fun _ type => getLevel type
|
||||
let indInfo ← getConstInfoInduct indGroup.all[0]!
|
||||
let indInfo ← getConstInfoInduct recArgInfo.indName
|
||||
let useBInductionOn := indInfo.isReflexive && brecOnUniv == levelZero
|
||||
let brecOnUniv ←
|
||||
if indInfo.isReflexive && brecOnUniv != levelZero then
|
||||
decLevel brecOnUniv
|
||||
else
|
||||
pure brecOnUniv
|
||||
let brecOnCons := fun idx =>
|
||||
let brecOnCons := fun n =>
|
||||
let brecOn :=
|
||||
if let .some n := indGroup.all[idx]? then
|
||||
if useBInductionOn then .const (mkBInductionOnName n) indGroup.levels
|
||||
else .const (mkBRecOnName n) (brecOnUniv :: indGroup.levels)
|
||||
else
|
||||
let n := indGroup.all[0]!
|
||||
let j := idx - indGroup.all.size + 1
|
||||
if useBInductionOn then .const (mkBInductionOnName n |>.appendIndexAfter j) indGroup.levels
|
||||
else .const (mkBRecOnName n |>.appendIndexAfter j) (brecOnUniv :: indGroup.levels)
|
||||
mkAppN brecOn indGroup.params
|
||||
if useBInductionOn then .const (mkBInductionOnName n) recArgInfo.indLevels
|
||||
else .const (mkBRecOnName n) (brecOnUniv :: recArgInfo.indLevels)
|
||||
mkAppN brecOn recArgInfo.indParams
|
||||
|
||||
-- Pick one as a prototype
|
||||
let brecOnAux := brecOnCons 0
|
||||
let brecOnAux := brecOnCons recArgInfo.indName
|
||||
-- Infer the type of the packed motive arguments
|
||||
let packedMotiveTypes ← inferArgumentTypesN indGroup.numMotives brecOnAux
|
||||
let packedMotiveTypes ← inferArgumentTypesN recArgInfo.indAll.size brecOnAux
|
||||
let packedMotives ← positions.mapMwith packMotives packedMotiveTypes motives
|
||||
|
||||
return fun n => mkAppN (brecOnCons n) packedMotives
|
||||
@@ -272,18 +265,17 @@ combinators. This assumes that all `.brecOn` functions of a mutual inductive hav
|
||||
It also undoes the permutation and packing done by `packMotives`
|
||||
-/
|
||||
def inferBRecOnFTypes (recArgInfos : Array RecArgInfo) (positions : Positions)
|
||||
(brecOnConst : Nat → Expr) : MetaM (Array Expr) := do
|
||||
let numTypeFormers := positions.size
|
||||
(brecOnConst : Name → Expr) : MetaM (Array Expr) := do
|
||||
let recArgInfo := recArgInfos[0]! -- pick an arbitrary one
|
||||
let brecOn := brecOnConst 0
|
||||
let brecOn := brecOnConst recArgInfo.indName
|
||||
check brecOn
|
||||
let brecOnType ← inferType brecOn
|
||||
-- Skip the indices and major argument
|
||||
let packedFTypes ← forallBoundedTelescope brecOnType (some (recArgInfo.indicesPos.size + 1)) fun _ brecOnType =>
|
||||
-- And return the types of of the next arguments
|
||||
arrowDomainsN numTypeFormers brecOnType
|
||||
arrowDomainsN recArgInfo.indAll.size brecOnType
|
||||
|
||||
let mut FTypes := Array.mkArray positions.numIndices (Expr.sort 0)
|
||||
let mut FTypes := Array.mkArray recArgInfos.size (Expr.sort 0)
|
||||
for packedFType in packedFTypes, poss in positions do
|
||||
for pos in poss do
|
||||
FTypes := FTypes.set! pos packedFType
|
||||
@@ -293,11 +285,11 @@ def inferBRecOnFTypes (recArgInfos : Array RecArgInfo) (positions : Positions)
|
||||
Completes the `.brecOn` for the given function.
|
||||
The `value` is the function with (only) the fixed parameters moved into the context.
|
||||
-/
|
||||
def mkBrecOnApp (positions : Positions) (fnIdx : Nat) (brecOnConst : Nat → Expr)
|
||||
def mkBrecOnApp (positions : Positions) (fnIdx : Nat) (brecOnConst : Name → Expr)
|
||||
(FArgs : Array Expr) (recArgInfo : RecArgInfo) (value : Expr) : MetaM Expr := do
|
||||
lambdaTelescope value fun ys _value => do
|
||||
let (indexMajorArgs, otherArgs) := recArgInfo.pickIndicesMajor ys
|
||||
let brecOn := brecOnConst recArgInfo.indIdx
|
||||
let brecOn := brecOnConst recArgInfo.indName
|
||||
let brecOn := mkAppN brecOn indexMajorArgs
|
||||
let packedFTypes ← inferArgumentTypesN positions.size brecOn
|
||||
let packedFArgs ← positions.mapMwith packFArgs packedFTypes FArgs
|
||||
|
||||
@@ -9,6 +9,46 @@ import Lean.Meta.ForEachExpr
|
||||
|
||||
namespace Lean.Elab.Structural
|
||||
|
||||
/--
|
||||
Information about the argument of interest of a structurally recursive function.
|
||||
|
||||
The `Expr`s in this data structure expect the `fixedParams` to be in scope, but not the other
|
||||
parameters of the function. This ensures that this data structure makes sense in the other functions
|
||||
of a mutually recursive group.
|
||||
-/
|
||||
structure RecArgInfo where
|
||||
/-- the name of the recursive function -/
|
||||
fnName : Name
|
||||
/-- the fixed prefix of arguments of the function we are trying to justify termination using structural recursion. -/
|
||||
numFixed : Nat
|
||||
/-- position of the argument (counted including fixed prefix) we are recursing on -/
|
||||
recArgPos : Nat
|
||||
/-- position of the indices (counted including fixed prefix) of the inductive datatype indices we are recursing on -/
|
||||
indicesPos : Array Nat
|
||||
/-- inductive datatype name of the argument we are recursing on -/
|
||||
indName : Name
|
||||
/-- inductive datatype universe levels of the argument we are recursing on -/
|
||||
indLevels : List Level
|
||||
/-- inductive datatype parameters of the argument we are recursing on -/
|
||||
indParams : Array Expr
|
||||
/-- The types mutually inductive with indName -/
|
||||
indAll : Array Name
|
||||
deriving Inhabited
|
||||
/--
|
||||
If `xs` are the parameters of the functions (excluding fixed prefix), partitions them
|
||||
into indices and major arguments, and other parameters.
|
||||
-/
|
||||
def RecArgInfo.pickIndicesMajor (info : RecArgInfo) (xs : Array Expr) : (Array Expr × Array Expr) := Id.run do
|
||||
let mut indexMajorArgs := #[]
|
||||
let mut otherArgs := #[]
|
||||
for h : i in [:xs.size] do
|
||||
let j := i + info.numFixed
|
||||
if j = info.recArgPos || info.indicesPos.contains j then
|
||||
indexMajorArgs := indexMajorArgs.push xs[i]
|
||||
else
|
||||
otherArgs := otherArgs.push xs[i]
|
||||
return (indexMajorArgs, otherArgs)
|
||||
|
||||
structure State where
|
||||
/-- As part of the inductive predicates case, we keep adding more and more discriminants from the
|
||||
local context and build up a bigger matcher application until we reach a fixed point.
|
||||
@@ -51,11 +91,10 @@ and for each such type, keep track of the order of the functions.
|
||||
|
||||
We represent these positions as an `Array (Array Nat)`. We have that
|
||||
|
||||
* `positions.size = indInfo.numTypeFormers`
|
||||
* `positions.size = indInfo.all.length`
|
||||
* `positions.flatten` is a permutation of `[0:n]`, so each of the `n` functions has exactly one
|
||||
position, and each position refers to one of the `n` functions.
|
||||
* if `k ∈ positions[i]` then the recursive argument of function `k` is has type `indInfo.all[i]`
|
||||
(or corresponding nested inductive type)
|
||||
|
||||
-/
|
||||
abbrev Positions := Array (Array Nat)
|
||||
@@ -88,6 +127,3 @@ def Positions.mapMwith {α β m} [Monad m] [Inhabited β] (f : α → Array β
|
||||
(Array.zip ys positions).mapM fun ⟨y, poss⟩ => f y (poss.map (xs[·]!))
|
||||
|
||||
end Lean.Elab.Structural
|
||||
|
||||
builtin_initialize
|
||||
Lean.registerTraceClass `Elab.definition.structural
|
||||
|
||||
@@ -4,31 +4,11 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura, Joachim Breitner
|
||||
-/
|
||||
prelude
|
||||
import Lean.Elab.PreDefinition.TerminationArgument
|
||||
import Lean.Elab.PreDefinition.Structural.Basic
|
||||
import Lean.Elab.PreDefinition.Structural.RecArgInfo
|
||||
|
||||
namespace Lean.Elab.Structural
|
||||
open Meta
|
||||
|
||||
def prettyParam (xs : Array Expr) (i : Nat) : MetaM MessageData := do
|
||||
let x := xs[i]!
|
||||
let n ← x.fvarId!.getUserName
|
||||
addMessageContextFull <| if n.hasMacroScopes then m!"#{i+1}" else m!"{x}"
|
||||
|
||||
def prettyRecArg (xs : Array Expr) (value : Expr) (recArgInfo : RecArgInfo) : MetaM MessageData := do
|
||||
lambdaTelescope value fun ys _ => prettyParam (xs ++ ys) recArgInfo.recArgPos
|
||||
|
||||
def prettyParameterSet (fnNames : Array Name) (xs : Array Expr) (values : Array Expr)
|
||||
(recArgInfos : Array RecArgInfo) : MetaM MessageData := do
|
||||
if fnNames.size = 1 then
|
||||
return m!"parameter " ++ (← prettyRecArg xs values[0]! recArgInfos[0]!)
|
||||
else
|
||||
let mut l := #[]
|
||||
for fnName in fnNames, value in values, recArgInfo in recArgInfos do
|
||||
l := l.push m!"{(← prettyRecArg xs value recArgInfo)} of {fnName}"
|
||||
return m!"parameters " ++ .andList l.toList
|
||||
|
||||
private def getIndexMinPos (xs : Array Expr) (indices : Array Expr) : Nat := Id.run do
|
||||
let mut minPos := xs.size
|
||||
for index in indices do
|
||||
@@ -92,190 +72,60 @@ def getRecArgInfo (fnName : Name) (numFixed : Nat) (xs : Array Expr) (i : Nat) :
|
||||
| some (indParam, y) =>
|
||||
throwError "its type is an inductive datatype{indentExpr xType}\nand the datatype parameter{indentExpr indParam}\ndepends on the function parameter{indentExpr y}\nwhich does not come before the varying parameters and before the indices of the recursion parameter."
|
||||
| none =>
|
||||
let indAll := indInfo.all.toArray
|
||||
let .some indIdx := indAll.indexOf? indInfo.name | panic! "{indInfo.name} not in {indInfo.all}"
|
||||
let indicesPos := indIndices.map fun index => match xs.indexOf? index with | some i => i.val | none => unreachable!
|
||||
let indGroupInst := {
|
||||
IndGroupInfo.ofInductiveVal indInfo with
|
||||
levels := us
|
||||
params := indParams }
|
||||
return { fnName := fnName
|
||||
numFixed := numFixed
|
||||
recArgPos := i
|
||||
indicesPos := indicesPos
|
||||
indGroupInst := indGroupInst
|
||||
indIdx := indIdx }
|
||||
return { fnName := fnName
|
||||
numFixed := numFixed
|
||||
recArgPos := i
|
||||
indicesPos := indicesPos
|
||||
indName := indInfo.name
|
||||
indLevels := us
|
||||
indParams := indParams
|
||||
indAll := indInfo.all.toArray }
|
||||
else
|
||||
throwError "the index #{i+1} exceeds {xs.size}, the number of parameters"
|
||||
|
||||
/--
|
||||
Collects the `RecArgInfos` for one function, and returns a report for why the others were not
|
||||
considered.
|
||||
Runs `k` on all argument indices, until it succeeds.
|
||||
We use this argument to justify termination using the auxiliary `brecOn` construction.
|
||||
|
||||
The `xs` are the fixed parameters, `value` the body with the fixed prefix instantiated.
|
||||
We give preference for arguments that are *not* indices of inductive types of other arguments.
|
||||
See issue #837 for an example where we can show termination using the index of an inductive family, but
|
||||
we don't get the desired definitional equalities.
|
||||
|
||||
Takes the optional user annotations into account (`termArg?`). If this is given and the argument
|
||||
is unsuitable, throw an error.
|
||||
`value` is the function value (including fixed parameters)
|
||||
-/
|
||||
def getRecArgInfos (fnName : Name) (xs : Array Expr) (value : Expr)
|
||||
(termArg? : Option TerminationArgument) : MetaM (Array RecArgInfo × MessageData) := do
|
||||
lambdaTelescope value fun ys _ => do
|
||||
if let .some termArg := termArg? then
|
||||
-- User explictly asked to use a certain argument, so throw errors eagerly
|
||||
let recArgInfo ← withRef termArg.ref do
|
||||
mapError (f := (m!"cannot use specified parameter for structural recursion:{indentD ·}")) do
|
||||
getRecArgInfo fnName xs.size (xs ++ ys) (← termArg.structuralArg)
|
||||
return (#[recArgInfo], m!"")
|
||||
else
|
||||
let mut recArgInfos := #[]
|
||||
let mut report : MessageData := m!""
|
||||
-- No `termination_by`, so try all, and remember the errors
|
||||
for idx in [:xs.size + ys.size] do
|
||||
try
|
||||
let recArgInfo ← getRecArgInfo fnName xs.size (xs ++ ys) idx
|
||||
recArgInfos := recArgInfos.push recArgInfo
|
||||
catch e =>
|
||||
report := report ++ (m!"Not considering parameter {← prettyParam (xs ++ ys) idx} of {fnName}:" ++
|
||||
indentD e.toMessageData) ++ "\n"
|
||||
trace[Elab.definition.structural] "getRecArgInfos report: {report}"
|
||||
return (recArgInfos, report)
|
||||
partial def tryAllArgs (value : Expr) (k : Nat → M α) : M α := do
|
||||
-- It's improtant to keep the call to `k` outside the scope of `lambdaTelescope`:
|
||||
-- The tactics in the IndPred construction search the full local context, so we must not have
|
||||
-- extra FVars there
|
||||
let (indices, nonIndices) ← lambdaTelescope value fun xs _ => do
|
||||
let indicesRef : IO.Ref (Array Nat) ← IO.mkRef {}
|
||||
for x in xs do
|
||||
let xType ← inferType x
|
||||
/- Traverse all sub-expressions in the type of `x` -/
|
||||
forEachExpr xType fun e =>
|
||||
/- If `e` is an inductive family, we store in `indicesRef` all variables in `xs` that occur in "index positions". -/
|
||||
matchConstInduct e.getAppFn (fun _ => pure ()) fun info _ => do
|
||||
if info.numIndices > 0 && info.numParams + info.numIndices == e.getAppNumArgs then
|
||||
for arg in e.getAppArgs[info.numParams:] do
|
||||
forEachExpr arg fun e => do
|
||||
if let .some idx := xs.getIdx? e then
|
||||
indicesRef.modify (·.push idx)
|
||||
let indices ← indicesRef.get
|
||||
let nonIndices := (Array.range xs.size).filter (fun i => !(indices.contains i))
|
||||
return (indices, nonIndices)
|
||||
|
||||
|
||||
/--
|
||||
Reorders the `RecArgInfos` of one function to put arguments that are indices of other arguments
|
||||
last.
|
||||
See issue #837 for an example where we can show termination using the index of an inductive family, but
|
||||
we don't get the desired definitional equalities.
|
||||
-/
|
||||
def nonIndicesFirst (recArgInfos : Array RecArgInfo) : Array RecArgInfo := Id.run do
|
||||
let mut indicesPos : HashSet Nat := {}
|
||||
for recArgInfo in recArgInfos do
|
||||
for pos in recArgInfo.indicesPos do
|
||||
indicesPos := indicesPos.insert pos
|
||||
let (indices,nonIndices) := recArgInfos.partition (indicesPos.contains ·.recArgPos)
|
||||
return nonIndices ++ indices
|
||||
|
||||
private def dedup [Monad m] (eq : α → α → m Bool) (xs : Array α) : m (Array α) := do
|
||||
let mut ret := #[]
|
||||
for x in xs do
|
||||
unless (← ret.anyM (eq · x)) do
|
||||
ret := ret.push x
|
||||
return ret
|
||||
|
||||
/--
|
||||
Given the `RecArgInfo`s of all the recursive functions, find the inductive groups to consider.
|
||||
-/
|
||||
def inductiveGroups (recArgInfos : Array RecArgInfo) : MetaM (Array IndGroupInst) :=
|
||||
dedup IndGroupInst.isDefEq (recArgInfos.map (·.indGroupInst))
|
||||
|
||||
/--
|
||||
Filters the `recArgInfos` by those that describe an argument that's part of the recursive inductive
|
||||
group `group`.
|
||||
|
||||
Because of nested inductives this function has the ability to change the `recArgInfo`.
|
||||
Consider
|
||||
```
|
||||
inductive Tree where | node : List Tree → Tree
|
||||
```
|
||||
then when we look for arguments whose type is part of the group `Tree`, we want to also consider
|
||||
the argument of type `List Tree`, even though that argument’s `RecArgInfo` refers to initially to
|
||||
`List`.
|
||||
-/
|
||||
def argsInGroup (group : IndGroupInst) (xs : Array Expr) (value : Expr)
|
||||
(recArgInfos : Array RecArgInfo) : MetaM (Array RecArgInfo) := do
|
||||
|
||||
let nestedTypeFormers ← group.nestedTypeFormers
|
||||
|
||||
recArgInfos.filterMapM fun recArgInfo => do
|
||||
-- Is this argument from the same mutual group of inductives?
|
||||
if (← group.isDefEq recArgInfo.indGroupInst) then
|
||||
return (.some recArgInfo)
|
||||
|
||||
-- Can this argument be understood as the auxillary type former of a nested inductive?
|
||||
if nestedTypeFormers.isEmpty then return .none
|
||||
lambdaTelescope value fun ys _ => do
|
||||
let x := (xs++ys)[recArgInfo.recArgPos]!
|
||||
for nestedTypeFormer in nestedTypeFormers, indIdx in [group.all.size : group.numMotives] do
|
||||
let xType ← whnfD (← inferType x)
|
||||
let (indIndices, _, type) ← forallMetaTelescope nestedTypeFormer
|
||||
if (← isDefEqGuarded type xType) then
|
||||
let indIndices ← indIndices.mapM instantiateMVars
|
||||
if !indIndices.all Expr.isFVar then
|
||||
-- throwError "indices are not variables{indentExpr xType}"
|
||||
continue
|
||||
if !indIndices.allDiff then
|
||||
-- throwError "indices are not pairwise distinct{indentExpr xType}"
|
||||
continue
|
||||
-- TODO: Do we have to worry about the indices ending up in the fixed prefix here?
|
||||
if let some (_index, _y) ← hasBadIndexDep? ys indIndices then
|
||||
-- throwError "its type {indInfo.name} is an inductive family{indentExpr xType}\nand index{indentExpr index}\ndepends on the non index{indentExpr y}"
|
||||
continue
|
||||
let indicesPos := indIndices.map fun index => match (xs++ys).indexOf? index with | some i => i.val | none => unreachable!
|
||||
return .some
|
||||
{ fnName := recArgInfo.fnName
|
||||
numFixed := recArgInfo.numFixed
|
||||
recArgPos := recArgInfo.recArgPos
|
||||
indicesPos := indicesPos
|
||||
indGroupInst := group
|
||||
indIdx := indIdx }
|
||||
return .none
|
||||
|
||||
def maxCombinationSize : Nat := 10
|
||||
|
||||
def allCombinations (xss : Array (Array α)) : Option (Array (Array α)) :=
|
||||
if xss.foldl (· * ·.size) 1 > maxCombinationSize then
|
||||
none
|
||||
else
|
||||
let rec go i acc : Array (Array α):=
|
||||
if h : i < xss.size then
|
||||
xss[i].concatMap fun x => go (i + 1) (acc.push x)
|
||||
else
|
||||
#[acc]
|
||||
some (go 0 #[])
|
||||
|
||||
|
||||
def tryAllArgs (fnNames : Array Name) (xs : Array Expr) (values : Array Expr)
|
||||
(termArg?s : Array (Option TerminationArgument)) (k : Array RecArgInfo → M α) : M α := do
|
||||
let mut report := m!""
|
||||
-- Gather information on all possible recursive arguments
|
||||
let mut recArgInfoss := #[]
|
||||
for fnName in fnNames, value in values, termArg? in termArg?s do
|
||||
let (recArgInfos, thisReport) ← getRecArgInfos fnName xs value termArg?
|
||||
report := report ++ thisReport
|
||||
recArgInfoss := recArgInfoss.push recArgInfos
|
||||
-- Put non-indices first
|
||||
recArgInfoss := recArgInfoss.map nonIndicesFirst
|
||||
trace[Elab.definition.structural] "recArgInfoss: {recArgInfoss.map (·.map (·.recArgPos))}"
|
||||
-- Inductive groups to consider
|
||||
let groups ← inductiveGroups recArgInfoss.flatten
|
||||
trace[Elab.definition.structural] "inductive groups: {groups}"
|
||||
if groups.isEmpty then
|
||||
report := report ++ "no parameters suitable for structural recursion"
|
||||
-- Consider each group
|
||||
for group in groups do
|
||||
-- Select those RecArgInfos that are compatible with this inductive group
|
||||
let mut recArgInfoss' := #[]
|
||||
for value in values, recArgInfos in recArgInfoss do
|
||||
recArgInfoss' := recArgInfoss'.push (← argsInGroup group xs value recArgInfos)
|
||||
if let some idx := recArgInfoss'.findIdx? (·.isEmpty) then
|
||||
report := report ++ m!"Skipping arguments of type {group}, as {fnNames[idx]!} has no compatible argument.\n"
|
||||
continue
|
||||
if let some combs := allCombinations recArgInfoss' then
|
||||
for comb in combs do
|
||||
try
|
||||
-- TODO: Here we used to save and restore the state. But should the `try`-`catch`
|
||||
-- not suffice?
|
||||
let r ← k comb
|
||||
trace[Elab.definition.structural] "tryAllArgs report:\n{report}"
|
||||
return r
|
||||
catch e =>
|
||||
let m ← prettyParameterSet fnNames xs values comb
|
||||
report := report ++ m!"Cannot use {m}:{indentD e.toMessageData}\n"
|
||||
else
|
||||
report := report ++ m!"Too many possible combinations of parameters of type {group} (or " ++
|
||||
m!"please indicate the recursive argument explicitly using `termination_by structural`).\n"
|
||||
report := m!"failed to infer structural recursion:\n" ++ report
|
||||
trace[Elab.definition.structural] "tryAllArgs:\n{report}"
|
||||
throwError report
|
||||
let mut errors : Array MessageData := Array.mkArray (indices.size + nonIndices.size) m!""
|
||||
let saveState ← get -- backtrack the state for each argument
|
||||
for i in id (nonIndices ++ indices) do
|
||||
trace[Elab.definition.structural] "findRecArg i: {i}"
|
||||
try
|
||||
set saveState
|
||||
return (← k i)
|
||||
catch e => errors := errors.set! i e.toMessageData
|
||||
throwError
|
||||
errors.foldl
|
||||
(init := m!"structural recursion cannot be used:")
|
||||
(f := (· ++ Format.line ++ Format.line ++ .))
|
||||
|
||||
end Lean.Elab.Structural
|
||||
|
||||
@@ -1,93 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Lean FRO. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Joachim Breitner
|
||||
-/
|
||||
prelude
|
||||
import Lean.Meta.InferType
|
||||
|
||||
/-!
|
||||
This module contains the types
|
||||
* `IndGroupInfo`, a variant of `InductiveVal` with information that
|
||||
applies to a whole group of mutual inductives and
|
||||
* `IndGroupInst` which extends `IndGroupInfo` with levels and parameters
|
||||
to indicate a instantiation of the group.
|
||||
|
||||
One purpose of this abstraction is to make it clear when a fuction operates on a group as
|
||||
a whole, rather than a specific inductive within the group.
|
||||
-/
|
||||
|
||||
namespace Lean.Elab.Structural
|
||||
open Lean Meta
|
||||
|
||||
/--
|
||||
A mutually inductive group, identified by the `all` array of the `InductiveVal` of its
|
||||
constituents.
|
||||
-/
|
||||
structure IndGroupInfo where
|
||||
all : Array Name
|
||||
numNested : Nat
|
||||
deriving BEq, Inhabited
|
||||
|
||||
def IndGroupInfo.ofInductiveVal (indInfo : InductiveVal) : IndGroupInfo where
|
||||
all := indInfo.all.toArray
|
||||
numNested := indInfo.numNested
|
||||
|
||||
def IndGroupInfo.numMotives (group : IndGroupInfo) : Nat :=
|
||||
group.all.size + group.numNested
|
||||
|
||||
/--
|
||||
An instance of an mutually inductive group of inductives, identified by the `all` array
|
||||
and the level and expressions parameters.
|
||||
|
||||
For example this distinguishes between `List α` and `List β` so that we will not even attempt
|
||||
mutual structural recursion on such incompatible types.
|
||||
-/
|
||||
structure IndGroupInst extends IndGroupInfo where
|
||||
levels : List Level
|
||||
params : Array Expr
|
||||
deriving Inhabited
|
||||
|
||||
def IndGroupInst.toMessageData (igi : IndGroupInst) : MessageData :=
|
||||
mkAppN (.const igi.all[0]! igi.levels) igi.params
|
||||
|
||||
instance : ToMessageData IndGroupInst where
|
||||
toMessageData := IndGroupInst.toMessageData
|
||||
|
||||
def IndGroupInst.isDefEq (igi1 igi2 : IndGroupInst) : MetaM Bool := do
|
||||
unless igi1.toIndGroupInfo == igi2.toIndGroupInfo do return false
|
||||
unless igi1.levels.length = igi2.levels.length do return false
|
||||
unless (igi1.levels.zip igi2.levels).all (fun (l₁, l₂) => Level.isEquiv l₁ l₂) do return false
|
||||
unless igi1.params.size = igi2.params.size do return false
|
||||
unless (← (igi1.params.zip igi2.params).allM (fun (e₁, e₂) => Meta.isDefEqGuarded e₁ e₂)) do return false
|
||||
return true
|
||||
|
||||
/--
|
||||
Figures out the nested type formers of an inductive group, with parameters instantiated
|
||||
and indices still forall-abstracted.
|
||||
|
||||
For example given a nested inductive
|
||||
```
|
||||
inductive Tree α where | node : α → Vector (Tree α) n → Tree α
|
||||
```
|
||||
(where `n` is an index of `Vector`) and the instantiation `Tree Int` it will return
|
||||
```
|
||||
#[(n : Nat) → Vector (Tree Int) n]
|
||||
```
|
||||
|
||||
-/
|
||||
def IndGroupInst.nestedTypeFormers (igi : IndGroupInst) : MetaM (Array Expr) := do
|
||||
if igi.numNested = 0 then return #[]
|
||||
-- We extract this information from the motives of the recursor
|
||||
let recName := mkRecName igi.all[0]!
|
||||
let recInfo ← getConstInfoRec recName
|
||||
assert! recInfo.numMotives = igi.numMotives
|
||||
let aux := mkAppN (.const recName (0 :: igi.levels)) igi.params
|
||||
let motives ← inferArgumentTypesN recInfo.numMotives aux
|
||||
let auxMotives : Array Expr := motives[igi.all.size:]
|
||||
auxMotives.mapM fun motive =>
|
||||
forallTelescopeReducing motive fun xs _ => do
|
||||
assert! xs.size > 0
|
||||
mkForallFVars xs.pop (← inferType xs.back)
|
||||
|
||||
end Lean.Elab.Structural
|
||||
@@ -7,7 +7,6 @@ prelude
|
||||
import Lean.Meta.IndPredBelow
|
||||
import Lean.Elab.PreDefinition.Basic
|
||||
import Lean.Elab.PreDefinition.Structural.Basic
|
||||
import Lean.Elab.PreDefinition.Structural.RecArgInfo
|
||||
|
||||
namespace Lean.Elab.Structural
|
||||
open Meta
|
||||
@@ -82,8 +81,8 @@ def mkIndPredBRecOn (recArgInfo : RecArgInfo) (value : Expr) : M Expr := do
|
||||
let motive ← mkForallFVars otherArgs type
|
||||
let motive ← mkLambdaFVars indexMajorArgs motive
|
||||
trace[Elab.definition.structural] "brecOn motive: {motive}"
|
||||
let brecOn := Lean.mkConst (mkBRecOnName recArgInfo.indName!) recArgInfo.indGroupInst.levels
|
||||
let brecOn := mkAppN brecOn recArgInfo.indGroupInst.params
|
||||
let brecOn := Lean.mkConst (mkBRecOnName recArgInfo.indName) recArgInfo.indLevels
|
||||
let brecOn := mkAppN brecOn recArgInfo.indParams
|
||||
let brecOn := mkApp brecOn motive
|
||||
let brecOn := mkAppN brecOn indexMajorArgs
|
||||
check brecOn
|
||||
|
||||
@@ -89,72 +89,87 @@ def getMutualFixedPrefix (preDefs : Array PreDefinition) : M Nat :=
|
||||
return true
|
||||
resultRef.get
|
||||
|
||||
private def elimMutualRecursion (preDefs : Array PreDefinition) (xs : Array Expr)
|
||||
(recArgInfos : Array RecArgInfo) : M (Array PreDefinition) := do
|
||||
let values ← preDefs.mapM (instantiateLambda ·.value xs)
|
||||
let indInfo ← getConstInfoInduct recArgInfos[0]!.indGroupInst.all[0]!
|
||||
if ← isInductivePredicate indInfo.name then
|
||||
-- Here we branch off to the IndPred construction, but only for non-mutual functions
|
||||
unless preDefs.size = 1 do
|
||||
throwError "structural mutual recursion over inductive predicates is not supported"
|
||||
trace[Elab.definition.structural] "Using mkIndPred construction"
|
||||
let preDef := preDefs[0]!
|
||||
let recArgInfo := recArgInfos[0]!
|
||||
let value := values[0]!
|
||||
let valueNew ← mkIndPredBRecOn recArgInfo value
|
||||
let valueNew ← mkLambdaFVars xs valueNew
|
||||
trace[Elab.definition.structural] "Nonrecursive value:{indentExpr valueNew}"
|
||||
check valueNew
|
||||
return #[{ preDef with value := valueNew }]
|
||||
/-- Checks that all parameter types are mutually inductive -/
|
||||
private def checkAllFromSameClique (recArgInfos : Array RecArgInfo) : MetaM Unit := do
|
||||
for recArgInfo in recArgInfos do
|
||||
unless recArgInfos[0]!.indAll.contains recArgInfo.indName do
|
||||
throwError m!"Cannot use structural mutual recursion: The recursive argument of " ++
|
||||
m!"{recArgInfos[0]!.fnName} is of type {recArgInfos[0]!.indName}, " ++
|
||||
m!"the recursive argument of {recArgInfo.fnName} is of type " ++
|
||||
m!"{recArgInfo.indName}, and these are not mutually recursive."
|
||||
|
||||
-- Sort the (indices of the) definitions by their position in indInfo.all
|
||||
let positions : Positions := .groupAndSort (·.indIdx) recArgInfos (Array.range indInfo.numTypeFormers)
|
||||
trace[Elab.definition.structural] "positions: {positions}"
|
||||
|
||||
-- Construct the common `.brecOn` arguments
|
||||
let motives ← (Array.zip recArgInfos values).mapM fun (r, v) => mkBRecOnMotive r v
|
||||
trace[Elab.definition.structural] "motives: {motives}"
|
||||
let brecOnConst ← mkBRecOnConst recArgInfos positions motives
|
||||
let FTypes ← inferBRecOnFTypes recArgInfos positions brecOnConst
|
||||
trace[Elab.definition.structural] "FTypes: {FTypes}"
|
||||
let FArgs ← (recArgInfos.zip (values.zip FTypes)).mapM fun (r, (v, t)) =>
|
||||
mkBRecOnF recArgInfos positions r v t
|
||||
trace[Elab.definition.structural] "FArgs: {FArgs}"
|
||||
-- Assemble the individual `.brecOn` applications
|
||||
let valuesNew ← (Array.zip recArgInfos values).mapIdxM fun i (r, v) =>
|
||||
mkBrecOnApp positions i brecOnConst FArgs r v
|
||||
-- Abstract over the fixed prefixed
|
||||
let valuesNew ← valuesNew.mapM (mkLambdaFVars xs ·)
|
||||
return (Array.zip preDefs valuesNew).map fun ⟨preDef, valueNew⟩ => { preDef with value := valueNew }
|
||||
|
||||
private def inferRecArgPos (preDefs : Array PreDefinition) (termArg?s : Array (Option TerminationArgument)) :
|
||||
M (Array Nat × Array PreDefinition) := do
|
||||
private def elimMutualRecursion (preDefs : Array PreDefinition) (recArgPoss : Array Nat) : M (Array PreDefinition) := do
|
||||
withoutModifyingEnv do
|
||||
preDefs.forM (addAsAxiom ·)
|
||||
let fnNames := preDefs.map (·.declName)
|
||||
let names := preDefs.map (·.declName)
|
||||
let preDefs ← preDefs.mapM fun preDef =>
|
||||
return { preDef with value := (← preprocess preDef.value fnNames) }
|
||||
return { preDef with value := (← preprocess preDef.value names) }
|
||||
|
||||
-- The syntactically fixed arguments
|
||||
let maxNumFixed ← getMutualFixedPrefix preDefs
|
||||
|
||||
lambdaBoundedTelescope preDefs[0]!.value maxNumFixed fun xs _ => do
|
||||
-- We do two passes to get the RecArgInfo values.
|
||||
-- From the first pass, we only keep the mininum of the `numFixed` reported.
|
||||
let numFixed ← lambdaBoundedTelescope preDefs[0]!.value maxNumFixed fun xs _ => do
|
||||
assert! xs.size = maxNumFixed
|
||||
let values ← preDefs.mapM (instantiateLambda ·.value xs)
|
||||
|
||||
tryAllArgs fnNames xs values termArg?s fun recArgInfos => do
|
||||
let recArgPoss := recArgInfos.map (·.recArgPos)
|
||||
trace[Elab.definition.structural] "Trying argument set {recArgPoss}"
|
||||
let numFixed := recArgInfos.foldl (·.min ·.numFixed) maxNumFixed
|
||||
if numFixed < maxNumFixed then
|
||||
trace[Elab.definition.structural] "Reduced numFixed from {maxNumFixed} to {numFixed}"
|
||||
-- We may have decreased the number of arguments we consider fixed, so update
|
||||
-- the recArgInfos, remove the extra arguments from local environment, and recalculate value
|
||||
let recArgInfos := recArgInfos.map ({· with numFixed := numFixed })
|
||||
withErasedFVars (xs.extract numFixed xs.size |>.map (·.fvarId!)) do
|
||||
let xs := xs[:numFixed]
|
||||
let preDefs' ← elimMutualRecursion preDefs xs recArgInfos
|
||||
return (recArgPoss, preDefs')
|
||||
let recArgInfos ← preDefs.mapIdxM fun i preDef => do
|
||||
let recArgPos := recArgPoss[i]!
|
||||
let value := values[i]!
|
||||
lambdaTelescope value fun ys _value => do
|
||||
getRecArgInfo preDef.declName maxNumFixed (xs ++ ys) recArgPos
|
||||
|
||||
return (recArgInfos.map (·.numFixed)).foldl Nat.min maxNumFixed
|
||||
|
||||
if numFixed < maxNumFixed then
|
||||
trace[Elab.definition.structural] "Reduced numFixed from {maxNumFixed} to {numFixed}"
|
||||
|
||||
-- Now we bring exactly that `numFixed` parameter into scope.
|
||||
lambdaBoundedTelescope preDefs[0]!.value numFixed fun xs _ => do
|
||||
assert! xs.size = numFixed
|
||||
let values ← preDefs.mapM (instantiateLambda ·.value xs)
|
||||
|
||||
let recArgInfos ← preDefs.mapIdxM fun i preDef => do
|
||||
let recArgPos := recArgPoss[i]!
|
||||
let value := values[i]!
|
||||
lambdaTelescope value fun ys _value => do
|
||||
getRecArgInfo preDef.declName numFixed (xs ++ ys) recArgPos
|
||||
|
||||
-- Two passes should suffice
|
||||
assert! recArgInfos.all (·.numFixed = numFixed)
|
||||
|
||||
let indInfo ← getConstInfoInduct recArgInfos[0]!.indName
|
||||
if ← isInductivePredicate indInfo.name then
|
||||
-- Here we branch off to the IndPred construction, but only for non-mutual functions
|
||||
unless preDefs.size = 1 do
|
||||
throwError "structural mutual recursion over inductive predicates is not supported"
|
||||
trace[Elab.definition.structural] "Using mkIndPred construction"
|
||||
let preDef := preDefs[0]!
|
||||
let recArgInfo := recArgInfos[0]!
|
||||
let value := values[0]!
|
||||
let valueNew ← mkIndPredBRecOn recArgInfo value
|
||||
let valueNew ← mkLambdaFVars xs valueNew
|
||||
trace[Elab.definition.structural] "Nonrecursive value:{indentExpr valueNew}"
|
||||
check valueNew
|
||||
return #[{ preDef with value := valueNew }]
|
||||
|
||||
checkAllFromSameClique recArgInfos
|
||||
-- Sort the (indices of the) definitions by their position in indInfo.all
|
||||
let positions : Positions := .groupAndSort (·.indName) recArgInfos indInfo.all.toArray
|
||||
|
||||
-- Construct the common `.brecOn` arguments
|
||||
let motives ← (Array.zip recArgInfos values).mapM fun (r, v) => mkBRecOnMotive r v
|
||||
let brecOnConst ← mkBRecOnConst recArgInfos positions motives
|
||||
let FTypes ← inferBRecOnFTypes recArgInfos positions brecOnConst
|
||||
let FArgs ← (recArgInfos.zip (values.zip FTypes)).mapM fun (r, (v, t)) =>
|
||||
mkBRecOnF recArgInfos positions r v t
|
||||
-- Assemble the individual `.brecOn` applications
|
||||
let valuesNew ← (Array.zip recArgInfos values).mapIdxM fun i (r, v) =>
|
||||
mkBrecOnApp positions i brecOnConst FArgs r v
|
||||
-- Abstract over the fixed prefixed
|
||||
let valuesNew ← valuesNew.mapM (mkLambdaFVars xs ·)
|
||||
return (Array.zip preDefs valuesNew).map fun ⟨preDef, valueNew⟩ => { preDef with value := valueNew }
|
||||
|
||||
def reportTermArg (preDef : PreDefinition) (recArgPos : Nat) : MetaM Unit := do
|
||||
if let some ref := preDef.termination.terminationBy?? then
|
||||
@@ -164,20 +179,34 @@ def reportTermArg (preDef : PreDefinition) (recArgPos : Nat) : MetaM Unit := do
|
||||
let stx ← termArg.delab arity (extraParams := preDef.termination.extraParams)
|
||||
Tactic.TryThis.addSuggestion ref stx
|
||||
|
||||
private def inferRecArgPos (preDefs : Array PreDefinition)
|
||||
(termArgs? : Option TerminationArguments) : M (Array Nat × Array PreDefinition) := do
|
||||
withoutModifyingEnv do
|
||||
if let some termArgs := termArgs? then
|
||||
let recArgPoss ← termArgs.mapM (·.structuralArg)
|
||||
let preDefsNew ← elimMutualRecursion preDefs recArgPoss
|
||||
return (recArgPoss, preDefsNew)
|
||||
else
|
||||
let #[preDef] := preDefs
|
||||
| throwError "mutual structural recursion requires explicit `termination_by` clauses"
|
||||
-- Use termination_by annotation to find argument to recurse on, or just try all
|
||||
tryAllArgs preDef.value fun i =>
|
||||
mapError (f := fun msg => m!"argument #{i+1} cannot be used for structural recursion{indentD msg}") do
|
||||
let preDefsNew ← elimMutualRecursion #[preDef] #[i]
|
||||
return (#[i], preDefsNew)
|
||||
|
||||
def structuralRecursion (preDefs : Array PreDefinition) (termArg?s : Array (Option TerminationArgument)) : TermElabM Unit := do
|
||||
let names := preDefs.map (·.declName)
|
||||
let ((recArgPoss, preDefsNonRec), state) ← run <| inferRecArgPos preDefs termArg?s
|
||||
def structuralRecursion (preDefs : Array PreDefinition) (termArgs? : Option TerminationArguments) : TermElabM Unit := do
|
||||
let ((recArgPoss, preDefsNonRec), state) ← run <| inferRecArgPos preDefs termArgs?
|
||||
for recArgPos in recArgPoss, preDef in preDefs do
|
||||
reportTermArg preDef recArgPos
|
||||
state.addMatchers.forM liftM
|
||||
preDefsNonRec.forM fun preDefNonRec => do
|
||||
let preDefNonRec ← eraseRecAppSyntax preDefNonRec
|
||||
-- state.addMatchers.forM liftM
|
||||
mapError (f := (m!"structural recursion failed, produced type incorrect term{indentD ·}")) do
|
||||
-- We create the `_unsafe_rec` before we abstract nested proofs.
|
||||
-- Reason: the nested proofs may be referring to the _unsafe_rec.
|
||||
addNonRec preDefNonRec (applyAttrAfterCompilation := false) (all := names.toList)
|
||||
mapError (addNonRec preDefNonRec (applyAttrAfterCompilation := false)) fun msg =>
|
||||
m!"structural recursion failed, produced type incorrect term{indentD msg}"
|
||||
-- We create the `_unsafe_rec` before we abstract nested proofs.
|
||||
-- Reason: the nested proofs may be referring to the _unsafe_rec.
|
||||
let preDefs ← preDefs.mapM (eraseRecAppSyntax ·)
|
||||
addAndCompilePartialRec preDefs
|
||||
for preDef in preDefs, recArgPos in recArgPoss do
|
||||
@@ -195,6 +224,8 @@ def structuralRecursion (preDefs : Array PreDefinition) (termArg?s : Array (Opti
|
||||
markAsRecursive preDef.declName
|
||||
applyAttributesOf preDefsNonRec AttributeApplicationTime.afterCompilation
|
||||
|
||||
builtin_initialize
|
||||
registerTraceClass `Elab.definition.structural
|
||||
|
||||
end Structural
|
||||
|
||||
|
||||
@@ -1,60 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2021 Microsoft Corporation. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura, Joachim Breitner
|
||||
-/
|
||||
prelude
|
||||
import Lean.Meta.Basic
|
||||
import Lean.Meta.ForEachExpr
|
||||
import Lean.Elab.PreDefinition.Structural.IndGroupInfo
|
||||
|
||||
namespace Lean.Elab.Structural
|
||||
|
||||
|
||||
/--
|
||||
Information about the argument of interest of a structurally recursive function.
|
||||
|
||||
The `Expr`s in this data structure expect the `fixedParams` to be in scope, but not the other
|
||||
parameters of the function. This ensures that this data structure makes sense in the other functions
|
||||
of a mutually recursive group.
|
||||
-/
|
||||
structure RecArgInfo where
|
||||
/-- the name of the recursive function -/
|
||||
fnName : Name
|
||||
/-- the fixed prefix of arguments of the function we are trying to justify termination using structural recursion. -/
|
||||
numFixed : Nat
|
||||
/-- position of the argument (counted including fixed prefix) we are recursing on -/
|
||||
recArgPos : Nat
|
||||
/-- position of the indices (counted including fixed prefix) of the inductive datatype indices we are recursing on -/
|
||||
indicesPos : Array Nat
|
||||
/-- The inductive group (with parameters) of the argument's type -/
|
||||
indGroupInst : IndGroupInst
|
||||
/--
|
||||
index of the inductive datatype of the argument we are recursing on.
|
||||
If `< indAll.all`, a normal data type, else an auxillary data type due to nested recursion
|
||||
-/
|
||||
indIdx : Nat
|
||||
deriving Inhabited
|
||||
|
||||
/--
|
||||
If `xs` are the parameters of the functions (excluding fixed prefix), partitions them
|
||||
into indices and major arguments, and other parameters.
|
||||
-/
|
||||
def RecArgInfo.pickIndicesMajor (info : RecArgInfo) (xs : Array Expr) : (Array Expr × Array Expr) := Id.run do
|
||||
let mut indexMajorArgs := #[]
|
||||
let mut otherArgs := #[]
|
||||
for h : i in [:xs.size] do
|
||||
let j := i + info.numFixed
|
||||
if j = info.recArgPos || info.indicesPos.contains j then
|
||||
indexMajorArgs := indexMajorArgs.push xs[i]
|
||||
else
|
||||
otherArgs := otherArgs.push xs[i]
|
||||
return (indexMajorArgs, otherArgs)
|
||||
|
||||
/--
|
||||
Name of the recursive data type. Assumes that it is not one of the auxillary ones.
|
||||
-/
|
||||
def RecArgInfo.indName! (info : RecArgInfo) : Name :=
|
||||
info.indGroupInst.all[info.indIdx]!
|
||||
|
||||
end Lean.Elab.Structural
|
||||
@@ -10,7 +10,7 @@ import Lean.Elab.Term
|
||||
import Lean.Elab.Binders
|
||||
import Lean.Elab.SyntheticMVars
|
||||
import Lean.Elab.PreDefinition.TerminationHint
|
||||
import Lean.PrettyPrinter.Delaborator.Basic
|
||||
import Lean.PrettyPrinter.Delaborator
|
||||
|
||||
/-!
|
||||
This module contains
|
||||
@@ -115,7 +115,7 @@ def TerminationArgument.delab (arity : Nat) (extraParams : Nat) (termArg : Termi
|
||||
-- any variable not mentioned syntatically (it may appear in the `Expr`, so do not just use
|
||||
-- `e.bindingBody!.hasLooseBVar`) should be delaborated as a hole.
|
||||
let vars : TSyntaxArray [`ident, `Lean.Parser.Term.hole] :=
|
||||
Array.map (fun (i : Ident) => if stxBody.raw.hasIdent i.getId then i else hole) vars
|
||||
Array.map (fun (i : Ident) => if hasIdent i.getId stxBody then i else hole) vars
|
||||
-- drop trailing underscores
|
||||
let mut vars := vars
|
||||
while ! vars.isEmpty && vars.back.raw.isOfKind ``hole do vars := vars.pop
|
||||
|
||||
@@ -86,8 +86,7 @@ def varyingVarNames (fixedPrefixSize : Nat) (preDef : PreDefinition) : MetaM (Ar
|
||||
let xs : Array Expr := xs[fixedPrefixSize:]
|
||||
xs.mapM (·.fvarId!.getUserName)
|
||||
|
||||
def wfRecursion (preDefs : Array PreDefinition) (termArg?s : Array (Option TerminationArgument)) : TermElabM Unit := do
|
||||
let termArgs? := termArg?s.sequenceMap id -- Either all or none, checked by `elabTerminationByHints`
|
||||
def wfRecursion (preDefs : Array PreDefinition) (termArgs? : Option TerminationArguments) : TermElabM Unit := do
|
||||
let preDefs ← preDefs.mapM fun preDef =>
|
||||
return { preDef with value := (← preprocess preDef.value) }
|
||||
let (fixedPrefixSize, argsPacker, unaryPreDef) ← withoutModifyingEnv do
|
||||
|
||||
@@ -40,4 +40,3 @@ import Lean.Elab.Tactic.LibrarySearch
|
||||
import Lean.Elab.Tactic.ShowTerm
|
||||
import Lean.Elab.Tactic.Rfl
|
||||
import Lean.Elab.Tactic.Rewrites
|
||||
import Lean.Elab.Tactic.DiscrTreeKey
|
||||
|
||||
@@ -1,65 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Matthew Robert Ballard. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Tomas Skrivan, Matthew Robert Ballard
|
||||
-/
|
||||
prelude
|
||||
import Init.Tactics
|
||||
import Lean.Elab.Command
|
||||
import Lean.Meta.Tactic.Simp.SimpTheorems
|
||||
|
||||
namespace Lean.Elab.Tactic.DiscrTreeKey
|
||||
|
||||
open Lean.Meta DiscrTree
|
||||
open Lean.Elab.Tactic
|
||||
open Lean.Elab.Command
|
||||
|
||||
private def mkKey (e : Expr) (simp : Bool) : MetaM (Array Key) := do
|
||||
let (_, _, type) ← withReducible <| forallMetaTelescopeReducing e
|
||||
let type ← whnfR type
|
||||
if simp then
|
||||
if let some (_, lhs, _) := type.eq? then
|
||||
mkPath lhs simpDtConfig
|
||||
else if let some (lhs, _) := type.iff? then
|
||||
mkPath lhs simpDtConfig
|
||||
else if let some (_, lhs, _) := type.ne? then
|
||||
mkPath lhs simpDtConfig
|
||||
else if let some p := type.not? then
|
||||
match p.eq? with
|
||||
| some (_, lhs, _) =>
|
||||
mkPath lhs simpDtConfig
|
||||
| _ => mkPath p simpDtConfig
|
||||
else
|
||||
mkPath type simpDtConfig
|
||||
else
|
||||
mkPath type {}
|
||||
|
||||
private def getType (t : TSyntax `term) : TermElabM Expr := do
|
||||
if let `($id:ident) := t then
|
||||
if let some ldecl := (← getLCtx).findFromUserName? id.getId then
|
||||
return ldecl.type
|
||||
else
|
||||
let info ← getConstInfo (← realizeGlobalConstNoOverloadWithInfo id)
|
||||
return info.type
|
||||
else
|
||||
Term.elabTerm t none
|
||||
|
||||
@[builtin_command_elab Lean.Parser.discrTreeKeyCmd]
|
||||
def evalDiscrTreeKeyCmd : CommandElab := fun stx => do
|
||||
Command.liftTermElabM <| do
|
||||
match stx with
|
||||
| `(command| #discr_tree_key $t:term) => do
|
||||
let type ← getType t
|
||||
logInfo (← keysAsPattern <| ← mkKey type false)
|
||||
| _ => Elab.throwUnsupportedSyntax
|
||||
|
||||
@[builtin_command_elab Lean.Parser.discrTreeSimpKeyCmd]
|
||||
def evalDiscrTreeSimpKeyCmd : CommandElab := fun stx => do
|
||||
Command.liftTermElabM <| do
|
||||
match stx with
|
||||
| `(command| #discr_tree_simp_key $t:term) => do
|
||||
let type ← getType t
|
||||
logInfo (← keysAsPattern <| ← mkKey type true)
|
||||
| _ => Elab.throwUnsupportedSyntax
|
||||
|
||||
end Lean.Elab.Tactic.DiscrTreeKey
|
||||
@@ -359,8 +359,8 @@ def elabAsFVar (stx : Syntax) (userName? : Option Name := none) : TacticM FVarId
|
||||
| _ => throwUnsupportedSyntax
|
||||
|
||||
/--
|
||||
Make sure `expectedType` does not contain free and metavariables.
|
||||
It applies zeta and zetaDelta-reduction to eliminate let-free-vars.
|
||||
Make sure `expectedType` does not contain free and metavariables.
|
||||
It applies zeta and zetaDelta-reduction to eliminate let-free-vars.
|
||||
-/
|
||||
private def preprocessPropToDecide (expectedType : Expr) : TermElabM Expr := do
|
||||
let mut expectedType ← instantiateMVars expectedType
|
||||
@@ -370,28 +370,6 @@ private def preprocessPropToDecide (expectedType : Expr) : TermElabM Expr := do
|
||||
throwError "expected type must not contain free or meta variables{indentExpr expectedType}"
|
||||
return expectedType
|
||||
|
||||
/--
|
||||
Given the decidable instance `inst`, reduces it and returns a decidable instance expression
|
||||
in whnf that can be regarded as the reason for the failure of `inst` to fully reduce.
|
||||
-/
|
||||
private partial def blameDecideReductionFailure (inst : Expr) : MetaM Expr := do
|
||||
let inst ← whnf inst
|
||||
-- If it's the Decidable recursor, then blame the major premise.
|
||||
if inst.isAppOfArity ``Decidable.rec 5 then
|
||||
return ← blameDecideReductionFailure inst.appArg!
|
||||
-- If it is a matcher, look for a discriminant that's a Decidable instance to blame.
|
||||
if let .const c _ := inst.getAppFn then
|
||||
if let some info ← getMatcherInfo? c then
|
||||
if inst.getAppNumArgs == info.arity then
|
||||
let args := inst.getAppArgs
|
||||
for i in [0:info.numDiscrs] do
|
||||
let inst' := args[info.numParams + 1 + i]!
|
||||
if (← Meta.isClass? (← inferType inst')) == ``Decidable then
|
||||
let inst'' ← whnf inst'
|
||||
if !(inst''.isAppOf ``isTrue || inst''.isAppOf ``isFalse) then
|
||||
return ← blameDecideReductionFailure inst''
|
||||
return inst
|
||||
|
||||
@[builtin_tactic Lean.Parser.Tactic.decide] def evalDecide : Tactic := fun _ =>
|
||||
closeMainGoalUsing fun expectedType => do
|
||||
let expectedType ← preprocessPropToDecide expectedType
|
||||
@@ -399,66 +377,24 @@ private partial def blameDecideReductionFailure (inst : Expr) : MetaM Expr := do
|
||||
let d ← instantiateMVars d
|
||||
-- Get instance from `d`
|
||||
let s := d.appArg!
|
||||
-- Reduce the instance rather than `d` itself for diagnostics purposes.
|
||||
let r ← withAtLeastTransparency .default <| whnf s
|
||||
if r.isAppOf ``isTrue then
|
||||
-- Success!
|
||||
-- While we have a proof from reduction, we do not embed it in the proof term,
|
||||
-- and instead we let the kernel recompute it during type checking from the following more efficient term.
|
||||
let rflPrf ← mkEqRefl (toExpr true)
|
||||
return mkApp3 (Lean.mkConst ``of_decide_eq_true) expectedType s rflPrf
|
||||
else
|
||||
-- Diagnose the failure, lazily so that there is no performance impact if `decide` isn't being used interactively.
|
||||
throwError MessageData.ofLazyM (es := #[expectedType]) do
|
||||
if r.isAppOf ``isFalse then
|
||||
return m!"\
|
||||
tactic 'decide' proved that the proposition\
|
||||
{indentExpr expectedType}\n\
|
||||
is false"
|
||||
-- Re-reduce the instance and collect diagnostics, to get all unfolded Decidable instances
|
||||
let (reason, unfoldedInsts) ← withoutModifyingState <| withOptions (fun opt => diagnostics.set opt true) do
|
||||
modifyDiag (fun _ => {})
|
||||
let reason ← withAtLeastTransparency .default <| blameDecideReductionFailure s
|
||||
let unfolded := (← get).diag.unfoldCounter.foldl (init := #[]) fun cs n _ => cs.push n
|
||||
let unfoldedInsts ← unfolded |>.qsort Name.lt |>.filterMapM fun n => do
|
||||
let e ← mkConstWithLevelParams n
|
||||
if (← Meta.isClass? (← inferType e)) == ``Decidable then
|
||||
return m!"'{MessageData.ofConst e}'"
|
||||
else
|
||||
return none
|
||||
return (reason, unfoldedInsts)
|
||||
let stuckMsg :=
|
||||
if unfoldedInsts.isEmpty then
|
||||
m!"Reduction got stuck at the '{MessageData.ofConstName ``Decidable}' instance{indentExpr reason}"
|
||||
else
|
||||
let instances := if unfoldedInsts.size == 1 then "instance" else "instances"
|
||||
m!"After unfolding the {instances} {MessageData.andList unfoldedInsts.toList}, \
|
||||
reduction got stuck at the '{MessageData.ofConstName ``Decidable}' instance{indentExpr reason}"
|
||||
let hint :=
|
||||
if reason.isAppOf ``Eq.rec then
|
||||
m!"\n\n\
|
||||
Hint: Reduction got stuck on '▸' ({MessageData.ofConstName ``Eq.rec}), \
|
||||
which suggests that one of the '{MessageData.ofConstName ``Decidable}' instances is defined using tactics such as 'rw' or 'simp'. \
|
||||
To avoid tactics, make use of functions such as \
|
||||
'{MessageData.ofConstName ``inferInstanceAs}' or '{MessageData.ofConstName ``decidable_of_decidable_of_iff}' \
|
||||
to alter a proposition."
|
||||
else if reason.isAppOf ``Classical.choice then
|
||||
m!"\n\n\
|
||||
Hint: Reduction got stuck on '{MessageData.ofConstName ``Classical.choice}', \
|
||||
which indicates that a '{MessageData.ofConstName ``Decidable}' instance \
|
||||
is defined using classical reasoning, proving an instance exists rather than giving a concrete construction. \
|
||||
The 'decide' tactic works by evaluating a decision procedure via reduction, and it cannot make progress with such instances. \
|
||||
This can occur due to the 'opened scoped Classical' command, which enables the instance \
|
||||
'{MessageData.ofConstName ``Classical.propDecidable}'."
|
||||
else
|
||||
MessageData.nil
|
||||
return m!"\
|
||||
tactic 'decide' failed for proposition\
|
||||
{indentExpr expectedType}\n\
|
||||
since its '{MessageData.ofConstName ``Decidable}' instance\
|
||||
{indentExpr s}\n\
|
||||
did not reduce to '{MessageData.ofConstName ``isTrue}' or '{MessageData.ofConstName ``isFalse}'.\n\n\
|
||||
{stuckMsg}{hint}"
|
||||
-- Reduce the instance rather than `d` itself, since that gives a nicer error message on failure.
|
||||
let r ← withDefault <| whnf s
|
||||
if r.isAppOf ``isFalse then
|
||||
throwError "\
|
||||
tactic 'decide' proved that the proposition\
|
||||
{indentExpr expectedType}\n\
|
||||
is false"
|
||||
unless r.isAppOf ``isTrue do
|
||||
throwError "\
|
||||
tactic 'decide' failed for proposition\
|
||||
{indentExpr expectedType}\n\
|
||||
since its 'Decidable' instance reduced to\
|
||||
{indentExpr r}\n\
|
||||
rather than to the 'isTrue' constructor."
|
||||
-- While we have a proof from reduction, we do not embed it in the proof term,
|
||||
-- but rather we let the kernel recompute it during type checking from a more efficient term.
|
||||
let rflPrf ← mkEqRefl (toExpr true)
|
||||
return mkApp3 (Lean.mkConst ``of_decide_eq_true) expectedType s rflPrf
|
||||
|
||||
private def mkNativeAuxDecl (baseName : Name) (type value : Expr) : TermElabM Name := do
|
||||
let auxName ← Term.mkAuxName baseName
|
||||
|
||||
@@ -85,12 +85,7 @@ def mkExtIffType (extThmName : Name) : MetaM Expr := withLCtx {} {} do
|
||||
if fvars.fvarSet.contains fvar.fvarId! then
|
||||
throwError "argument {fvar} is depended upon, which is not supported"
|
||||
let conj := mkAndN (← toRevert.mapM (inferType ·)).toList
|
||||
-- Make everything implicit except for inst implicits
|
||||
let mut newBis := #[]
|
||||
for fvar in args[0:startIdx] do
|
||||
if (← fvar.fvarId!.getBinderInfo) matches .default | .strictImplicit then
|
||||
newBis := newBis.push (fvar.fvarId!, .implicit)
|
||||
withNewBinderInfos newBis do
|
||||
withNewBinderInfos (args |>.extract 0 startIdx |>.map (·.fvarId!, .implicit)) do
|
||||
mkForallFVars args[:startIdx] <| mkIff ty conj
|
||||
|
||||
/--
|
||||
|
||||
@@ -564,7 +564,7 @@ def getInductiveValFromMajor (major : Expr) : TacticM InductiveVal :=
|
||||
|
||||
/--
|
||||
Elaborates the term in the `using` clause. We want to allow parameters to be instantiated
|
||||
(e.g. `using foo (p := …)`), but preserve other parameters, like the motives, as parameters,
|
||||
(e.g. `using foo (p := …)`), but preserve other paramters, like the motives, as parameters,
|
||||
without turning them into MVars. So this uses `abstractMVars` at the end. This is inspired by
|
||||
`Lean.Elab.Tactic.addSimpTheorem`.
|
||||
|
||||
|
||||
@@ -1827,13 +1827,9 @@ def isLetRecAuxMVar (mvarId : MVarId) : TermElabM Bool := do
|
||||
/--
|
||||
Create an `Expr.const` using the given name and explicit levels.
|
||||
Remark: fresh universe metavariables are created if the constant has more universe
|
||||
parameters than `explicitLevels`.
|
||||
|
||||
If `checkDeprecated := true`, then `Linter.checkDeprecated` is invoked.
|
||||
-/
|
||||
def mkConst (constName : Name) (explicitLevels : List Level := []) (checkDeprecated := true) : TermElabM Expr := do
|
||||
if checkDeprecated then
|
||||
Linter.checkDeprecated constName
|
||||
parameters than `explicitLevels`. -/
|
||||
def mkConst (constName : Name) (explicitLevels : List Level := []) : TermElabM Expr := do
|
||||
Linter.checkDeprecated constName -- TODO: check is occurring too early if there are multiple alternatives. Fix if it is not ok in practice
|
||||
let cinfo ← getConstInfo constName
|
||||
if explicitLevels.length > cinfo.levelParams.length then
|
||||
throwError "too many explicit universe levels for '{constName}'"
|
||||
@@ -1842,21 +1838,10 @@ def mkConst (constName : Name) (explicitLevels : List Level := []) (checkDepreca
|
||||
let us ← mkFreshLevelMVars numMissingLevels
|
||||
return Lean.mkConst constName (explicitLevels ++ us)
|
||||
|
||||
def checkDeprecated (ref : Syntax) (e : Expr) : TermElabM Unit := do
|
||||
if let .const declName _ := e.getAppFn then
|
||||
withRef ref do Linter.checkDeprecated declName
|
||||
|
||||
private def mkConsts (candidates : List (Name × List String)) (explicitLevels : List Level) : TermElabM (List (Expr × List String)) := do
|
||||
candidates.foldlM (init := []) fun result (declName, projs) => do
|
||||
-- TODO: better support for `mkConst` failure. We may want to cache the failures, and report them if all candidates fail.
|
||||
/-
|
||||
We disable `checkDeprecated` here because there may be many overloaded symbols.
|
||||
Note that, this method and `resolveName` and `resolveName'` return a list of pairs instead of a list of `TermElabResult`s.
|
||||
We perform the `checkDeprecated` test at `resolveId?` and `elabAppFnId`.
|
||||
At `elabAppFnId`, we perform the check when converting the list returned by `resolveName'` into a list of
|
||||
`TermElabResult`s.
|
||||
-/
|
||||
let const ← mkConst declName explicitLevels (checkDeprecated := false)
|
||||
let const ← mkConst declName explicitLevels
|
||||
return (const, projs) :: result
|
||||
|
||||
def resolveName (stx : Syntax) (n : Name) (preresolved : List Syntax.Preresolved) (explicitLevels : List Level) (expectedType? : Option Expr := none) : TermElabM (List (Expr × List String)) := do
|
||||
@@ -1910,11 +1895,11 @@ def resolveId? (stx : Syntax) (kind := "term") (withInfo := false) : TermElabM (
|
||||
| [] => return none
|
||||
| [f] =>
|
||||
let f ← if withInfo then addTermInfo stx f else pure f
|
||||
checkDeprecated stx f
|
||||
return some f
|
||||
| _ => throwError "ambiguous {kind}, use fully qualified name, possible interpretations {fs}"
|
||||
| _ => throwError "identifier expected"
|
||||
|
||||
|
||||
def TermElabM.run (x : TermElabM α) (ctx : Context := {}) (s : State := {}) : MetaM (α × State) :=
|
||||
withConfig setElabConfig (x ctx |>.run s)
|
||||
|
||||
|
||||
@@ -504,14 +504,6 @@ def mkArrayLit (type : Expr) (xs : List Expr) : MetaM Expr := do
|
||||
let listLit ← mkListLit type xs
|
||||
return mkApp (mkApp (mkConst ``List.toArray [u]) type) listLit
|
||||
|
||||
def mkNone (type : Expr) : MetaM Expr := do
|
||||
let u ← getDecLevel type
|
||||
return mkApp (mkConst ``Option.none [u]) type
|
||||
|
||||
def mkSome (type value : Expr) : MetaM Expr := do
|
||||
let u ← getDecLevel type
|
||||
return mkApp2 (mkConst ``Option.some [u]) type value
|
||||
|
||||
def mkSorry (type : Expr) (synthetic : Bool) : MetaM Expr := do
|
||||
let u ← getLevel type
|
||||
return mkApp2 (mkConst ``sorryAx [u]) type (toExpr synthetic)
|
||||
|
||||
@@ -1492,16 +1492,6 @@ private def withLocalContextImp (lctx : LocalContext) (localInsts : LocalInstanc
|
||||
def withLCtx (lctx : LocalContext) (localInsts : LocalInstances) : n α → n α :=
|
||||
mapMetaM <| withLocalContextImp lctx localInsts
|
||||
|
||||
/--
|
||||
Runs `k` in a local envrionment with the `fvarIds` erased.
|
||||
-/
|
||||
def withErasedFVars [MonadLCtx n] [MonadLiftT MetaM n] (fvarIds : Array FVarId) (k : n α) : n α := do
|
||||
let lctx ← getLCtx
|
||||
let localInsts ← getLocalInstances
|
||||
let lctx' := fvarIds.foldl (·.erase ·) lctx
|
||||
let localInsts' := localInsts.filter (!fvarIds.contains ·.fvar.fvarId!)
|
||||
withLCtx lctx' localInsts' k
|
||||
|
||||
private def withMVarContextImp (mvarId : MVarId) (x : MetaM α) : MetaM α := do
|
||||
let mvarDecl ← mvarId.getDecl
|
||||
withLocalContextImp mvarDecl.lctx mvarDecl.localInstances x
|
||||
|
||||
@@ -85,8 +85,9 @@ of type
|
||||
```
|
||||
α → List α → Sort (max 1 u_1) → Sort (max 1 u_1)
|
||||
```
|
||||
The parameter `typeFormers` are the `motive`s.
|
||||
-/
|
||||
private def buildBelowMinorPremise (rlvl : Level) (motives : Array Expr) (minorType : Expr) : MetaM Expr :=
|
||||
private def buildBelowMinorPremise (rlvl : Level) (typeFormers : Array Expr) (minorType : Expr) : MetaM Expr :=
|
||||
forallTelescope minorType fun minor_args _ => do go #[] minor_args.toList
|
||||
where
|
||||
ibelow := rlvl matches .zero
|
||||
@@ -95,7 +96,7 @@ where
|
||||
| arg::args => do
|
||||
let argType ← inferType arg
|
||||
forallTelescope argType fun arg_args arg_type => do
|
||||
if motives.contains arg_type.getAppFn then
|
||||
if typeFormers.contains arg_type.getAppFn then
|
||||
let name ← arg.fvarId!.getUserName
|
||||
let type' ← forallTelescope argType fun args _ => mkForallFVars args (.sort rlvl)
|
||||
withLocalDeclD name type' fun arg' => do
|
||||
@@ -123,99 +124,80 @@ fun {α} {motive} t =>
|
||||
List.rec True (fun head tail tail_ih => (motive tail ∧ tail_ih) ∧ True) t
|
||||
```
|
||||
-/
|
||||
private def mkBelowFromRec (recName : Name) (ibelow reflexive : Bool) (nParams : Nat)
|
||||
(belowName : Name) : MetaM Unit := do
|
||||
-- The construction follows the type of `ind.rec`
|
||||
let .recInfo recVal ← getConstInfo recName
|
||||
| throwError "{recName} not a .recInfo"
|
||||
let lvl::lvls := recVal.levelParams.map (Level.param ·)
|
||||
| throwError "recursor {recName} has no levelParams"
|
||||
let lvlParam := recVal.levelParams.head!
|
||||
|
||||
let refType :=
|
||||
if ibelow then
|
||||
recVal.type.instantiateLevelParams [lvlParam] [0]
|
||||
else if reflexive then
|
||||
recVal.type.instantiateLevelParams [lvlParam] [lvl.succ]
|
||||
else
|
||||
recVal.type
|
||||
|
||||
let decl ← forallTelescope refType fun refArgs _ => do
|
||||
assert! refArgs.size > nParams + recVal.numMotives + recVal.numMinors
|
||||
let params : Array Expr := refArgs[:nParams]
|
||||
let motives : Array Expr := refArgs[nParams:nParams + recVal.numMotives]
|
||||
let minors : Array Expr := refArgs[nParams + recVal.numMotives:nParams + recVal.numMotives + recVal.numMinors]
|
||||
let indices : Array Expr := refArgs[nParams + recVal.numMotives + recVal.numMinors:refArgs.size - 1]
|
||||
let major : Expr := refArgs[refArgs.size - 1]!
|
||||
|
||||
-- universe parameter names of ibelow/below
|
||||
let blvls :=
|
||||
-- For ibelow we instantiate the first universe parameter of `.rec` to `.zero`
|
||||
if ibelow then recVal.levelParams.tail!
|
||||
else recVal.levelParams
|
||||
-- universe parameter of the type fomer.
|
||||
-- same as `typeFormerTypeLevel indVal.type`, but we want to infer it from the
|
||||
-- type of the recursor, to be more robust when facing nested induction
|
||||
let majorTypeType ← inferType (← inferType major)
|
||||
let .some ilvl ← typeFormerTypeLevel majorTypeType
|
||||
| throwError "type of type of major premise {major} not a type former"
|
||||
|
||||
-- universe level of the resultant type
|
||||
let rlvl : Level :=
|
||||
if ibelow then
|
||||
0
|
||||
else if reflexive then
|
||||
if let .max 1 ilvl' := ilvl then
|
||||
mkLevelMax' (.succ lvl) ilvl'
|
||||
else
|
||||
mkLevelMax' (.succ lvl) ilvl
|
||||
else
|
||||
mkLevelMax' 1 lvl
|
||||
|
||||
let mut val := .const recName (rlvl.succ :: lvls)
|
||||
-- add parameters
|
||||
val := mkAppN val params
|
||||
-- add type formers
|
||||
for motive in motives do
|
||||
let arg ← forallTelescope (← inferType motive) fun targs _ =>
|
||||
mkLambdaFVars targs (.sort rlvl)
|
||||
val := .app val arg
|
||||
-- add minor premises
|
||||
for minor in minors do
|
||||
let arg ← buildBelowMinorPremise rlvl motives (← inferType minor)
|
||||
val := .app val arg
|
||||
-- add indices and major premise
|
||||
val := mkAppN val indices
|
||||
val := mkApp val major
|
||||
|
||||
-- All paramaters of `.rec` besides the `minors` become parameters of `.below`
|
||||
let below_params := params ++ motives ++ indices ++ #[major]
|
||||
let type ← mkForallFVars below_params (.sort rlvl)
|
||||
val ← mkLambdaFVars below_params val
|
||||
|
||||
mkDefinitionValInferrringUnsafe belowName blvls type val .abbrev
|
||||
|
||||
addDecl (.defnDecl decl)
|
||||
setReducibleAttribute decl.name
|
||||
modifyEnv fun env => markAuxRecursor env decl.name
|
||||
modifyEnv fun env => addProtected env decl.name
|
||||
|
||||
private def mkBelowOrIBelow (indName : Name) (ibelow : Bool) : MetaM Unit := do
|
||||
let .inductInfo indVal ← getConstInfo indName | return
|
||||
unless indVal.isRec do return
|
||||
if ← isPropFormerType indVal.type then return
|
||||
|
||||
let recName := mkRecName indName
|
||||
let belowName := if ibelow then mkIBelowName indName else mkBelowName indName
|
||||
mkBelowFromRec recName ibelow indVal.isReflexive indVal.numParams belowName
|
||||
-- The construction follows the type of `ind.rec`
|
||||
let .recInfo recVal ← getConstInfo recName
|
||||
| throwError "{recName} not a .recInfo"
|
||||
let lvl::lvls := recVal.levelParams.map (Level.param ·)
|
||||
| throwError "recursor {recName} has no levelParams"
|
||||
let lvlParam := recVal.levelParams.head!
|
||||
-- universe parameter names of ibelow/below
|
||||
let blvls :=
|
||||
-- For ibelow we instantiate the first universe parameter of `.rec` to `.zero`
|
||||
if ibelow then recVal.levelParams.tail!
|
||||
else recVal.levelParams
|
||||
let .some ilvl ← typeFormerTypeLevel indVal.type
|
||||
| throwError "type {indVal.type} of inductive {indVal.name} not a type former?"
|
||||
|
||||
-- If this is the first inductive in a mutual group with nested inductives,
|
||||
-- generate the constructions for the nested inductives now
|
||||
if indVal.all[0]! = indName then
|
||||
for i in [:indVal.numNested] do
|
||||
let recName := recName.appendIndexAfter (i + 1)
|
||||
let belowName := belowName.appendIndexAfter (i + 1)
|
||||
mkBelowFromRec recName ibelow indVal.isReflexive indVal.numParams belowName
|
||||
-- universe level of the resultant type
|
||||
let rlvl : Level :=
|
||||
if ibelow then
|
||||
0
|
||||
else if indVal.isReflexive then
|
||||
if let .max 1 ilvl' := ilvl then
|
||||
mkLevelMax' (.succ lvl) ilvl'
|
||||
else
|
||||
mkLevelMax' (.succ lvl) ilvl
|
||||
else
|
||||
mkLevelMax' 1 lvl
|
||||
|
||||
let refType :=
|
||||
if ibelow then
|
||||
recVal.type.instantiateLevelParams [lvlParam] [0]
|
||||
else if indVal.isReflexive then
|
||||
recVal.type.instantiateLevelParams [lvlParam] [lvl.succ]
|
||||
else
|
||||
recVal.type
|
||||
|
||||
let decl ← forallTelescope refType fun refArgs _ => do
|
||||
assert! refArgs.size == indVal.numParams + recVal.numMotives + recVal.numMinors + indVal.numIndices + 1
|
||||
let params : Array Expr := refArgs[:indVal.numParams]
|
||||
let typeFormers : Array Expr := refArgs[indVal.numParams:indVal.numParams + recVal.numMotives]
|
||||
let minors : Array Expr := refArgs[indVal.numParams + recVal.numMotives:indVal.numParams + recVal.numMotives + recVal.numMinors]
|
||||
let remaining : Array Expr := refArgs[indVal.numParams + recVal.numMotives + recVal.numMinors:]
|
||||
|
||||
let mut val := .const recName (rlvl.succ :: lvls)
|
||||
-- add parameters
|
||||
val := mkAppN val params
|
||||
-- add type formers
|
||||
for typeFormer in typeFormers do
|
||||
let arg ← forallTelescope (← inferType typeFormer) fun targs _ =>
|
||||
mkLambdaFVars targs (.sort rlvl)
|
||||
val := .app val arg
|
||||
-- add minor premises
|
||||
for minor in minors do
|
||||
let arg ← buildBelowMinorPremise rlvl typeFormers (← inferType minor)
|
||||
val := .app val arg
|
||||
-- add indices and major premise
|
||||
val := mkAppN val remaining
|
||||
|
||||
-- All paramaters of `.rec` besides the `minors` become parameters of `.below`
|
||||
let below_params := params ++ typeFormers ++ remaining
|
||||
let type ← mkForallFVars below_params (.sort rlvl)
|
||||
val ← mkLambdaFVars below_params val
|
||||
|
||||
let name := if ibelow then mkIBelowName indName else mkBelowName indName
|
||||
mkDefinitionValInferrringUnsafe name blvls type val .abbrev
|
||||
|
||||
addDecl (.defnDecl decl)
|
||||
setReducibleAttribute decl.name
|
||||
modifyEnv fun env => markAuxRecursor env decl.name
|
||||
modifyEnv fun env => addProtected env decl.name
|
||||
|
||||
def mkBelow (declName : Name) : MetaM Unit := mkBelowOrIBelow declName true
|
||||
def mkIBelow (declName : Name) : MetaM Unit := mkBelowOrIBelow declName false
|
||||
@@ -237,21 +219,22 @@ of type
|
||||
PProd (motive tail) (List.below tail) →
|
||||
PProd (motive (head :: tail)) (PProd (PProd (motive tail) (List.below tail)) PUnit)
|
||||
```
|
||||
The parameter `typeFormers` are the `motive`s.
|
||||
-/
|
||||
private def buildBRecOnMinorPremise (rlvl : Level) (motives : Array Expr)
|
||||
private def buildBRecOnMinorPremise (rlvl : Level) (typeFormers : Array Expr)
|
||||
(belows : Array Expr) (fs : Array Expr) (minorType : Expr) : MetaM Expr :=
|
||||
forallTelescope minorType fun minor_args minor_type => do
|
||||
let rec go (prods : Array Expr) : List Expr → MetaM Expr
|
||||
| [] => minor_type.withApp fun minor_type_fn minor_type_args => do
|
||||
let b ← mkNProdMk rlvl prods
|
||||
let .some ⟨idx, _⟩ := motives.indexOf? minor_type_fn
|
||||
| throwError m!"Did not find {minor_type} in {motives}"
|
||||
let .some ⟨idx, _⟩ := typeFormers.indexOf? minor_type_fn
|
||||
| throwError m!"Did not find {minor_type} in {typeFormers}"
|
||||
mkPProdMk (mkAppN fs[idx]! (minor_type_args.push b)) b
|
||||
| arg::args => do
|
||||
let argType ← inferType arg
|
||||
forallTelescope argType fun arg_args arg_type => do
|
||||
arg_type.withApp fun arg_type_fn arg_type_args => do
|
||||
if let .some idx := motives.indexOf? arg_type_fn then
|
||||
if let .some idx := typeFormers.indexOf? arg_type_fn then
|
||||
let name ← arg.fvarId!.getUserName
|
||||
let type' ← mkForallFVars arg_args
|
||||
(← mkPProd arg_type (mkAppN belows[idx]! arg_type_args) )
|
||||
@@ -294,72 +277,81 @@ fun {α} {motive} t F_1 => (
|
||||
).1
|
||||
```
|
||||
-/
|
||||
private def mkBRecOnFromRec (recName : Name) (ind reflexive : Bool) (nParams : Nat)
|
||||
(all : Array Name) (brecOnName : Name) : MetaM Unit := do
|
||||
def mkBRecOnOrBInductionOn (indName : Name) (ind : Bool) : MetaM Unit := do
|
||||
let .inductInfo indVal ← getConstInfo indName | return
|
||||
unless indVal.isRec do return
|
||||
if ← isPropFormerType indVal.type then return
|
||||
let recName := mkRecName indName
|
||||
let .recInfo recVal ← getConstInfo recName | return
|
||||
unless recVal.numMotives = indVal.all.length do
|
||||
/-
|
||||
The mutual declaration containing `declName` contains nested inductive datatypes.
|
||||
We don't support this kind of declaration here yet. We probably never will :)
|
||||
To support it, we will need to generate an auxiliary `below` for each nested inductive
|
||||
type since their default `below` is not good here. For example, at
|
||||
```
|
||||
inductive Term
|
||||
| var : String -> Term
|
||||
| app : String -> List Term -> Term
|
||||
```
|
||||
The `List.below` is not useful since it will not allow us to recurse over the nested terms.
|
||||
We need to generate another one using the auxiliary recursor `Term.rec_1` for `List Term`.
|
||||
-/
|
||||
return
|
||||
|
||||
let lvl::lvls := recVal.levelParams.map (Level.param ·)
|
||||
| throwError "recursor {recName} has no levelParams"
|
||||
let lvlParam := recVal.levelParams.head!
|
||||
-- universe parameter names of brecOn/binductionOn
|
||||
let blps := if ind then recVal.levelParams.tail! else recVal.levelParams
|
||||
-- universe arguments of below/ibelow
|
||||
let blvls := if ind then lvls else lvl::lvls
|
||||
|
||||
let .some ⟨idx, _⟩ := indVal.all.toArray.indexOf? indName
|
||||
| throwError m!"Did not find {indName} in {indVal.all}"
|
||||
|
||||
let .some ilvl ← typeFormerTypeLevel indVal.type
|
||||
| throwError "type {indVal.type} of inductive {indVal.name} not a type former?"
|
||||
-- universe level of the resultant type
|
||||
let rlvl : Level :=
|
||||
if ind then
|
||||
0
|
||||
else if indVal.isReflexive then
|
||||
if let .max 1 ilvl' := ilvl then
|
||||
mkLevelMax' (.succ lvl) ilvl'
|
||||
else
|
||||
mkLevelMax' (.succ lvl) ilvl
|
||||
else
|
||||
mkLevelMax' 1 lvl
|
||||
|
||||
let refType :=
|
||||
if ind then
|
||||
recVal.type.instantiateLevelParams [lvlParam] [0]
|
||||
else if reflexive then
|
||||
else if indVal.isReflexive then
|
||||
recVal.type.instantiateLevelParams [lvlParam] [lvl.succ]
|
||||
else
|
||||
recVal.type
|
||||
|
||||
let decl ← forallTelescope refType fun refArgs refBody => do
|
||||
assert! refArgs.size > nParams + recVal.numMotives + recVal.numMinors
|
||||
let params : Array Expr := refArgs[:nParams]
|
||||
let motives : Array Expr := refArgs[nParams:nParams + recVal.numMotives]
|
||||
let minors : Array Expr := refArgs[nParams + recVal.numMotives:nParams + recVal.numMotives + recVal.numMinors]
|
||||
let indices : Array Expr := refArgs[nParams + recVal.numMotives + recVal.numMinors:refArgs.size - 1]
|
||||
let major : Expr := refArgs[refArgs.size - 1]!
|
||||
let decl ← forallTelescope refType fun refArgs _ => do
|
||||
assert! refArgs.size == indVal.numParams + recVal.numMotives + recVal.numMinors + indVal.numIndices + 1
|
||||
let params : Array Expr := refArgs[:indVal.numParams]
|
||||
let typeFormers : Array Expr := refArgs[indVal.numParams:indVal.numParams + recVal.numMotives]
|
||||
let minors : Array Expr := refArgs[indVal.numParams + recVal.numMotives:indVal.numParams + recVal.numMotives + recVal.numMinors]
|
||||
let remaining : Array Expr := refArgs[indVal.numParams + recVal.numMotives + recVal.numMinors:]
|
||||
|
||||
let some idx := motives.indexOf? refBody.getAppFn
|
||||
| throwError "result type of {refType} is not one of {motives}"
|
||||
-- One `below` for each type former (same parameters)
|
||||
let belows := indVal.all.toArray.map fun n =>
|
||||
let belowName := if ind then mkIBelowName n else mkBelowName n
|
||||
mkAppN (.const belowName blvls) (params ++ typeFormers)
|
||||
|
||||
-- universe parameter of the type fomer.
|
||||
-- same as `typeFormerTypeLevel indVal.type`, but we want to infer it from the
|
||||
-- type of the recursor, to be more robust when facing nested induction
|
||||
let majorTypeType ← inferType (← inferType major)
|
||||
let .some ilvl ← typeFormerTypeLevel majorTypeType
|
||||
| throwError "type of type of major premise {major} not a type former"
|
||||
|
||||
-- universe level of the resultant type
|
||||
let rlvl : Level :=
|
||||
if ind then
|
||||
0
|
||||
else if reflexive then
|
||||
if let .max 1 ilvl' := ilvl then
|
||||
mkLevelMax' (.succ lvl) ilvl'
|
||||
else
|
||||
mkLevelMax' (.succ lvl) ilvl
|
||||
else
|
||||
mkLevelMax' 1 lvl
|
||||
|
||||
-- One `below` for each motive, with the same motive parameters
|
||||
let blvls := if ind then lvls else lvl::lvls
|
||||
let belows := Array.ofFn (n := motives.size) fun ⟨i,_⟩ =>
|
||||
let belowName :=
|
||||
if let some n := all[i]? then
|
||||
if ind then mkIBelowName n else mkBelowName n
|
||||
else
|
||||
if ind then .str all[0]! s!"ibelow_{i-all.size+1}"
|
||||
else .str all[0]! s!"below_{i-all.size+1}"
|
||||
mkAppN (.const belowName blvls) (params ++ motives)
|
||||
|
||||
-- create types of functionals (one for each motive)
|
||||
-- create types of functionals (one for each type former)
|
||||
-- (F_1 : (t : List α) → (f : List.below t) → motive t)
|
||||
-- and bring parameters of that type into scope
|
||||
let mut fDecls : Array (Name × (Array Expr -> MetaM Expr)) := #[]
|
||||
for motive in motives, below in belows, i in [:motives.size] do
|
||||
let fType ← forallTelescope (← inferType motive) fun targs _ => do
|
||||
for typeFormer in typeFormers, below in belows, i in [:typeFormers.size] do
|
||||
let fType ← forallTelescope (← inferType typeFormer) fun targs _ => do
|
||||
withLocalDeclD `f (mkAppN below targs) fun f =>
|
||||
mkForallFVars (targs.push f) (mkAppN motive targs)
|
||||
mkForallFVars (targs.push f) (mkAppN typeFormer targs)
|
||||
let fName := .mkSimple s!"F_{i + 1}"
|
||||
fDecls := fDecls.push (fName, fun _ => pure fType)
|
||||
withLocalDeclsD fDecls fun fs => do
|
||||
@@ -367,53 +359,35 @@ private def mkBRecOnFromRec (recName : Name) (ind reflexive : Bool) (nParams : N
|
||||
-- add parameters
|
||||
val := mkAppN val params
|
||||
-- add type formers
|
||||
for motive in motives, below in belows do
|
||||
for typeFormer in typeFormers, below in belows do
|
||||
-- example: (motive := fun t => PProd (motive t) (@List.below α motive t))
|
||||
let arg ← forallTelescope (← inferType motive) fun targs _ => do
|
||||
let cType := mkAppN motive targs
|
||||
let arg ← forallTelescope (← inferType typeFormer) fun targs _ => do
|
||||
let cType := mkAppN typeFormer targs
|
||||
let belowType := mkAppN below targs
|
||||
let arg ← mkPProd cType belowType
|
||||
mkLambdaFVars targs arg
|
||||
val := .app val arg
|
||||
-- add minor premises
|
||||
for minor in minors do
|
||||
let arg ← buildBRecOnMinorPremise rlvl motives belows fs (← inferType minor)
|
||||
let arg ← buildBRecOnMinorPremise rlvl typeFormers belows fs (← inferType minor)
|
||||
val := .app val arg
|
||||
-- add indices and major premise
|
||||
val := mkAppN val indices
|
||||
val := mkApp val major
|
||||
val := mkAppN val remaining
|
||||
-- project out first component
|
||||
val ← mkPProdFst val
|
||||
|
||||
-- All paramaters of `.rec` besides the `minors` become parameters of `.bRecOn`, and the `fs`
|
||||
let below_params := params ++ motives ++ indices ++ #[major] ++ fs
|
||||
let type ← mkForallFVars below_params (mkAppN motives[idx]! (indices ++ #[major]))
|
||||
let below_params := params ++ typeFormers ++ remaining ++ fs
|
||||
let type ← mkForallFVars below_params (mkAppN typeFormers[idx]! remaining)
|
||||
val ← mkLambdaFVars below_params val
|
||||
|
||||
mkDefinitionValInferrringUnsafe brecOnName blps type val .abbrev
|
||||
let name := if ind then mkBInductionOnName indName else mkBRecOnName indName
|
||||
mkDefinitionValInferrringUnsafe name blps type val .abbrev
|
||||
|
||||
addDecl (.defnDecl decl)
|
||||
setReducibleAttribute decl.name
|
||||
modifyEnv fun env => markAuxRecursor env decl.name
|
||||
modifyEnv fun env => addProtected env decl.name
|
||||
|
||||
def mkBRecOnOrBInductionOn (indName : Name) (ind : Bool) : MetaM Unit := do
|
||||
let .inductInfo indVal ← getConstInfo indName | return
|
||||
unless indVal.isRec do return
|
||||
if ← isPropFormerType indVal.type then return
|
||||
|
||||
let recName := mkRecName indName
|
||||
let brecOnName := if ind then mkBInductionOnName indName else mkBRecOnName indName
|
||||
mkBRecOnFromRec recName ind indVal.isReflexive indVal.numParams indVal.all.toArray brecOnName
|
||||
|
||||
-- If this is the first inductive in a mutual group with nested inductives,
|
||||
-- generate the constructions for the nested inductives now.
|
||||
if indVal.all[0]! = indName then
|
||||
for i in [:indVal.numNested] do
|
||||
let recName := recName.appendIndexAfter (i + 1)
|
||||
let brecOnName := brecOnName.appendIndexAfter (i + 1)
|
||||
mkBRecOnFromRec recName ind indVal.isReflexive indVal.numParams indVal.all.toArray brecOnName
|
||||
|
||||
|
||||
def mkBRecOn (declName : Name) : MetaM Unit := mkBRecOnOrBInductionOn declName false
|
||||
def mkBInductionOn (declName : Name) : MetaM Unit := mkBRecOnOrBInductionOn declName true
|
||||
|
||||
@@ -176,48 +176,4 @@ def litToCtor (e : Expr) : MetaM Expr := do
|
||||
return mkApp3 (mkConst ``Fin.mk) n i h
|
||||
return e
|
||||
|
||||
/--
|
||||
Check if an expression is a list literal (i.e. a nested chain of `List.cons`, ending at a `List.nil`),
|
||||
where each element is "recognised" by a given function `f : Expr → MetaM (Option α)`,
|
||||
and return the array of recognised values.
|
||||
-/
|
||||
partial def getListLitOf? (e : Expr) (f : Expr → MetaM (Option α)) : MetaM (Option (Array α)) := do
|
||||
let mut e ← instantiateMVars e.consumeMData
|
||||
let mut r := #[]
|
||||
while true do
|
||||
match_expr e with
|
||||
| List.nil _ => break
|
||||
| List.cons _ a as => do
|
||||
let some a ← f a | return none
|
||||
r := r.push a
|
||||
e := as
|
||||
| _ => return none
|
||||
return some r
|
||||
|
||||
/--
|
||||
Check if an expression is a list literal (i.e. a nested chain of `List.cons`, ending at a `List.nil`),
|
||||
returning the array of `Expr` values.
|
||||
-/
|
||||
def getListLit? (e : Expr) : MetaM (Option (Array Expr)) := getListLitOf? e fun s => return some s
|
||||
|
||||
/--
|
||||
Check if an expression is an array literal
|
||||
(i.e. `List.toArray` applied to a nested chain of `List.cons`, ending at a `List.nil`),
|
||||
where each element is "recognised" by a given function `f : Expr → MetaM (Option α)`,
|
||||
and return the array of recognised values.
|
||||
-/
|
||||
def getArrayLitOf? (e : Expr) (f : Expr → MetaM (Option α)) : MetaM (Option (Array α)) := do
|
||||
let e ← instantiateMVars e.consumeMData
|
||||
match_expr e with
|
||||
| List.toArray _ as => getListLitOf? as f
|
||||
| _ => return none
|
||||
|
||||
/--
|
||||
Check if an expression is an array literal
|
||||
(i.e. `List.toArray` applied to a nested chain of `List.cons`, ending at a `List.nil`),
|
||||
returning the array of `Expr` values.
|
||||
-/
|
||||
def getArrayLit? (e : Expr) : MetaM (Option (Array Expr)) := getArrayLitOf? e fun s => return some s
|
||||
|
||||
|
||||
end Lean.Meta
|
||||
|
||||
@@ -294,7 +294,7 @@ def transform
|
||||
altType in altTypes do
|
||||
let alt' ← forallAltTelescope' origAltType (numParams - numDiscrEqs) 0 fun ys args => do
|
||||
let altType ← instantiateForall altType ys
|
||||
-- The splitter inserts its extra parameters after the first ys.size parameters, before
|
||||
-- The splitter inserts its extra paramters after the first ys.size parameters, before
|
||||
-- the parameters for the numDiscrEqs
|
||||
forallBoundedTelescope altType (splitterNumParams - ys.size) fun ys2 altType => do
|
||||
forallBoundedTelescope altType numDiscrEqs fun ys3 altType => do
|
||||
|
||||
@@ -164,6 +164,8 @@ partial def mkSizeOfFn (recName : Name) (declName : Name): MetaM Unit := do
|
||||
-/
|
||||
def mkSizeOfFns (typeName : Name) : MetaM (Array Name × NameMap Name) := do
|
||||
let indInfo ← getConstInfoInduct typeName
|
||||
let recInfo ← getConstInfoRec (mkRecName typeName)
|
||||
let numExtra := recInfo.numMotives - indInfo.all.length -- numExtra > 0 for nested inductive types
|
||||
let mut result := #[]
|
||||
let baseName := indInfo.all.head! ++ `_sizeOf -- we use the first inductive type as the base name for `sizeOf` functions
|
||||
let mut i := 1
|
||||
@@ -175,7 +177,7 @@ def mkSizeOfFns (typeName : Name) : MetaM (Array Name × NameMap Name) := do
|
||||
recMap := recMap.insert recName sizeOfName
|
||||
result := result.push sizeOfName
|
||||
i := i + 1
|
||||
for j in [:indInfo.numNested] do
|
||||
for j in [:numExtra] do
|
||||
let recName := (mkRecName indInfo.all.head!).appendIndexAfter (j+1)
|
||||
let sizeOfName := baseName.appendIndexAfter i
|
||||
mkSizeOfFn recName sizeOfName
|
||||
|
||||
@@ -719,7 +719,7 @@ to the continuation
|
||||
recursion and extra parameters passed to the recursor)
|
||||
* the position of the motive/induction hypothesis in the body's arguments
|
||||
* the body, as passed to the recursor. Expected to be a lambda that takes the
|
||||
varying parameters and the motive
|
||||
varying paramters and the motive
|
||||
* a function to re-assemble the call with a new Motive. The resulting expression expects
|
||||
the new body next, so that the expected type of the body can be inferred
|
||||
* a function to finish assembling the call with the new body.
|
||||
@@ -744,8 +744,8 @@ def findRecursor {α} (name : Name) (varNames : Array Name) (e : Expr)
|
||||
-- Bail out on mutual or nested inductives
|
||||
let .str indName _ := f.constName! | unreachable!
|
||||
let indInfo ← getConstInfoInduct indName
|
||||
if indInfo.numTypeFormers > 1 then
|
||||
throwError "functional induction: cannot handle mutual or nested inductives"
|
||||
if indInfo.all.length > 1 then
|
||||
throwError "functional induction: cannot handle mutual inductives"
|
||||
|
||||
let elimInfo ← getElimExprInfo f
|
||||
let targets : Array Expr := elimInfo.targetsPos.map (args[·]!)
|
||||
|
||||
@@ -13,4 +13,3 @@ import Lean.Meta.Tactic.Simp.BuiltinSimprocs.Char
|
||||
import Lean.Meta.Tactic.Simp.BuiltinSimprocs.String
|
||||
import Lean.Meta.Tactic.Simp.BuiltinSimprocs.BitVec
|
||||
import Lean.Meta.Tactic.Simp.BuiltinSimprocs.List
|
||||
import Lean.Meta.Tactic.Simp.BuiltinSimprocs.Array
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Lean FRO. All Rights Reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Kim Morrison
|
||||
-/
|
||||
prelude
|
||||
import Lean.Meta.LitValues
|
||||
import Lean.Meta.Tactic.Simp.BuiltinSimprocs.Nat
|
||||
|
||||
namespace Array
|
||||
open Lean Meta Simp
|
||||
|
||||
/-- Simplification procedure for `#[...][n]` for `n` a `Nat` literal. -/
|
||||
builtin_dsimproc [simp, seval] reduceGetElem (@GetElem.getElem (Array _) Nat _ _ _ _ _ _) := fun e => do
|
||||
let_expr GetElem.getElem _ _ _ _ _ xs n _ ← e | return .continue
|
||||
let some n ← Nat.fromExpr? n | return .continue
|
||||
let some xs ← getArrayLit? xs | return .continue
|
||||
return .done <| xs[n]!
|
||||
|
||||
/-- Simplification procedure for `#[...][n]?` for `n` a `Nat` literal. -/
|
||||
builtin_dsimproc [simp, seval] reduceGetElem? (@GetElem?.getElem? (Array _) Nat _ _ _ _ _) := fun e => do
|
||||
let_expr GetElem?.getElem? _ _ α _ _ xs n ← e | return .continue
|
||||
let some n ← Nat.fromExpr? n | return .continue
|
||||
let some xs ← getArrayLit? xs | return .continue
|
||||
let r ← if h : n < xs.size then mkSome α xs[n] else mkNone α
|
||||
return .done r
|
||||
|
||||
/-- Simplification procedure for `#[...][n]!` for `n` a `Nat` literal. -/
|
||||
builtin_dsimproc [simp, seval] reduceGetElem! (@GetElem?.getElem! (Array _) Nat _ _ _ _ _ _) := fun e => do
|
||||
let_expr GetElem?.getElem! _ _ α _ _ I xs n ← e | return .continue
|
||||
let some n ← Nat.fromExpr? n | return .continue
|
||||
let some xs ← getArrayLit? xs | return .continue
|
||||
let r ← if h : n < xs.size then pure xs[n] else mkDefault α
|
||||
return .done r
|
||||
|
||||
end Array
|
||||
@@ -142,8 +142,8 @@ def findModuleOf? [Monad m] [MonadEnv m] [MonadError m] (declName : Name) : m (O
|
||||
|
||||
def isEnumType [Monad m] [MonadEnv m] [MonadError m] (declName : Name) : m Bool := do
|
||||
if let ConstantInfo.inductInfo info ← getConstInfo declName then
|
||||
if !info.type.isProp && info.numTypeFormers == 1 && info.numIndices == 0 && info.numParams == 0
|
||||
&& !info.ctors.isEmpty && !info.isRec && !info.isUnsafe then
|
||||
if !info.type.isProp && info.all.length == 1 && info.numIndices == 0 && info.numParams == 0
|
||||
&& !info.ctors.isEmpty && !info.isRec && !info.isNested && !info.isUnsafe then
|
||||
info.ctors.allM fun ctorName => do
|
||||
let ConstantInfo.ctorInfo info ← getConstInfo ctorName | return false
|
||||
return info.numFields == 0
|
||||
|
||||
@@ -701,7 +701,7 @@ list, so it should be brief.
|
||||
@[builtin_command_parser] def genInjectiveTheorems := leading_parser
|
||||
"gen_injective_theorems% " >> ident
|
||||
|
||||
/-- No-op parser used as syntax kind for attaching remaining whitespace at the end of the input. -/
|
||||
/-- No-op parser used as syntax kind for attaching remaining whitespace to at the end of the input. -/
|
||||
@[run_builtin_parser_attribute_hooks] def eoi : Parser := leading_parser ""
|
||||
|
||||
builtin_initialize
|
||||
|
||||
@@ -174,11 +174,9 @@ do not yield the right result.
|
||||
-/
|
||||
@[builtin_term_parser] def typeAscription := leading_parser
|
||||
"(" >> (withoutPosition (withoutForbidden (termParser >> " :" >> optional (ppSpace >> termParser)))) >> ")"
|
||||
|
||||
/-- Tuple notation; `()` is short for `Unit.unit`, `(a, b, c)` for `Prod.mk a (Prod.mk b c)`, etc. -/
|
||||
@[builtin_term_parser] def tuple := leading_parser
|
||||
"(" >> optional (withoutPosition (withoutForbidden (termParser >> ", " >> sepBy1 termParser ", " (allowTrailingSep := true)))) >> ")"
|
||||
|
||||
/--
|
||||
Parentheses, used for grouping expressions (e.g., `a * (b + c)`).
|
||||
Can also be used for creating simple functions when combined with `·`. Here are some examples:
|
||||
|
||||
@@ -128,12 +128,10 @@ def ofLazyM (f : MetaM MessageData) (es : Array Expr := #[]) : MessageData :=
|
||||
instantiateMVarsCore mvarctxt a |>.1.hasSyntheticSorry
|
||||
))
|
||||
|
||||
/--
|
||||
Pretty print a const expression using `delabConst` and generate terminfo.
|
||||
/-- Pretty print a const expression using `delabConst` and generate terminfo.
|
||||
This function avoids inserting `@` if the constant is for a function whose first
|
||||
argument is implicit, which is what the default `toMessageData` for `Expr` does.
|
||||
Panics if `e` is not a constant.
|
||||
-/
|
||||
Panics if `e` is not a constant. -/
|
||||
def ofConst (e : Expr) : MessageData :=
|
||||
if e.isConst then
|
||||
let delab : Delab := withOptionAtCurrPos `pp.tagAppFns true delabConst
|
||||
@@ -141,19 +139,6 @@ def ofConst (e : Expr) : MessageData :=
|
||||
else
|
||||
panic! "not a constant"
|
||||
|
||||
/--
|
||||
Pretty print a constant given its name, similar to `Lean.MessageData.ofConst`.
|
||||
Uses the constant's universe level parameters when pretty printing.
|
||||
If there is no such constant in the environment, the name is simply formatted.
|
||||
-/
|
||||
def ofConstName (constName : Name) : MessageData :=
|
||||
.ofFormatWithInfosM do
|
||||
if let some info := (← getEnv).find? constName then
|
||||
let delab : Delab := withOptionAtCurrPos `pp.tagAppFns true delabConst
|
||||
PrettyPrinter.ppExprWithInfos (delab := delab) (.const constName <| info.levelParams.map mkLevelParam)
|
||||
else
|
||||
return format constName
|
||||
|
||||
/-- Generates `MessageData` for a declaration `c` as `c.{<levels>} <params> : <type>`, with terminfo. -/
|
||||
def signature (c : Name) : MessageData :=
|
||||
.ofFormatWithInfosM (PrettyPrinter.ppSignature c)
|
||||
|
||||
@@ -199,10 +199,6 @@ def unexpandStructureInstance (stx : Syntax) : Delab := whenPPOption getPPStruct
|
||||
let mut fields := #[]
|
||||
guard $ fieldNames.size == stx[1].getNumArgs
|
||||
if hasPPUsingAnonymousConstructorAttribute env s.induct then
|
||||
/- Note that we don't flatten anonymous constructor notation. Only a complete such notation receives TermInfo,
|
||||
and flattening would cause the flattened-in notation to lose its TermInfo.
|
||||
Potentially it would be justified to flatten anonymous constructor notation when the terms are
|
||||
from the same type family (think `Sigma`), but for now users can write a custom delaborator in such instances. -/
|
||||
return ← withTypeAscription (cond := (← withType <| getPPOption getPPStructureInstanceType)) do
|
||||
`(⟨$[$(stx[1].getArgs)],*⟩)
|
||||
let args := e.getAppArgs
|
||||
@@ -642,7 +638,7 @@ List.map.match_1 : {α : Type _} →
|
||||
```
|
||||
-/
|
||||
@[builtin_delab app]
|
||||
partial def delabAppMatch : Delab := whenNotPPOption getPPExplicit <| whenPPOption getPPNotation <| whenPPOption getPPMatch do
|
||||
partial def delabAppMatch : Delab := whenPPOption getPPNotation <| whenPPOption getPPMatch do
|
||||
-- Check that this is a matcher, and then set up overapplication.
|
||||
let Expr.const c us := (← getExpr).getAppFn | failure
|
||||
let some info ← getMatcherInfo? c | failure
|
||||
@@ -773,6 +769,16 @@ def delabMData : Delab := do
|
||||
else
|
||||
withMDataOptions delab
|
||||
|
||||
/--
|
||||
Check for a `Syntax.ident` of the given name anywhere in the tree.
|
||||
This is usually a bad idea since it does not check for shadowing bindings,
|
||||
but in the delaborator we assume that bindings are never shadowed.
|
||||
-/
|
||||
partial def hasIdent (id : Name) : Syntax → Bool
|
||||
| Syntax.ident _ _ id' _ => id == id'
|
||||
| Syntax.node _ _ args => args.any (hasIdent id)
|
||||
| _ => false
|
||||
|
||||
/--
|
||||
Return `true` iff current binder should be merged with the nested
|
||||
binder, if any, into a single binder group:
|
||||
@@ -818,7 +824,7 @@ def delabLam : Delab :=
|
||||
let e ← getExpr
|
||||
let stxT ← withBindingDomain delab
|
||||
let ppTypes ← getPPOption getPPFunBinderTypes
|
||||
let usedDownstream := curNames.any (fun n => stxBody.hasIdent n.getId)
|
||||
let usedDownstream := curNames.any (fun n => hasIdent n.getId stxBody)
|
||||
|
||||
-- leave lambda implicit if possible
|
||||
-- TODO: for now we just always block implicit lambdas when delaborating. We can revisit.
|
||||
@@ -1129,24 +1135,6 @@ def delabSigma : Delab := delabSigmaCore (sigma := true)
|
||||
@[builtin_delab app.PSigma]
|
||||
def delabPSigma : Delab := delabSigmaCore (sigma := false)
|
||||
|
||||
-- PProd and MProd value delaborator
|
||||
-- (like pp_using_anonymous_constructor but flattening nested tuples)
|
||||
|
||||
def delabPProdMkCore (mkName : Name) : Delab := whenNotPPOption getPPExplicit <| whenPPOption getPPNotation do
|
||||
guard <| (← getExpr).getAppNumArgs == 4
|
||||
let a ← withAppFn <| withAppArg delab
|
||||
let b ← withAppArg <| delab
|
||||
if (← getExpr).appArg!.isAppOfArity mkName 4 then
|
||||
if let `(⟨$xs,*⟩) := b then
|
||||
return ← `(⟨$a, $xs,*⟩)
|
||||
`(⟨$a, $b⟩)
|
||||
|
||||
@[builtin_delab app.PProd.mk]
|
||||
def delabPProdMk : Delab := delabPProdMkCore ``PProd.mk
|
||||
|
||||
@[builtin_delab app.MProd.mk]
|
||||
def delabMProdMk : Delab := delabPProdMkCore ``MProd.mk
|
||||
|
||||
partial def delabDoElems : DelabM (List Syntax) := do
|
||||
let e ← getExpr
|
||||
if e.isAppOfArity ``Bind.bind 6 then
|
||||
|
||||
@@ -164,16 +164,6 @@ def asNode : Syntax → SyntaxNode
|
||||
def getIdAt (stx : Syntax) (i : Nat) : Name :=
|
||||
(stx.getArg i).getId
|
||||
|
||||
/--
|
||||
Check for a `Syntax.ident` of the given name anywhere in the tree.
|
||||
This is usually a bad idea since it does not check for shadowing bindings,
|
||||
but in the delaborator we assume that bindings are never shadowed.
|
||||
-/
|
||||
partial def hasIdent (id : Name) : Syntax → Bool
|
||||
| ident _ _ id' _ => id == id'
|
||||
| node _ _ args => args.any (hasIdent id)
|
||||
| _ => false
|
||||
|
||||
@[inline] def modifyArgs (stx : Syntax) (fn : Array Syntax → Array Syntax) : Syntax :=
|
||||
match stx with
|
||||
| node i k args => node i k (fn args)
|
||||
|
||||
@@ -31,4 +31,3 @@ import Lean.Util.FileSetupInfo
|
||||
import Lean.Util.Heartbeats
|
||||
import Lean.Util.SearchPath
|
||||
import Lean.Util.SafeExponentiation
|
||||
import Lean.Util.NumObjs
|
||||
|
||||
@@ -1,47 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Amazon.com, Inc. or its affiliates. All Rights Reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Lean.Expr
|
||||
import Lean.Util.PtrSet
|
||||
|
||||
namespace Lean.Expr
|
||||
namespace NumObjs
|
||||
|
||||
unsafe structure State where
|
||||
visited : PtrSet Expr := mkPtrSet
|
||||
counter : Nat := 0
|
||||
|
||||
unsafe abbrev M := StateM State
|
||||
|
||||
unsafe def visit (e : Expr) : M Unit :=
|
||||
unless (← get).visited.contains e do
|
||||
modify fun { visited, counter } => { visited := visited.insert e, counter := counter + 1 }
|
||||
match e with
|
||||
| .forallE _ d b _ => visit d; visit b
|
||||
| .lam _ d b _ => visit d; visit b
|
||||
| .mdata _ b => visit b
|
||||
| .letE _ t v b _ => visit t; visit v; visit b
|
||||
| .app f a => visit f; visit a
|
||||
| .proj _ _ b => visit b
|
||||
| _ => return ()
|
||||
|
||||
unsafe def main (e : Expr) : Nat :=
|
||||
let (_, s) := NumObjs.visit e |>.run {}
|
||||
s.counter
|
||||
|
||||
end NumObjs
|
||||
|
||||
/--
|
||||
Returns the number of allocated `Expr` objects in the given expression `e`.
|
||||
|
||||
This operation is performed in `IO` because the result depends on the memory representation of the object.
|
||||
|
||||
Note: Use this function primarily for diagnosing performance issues.
|
||||
-/
|
||||
def numObjs (e : Expr) : IO Nat :=
|
||||
return unsafe NumObjs.main e
|
||||
|
||||
end Lean.Expr
|
||||
@@ -5,6 +5,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Wojciech Nawrocki
|
||||
-/
|
||||
prelude
|
||||
import Lean.PrettyPrinter
|
||||
import Lean.Server.Rpc.Basic
|
||||
import Lean.Server.InfoUtils
|
||||
import Lean.Widget.TaggedText
|
||||
|
||||
@@ -115,9 +115,9 @@ instance [BEq α] [Hashable α] {m : DHashMap α β} {a : α} : Decidable (a ∈
|
||||
(a : α) (fallback : β a) : β a :=
|
||||
Raw₀.getD ⟨m.1, m.2.size_buckets_pos⟩ a fallback
|
||||
|
||||
@[inline, inherit_doc Raw.erase] def erase [BEq α] [Hashable α] (m : DHashMap α β) (a : α) :
|
||||
@[inline, inherit_doc Raw.remove] def remove [BEq α] [Hashable α] (m : DHashMap α β) (a : α) :
|
||||
DHashMap α β :=
|
||||
⟨Raw₀.erase ⟨m.1, m.2.size_buckets_pos⟩ a, .erase₀ m.2⟩
|
||||
⟨Raw₀.remove ⟨m.1, m.2.size_buckets_pos⟩ a, .remove₀ m.2⟩
|
||||
|
||||
section
|
||||
|
||||
|
||||
@@ -129,9 +129,9 @@ def replace [BEq α] (a : α) (b : β a) : AssocList α β → AssocList α β
|
||||
| cons k v l => bif a == k then cons a b l else cons k v (replace a b l)
|
||||
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
def erase [BEq α] (a : α) : AssocList α β → AssocList α β
|
||||
def remove [BEq α] (a : α) : AssocList α β → AssocList α β
|
||||
| nil => nil
|
||||
| cons k v l => bif a == k then l else cons k v (l.erase a)
|
||||
| cons k v l => bif a == k then l else cons k v (l.remove a)
|
||||
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
@[inline] def filterMap (f : (a : α) → β a → Option (γ a)) :
|
||||
|
||||
@@ -119,11 +119,11 @@ theorem toList_replace [BEq α] {l : AssocList α β} {a : α} {b : β a} :
|
||||
· next k v t ih => cases h : a == k <;> simp_all [replace, List.replaceEntry_cons]
|
||||
|
||||
@[simp]
|
||||
theorem toList_erase [BEq α] {l : AssocList α β} {a : α} :
|
||||
(l.erase a).toList = eraseKey a l.toList := by
|
||||
theorem toList_remove [BEq α] {l : AssocList α β} {a : α} :
|
||||
(l.remove a).toList = removeKey a l.toList := by
|
||||
induction l
|
||||
· simp [erase]
|
||||
· next k v t ih => cases h : a == k <;> simp_all [erase, List.eraseKey_cons]
|
||||
· simp [remove]
|
||||
· next k v t ih => cases h : a == k <;> simp_all [remove, List.removeKey_cons]
|
||||
|
||||
theorem toList_filterMap {f : (a : α) → β a → Option (γ a)} {l : AssocList α β} :
|
||||
Perm (l.filterMap f).toList (l.toList.filterMap fun p => (f p.1 p.2).map (⟨p.1, ·⟩)) := by
|
||||
|
||||
@@ -79,7 +79,7 @@ maintainable. To this end, we provide theorems `apply_bucket`, `apply_bucket_wit
|
||||
`toListModel_updateBucket` and `toListModel_updateAllBuckets`, which do all of the heavy lifting in
|
||||
a general way. The verification for each actual operation in `Internal.WF` is then extremely
|
||||
straightward, requiring only to plug in some results about lists. See for example the functions
|
||||
`containsₘ_eq_containsKey` and the section on `eraseₘ` for prototypical examples of this technique.
|
||||
`containsₘ_eq_containsKey` and the section on `removeₘ` for prototypical examples of this technique.
|
||||
|
||||
Here is a summary of the steps required to add and verify a new operation:
|
||||
1. Write the executable implementation
|
||||
@@ -197,7 +197,7 @@ where
|
||||
if h : i < source.size then
|
||||
let idx : Fin source.size := ⟨i, h⟩
|
||||
let es := source.get idx
|
||||
-- We erase `es` from `source` to make sure we can reuse its memory cells
|
||||
-- We remove `es` from `source` to make sure we can reuse its memory cells
|
||||
-- when performing es.foldl
|
||||
let source := source.set idx .nil
|
||||
let target := es.foldl (reinsertAux hash) target
|
||||
@@ -313,13 +313,13 @@ where
|
||||
buckets[idx.1].getCast! a
|
||||
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
@[inline] def erase [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) : Raw₀ α β :=
|
||||
@[inline] def remove [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) : Raw₀ α β :=
|
||||
let ⟨⟨size, buckets⟩, hb⟩ := m
|
||||
let ⟨i, h⟩ := mkIdx buckets.size hb (hash a)
|
||||
let bkt := buckets[i]
|
||||
if bkt.contains a then
|
||||
let buckets' := buckets.uset i .nil h
|
||||
⟨⟨size - 1, buckets'.uset i (bkt.erase a) (by simpa [buckets'])⟩, by simpa [buckets']⟩
|
||||
⟨⟨size - 1, buckets'.uset i (bkt.remove a) (by simpa [buckets'])⟩, by simpa [buckets']⟩
|
||||
else
|
||||
⟨⟨size, buckets⟩, hb⟩
|
||||
|
||||
|
||||
@@ -641,51 +641,51 @@ theorem containsKey_replaceEntry [BEq α] [PartialEquivBEq α] {l : List ((a :
|
||||
exact containsKey_of_beq h.1 (BEq.symm h.2)
|
||||
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
def eraseKey [BEq α] (k : α) : List ((a : α) × β a) → List ((a : α) × β a)
|
||||
def removeKey [BEq α] (k : α) : List ((a : α) × β a) → List ((a : α) × β a)
|
||||
| [] => []
|
||||
| ⟨k', v'⟩ :: l => bif k == k' then l else ⟨k', v'⟩ :: eraseKey k l
|
||||
| ⟨k', v'⟩ :: l => bif k == k' then l else ⟨k', v'⟩ :: removeKey k l
|
||||
|
||||
@[simp] theorem eraseKey_nil [BEq α] {k : α} : eraseKey k ([] : List ((a : α) × β a)) = [] := rfl
|
||||
theorem eraseKey_cons [BEq α] {l : List ((a : α) × β a)} {k k' : α} {v' : β k'} :
|
||||
eraseKey k (⟨k', v'⟩ :: l) = bif k == k' then l else ⟨k', v'⟩ :: eraseKey k l := rfl
|
||||
@[simp] theorem removeKey_nil [BEq α] {k : α} : removeKey k ([] : List ((a : α) × β a)) = [] := rfl
|
||||
theorem removeKey_cons [BEq α] {l : List ((a : α) × β a)} {k k' : α} {v' : β k'} :
|
||||
removeKey k (⟨k', v'⟩ :: l) = bif k == k' then l else ⟨k', v'⟩ :: removeKey k l := rfl
|
||||
|
||||
theorem eraseKey_cons_of_beq [BEq α] {l : List ((a : α) × β a)} {k k' : α} {v' : β k'}
|
||||
(h : k == k') : eraseKey k (⟨k', v'⟩ :: l) = l :=
|
||||
by simp [eraseKey_cons, h]
|
||||
theorem removeKey_cons_of_beq [BEq α] {l : List ((a : α) × β a)} {k k' : α} {v' : β k'}
|
||||
(h : k == k') : removeKey k (⟨k', v'⟩ :: l) = l :=
|
||||
by simp [removeKey_cons, h]
|
||||
|
||||
@[simp]
|
||||
theorem eraseKey_cons_self [BEq α] [ReflBEq α] {l : List ((a : α) × β a)} {k : α} {v : β k} :
|
||||
eraseKey k (⟨k, v⟩ :: l) = l :=
|
||||
eraseKey_cons_of_beq BEq.refl
|
||||
theorem removeKey_cons_self [BEq α] [ReflBEq α] {l : List ((a : α) × β a)} {k : α} {v : β k} :
|
||||
removeKey k (⟨k, v⟩ :: l) = l :=
|
||||
removeKey_cons_of_beq BEq.refl
|
||||
|
||||
theorem eraseKey_cons_of_false [BEq α] {l : List ((a : α) × β a)} {k k' : α} {v' : β k'}
|
||||
(h : (k == k') = false) : eraseKey k (⟨k', v'⟩ :: l) = ⟨k', v'⟩ :: eraseKey k l := by
|
||||
simp [eraseKey_cons, h]
|
||||
theorem removeKey_cons_of_false [BEq α] {l : List ((a : α) × β a)} {k k' : α} {v' : β k'}
|
||||
(h : (k == k') = false) : removeKey k (⟨k', v'⟩ :: l) = ⟨k', v'⟩ :: removeKey k l := by
|
||||
simp [removeKey_cons, h]
|
||||
|
||||
theorem eraseKey_of_containsKey_eq_false [BEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(h : containsKey k l = false) : eraseKey k l = l := by
|
||||
theorem removeKey_of_containsKey_eq_false [BEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(h : containsKey k l = false) : removeKey k l = l := by
|
||||
induction l using assoc_induction
|
||||
· simp
|
||||
· next k' v' t ih =>
|
||||
simp only [containsKey_cons, Bool.or_eq_false_iff] at h
|
||||
rw [eraseKey_cons_of_false h.1, ih h.2]
|
||||
rw [removeKey_cons_of_false h.1, ih h.2]
|
||||
|
||||
theorem sublist_eraseKey [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
Sublist (eraseKey k l) l := by
|
||||
theorem sublist_removeKey [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
Sublist (removeKey k l) l := by
|
||||
induction l using assoc_induction
|
||||
· simp
|
||||
· next k' v' t ih =>
|
||||
rw [eraseKey_cons]
|
||||
rw [removeKey_cons]
|
||||
cases k == k'
|
||||
· simpa
|
||||
· simpa using Sublist.cons_right Sublist.refl
|
||||
|
||||
theorem length_eraseKey [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
(eraseKey k l).length = bif containsKey k l then l.length - 1 else l.length := by
|
||||
theorem length_removeKey [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
(removeKey k l).length = bif containsKey k l then l.length - 1 else l.length := by
|
||||
induction l using assoc_induction
|
||||
· simp
|
||||
· next k' v' t ih =>
|
||||
rw [eraseKey_cons, containsKey_cons]
|
||||
rw [removeKey_cons, containsKey_cons]
|
||||
cases k == k'
|
||||
· rw [cond_false, Bool.false_or, List.length_cons, ih]
|
||||
cases h : containsKey k t
|
||||
@@ -697,15 +697,15 @@ theorem length_eraseKey [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
· simp
|
||||
· simp
|
||||
|
||||
theorem length_eraseKey_le [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
(eraseKey k l).length ≤ l.length :=
|
||||
sublist_eraseKey.length_le
|
||||
theorem length_removeKey_le [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
(removeKey k l).length ≤ l.length :=
|
||||
sublist_removeKey.length_le
|
||||
|
||||
theorem isEmpty_eraseKey [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
(eraseKey k l).isEmpty = (l.isEmpty || (l.length == 1 && containsKey k l)) := by
|
||||
theorem isEmpty_removeKey [BEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
(removeKey k l).isEmpty = (l.isEmpty || (l.length == 1 && containsKey k l)) := by
|
||||
rw [Bool.eq_iff_iff]
|
||||
simp only [Bool.or_eq_true, Bool.and_eq_true, beq_iff_eq]
|
||||
rw [List.isEmpty_iff_length_eq_zero, length_eraseKey, List.isEmpty_iff_length_eq_zero]
|
||||
rw [List.isEmpty_iff_length_eq_zero, length_removeKey, List.isEmpty_iff_length_eq_zero]
|
||||
cases containsKey k l <;> cases l <;> simp
|
||||
|
||||
@[simp] theorem keys_nil : keys ([] : List ((a : α) × β a)) = [] := rfl
|
||||
@@ -1124,55 +1124,55 @@ theorem length_le_length_insertEntryIfNew [BEq α] {l : List ((a : α) × β a)}
|
||||
· simp
|
||||
|
||||
@[simp]
|
||||
theorem keys_eraseKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
keys (eraseKey k l) = (keys l).erase k := by
|
||||
theorem keys_removeKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
keys (removeKey k l) = (keys l).erase k := by
|
||||
induction l using assoc_induction
|
||||
· rfl
|
||||
· next k' v' l ih =>
|
||||
simp only [eraseKey_cons, keys_cons, List.erase_cons]
|
||||
simp only [removeKey_cons, keys_cons, List.erase_cons]
|
||||
rw [BEq.comm]
|
||||
cases k' == k <;> simp [ih]
|
||||
|
||||
theorem DistinctKeys.eraseKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
DistinctKeys l → DistinctKeys (eraseKey k l) := by
|
||||
theorem DistinctKeys.removeKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α} :
|
||||
DistinctKeys l → DistinctKeys (removeKey k l) := by
|
||||
apply distinctKeys_of_sublist_keys (by simpa using erase_sublist _ _)
|
||||
|
||||
theorem getEntry?_eraseKey_self [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(h : DistinctKeys l) : getEntry? k (eraseKey k l) = none := by
|
||||
theorem getEntry?_removeKey_self [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(h : DistinctKeys l) : getEntry? k (removeKey k l) = none := by
|
||||
induction l using assoc_induction
|
||||
· simp
|
||||
· next k' v' t ih =>
|
||||
cases h' : k == k'
|
||||
· rw [eraseKey_cons_of_false h', getEntry?_cons_of_false h']
|
||||
· rw [removeKey_cons_of_false h', getEntry?_cons_of_false h']
|
||||
exact ih h.tail
|
||||
· rw [eraseKey_cons_of_beq h', ← Option.not_isSome_iff_eq_none, Bool.not_eq_true,
|
||||
· rw [removeKey_cons_of_beq h', ← Option.not_isSome_iff_eq_none, Bool.not_eq_true,
|
||||
← containsKey_eq_isSome_getEntry?, ← containsKey_congr (BEq.symm h')]
|
||||
exact h.containsKey_eq_false
|
||||
|
||||
theorem getEntry?_eraseKey_of_beq [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
(hl : DistinctKeys l) (hka : a == k) : getEntry? a (eraseKey k l) = none := by
|
||||
rw [← getEntry?_congr (BEq.symm hka), getEntry?_eraseKey_self hl]
|
||||
theorem getEntry?_removeKey_of_beq [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
(hl : DistinctKeys l) (hka : a == k) : getEntry? a (removeKey k l) = none := by
|
||||
rw [← getEntry?_congr (BEq.symm hka), getEntry?_removeKey_self hl]
|
||||
|
||||
theorem getEntry?_eraseKey_of_false [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hka : (a == k) = false) : getEntry? a (eraseKey k l) = getEntry? a l := by
|
||||
theorem getEntry?_removeKey_of_false [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hka : (a == k) = false) : getEntry? a (removeKey k l) = getEntry? a l := by
|
||||
induction l using assoc_induction
|
||||
· simp
|
||||
· next k' v' t ih =>
|
||||
cases h' : k == k'
|
||||
· rw [eraseKey_cons_of_false h']
|
||||
· rw [removeKey_cons_of_false h']
|
||||
cases h'' : a == k'
|
||||
· rw [getEntry?_cons_of_false h'', ih, getEntry?_cons_of_false h'']
|
||||
· rw [getEntry?_cons_of_true h'', getEntry?_cons_of_true h'']
|
||||
· rw [eraseKey_cons_of_beq h']
|
||||
· rw [removeKey_cons_of_beq h']
|
||||
have hx : (a == k') = false := BEq.neq_of_neq_of_beq hka h'
|
||||
rw [getEntry?_cons_of_false hx]
|
||||
|
||||
theorem getEntry?_eraseKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
theorem getEntry?_removeKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
(hl : DistinctKeys l) :
|
||||
getEntry? a (eraseKey k l) = bif a == k then none else getEntry? a l := by
|
||||
getEntry? a (removeKey k l) = bif a == k then none else getEntry? a l := by
|
||||
cases h : a == k
|
||||
· simp [getEntry?_eraseKey_of_false h, h]
|
||||
· simp [getEntry?_eraseKey_of_beq hl h, h]
|
||||
· simp [getEntry?_removeKey_of_false h, h]
|
||||
· simp [getEntry?_removeKey_of_beq hl h, h]
|
||||
|
||||
theorem keys_filterMap [BEq α] {l : List ((a : α) × β a)} {f : (a : α) → β a → Option (γ a)} :
|
||||
keys (l.filterMap fun p => (f p.1 p.2).map (⟨p.1, ·⟩)) =
|
||||
@@ -1208,110 +1208,110 @@ section
|
||||
|
||||
variable {β : Type v}
|
||||
|
||||
theorem getValue?_eraseKey_self [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k : α}
|
||||
(h : DistinctKeys l) : getValue? k (eraseKey k l) = none := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_eraseKey_self h]
|
||||
theorem getValue?_removeKey_self [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k : α}
|
||||
(h : DistinctKeys l) : getValue? k (removeKey k l) = none := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_removeKey_self h]
|
||||
|
||||
theorem getValue?_eraseKey_of_beq [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k a : α}
|
||||
(hl : DistinctKeys l) (hka : a == k) : getValue? a (eraseKey k l) = none := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_eraseKey_of_beq hl hka]
|
||||
theorem getValue?_removeKey_of_beq [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k a : α}
|
||||
(hl : DistinctKeys l) (hka : a == k) : getValue? a (removeKey k l) = none := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_removeKey_of_beq hl hka]
|
||||
|
||||
theorem getValue?_eraseKey_of_false [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k a : α}
|
||||
(hka : (a == k) = false) : getValue? a (eraseKey k l) = getValue? a l := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_eraseKey_of_false hka]
|
||||
theorem getValue?_removeKey_of_false [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k a : α}
|
||||
(hka : (a == k) = false) : getValue? a (removeKey k l) = getValue? a l := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_removeKey_of_false hka]
|
||||
|
||||
theorem getValue?_eraseKey [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k a : α}
|
||||
theorem getValue?_removeKey [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)} {k a : α}
|
||||
(hl : DistinctKeys l) :
|
||||
getValue? a (eraseKey k l) = bif a == k then none else getValue? a l := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_eraseKey hl, Bool.apply_cond (Option.map _)]
|
||||
getValue? a (removeKey k l) = bif a == k then none else getValue? a l := by
|
||||
simp [getValue?_eq_getEntry?, getEntry?_removeKey hl, Bool.apply_cond (Option.map _)]
|
||||
|
||||
end
|
||||
|
||||
theorem containsKey_eraseKey_self [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(h : DistinctKeys l) : containsKey k (eraseKey k l) = false := by
|
||||
simp [containsKey_eq_isSome_getEntry?, getEntry?_eraseKey_self h]
|
||||
theorem containsKey_removeKey_self [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(h : DistinctKeys l) : containsKey k (removeKey k l) = false := by
|
||||
simp [containsKey_eq_isSome_getEntry?, getEntry?_removeKey_self h]
|
||||
|
||||
theorem containsKey_eraseKey_of_beq [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hl : DistinctKeys l) (hka : a == k) : containsKey a (eraseKey k l) = false := by
|
||||
rw [containsKey_congr hka, containsKey_eraseKey_self hl]
|
||||
theorem containsKey_removeKey_of_beq [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hl : DistinctKeys l) (hka : a == k) : containsKey a (removeKey k l) = false := by
|
||||
rw [containsKey_congr hka, containsKey_removeKey_self hl]
|
||||
|
||||
theorem containsKey_eraseKey_of_false [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hka : (a == k) = false) : containsKey a (eraseKey k l) = containsKey a l := by
|
||||
simp [containsKey_eq_isSome_getEntry?, getEntry?_eraseKey_of_false hka]
|
||||
theorem containsKey_removeKey_of_false [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hka : (a == k) = false) : containsKey a (removeKey k l) = containsKey a l := by
|
||||
simp [containsKey_eq_isSome_getEntry?, getEntry?_removeKey_of_false hka]
|
||||
|
||||
theorem containsKey_eraseKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
(hl : DistinctKeys l) : containsKey a (eraseKey k l) = (!(a == k) && containsKey a l) := by
|
||||
simp [containsKey_eq_isSome_getEntry?, getEntry?_eraseKey hl, Bool.apply_cond]
|
||||
theorem containsKey_removeKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
(hl : DistinctKeys l) : containsKey a (removeKey k l) = (!(a == k) && containsKey a l) := by
|
||||
simp [containsKey_eq_isSome_getEntry?, getEntry?_removeKey hl, Bool.apply_cond]
|
||||
|
||||
theorem getValueCast?_eraseKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
theorem getValueCast?_removeKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
(hl : DistinctKeys l) :
|
||||
getValueCast? a (eraseKey k l) = bif a == k then none else getValueCast? a l := by
|
||||
rw [getValueCast?_eq_getEntry?, Option.dmap_congr (getEntry?_eraseKey hl)]
|
||||
getValueCast? a (removeKey k l) = bif a == k then none else getValueCast? a l := by
|
||||
rw [getValueCast?_eq_getEntry?, Option.dmap_congr (getEntry?_removeKey hl)]
|
||||
rcases Bool.eq_false_or_eq_true (a == k) with h|h
|
||||
· rw [Option.dmap_congr (Bool.cond_pos h), Option.dmap_none, Bool.cond_pos h]
|
||||
· rw [Option.dmap_congr (Bool.cond_neg h), getValueCast?_eq_getEntry?]
|
||||
exact (Bool.cond_neg h).symm
|
||||
|
||||
theorem getValueCast?_eraseKey_self [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(hl : DistinctKeys l) : getValueCast? k (eraseKey k l) = none := by
|
||||
rw [getValueCast?_eraseKey hl, Bool.cond_pos BEq.refl]
|
||||
theorem getValueCast?_removeKey_self [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
(hl : DistinctKeys l) : getValueCast? k (removeKey k l) = none := by
|
||||
rw [getValueCast?_removeKey hl, Bool.cond_pos BEq.refl]
|
||||
|
||||
theorem getValueCast!_eraseKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
theorem getValueCast!_removeKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
[Inhabited (β a)] (hl : DistinctKeys l) :
|
||||
getValueCast! a (eraseKey k l) = bif a == k then default else getValueCast! a l := by
|
||||
simp [getValueCast!_eq_getValueCast?, getValueCast?_eraseKey hl, Bool.apply_cond Option.get!]
|
||||
getValueCast! a (removeKey k l) = bif a == k then default else getValueCast! a l := by
|
||||
simp [getValueCast!_eq_getValueCast?, getValueCast?_removeKey hl, Bool.apply_cond Option.get!]
|
||||
|
||||
theorem getValueCast!_eraseKey_self [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
[Inhabited (β k)] (hl : DistinctKeys l) : getValueCast! k (eraseKey k l) = default := by
|
||||
simp [getValueCast!_eq_getValueCast?, getValueCast?_eraseKey_self hl]
|
||||
theorem getValueCast!_removeKey_self [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
[Inhabited (β k)] (hl : DistinctKeys l) : getValueCast! k (removeKey k l) = default := by
|
||||
simp [getValueCast!_eq_getValueCast?, getValueCast?_removeKey_self hl]
|
||||
|
||||
theorem getValueCastD_eraseKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
{fallback : β a} (hl : DistinctKeys l) : getValueCastD a (eraseKey k l) fallback =
|
||||
theorem getValueCastD_removeKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α}
|
||||
{fallback : β a} (hl : DistinctKeys l) : getValueCastD a (removeKey k l) fallback =
|
||||
bif a == k then fallback else getValueCastD a l fallback := by
|
||||
simp [getValueCastD_eq_getValueCast?, getValueCast?_eraseKey hl,
|
||||
simp [getValueCastD_eq_getValueCast?, getValueCast?_removeKey hl,
|
||||
Bool.apply_cond (fun x => Option.getD x fallback)]
|
||||
|
||||
theorem getValueCastD_eraseKey_self [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
theorem getValueCastD_removeKey_self [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k : α}
|
||||
{fallback : β k} (hl : DistinctKeys l) :
|
||||
getValueCastD k (eraseKey k l) fallback = fallback := by
|
||||
simp [getValueCastD_eq_getValueCast?, getValueCast?_eraseKey_self hl]
|
||||
getValueCastD k (removeKey k l) fallback = fallback := by
|
||||
simp [getValueCastD_eq_getValueCast?, getValueCast?_removeKey_self hl]
|
||||
|
||||
theorem getValue!_eraseKey {β : Type v} [BEq α] [PartialEquivBEq α] [Inhabited β]
|
||||
theorem getValue!_removeKey {β : Type v} [BEq α] [PartialEquivBEq α] [Inhabited β]
|
||||
{l : List ((_ : α) × β)} {k a : α} (hl : DistinctKeys l) :
|
||||
getValue! a (eraseKey k l) = bif a == k then default else getValue! a l := by
|
||||
simp [getValue!_eq_getValue?, getValue?_eraseKey hl, Bool.apply_cond Option.get!]
|
||||
getValue! a (removeKey k l) = bif a == k then default else getValue! a l := by
|
||||
simp [getValue!_eq_getValue?, getValue?_removeKey hl, Bool.apply_cond Option.get!]
|
||||
|
||||
theorem getValue!_eraseKey_self {β : Type v} [BEq α] [PartialEquivBEq α] [Inhabited β]
|
||||
theorem getValue!_removeKey_self {β : Type v} [BEq α] [PartialEquivBEq α] [Inhabited β]
|
||||
{l : List ((_ : α) × β)} {k : α} (hl : DistinctKeys l) :
|
||||
getValue! k (eraseKey k l) = default := by
|
||||
simp [getValue!_eq_getValue?, getValue?_eraseKey_self hl]
|
||||
getValue! k (removeKey k l) = default := by
|
||||
simp [getValue!_eq_getValue?, getValue?_removeKey_self hl]
|
||||
|
||||
theorem getValueD_eraseKey {β : Type v} [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)}
|
||||
{k a : α} {fallback : β} (hl : DistinctKeys l) : getValueD a (eraseKey k l) fallback =
|
||||
theorem getValueD_removeKey {β : Type v} [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)}
|
||||
{k a : α} {fallback : β} (hl : DistinctKeys l) : getValueD a (removeKey k l) fallback =
|
||||
bif a == k then fallback else getValueD a l fallback := by
|
||||
simp [getValueD_eq_getValue?, getValue?_eraseKey hl, Bool.apply_cond (fun x => Option.getD x fallback)]
|
||||
simp [getValueD_eq_getValue?, getValue?_removeKey hl, Bool.apply_cond (fun x => Option.getD x fallback)]
|
||||
|
||||
theorem getValueD_eraseKey_self {β : Type v} [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)}
|
||||
theorem getValueD_removeKey_self {β : Type v} [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)}
|
||||
{k : α} {fallback : β} (hl : DistinctKeys l) :
|
||||
getValueD k (eraseKey k l) fallback = fallback := by
|
||||
simp [getValueD_eq_getValue?, getValue?_eraseKey_self hl]
|
||||
getValueD k (removeKey k l) fallback = fallback := by
|
||||
simp [getValueD_eq_getValue?, getValue?_removeKey_self hl]
|
||||
|
||||
theorem containsKey_of_containsKey_eraseKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hl : DistinctKeys l) : containsKey a (eraseKey k l) → containsKey a l := by
|
||||
simp [containsKey_eraseKey hl]
|
||||
theorem containsKey_of_containsKey_removeKey [BEq α] [PartialEquivBEq α] {l : List ((a : α) × β a)}
|
||||
{k a : α} (hl : DistinctKeys l) : containsKey a (removeKey k l) → containsKey a l := by
|
||||
simp [containsKey_removeKey hl]
|
||||
|
||||
theorem getValueCast_eraseKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α} {h}
|
||||
(hl : DistinctKeys l) : getValueCast a (eraseKey k l) h =
|
||||
getValueCast a l (containsKey_of_containsKey_eraseKey hl h) := by
|
||||
rw [containsKey_eraseKey hl, Bool.and_eq_true, Bool.not_eq_true'] at h
|
||||
rw [← Option.some_inj, ← getValueCast?_eq_some_getValueCast, getValueCast?_eraseKey hl, h.1,
|
||||
theorem getValueCast_removeKey [BEq α] [LawfulBEq α] {l : List ((a : α) × β a)} {k a : α} {h}
|
||||
(hl : DistinctKeys l) : getValueCast a (removeKey k l) h =
|
||||
getValueCast a l (containsKey_of_containsKey_removeKey hl h) := by
|
||||
rw [containsKey_removeKey hl, Bool.and_eq_true, Bool.not_eq_true'] at h
|
||||
rw [← Option.some_inj, ← getValueCast?_eq_some_getValueCast, getValueCast?_removeKey hl, h.1,
|
||||
cond_false, ← getValueCast?_eq_some_getValueCast]
|
||||
|
||||
theorem getValue_eraseKey {β : Type v} [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)}
|
||||
theorem getValue_removeKey {β : Type v} [BEq α] [PartialEquivBEq α] {l : List ((_ : α) × β)}
|
||||
{k a : α} {h} (hl : DistinctKeys l) :
|
||||
getValue a (eraseKey k l) h = getValue a l (containsKey_of_containsKey_eraseKey hl h) := by
|
||||
rw [containsKey_eraseKey hl, Bool.and_eq_true, Bool.not_eq_true'] at h
|
||||
rw [← Option.some_inj, ← getValue?_eq_some_getValue, getValue?_eraseKey hl, h.1, cond_false,
|
||||
getValue a (removeKey k l) h = getValue a l (containsKey_of_containsKey_removeKey hl h) := by
|
||||
rw [containsKey_removeKey hl, Bool.and_eq_true, Bool.not_eq_true'] at h
|
||||
rw [← Option.some_inj, ← getValue?_eq_some_getValue, getValue?_removeKey hl, h.1, cond_false,
|
||||
← getValue?_eq_some_getValue]
|
||||
|
||||
theorem getEntry?_of_perm [BEq α] [PartialEquivBEq α] {l l' : List ((a : α) × β a)} {a : α}
|
||||
@@ -1429,10 +1429,10 @@ theorem insertEntry_of_perm [BEq α] [EquivBEq α] {l l' : List ((a : α) × β
|
||||
apply getEntry?_ext hl.insertEntry (hl.perm h.symm).insertEntry
|
||||
simp [getEntry?_insertEntry, getEntry?_of_perm hl h]
|
||||
|
||||
theorem eraseKey_of_perm [BEq α] [EquivBEq α] {l l' : List ((a : α) × β a)} {k : α}
|
||||
(hl : DistinctKeys l) (h : Perm l l') : Perm (eraseKey k l) (eraseKey k l') := by
|
||||
apply getEntry?_ext hl.eraseKey (hl.perm h.symm).eraseKey
|
||||
simp [getEntry?_eraseKey hl, getEntry?_eraseKey (hl.perm h.symm), getEntry?_of_perm hl h]
|
||||
theorem removeKey_of_perm [BEq α] [EquivBEq α] {l l' : List ((a : α) × β a)} {k : α}
|
||||
(hl : DistinctKeys l) (h : Perm l l') : Perm (removeKey k l) (removeKey k l') := by
|
||||
apply getEntry?_ext hl.removeKey (hl.perm h.symm).removeKey
|
||||
simp [getEntry?_removeKey hl, getEntry?_removeKey (hl.perm h.symm), getEntry?_of_perm hl h]
|
||||
|
||||
@[simp]
|
||||
theorem getEntry?_append [BEq α] {l l' : List ((a : α) × β a)} {a : α} :
|
||||
@@ -1529,12 +1529,12 @@ theorem insertEntry_append_of_not_contains_right [BEq α] {l l' : List ((a : α)
|
||||
· simp [insertEntry, containsKey_append, h, h']
|
||||
· simp [insertEntry, containsKey_append, h, h', replaceEntry_append_of_containsKey_left h]
|
||||
|
||||
theorem eraseKey_append_of_containsKey_right_eq_false [BEq α] {l l' : List ((a : α) × β a)} {k : α}
|
||||
(h : containsKey k l' = false) : eraseKey k (l ++ l') = eraseKey k l ++ l' := by
|
||||
theorem removeKey_append_of_containsKey_right_eq_false [BEq α] {l l' : List ((a : α) × β a)} {k : α}
|
||||
(h : containsKey k l' = false) : removeKey k (l ++ l') = removeKey k l ++ l' := by
|
||||
induction l using assoc_induction
|
||||
· simp [eraseKey_of_containsKey_eq_false h]
|
||||
· simp [removeKey_of_containsKey_eq_false h]
|
||||
· next k' v' t ih =>
|
||||
rw [List.cons_append, eraseKey_cons, eraseKey_cons]
|
||||
rw [List.cons_append, removeKey_cons, removeKey_cons]
|
||||
cases k == k'
|
||||
· rw [cond_false, cond_false, ih, List.cons_append]
|
||||
· rw [cond_true, cond_true]
|
||||
|
||||
@@ -286,12 +286,12 @@ def insertIfNewₘ [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) (b : β a)
|
||||
if m.containsₘ a then m else Raw₀.expandIfNecessary (m.consₘ a b)
|
||||
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
def eraseₘaux [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) : Raw₀ α β :=
|
||||
⟨⟨m.1.size - 1, updateBucket m.1.buckets m.2 a (fun l => l.erase a)⟩, by simpa using m.2⟩
|
||||
def removeₘaux [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) : Raw₀ α β :=
|
||||
⟨⟨m.1.size - 1, updateBucket m.1.buckets m.2 a (fun l => l.remove a)⟩, by simpa using m.2⟩
|
||||
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
def eraseₘ [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) : Raw₀ α β :=
|
||||
if m.containsₘ a then m.eraseₘaux a else m
|
||||
def removeₘ [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) : Raw₀ α β :=
|
||||
if m.containsₘ a then m.removeₘaux a else m
|
||||
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
def filterMapₘ (m : Raw₀ α β) (f : (a : α) → β a → Option (δ a)) : Raw₀ α δ :=
|
||||
@@ -405,12 +405,12 @@ theorem getThenInsertIfNew?_eq_get?ₘ [BEq α] [Hashable α] [LawfulBEq α] (m
|
||||
dsimp only [Array.ugetElem_eq_getElem, Array.uset]
|
||||
split <;> simp_all
|
||||
|
||||
theorem erase_eq_eraseₘ [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) :
|
||||
m.erase a = m.eraseₘ a := by
|
||||
rw [erase, eraseₘ, containsₘ, bucket]
|
||||
theorem remove_eq_removeₘ [BEq α] [Hashable α] (m : Raw₀ α β) (a : α) :
|
||||
m.remove a = m.removeₘ a := by
|
||||
rw [remove, removeₘ, containsₘ, bucket]
|
||||
dsimp only [Array.ugetElem_eq_getElem, Array.uset]
|
||||
split
|
||||
· simp only [eraseₘaux, Subtype.mk.injEq, Raw.mk.injEq, true_and]
|
||||
· simp only [removeₘaux, Subtype.mk.injEq, Raw.mk.injEq, true_and]
|
||||
rw [Array.set_set, updateBucket]
|
||||
simp only [Array.uset, Array.ugetElem_eq_getElem]
|
||||
· rfl
|
||||
|
||||
@@ -135,13 +135,13 @@ theorem get!_val [BEq α] [Hashable α] [LawfulBEq α] {m : Raw₀ α β} {a :
|
||||
m.val.get! a = m.get! a := by
|
||||
simp [Raw.get!, m.2]
|
||||
|
||||
theorem erase_eq [BEq α] [Hashable α] {m : Raw α β} (h : m.WF) {a : α} :
|
||||
m.erase a = Raw₀.erase ⟨m, h.size_buckets_pos⟩ a := by
|
||||
simp [Raw.erase, h.size_buckets_pos]
|
||||
theorem remove_eq [BEq α] [Hashable α] {m : Raw α β} (h : m.WF) {a : α} :
|
||||
m.remove a = Raw₀.remove ⟨m, h.size_buckets_pos⟩ a := by
|
||||
simp [Raw.remove, h.size_buckets_pos]
|
||||
|
||||
theorem erase_val [BEq α] [Hashable α] {m : Raw₀ α β} {a : α} :
|
||||
m.val.erase a = m.erase a := by
|
||||
simp [Raw.erase, m.2]
|
||||
theorem remove_val [BEq α] [Hashable α] {m : Raw₀ α β} {a : α} :
|
||||
m.val.remove a = m.remove a := by
|
||||
simp [Raw.remove, m.2]
|
||||
|
||||
theorem filterMap_eq [BEq α] [Hashable α] {m : Raw α β} (h : m.WF)
|
||||
{f : (a : α) → β a → Option (δ a)} : m.filterMap f =
|
||||
|
||||
@@ -60,7 +60,7 @@ variable (m : Raw₀ α β) (h : m.1.WF)
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
scoped macro "wf_trivial" : tactic => `(tactic|
|
||||
repeat (first
|
||||
| apply Raw₀.wfImp_insert | apply Raw₀.wfImp_insertIfNew | apply Raw₀.wfImp_erase
|
||||
| apply Raw₀.wfImp_insert | apply Raw₀.wfImp_insertIfNew | apply Raw₀.wfImp_remove
|
||||
| apply Raw.WF.out | assumption | apply Raw₀.wfImp_empty | apply Raw.WFImp.distinct
|
||||
| apply Raw.WF.empty₀))
|
||||
|
||||
@@ -76,7 +76,7 @@ private def queryNames : Array Name :=
|
||||
``Const.get!_eq_getValue!, ``Const.getD_eq_getValueD]
|
||||
|
||||
private def modifyNames : Array Name :=
|
||||
#[``toListModel_insert, ``toListModel_erase, ``toListModel_insertIfNew]
|
||||
#[``toListModel_insert, ``toListModel_remove, ``toListModel_insertIfNew]
|
||||
|
||||
private def congrNames : MacroM (Array (TSyntax `term)) := do
|
||||
return #[← `(Std.DHashMap.Internal.List.Perm.isEmpty_eq), ← `(containsKey_of_perm),
|
||||
@@ -153,28 +153,28 @@ theorem size_le_size_insert [EquivBEq α] [LawfulHashable α] {k : α} {v : β k
|
||||
simp_to_model using List.length_le_length_insertEntry
|
||||
|
||||
@[simp]
|
||||
theorem erase_empty {k : α} {c : Nat} : (empty c : Raw₀ α β).erase k = empty c := by
|
||||
simp [erase, empty]
|
||||
theorem remove_empty {k : α} {c : Nat} : (empty c : Raw₀ α β).remove k = empty c := by
|
||||
simp [remove, empty]
|
||||
|
||||
theorem isEmpty_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).1.isEmpty = (m.1.isEmpty || (m.1.size == 1 && m.contains k)) := by
|
||||
simp_to_model using List.isEmpty_eraseKey
|
||||
theorem isEmpty_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).1.isEmpty = (m.1.isEmpty || (m.1.size == 1 && m.contains k)) := by
|
||||
simp_to_model using List.isEmpty_removeKey
|
||||
|
||||
theorem contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a = (!(a == k) && m.contains a) := by
|
||||
simp_to_model using List.containsKey_eraseKey
|
||||
theorem contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a = (!(a == k) && m.contains a) := by
|
||||
simp_to_model using List.containsKey_removeKey
|
||||
|
||||
theorem contains_of_contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a → m.contains a := by
|
||||
simp_to_model using List.containsKey_of_containsKey_eraseKey
|
||||
theorem contains_of_contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a → m.contains a := by
|
||||
simp_to_model using List.containsKey_of_containsKey_removeKey
|
||||
|
||||
theorem size_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).1.size = bif m.contains k then m.1.size - 1 else m.1.size := by
|
||||
simp_to_model using List.length_eraseKey
|
||||
theorem size_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).1.size = bif m.contains k then m.1.size - 1 else m.1.size := by
|
||||
simp_to_model using List.length_removeKey
|
||||
|
||||
theorem size_erase_le [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).1.size ≤ m.1.size := by
|
||||
simp_to_model using List.length_eraseKey_le
|
||||
theorem size_remove_le [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).1.size ≤ m.1.size := by
|
||||
simp_to_model using List.length_removeKey_le
|
||||
|
||||
@[simp]
|
||||
theorem containsThenInsert_fst {k : α} {v : β k} : (m.containsThenInsert k v).1 = m.contains k := by
|
||||
@@ -214,12 +214,12 @@ theorem contains_eq_isSome_get? [LawfulBEq α] {a : α} : m.contains a = (m.get?
|
||||
theorem get?_eq_none [LawfulBEq α] {a : α} : m.contains a = false → m.get? a = none := by
|
||||
simp_to_model using List.getValueCast?_eq_none
|
||||
|
||||
theorem get?_erase [LawfulBEq α] {k a : α} :
|
||||
(m.erase k).get? a = bif a == k then none else m.get? a := by
|
||||
simp_to_model using List.getValueCast?_eraseKey
|
||||
theorem get?_remove [LawfulBEq α] {k a : α} :
|
||||
(m.remove k).get? a = bif a == k then none else m.get? a := by
|
||||
simp_to_model using List.getValueCast?_removeKey
|
||||
|
||||
theorem get?_erase_self [LawfulBEq α] {k : α} : (m.erase k).get? k = none := by
|
||||
simp_to_model using List.getValueCast?_eraseKey_self
|
||||
theorem get?_remove_self [LawfulBEq α] {k : α} : (m.remove k).get? k = none := by
|
||||
simp_to_model using List.getValueCast?_removeKey_self
|
||||
|
||||
namespace Const
|
||||
|
||||
@@ -249,13 +249,13 @@ theorem get?_eq_none [EquivBEq α] [LawfulHashable α] {a : α} :
|
||||
m.contains a = false → get? m a = none := by
|
||||
simp_to_model using List.getValue?_eq_none.2
|
||||
|
||||
theorem get?_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
Const.get? (m.erase k) a = bif a == k then none else get? m a := by
|
||||
simp_to_model using List.getValue?_eraseKey
|
||||
theorem get?_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
Const.get? (m.remove k) a = bif a == k then none else get? m a := by
|
||||
simp_to_model using List.getValue?_removeKey
|
||||
|
||||
theorem get?_erase_self [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
get? (m.erase k) k = none := by
|
||||
simp_to_model using List.getValue?_eraseKey_self
|
||||
theorem get?_remove_self [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
get? (m.remove k) k = none := by
|
||||
simp_to_model using List.getValue?_removeKey_self
|
||||
|
||||
theorem get?_eq_get? [LawfulBEq α] {a : α} : get? m a = m.get? a := by
|
||||
simp_to_model using List.getValue?_eq_getValueCast?
|
||||
@@ -279,9 +279,9 @@ theorem get_insert_self [LawfulBEq α] {k : α} {v : β k} :
|
||||
simp_to_model using List.getValueCast_insertEntry_self
|
||||
|
||||
@[simp]
|
||||
theorem get_erase [LawfulBEq α] {k a : α} {h'} :
|
||||
(m.erase k).get a h' = m.get a (contains_of_contains_erase _ h h') := by
|
||||
simp_to_model using List.getValueCast_eraseKey
|
||||
theorem get_remove [LawfulBEq α] {k a : α} {h'} :
|
||||
(m.remove k).get a h' = m.get a (contains_of_contains_remove _ h h') := by
|
||||
simp_to_model using List.getValueCast_removeKey
|
||||
|
||||
theorem get?_eq_some_get [LawfulBEq α] {a : α} {h} : m.get? a = some (m.get a h) := by
|
||||
simp_to_model using List.getValueCast?_eq_some_getValueCast
|
||||
@@ -301,9 +301,9 @@ theorem get_insert_self [EquivBEq α] [LawfulHashable α] {k : α} {v : β} :
|
||||
simp_to_model using List.getValue_insertEntry_self
|
||||
|
||||
@[simp]
|
||||
theorem get_erase [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
get (m.erase k) a h' = get m a (contains_of_contains_erase _ h h') := by
|
||||
simp_to_model using List.getValue_eraseKey
|
||||
theorem get_remove [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
get (m.remove k) a h' = get m a (contains_of_contains_remove _ h h') := by
|
||||
simp_to_model using List.getValue_removeKey
|
||||
|
||||
theorem get?_eq_some_get [EquivBEq α] [LawfulHashable α] {a : α} {h} :
|
||||
get? m a = some (get m a h) := by
|
||||
@@ -339,13 +339,13 @@ theorem get!_eq_default [LawfulBEq α] {a : α} [Inhabited (β a)] :
|
||||
m.contains a = false → m.get! a = default := by
|
||||
simp_to_model using List.getValueCast!_eq_default
|
||||
|
||||
theorem get!_erase [LawfulBEq α] {k a : α} [Inhabited (β a)] :
|
||||
(m.erase k).get! a = bif a == k then default else m.get! a := by
|
||||
simp_to_model using List.getValueCast!_eraseKey
|
||||
theorem get!_remove [LawfulBEq α] {k a : α} [Inhabited (β a)] :
|
||||
(m.remove k).get! a = bif a == k then default else m.get! a := by
|
||||
simp_to_model using List.getValueCast!_removeKey
|
||||
|
||||
theorem get!_erase_self [LawfulBEq α] {k : α} [Inhabited (β k)] :
|
||||
(m.erase k).get! k = default := by
|
||||
simp_to_model using List.getValueCast!_eraseKey_self
|
||||
theorem get!_remove_self [LawfulBEq α] {k : α} [Inhabited (β k)] :
|
||||
(m.remove k).get! k = default := by
|
||||
simp_to_model using List.getValueCast!_removeKey_self
|
||||
|
||||
theorem get?_eq_some_get! [LawfulBEq α] {a : α} [Inhabited (β a)] :
|
||||
m.contains a = true → m.get? a = some (m.get! a) := by
|
||||
@@ -383,13 +383,13 @@ theorem get!_eq_default [EquivBEq α] [LawfulHashable α] [Inhabited β] {a : α
|
||||
m.contains a = false → get! m a = default := by
|
||||
simp_to_model using List.getValue!_eq_default
|
||||
|
||||
theorem get!_erase [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
get! (m.erase k) a = bif a == k then default else get! m a := by
|
||||
simp_to_model using List.getValue!_eraseKey
|
||||
theorem get!_remove [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
get! (m.remove k) a = bif a == k then default else get! m a := by
|
||||
simp_to_model using List.getValue!_removeKey
|
||||
|
||||
theorem get!_erase_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
get! (m.erase k) k = default := by
|
||||
simp_to_model using List.getValue!_eraseKey_self
|
||||
theorem get!_remove_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
get! (m.remove k) k = default := by
|
||||
simp_to_model using List.getValue!_removeKey_self
|
||||
|
||||
theorem get?_eq_some_get! [EquivBEq α] [LawfulHashable α] [Inhabited β] {a : α} :
|
||||
m.contains a = true → get? m a = some (get! m a) := by
|
||||
@@ -434,13 +434,13 @@ theorem getD_eq_fallback [LawfulBEq α] {a : α} {fallback : β a} :
|
||||
m.contains a = false → m.getD a fallback = fallback := by
|
||||
simp_to_model using List.getValueCastD_eq_fallback
|
||||
|
||||
theorem getD_erase [LawfulBEq α] {k a : α} {fallback : β a} :
|
||||
(m.erase k).getD a fallback = bif a == k then fallback else m.getD a fallback := by
|
||||
simp_to_model using List.getValueCastD_eraseKey
|
||||
theorem getD_remove [LawfulBEq α] {k a : α} {fallback : β a} :
|
||||
(m.remove k).getD a fallback = bif a == k then fallback else m.getD a fallback := by
|
||||
simp_to_model using List.getValueCastD_removeKey
|
||||
|
||||
theorem getD_erase_self [LawfulBEq α] {k : α} {fallback : β k} :
|
||||
(m.erase k).getD k fallback = fallback := by
|
||||
simp_to_model using List.getValueCastD_eraseKey_self
|
||||
theorem getD_remove_self [LawfulBEq α] {k : α} {fallback : β k} :
|
||||
(m.remove k).getD k fallback = fallback := by
|
||||
simp_to_model using List.getValueCastD_removeKey_self
|
||||
|
||||
theorem get?_eq_some_getD [LawfulBEq α] {a : α} {fallback : β a} :
|
||||
m.contains a = true → m.get? a = some (m.getD a fallback) := by
|
||||
@@ -482,13 +482,13 @@ theorem getD_eq_fallback [EquivBEq α] [LawfulHashable α] {a : α} {fallback :
|
||||
m.contains a = false → getD m a fallback = fallback := by
|
||||
simp_to_model using List.getValueD_eq_fallback
|
||||
|
||||
theorem getD_erase [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
getD (m.erase k) a fallback = bif a == k then fallback else getD m a fallback := by
|
||||
simp_to_model using List.getValueD_eraseKey
|
||||
theorem getD_remove [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
getD (m.remove k) a fallback = bif a == k then fallback else getD m a fallback := by
|
||||
simp_to_model using List.getValueD_removeKey
|
||||
|
||||
theorem getD_erase_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
getD (m.erase k) k fallback = fallback := by
|
||||
simp_to_model using List.getValueD_eraseKey_self
|
||||
theorem getD_remove_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
getD (m.remove k) k fallback = fallback := by
|
||||
simp_to_model using List.getValueD_removeKey_self
|
||||
|
||||
theorem get?_eq_some_getD [EquivBEq α] [LawfulHashable α] {a : α} {fallback : β} :
|
||||
m.contains a = true → get? m a = some (getD m a fallback) := by
|
||||
|
||||
@@ -453,63 +453,63 @@ theorem Const.wfImp_getThenInsertIfNew? {β : Type v} [BEq α] [Hashable α] [Eq
|
||||
rw [getThenInsertIfNew?_eq_insertIfNewₘ]
|
||||
exact wfImp_insertIfNewₘ h
|
||||
|
||||
/-! # `eraseₘ` -/
|
||||
/-! # `removeₘ` -/
|
||||
|
||||
theorem toListModel_eraseₘaux [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] (m : Raw₀ α β)
|
||||
theorem toListModel_removeₘaux [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] (m : Raw₀ α β)
|
||||
(a : α) (h : Raw.WFImp m.1) :
|
||||
Perm (toListModel (m.eraseₘaux a).1.buckets) (eraseKey a (toListModel m.1.buckets)) :=
|
||||
toListModel_updateBucket h AssocList.toList_erase List.eraseKey_of_perm
|
||||
List.eraseKey_append_of_containsKey_right_eq_false
|
||||
Perm (toListModel (m.removeₘaux a).1.buckets) (removeKey a (toListModel m.1.buckets)) :=
|
||||
toListModel_updateBucket h AssocList.toList_remove List.removeKey_of_perm
|
||||
List.removeKey_append_of_containsKey_right_eq_false
|
||||
|
||||
theorem isHashSelf_eraseₘaux [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] (m : Raw₀ α β)
|
||||
(a : α) (h : Raw.WFImp m.1) : IsHashSelf (m.eraseₘaux a).1.buckets := by
|
||||
theorem isHashSelf_removeₘaux [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] (m : Raw₀ α β)
|
||||
(a : α) (h : Raw.WFImp m.1) : IsHashSelf (m.removeₘaux a).1.buckets := by
|
||||
apply h.buckets_hash_self.updateBucket (fun l p hp => ?_)
|
||||
rw [AssocList.toList_erase] at hp
|
||||
exact Or.inl (containsKey_of_mem ((sublist_eraseKey.mem hp)))
|
||||
rw [AssocList.toList_remove] at hp
|
||||
exact Or.inl (containsKey_of_mem ((sublist_removeKey.mem hp)))
|
||||
|
||||
theorem wfImp_eraseₘaux [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] (m : Raw₀ α β) (a : α)
|
||||
(h : Raw.WFImp m.1) (h' : m.containsₘ a = true) : Raw.WFImp (m.eraseₘaux a).1 where
|
||||
buckets_hash_self := isHashSelf_eraseₘaux m a h
|
||||
theorem wfImp_removeₘaux [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] (m : Raw₀ α β) (a : α)
|
||||
(h : Raw.WFImp m.1) (h' : m.containsₘ a = true) : Raw.WFImp (m.removeₘaux a).1 where
|
||||
buckets_hash_self := isHashSelf_removeₘaux m a h
|
||||
size_eq := by
|
||||
rw [(toListModel_eraseₘaux m a h).length_eq, eraseₘaux, length_eraseKey,
|
||||
rw [(toListModel_removeₘaux m a h).length_eq, removeₘaux, length_removeKey,
|
||||
← containsₘ_eq_containsKey h, h', cond_true, h.size_eq]
|
||||
distinct := h.distinct.eraseKey.perm (toListModel_eraseₘaux m a h)
|
||||
distinct := h.distinct.removeKey.perm (toListModel_removeₘaux m a h)
|
||||
|
||||
theorem toListModel_perm_eraseKey_of_containsₘ_eq_false [BEq α] [Hashable α] [EquivBEq α]
|
||||
theorem toListModel_perm_removeKey_of_containsₘ_eq_false [BEq α] [Hashable α] [EquivBEq α]
|
||||
[LawfulHashable α] (m : Raw₀ α β) (a : α) (h : Raw.WFImp m.1) (h' : m.containsₘ a = false) :
|
||||
Perm (toListModel m.1.buckets) (eraseKey a (toListModel m.1.buckets)) := by
|
||||
rw [eraseKey_of_containsKey_eq_false]
|
||||
Perm (toListModel m.1.buckets) (removeKey a (toListModel m.1.buckets)) := by
|
||||
rw [removeKey_of_containsKey_eq_false]
|
||||
· exact Perm.refl _
|
||||
· rw [← containsₘ_eq_containsKey h, h']
|
||||
|
||||
theorem toListModel_eraseₘ [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β}
|
||||
theorem toListModel_removeₘ [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β}
|
||||
{a : α} (h : Raw.WFImp m.1) :
|
||||
Perm (toListModel (m.eraseₘ a).1.buckets) (eraseKey a (toListModel m.1.buckets)) := by
|
||||
rw [eraseₘ]
|
||||
Perm (toListModel (m.removeₘ a).1.buckets) (removeKey a (toListModel m.1.buckets)) := by
|
||||
rw [removeₘ]
|
||||
split
|
||||
· exact toListModel_eraseₘaux m a h
|
||||
· exact toListModel_removeₘaux m a h
|
||||
· next h' =>
|
||||
exact toListModel_perm_eraseKey_of_containsₘ_eq_false _ _ h (eq_false_of_ne_true h')
|
||||
exact toListModel_perm_removeKey_of_containsₘ_eq_false _ _ h (eq_false_of_ne_true h')
|
||||
|
||||
theorem wfImp_eraseₘ [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β} {a : α}
|
||||
(h : Raw.WFImp m.1) : Raw.WFImp (m.eraseₘ a).1 := by
|
||||
rw [eraseₘ]
|
||||
theorem wfImp_removeₘ [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β} {a : α}
|
||||
(h : Raw.WFImp m.1) : Raw.WFImp (m.removeₘ a).1 := by
|
||||
rw [removeₘ]
|
||||
split
|
||||
· next h' => exact wfImp_eraseₘaux m a h h'
|
||||
· next h' => exact wfImp_removeₘaux m a h h'
|
||||
· exact h
|
||||
|
||||
/-! # `erase` -/
|
||||
/-! # `remove` -/
|
||||
|
||||
theorem toListModel_erase [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β}
|
||||
theorem toListModel_remove [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β}
|
||||
{a : α} (h : Raw.WFImp m.1) :
|
||||
Perm (toListModel (m.erase a).1.buckets) (eraseKey a (toListModel m.1.buckets)) := by
|
||||
rw [erase_eq_eraseₘ]
|
||||
exact toListModel_eraseₘ h
|
||||
Perm (toListModel (m.remove a).1.buckets) (removeKey a (toListModel m.1.buckets)) := by
|
||||
rw [remove_eq_removeₘ]
|
||||
exact toListModel_removeₘ h
|
||||
|
||||
theorem wfImp_erase [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β} {a : α}
|
||||
(h : Raw.WFImp m.1) : Raw.WFImp (m.erase a).1 := by
|
||||
rw [erase_eq_eraseₘ]
|
||||
exact wfImp_eraseₘ h
|
||||
theorem wfImp_remove [BEq α] [Hashable α] [EquivBEq α] [LawfulHashable α] {m : Raw₀ α β} {a : α}
|
||||
(h : Raw.WFImp m.1) : Raw.WFImp (m.remove a).1 := by
|
||||
rw [remove_eq_removeₘ]
|
||||
exact wfImp_removeₘ h
|
||||
|
||||
/-! # `filterMapₘ` -/
|
||||
|
||||
@@ -626,7 +626,7 @@ theorem WF.out [BEq α] [Hashable α] [i₁ : EquivBEq α] [i₂ : LawfulHashabl
|
||||
· next h => exact Raw₀.wfImp_insert (by apply h)
|
||||
· next h => exact Raw₀.wfImp_containsThenInsert (by apply h)
|
||||
· next h => exact Raw₀.wfImp_containsThenInsertIfNew (by apply h)
|
||||
· next h => exact Raw₀.wfImp_erase (by apply h)
|
||||
· next h => exact Raw₀.wfImp_remove (by apply h)
|
||||
· next h => exact Raw₀.wfImp_insertIfNew (by apply h)
|
||||
· next h => exact Raw₀.wfImp_getThenInsertIfNew? (by apply h)
|
||||
· next h => exact Raw₀.wfImp_filter (by apply h)
|
||||
|
||||
@@ -110,41 +110,41 @@ theorem size_le_size_insert [EquivBEq α] [LawfulHashable α] {k : α} {v : β k
|
||||
Raw₀.size_le_size_insert ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem erase_empty {k : α} {c : Nat} : (empty c : DHashMap α β).erase k = empty c :=
|
||||
Subtype.eq (congrArg Subtype.val (Raw₀.erase_empty (k := k)) :) -- Lean code is happy
|
||||
theorem remove_empty {k : α} {c : Nat} : (empty c : DHashMap α β).remove k = empty c :=
|
||||
Subtype.eq (congrArg Subtype.val (Raw₀.remove_empty (k := k)) :) -- Lean code is happy
|
||||
|
||||
@[simp]
|
||||
theorem erase_emptyc {k : α} : (∅ : DHashMap α β).erase k = ∅ :=
|
||||
erase_empty
|
||||
theorem remove_emptyc {k : α} : (∅ : DHashMap α β).remove k = ∅ :=
|
||||
remove_empty
|
||||
|
||||
@[simp]
|
||||
theorem isEmpty_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
Raw₀.isEmpty_erase _ m.2
|
||||
theorem isEmpty_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
Raw₀.isEmpty_remove _ m.2
|
||||
|
||||
@[simp]
|
||||
theorem contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a = (!(a == k) && m.contains a) :=
|
||||
Raw₀.contains_erase ⟨m.1, _⟩ m.2
|
||||
theorem contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a = (!(a == k) && m.contains a) :=
|
||||
Raw₀.contains_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.erase k ↔ (a == k) = false ∧ a ∈ m := by
|
||||
simp [mem_iff_contains, contains_erase]
|
||||
theorem mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.remove k ↔ (a == k) = false ∧ a ∈ m := by
|
||||
simp [mem_iff_contains, contains_remove]
|
||||
|
||||
theorem contains_of_contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a → m.contains a :=
|
||||
Raw₀.contains_of_contains_erase ⟨m.1, _⟩ m.2
|
||||
theorem contains_of_contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a → m.contains a :=
|
||||
Raw₀.contains_of_contains_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem mem_of_mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.erase k → a ∈ m := by
|
||||
theorem mem_of_mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.remove k → a ∈ m := by
|
||||
simp
|
||||
|
||||
theorem size_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
Raw₀.size_erase _ m.2
|
||||
theorem size_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
Raw₀.size_remove _ m.2
|
||||
|
||||
theorem size_erase_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k).size ≤ m.size :=
|
||||
Raw₀.size_erase_le _ m.2
|
||||
theorem size_remove_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k).size ≤ m.size :=
|
||||
Raw₀.size_remove_le _ m.2
|
||||
|
||||
@[simp]
|
||||
theorem containsThenInsert_fst {k : α} {v : β k} : (m.containsThenInsert k v).1 = m.contains k :=
|
||||
@@ -193,13 +193,13 @@ theorem get?_eq_none_of_contains_eq_false [LawfulBEq α] {a : α} :
|
||||
theorem get?_eq_none [LawfulBEq α] {a : α} : ¬a ∈ m → m.get? a = none := by
|
||||
simpa [mem_iff_contains] using get?_eq_none_of_contains_eq_false
|
||||
|
||||
theorem get?_erase [LawfulBEq α] {k a : α} :
|
||||
(m.erase k).get? a = bif a == k then none else m.get? a :=
|
||||
Raw₀.get?_erase ⟨m.1, _⟩ m.2
|
||||
theorem get?_remove [LawfulBEq α] {k a : α} :
|
||||
(m.remove k).get? a = bif a == k then none else m.get? a :=
|
||||
Raw₀.get?_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem get?_erase_self [LawfulBEq α] {k : α} : (m.erase k).get? k = none :=
|
||||
Raw₀.get?_erase_self ⟨m.1, _⟩ m.2
|
||||
theorem get?_remove_self [LawfulBEq α] {k : α} : (m.remove k).get? k = none :=
|
||||
Raw₀.get?_remove_self ⟨m.1, _⟩ m.2
|
||||
|
||||
namespace Const
|
||||
|
||||
@@ -237,13 +237,13 @@ theorem get?_eq_none_of_contains_eq_false [EquivBEq α] [LawfulHashable α] {a :
|
||||
theorem get?_eq_none [EquivBEq α] [LawfulHashable α] {a : α } : ¬a ∈ m → get? m a = none := by
|
||||
simpa [mem_iff_contains] using get?_eq_none_of_contains_eq_false
|
||||
|
||||
theorem get?_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
Const.get? (m.erase k) a = bif a == k then none else get? m a :=
|
||||
Raw₀.Const.get?_erase ⟨m.1, _⟩ m.2
|
||||
theorem get?_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
Const.get? (m.remove k) a = bif a == k then none else get? m a :=
|
||||
Raw₀.Const.get?_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem get?_erase_self [EquivBEq α] [LawfulHashable α] {k : α} : get? (m.erase k) k = none :=
|
||||
Raw₀.Const.get?_erase_self ⟨m.1, _⟩ m.2
|
||||
theorem get?_remove_self [EquivBEq α] [LawfulHashable α] {k : α} : get? (m.remove k) k = none :=
|
||||
Raw₀.Const.get?_remove_self ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem get?_eq_get? [LawfulBEq α] {a : α} : get? m a = m.get? a :=
|
||||
Raw₀.Const.get?_eq_get? ⟨m.1, _⟩ m.2
|
||||
@@ -267,9 +267,9 @@ theorem get_insert_self [LawfulBEq α] {k : α} {v : β k} :
|
||||
Raw₀.get_insert_self ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem get_erase [LawfulBEq α] {k a : α} {h'} :
|
||||
(m.erase k).get a h' = m.get a (mem_of_mem_erase h') :=
|
||||
Raw₀.get_erase ⟨m.1, _⟩ m.2
|
||||
theorem get_remove [LawfulBEq α] {k a : α} {h'} :
|
||||
(m.remove k).get a h' = m.get a (mem_of_mem_remove h') :=
|
||||
Raw₀.get_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem get?_eq_some_get [LawfulBEq α] {a : α} {h} : m.get? a = some (m.get a h) :=
|
||||
Raw₀.get?_eq_some_get ⟨m.1, _⟩ m.2
|
||||
@@ -289,9 +289,9 @@ theorem get_insert_self [EquivBEq α] [LawfulHashable α] {k : α} {v : β} :
|
||||
Raw₀.Const.get_insert_self ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem get_erase [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
get (m.erase k) a h' = get m a (mem_of_mem_erase h') :=
|
||||
Raw₀.Const.get_erase ⟨m.1, _⟩ m.2
|
||||
theorem get_remove [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
get (m.remove k) a h' = get m a (mem_of_mem_remove h') :=
|
||||
Raw₀.Const.get_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem get?_eq_some_get [EquivBEq α] [LawfulHashable α] {a : α} {h} :
|
||||
get? m a = some (get m a h) :=
|
||||
@@ -338,14 +338,14 @@ theorem get!_eq_default [LawfulBEq α] {a : α} [Inhabited (β a)] :
|
||||
¬a ∈ m → m.get! a = default := by
|
||||
simpa [mem_iff_contains] using get!_eq_default_of_contains_eq_false
|
||||
|
||||
theorem get!_erase [LawfulBEq α] {k a : α} [Inhabited (β a)] :
|
||||
(m.erase k).get! a = bif a == k then default else m.get! a :=
|
||||
Raw₀.get!_erase ⟨m.1, _⟩ m.2
|
||||
theorem get!_remove [LawfulBEq α] {k a : α} [Inhabited (β a)] :
|
||||
(m.remove k).get! a = bif a == k then default else m.get! a :=
|
||||
Raw₀.get!_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem get!_erase_self [LawfulBEq α] {k : α} [Inhabited (β k)] :
|
||||
(m.erase k).get! k = default :=
|
||||
Raw₀.get!_erase_self ⟨m.1, _⟩ m.2
|
||||
theorem get!_remove_self [LawfulBEq α] {k : α} [Inhabited (β k)] :
|
||||
(m.remove k).get! k = default :=
|
||||
Raw₀.get!_remove_self ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem get?_eq_some_get!_of_contains [LawfulBEq α] {a : α} [Inhabited (β a)] :
|
||||
m.contains a = true → m.get? a = some (m.get! a) :=
|
||||
@@ -397,14 +397,14 @@ theorem get!_eq_default [EquivBEq α] [LawfulHashable α] [Inhabited β] {a : α
|
||||
¬a ∈ m → get! m a = default := by
|
||||
simpa [mem_iff_contains] using get!_eq_default_of_contains_eq_false
|
||||
|
||||
theorem get!_erase [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
get! (m.erase k) a = bif a == k then default else get! m a :=
|
||||
Raw₀.Const.get!_erase ⟨m.1, _⟩ m.2
|
||||
theorem get!_remove [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
get! (m.remove k) a = bif a == k then default else get! m a :=
|
||||
Raw₀.Const.get!_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem get!_erase_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
get! (m.erase k) k = default :=
|
||||
Raw₀.Const.get!_erase_self ⟨m.1, _⟩ m.2
|
||||
theorem get!_remove_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
get! (m.remove k) k = default :=
|
||||
Raw₀.Const.get!_remove_self ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem get?_eq_some_get!_of_contains [EquivBEq α] [LawfulHashable α] [Inhabited β] {a : α} :
|
||||
m.contains a = true → get? m a = some (get! m a) :=
|
||||
@@ -464,14 +464,14 @@ theorem getD_eq_fallback [LawfulBEq α] {a : α} {fallback : β a} :
|
||||
¬a ∈ m → m.getD a fallback = fallback := by
|
||||
simpa [mem_iff_contains] using getD_eq_fallback_of_contains_eq_false
|
||||
|
||||
theorem getD_erase [LawfulBEq α] {k a : α} {fallback : β a} :
|
||||
(m.erase k).getD a fallback = bif a == k then fallback else m.getD a fallback :=
|
||||
Raw₀.getD_erase ⟨m.1, _⟩ m.2
|
||||
theorem getD_remove [LawfulBEq α] {k a : α} {fallback : β a} :
|
||||
(m.remove k).getD a fallback = bif a == k then fallback else m.getD a fallback :=
|
||||
Raw₀.getD_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem getD_erase_self [LawfulBEq α] {k : α} {fallback : β k} :
|
||||
(m.erase k).getD k fallback = fallback :=
|
||||
Raw₀.getD_erase_self ⟨m.1, _⟩ m.2
|
||||
theorem getD_remove_self [LawfulBEq α] {k : α} {fallback : β k} :
|
||||
(m.remove k).getD k fallback = fallback :=
|
||||
Raw₀.getD_remove_self ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem get?_eq_some_getD_of_contains [LawfulBEq α] {a : α} {fallback : β a} :
|
||||
m.contains a = true → m.get? a = some (m.getD a fallback) :=
|
||||
@@ -528,14 +528,14 @@ theorem getD_eq_fallback [EquivBEq α] [LawfulHashable α] {a : α} {fallback :
|
||||
¬a ∈ m → getD m a fallback = fallback := by
|
||||
simpa [mem_iff_contains] using getD_eq_fallback_of_contains_eq_false
|
||||
|
||||
theorem getD_erase [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
getD (m.erase k) a fallback = bif a == k then fallback else getD m a fallback :=
|
||||
Raw₀.Const.getD_erase ⟨m.1, _⟩ m.2
|
||||
theorem getD_remove [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
getD (m.remove k) a fallback = bif a == k then fallback else getD m a fallback :=
|
||||
Raw₀.Const.getD_remove ⟨m.1, _⟩ m.2
|
||||
|
||||
@[simp]
|
||||
theorem getD_erase_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
getD (m.erase k) k fallback = fallback :=
|
||||
Raw₀.Const.getD_erase_self ⟨m.1, _⟩ m.2
|
||||
theorem getD_remove_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
getD (m.remove k) k fallback = fallback :=
|
||||
Raw₀.Const.getD_remove_self ⟨m.1, _⟩ m.2
|
||||
|
||||
theorem get?_eq_some_getD_of_contains [EquivBEq α] [LawfulHashable α] {a : α} {fallback : β} :
|
||||
m.contains a = true → get? m a = some (getD m a fallback) :=
|
||||
|
||||
@@ -180,9 +180,9 @@ Uses the `LawfulBEq` instance to cast the retrieved value to the correct type.
|
||||
else default -- will never happen for well-formed inputs
|
||||
|
||||
/-- Removes the mapping for the given key if it exists. -/
|
||||
@[inline] def erase [BEq α] [Hashable α] (m : Raw α β) (a : α) : Raw α β :=
|
||||
@[inline] def remove [BEq α] [Hashable α] (m : Raw α β) (a : α) : Raw α β :=
|
||||
if h : 0 < m.buckets.size then
|
||||
Raw₀.erase ⟨m, h⟩ a
|
||||
Raw₀.remove ⟨m, h⟩ a
|
||||
else m -- will never happen for well-formed inputs
|
||||
|
||||
section
|
||||
@@ -416,7 +416,7 @@ inductive WF : {α : Type u} → {β : α → Type v} → [BEq α] → [Hashable
|
||||
| containsThenInsertIfNew₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
|
||||
WF m → WF (Raw₀.containsThenInsertIfNew ⟨m, h⟩ a b).2.1
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
| erase₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a} : WF m → WF (Raw₀.erase ⟨m, h⟩ a).1
|
||||
| remove₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a} : WF m → WF (Raw₀.remove ⟨m, h⟩ a).1
|
||||
/-- Internal implementation detail of the hash map -/
|
||||
| insertIfNew₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
|
||||
WF m → WF (Raw₀.insertIfNew ⟨m, h⟩ a b).1
|
||||
@@ -436,7 +436,7 @@ theorem WF.size_buckets_pos [BEq α] [Hashable α] (m : Raw α β) : WF m → 0
|
||||
| insert₀ _ => (Raw₀.insert ⟨_, _⟩ _ _).2
|
||||
| containsThenInsert₀ _ => (Raw₀.containsThenInsert ⟨_, _⟩ _ _).2.2
|
||||
| containsThenInsertIfNew₀ _ => (Raw₀.containsThenInsertIfNew ⟨_, _⟩ _ _).2.2
|
||||
| erase₀ _ => (Raw₀.erase ⟨_, _⟩ _).2
|
||||
| remove₀ _ => (Raw₀.remove ⟨_, _⟩ _).2
|
||||
| insertIfNew₀ _ => (Raw₀.insertIfNew ⟨_, _⟩ _ _).2
|
||||
| getThenInsertIfNew?₀ _ => (Raw₀.getThenInsertIfNew? ⟨_, _⟩ _ _).2.2
|
||||
| filter₀ _ => (Raw₀.filter _ ⟨_, _⟩).2
|
||||
@@ -460,8 +460,8 @@ theorem WF.containsThenInsertIfNew [BEq α] [Hashable α] {m : Raw α β} {a :
|
||||
(m.containsThenInsertIfNew a b).2.WF := by
|
||||
simpa [Raw.containsThenInsertIfNew, h.size_buckets_pos] using .containsThenInsertIfNew₀ h
|
||||
|
||||
theorem WF.erase [BEq α] [Hashable α] {m : Raw α β} {a : α} (h : m.WF) : (m.erase a).WF := by
|
||||
simpa [Raw.erase, h.size_buckets_pos] using .erase₀ h
|
||||
theorem WF.remove [BEq α] [Hashable α] {m : Raw α β} {a : α} (h : m.WF) : (m.remove a).WF := by
|
||||
simpa [Raw.remove, h.size_buckets_pos] using .remove₀ h
|
||||
|
||||
theorem WF.insertIfNew [BEq α] [Hashable α] {m : Raw α β} {a : α} {b : β a} (h : m.WF) :
|
||||
(m.insertIfNew a b).WF := by
|
||||
|
||||
@@ -39,7 +39,7 @@ private def baseNames : Array Name :=
|
||||
``getThenInsertIfNew?_snd_eq, ``getThenInsertIfNew?_snd_val,
|
||||
``map_eq, ``map_val,
|
||||
``filter_eq, ``filter_val,
|
||||
``erase_eq, ``erase_val,
|
||||
``remove_eq, ``remove_val,
|
||||
``filterMap_eq, ``filterMap_val,
|
||||
``Const.getThenInsertIfNew?_snd_eq, ``Const.getThenInsertIfNew?_snd_val,
|
||||
``containsThenInsert_fst_eq, ``containsThenInsert_fst_val,
|
||||
@@ -158,42 +158,42 @@ theorem size_le_size_insert [EquivBEq α] [LawfulHashable α] {k : α} {v : β k
|
||||
simp_to_raw using Raw₀.size_le_size_insert ⟨m, _⟩ h
|
||||
|
||||
@[simp]
|
||||
theorem erase_empty {k : α} {c : Nat} : (empty c : Raw α β).erase k = empty c := by
|
||||
rw [erase_eq (by wf_trivial)]
|
||||
exact congrArg Subtype.val Raw₀.erase_empty
|
||||
theorem remove_empty {k : α} {c : Nat} : (empty c : Raw α β).remove k = empty c := by
|
||||
rw [remove_eq (by wf_trivial)]
|
||||
exact congrArg Subtype.val Raw₀.remove_empty
|
||||
|
||||
@[simp]
|
||||
theorem erase_emptyc {k : α} : (∅ : Raw α β).erase k = ∅ :=
|
||||
erase_empty
|
||||
theorem remove_emptyc {k : α} : (∅ : Raw α β).remove k = ∅ :=
|
||||
remove_empty
|
||||
|
||||
@[simp]
|
||||
theorem isEmpty_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) := by
|
||||
simp_to_raw using Raw₀.isEmpty_erase
|
||||
theorem isEmpty_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) := by
|
||||
simp_to_raw using Raw₀.isEmpty_remove
|
||||
|
||||
@[simp]
|
||||
theorem contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a = (!(a == k) && m.contains a) := by
|
||||
simp_to_raw using Raw₀.contains_erase
|
||||
theorem contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a = (!(a == k) && m.contains a) := by
|
||||
simp_to_raw using Raw₀.contains_remove
|
||||
|
||||
@[simp]
|
||||
theorem mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.erase k ↔ (a == k) = false ∧ a ∈ m := by
|
||||
simp [mem_iff_contains, contains_erase h]
|
||||
theorem mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.remove k ↔ (a == k) = false ∧ a ∈ m := by
|
||||
simp [mem_iff_contains, contains_remove h]
|
||||
|
||||
theorem contains_of_contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a → m.contains a := by
|
||||
simp_to_raw using Raw₀.contains_of_contains_erase
|
||||
theorem contains_of_contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a → m.contains a := by
|
||||
simp_to_raw using Raw₀.contains_of_contains_remove
|
||||
|
||||
theorem mem_of_mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.erase k → a ∈ m := by
|
||||
simpa [mem_iff_contains] using contains_of_contains_erase h
|
||||
theorem mem_of_mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.remove k → a ∈ m := by
|
||||
simpa [mem_iff_contains] using contains_of_contains_remove h
|
||||
|
||||
theorem size_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).size = bif m.contains k then m.size - 1 else m.size := by
|
||||
simp_to_raw using Raw₀.size_erase
|
||||
theorem size_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).size = bif m.contains k then m.size - 1 else m.size := by
|
||||
simp_to_raw using Raw₀.size_remove
|
||||
|
||||
theorem size_erase_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k).size ≤ m.size := by
|
||||
simp_to_raw using Raw₀.size_erase_le
|
||||
theorem size_remove_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k).size ≤ m.size := by
|
||||
simp_to_raw using Raw₀.size_remove_le
|
||||
|
||||
@[simp]
|
||||
theorem containsThenInsert_fst {k : α} {v : β k} : (m.containsThenInsert k v).1 = m.contains k := by
|
||||
@@ -242,13 +242,13 @@ theorem get?_eq_none_of_contains_eq_false [LawfulBEq α] {a : α} :
|
||||
theorem get?_eq_none [LawfulBEq α] {a : α} : ¬a ∈ m → m.get? a = none := by
|
||||
simpa [mem_iff_contains] using get?_eq_none_of_contains_eq_false h
|
||||
|
||||
theorem get?_erase [LawfulBEq α] {k a : α} :
|
||||
(m.erase k).get? a = bif a == k then none else m.get? a := by
|
||||
simp_to_raw using Raw₀.get?_erase
|
||||
theorem get?_remove [LawfulBEq α] {k a : α} :
|
||||
(m.remove k).get? a = bif a == k then none else m.get? a := by
|
||||
simp_to_raw using Raw₀.get?_remove
|
||||
|
||||
@[simp]
|
||||
theorem get?_erase_self [LawfulBEq α] {k : α} : (m.erase k).get? k = none := by
|
||||
simp_to_raw using Raw₀.get?_erase_self
|
||||
theorem get?_remove_self [LawfulBEq α] {k : α} : (m.remove k).get? k = none := by
|
||||
simp_to_raw using Raw₀.get?_remove_self
|
||||
|
||||
namespace Const
|
||||
|
||||
@@ -286,13 +286,13 @@ theorem get?_eq_none_of_contains_eq_false [EquivBEq α] [LawfulHashable α] {a :
|
||||
theorem get?_eq_none [EquivBEq α] [LawfulHashable α] {a : α} : ¬a ∈ m → get? m a = none := by
|
||||
simpa [mem_iff_contains] using get?_eq_none_of_contains_eq_false h
|
||||
|
||||
theorem get?_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
Const.get? (m.erase k) a = bif a == k then none else get? m a := by
|
||||
simp_to_raw using Raw₀.Const.get?_erase
|
||||
theorem get?_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
Const.get? (m.remove k) a = bif a == k then none else get? m a := by
|
||||
simp_to_raw using Raw₀.Const.get?_remove
|
||||
|
||||
@[simp]
|
||||
theorem get?_erase_self [EquivBEq α] [LawfulHashable α] {k : α} : get? (m.erase k) k = none := by
|
||||
simp_to_raw using Raw₀.Const.get?_erase_self
|
||||
theorem get?_remove_self [EquivBEq α] [LawfulHashable α] {k : α} : get? (m.remove k) k = none := by
|
||||
simp_to_raw using Raw₀.Const.get?_remove_self
|
||||
|
||||
theorem get?_eq_get? [LawfulBEq α] {a : α} : get? m a = m.get? a := by
|
||||
simp_to_raw using Raw₀.Const.get?_eq_get?
|
||||
@@ -317,9 +317,9 @@ theorem get_insert_self [LawfulBEq α] {k : α} {v : β k} :
|
||||
simp_to_raw using Raw₀.get_insert_self ⟨m, _⟩
|
||||
|
||||
@[simp]
|
||||
theorem get_erase [LawfulBEq α] {k a : α} {h'} :
|
||||
(m.erase a).get k h' = m.get k (mem_of_mem_erase h h') := by
|
||||
simp_to_raw using Raw₀.get_erase ⟨m, _⟩
|
||||
theorem get_remove [LawfulBEq α] {k a : α} {h'} :
|
||||
(m.remove a).get k h' = m.get k (mem_of_mem_remove h h') := by
|
||||
simp_to_raw using Raw₀.get_remove ⟨m, _⟩
|
||||
|
||||
theorem get?_eq_some_get [LawfulBEq α] {a : α} {h} : m.get? a = some (m.get a h) := by
|
||||
simp_to_raw using Raw₀.get?_eq_some_get
|
||||
@@ -339,9 +339,9 @@ theorem get_insert_self [EquivBEq α] [LawfulHashable α] {k : α} {v : β} :
|
||||
simp_to_raw using Raw₀.Const.get_insert_self ⟨m, _⟩
|
||||
|
||||
@[simp]
|
||||
theorem get_erase [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
get (m.erase k) a h' = get m a (mem_of_mem_erase h h') := by
|
||||
simp_to_raw using Raw₀.Const.get_erase ⟨m, _⟩
|
||||
theorem get_remove [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
get (m.remove k) a h' = get m a (mem_of_mem_remove h h') := by
|
||||
simp_to_raw using Raw₀.Const.get_remove ⟨m, _⟩
|
||||
|
||||
theorem get?_eq_some_get [EquivBEq α] [LawfulHashable α] {a : α} {h : a ∈ m} :
|
||||
get? m a = some (get m a h) := by
|
||||
@@ -387,14 +387,14 @@ theorem get!_eq_default [LawfulBEq α] {a : α} [Inhabited (β a)] :
|
||||
¬a ∈ m → m.get! a = default := by
|
||||
simpa [mem_iff_contains] using get!_eq_default_of_contains_eq_false h
|
||||
|
||||
theorem get!_erase [LawfulBEq α] {k a : α} [Inhabited (β a)] :
|
||||
(m.erase k).get! a = bif a == k then default else m.get! a := by
|
||||
simp_to_raw using Raw₀.get!_erase
|
||||
theorem get!_remove [LawfulBEq α] {k a : α} [Inhabited (β a)] :
|
||||
(m.remove k).get! a = bif a == k then default else m.get! a := by
|
||||
simp_to_raw using Raw₀.get!_remove
|
||||
|
||||
@[simp]
|
||||
theorem get!_erase_self [LawfulBEq α] {k : α} [Inhabited (β k)] :
|
||||
(m.erase k).get! k = default := by
|
||||
simp_to_raw using Raw₀.get!_erase_self
|
||||
theorem get!_remove_self [LawfulBEq α] {k : α} [Inhabited (β k)] :
|
||||
(m.remove k).get! k = default := by
|
||||
simp_to_raw using Raw₀.get!_remove_self
|
||||
|
||||
theorem get?_eq_some_get!_of_contains [LawfulBEq α] {a : α} [Inhabited (β a)] :
|
||||
m.contains a = true → m.get? a = some (m.get! a) := by
|
||||
@@ -445,14 +445,14 @@ theorem get!_eq_default [EquivBEq α] [LawfulHashable α] [Inhabited β] {a : α
|
||||
¬a ∈ m → get! m a = default := by
|
||||
simpa [mem_iff_contains] using get!_eq_default_of_contains_eq_false h
|
||||
|
||||
theorem get!_erase [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
get! (m.erase k) a = bif a == k then default else get! m a := by
|
||||
simp_to_raw using Raw₀.Const.get!_erase
|
||||
theorem get!_remove [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
get! (m.remove k) a = bif a == k then default else get! m a := by
|
||||
simp_to_raw using Raw₀.Const.get!_remove
|
||||
|
||||
@[simp]
|
||||
theorem get!_erase_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
get! (m.erase k) k = default := by
|
||||
simp_to_raw using Raw₀.Const.get!_erase_self
|
||||
theorem get!_remove_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
get! (m.remove k) k = default := by
|
||||
simp_to_raw using Raw₀.Const.get!_remove_self
|
||||
|
||||
theorem get?_eq_some_get!_of_contains [EquivBEq α] [LawfulHashable α] [Inhabited β] {a : α} :
|
||||
m.contains a = true → get? m a = some (get! m a) := by
|
||||
@@ -512,14 +512,14 @@ theorem getD_eq_fallback [LawfulBEq α] {a : α} {fallback : β a} :
|
||||
¬a ∈ m → m.getD a fallback = fallback := by
|
||||
simpa [mem_iff_contains] using getD_eq_fallback_of_contains_eq_false h
|
||||
|
||||
theorem getD_erase [LawfulBEq α] {k a : α} {fallback : β a} :
|
||||
(m.erase k).getD a fallback = bif a == k then fallback else m.getD a fallback := by
|
||||
simp_to_raw using Raw₀.getD_erase
|
||||
theorem getD_remove [LawfulBEq α] {k a : α} {fallback : β a} :
|
||||
(m.remove k).getD a fallback = bif a == k then fallback else m.getD a fallback := by
|
||||
simp_to_raw using Raw₀.getD_remove
|
||||
|
||||
@[simp]
|
||||
theorem getD_erase_self [LawfulBEq α] {k : α} {fallback : β k} :
|
||||
(m.erase k).getD k fallback = fallback := by
|
||||
simp_to_raw using Raw₀.getD_erase_self
|
||||
theorem getD_remove_self [LawfulBEq α] {k : α} {fallback : β k} :
|
||||
(m.remove k).getD k fallback = fallback := by
|
||||
simp_to_raw using Raw₀.getD_remove_self
|
||||
|
||||
theorem get?_eq_some_getD_of_contains [LawfulBEq α] {a : α} {fallback : β a} :
|
||||
m.contains a = true → m.get? a = some (m.getD a fallback) := by
|
||||
@@ -575,14 +575,14 @@ theorem getD_eq_fallback [EquivBEq α] [LawfulHashable α] {a : α} {fallback :
|
||||
¬a ∈ m → getD m a fallback = fallback := by
|
||||
simpa [mem_iff_contains] using getD_eq_fallback_of_contains_eq_false h
|
||||
|
||||
theorem getD_erase [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
getD (m.erase k) a fallback = bif a == k then fallback else getD m a fallback := by
|
||||
simp_to_raw using Raw₀.Const.getD_erase
|
||||
theorem getD_remove [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
getD (m.remove k) a fallback = bif a == k then fallback else getD m a fallback := by
|
||||
simp_to_raw using Raw₀.Const.getD_remove
|
||||
|
||||
@[simp]
|
||||
theorem getD_erase_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
getD (m.erase k) k fallback = fallback := by
|
||||
simp_to_raw using Raw₀.Const.getD_erase_self
|
||||
theorem getD_remove_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
getD (m.remove k) k fallback = fallback := by
|
||||
simp_to_raw using Raw₀.Const.getD_remove_self
|
||||
|
||||
theorem get?_eq_some_getD_of_contains [EquivBEq α] [LawfulHashable α] {a : α} {fallback : β} :
|
||||
m.contains a = true → get? m a = some (getD m a fallback) := by
|
||||
|
||||
@@ -145,9 +145,9 @@ instance [BEq α] [Hashable α] : GetElem? (HashMap α β) α β (fun m a => a
|
||||
getElem? m a := m.get? a
|
||||
getElem! m a := m.get! a
|
||||
|
||||
@[inline, inherit_doc DHashMap.erase] def erase [BEq α] [Hashable α] (m : HashMap α β) (a : α) :
|
||||
@[inline, inherit_doc DHashMap.remove] def remove [BEq α] [Hashable α] (m : HashMap α β) (a : α) :
|
||||
HashMap α β :=
|
||||
⟨m.inner.erase a⟩
|
||||
⟨m.inner.remove a⟩
|
||||
|
||||
@[inline, inherit_doc DHashMap.size] def size [BEq α] [Hashable α] (m : HashMap α β) : Nat :=
|
||||
m.inner.size
|
||||
|
||||
@@ -118,41 +118,41 @@ theorem size_le_size_insert [EquivBEq α] [LawfulHashable α] {k : α} {v : β}
|
||||
DHashMap.size_le_size_insert
|
||||
|
||||
@[simp]
|
||||
theorem erase_empty {a : α} {c : Nat} : (empty c : HashMap α β).erase a = empty c :=
|
||||
ext DHashMap.erase_empty
|
||||
theorem remove_empty {a : α} {c : Nat} : (empty c : HashMap α β).remove a = empty c :=
|
||||
ext DHashMap.remove_empty
|
||||
|
||||
@[simp]
|
||||
theorem erase_emptyc {a : α} : (∅ : HashMap α β).erase a = ∅ :=
|
||||
ext DHashMap.erase_emptyc
|
||||
theorem remove_emptyc {a : α} : (∅ : HashMap α β).remove a = ∅ :=
|
||||
ext DHashMap.remove_emptyc
|
||||
|
||||
@[simp]
|
||||
theorem isEmpty_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
DHashMap.isEmpty_erase
|
||||
theorem isEmpty_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
DHashMap.isEmpty_remove
|
||||
|
||||
@[simp]
|
||||
theorem contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a = (!(a == k) && m.contains a) :=
|
||||
DHashMap.contains_erase
|
||||
theorem contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a = (!(a == k) && m.contains a) :=
|
||||
DHashMap.contains_remove
|
||||
|
||||
@[simp]
|
||||
theorem mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.erase k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
DHashMap.mem_erase
|
||||
theorem mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.remove k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
DHashMap.mem_remove
|
||||
|
||||
theorem contains_of_contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a → m.contains a :=
|
||||
DHashMap.contains_of_contains_erase
|
||||
theorem contains_of_contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a → m.contains a :=
|
||||
DHashMap.contains_of_contains_remove
|
||||
|
||||
theorem mem_of_mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.erase k → a ∈ m :=
|
||||
DHashMap.mem_of_mem_erase
|
||||
theorem mem_of_mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.remove k → a ∈ m :=
|
||||
DHashMap.mem_of_mem_remove
|
||||
|
||||
theorem size_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
DHashMap.size_erase
|
||||
theorem size_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
DHashMap.size_remove
|
||||
|
||||
theorem size_erase_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k).size ≤ m.size :=
|
||||
DHashMap.size_erase_le
|
||||
theorem size_remove_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k).size ≤ m.size :=
|
||||
DHashMap.size_remove_le
|
||||
|
||||
@[simp]
|
||||
theorem containsThenInsert_fst {k : α} {v : β} : (m.containsThenInsert k v).1 = m.contains k :=
|
||||
@@ -204,13 +204,13 @@ theorem getElem?_eq_none_of_contains_eq_false [EquivBEq α] [LawfulHashable α]
|
||||
theorem getElem?_eq_none [EquivBEq α] [LawfulHashable α] {a : α} : ¬a ∈ m → m[a]? = none :=
|
||||
DHashMap.Const.get?_eq_none
|
||||
|
||||
theorem getElem?_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k)[a]? = bif a == k then none else m[a]? :=
|
||||
DHashMap.Const.get?_erase
|
||||
theorem getElem?_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k)[a]? = bif a == k then none else m[a]? :=
|
||||
DHashMap.Const.get?_remove
|
||||
|
||||
@[simp]
|
||||
theorem getElem?_erase_self [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k)[k]? = none :=
|
||||
DHashMap.Const.get?_erase_self
|
||||
theorem getElem?_remove_self [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k)[k]? = none :=
|
||||
DHashMap.Const.get?_remove_self
|
||||
|
||||
theorem getElem?_congr [EquivBEq α] [LawfulHashable α] {a b : α} (hab : a == b) : m[a]? = m[b]? :=
|
||||
DHashMap.Const.get?_congr hab
|
||||
@@ -226,9 +226,9 @@ theorem getElem_insert_self [EquivBEq α] [LawfulHashable α] {k : α} {v : β}
|
||||
DHashMap.Const.get_insert_self
|
||||
|
||||
@[simp]
|
||||
theorem getElem_erase [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
(m.erase k)[a]'h' = m[a]'(mem_of_mem_erase h') :=
|
||||
DHashMap.Const.get_erase (h' := h')
|
||||
theorem getElem_remove [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
(m.remove k)[a]'h' = m[a]'(mem_of_mem_remove h') :=
|
||||
DHashMap.Const.get_remove (h' := h')
|
||||
|
||||
theorem getElem?_eq_some_getElem [EquivBEq α] [LawfulHashable α] {a : α} {h' : a ∈ m} :
|
||||
m[a]? = some (m[a]'h') :=
|
||||
@@ -267,14 +267,14 @@ theorem getElem!_eq_default [EquivBEq α] [LawfulHashable α] [Inhabited β] {a
|
||||
¬a ∈ m → m[a]! = default :=
|
||||
DHashMap.Const.get!_eq_default
|
||||
|
||||
theorem getElem!_erase [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
(m.erase k)[a]! = bif a == k then default else m[a]! :=
|
||||
DHashMap.Const.get!_erase
|
||||
theorem getElem!_remove [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
(m.remove k)[a]! = bif a == k then default else m[a]! :=
|
||||
DHashMap.Const.get!_remove
|
||||
|
||||
@[simp]
|
||||
theorem getElem!_erase_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
(m.erase k)[k]! = default :=
|
||||
DHashMap.Const.get!_erase_self
|
||||
theorem getElem!_remove_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
(m.remove k)[k]! = default :=
|
||||
DHashMap.Const.get!_remove_self
|
||||
|
||||
theorem getElem?_eq_some_getElem!_of_contains [EquivBEq α] [LawfulHashable α] [Inhabited β]
|
||||
{a : α} : m.contains a = true → m[a]? = some m[a]! :=
|
||||
@@ -326,14 +326,14 @@ theorem getD_eq_fallback [EquivBEq α] [LawfulHashable α] {a : α} {fallback :
|
||||
¬a ∈ m → m.getD a fallback = fallback :=
|
||||
DHashMap.Const.getD_eq_fallback
|
||||
|
||||
theorem getD_erase [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
(m.erase k).getD a fallback = bif a == k then fallback else m.getD a fallback :=
|
||||
DHashMap.Const.getD_erase
|
||||
theorem getD_remove [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
(m.remove k).getD a fallback = bif a == k then fallback else m.getD a fallback :=
|
||||
DHashMap.Const.getD_remove
|
||||
|
||||
@[simp]
|
||||
theorem getD_erase_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
(m.erase k).getD k fallback = fallback :=
|
||||
DHashMap.Const.getD_erase_self
|
||||
theorem getD_remove_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
(m.remove k).getD k fallback = fallback :=
|
||||
DHashMap.Const.getD_remove_self
|
||||
|
||||
theorem getElem?_eq_some_getD_of_contains [EquivBEq α] [LawfulHashable α] {a : α} {fallback : β} :
|
||||
m.contains a = true → m[a]? = some (m.getD a fallback) :=
|
||||
|
||||
@@ -133,9 +133,9 @@ instance [BEq α] [Hashable α] : GetElem? (Raw α β) α β (fun m a => a ∈ m
|
||||
getElem? m a := m.get? a
|
||||
getElem! m a := m.get! a
|
||||
|
||||
@[inline, inherit_doc DHashMap.Raw.erase] def erase [BEq α] [Hashable α] (m : Raw α β)
|
||||
@[inline, inherit_doc DHashMap.Raw.remove] def remove [BEq α] [Hashable α] (m : Raw α β)
|
||||
(a : α) : Raw α β :=
|
||||
⟨m.inner.erase a⟩
|
||||
⟨m.inner.remove a⟩
|
||||
|
||||
@[inline, inherit_doc DHashMap.Raw.size] def size (m : Raw α β) : Nat :=
|
||||
m.inner.size
|
||||
@@ -258,8 +258,8 @@ theorem WF.getThenInsertIfNew? [BEq α] [Hashable α] {m : Raw α β} {a : α} {
|
||||
(m.getThenInsertIfNew? a b).2.WF :=
|
||||
⟨DHashMap.Raw.WF.Const.getThenInsertIfNew? h.out⟩
|
||||
|
||||
theorem WF.erase [BEq α] [Hashable α] {m : Raw α β} {a : α} (h : m.WF) : (m.erase a).WF :=
|
||||
⟨DHashMap.Raw.WF.erase h.out⟩
|
||||
theorem WF.remove [BEq α] [Hashable α] {m : Raw α β} {a : α} (h : m.WF) : (m.remove a).WF :=
|
||||
⟨DHashMap.Raw.WF.remove h.out⟩
|
||||
|
||||
theorem WF.filter [BEq α] [Hashable α] {m : Raw α β} {f : α → β → Bool} (h : m.WF) :
|
||||
(m.filter f).WF :=
|
||||
|
||||
@@ -117,41 +117,41 @@ theorem size_le_size_insert [EquivBEq α] [LawfulHashable α] {k : α} {v : β}
|
||||
DHashMap.Raw.size_le_size_insert h.out
|
||||
|
||||
@[simp]
|
||||
theorem erase_empty {k : α} {c : Nat} : (empty c : Raw α β).erase k = empty c :=
|
||||
ext DHashMap.Raw.erase_empty
|
||||
theorem remove_empty {k : α} {c : Nat} : (empty c : Raw α β).remove k = empty c :=
|
||||
ext DHashMap.Raw.remove_empty
|
||||
|
||||
@[simp]
|
||||
theorem erase_emptyc {k : α} : (∅ : Raw α β).erase k = ∅ :=
|
||||
ext DHashMap.Raw.erase_emptyc
|
||||
theorem remove_emptyc {k : α} : (∅ : Raw α β).remove k = ∅ :=
|
||||
ext DHashMap.Raw.remove_emptyc
|
||||
|
||||
@[simp]
|
||||
theorem isEmpty_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
DHashMap.Raw.isEmpty_erase h.out
|
||||
theorem isEmpty_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
DHashMap.Raw.isEmpty_remove h.out
|
||||
|
||||
@[simp]
|
||||
theorem contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a = (!(a == k) && m.contains a) :=
|
||||
DHashMap.Raw.contains_erase h.out
|
||||
theorem contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a = (!(a == k) && m.contains a) :=
|
||||
DHashMap.Raw.contains_remove h.out
|
||||
|
||||
@[simp]
|
||||
theorem mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.erase k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
DHashMap.Raw.mem_erase h.out
|
||||
theorem mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.remove k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
DHashMap.Raw.mem_remove h.out
|
||||
|
||||
theorem contains_of_contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a → m.contains a :=
|
||||
DHashMap.Raw.contains_of_contains_erase h.out
|
||||
theorem contains_of_contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a → m.contains a :=
|
||||
DHashMap.Raw.contains_of_contains_remove h.out
|
||||
|
||||
theorem mem_of_mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.erase k → a ∈ m :=
|
||||
DHashMap.Raw.mem_of_mem_erase h.out
|
||||
theorem mem_of_mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.remove k → a ∈ m :=
|
||||
DHashMap.Raw.mem_of_mem_remove h.out
|
||||
|
||||
theorem size_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
DHashMap.Raw.size_erase h.out
|
||||
theorem size_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
DHashMap.Raw.size_remove h.out
|
||||
|
||||
theorem size_erase_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k).size ≤ m.size :=
|
||||
DHashMap.Raw.size_erase_le h.out
|
||||
theorem size_remove_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k).size ≤ m.size :=
|
||||
DHashMap.Raw.size_remove_le h.out
|
||||
|
||||
@[simp]
|
||||
theorem containsThenInsert_fst {k : α} {v : β} : (m.containsThenInsert k v).1 = m.contains k :=
|
||||
@@ -203,13 +203,13 @@ theorem getElem?_eq_none_of_contains_eq_false [EquivBEq α] [LawfulHashable α]
|
||||
theorem getElem?_eq_none [EquivBEq α] [LawfulHashable α] {a : α} : ¬a ∈ m → m[a]? = none :=
|
||||
DHashMap.Raw.Const.get?_eq_none h.out
|
||||
|
||||
theorem getElem?_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k)[a]? = bif a == k then none else m[a]? :=
|
||||
DHashMap.Raw.Const.get?_erase h.out
|
||||
theorem getElem?_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k)[a]? = bif a == k then none else m[a]? :=
|
||||
DHashMap.Raw.Const.get?_remove h.out
|
||||
|
||||
@[simp]
|
||||
theorem getElem?_erase_self [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k)[k]? = none :=
|
||||
DHashMap.Raw.Const.get?_erase_self h.out
|
||||
theorem getElem?_remove_self [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k)[k]? = none :=
|
||||
DHashMap.Raw.Const.get?_remove_self h.out
|
||||
|
||||
theorem getElem?_congr [EquivBEq α] [LawfulHashable α] {a b : α} (hab : a == b) : m[a]? = m[b]? :=
|
||||
DHashMap.Raw.Const.get?_congr h.out hab
|
||||
@@ -225,9 +225,9 @@ theorem getElem_insert_self [EquivBEq α] [LawfulHashable α] {k : α} {v : β}
|
||||
DHashMap.Raw.Const.get_insert_self h.out
|
||||
|
||||
@[simp]
|
||||
theorem getElem_erase [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
(m.erase k)[a]'h' = m[a]'(mem_of_mem_erase h h') :=
|
||||
DHashMap.Raw.Const.get_erase (h' := h') h.out
|
||||
theorem getElem_remove [EquivBEq α] [LawfulHashable α] {k a : α} {h'} :
|
||||
(m.remove k)[a]'h' = m[a]'(mem_of_mem_remove h h') :=
|
||||
DHashMap.Raw.Const.get_remove (h' := h') h.out
|
||||
|
||||
theorem getElem?_eq_some_getElem [EquivBEq α] [LawfulHashable α] {a : α} {h' : a ∈ m} :
|
||||
m[a]? = some (m[a]'h') :=
|
||||
@@ -266,14 +266,14 @@ theorem getElem!_eq_default [EquivBEq α] [LawfulHashable α] [Inhabited β] {a
|
||||
¬a ∈ m → m[a]! = default :=
|
||||
DHashMap.Raw.Const.get!_eq_default h.out
|
||||
|
||||
theorem getElem!_erase [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
(m.erase k)[a]! = bif a == k then default else m[a]! :=
|
||||
DHashMap.Raw.Const.get!_erase h.out
|
||||
theorem getElem!_remove [EquivBEq α] [LawfulHashable α] [Inhabited β] {k a : α} :
|
||||
(m.remove k)[a]! = bif a == k then default else m[a]! :=
|
||||
DHashMap.Raw.Const.get!_remove h.out
|
||||
|
||||
@[simp]
|
||||
theorem getElem!_erase_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
(m.erase k)[k]! = default :=
|
||||
DHashMap.Raw.Const.get!_erase_self h.out
|
||||
theorem getElem!_remove_self [EquivBEq α] [LawfulHashable α] [Inhabited β] {k : α} :
|
||||
(m.remove k)[k]! = default :=
|
||||
DHashMap.Raw.Const.get!_remove_self h.out
|
||||
|
||||
theorem getElem?_eq_some_getElem!_of_contains [EquivBEq α] [LawfulHashable α] [Inhabited β]
|
||||
{a : α} : m.contains a = true → m[a]? = some m[a]! :=
|
||||
@@ -324,14 +324,14 @@ theorem getD_eq_fallback [EquivBEq α] [LawfulHashable α] {a : α} {fallback :
|
||||
¬a ∈ m → m.getD a fallback = fallback :=
|
||||
DHashMap.Raw.Const.getD_eq_fallback h.out
|
||||
|
||||
theorem getD_erase [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
(m.erase k).getD a fallback = bif a == k then fallback else m.getD a fallback :=
|
||||
DHashMap.Raw.Const.getD_erase h.out
|
||||
theorem getD_remove [EquivBEq α] [LawfulHashable α] {k a : α} {fallback : β} :
|
||||
(m.remove k).getD a fallback = bif a == k then fallback else m.getD a fallback :=
|
||||
DHashMap.Raw.Const.getD_remove h.out
|
||||
|
||||
@[simp]
|
||||
theorem getD_erase_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
(m.erase k).getD k fallback = fallback :=
|
||||
DHashMap.Raw.Const.getD_erase_self h.out
|
||||
theorem getD_remove_self [EquivBEq α] [LawfulHashable α] {k : α} {fallback : β} :
|
||||
(m.remove k).getD k fallback = fallback :=
|
||||
DHashMap.Raw.Const.getD_remove_self h.out
|
||||
|
||||
theorem getElem?_eq_some_getD_of_contains [EquivBEq α] [LawfulHashable α] {a : α} {fallback : β} :
|
||||
m.contains a = true → m[a]? = some (m.getD a fallback) :=
|
||||
|
||||
@@ -102,8 +102,8 @@ instance [BEq α] [Hashable α] {m : HashSet α} {a : α} : Decidable (a ∈ m)
|
||||
inferInstanceAs (Decidable (a ∈ m.inner))
|
||||
|
||||
/-- Removes the element if it exists. -/
|
||||
@[inline] def erase [BEq α] [Hashable α] (m : HashSet α) (a : α) : HashSet α :=
|
||||
⟨m.inner.erase a⟩
|
||||
@[inline] def remove [BEq α] [Hashable α] (m : HashSet α) (a : α) : HashSet α :=
|
||||
⟨m.inner.remove a⟩
|
||||
|
||||
/-- The number of elements present in the set -/
|
||||
@[inline] def size [BEq α] [Hashable α] (m : HashSet α) : Nat :=
|
||||
|
||||
@@ -109,41 +109,41 @@ theorem size_le_size_insert [EquivBEq α] [LawfulHashable α] {k : α} : m.size
|
||||
HashMap.size_le_size_insertIfNew
|
||||
|
||||
@[simp]
|
||||
theorem erase_empty {a : α} {c : Nat} : (empty c : HashSet α).erase a = empty c :=
|
||||
ext HashMap.erase_empty
|
||||
theorem remove_empty {a : α} {c : Nat} : (empty c : HashSet α).remove a = empty c :=
|
||||
ext HashMap.remove_empty
|
||||
|
||||
@[simp]
|
||||
theorem erase_emptyc {a : α} : (∅ : HashSet α).erase a = ∅ :=
|
||||
ext HashMap.erase_emptyc
|
||||
theorem remove_emptyc {a : α} : (∅ : HashSet α).remove a = ∅ :=
|
||||
ext HashMap.remove_emptyc
|
||||
|
||||
@[simp]
|
||||
theorem isEmpty_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
HashMap.isEmpty_erase
|
||||
theorem isEmpty_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
HashMap.isEmpty_remove
|
||||
|
||||
@[simp]
|
||||
theorem contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a = (!(a == k) && m.contains a) :=
|
||||
HashMap.contains_erase
|
||||
theorem contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a = (!(a == k) && m.contains a) :=
|
||||
HashMap.contains_remove
|
||||
|
||||
@[simp]
|
||||
theorem mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.erase k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
HashMap.mem_erase
|
||||
theorem mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.remove k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
HashMap.mem_remove
|
||||
|
||||
theorem contains_of_contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a → m.contains a :=
|
||||
HashMap.contains_of_contains_erase
|
||||
theorem contains_of_contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a → m.contains a :=
|
||||
HashMap.contains_of_contains_remove
|
||||
|
||||
theorem mem_of_mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.erase k → a ∈ m :=
|
||||
HashMap.mem_of_mem_erase
|
||||
theorem mem_of_mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.remove k → a ∈ m :=
|
||||
HashMap.mem_of_mem_remove
|
||||
|
||||
theorem size_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
HashMap.size_erase
|
||||
theorem size_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
HashMap.size_remove
|
||||
|
||||
theorem size_erase_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k).size ≤ m.size :=
|
||||
HashMap.size_erase_le
|
||||
theorem size_remove_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k).size ≤ m.size :=
|
||||
HashMap.size_remove_le
|
||||
|
||||
@[simp]
|
||||
theorem containsThenInsert_fst {k : α} : (m.containsThenInsert k).1 = m.contains k :=
|
||||
|
||||
@@ -103,8 +103,8 @@ instance [BEq α] [Hashable α] {m : Raw α} {a : α} : Decidable (a ∈ m) :=
|
||||
inferInstanceAs (Decidable (a ∈ m.inner))
|
||||
|
||||
/-- Removes the element if it exists. -/
|
||||
@[inline] def erase [BEq α] [Hashable α] (m : Raw α) (a : α) : Raw α :=
|
||||
⟨m.inner.erase a⟩
|
||||
@[inline] def remove [BEq α] [Hashable α] (m : Raw α) (a : α) : Raw α :=
|
||||
⟨m.inner.remove a⟩
|
||||
|
||||
/-- The number of elements present in the set -/
|
||||
@[inline] def size (m : Raw α) : Nat :=
|
||||
@@ -218,8 +218,8 @@ theorem WF.containsThenInsert [BEq α] [Hashable α] {m : Raw α} {a : α} (h :
|
||||
(m.containsThenInsert a).2.WF :=
|
||||
⟨HashMap.Raw.WF.containsThenInsertIfNew h.out⟩
|
||||
|
||||
theorem WF.erase [BEq α] [Hashable α] {m : Raw α} {a : α} (h : m.WF) : (m.erase a).WF :=
|
||||
⟨HashMap.Raw.WF.erase h.out⟩
|
||||
theorem WF.remove [BEq α] [Hashable α] {m : Raw α} {a : α} (h : m.WF) : (m.remove a).WF :=
|
||||
⟨HashMap.Raw.WF.remove h.out⟩
|
||||
|
||||
theorem WF.filter [BEq α] [Hashable α] {m : Raw α} {f : α → Bool} (h : m.WF) : (m.filter f).WF :=
|
||||
⟨HashMap.Raw.WF.filter h.out⟩
|
||||
|
||||
@@ -109,41 +109,41 @@ theorem size_le_size_insert [EquivBEq α] [LawfulHashable α] {k : α} : m.size
|
||||
HashMap.Raw.size_le_size_insertIfNew h.out
|
||||
|
||||
@[simp]
|
||||
theorem erase_empty {k : α} {c : Nat} : (empty c : Raw α).erase k = empty c :=
|
||||
ext HashMap.Raw.erase_empty
|
||||
theorem remove_empty {k : α} {c : Nat} : (empty c : Raw α).remove k = empty c :=
|
||||
ext HashMap.Raw.remove_empty
|
||||
|
||||
@[simp]
|
||||
theorem erase_emptyc {k : α} : (∅ : Raw α).erase k = ∅ :=
|
||||
ext HashMap.Raw.erase_emptyc
|
||||
theorem remove_emptyc {k : α} : (∅ : Raw α).remove k = ∅ :=
|
||||
ext HashMap.Raw.remove_emptyc
|
||||
|
||||
@[simp]
|
||||
theorem isEmpty_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
HashMap.Raw.isEmpty_erase h.out
|
||||
theorem isEmpty_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).isEmpty = (m.isEmpty || (m.size == 1 && m.contains k)) :=
|
||||
HashMap.Raw.isEmpty_remove h.out
|
||||
|
||||
@[simp]
|
||||
theorem contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a = (!(a == k) && m.contains a) :=
|
||||
HashMap.Raw.contains_erase h.out
|
||||
theorem contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a = (!(a == k) && m.contains a) :=
|
||||
HashMap.Raw.contains_remove h.out
|
||||
|
||||
@[simp]
|
||||
theorem mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.erase k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
HashMap.Raw.mem_erase h.out
|
||||
theorem mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
a ∈ m.remove k ↔ (a == k) = false ∧ a ∈ m :=
|
||||
HashMap.Raw.mem_remove h.out
|
||||
|
||||
theorem contains_of_contains_erase [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.erase k).contains a → m.contains a :=
|
||||
HashMap.Raw.contains_of_contains_erase h.out
|
||||
theorem contains_of_contains_remove [EquivBEq α] [LawfulHashable α] {k a : α} :
|
||||
(m.remove k).contains a → m.contains a :=
|
||||
HashMap.Raw.contains_of_contains_remove h.out
|
||||
|
||||
theorem mem_of_mem_erase [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.erase k → a ∈ m :=
|
||||
HashMap.Raw.mem_of_mem_erase h.out
|
||||
theorem mem_of_mem_remove [EquivBEq α] [LawfulHashable α] {k a : α} : a ∈ m.remove k → a ∈ m :=
|
||||
HashMap.Raw.mem_of_mem_remove h.out
|
||||
|
||||
theorem size_erase [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.erase k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
HashMap.Raw.size_erase h.out
|
||||
theorem size_remove [EquivBEq α] [LawfulHashable α] {k : α} :
|
||||
(m.remove k).size = bif m.contains k then m.size - 1 else m.size :=
|
||||
HashMap.Raw.size_remove h.out
|
||||
|
||||
theorem size_erase_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.erase k).size ≤ m.size :=
|
||||
HashMap.Raw.size_erase_le h.out
|
||||
theorem size_remove_le [EquivBEq α] [LawfulHashable α] {k : α} : (m.remove k).size ≤ m.size :=
|
||||
HashMap.Raw.size_remove_le h.out
|
||||
|
||||
@[simp]
|
||||
theorem containsThenInsert_fst {k : α} : (m.containsThenInsert k).1 = m.contains k :=
|
||||
|
||||
@@ -448,6 +448,9 @@ static inline void lean_inc(lean_object * o) { if (!lean_is_scalar(o)) lean_inc_
|
||||
static inline void lean_inc_n(lean_object * o, size_t n) { if (!lean_is_scalar(o)) lean_inc_ref_n(o, n); }
|
||||
static inline void lean_dec(lean_object * o) { if (!lean_is_scalar(o)) lean_dec_ref(o); }
|
||||
|
||||
/* Just free memory */
|
||||
LEAN_EXPORT void lean_dealloc(lean_object * o);
|
||||
|
||||
static inline bool lean_is_ctor(lean_object * o) { return lean_ptr_tag(o) <= LeanMaxCtorTag; }
|
||||
static inline bool lean_is_closure(lean_object * o) { return lean_ptr_tag(o) == LeanClosure; }
|
||||
static inline bool lean_is_array(lean_object * o) { return lean_ptr_tag(o) == LeanArray; }
|
||||
|
||||
@@ -51,24 +51,8 @@ Whether the build should show progress information.
|
||||
def BuildConfig.showProgress (cfg : BuildConfig) : Bool :=
|
||||
(cfg.noBuild ∧ cfg.verbosity == .verbose) ∨ cfg.verbosity != .quiet
|
||||
|
||||
/-- The core structure of a Lake job. -/
|
||||
structure JobCore (α : Type u) where
|
||||
/-- The Lean `Task` object for the job. -/
|
||||
task : α
|
||||
/--
|
||||
A caption for the job in Lake's build monitor.
|
||||
Will be formatted like `✔ [3/5] Ran <caption>`.
|
||||
-/
|
||||
caption : String
|
||||
/-- Whether this job failing should cause the build to fail. -/
|
||||
optional : Bool := false
|
||||
deriving Inhabited
|
||||
|
||||
/-- A Lake job task with an opaque value type in `Type`. -/
|
||||
declare_opaque_type OpaqueJobTask
|
||||
|
||||
/-- A Lake job with an opaque value type in `Type`. -/
|
||||
abbrev OpaqueJob := JobCore OpaqueJobTask
|
||||
declare_opaque_type OpaqueJob
|
||||
|
||||
/-- A Lake context with a build configuration and additional build data. -/
|
||||
structure BuildContext extends BuildConfig, Context where
|
||||
|
||||
@@ -118,15 +118,12 @@ module_data o.noexport : BuildJob FilePath
|
||||
|
||||
/--
|
||||
A package's *optional* cloud build release.
|
||||
Will not cause the whole build to fail if the release cannot be fetched.
|
||||
Does not fail if the release cannot be fetched.
|
||||
-/
|
||||
abbrev Package.optReleaseFacet := `optRelease
|
||||
package_data optRelease : BuildJob Bool
|
||||
|
||||
/--
|
||||
A package's cloud build release.
|
||||
Will cause the whole build to fail if the release cannot be fetched.
|
||||
-/
|
||||
/-- A package's cloud build release. -/
|
||||
abbrev Package.releaseFacet := `release
|
||||
package_data release : BuildJob Unit
|
||||
|
||||
|
||||
@@ -104,8 +104,8 @@ def ensureJob (x : FetchM (Job α))
|
||||
Registers the job for the top-level build monitor,
|
||||
(e.g., the Lake CLI progress UI), assigning it `caption`.
|
||||
-/
|
||||
def registerJob (caption : String) (job : Job α) (optional := false) : FetchM (Job α) := do
|
||||
let job : Job α := {job with caption, optional}
|
||||
def registerJob (caption : String) (job : Job α) : FetchM (Job α) := do
|
||||
let job := job.setCaption caption
|
||||
(← getBuildContext).registeredJobs.modify (·.push job)
|
||||
return job.renew
|
||||
|
||||
@@ -116,10 +116,10 @@ Registers the produced job for the top-level build monitor
|
||||
Stray I/O, logs, and errors produced by `x` will be wrapped into the job.
|
||||
-/
|
||||
def withRegisterJob
|
||||
(caption : String) (x : FetchM (Job α)) (optional := false)
|
||||
(caption : String) (x : FetchM (Job α))
|
||||
: FetchM (Job α) := do
|
||||
let job ← ensureJob x
|
||||
registerJob caption job optional
|
||||
registerJob caption job
|
||||
|
||||
/--
|
||||
Registers the produced job for the top-level build monitor
|
||||
|
||||
@@ -101,32 +101,33 @@ abbrev SpawnM := BuildT BaseIO
|
||||
@[deprecated SpawnM (since := "2024-05-21")] abbrev SchedulerM := SpawnM
|
||||
|
||||
/-- A Lake job. -/
|
||||
abbrev Job (α : Type u) := JobCore (JobTask α)
|
||||
|
||||
structure BundledJobTask where
|
||||
{Result : Type u}
|
||||
task : JobTask Result
|
||||
structure Job (α : Type u) where
|
||||
task : JobTask α
|
||||
caption : String
|
||||
deriving Inhabited
|
||||
|
||||
instance : CoeOut (JobTask α) BundledJobTask := ⟨.mk⟩
|
||||
structure BundledJob where
|
||||
{type : Type u}
|
||||
job : Job type
|
||||
deriving Inhabited
|
||||
|
||||
hydrate_opaque_type OpaqueJobTask BundledJobTask
|
||||
instance : CoeOut (Job α) BundledJob := ⟨.mk⟩
|
||||
|
||||
abbrev OpaqueJob.Result (job : OpaqueJob) : Type :=
|
||||
job.task.get.Result
|
||||
hydrate_opaque_type OpaqueJob BundledJob
|
||||
|
||||
nonrec abbrev OpaqueJob.task (job : OpaqueJob) : JobTask job.Result :=
|
||||
job.task.get.task
|
||||
abbrev OpaqueJob.type (job : OpaqueJob) : Type :=
|
||||
job.get.type
|
||||
|
||||
abbrev OpaqueJob.ofJob (job : Job α) : OpaqueJob :=
|
||||
{job with task := job.task}
|
||||
abbrev OpaqueJob.toJob (job : OpaqueJob) : Job job.type :=
|
||||
job.get.job
|
||||
|
||||
instance : CoeOut (Job α) OpaqueJob := ⟨.ofJob⟩
|
||||
abbrev OpaqueJob.task (job : OpaqueJob) : JobTask job.type :=
|
||||
job.toJob.task
|
||||
|
||||
abbrev OpaqueJob.toJob (job : OpaqueJob) : Job job.Result :=
|
||||
{job with task := job.task}
|
||||
abbrev OpaqueJob.caption (job : OpaqueJob) : String :=
|
||||
job.toJob.caption
|
||||
|
||||
instance : CoeDep OpaqueJob job (Job job.Result) := ⟨job.toJob⟩
|
||||
instance : CoeDep OpaqueJob job (Job job.type) := ⟨job.toJob⟩
|
||||
|
||||
namespace Job
|
||||
|
||||
|
||||
@@ -24,27 +24,23 @@ def Package.recComputeDeps (self : Package) : FetchM (Array Package) := do
|
||||
def Package.depsFacetConfig : PackageFacetConfig depsFacet :=
|
||||
mkFacetConfig Package.recComputeDeps
|
||||
|
||||
/-- Tries to download and unpack the package's prebuilt release archive (from GitHub). -/
|
||||
def Package.fetchOptRelease (self : Package) : FetchM (BuildJob Unit) := do
|
||||
(← self.optRelease.fetch).bindSync fun success t => do
|
||||
unless success do
|
||||
logWarning s!"building from source; \
|
||||
failed to fetch cloud release (see '{self.name}:optRelease' for details)"
|
||||
return ((), t)
|
||||
|
||||
/--
|
||||
Build the `extraDepTargets` for the package and its transitive dependencies.
|
||||
Also fetch pre-built releases for the package's' dependencies.
|
||||
-/
|
||||
def Package.recBuildExtraDepTargets (self : Package) : FetchM (BuildJob Unit) :=
|
||||
withRegisterJob s!"{self.name}:extraDep" do
|
||||
def Package.recBuildExtraDepTargets (self : Package) : FetchM (BuildJob Unit) := do
|
||||
let mut job := BuildJob.nil
|
||||
-- Build dependencies' extra dep targets
|
||||
for dep in self.deps do
|
||||
job := job.mix <| ← dep.extraDep.fetch
|
||||
-- Fetch pre-built release if desired and this package is a dependency
|
||||
if self.name ≠ (← getWorkspace).root.name ∧ self.preferReleaseBuild then
|
||||
job := job.add <| ← self.fetchOptRelease
|
||||
job := job.add <| ←
|
||||
withRegisterJob s!"{self.name}:optRelease" do
|
||||
(← self.optRelease.fetch).bindSync fun success t => do
|
||||
unless success do
|
||||
logWarning "failed to fetch cloud release; falling back to local build"
|
||||
return ((), t)
|
||||
-- Build this package's extra dep targets
|
||||
for target in self.extraDepTargets do
|
||||
job := job.mix <| ← self.fetchTargetJob target
|
||||
@@ -54,19 +50,17 @@ def Package.recBuildExtraDepTargets (self : Package) : FetchM (BuildJob Unit) :=
|
||||
def Package.extraDepFacetConfig : PackageFacetConfig extraDepFacet :=
|
||||
mkFacetJobConfig Package.recBuildExtraDepTargets
|
||||
|
||||
/-- Tries to download and unpack the package's prebuilt release archive (from GitHub). -/
|
||||
def Package.fetchOptReleaseCore (self : Package) : FetchM (BuildJob Bool) :=
|
||||
withRegisterJob s!"{self.name}:optRelease" (optional := true) <| Job.async do
|
||||
/-- Download and unpack the package's prebuilt release archive (from GitHub). -/
|
||||
def Package.fetchOptRelease (self : Package) : FetchM (BuildJob Bool) := Job.async do
|
||||
let repo := GitRepo.mk self.dir
|
||||
let repoUrl? := self.releaseRepo? <|> self.remoteUrl?
|
||||
let some repoUrl := repoUrl? <|> (← repo.getFilteredRemoteUrl?)
|
||||
| logError s!"release repository URL not known; \
|
||||
the package may need to set 'releaseRepo'"
|
||||
| logInfo s!"{self.name}: wanted prebuilt release, \
|
||||
but repository URL not known; the package may need to set 'releaseRepo'"
|
||||
updateAction .fetch
|
||||
return (false, .nil)
|
||||
let some tag ← repo.findTag?
|
||||
| let rev ← if let some rev ← repo.getHeadRevision? then pure s!" '{rev}'" else pure ""
|
||||
logError s!"no release tag found for revision{rev}"
|
||||
| logInfo s!"{self.name}: wanted prebuilt release, but no tag found for revision"
|
||||
updateAction .fetch
|
||||
return (false, .nil)
|
||||
let url := s!"{repoUrl}/releases/download/{tag}/{self.buildArchive}"
|
||||
@@ -83,7 +77,7 @@ def Package.fetchOptReleaseCore (self : Package) : FetchM (BuildJob Bool) :=
|
||||
|
||||
/-- The `PackageFacetConfig` for the builtin `optReleaseFacet`. -/
|
||||
def Package.optReleaseFacetConfig : PackageFacetConfig optReleaseFacet :=
|
||||
mkFacetJobConfig (·.fetchOptReleaseCore)
|
||||
mkFacetJobConfig (·.fetchOptRelease)
|
||||
|
||||
/-- The `PackageFacetConfig` for the builtin `releaseFacet`. -/
|
||||
def Package.releaseFacetConfig : PackageFacetConfig releaseFacet :=
|
||||
@@ -91,20 +85,20 @@ def Package.releaseFacetConfig : PackageFacetConfig releaseFacet :=
|
||||
withRegisterJob s!"{pkg.name}:release" do
|
||||
(← pkg.optRelease.fetch).bindSync fun success t => do
|
||||
unless success do
|
||||
error s!"failed to fetch cloud release (see '{pkg.name}:optRelease' for details)"
|
||||
error "failed to fetch cloud release"
|
||||
return ((), t)
|
||||
|
||||
/-- Perform a build job after first checking for an (optional) cloud release for the package. -/
|
||||
/-- Perform a build job after first checking for a cloud release for the package. -/
|
||||
def Package.afterReleaseAsync (self : Package) (build : SpawnM (Job α)) : FetchM (Job α) := do
|
||||
if self.preferReleaseBuild ∧ self.name ≠ (← getRootPackage).name then
|
||||
(← self.optRelease.fetch).bindAsync fun _ _ => build
|
||||
(← self.release.fetch).bindAsync fun _ _ => build
|
||||
else
|
||||
build
|
||||
|
||||
/-- Perform a build after first checking for an (optional) cloud release for the package. -/
|
||||
/-- Perform a build after first checking for a cloud release for the package. -/
|
||||
def Package.afterReleaseSync (self : Package) (build : JobM α) : FetchM (Job α) := do
|
||||
if self.preferReleaseBuild ∧ self.name ≠ (← getRootPackage).name then
|
||||
(← self.optRelease.fetch).bindSync fun _ _ => build
|
||||
(← self.release.fetch).bindSync fun _ _ => build
|
||||
else
|
||||
Job.async build
|
||||
|
||||
|
||||
@@ -99,18 +99,17 @@ def renderProgress (running unfinished : Array OpaqueJob) (h : 0 < unfinished.si
|
||||
def reportJob (job : OpaqueJob) : MonitorM PUnit := do
|
||||
let {jobNo, ..} ← get
|
||||
let {totalJobs, failLv, outLv, out, useAnsi, showProgress, minAction, ..} ← read
|
||||
let {task, caption, optional} := job.toJob
|
||||
let {task, caption} := job.toJob
|
||||
let {log, action, ..} := task.get.state
|
||||
let maxLv := log.maxLv
|
||||
let failed := log.hasEntries ∧ maxLv ≥ failLv
|
||||
if failed ∧ ¬optional then
|
||||
if failed then
|
||||
modify fun s => {s with failures := s.failures.push caption}
|
||||
let hasOutput := failed ∨ (log.hasEntries ∧ maxLv ≥ outLv)
|
||||
if hasOutput ∨ (showProgress ∧ ¬ useAnsi ∧ action ≥ minAction) then
|
||||
let verb := action.verb failed
|
||||
let icon := if hasOutput then maxLv.icon else '✔'
|
||||
let opt := if optional then " (Optional)" else ""
|
||||
let caption := s!"{icon} [{jobNo}/{totalJobs}]{opt} {verb} {caption}"
|
||||
let caption := s!"{icon} [{jobNo}/{totalJobs}] {verb} {caption}"
|
||||
let caption :=
|
||||
if useAnsi then
|
||||
let color := if hasOutput then maxLv.ansiColor else "32"
|
||||
@@ -236,7 +235,7 @@ def Workspace.runFetchM
|
||||
| error "top-level build failed"
|
||||
return a
|
||||
else
|
||||
print! out "Some required builds logged failures:\n"
|
||||
print! out "Some builds logged failures:\n"
|
||||
failures.forM (print! out s!"- {·}\n")
|
||||
flush out
|
||||
error "build failed"
|
||||
|
||||
@@ -47,6 +47,9 @@ def cwd : GitRepo := ⟨"."⟩
|
||||
@[inline] def dirExists (repo : GitRepo) : BaseIO Bool :=
|
||||
repo.dir.isDir
|
||||
|
||||
@[inline] def captureGit (args : Array String) (repo : GitRepo) : LogIO String :=
|
||||
captureProc {cmd := "git", args, cwd := repo.dir}
|
||||
|
||||
@[inline] def captureGit? (args : Array String) (repo : GitRepo) : BaseIO (Option String) :=
|
||||
captureProc? {cmd := "git", args, cwd := repo.dir}
|
||||
|
||||
@@ -63,30 +66,31 @@ def cwd : GitRepo := ⟨"."⟩
|
||||
repo.execGit #["init", "-q"]
|
||||
|
||||
@[inline] def fetch (repo : GitRepo) (remote := Git.defaultRemote) : LogIO PUnit :=
|
||||
repo.execGit #["fetch", "--tags", "--force", remote]
|
||||
repo.execGit #["fetch", remote]
|
||||
|
||||
@[inline] def checkoutBranch (branch : String) (repo : GitRepo) : LogIO PUnit :=
|
||||
repo.execGit #["checkout", "-B", branch]
|
||||
|
||||
@[inline] def checkoutDetach (hash : String) (repo : GitRepo) : LogIO PUnit :=
|
||||
repo.execGit #["checkout", "--detach", hash, "--"]
|
||||
repo.execGit #["checkout", "--detach", hash]
|
||||
|
||||
@[inline] def resolveRevision? (rev : String) (repo : GitRepo) : BaseIO (Option String) := do
|
||||
repo.captureGit? #["rev-parse", "--verify", "--end-of-options", rev]
|
||||
repo.captureGit? #["rev-parse", "--verify", rev]
|
||||
|
||||
@[inline] def resolveRevision (rev : String) (repo : GitRepo) : LogIO String := do
|
||||
repo.captureGit #["rev-parse", "--verify", rev]
|
||||
|
||||
@[inline] def getHeadRevision (repo : GitRepo) : LogIO String :=
|
||||
repo.resolveRevision "HEAD"
|
||||
|
||||
@[inline] def getHeadRevision? (repo : GitRepo) : BaseIO (Option String) :=
|
||||
repo.resolveRevision? "HEAD"
|
||||
|
||||
def getHeadRevision (repo : GitRepo) : LogIO String := do
|
||||
if let some rev ← repo.getHeadRevision? then return rev
|
||||
error s!"{repo}: could not resolve 'HEAD' to a commit; \
|
||||
the repository may be corrupt, so you may need to remove it and try again"
|
||||
|
||||
def resolveRemoteRevision (rev : String) (remote := Git.defaultRemote) (repo : GitRepo) : LogIO String := do
|
||||
if Git.isFullObjectName rev then return rev
|
||||
if let some rev ← repo.resolveRevision? s!"{remote}/{rev}" then return rev
|
||||
if let some rev ← repo.resolveRevision? rev then return rev
|
||||
error s!"{repo}: revision not found '{rev}'"
|
||||
error s!"cannot find revision {rev} in repository {repo}"
|
||||
|
||||
def findRemoteRevision (repo : GitRepo) (rev? : Option String := none) (remote := Git.defaultRemote) : LogIO String := do
|
||||
repo.fetch remote; repo.resolveRemoteRevision (rev?.getD Git.upstreamBranch) remote
|
||||
|
||||
@@ -79,9 +79,10 @@ instance : MonadLift IO MainM := ⟨MonadError.runIO⟩
|
||||
@[inline] def runLogIO (x : LogIO α)
|
||||
(minLv := LogLevel.info) (ansiMode := AnsiMode.auto) (out := OutStream.stderr)
|
||||
: MainM α := do
|
||||
let logger ← out.getLogger minLv ansiMode
|
||||
match (← x {}) with
|
||||
| .ok a log => replay log (← out.getLogger minLv ansiMode); return a
|
||||
| .error _ log => replay log (← out.getLogger .trace ansiMode); exit 1
|
||||
| .ok a log => replay log logger; return a
|
||||
| .error _ log => replay log logger; exit 1
|
||||
where
|
||||
-- avoid specialization of this call at each call site
|
||||
replay (log : Log) (logger : MonadLog BaseIO) : BaseIO Unit :=
|
||||
|
||||
@@ -2,7 +2,6 @@ import Lake
|
||||
open Lake DSL
|
||||
|
||||
package test
|
||||
|
||||
require dep from git "dep" @ "release"
|
||||
require dep from "dep"
|
||||
|
||||
lean_lib Test
|
||||
|
||||
@@ -19,51 +19,40 @@ git config user.name test
|
||||
git config user.email test@example.com
|
||||
git add --all
|
||||
git commit -m "initial commit"
|
||||
git tag release
|
||||
INIT_REV=`git rev-parse release`
|
||||
git commit --allow-empty -m "second commit"
|
||||
popd
|
||||
|
||||
# Clone dependency
|
||||
$LAKE update
|
||||
# Remove the release tag from the local copy
|
||||
git -C .lake/packages/dep tag -d release
|
||||
|
||||
# Test that a direct invocation fo `lake build *:release` fails
|
||||
REV_STR="'${INIT_REV}'"
|
||||
($LAKE build dep:release && exit 1 || true) | diff -u --strip-trailing-cr <(cat << EOF
|
||||
✖ [1/2] (Optional) Fetching dep:optRelease
|
||||
error: no release tag found for revision ${REV_STR}
|
||||
✖ [2/2] Running dep:release
|
||||
error: failed to fetch cloud release (see 'dep:optRelease' for details)
|
||||
Some required builds logged failures:
|
||||
($LAKE build dep:release && exit 1 || true) | diff -u --strip-trailing-cr <(cat << 'EOF'
|
||||
✖ [1/1] Fetching dep:release
|
||||
info: dep: wanted prebuilt release, but no tag found for revision
|
||||
error: failed to fetch cloud release
|
||||
Some builds logged failures:
|
||||
- dep:release
|
||||
EOF
|
||||
) -
|
||||
|
||||
# Test that an indirect fetch on the release does not cause the build to fail
|
||||
$LAKE build Test | diff -u --strip-trailing-cr <(cat << EOF
|
||||
✖ [1/5] (Optional) Fetching dep:optRelease
|
||||
error: no release tag found for revision ${REV_STR}
|
||||
⚠ [2/5] Ran dep:extraDep
|
||||
warning: building from source; failed to fetch cloud release (see 'dep:optRelease' for details)
|
||||
✔ [4/5] Built Test
|
||||
$LAKE build Test | diff -u --strip-trailing-cr <(cat << 'EOF'
|
||||
⚠ [1/3] Fetched dep:optRelease
|
||||
info: dep: wanted prebuilt release, but no tag found for revision
|
||||
warning: failed to fetch cloud release; falling back to local build
|
||||
✔ [2/3] Built Test
|
||||
Build completed successfully.
|
||||
EOF
|
||||
) -
|
||||
|
||||
# Test download failure
|
||||
$LAKE update # re-fetch release tag
|
||||
git -C dep tag release
|
||||
($LAKE build dep:release && exit 1 || true) | grep --color "downloading"
|
||||
|
||||
# Test automatic cloud release unpacking
|
||||
mkdir -p .lake/packages/dep/.lake/build
|
||||
$LAKE -d .lake/packages/dep pack 2>&1 | grep --color "packing"
|
||||
test -f .lake/packages/dep/.lake/release.tgz
|
||||
echo 4225503363911572621 > .lake/packages/dep/.lake/release.tgz.trace
|
||||
rmdir .lake/packages/dep/.lake/build
|
||||
mkdir -p dep/.lake/build
|
||||
$LAKE -d dep pack 2>&1 | grep --color "packing"
|
||||
test -f dep/.lake/release.tgz
|
||||
echo 4225503363911572621 > dep/.lake/release.tgz.trace
|
||||
rmdir dep/.lake/build
|
||||
$LAKE build dep:release -v | grep --color "unpacking"
|
||||
test -d .lake/packages/dep/.lake/build
|
||||
test -d dep/.lake/build
|
||||
|
||||
# Test that the job prints nothing if the archive is already fetched and unpacked
|
||||
$LAKE build dep:release | diff -u --strip-trailing-cr <(cat << 'EOF'
|
||||
@@ -77,15 +66,5 @@ Build completed successfully.
|
||||
EOF
|
||||
) -
|
||||
|
||||
# Test retagging
|
||||
git -C dep tag -d release
|
||||
git -C dep tag release
|
||||
NEW_REV=`git -C dep rev-parse release`
|
||||
test ! "$INIT_REV" = "$NEW_REV"
|
||||
test `git -C .lake/packages/dep rev-parse HEAD` = "$INIT_REV"
|
||||
$LAKE update
|
||||
test `git -C .lake/packages/dep rev-parse HEAD` = "$NEW_REV"
|
||||
$LAKE build dep:release
|
||||
|
||||
# Cleanup git repo
|
||||
rm -rf dep/.git
|
||||
|
||||
@@ -6,8 +6,6 @@ Author: Leonardo de Moura
|
||||
*/
|
||||
#include <vector>
|
||||
#include <cstring>
|
||||
#include <unordered_map>
|
||||
#include <unordered_set>
|
||||
#include "runtime/object.h"
|
||||
#include "runtime/hash.h"
|
||||
|
||||
@@ -270,177 +268,4 @@ public:
|
||||
extern "C" LEAN_EXPORT obj_res lean_state_sharecommon(b_obj_arg tc, obj_arg s, obj_arg a) {
|
||||
return sharecommon_fn(tc, s)(a);
|
||||
}
|
||||
|
||||
/*
|
||||
A faster version of `sharecommon_fn` which only uses a local state.
|
||||
It optimizes the number of RC operations, the strategy for caching results,
|
||||
and uses C++ hashmap.
|
||||
*/
|
||||
class sharecommon_quick_fn {
|
||||
struct set_hash {
|
||||
std::size_t operator()(lean_object * o) const { return lean_sharecommon_hash(o); }
|
||||
};
|
||||
struct set_eq {
|
||||
std::size_t operator()(lean_object * o1, lean_object * o2) const { return lean_sharecommon_eq(o1, o2); }
|
||||
};
|
||||
|
||||
/*
|
||||
We use `m_cache` to ensure we do **not** traverse a DAG as a tree.
|
||||
We use pointer equality for this collection.
|
||||
*/
|
||||
std::unordered_map<lean_object *, lean_object *> m_cache;
|
||||
/* Set of maximally shared terms. AKA hash-consing table. */
|
||||
std::unordered_set<lean_object *, set_hash, set_eq> m_set;
|
||||
|
||||
/*
|
||||
We do not increment reference counters when inserting Lean objects at `m_cache` and `m_set`.
|
||||
This is correct because
|
||||
- The domain of `m_cache` contains only sub-objects of `lean_sharecommon_quick` parameter,
|
||||
and we know the object referenced by this parameter will remain alive.
|
||||
- The range of `m_cache` contains only new objects that have been maxed shared, and these
|
||||
objects will be are sub-objects of the object returned by `lean_sharecommon_quick`.
|
||||
- `m_set` is like the range of `m_cache`.
|
||||
*/
|
||||
|
||||
lean_object * check_cache(lean_object * a) {
|
||||
if (!lean_is_exclusive(a)) {
|
||||
// We only check the cache if `a` is a shared object
|
||||
auto it = m_cache.find(a);
|
||||
if (it != m_cache.end()) {
|
||||
// All objects stored in the range of `m_cache` are single threaded.
|
||||
lean_assert(lean_is_st(it->second));
|
||||
// We increment the reference counter because this object
|
||||
// will be returned by `lean_sharecommon_quick` or stored into a new object.
|
||||
it->second->m_rc++;
|
||||
return it->second;
|
||||
}
|
||||
}
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
/*
|
||||
`new_a` is a new object that is equal to `a`, but its subobjects are maximally shared.
|
||||
*/
|
||||
lean_object * save(lean_object * a, lean_object * new_a) {
|
||||
lean_assert(lean_is_st(new_a));
|
||||
lean_assert(new_a->m_rc == 1);
|
||||
auto it = m_set.find(new_a);
|
||||
lean_object * result;
|
||||
if (it == m_set.end()) {
|
||||
// `new_a` is a new object
|
||||
m_set.insert(new_a);
|
||||
result = new_a;
|
||||
} else {
|
||||
// We already have a maximally shared object that is equal to `new_a`
|
||||
result = *it;
|
||||
DEBUG_CODE({
|
||||
if (lean_is_ctor(new_a)) {
|
||||
lean_assert(lean_is_ctor(result));
|
||||
unsigned num_objs = lean_ctor_num_objs(new_a);
|
||||
lean_assert(lean_ctor_num_objs(result) == num_objs);
|
||||
for (unsigned i = 0; i < num_objs; i++) {
|
||||
lean_assert(lean_ctor_get(result, i) == lean_ctor_get(new_a, i));
|
||||
}
|
||||
}
|
||||
});
|
||||
lean_dec_ref(new_a); // delete `new_a`
|
||||
// All objects in `m_set` are single threaded.
|
||||
lean_assert(lean_is_st(result));
|
||||
result->m_rc++;
|
||||
lean_assert(result->m_rc > 1);
|
||||
}
|
||||
if (!lean_is_exclusive(a)) {
|
||||
// We only cache the result if `a` is a shared object.
|
||||
m_cache.insert(std::make_pair(a, result));
|
||||
}
|
||||
lean_assert(result == new_a || result->m_rc > 1);
|
||||
lean_assert(result != new_a || result->m_rc == 1);
|
||||
return result;
|
||||
}
|
||||
|
||||
// `sarray` and `string`
|
||||
lean_object * visit_terminal(lean_object * a) {
|
||||
auto it = m_set.find(a);
|
||||
if (it == m_set.end()) {
|
||||
m_set.insert(a);
|
||||
} else {
|
||||
a = *it;
|
||||
}
|
||||
lean_inc_ref(a);
|
||||
return a;
|
||||
}
|
||||
|
||||
lean_object * visit_array(lean_object * a) {
|
||||
lean_object * r = check_cache(a);
|
||||
if (r != nullptr) { lean_assert(r->m_rc > 1); return r; }
|
||||
|
||||
size_t sz = array_size(a);
|
||||
lean_array_object * new_a = (lean_array_object*)lean_alloc_array(sz, sz);
|
||||
for (size_t i = 0; i < sz; i++) {
|
||||
lean_array_set_core((lean_object*)new_a, i, visit(lean_array_get_core(a, i)));
|
||||
}
|
||||
return save(a, (lean_object*)new_a);
|
||||
}
|
||||
|
||||
lean_object * visit_ctor(lean_object * a) {
|
||||
lean_object * r = check_cache(a);
|
||||
if (r != nullptr) { lean_assert(r->m_rc > 1); return r; }
|
||||
unsigned num_objs = lean_ctor_num_objs(a);
|
||||
unsigned tag = lean_ptr_tag(a);
|
||||
unsigned sz = lean_object_byte_size(a);
|
||||
unsigned scalar_offset = sizeof(lean_object) + num_objs*sizeof(void*);
|
||||
unsigned scalar_sz = sz - scalar_offset;
|
||||
lean_object * new_a = lean_alloc_ctor(tag, num_objs, scalar_sz);
|
||||
for (unsigned i = 0; i < num_objs; i++) {
|
||||
lean_ctor_set(new_a, i, visit(lean_ctor_get(a, i)));
|
||||
}
|
||||
if (scalar_sz > 0) {
|
||||
memcpy(reinterpret_cast<char*>(new_a) + scalar_offset, reinterpret_cast<char*>(a) + scalar_offset, scalar_sz);
|
||||
}
|
||||
return save(a, new_a);
|
||||
}
|
||||
|
||||
public:
|
||||
|
||||
/*
|
||||
**TODO:** We did not implement stack overflow detection.
|
||||
We claim it is not needed in the current uses of `shareCommon'`.
|
||||
If this becomes an issue, we can use the following approach to address the issue without
|
||||
affecting the performance.
|
||||
- Add an extra `depth` parameter.
|
||||
- In `operator()`, estimate the maximum depth based on the remaining stack space. See `check_stack`.
|
||||
- If the limit is reached, simply return `a`.
|
||||
*/
|
||||
lean_object * visit(lean_object * a) {
|
||||
if (lean_is_scalar(a)) {
|
||||
return a;
|
||||
}
|
||||
switch (lean_ptr_tag(a)) {
|
||||
/*
|
||||
Similarly to `sharecommon_fn`, we only maximally share arrays, scalar arrays, strings, and
|
||||
constructor objects.
|
||||
*/
|
||||
case LeanMPZ: lean_inc_ref(a); return a;
|
||||
case LeanClosure: lean_inc_ref(a); return a;
|
||||
case LeanThunk: lean_inc_ref(a); return a;
|
||||
case LeanTask: lean_inc_ref(a); return a;
|
||||
case LeanRef: lean_inc_ref(a); return a;
|
||||
case LeanExternal: lean_inc_ref(a); return a;
|
||||
case LeanReserved: lean_inc_ref(a); return a;
|
||||
case LeanScalarArray: return visit_terminal(a);
|
||||
case LeanString: return visit_terminal(a);
|
||||
case LeanArray: return visit_array(a);
|
||||
default: return visit_ctor(a);
|
||||
}
|
||||
}
|
||||
|
||||
lean_object * operator()(lean_object * a) {
|
||||
return visit(a);
|
||||
}
|
||||
};
|
||||
|
||||
// def ShareCommon.shareCommon' (a : A) : A := a
|
||||
extern "C" LEAN_EXPORT obj_res lean_sharecommon_quick(obj_arg a) {
|
||||
return sharecommon_quick_fn()(a);
|
||||
}
|
||||
};
|
||||
|
||||
BIN
stage0/stdlib/Init/Data/Float.c
generated
BIN
stage0/stdlib/Init/Data/Float.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/List/Basic.c
generated
BIN
stage0/stdlib/Init/Data/List/Basic.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/List/Impl.c
generated
BIN
stage0/stdlib/Init/Data/List/Impl.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/List/Lemmas.c
generated
BIN
stage0/stdlib/Init/Data/List/Lemmas.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Repr.c
generated
BIN
stage0/stdlib/Init/Data/Repr.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Meta.c
generated
BIN
stage0/stdlib/Init/Meta.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Notation.c
generated
BIN
stage0/stdlib/Init/Notation.c
generated
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user