mirror of
https://github.com/leanprover/lean4.git
synced 2026-03-31 01:04:07 +00:00
Compare commits
1 Commits
upstream_A
...
pr-release
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c9979fb598 |
2
.github/workflows/copyright-header.yml
vendored
2
.github/workflows/copyright-header.yml
vendored
@@ -10,7 +10,7 @@ jobs:
|
||||
|
||||
- name: Verify .lean files start with a copyright header.
|
||||
run: |
|
||||
FILES=$(find ./src -type d \( -path "./src/lake/examples" -o -path "./src/lake/tests" \) -prune -o -type f -name "*.lean" -exec perl -ne 'BEGIN { $/ = undef; } print "$ARGV\n" if !m{\A/-\nCopyright}; exit;' {} \;)
|
||||
FILES=$(find . -type d \( -path "./tests" -o -path "./doc" -o -path "./src/lake/examples" -o -path "./src/lake/tests" -o -path "./build" -o -path "./nix" \) -prune -o -type f -name "*.lean" -exec perl -ne 'BEGIN { $/ = undef; } print "$ARGV\n" if !m{\A/-\nCopyright}; exit;' {} \;)
|
||||
if [ -n "$FILES" ]; then
|
||||
echo "Found .lean files which do not have a copyright header:"
|
||||
echo "$FILES"
|
||||
|
||||
@@ -21,11 +21,6 @@
|
||||
/src/Lean/Server/ @mhuisi
|
||||
/src/Lean/Widget/ @Vtec234
|
||||
/src/runtime/io.cpp @joehendrix
|
||||
/src/Init/Data/ @semorrison
|
||||
/src/Init/Data/Array/Lemmas.lean @digama0
|
||||
/src/Init/Data/List/Lemmas.lean @digama0
|
||||
/src/Init/Data/List/BasicAux.lean @digama0
|
||||
/src/Init/Data/Array/Subarray.lean @david-christiansen
|
||||
/src/Lean/Elab/Tactic/RCases.lean @digama0
|
||||
/src/Init/RCases.lean @digama0
|
||||
/src/Lean/Elab/Tactic/Ext.lean @digama0
|
||||
@@ -44,4 +39,5 @@
|
||||
/src/Lean/Elab/Tactic/Guard.lean @digama0
|
||||
/src/Init/Guard.lean @digama0
|
||||
/src/Lean/Server/CodeActions/ @digama0
|
||||
/src/Init/Data/Array/Subarray.lean @david-christiansen
|
||||
|
||||
|
||||
@@ -99,10 +99,6 @@ v4.8.0 (development in progress)
|
||||
and `#guard_msgs (ordering := sorted) in cmd` sorts the messages in lexicographic order before checking.
|
||||
PR [#3883](https://github.com/leanprover/lean4/pull/3883).
|
||||
|
||||
* The `#guard_msgs` command now supports showing a diff between the expected and actual outputs. This feature is currently
|
||||
disabled by default, but can be enabled with `set_option guard_msgs.diff true`. Depending on user feedback, this option
|
||||
may default to `true` in a future version of Lean.
|
||||
|
||||
Breaking changes:
|
||||
|
||||
* Automatically generated equational theorems are now named using suffix `.eq_<idx>` instead of `._eq_<idx>`, and `.def` instead of `._unfold`. Example:
|
||||
@@ -136,8 +132,6 @@ fact.def :
|
||||
|
||||
* The change to the instance name algorithm (described above) can break projects that made use of the auto-generated names.
|
||||
|
||||
* `Option.toMonad` has been renamed to `Option.getM` and the unneeded `[Monad m]` instance argument has been removed.
|
||||
|
||||
v4.7.0
|
||||
---------
|
||||
|
||||
|
||||
@@ -21,7 +21,7 @@ We'll use `v4.6.0` as the intended release version as a running example.
|
||||
- Reconcile discrepancies in the `v4.6.0` section,
|
||||
usually via copy and paste and a commit to `releases/v4.6.0`.
|
||||
- `git tag v4.6.0`
|
||||
- `git push $REMOTE v4.6.0`, where `$REMOTE` is the upstream Lean repository (e.g., `origin`, `upstream`)
|
||||
- `git push origin v4.6.0`
|
||||
- Now wait, while CI runs.
|
||||
- You can monitor this at `https://github.com/leanprover/lean4/actions/workflows/ci.yml`,
|
||||
looking for the `v4.6.0` tag.
|
||||
@@ -34,76 +34,48 @@ We'll use `v4.6.0` as the intended release version as a running example.
|
||||
(e.g. `v4.6.0-rc1`), and quickly sanity check.
|
||||
- Next, we will move a curated list of downstream repos to the latest stable release.
|
||||
- For each of the repositories listed below:
|
||||
- Make a PR to `master`/`main` changing the toolchain to `v4.6.0`
|
||||
- Update the toolchain file
|
||||
- In the Lakefile, if there are dependencies on specific version tags of dependencies that you've already pushed as part of this process, update them to the new tag.
|
||||
If they depend on `main` or `master`, don't change this; you've just updated the dependency, so it will work and be saved in the manifest
|
||||
- Run `lake update`
|
||||
- The PR title should be "chore: bump toolchain to v4.6.0".
|
||||
- Merge the PR once CI completes.
|
||||
- Create the tag `v4.6.0` from `master`/`main` and push it.
|
||||
- Merge the tag `v4.6.0` into the `stable` branch and push it.
|
||||
- Make a PR to `master`/`main` changing the toolchain to `v4.6.0`.
|
||||
The PR title should be "chore: bump toolchain to v4.6.0".
|
||||
Since the `v4.6.0` release should be functionally identical to the last release candidate,
|
||||
which the repository should already be on, this PR is a no-op besides changing the toolchain.
|
||||
- Once this is merged, create the tag `v4.6.0` from `master`/`main` and push it.
|
||||
- Merge the tag `v4.6.0` into the stable branch.
|
||||
- We do this for the repositories:
|
||||
- [lean4checker](https://github.com/leanprover/lean4checker)
|
||||
- No dependencies
|
||||
- Note: `lean4checker` uses a different version tagging scheme: use `toolchain/v4.6.0` rather than `v4.6.0`.
|
||||
- Toolchain bump PR
|
||||
- Create and push the tag
|
||||
- Merge the tag into `stable`
|
||||
- [Std](https://github.com/leanprover-community/std4)
|
||||
- No dependencies
|
||||
- Toolchain bump PR
|
||||
- Create and push the tag
|
||||
- Merge the tag into `stable`
|
||||
- `lean4checker` uses a different version tagging scheme: use `toolchain/v4.6.0` rather than `v4.6.0`.
|
||||
- [Std](https://github.com/leanprover-community/repl)
|
||||
- [ProofWidgets4](https://github.com/leanprover-community/ProofWidgets4)
|
||||
- Dependencies: `Std`
|
||||
- Note on versions and branches:
|
||||
- `ProofWidgets` uses a sequential version tagging scheme, e.g. `v0.0.29`,
|
||||
which does not refer to the toolchain being used.
|
||||
- Make a new release in this sequence after merging the toolchain bump PR.
|
||||
- `ProofWidgets` does not maintain a `stable` branch.
|
||||
- Toolchain bump PR
|
||||
- Create and push the tag, following the version convention of the repository
|
||||
- `ProofWidgets` uses a sequential version tagging scheme, e.g. `v0.0.29`,
|
||||
which does not refer to the toolchain being used.
|
||||
- Make a new release in this sequence after merging the toolchain bump PR.
|
||||
- `ProofWidgets` does not maintain a `stable` branch.
|
||||
- [Aesop](https://github.com/leanprover-community/aesop)
|
||||
- Dependencies: `Std`
|
||||
- Toolchain bump PR including updated Lake manifest
|
||||
- Create and push the tag
|
||||
- Merge the tag into `stable`
|
||||
- [doc-gen4](https://github.com/leanprover/doc-gen4)
|
||||
- Dependencies: exist, but they're not part of the release workflow
|
||||
- Toolchain bump PR including updated Lake manifest
|
||||
- Create and push the tag
|
||||
- There is no `stable` branch; skip this step
|
||||
- [import-graph](https://github.com/leanprover-community/import-graph)
|
||||
- Toolchain bump PR including updated Lake manifest
|
||||
- Create and push the tag
|
||||
- There is no `stable` branch; skip this step
|
||||
- [Mathlib](https://github.com/leanprover-community/mathlib4)
|
||||
- Dependencies: `Aesop`, `ProofWidgets4`, `lean4checker`, `Std`, `doc-gen4`, `import-graph`
|
||||
- Toolchain bump PR notes:
|
||||
- In addition to updating the `lean-toolchain` and `lakefile.lean`,
|
||||
in `.github/workflows/build.yml.in` in the `lean4checker` section update the line
|
||||
`git checkout toolchain/v4.6.0` to the appropriate tag,
|
||||
and then run `.github/workflows/mk_build_yml.sh`. Coordinate with
|
||||
a Mathlib maintainer to get this merged.
|
||||
- Push the PR branch to the main Mathlib repository rather than a fork, or CI may not work reliably
|
||||
- Create and push the tag
|
||||
- Create a new branch from the tag, push it, and open a pull request against `stable`.
|
||||
Coordinate with a Mathlib maintainer to get this merged.
|
||||
- In addition to updating the `lean-toolchain` and `lakefile.lean`,
|
||||
in `.github/workflows/build.yml.in` in the `lean4checker` section update the line
|
||||
`git checkout toolchain/v4.6.0` to the appropriate tag,
|
||||
and then run `.github/workflows/mk_build_yml.sh`.
|
||||
- [REPL](https://github.com/leanprover-community/repl)
|
||||
- Dependencies: `Mathlib` (for test code)
|
||||
- Note that there are two copies of `lean-toolchain`/`lakefile.lean`:
|
||||
in the root, and in `test/Mathlib/`. Edit both, and run `lake update` in both directories.
|
||||
- Toolchain bump PR including updated Lake manifest
|
||||
- Create and push the tag
|
||||
- Merge the tag into `stable`
|
||||
in the root, and in `test/Mathlib/`.
|
||||
- Note that there are dependencies between these packages:
|
||||
you should update the lakefile so that you are using the `v4.6.0` tag of upstream repositories
|
||||
(or the sequential tag for `ProofWidgets4`), and run `lake update` before committing.
|
||||
- This means that this process is sequential; each repository must have its bump PR merged,
|
||||
and the new tag pushed, before you can make the PR for the downstream repositories.
|
||||
- `lean4checker` has no dependencies
|
||||
- `Std` has no dependencies
|
||||
- `Aesop` depends on `Std`
|
||||
- `ProofWidgets4` depends on `Std`
|
||||
- `Mathlib` depends on `Aesop`, `ProofWidgets4`, and `lean4checker` (and transitively on `Std`)
|
||||
- `REPL` depends on `Mathlib` (this dependency is only for testing).
|
||||
- Merge the release announcement PR for the Lean website - it will be deployed automatically
|
||||
- Finally, make an announcement!
|
||||
This should go in https://leanprover.zulipchat.com/#narrow/stream/113486-announce, with topic `v4.6.0`.
|
||||
Please see previous announcements for suggested language.
|
||||
You will want a few bullet points for main topics from the release notes.
|
||||
Link to the blog post from the Zulip announcement.
|
||||
- Make sure that whoever is handling social media knows the release is out.
|
||||
Please also make sure that whoever is handling social media knows the release is out.
|
||||
|
||||
## Optimistic(?) time estimates:
|
||||
- Initial checks and push the tag: 30 minutes.
|
||||
|
||||
@@ -1,28 +0,0 @@
|
||||
import Lean.Util.Profiler
|
||||
|
||||
/-!
|
||||
|
||||
Usage:
|
||||
```sh
|
||||
lean --run ./script/collideProfiles.lean **/*.lean.json ... > merged.json
|
||||
```
|
||||
|
||||
Merges multiple `trace.profiler.output` profiles into a single one while deduplicating samples with
|
||||
the same stack. This is useful for building cumulative profiles of medium-to-large projects because
|
||||
Firefox Profiler cannot handle hundreds of tracks and the deduplication will also ensure that the
|
||||
profile is small enough for uploading.
|
||||
|
||||
As ordering of samples is not meaningful after this transformation, only "Call Tree" and "Flame
|
||||
Graph" are useful for such profiles.
|
||||
-/
|
||||
|
||||
open Lean
|
||||
|
||||
def main (args : List String) : IO Unit := do
|
||||
let profiles ← args.toArray.mapM fun path => do
|
||||
let json ← IO.FS.readFile ⟨path⟩
|
||||
let profile ← IO.ofExcept $ Json.parse json
|
||||
IO.ofExcept <| fromJson? profile
|
||||
-- NOTE: `collide` should not be interpreted
|
||||
let profile := Firefox.Profile.collide profiles
|
||||
IO.println <| Json.compress <| toJson profile
|
||||
@@ -15,13 +15,6 @@ namespace Classical
|
||||
noncomputable def indefiniteDescription {α : Sort u} (p : α → Prop) (h : ∃ x, p x) : {x // p x} :=
|
||||
choice <| let ⟨x, px⟩ := h; ⟨⟨x, px⟩⟩
|
||||
|
||||
/--
|
||||
Given that there exists an element satisfying `p`, returns one such element.
|
||||
|
||||
This is a straightforward consequence of, and equivalent to, `Classical.choice`.
|
||||
|
||||
See also `choose_spec`, which asserts that the returned value has property `p`.
|
||||
-/
|
||||
noncomputable def choose {α : Sort u} {p : α → Prop} (h : ∃ x, p x) : α :=
|
||||
(indefiniteDescription p h).val
|
||||
|
||||
|
||||
@@ -20,29 +20,8 @@ def Functor.discard {f : Type u → Type v} {α : Type u} [Functor f] (x : f α)
|
||||
|
||||
export Functor (discard)
|
||||
|
||||
/--
|
||||
An `Alternative` functor is an `Applicative` functor that can "fail" or be "empty"
|
||||
and a binary operation `<|>` that “collects values” or finds the “left-most success”.
|
||||
|
||||
Important instances include
|
||||
* `Option`, where `failure := none` and `<|>` returns the left-most `some`.
|
||||
* Parser combinators typically provide an `Applicative` instance for error-handling and
|
||||
backtracking.
|
||||
|
||||
Error recovery and state can interact subtly. For example, the implementation of `Alternative` for `OptionT (StateT σ Id)` keeps modifications made to the state while recovering from failure, while `StateT σ (OptionT Id)` discards them.
|
||||
-/
|
||||
-- NB: List instance is in mathlib. Once upstreamed, add
|
||||
-- * `List`, where `failure` is the empty list and `<|>` concatenates.
|
||||
class Alternative (f : Type u → Type v) extends Applicative f : Type (max (u+1) v) where
|
||||
/--
|
||||
Produces an empty collection or recoverable failure. The `<|>` operator collects values or recovers
|
||||
from failures. See `Alternative` for more details.
|
||||
-/
|
||||
failure : {α : Type u} → f α
|
||||
/--
|
||||
Depending on the `Alternative` instance, collects values or recovers from `failure`s by
|
||||
returning the leftmost success. Can be written using the `<|>` operator syntax.
|
||||
-/
|
||||
orElse : {α : Type u} → f α → (Unit → f α) → f α
|
||||
|
||||
instance (f : Type u → Type v) (α : Type u) [Alternative f] : OrElse (f α) := ⟨Alternative.orElse⟩
|
||||
@@ -51,15 +30,9 @@ variable {f : Type u → Type v} [Alternative f] {α : Type u}
|
||||
|
||||
export Alternative (failure)
|
||||
|
||||
/--
|
||||
If the proposition `p` is true, does nothing, else fails (using `failure`).
|
||||
-/
|
||||
@[always_inline, inline] def guard {f : Type → Type v} [Alternative f] (p : Prop) [Decidable p] : f Unit :=
|
||||
if p then pure () else failure
|
||||
|
||||
/--
|
||||
Returns `some x` if `f` succeeds with value `x`, else returns `none`.
|
||||
-/
|
||||
@[always_inline, inline] def optional (x : f α) : f (Option α) :=
|
||||
some <$> x <|> pure none
|
||||
|
||||
|
||||
@@ -12,15 +12,6 @@ open Function
|
||||
@[simp] theorem monadLift_self [Monad m] (x : m α) : monadLift x = x :=
|
||||
rfl
|
||||
|
||||
/--
|
||||
The `Functor` typeclass only contains the operations of a functor.
|
||||
`LawfulFunctor` further asserts that these operations satisfy the laws of a functor,
|
||||
including the preservation of the identity and composition laws:
|
||||
```
|
||||
id <$> x = x
|
||||
(h ∘ g) <$> x = h <$> g <$> x
|
||||
```
|
||||
-/
|
||||
class LawfulFunctor (f : Type u → Type v) [Functor f] : Prop where
|
||||
map_const : (Functor.mapConst : α → f β → f α) = Functor.map ∘ const β
|
||||
id_map (x : f α) : id <$> x = x
|
||||
@@ -33,16 +24,6 @@ attribute [simp] id_map
|
||||
@[simp] theorem id_map' [Functor m] [LawfulFunctor m] (x : m α) : (fun a => a) <$> x = x :=
|
||||
id_map x
|
||||
|
||||
/--
|
||||
The `Applicative` typeclass only contains the operations of an applicative functor.
|
||||
`LawfulApplicative` further asserts that these operations satisfy the laws of an applicative functor:
|
||||
```
|
||||
pure id <*> v = v
|
||||
pure (·∘·) <*> u <*> v <*> w = u <*> (v <*> w)
|
||||
pure f <*> pure x = pure (f x)
|
||||
u <*> pure y = pure (· y) <*> u
|
||||
```
|
||||
-/
|
||||
class LawfulApplicative (f : Type u → Type v) [Applicative f] extends LawfulFunctor f : Prop where
|
||||
seqLeft_eq (x : f α) (y : f β) : x <* y = const β <$> x <*> y
|
||||
seqRight_eq (x : f α) (y : f β) : x *> y = const α id <$> x <*> y
|
||||
@@ -61,18 +42,6 @@ attribute [simp] map_pure seq_pure
|
||||
@[simp] theorem pure_id_seq [Applicative f] [LawfulApplicative f] (x : f α) : pure id <*> x = x := by
|
||||
simp [pure_seq]
|
||||
|
||||
/--
|
||||
The `Monad` typeclass only contains the operations of a monad.
|
||||
`LawfulMonad` further asserts that these operations satisfy the laws of a monad,
|
||||
including associativity and identity laws for `bind`:
|
||||
```
|
||||
pure x >>= f = f x
|
||||
x >>= pure = x
|
||||
x >>= f >>= g = x >>= (fun x => f x >>= g)
|
||||
```
|
||||
|
||||
`LawfulMonad.mk'` is an alternative constructor containing useful defaults for many fields.
|
||||
-/
|
||||
class LawfulMonad (m : Type u → Type v) [Monad m] extends LawfulApplicative m : Prop where
|
||||
bind_pure_comp (f : α → β) (x : m α) : x >>= (fun a => pure (f a)) = f <$> x
|
||||
bind_map {α β : Type u} (f : m (α → β)) (x : m α) : f >>= (. <$> x) = f <*> x
|
||||
|
||||
@@ -10,7 +10,7 @@ import Init.Control.Except
|
||||
|
||||
universe u v
|
||||
|
||||
instance : ToBool (Option α) := ⟨Option.isSome⟩
|
||||
instance : ToBool (Option α) := ⟨Option.toBool⟩
|
||||
|
||||
def OptionT (m : Type u → Type v) (α : Type u) : Type v :=
|
||||
m (Option α)
|
||||
|
||||
@@ -14,7 +14,6 @@ import Init.Data.String
|
||||
import Init.Data.List
|
||||
import Init.Data.Int
|
||||
import Init.Data.Array
|
||||
import Init.Data.Array.Subarray.Split
|
||||
import Init.Data.ByteArray
|
||||
import Init.Data.FloatArray
|
||||
import Init.Data.Fin
|
||||
|
||||
@@ -5,7 +5,6 @@ Authors: Mario Carneiro
|
||||
-/
|
||||
prelude
|
||||
import Init.Data.Nat.MinMax
|
||||
import Init.Data.Nat.Lemmas
|
||||
import Init.Data.List.Lemmas
|
||||
import Init.Data.Fin.Basic
|
||||
import Init.Data.Array.Mem
|
||||
@@ -188,8 +187,7 @@ theorem anyM_stop_le_start [Monad m] (p : α → m Bool) (as : Array α) (start
|
||||
theorem mem_def (a : α) (as : Array α) : a ∈ as ↔ a ∈ as.data :=
|
||||
⟨fun | .mk h => h, Array.Mem.mk⟩
|
||||
|
||||
/-! # get -/
|
||||
|
||||
/-- # get -/
|
||||
@[simp] theorem get_eq_getElem (a : Array α) (i : Fin _) : a.get i = a[i.1] := rfl
|
||||
|
||||
theorem getElem?_lt
|
||||
@@ -219,7 +217,7 @@ theorem get!_eq_getD [Inhabited α] (a : Array α) : a.get! n = a.getD n default
|
||||
@[simp] theorem get!_eq_getElem? [Inhabited α] (a : Array α) (i : Nat) : a.get! i = (a.get? i).getD default := by
|
||||
by_cases p : i < a.size <;> simp [getD_get?, get!_eq_getD, p]
|
||||
|
||||
/-! # set -/
|
||||
/-- # set -/
|
||||
|
||||
@[simp] theorem getElem_set_eq (a : Array α) (i : Fin a.size) (v : α) {j : Nat}
|
||||
(eq : i.val = j) (p : j < (a.set i v).size) :
|
||||
@@ -242,7 +240,7 @@ theorem getElem_set (a : Array α) (i : Fin a.size) (v : α) (j : Nat)
|
||||
(ne : i.val ≠ j) : (a.set i v)[j]? = a[j]? := by
|
||||
by_cases h : j < a.size <;> simp [getElem?_lt, getElem?_ge, Nat.ge_of_not_lt, ne, h]
|
||||
|
||||
/-! # setD -/
|
||||
/- # setD -/
|
||||
|
||||
@[simp] theorem set!_is_setD : @set! = @setD := rfl
|
||||
|
||||
@@ -268,44 +266,4 @@ theorem getElem?_setD_eq (a : Array α) {i : Nat} (p : i < a.size) (v : α) : (a
|
||||
by_cases h : i < a.size <;>
|
||||
simp [setD, Nat.not_lt_of_le, h, getD_get?]
|
||||
|
||||
/-! # ofFn -/
|
||||
|
||||
@[simp] theorem size_ofFn_go {n} (f : Fin n → α) (i acc) :
|
||||
(ofFn.go f i acc).size = acc.size + (n - i) := by
|
||||
if hin : i < n then
|
||||
unfold ofFn.go
|
||||
have : 1 + (n - (i + 1)) = n - i :=
|
||||
Nat.sub_sub .. ▸ Nat.add_sub_cancel' (Nat.le_sub_of_add_le (Nat.add_comm .. ▸ hin))
|
||||
rw [dif_pos hin, size_ofFn_go f (i+1), size_push, Nat.add_assoc, this]
|
||||
else
|
||||
have : n - i = 0 := Nat.sub_eq_zero_of_le (Nat.le_of_not_lt hin)
|
||||
unfold ofFn.go
|
||||
simp [hin, this]
|
||||
termination_by n - i
|
||||
|
||||
@[simp] theorem size_ofFn (f : Fin n → α) : (ofFn f).size = n := by simp [ofFn]
|
||||
|
||||
theorem getElem_ofFn_go (f : Fin n → α) (i) {acc k}
|
||||
(hki : k < n) (hin : i ≤ n) (hi : i = acc.size)
|
||||
(hacc : ∀ j, ∀ hj : j < acc.size, acc[j] = f ⟨j, Nat.lt_of_lt_of_le hj (hi ▸ hin)⟩) :
|
||||
haveI : acc.size + (n - acc.size) = n := Nat.add_sub_cancel' (hi ▸ hin)
|
||||
(ofFn.go f i acc)[k]'(by simp [*]) = f ⟨k, hki⟩ := by
|
||||
unfold ofFn.go
|
||||
if hin : i < n then
|
||||
have : 1 + (n - (i + 1)) = n - i :=
|
||||
Nat.sub_sub .. ▸ Nat.add_sub_cancel' (Nat.le_sub_of_add_le (Nat.add_comm .. ▸ hin))
|
||||
simp only [dif_pos hin]
|
||||
rw [getElem_ofFn_go f (i+1) _ hin (by simp [*]) (fun j hj => ?hacc)]
|
||||
cases (Nat.lt_or_eq_of_le <| Nat.le_of_lt_succ (by simpa using hj)) with
|
||||
| inl hj => simp [get_push, hj, hacc j hj]
|
||||
| inr hj => simp [get_push, *]
|
||||
else
|
||||
simp [hin, hacc k (Nat.lt_of_lt_of_le hki (Nat.le_of_not_lt (hi ▸ hin)))]
|
||||
termination_by n - i
|
||||
|
||||
@[simp] theorem getElem_ofFn (f : Fin n → α) (i : Nat) (h) :
|
||||
(ofFn f)[i] = f ⟨i, size_ofFn f ▸ h⟩ :=
|
||||
getElem_ofFn_go _ _ _ (by simp) (by simp) nofun
|
||||
|
||||
|
||||
end Array
|
||||
|
||||
@@ -29,12 +29,6 @@ namespace Subarray
|
||||
def size (s : Subarray α) : Nat :=
|
||||
s.stop - s.start
|
||||
|
||||
theorem size_le_array_size {s : Subarray α} : s.size ≤ s.array.size := by
|
||||
let {array, start, stop, start_le_stop, stop_le_array_size} := s
|
||||
simp [size]
|
||||
apply Nat.le_trans (Nat.sub_le stop start)
|
||||
assumption
|
||||
|
||||
def get (s : Subarray α) (i : Fin s.size) : α :=
|
||||
have : s.start + i.val < s.array.size := by
|
||||
apply Nat.lt_of_lt_of_le _ s.stop_le_array_size
|
||||
|
||||
@@ -1,71 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: David Thrane Christiansen
|
||||
-/
|
||||
|
||||
prelude
|
||||
import Init.Data.Array.Basic
|
||||
import Init.Data.Array.Subarray
|
||||
import Init.Omega
|
||||
|
||||
/-
|
||||
This module contains splitting operations on subarrays that crucially rely on `omega` for proof
|
||||
automation. Placing them in another module breaks an import cycle, because `omega` itself uses the
|
||||
array library.
|
||||
-/
|
||||
|
||||
namespace Subarray
|
||||
/--
|
||||
Splits a subarray into two parts.
|
||||
-/
|
||||
def split (s : Subarray α) (i : Fin s.size.succ) : (Subarray α × Subarray α) :=
|
||||
let ⟨i', isLt⟩ := i
|
||||
have := s.start_le_stop
|
||||
have := s.stop_le_array_size
|
||||
have : i' ≤ s.stop - s.start := Nat.lt_succ.mp isLt
|
||||
have : s.start + i' ≤ s.stop := by omega
|
||||
have : s.start + i' ≤ s.array.size := by omega
|
||||
have : s.start + i' ≤ s.stop := by
|
||||
simp only [size] at isLt
|
||||
omega
|
||||
let pre := {s with
|
||||
stop := s.start + i',
|
||||
start_le_stop := by omega,
|
||||
stop_le_array_size := by assumption
|
||||
}
|
||||
let post := {s with
|
||||
start := s.start + i'
|
||||
start_le_stop := by assumption
|
||||
}
|
||||
(pre, post)
|
||||
|
||||
/--
|
||||
Removes the first `i` elements of the subarray. If there are `i` or fewer elements, the resulting
|
||||
subarray is empty.
|
||||
-/
|
||||
def drop (arr : Subarray α) (i : Nat) : Subarray α where
|
||||
array := arr.array
|
||||
start := min (arr.start + i) arr.stop
|
||||
stop := arr.stop
|
||||
start_le_stop := by
|
||||
rw [Nat.min_def]
|
||||
split <;> simp only [Nat.le_refl, *]
|
||||
stop_le_array_size := arr.stop_le_array_size
|
||||
|
||||
/--
|
||||
Keeps only the first `i` elements of the subarray. If there are `i` or fewer elements, the resulting
|
||||
subarray is empty.
|
||||
-/
|
||||
def take (arr : Subarray α) (i : Nat) : Subarray α where
|
||||
array := arr.array
|
||||
start := arr.start
|
||||
stop := min (arr.start + i) arr.stop
|
||||
start_le_stop := by
|
||||
have := arr.start_le_stop
|
||||
rw [Nat.min_def]
|
||||
split <;> omega
|
||||
stop_le_array_size := by
|
||||
have := arr.stop_le_array_size
|
||||
rw [Nat.min_def]
|
||||
split <;> omega
|
||||
@@ -1,7 +1,7 @@
|
||||
/-
|
||||
Copyright (c) 2024 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Joe Hendrix, Wojciech Nawrocki, Leonardo de Moura, Mario Carneiro, Alex Keizer, Harun Khan, Abdalrhman M Mohamed
|
||||
Authors: Joe Hendrix, Wojciech Nawrocki, Leonardo de Moura, Mario Carneiro, Alex Keizer
|
||||
-/
|
||||
prelude
|
||||
import Init.Data.Fin.Basic
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
/-
|
||||
Copyright (c) 2023 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Joe Hendrix, Harun Khan, Alex Keizer, Abdalrhman M Mohamed,
|
||||
Authors: Joe Hendrix
|
||||
-/
|
||||
prelude
|
||||
import Init.Data.Bool
|
||||
|
||||
@@ -1085,9 +1085,6 @@ theorem bmod_mul_bmod : Int.bmod (Int.bmod x n * y) n = Int.bmod (x * y) n := by
|
||||
@[simp] theorem mul_bmod_bmod : Int.bmod (x * Int.bmod y n) n = Int.bmod (x * y) n := by
|
||||
rw [Int.mul_comm x, bmod_mul_bmod, Int.mul_comm x]
|
||||
|
||||
theorem add_bmod (a b : Int) (n : Nat) : (a + b).bmod n = (a.bmod n + b.bmod n).bmod n := by
|
||||
simp
|
||||
|
||||
theorem emod_bmod {x : Int} {m : Nat} : bmod (x % m) m = bmod x m := by
|
||||
simp [bmod]
|
||||
|
||||
|
||||
@@ -5,7 +5,6 @@ Author: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Init.Data.Nat.Linear
|
||||
import Init.Ext
|
||||
|
||||
universe u
|
||||
|
||||
@@ -13,147 +12,60 @@ namespace List
|
||||
/-! The following functions can't be defined at `Init.Data.List.Basic`, because they depend on `Init.Util`,
|
||||
and `Init.Util` depends on `Init.Data.List.Basic`. -/
|
||||
|
||||
/--
|
||||
Returns the `i`-th element in the list (zero-based).
|
||||
|
||||
If the index is out of bounds (`i ≥ as.length`), this function panics when executed, and returns
|
||||
`default`. See `get?` and `getD` for safer alternatives.
|
||||
-/
|
||||
def get! [Inhabited α] : (as : List α) → (i : Nat) → α
|
||||
def get! [Inhabited α] : List α → Nat → α
|
||||
| a::_, 0 => a
|
||||
| _::as, n+1 => get! as n
|
||||
| _, _ => panic! "invalid index"
|
||||
|
||||
/--
|
||||
Returns the `i`-th element in the list (zero-based).
|
||||
|
||||
If the index is out of bounds (`i ≥ as.length`), this function returns `none`.
|
||||
Also see `get`, `getD` and `get!`.
|
||||
-/
|
||||
def get? : (as : List α) → (i : Nat) → Option α
|
||||
def get? : List α → Nat → Option α
|
||||
| a::_, 0 => some a
|
||||
| _::as, n+1 => get? as n
|
||||
| _, _ => none
|
||||
|
||||
/--
|
||||
Returns the `i`-th element in the list (zero-based).
|
||||
def getD (as : List α) (idx : Nat) (a₀ : α) : α :=
|
||||
(as.get? idx).getD a₀
|
||||
|
||||
If the index is out of bounds (`i ≥ as.length`), this function returns `fallback`.
|
||||
See also `get?` and `get!`.
|
||||
-/
|
||||
def getD (as : List α) (i : Nat) (fallback : α) : α :=
|
||||
(as.get? i).getD fallback
|
||||
|
||||
@[ext] theorem ext : ∀ {l₁ l₂ : List α}, (∀ n, l₁.get? n = l₂.get? n) → l₁ = l₂
|
||||
| [], [], _ => rfl
|
||||
| a :: l₁, [], h => nomatch h 0
|
||||
| [], a' :: l₂, h => nomatch h 0
|
||||
| a :: l₁, a' :: l₂, h => by
|
||||
have h0 : some a = some a' := h 0
|
||||
injection h0 with aa; simp only [aa, ext fun n => h (n+1)]
|
||||
|
||||
/--
|
||||
Returns the first element in the list.
|
||||
|
||||
If the list is empty, this function panics when executed, and returns `default`.
|
||||
See `head` and `headD` for safer alternatives.
|
||||
-/
|
||||
def head! [Inhabited α] : List α → α
|
||||
| [] => panic! "empty list"
|
||||
| a::_ => a
|
||||
|
||||
/--
|
||||
Returns the first element in the list.
|
||||
|
||||
If the list is empty, this function returns `none`.
|
||||
Also see `headD` and `head!`.
|
||||
-/
|
||||
def head? : List α → Option α
|
||||
| [] => none
|
||||
| a::_ => some a
|
||||
|
||||
/--
|
||||
Returns the first element in the list.
|
||||
|
||||
If the list is empty, this function returns `fallback`.
|
||||
Also see `head?` and `head!`.
|
||||
-/
|
||||
def headD : (as : List α) → (fallback : α) → α
|
||||
| [], fallback => fallback
|
||||
def headD : List α → α → α
|
||||
| [], a₀ => a₀
|
||||
| a::_, _ => a
|
||||
|
||||
/--
|
||||
Returns the first element of a non-empty list.
|
||||
-/
|
||||
def head : (as : List α) → as ≠ [] → α
|
||||
| a::_, _ => a
|
||||
|
||||
/--
|
||||
Drops the first element of the list.
|
||||
|
||||
If the list is empty, this function panics when executed, and returns the empty list.
|
||||
See `tail` and `tailD` for safer alternatives.
|
||||
-/
|
||||
def tail! : List α → List α
|
||||
| [] => panic! "empty list"
|
||||
| _::as => as
|
||||
|
||||
/--
|
||||
Drops the first element of the list.
|
||||
|
||||
If the list is empty, this function returns `none`.
|
||||
Also see `tailD` and `tail!`.
|
||||
-/
|
||||
def tail? : List α → Option (List α)
|
||||
| [] => none
|
||||
| _::as => some as
|
||||
|
||||
/--
|
||||
Drops the first element of the list.
|
||||
def tailD : List α → List α → List α
|
||||
| [], as₀ => as₀
|
||||
| _::as, _ => as
|
||||
|
||||
If the list is empty, this function returns `fallback`.
|
||||
Also see `head?` and `head!`.
|
||||
-/
|
||||
def tailD (list fallback : List α) : List α :=
|
||||
match list with
|
||||
| [] => fallback
|
||||
| _ :: tl => tl
|
||||
|
||||
/--
|
||||
Returns the last element of a non-empty list.
|
||||
-/
|
||||
def getLast : ∀ (as : List α), as ≠ [] → α
|
||||
| [], h => absurd rfl h
|
||||
| [a], _ => a
|
||||
| _::b::as, _ => getLast (b::as) (fun h => List.noConfusion h)
|
||||
|
||||
/--
|
||||
Returns the last element in the list.
|
||||
|
||||
If the list is empty, this function panics when executed, and returns `default`.
|
||||
See `getLast` and `getLastD` for safer alternatives.
|
||||
-/
|
||||
def getLast! [Inhabited α] : List α → α
|
||||
| [] => panic! "empty list"
|
||||
| a::as => getLast (a::as) (fun h => List.noConfusion h)
|
||||
|
||||
/--
|
||||
Returns the last element in the list.
|
||||
|
||||
If the list is empty, this function returns `none`.
|
||||
Also see `getLastD` and `getLast!`.
|
||||
-/
|
||||
def getLast? : List α → Option α
|
||||
| [] => none
|
||||
| a::as => some (getLast (a::as) (fun h => List.noConfusion h))
|
||||
|
||||
/--
|
||||
Returns the last element in the list.
|
||||
|
||||
If the list is empty, this function returns `fallback`.
|
||||
Also see `getLast?` and `getLast!`.
|
||||
-/
|
||||
def getLastD : (as : List α) → (fallback : α) → α
|
||||
def getLastD : List α → α → α
|
||||
| [], a₀ => a₀
|
||||
| a::as, _ => getLast (a::as) (fun h => List.noConfusion h)
|
||||
|
||||
|
||||
@@ -274,19 +274,6 @@ theorem get?_reverse {l : List α} (i) (h : i < length l) :
|
||||
|
||||
@[simp] theorem getD_cons_succ : getD (x :: xs) (n + 1) d = getD xs n d := rfl
|
||||
|
||||
theorem ext_get {l₁ l₂ : List α} (hl : length l₁ = length l₂)
|
||||
(h : ∀ n h₁ h₂, get l₁ ⟨n, h₁⟩ = get l₂ ⟨n, h₂⟩) : l₁ = l₂ :=
|
||||
ext fun n =>
|
||||
if h₁ : n < length l₁ then by
|
||||
rw [get?_eq_get, get?_eq_get, h n h₁ (by rwa [← hl])]
|
||||
else by
|
||||
have h₁ := Nat.le_of_not_lt h₁
|
||||
rw [get?_len_le h₁, get?_len_le]; rwa [← hl]
|
||||
|
||||
@[simp] theorem get_map (f : α → β) {l n} :
|
||||
get (map f l) n = f (get l ⟨n, length_map l f ▸ n.2⟩) :=
|
||||
Option.some.inj <| by rw [← get?_eq_get, get?_map, get?_eq_get]; rfl
|
||||
|
||||
/-! ### take and drop -/
|
||||
|
||||
@[simp] theorem take_append_drop : ∀ (n : Nat) (l : List α), take n l ++ drop n l = l
|
||||
@@ -404,14 +391,6 @@ theorem foldr_eq_foldrM (f : α → β → β) (b) (l : List α) :
|
||||
|
||||
theorem foldr_self (l : List α) : l.foldr cons [] = l := by simp
|
||||
|
||||
theorem foldl_map (f : β₁ → β₂) (g : α → β₂ → α) (l : List β₁) (init : α) :
|
||||
(l.map f).foldl g init = l.foldl (fun x y => g x (f y)) init := by
|
||||
induction l generalizing init <;> simp [*]
|
||||
|
||||
theorem foldr_map (f : α₁ → α₂) (g : α₂ → β → β) (l : List α₁) (init : β) :
|
||||
(l.map f).foldr g init = l.foldr (fun x y => g (f x) y) init := by
|
||||
induction l generalizing init <;> simp [*]
|
||||
|
||||
/-! ### mapM -/
|
||||
|
||||
/-- Alternate (non-tail-recursive) form of mapM for proofs. -/
|
||||
|
||||
@@ -13,21 +13,19 @@ namespace Option
|
||||
deriving instance DecidableEq for Option
|
||||
deriving instance BEq for Option
|
||||
|
||||
/-- Lifts an optional value to any `Alternative`, sending `none` to `failure`. -/
|
||||
def getM [Alternative m] : Option α → m α
|
||||
def toMonad [Monad m] [Alternative m] : Option α → m α
|
||||
| none => failure
|
||||
| some a => pure a
|
||||
|
||||
@[deprecated getM] def toMonad [Monad m] [Alternative m] : Option α → m α :=
|
||||
getM
|
||||
@[inline] def toBool : Option α → Bool
|
||||
| some _ => true
|
||||
| none => false
|
||||
|
||||
/-- Returns `true` on `some x` and `false` on `none`. -/
|
||||
@[inline] def isSome : Option α → Bool
|
||||
| some _ => true
|
||||
| none => false
|
||||
|
||||
@[deprecated isSome, inline] def toBool : Option α → Bool := isSome
|
||||
|
||||
/-- Returns `true` on `none` and `false` on `some x`. -/
|
||||
@[inline] def isNone : Option α → Bool
|
||||
| some _ => false
|
||||
|
||||
@@ -182,13 +182,15 @@ instance [Ord α] : Ord (Option α) where
|
||||
|
||||
/-- The lexicographic order on pairs. -/
|
||||
def lexOrd [Ord α] [Ord β] : Ord (α × β) where
|
||||
compare := compareLex (compareOn (·.1)) (compareOn (·.2))
|
||||
compare p1 p2 := match compare p1.1 p2.1 with
|
||||
| .eq => compare p1.2 p2.2
|
||||
| o => o
|
||||
|
||||
def ltOfOrd [Ord α] : LT α where
|
||||
lt a b := compare a b = Ordering.lt
|
||||
lt a b := compare a b == Ordering.lt
|
||||
|
||||
instance [Ord α] : DecidableRel (@LT.lt α ltOfOrd) :=
|
||||
inferInstanceAs (DecidableRel (fun a b => compare a b = Ordering.lt))
|
||||
inferInstanceAs (DecidableRel (fun a b => compare a b == Ordering.lt))
|
||||
|
||||
def leOfOrd [Ord α] : LE α where
|
||||
le a b := (compare a b).isLE
|
||||
|
||||
@@ -116,11 +116,6 @@ instance {p : α → Prop} [Repr α] : Repr (Subtype p) where
|
||||
|
||||
namespace Nat
|
||||
|
||||
/-
|
||||
We have pure functions for calculating the decimal representation of a `Nat` (`toDigits`), but also
|
||||
a fast variant that handles small numbers (`USize`) via C code (`lean_string_of_usize`).
|
||||
-/
|
||||
|
||||
def digitChar (n : Nat) : Char :=
|
||||
if n = 0 then '0' else
|
||||
if n = 1 then '1' else
|
||||
@@ -151,20 +146,6 @@ def toDigitsCore (base : Nat) : Nat → Nat → List Char → List Char
|
||||
def toDigits (base : Nat) (n : Nat) : List Char :=
|
||||
toDigitsCore base (n+1) n []
|
||||
|
||||
@[extern "lean_string_of_usize"]
|
||||
protected def _root_.USize.repr (n : @& USize) : String :=
|
||||
(toDigits 10 n.toNat).asString
|
||||
|
||||
/-- We statically allocate and memoize reprs for small natural numbers. -/
|
||||
private def reprArray : Array String := Id.run do
|
||||
List.range 128 |>.map (·.toUSize.repr) |> Array.mk
|
||||
|
||||
private def reprFast (n : Nat) : String :=
|
||||
if h : n < 128 then Nat.reprArray.get ⟨n, h⟩ else
|
||||
if h : n < USize.size then (USize.ofNatCore n h).repr
|
||||
else (toDigits 10 n).asString
|
||||
|
||||
@[implemented_by reprFast]
|
||||
protected def repr (n : Nat) : String :=
|
||||
(toDigits 10 n).asString
|
||||
|
||||
@@ -194,32 +175,6 @@ def toSuperDigits (n : Nat) : List Char :=
|
||||
def toSuperscriptString (n : Nat) : String :=
|
||||
(toSuperDigits n).asString
|
||||
|
||||
def subDigitChar (n : Nat) : Char :=
|
||||
if n = 0 then '₀' else
|
||||
if n = 1 then '₁' else
|
||||
if n = 2 then '₂' else
|
||||
if n = 3 then '₃' else
|
||||
if n = 4 then '₄' else
|
||||
if n = 5 then '₅' else
|
||||
if n = 6 then '₆' else
|
||||
if n = 7 then '₇' else
|
||||
if n = 8 then '₈' else
|
||||
if n = 9 then '₉' else
|
||||
'*'
|
||||
|
||||
partial def toSubDigitsAux : Nat → List Char → List Char
|
||||
| n, ds =>
|
||||
let d := subDigitChar <| n % 10;
|
||||
let n' := n / 10;
|
||||
if n' = 0 then d::ds
|
||||
else toSubDigitsAux n' (d::ds)
|
||||
|
||||
def toSubDigits (n : Nat) : List Char :=
|
||||
toSubDigitsAux n []
|
||||
|
||||
def toSubscriptString (n : Nat) : String :=
|
||||
(toSubDigits n).asString
|
||||
|
||||
end Nat
|
||||
|
||||
instance : Repr Nat where
|
||||
|
||||
@@ -44,16 +44,6 @@ def append : String → (@& String) → String
|
||||
def toList (s : String) : List Char :=
|
||||
s.data
|
||||
|
||||
/-- Returns true if `p` is a valid UTF-8 position in the string `s`, meaning that `p ≤ s.endPos`
|
||||
and `p` lies on a UTF-8 character boundary. This has an O(1) implementation in the runtime. -/
|
||||
@[extern "lean_string_is_valid_pos"]
|
||||
def Pos.isValid (s : @&String) (p : @& Pos) : Bool :=
|
||||
go s.data 0
|
||||
where
|
||||
go : List Char → Pos → Bool
|
||||
| [], i => i = p
|
||||
| c::cs, i => if i = p then true else go cs (i + c)
|
||||
|
||||
def utf8GetAux : List Char → Pos → Pos → Char
|
||||
| [], _, _ => default
|
||||
| c::cs, i, p => if i = p then c else utf8GetAux cs (i + c) p
|
||||
|
||||
@@ -17,116 +17,20 @@ def toNat! (s : String) : Nat :=
|
||||
else
|
||||
panic! "Nat expected"
|
||||
|
||||
def utf8DecodeChar? (a : ByteArray) (i : Nat) : Option Char := do
|
||||
let c ← a[i]?
|
||||
if c &&& 0x80 == 0 then
|
||||
some ⟨c.toUInt32, .inl (Nat.lt_trans c.1.2 (by decide))⟩
|
||||
else if c &&& 0xe0 == 0xc0 then
|
||||
let c1 ← a[i+1]?
|
||||
guard (c1 &&& 0xc0 == 0x80)
|
||||
let r := ((c &&& 0x1f).toUInt32 <<< 6) ||| (c1 &&& 0x3f).toUInt32
|
||||
guard (0x80 ≤ r)
|
||||
-- TODO: Prove h from the definition of r once we have the necessary lemmas
|
||||
if h : r < 0xd800 then some ⟨r, .inl h⟩ else none
|
||||
else if c &&& 0xf0 == 0xe0 then
|
||||
let c1 ← a[i+1]?
|
||||
let c2 ← a[i+2]?
|
||||
guard (c1 &&& 0xc0 == 0x80 && c2 &&& 0xc0 == 0x80)
|
||||
let r :=
|
||||
((c &&& 0x0f).toUInt32 <<< 12) |||
|
||||
((c1 &&& 0x3f).toUInt32 <<< 6) |||
|
||||
(c2 &&& 0x3f).toUInt32
|
||||
guard (0x800 ≤ r)
|
||||
-- TODO: Prove `r < 0x110000` from the definition of r once we have the necessary lemmas
|
||||
if h : r < 0xd800 ∨ 0xdfff < r ∧ r < 0x110000 then some ⟨r, h⟩ else none
|
||||
else if c &&& 0xf8 == 0xf0 then
|
||||
let c1 ← a[i+1]?
|
||||
let c2 ← a[i+2]?
|
||||
let c3 ← a[i+3]?
|
||||
guard (c1 &&& 0xc0 == 0x80 && c2 &&& 0xc0 == 0x80 && c3 &&& 0xc0 == 0x80)
|
||||
let r :=
|
||||
((c &&& 0x07).toUInt32 <<< 18) |||
|
||||
((c1 &&& 0x3f).toUInt32 <<< 12) |||
|
||||
((c2 &&& 0x3f).toUInt32 <<< 6) |||
|
||||
(c3 &&& 0x3f).toUInt32
|
||||
if h : 0x10000 ≤ r ∧ r < 0x110000 then
|
||||
some ⟨r, .inr ⟨Nat.lt_of_lt_of_le (by decide) h.1, h.2⟩⟩
|
||||
else none
|
||||
else
|
||||
none
|
||||
/--
|
||||
Convert a [UTF-8](https://en.wikipedia.org/wiki/UTF-8) encoded `ByteArray` string to `String`.
|
||||
The result is unspecified if `a` is not properly UTF-8 encoded.
|
||||
-/
|
||||
@[extern "lean_string_from_utf8_unchecked"]
|
||||
opaque fromUTF8Unchecked (a : @& ByteArray) : String
|
||||
|
||||
/-- Returns true if the given byte array consists of valid UTF-8. -/
|
||||
@[extern "lean_string_validate_utf8"]
|
||||
def validateUTF8 (a : @& ByteArray) : Bool :=
|
||||
(loop 0).isSome
|
||||
where
|
||||
loop (i : Nat) : Option Unit := do
|
||||
if i < a.size then
|
||||
let c ← utf8DecodeChar? a i
|
||||
loop (i + csize c)
|
||||
else pure ()
|
||||
termination_by a.size - i
|
||||
decreasing_by exact Nat.sub_lt_sub_left ‹_› (Nat.lt_add_of_pos_right (one_le_csize c))
|
||||
|
||||
/-- Converts a [UTF-8](https://en.wikipedia.org/wiki/UTF-8) encoded `ByteArray` string to `String`. -/
|
||||
@[extern "lean_string_from_utf8"]
|
||||
def fromUTF8 (a : @& ByteArray) (h : validateUTF8 a) : String :=
|
||||
loop 0 ""
|
||||
where
|
||||
loop (i : Nat) (acc : String) : String :=
|
||||
if i < a.size then
|
||||
let c := (utf8DecodeChar? a i).getD default
|
||||
loop (i + csize c) (acc.push c)
|
||||
else acc
|
||||
termination_by a.size - i
|
||||
decreasing_by exact Nat.sub_lt_sub_left ‹_› (Nat.lt_add_of_pos_right (one_le_csize c))
|
||||
|
||||
/-- Converts a [UTF-8](https://en.wikipedia.org/wiki/UTF-8) encoded `ByteArray` string to `String`,
|
||||
or returns `none` if `a` is not properly UTF-8 encoded. -/
|
||||
@[inline] def fromUTF8? (a : ByteArray) : Option String :=
|
||||
if h : validateUTF8 a then fromUTF8 a h else none
|
||||
|
||||
/-- Converts a [UTF-8](https://en.wikipedia.org/wiki/UTF-8) encoded `ByteArray` string to `String`,
|
||||
or panics if `a` is not properly UTF-8 encoded. -/
|
||||
@[inline] def fromUTF8! (a : ByteArray) : String :=
|
||||
if h : validateUTF8 a then fromUTF8 a h else panic! "invalid UTF-8 string"
|
||||
|
||||
def utf8EncodeChar (c : Char) : List UInt8 :=
|
||||
let v := c.val
|
||||
if v ≤ 0x7f then
|
||||
[v.toUInt8]
|
||||
else if v ≤ 0x7ff then
|
||||
[(v >>> 6).toUInt8 &&& 0x1f ||| 0xc0,
|
||||
v.toUInt8 &&& 0x3f ||| 0x80]
|
||||
else if v ≤ 0xffff then
|
||||
[(v >>> 12).toUInt8 &&& 0x0f ||| 0xe0,
|
||||
(v >>> 6).toUInt8 &&& 0x3f ||| 0x80,
|
||||
v.toUInt8 &&& 0x3f ||| 0x80]
|
||||
else
|
||||
[(v >>> 18).toUInt8 &&& 0x07 ||| 0xf0,
|
||||
(v >>> 12).toUInt8 &&& 0x3f ||| 0x80,
|
||||
(v >>> 6).toUInt8 &&& 0x3f ||| 0x80,
|
||||
v.toUInt8 &&& 0x3f ||| 0x80]
|
||||
|
||||
@[simp] theorem length_utf8EncodeChar (c : Char) : (utf8EncodeChar c).length = csize c := by
|
||||
simp [csize, utf8EncodeChar, Char.utf8Size]
|
||||
cases Decidable.em (c.val ≤ 0x7f) <;> simp [*]
|
||||
cases Decidable.em (c.val ≤ 0x7ff) <;> simp [*]
|
||||
cases Decidable.em (c.val ≤ 0xffff) <;> simp [*]
|
||||
|
||||
/-- Converts the given `String` to a [UTF-8](https://en.wikipedia.org/wiki/UTF-8) encoded byte array. -/
|
||||
/-- Convert the given `String` to a [UTF-8](https://en.wikipedia.org/wiki/UTF-8) encoded byte array. -/
|
||||
@[extern "lean_string_to_utf8"]
|
||||
def toUTF8 (a : @& String) : ByteArray :=
|
||||
⟨⟨a.data.bind utf8EncodeChar⟩⟩
|
||||
|
||||
@[simp] theorem size_toUTF8 (s : String) : s.toUTF8.size = s.utf8ByteSize := by
|
||||
simp [toUTF8, ByteArray.size, Array.size, utf8ByteSize, List.bind]
|
||||
induction s.data <;> simp [List.map, List.join, utf8ByteSize.go, Nat.add_comm, *]
|
||||
opaque toUTF8 (a : @& String) : ByteArray
|
||||
|
||||
/-- Accesses a byte in the UTF-8 encoding of the `String`. O(1) -/
|
||||
@[extern "lean_string_get_byte_fast"]
|
||||
def getUtf8Byte (s : @& String) (n : Nat) (h : n < s.utf8ByteSize) : UInt8 :=
|
||||
(toUTF8 s).get ⟨n, size_toUTF8 _ ▸ h⟩
|
||||
opaque getUtf8Byte (s : @& String) (n : Nat) (h : n < s.utf8ByteSize) : UInt8
|
||||
|
||||
theorem Iterator.sizeOf_next_lt_of_hasNext (i : String.Iterator) (h : i.hasNext) : sizeOf i.next < sizeOf i := by
|
||||
cases i; rename_i s pos; simp [Iterator.next, Iterator.sizeOf_eq]; simp [Iterator.hasNext] at h
|
||||
|
||||
@@ -103,7 +103,7 @@ def UInt16.shiftLeft (a b : UInt16) : UInt16 := ⟨a.val <<< (modn b 16).val⟩
|
||||
@[extern "lean_uint16_to_uint8"]
|
||||
def UInt16.toUInt8 (a : UInt16) : UInt8 := a.toNat.toUInt8
|
||||
@[extern "lean_uint8_to_uint16"]
|
||||
def UInt8.toUInt16 (a : UInt8) : UInt16 := ⟨a.val, Nat.lt_trans a.1.2 (by decide)⟩
|
||||
def UInt8.toUInt16 (a : UInt8) : UInt16 := a.toNat.toUInt16
|
||||
@[extern "lean_uint16_shift_right"]
|
||||
def UInt16.shiftRight (a b : UInt16) : UInt16 := ⟨a.val >>> (modn b 16).val⟩
|
||||
def UInt16.lt (a b : UInt16) : Prop := a.val < b.val
|
||||
@@ -186,9 +186,9 @@ def UInt32.toUInt8 (a : UInt32) : UInt8 := a.toNat.toUInt8
|
||||
@[extern "lean_uint32_to_uint16"]
|
||||
def UInt32.toUInt16 (a : UInt32) : UInt16 := a.toNat.toUInt16
|
||||
@[extern "lean_uint8_to_uint32"]
|
||||
def UInt8.toUInt32 (a : UInt8) : UInt32 := ⟨a.val, Nat.lt_trans a.1.2 (by decide)⟩
|
||||
def UInt8.toUInt32 (a : UInt8) : UInt32 := a.toNat.toUInt32
|
||||
@[extern "lean_uint16_to_uint32"]
|
||||
def UInt16.toUInt32 (a : UInt16) : UInt32 := ⟨a.val, Nat.lt_trans a.1.2 (by decide)⟩
|
||||
def UInt16.toUInt32 (a : UInt16) : UInt32 := a.toNat.toUInt32
|
||||
|
||||
instance UInt32.instOfNat : OfNat UInt32 n := ⟨UInt32.ofNat n⟩
|
||||
instance : Add UInt32 := ⟨UInt32.add⟩
|
||||
@@ -244,11 +244,11 @@ def UInt64.toUInt16 (a : UInt64) : UInt16 := a.toNat.toUInt16
|
||||
@[extern "lean_uint64_to_uint32"]
|
||||
def UInt64.toUInt32 (a : UInt64) : UInt32 := a.toNat.toUInt32
|
||||
@[extern "lean_uint8_to_uint64"]
|
||||
def UInt8.toUInt64 (a : UInt8) : UInt64 := ⟨a.val, Nat.lt_trans a.1.2 (by decide)⟩
|
||||
def UInt8.toUInt64 (a : UInt8) : UInt64 := a.toNat.toUInt64
|
||||
@[extern "lean_uint16_to_uint64"]
|
||||
def UInt16.toUInt64 (a : UInt16) : UInt64 := ⟨a.val, Nat.lt_trans a.1.2 (by decide)⟩
|
||||
def UInt16.toUInt64 (a : UInt16) : UInt64 := a.toNat.toUInt64
|
||||
@[extern "lean_uint32_to_uint64"]
|
||||
def UInt32.toUInt64 (a : UInt32) : UInt64 := ⟨a.val, Nat.lt_trans a.1.2 (by decide)⟩
|
||||
def UInt32.toUInt64 (a : UInt32) : UInt64 := a.toNat.toUInt64
|
||||
|
||||
instance UInt64.instOfNat : OfNat UInt64 n := ⟨UInt64.ofNat n⟩
|
||||
instance : Add UInt64 := ⟨UInt64.add⟩
|
||||
@@ -321,7 +321,7 @@ def USize.shiftLeft (a b : USize) : USize := ⟨a.val <<< (modn b System.Platfor
|
||||
@[extern "lean_usize_shift_right"]
|
||||
def USize.shiftRight (a b : USize) : USize := ⟨a.val >>> (modn b System.Platform.numBits).val⟩
|
||||
@[extern "lean_uint32_to_usize"]
|
||||
def UInt32.toUSize (a : UInt32) : USize := USize.ofNat32 a.val a.1.2
|
||||
def UInt32.toUSize (a : UInt32) : USize := a.toNat.toUSize
|
||||
@[extern "lean_usize_to_uint32"]
|
||||
def USize.toUInt32 (a : USize) : UInt32 := a.toNat.toUInt32
|
||||
|
||||
|
||||
@@ -311,8 +311,6 @@ Note that EOF does not actually close a stream, so further reads may block and r
|
||||
-/
|
||||
getLine : IO String
|
||||
putStr : String → IO Unit
|
||||
/-- Returns true if a stream refers to a Windows console or Unix terminal. -/
|
||||
isTty : BaseIO Bool
|
||||
deriving Inhabited
|
||||
|
||||
open FS
|
||||
@@ -362,9 +360,6 @@ Will succeed even if no lock has been acquired.
|
||||
-/
|
||||
@[extern "lean_io_prim_handle_unlock"] opaque unlock (h : @& Handle) : IO Unit
|
||||
|
||||
/-- Returns true if a handle refers to a Windows console or Unix terminal. -/
|
||||
@[extern "lean_io_prim_handle_is_tty"] opaque isTty (h : @& Handle) : BaseIO Bool
|
||||
|
||||
@[extern "lean_io_prim_handle_flush"] opaque flush (h : @& Handle) : IO Unit
|
||||
/-- Rewinds the read/write cursor to the beginning of the handle. -/
|
||||
@[extern "lean_io_prim_handle_rewind"] opaque rewind (h : @& Handle) : IO Unit
|
||||
@@ -748,41 +743,36 @@ namespace FS
|
||||
namespace Stream
|
||||
|
||||
@[export lean_stream_of_handle]
|
||||
def ofHandle (h : Handle) : Stream where
|
||||
flush := Handle.flush h
|
||||
read := Handle.read h
|
||||
write := Handle.write h
|
||||
getLine := Handle.getLine h
|
||||
putStr := Handle.putStr h
|
||||
isTty := Handle.isTty h
|
||||
def ofHandle (h : Handle) : Stream := {
|
||||
flush := Handle.flush h,
|
||||
read := Handle.read h,
|
||||
write := Handle.write h,
|
||||
getLine := Handle.getLine h,
|
||||
putStr := Handle.putStr h,
|
||||
}
|
||||
|
||||
structure Buffer where
|
||||
data : ByteArray := ByteArray.empty
|
||||
pos : Nat := 0
|
||||
|
||||
def ofBuffer (r : Ref Buffer) : Stream where
|
||||
flush := pure ()
|
||||
def ofBuffer (r : Ref Buffer) : Stream := {
|
||||
flush := pure (),
|
||||
read := fun n => r.modifyGet fun b =>
|
||||
let data := b.data.extract b.pos (b.pos + n.toNat)
|
||||
(data, { b with pos := b.pos + data.size })
|
||||
(data, { b with pos := b.pos + data.size }),
|
||||
write := fun data => r.modify fun b =>
|
||||
-- set `exact` to `false` so that repeatedly writing to the stream does not impose quadratic run time
|
||||
{ b with data := data.copySlice 0 b.data b.pos data.size false, pos := b.pos + data.size }
|
||||
getLine := do
|
||||
let buf ← r.modifyGet fun b =>
|
||||
let pos := match b.data.findIdx? (start := b.pos) fun u => u == 0 || u = '\n'.toNat.toUInt8 with
|
||||
-- include '\n', but not '\0'
|
||||
| some pos => if b.data.get! pos == 0 then pos else pos + 1
|
||||
| none => b.data.size
|
||||
(b.data.extract b.pos pos, { b with pos := pos })
|
||||
match String.fromUTF8? buf with
|
||||
| some str => pure str
|
||||
| none => throw (.userError "invalid UTF-8")
|
||||
{ b with data := data.copySlice 0 b.data b.pos data.size false, pos := b.pos + data.size },
|
||||
getLine := r.modifyGet fun b =>
|
||||
let pos := match b.data.findIdx? (start := b.pos) fun u => u == 0 || u = '\n'.toNat.toUInt8 with
|
||||
-- include '\n', but not '\0'
|
||||
| some pos => if b.data.get! pos == 0 then pos else pos + 1
|
||||
| none => b.data.size
|
||||
(String.fromUTF8Unchecked <| b.data.extract b.pos pos, { b with pos := pos }),
|
||||
putStr := fun s => r.modify fun b =>
|
||||
let data := s.toUTF8
|
||||
{ b with data := data.copySlice 0 b.data b.pos data.size false, pos := b.pos + data.size }
|
||||
isTty := pure false
|
||||
|
||||
{ b with data := data.copySlice 0 b.data b.pos data.size false, pos := b.pos + data.size },
|
||||
}
|
||||
end Stream
|
||||
|
||||
/-- Run action with `stdin` emptied and `stdout+stderr` captured into a `String`. -/
|
||||
@@ -795,7 +785,7 @@ def withIsolatedStreams [Monad m] [MonadFinally m] [MonadLiftT BaseIO m] (x : m
|
||||
(if isolateStderr then withStderr (Stream.ofBuffer bOut) else id) <|
|
||||
x
|
||||
let bOut ← liftM (m := BaseIO) bOut.get
|
||||
let out := String.fromUTF8! bOut.data
|
||||
let out := String.fromUTF8Unchecked bOut.data
|
||||
pure (out, r)
|
||||
|
||||
end FS
|
||||
|
||||
@@ -50,7 +50,7 @@ def decodeUri (uri : String) : String := Id.run do
|
||||
((decoded.push c).push h1, i + 2)
|
||||
else
|
||||
(decoded.push c, i + 1)
|
||||
return String.fromUTF8! decoded
|
||||
return String.fromUTF8Unchecked decoded
|
||||
where hexDigitToUInt8? (c : UInt8) : Option UInt8 :=
|
||||
if zero ≤ c ∧ c ≤ nine then some (c - zero)
|
||||
else if lettera ≤ c ∧ c ≤ letterf then some (c - lettera + 10)
|
||||
|
||||
@@ -1125,14 +1125,11 @@ normalizes `h` with `norm_cast` and tries to use that to close the goal. -/
|
||||
macro "assumption_mod_cast" : tactic => `(tactic| norm_cast0 at * <;> assumption)
|
||||
|
||||
/--
|
||||
The `norm_cast` family of tactics is used to normalize certain coercions (*casts*) in expressions.
|
||||
- `norm_cast` normalizes casts in the target.
|
||||
- `norm_cast at h` normalizes casts in hypothesis `h`.
|
||||
|
||||
The tactic is basically a version of `simp` with a specific set of lemmas to move casts
|
||||
The `norm_cast` family of tactics is used to normalize casts inside expressions.
|
||||
It is basically a `simp` tactic with a specific set of lemmas to move casts
|
||||
upwards in the expression.
|
||||
Therefore even in situations where non-terminal `simp` calls are discouraged (because of fragility),
|
||||
`norm_cast` is considered to be safe.
|
||||
`norm_cast` is considered safe.
|
||||
It also has special handling of numerals.
|
||||
|
||||
For instance, given an assumption
|
||||
@@ -1140,22 +1137,22 @@ For instance, given an assumption
|
||||
a b : ℤ
|
||||
h : ↑a + ↑b < (10 : ℚ)
|
||||
```
|
||||
|
||||
writing `norm_cast at h` will turn `h` into
|
||||
```lean
|
||||
h : a + b < 10
|
||||
```
|
||||
|
||||
There are also variants of basic tactics that use `norm_cast` to normalize expressions during
|
||||
their operation, to make them more flexible about the expressions they accept
|
||||
(we say that it is a tactic *modulo* the effects of `norm_cast`):
|
||||
- `exact_mod_cast` for `exact` and `apply_mod_cast` for `apply`.
|
||||
Writing `exact_mod_cast h` and `apply_mod_cast h` will normalize casts
|
||||
in the goal and `h` before using `exact h` or `apply h`.
|
||||
- `rw_mod_cast` for `rw`. It applies `norm_cast` between rewrites.
|
||||
- `assumption_mod_cast` for `assumption`.
|
||||
This is effectively `norm_cast at *; assumption`, but more efficient.
|
||||
It normalizes casts in the goal and, for every hypothesis `h` in the context,
|
||||
it will try to normalize casts in `h` and use `exact h`.
|
||||
There are also variants of `exact`, `apply`, `rw`, and `assumption` that
|
||||
work modulo `norm_cast` - in other words, they apply `norm_cast` to make
|
||||
them more flexible. They are called `exact_mod_cast`, `apply_mod_cast`,
|
||||
`rw_mod_cast`, and `assumption_mod_cast`, respectively.
|
||||
Writing `exact_mod_cast h` and `apply_mod_cast h` will normalize casts
|
||||
in the goal and `h` before using `exact h` or `apply h`.
|
||||
Writing `assumption_mod_cast` will normalize casts in the goal and, for
|
||||
every hypothesis `h` in the context, it will try to normalize casts in `h` and use
|
||||
`exact h`.
|
||||
`rw_mod_cast` acts like the `rw` tactic but it applies `norm_cast` between steps.
|
||||
|
||||
See also `push_cast`, which moves casts inwards rather than lifting them outwards.
|
||||
-/
|
||||
@@ -1163,37 +1160,22 @@ macro "norm_cast" loc:(location)? : tactic =>
|
||||
`(tactic| norm_cast0 $[$loc]? <;> try trivial)
|
||||
|
||||
/--
|
||||
`push_cast` rewrites the goal to move certain coercions (*casts*) inward, toward the leaf nodes.
|
||||
`push_cast` rewrites the goal to move casts inward, toward the leaf nodes.
|
||||
This uses `norm_cast` lemmas in the forward direction.
|
||||
For example, `↑(a + b)` will be written to `↑a + ↑b`.
|
||||
- `push_cast` moves casts inward in the goal.
|
||||
- `push_cast at h` moves casts inward in the hypothesis `h`.
|
||||
It can be used with extra simp lemmas with, for example, `push_cast [Int.add_zero]`.
|
||||
It is equivalent to `simp only with push_cast`.
|
||||
It can also be used at hypotheses with `push_cast at h`
|
||||
and with extra simp lemmas with `push_cast [int.add_zero]`.
|
||||
|
||||
Example:
|
||||
```lean
|
||||
example (a b : Nat)
|
||||
(h1 : ((a + b : Nat) : Int) = 10)
|
||||
(h2 : ((a + b + 0 : Nat) : Int) = 10) :
|
||||
((a + b : Nat) : Int) = 10 := by
|
||||
/-
|
||||
h1 : ↑(a + b) = 10
|
||||
h2 : ↑(a + b + 0) = 10
|
||||
⊢ ↑(a + b) = 10
|
||||
-/
|
||||
push_cast
|
||||
/- Now
|
||||
⊢ ↑a + ↑b = 10
|
||||
-/
|
||||
push_cast at h1
|
||||
push_cast [Int.add_zero] at h2
|
||||
/- Now
|
||||
h1 h2 : ↑a + ↑b = 10
|
||||
-/
|
||||
exact h1
|
||||
example (a b : ℕ) (h1 : ((a + b : ℕ) : ℤ) = 10) (h2 : ((a + b + 0 : ℕ) : ℤ) = 10) :
|
||||
((a + b : ℕ) : ℤ) = 10 :=
|
||||
begin
|
||||
push_cast,
|
||||
push_cast at h1,
|
||||
push_cast [int.add_zero] at h2,
|
||||
end
|
||||
```
|
||||
|
||||
See also `norm_cast`.
|
||||
-/
|
||||
syntax (name := pushCast) "push_cast" (config)? (discharger)? (&" only")?
|
||||
(" [" (simpStar <|> simpErase <|> simpLemma),* "]")? (location)? : tactic
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Mario Carneiro. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Mario Carneiro
|
||||
-/
|
||||
prelude
|
||||
import Lean.Compiler.InitAttr
|
||||
import Lean.DocString
|
||||
|
||||
namespace Lean
|
||||
|
||||
def declareBuiltinDocStringAndRanges (declName : Name) : AttrM Unit := do
|
||||
if let some doc ← findDocString? (← getEnv) declName (includeBuiltin := false) then
|
||||
declareBuiltin (declName ++ `docString) (mkAppN (mkConst ``addBuiltinDocString) #[toExpr declName, toExpr doc])
|
||||
if let some declRanges ← findDeclarationRanges? declName then
|
||||
declareBuiltin (declName ++ `declRange) (mkAppN (mkConst ``addBuiltinDeclarationRanges) #[toExpr declName, toExpr declRanges])
|
||||
|
||||
builtin_initialize
|
||||
registerBuiltinAttribute {
|
||||
name := `builtin_doc
|
||||
descr := "make the docs and location of this declaration available as a builtin"
|
||||
add := fun decl stx _ => do
|
||||
Attribute.Builtin.ensureNoArgs stx
|
||||
declareBuiltinDocStringAndRanges decl
|
||||
}
|
||||
|
||||
end Lean
|
||||
@@ -12,8 +12,7 @@ Run the code generation pipeline for all declarations in `declNames`
|
||||
that fulfill the requirements of `shouldGenerateCode`.
|
||||
-/
|
||||
def compile (declNames : Array Name) : CoreM Unit := do profileitM Exception "compiler new" (← getOptions) do
|
||||
withTraceNode `Compiler (fun _ => return m!"compiling: {declNames}") do
|
||||
discard <| LCNF.compile declNames
|
||||
discard <| LCNF.compile declNames
|
||||
|
||||
builtin_initialize
|
||||
registerTraceClass `Compiler
|
||||
|
||||
@@ -177,13 +177,6 @@ instance : MonadTrace CoreM where
|
||||
def restore (b : State) : CoreM Unit :=
|
||||
modify fun s => { s with env := b.env, messages := b.messages, infoState := b.infoState }
|
||||
|
||||
/--
|
||||
Restores full state including sources for unique identifiers. Only intended for incremental reuse
|
||||
between elaboration runs, not for backtracking within a single run.
|
||||
-/
|
||||
def restoreFull (b : State) : CoreM Unit :=
|
||||
set b
|
||||
|
||||
private def mkFreshNameImp (n : Name) : CoreM Name := do
|
||||
let fresh ← modifyGet fun s => (s.nextMacroScope, { s with nextMacroScope := s.nextMacroScope + 1 })
|
||||
return addMacroScope (← getEnv).mainModule n fresh
|
||||
@@ -252,13 +245,6 @@ def resetMessageLog : CoreM Unit :=
|
||||
def getMessageLog : CoreM MessageLog :=
|
||||
return (← get).messages
|
||||
|
||||
/--
|
||||
Returns the current log and then resets its messages but does NOT reset `MessageLog.hadErrors`. Used
|
||||
for incremental reporting during elaboration of a single command.
|
||||
-/
|
||||
def getAndEmptyMessageLog : CoreM MessageLog :=
|
||||
modifyGet fun log => ({ log with msgs := {} }, log)
|
||||
|
||||
instance : MonadLog CoreM where
|
||||
getRef := getRef
|
||||
getFileMap := return (← read).fileMap
|
||||
@@ -344,13 +330,10 @@ opaque compileDeclsNew (declNames : List Name) : CoreM Unit
|
||||
|
||||
def compileDecl (decl : Declaration) : CoreM Unit := do
|
||||
let opts ← getOptions
|
||||
let decls := Compiler.getDeclNamesForCodeGen decl
|
||||
if compiler.enableNew.get opts then
|
||||
compileDeclsNew decls
|
||||
let res ← withTraceNode `compiler (fun _ => return m!"compiling old: {decls}") do
|
||||
return (← getEnv).compileDecl opts decl
|
||||
match res with
|
||||
| Except.ok env => setEnv env
|
||||
compileDeclsNew (Compiler.getDeclNamesForCodeGen decl)
|
||||
match (← getEnv).compileDecl opts decl with
|
||||
| Except.ok env => setEnv env
|
||||
| Except.error (KernelException.other msg) =>
|
||||
checkUnsupported decl -- Generate nicer error message for unsupported recursors and axioms
|
||||
throwError msg
|
||||
|
||||
@@ -18,7 +18,7 @@ open IO
|
||||
/-- Consumes `nBytes` bytes from the stream, interprets the bytes as a utf-8 string and the string as a valid JSON object. -/
|
||||
def readJson (h : FS.Stream) (nBytes : Nat) : IO Json := do
|
||||
let bytes ← h.read (USize.ofNat nBytes)
|
||||
let some s := String.fromUTF8? bytes | throw (IO.userError "invalid UTF-8")
|
||||
let s := String.fromUTF8Unchecked bytes
|
||||
ofExcept (Json.parse s)
|
||||
|
||||
def writeJson (h : FS.Stream) (j : Json) : IO Unit := do
|
||||
|
||||
@@ -324,7 +324,7 @@ inductive SemanticTokenType where
|
||||
| decorator
|
||||
-- Extensions
|
||||
| leanSorryLike
|
||||
deriving ToJson, FromJson, BEq, Hashable
|
||||
deriving ToJson, FromJson
|
||||
|
||||
-- must be in the same order as the constructors
|
||||
def SemanticTokenType.names : Array String :=
|
||||
|
||||
@@ -43,19 +43,11 @@ def fail (msg : String) : Parsec α := fun it =>
|
||||
error it msg
|
||||
|
||||
@[inline]
|
||||
def tryCatch (p : Parsec α)
|
||||
(csuccess : α → Parsec β)
|
||||
(cerror : Unit → Parsec β)
|
||||
: Parsec β := fun it =>
|
||||
def orElse (p : Parsec α) (q : Unit → Parsec α) : Parsec α := fun it =>
|
||||
match p it with
|
||||
| .success rem a => csuccess a rem
|
||||
| .error rem err =>
|
||||
-- We assume that it.s never changes as the `Parsec` monad only modifies `it.pos`.
|
||||
if it.pos = rem.pos then cerror () rem else .error rem err
|
||||
|
||||
@[inline]
|
||||
def orElse (p : Parsec α) (q : Unit → Parsec α) : Parsec α :=
|
||||
tryCatch p pure q
|
||||
| success rem a => success rem a
|
||||
| error rem err =>
|
||||
if it = rem then q () it else error rem err
|
||||
|
||||
@[inline]
|
||||
def attempt (p : Parsec α) : Parsec α := λ it =>
|
||||
@@ -82,7 +74,8 @@ def eof : Parsec Unit := fun it =>
|
||||
|
||||
@[specialize]
|
||||
partial def manyCore (p : Parsec α) (acc : Array α) : Parsec $ Array α :=
|
||||
tryCatch p (manyCore p <| acc.push ·) (fun _ => pure acc)
|
||||
(do manyCore p (acc.push $ ←p))
|
||||
<|> pure acc
|
||||
|
||||
@[inline]
|
||||
def many (p : Parsec α) : Parsec $ Array α := manyCore p #[]
|
||||
@@ -92,7 +85,8 @@ def many1 (p : Parsec α) : Parsec $ Array α := do manyCore p #[←p]
|
||||
|
||||
@[specialize]
|
||||
partial def manyCharsCore (p : Parsec Char) (acc : String) : Parsec String :=
|
||||
tryCatch p (manyCharsCore p <| acc.push ·) (fun _ => pure acc)
|
||||
(do manyCharsCore p (acc.push $ ←p))
|
||||
<|> pure acc
|
||||
|
||||
@[inline]
|
||||
def manyChars (p : Parsec Char) : Parsec String := manyCharsCore p ""
|
||||
|
||||
@@ -5,7 +5,6 @@ Authors: Leonardo de Moura, Sebastian Ullrich
|
||||
-/
|
||||
prelude
|
||||
import Lean.Data.Format
|
||||
import Lean.Data.Json
|
||||
import Lean.ToExpr
|
||||
|
||||
namespace Lean
|
||||
@@ -13,7 +12,7 @@ namespace Lean
|
||||
structure Position where
|
||||
line : Nat
|
||||
column : Nat
|
||||
deriving Inhabited, DecidableEq, Repr, ToJson, FromJson
|
||||
deriving Inhabited, DecidableEq, Repr
|
||||
|
||||
namespace Position
|
||||
protected def lt : Position → Position → Bool
|
||||
|
||||
@@ -49,26 +49,13 @@ def ReducibilityHints.getHeightEx (h : ReducibilityHints) : UInt32 :=
|
||||
|
||||
namespace ReducibilityHints
|
||||
|
||||
-- Recall that if `lt h₁ h₂`, we want to reduce declaration associated with `h₁`.
|
||||
def lt : ReducibilityHints → ReducibilityHints → Bool
|
||||
| .abbrev, .abbrev => false
|
||||
| .abbrev, _ => true
|
||||
| .regular d₁, .regular d₂ => d₁ > d₂
|
||||
| .regular d₁, .regular d₂ => d₁ < d₂
|
||||
| .regular _, .opaque => true
|
||||
| _, _ => false
|
||||
|
||||
protected def compare : ReducibilityHints → ReducibilityHints → Ordering
|
||||
| .abbrev, .abbrev => .eq
|
||||
| .abbrev, _ => .lt
|
||||
| .regular _, .abbrev => .gt
|
||||
| .regular d₁, .regular d₂ => Ord.compare d₂ d₁
|
||||
| .regular _, .opaque => .lt
|
||||
| .opaque, .opaque => .eq
|
||||
| .opaque, _ => .gt
|
||||
|
||||
instance : Ord ReducibilityHints where
|
||||
compare := ReducibilityHints.compare
|
||||
|
||||
def isAbbrev : ReducibilityHints → Bool
|
||||
| .abbrev => true
|
||||
| _ => false
|
||||
|
||||
@@ -1194,17 +1194,6 @@ private def addLValArg (baseName : Name) (fullName : Name) (e : Expr) (args : Ar
|
||||
argIdx := argIdx + 1
|
||||
throwError "invalid field notation, function '{fullName}' does not have argument with type ({baseName} ...) that can be used, it must be explicit or implicit with a unique name"
|
||||
|
||||
/-- Adds the `TermInfo` for the field of a projection. See `Lean.Parser.Term.identProjKind`. -/
|
||||
private def addProjTermInfo
|
||||
(stx : Syntax)
|
||||
(e : Expr)
|
||||
(expectedType? : Option Expr := none)
|
||||
(lctx? : Option LocalContext := none)
|
||||
(elaborator : Name := Name.anonymous)
|
||||
(isBinder force : Bool := false)
|
||||
: TermElabM Expr :=
|
||||
addTermInfo (Syntax.node .none Parser.Term.identProjKind #[stx]) e expectedType? lctx? elaborator isBinder force
|
||||
|
||||
private def elabAppLValsAux (namedArgs : Array NamedArg) (args : Array Arg) (expectedType? : Option Expr) (explicit ellipsis : Bool)
|
||||
(f : Expr) (lvals : List LVal) : TermElabM Expr :=
|
||||
let rec loop : Expr → List LVal → TermElabM Expr
|
||||
@@ -1225,7 +1214,7 @@ private def elabAppLValsAux (namedArgs : Array NamedArg) (args : Array Arg) (exp
|
||||
if isPrivateNameFromImportedModule (← getEnv) info.projFn then
|
||||
throwError "field '{fieldName}' from structure '{structName}' is private"
|
||||
let projFn ← mkConst info.projFn
|
||||
let projFn ← addProjTermInfo lval.getRef projFn
|
||||
let projFn ← addTermInfo lval.getRef projFn
|
||||
if lvals.isEmpty then
|
||||
let namedArgs ← addNamedArg namedArgs { name := `self, val := Arg.expr f }
|
||||
elabAppArgs projFn namedArgs args expectedType? explicit ellipsis
|
||||
@@ -1237,7 +1226,7 @@ private def elabAppLValsAux (namedArgs : Array NamedArg) (args : Array Arg) (exp
|
||||
| LValResolution.const baseStructName structName constName =>
|
||||
let f ← if baseStructName != structName then mkBaseProjections baseStructName structName f else pure f
|
||||
let projFn ← mkConst constName
|
||||
let projFn ← addProjTermInfo lval.getRef projFn
|
||||
let projFn ← addTermInfo lval.getRef projFn
|
||||
if lvals.isEmpty then
|
||||
let projFnType ← inferType projFn
|
||||
let (args, namedArgs) ← addLValArg baseStructName constName f args namedArgs projFnType
|
||||
@@ -1246,7 +1235,7 @@ private def elabAppLValsAux (namedArgs : Array NamedArg) (args : Array Arg) (exp
|
||||
let f ← elabAppArgs projFn #[] #[Arg.expr f] (expectedType? := none) (explicit := false) (ellipsis := false)
|
||||
loop f lvals
|
||||
| LValResolution.localRec baseName fullName fvar =>
|
||||
let fvar ← addProjTermInfo lval.getRef fvar
|
||||
let fvar ← addTermInfo lval.getRef fvar
|
||||
if lvals.isEmpty then
|
||||
let fvarType ← inferType fvar
|
||||
let (args, namedArgs) ← addLValArg baseName fullName f args namedArgs fvarType
|
||||
|
||||
@@ -119,6 +119,64 @@ private partial def elabChoiceAux (cmds : Array Syntax) (i : Nat) : CommandElabM
|
||||
@[builtin_command_elab choice] def elabChoice : CommandElab := fun stx =>
|
||||
elabChoiceAux stx.getArgs 0
|
||||
|
||||
/-- Declares one or more universe variables.
|
||||
|
||||
`universe u v`
|
||||
|
||||
`Prop`, `Type`, `Type u` and `Sort u` are types that classify other types, also known as
|
||||
*universes*. In `Type u` and `Sort u`, the variable `u` stands for the universe's *level*, and a
|
||||
universe at level `u` can only classify universes that are at levels lower than `u`. For more
|
||||
details on type universes, please refer to [the relevant chapter of Theorem Proving in Lean][tpil
|
||||
universes].
|
||||
|
||||
Just as type arguments allow polymorphic definitions to be used at many different types, universe
|
||||
parameters, represented by universe variables, allow a definition to be used at any required level.
|
||||
While Lean mostly handles universe levels automatically, declaring them explicitly can provide more
|
||||
control when writing signatures. The `universe` keyword allows the declared universe variables to be
|
||||
used in a collection of definitions, and Lean will ensure that these definitions use them
|
||||
consistently.
|
||||
|
||||
[tpil universes]: https://lean-lang.org/theorem_proving_in_lean4/dependent_type_theory.html#types-as-objects
|
||||
(Type universes on Theorem Proving in Lean)
|
||||
|
||||
```lean
|
||||
/- Explicit type-universe parameter. -/
|
||||
def id₁.{u} (α : Type u) (a : α) := a
|
||||
|
||||
/- Implicit type-universe parameter, equivalent to `id₁`.
|
||||
Requires option `autoImplicit true`, which is the default. -/
|
||||
def id₂ (α : Type u) (a : α) := a
|
||||
|
||||
/- Explicit standalone universe variable declaration, equivalent to `id₁` and `id₂`. -/
|
||||
universe u
|
||||
def id₃ (α : Type u) (a : α) := a
|
||||
```
|
||||
|
||||
On a more technical note, using a universe variable only in the right-hand side of a definition
|
||||
causes an error if the universe has not been declared previously.
|
||||
|
||||
```lean
|
||||
def L₁.{u} := List (Type u)
|
||||
|
||||
-- def L₂ := List (Type u) -- error: `unknown universe level 'u'`
|
||||
|
||||
universe u
|
||||
def L₃ := List (Type u)
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
universe u v w
|
||||
|
||||
structure Pair (α : Type u) (β : Type v) : Type (max u v) where
|
||||
a : α
|
||||
b : β
|
||||
|
||||
#check Pair.{v, w}
|
||||
-- Pair : Type v → Type w → Type (max v w)
|
||||
```
|
||||
-/
|
||||
@[builtin_command_elab «universe»] def elabUniverse : CommandElab := fun n => do
|
||||
n[1].forArgsM addUnivLevel
|
||||
|
||||
@@ -127,6 +185,30 @@ private partial def elabChoiceAux (cmds : Array Syntax) (i : Nat) : CommandElabM
|
||||
| Except.ok env => setEnv env
|
||||
| Except.error ex => throwError (ex.toMessageData (← getOptions))
|
||||
|
||||
/-- Adds names from other namespaces to the current namespace.
|
||||
|
||||
The command `export Some.Namespace (name₁ name₂)` makes `name₁` and `name₂`:
|
||||
|
||||
- visible in the current namespace without prefix `Some.Namespace`, like `open`, and
|
||||
- visible from outside the current namespace `N` as `N.name₁` and `N.name₂`.
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
namespace Morning.Sky
|
||||
def star := "venus"
|
||||
end Morning.Sky
|
||||
|
||||
namespace Evening.Sky
|
||||
export Morning.Sky (star)
|
||||
-- `star` is now in scope
|
||||
#check star
|
||||
end Evening.Sky
|
||||
|
||||
-- `star` is visible in `Evening.Sky`
|
||||
#check Evening.Sky.star
|
||||
```
|
||||
-/
|
||||
@[builtin_command_elab «export»] def elabExport : CommandElab := fun stx => do
|
||||
let `(export $ns ($ids*)) := stx | throwUnsupportedSyntax
|
||||
let nss ← resolveNamespace ns
|
||||
@@ -141,6 +223,118 @@ private partial def elabChoiceAux (cmds : Array Syntax) (i : Nat) : CommandElabM
|
||||
aliases := aliases.push (currNamespace ++ id, declName)
|
||||
modify fun s => { s with env := aliases.foldl (init := s.env) fun env p => addAlias env p.1 p.2 }
|
||||
|
||||
/-- Makes names from other namespaces visible without writing the namespace prefix.
|
||||
|
||||
Names that are made available with `open` are visible within the current `section` or `namespace`
|
||||
block. This makes referring to (type) definitions and theorems easier, but note that it can also
|
||||
make [scoped instances], notations, and attributes from a different namespace available.
|
||||
|
||||
The `open` command can be used in a few different ways:
|
||||
|
||||
* `open Some.Namespace.Path1 Some.Namespace.Path2` makes all non-protected names in
|
||||
`Some.Namespace.Path1` and `Some.Namespace.Path2` available without the prefix, so that
|
||||
`Some.Namespace.Path1.x` and `Some.Namespace.Path2.y` can be referred to by writing only `x` and
|
||||
`y`.
|
||||
|
||||
* `open Some.Namespace.Path hiding def1 def2` opens all non-protected names in `Some.Namespace.Path`
|
||||
except `def1` and `def2`.
|
||||
|
||||
* `open Some.Namespace.Path (def1 def2)` only makes `Some.Namespace.Path.def1` and
|
||||
`Some.Namespace.Path.def2` available without the full prefix, so `Some.Namespace.Path.def3` would
|
||||
be unaffected.
|
||||
|
||||
This works even if `def1` and `def2` are `protected`.
|
||||
|
||||
* `open Some.Namespace.Path renaming def1 → def1', def2 → def2'` same as `open Some.Namespace.Path
|
||||
(def1 def2)` but `def1`/`def2`'s names are changed to `def1'`/`def2'`.
|
||||
|
||||
This works even if `def1` and `def2` are `protected`.
|
||||
|
||||
* `open scoped Some.Namespace.Path1 Some.Namespace.Path2` **only** opens [scoped instances],
|
||||
notations, and attributes from `Namespace1` and `Namespace2`; it does **not** make any other name
|
||||
available.
|
||||
|
||||
* `open <any of the open shapes above> in` makes the names `open`-ed visible only in the next
|
||||
command or expression.
|
||||
|
||||
[scoped instance]: https://lean-lang.org/theorem_proving_in_lean4/type_classes.html#scoped-instances
|
||||
(Scoped instances in Theorem Proving in Lean)
|
||||
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
/-- SKI combinators https://en.wikipedia.org/wiki/SKI_combinator_calculus -/
|
||||
namespace Combinator.Calculus
|
||||
def I (a : α) : α := a
|
||||
def K (a : α) : β → α := fun _ => a
|
||||
def S (x : α → β → γ) (y : α → β) (z : α) : γ := x z (y z)
|
||||
end Combinator.Calculus
|
||||
|
||||
section
|
||||
-- open everything under `Combinator.Calculus`, *i.e.* `I`, `K` and `S`,
|
||||
-- until the section ends
|
||||
open Combinator.Calculus
|
||||
|
||||
theorem SKx_eq_K : S K x = I := rfl
|
||||
end
|
||||
|
||||
-- open everything under `Combinator.Calculus` only for the next command (the next `theorem`, here)
|
||||
open Combinator.Calculus in
|
||||
theorem SKx_eq_K' : S K x = I := rfl
|
||||
|
||||
section
|
||||
-- open only `S` and `K` under `Combinator.Calculus`
|
||||
open Combinator.Calculus (S K)
|
||||
|
||||
theorem SKxy_eq_y : S K x y = y := rfl
|
||||
|
||||
-- `I` is not in scope, we have to use its full path
|
||||
theorem SKxy_eq_Iy : S K x y = Combinator.Calculus.I y := rfl
|
||||
end
|
||||
|
||||
section
|
||||
open Combinator.Calculus
|
||||
renaming
|
||||
I → identity,
|
||||
K → konstant
|
||||
|
||||
#check identity
|
||||
#check konstant
|
||||
end
|
||||
|
||||
section
|
||||
open Combinator.Calculus
|
||||
hiding S
|
||||
|
||||
#check I
|
||||
#check K
|
||||
end
|
||||
|
||||
section
|
||||
namespace Demo
|
||||
inductive MyType
|
||||
| val
|
||||
|
||||
namespace N1
|
||||
scoped infix:68 " ≋ " => BEq.beq
|
||||
|
||||
scoped instance : BEq MyType where
|
||||
beq _ _ := true
|
||||
|
||||
def Alias := MyType
|
||||
end N1
|
||||
end Demo
|
||||
|
||||
-- bring `≋` and the instance in scope, but not `Alias`
|
||||
open scoped Demo.N1
|
||||
|
||||
#check Demo.MyType.val == Demo.MyType.val
|
||||
#check Demo.MyType.val ≋ Demo.MyType.val
|
||||
-- #check Alias -- unknown identifier 'Alias'
|
||||
end
|
||||
```
|
||||
-/
|
||||
@[builtin_command_elab «open»] def elabOpen : CommandElab
|
||||
| `(open $decl:openDecl) => do
|
||||
let openDecls ← elabOpenDecl decl
|
||||
@@ -226,6 +420,102 @@ private def replaceBinderAnnotation (binder : TSyntax ``Parser.Term.bracketedBin
|
||||
else
|
||||
return #[binder]
|
||||
|
||||
/-- Declares one or more typed variables, or modifies whether already-declared variables are
|
||||
implicit.
|
||||
|
||||
Introduces variables that can be used in definitions within the same `namespace` or `section` block.
|
||||
When a definition mentions a variable, Lean will add it as an argument of the definition. The
|
||||
`variable` command is also able to add typeclass parameters. This is useful in particular when
|
||||
writing many definitions that have parameters in common (see below for an example).
|
||||
|
||||
Variable declarations have the same flexibility as regular function paramaters. In particular they
|
||||
can be [explicit, implicit][binder docs], or [instance implicit][tpil classes] (in which case they
|
||||
can be anonymous). This can be changed, for instance one can turn explicit variable `x` into an
|
||||
implicit one with `variable {x}`. Note that currently, you should avoid changing how variables are
|
||||
bound and declare new variables at the same time; see [issue 2789] for more on this topic.
|
||||
|
||||
See [*Variables and Sections* from Theorem Proving in Lean][tpil vars] for a more detailed
|
||||
discussion.
|
||||
|
||||
[tpil vars]: https://lean-lang.org/theorem_proving_in_lean4/dependent_type_theory.html#variables-and-sections
|
||||
(Variables and Sections on Theorem Proving in Lean)
|
||||
[tpil classes]: https://lean-lang.org/theorem_proving_in_lean4/type_classes.html
|
||||
(Type classes on Theorem Proving in Lean)
|
||||
[binder docs]: https://leanprover-community.github.io/mathlib4_docs/Lean/Expr.html#Lean.BinderInfo
|
||||
(Documentation for the BinderInfo type)
|
||||
[issue 2789]: https://github.com/leanprover/lean4/issues/2789
|
||||
(Issue 2789 on github)
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
section
|
||||
variable
|
||||
{α : Type u} -- implicit
|
||||
(a : α) -- explicit
|
||||
[instBEq : BEq α] -- instance implicit, named
|
||||
[Hashable α] -- instance implicit, anonymous
|
||||
|
||||
def isEqual (b : α) : Bool :=
|
||||
a == b
|
||||
|
||||
#check isEqual
|
||||
-- isEqual.{u} {α : Type u} (a : α) [instBEq : BEq α] (b : α) : Bool
|
||||
|
||||
variable
|
||||
{a} -- `a` is implicit now
|
||||
|
||||
def eqComm {b : α} := a == b ↔ b == a
|
||||
|
||||
#check eqComm
|
||||
-- eqComm.{u} {α : Type u} {a : α} [instBEq : BEq α] {b : α} : Prop
|
||||
end
|
||||
```
|
||||
|
||||
The following shows a typical use of `variable` to factor out definition arguments:
|
||||
|
||||
```lean
|
||||
variable (Src : Type)
|
||||
|
||||
structure Logger where
|
||||
trace : List (Src × String)
|
||||
#check Logger
|
||||
-- Logger (Src : Type) : Type
|
||||
|
||||
namespace Logger
|
||||
-- switch `Src : Type` to be implicit until the `end Logger`
|
||||
variable {Src}
|
||||
|
||||
def empty : Logger Src where
|
||||
trace := []
|
||||
#check empty
|
||||
-- Logger.empty {Src : Type} : Logger Src
|
||||
|
||||
variable (log : Logger Src)
|
||||
|
||||
def len :=
|
||||
log.trace.length
|
||||
#check len
|
||||
-- Logger.len {Src : Type} (log : Logger Src) : Nat
|
||||
|
||||
variable (src : Src) [BEq Src]
|
||||
|
||||
-- at this point all of `log`, `src`, `Src` and the `BEq` instance can all become arguments
|
||||
|
||||
def filterSrc :=
|
||||
log.trace.filterMap
|
||||
fun (src', str') => if src' == src then some str' else none
|
||||
#check filterSrc
|
||||
-- Logger.filterSrc {Src : Type} (log : Logger Src) (src : Src) [inst✝ : BEq Src] : List String
|
||||
|
||||
def lenSrc :=
|
||||
log.filterSrc src |>.length
|
||||
#check lenSrc
|
||||
-- Logger.lenSrc {Src : Type} (log : Logger Src) (src : Src) [inst✝ : BEq Src] : Nat
|
||||
end Logger
|
||||
```
|
||||
|
||||
-/
|
||||
@[builtin_command_elab «variable»] def elabVariable : CommandElab
|
||||
| `(variable $binders*) => do
|
||||
-- Try to elaborate `binders` for sanity checking
|
||||
@@ -248,7 +538,10 @@ def elabCheckCore (ignoreStuckTC : Bool) : CommandElab
|
||||
try
|
||||
for c in (← realizeGlobalConstWithInfos term) do
|
||||
addCompletionInfo <| .id term id.getId (danglingDot := false) {} none
|
||||
logInfoAt tk <| .signature c
|
||||
logInfoAt tk <| .ofPPFormat { pp := fun
|
||||
| some ctx => ctx.runMetaM <| PrettyPrinter.ppSignature c
|
||||
| none => return f!"{c}" -- should never happen
|
||||
}
|
||||
return
|
||||
catch _ => pure () -- identifier might not be a constant but constant + projection
|
||||
let e ← Term.elabTerm term none
|
||||
|
||||
@@ -7,7 +7,6 @@ prelude
|
||||
import Lean.Elab.Binders
|
||||
import Lean.Elab.SyntheticMVars
|
||||
import Lean.Elab.SetOption
|
||||
import Lean.Language.Basic
|
||||
|
||||
namespace Lean.Elab.Command
|
||||
|
||||
@@ -45,16 +44,6 @@ structure Context where
|
||||
currMacroScope : MacroScope := firstFrontendMacroScope
|
||||
ref : Syntax := Syntax.missing
|
||||
tacticCache? : Option (IO.Ref Tactic.Cache)
|
||||
/--
|
||||
Snapshot for incremental reuse and reporting of command elaboration. Currently unused in Lean
|
||||
itself.
|
||||
|
||||
Definitely resolved in `Language.Lean.process.doElab`.
|
||||
|
||||
Invariant: if the bundle's `old?` is set, the context and state at the beginning of current and
|
||||
old elaboration are identical.
|
||||
-/
|
||||
snap? : Option (Language.SnapshotBundle Language.DynamicSnapshot)
|
||||
|
||||
abbrev CommandElabCoreM (ε) := ReaderT Context $ StateRefT State $ EIO ε
|
||||
abbrev CommandElabM := CommandElabCoreM Exception
|
||||
@@ -157,13 +146,10 @@ private def addTraceAsMessagesCore (ctx : Context) (log : MessageLog) (traceStat
|
||||
|
||||
private def addTraceAsMessages : CommandElabM Unit := do
|
||||
let ctx ← read
|
||||
-- do not add trace messages if `trace.profiler.output` is set as it would be redundant and
|
||||
-- pretty printing the trace messages is expensive
|
||||
if trace.profiler.output.get? (← getOptions) |>.isNone then
|
||||
modify fun s => { s with
|
||||
messages := addTraceAsMessagesCore ctx s.messages s.traceState
|
||||
traceState.traces := {}
|
||||
}
|
||||
modify fun s => { s with
|
||||
messages := addTraceAsMessagesCore ctx s.messages s.traceState
|
||||
traceState.traces := {}
|
||||
}
|
||||
|
||||
def liftCoreM (x : CoreM α) : CommandElabM α := do
|
||||
let s ← get
|
||||
@@ -220,8 +206,7 @@ def runLinters (stx : Syntax) : CommandElabM Unit := do
|
||||
let linters ← lintersRef.get
|
||||
unless linters.isEmpty do
|
||||
for linter in linters do
|
||||
withTraceNode `Elab.lint (fun _ => return m!"running linter: {linter.name}")
|
||||
(tag := linter.name.toString) do
|
||||
withTraceNode `Elab.lint (fun _ => return m!"running linter: {linter.name}") do
|
||||
let savedState ← get
|
||||
try
|
||||
linter.run stx
|
||||
@@ -293,9 +278,7 @@ partial def elabCommand (stx : Syntax) : CommandElabM Unit := do
|
||||
-- list of commands => elaborate in order
|
||||
-- The parser will only ever return a single command at a time, but syntax quotations can return multiple ones
|
||||
args.forM elabCommand
|
||||
else withTraceNode `Elab.command (fun _ => return stx) (tag :=
|
||||
-- special case: show actual declaration kind for `declaration` commands
|
||||
(if stx.isOfKind ``Parser.Command.declaration then stx[1] else stx).getKind.toString) do
|
||||
else withTraceNode `Elab.command (fun _ => return stx) do
|
||||
let s ← get
|
||||
match (← liftMacroM <| expandMacroImpl? s.env stx) with
|
||||
| some (decl, stxNew?) =>
|
||||
@@ -531,7 +514,6 @@ def liftCommandElabM (cmd : CommandElabM α) : CoreM α := do
|
||||
fileMap := ← getFileMap
|
||||
ref := ← getRef
|
||||
tacticCache? := none
|
||||
snap? := none
|
||||
} |>.run {
|
||||
env := ← getEnv
|
||||
maxRecDepth := ← getMaxRecDepth
|
||||
|
||||
@@ -51,8 +51,6 @@ Expressions can also be replaced by `.bvar 0` if they shouldn't be mentioned.
|
||||
-/
|
||||
private partial def winnowExpr (e : Expr) : MetaM Expr := do
|
||||
let rec visit (e : Expr) : MonadCacheT Expr Expr MetaM Expr := checkCache e fun _ => do
|
||||
if ← isProof e then
|
||||
return .bvar 0
|
||||
match e with
|
||||
| .app .. =>
|
||||
if let some e' ← getParentProjArg e then
|
||||
|
||||
@@ -7,7 +7,6 @@ prelude
|
||||
import Lean.Language.Lean
|
||||
import Lean.Util.Profile
|
||||
import Lean.Server.References
|
||||
import Lean.Util.Profiler
|
||||
|
||||
namespace Lean.Elab.Frontend
|
||||
|
||||
@@ -33,7 +32,6 @@ def setCommandState (commandState : Command.State) : FrontendM Unit :=
|
||||
fileName := ctx.inputCtx.fileName
|
||||
fileMap := ctx.inputCtx.fileMap
|
||||
tacticCache? := none
|
||||
snap? := none
|
||||
}
|
||||
match (← liftM <| EIO.toIO' <| (x cmdCtx).run s.commandState) with
|
||||
| Except.error e => throw <| IO.Error.userError s!"unexpected internal error: {← e.toMessageData.toString}"
|
||||
@@ -109,9 +107,7 @@ def runFrontend
|
||||
(mainModuleName : Name)
|
||||
(trustLevel : UInt32 := 0)
|
||||
(ileanFileName? : Option String := none)
|
||||
(jsonOutput : Bool := false)
|
||||
: IO (Environment × Bool) := do
|
||||
let startTime := (← IO.monoNanosNow).toFloat / 1000000000
|
||||
let inputCtx := Parser.mkInputContext input fileName
|
||||
-- TODO: replace with `#lang` processing
|
||||
if /- Lean #lang? -/ true then
|
||||
@@ -123,14 +119,14 @@ def runFrontend
|
||||
let (env, messages) ← processHeader (leakEnv := true) header opts messages inputCtx trustLevel
|
||||
let env := env.setMainModule mainModuleName
|
||||
let mut commandState := Command.mkState env messages opts
|
||||
let elabStartTime := (← IO.monoNanosNow).toFloat / 1000000000
|
||||
|
||||
if ileanFileName?.isSome then
|
||||
-- Collect InfoTrees so we can later extract and export their info to the ilean file
|
||||
commandState := { commandState with infoState.enabled := true }
|
||||
|
||||
let s ← IO.processCommands inputCtx parserState commandState
|
||||
Language.reportMessages s.commandState.messages opts jsonOutput
|
||||
for msg in s.commandState.messages.toList do
|
||||
IO.print (← msg.toString (includeEndPos := Language.printMessageEndPos.get opts))
|
||||
|
||||
if let some ileanFileName := ileanFileName? then
|
||||
let trees := s.commandState.infoState.trees.toArray
|
||||
@@ -139,26 +135,13 @@ def runFrontend
|
||||
let ilean := { module := mainModuleName, references : Lean.Server.Ilean }
|
||||
IO.FS.writeFile ileanFileName $ Json.compress $ toJson ilean
|
||||
|
||||
if let some out := trace.profiler.output.get? opts then
|
||||
let traceState := s.commandState.traceState
|
||||
-- importing does not happen in an elaboration monad, add now
|
||||
let traceState := { traceState with
|
||||
traces := #[{
|
||||
ref := .missing,
|
||||
msg := .trace { cls := `Import, startTime, stopTime := elabStartTime }
|
||||
(.ofFormat "importing") #[]
|
||||
}].toPArray' ++ traceState.traces
|
||||
}
|
||||
let profile ← Firefox.Profile.export mainModuleName.toString startTime traceState opts
|
||||
IO.FS.writeFile ⟨out⟩ <| Json.compress <| toJson profile
|
||||
|
||||
return (s.commandState.env, !s.commandState.messages.hasErrors)
|
||||
|
||||
let ctx := { inputCtx with mainModuleName, opts, trustLevel }
|
||||
let processor := Language.Lean.process
|
||||
let snap ← processor none ctx
|
||||
let snaps := Language.toSnapshotTree snap
|
||||
snaps.runAndReport opts jsonOutput
|
||||
snaps.runAndReport opts
|
||||
if let some ileanFileName := ileanFileName? then
|
||||
let trees := snaps.getAll.concatMap (match ·.infoTree? with | some t => #[t] | _ => #[])
|
||||
let references := Lean.Server.findModuleRefs inputCtx.fileMap trees (localVars := false)
|
||||
|
||||
@@ -5,7 +5,6 @@ Authors: Kyle Miller
|
||||
-/
|
||||
prelude
|
||||
import Lean.Elab.Notation
|
||||
import Lean.Util.Diff
|
||||
import Lean.Server.CodeActions.Attr
|
||||
|
||||
/-! `#guard_msgs` command for testing commands
|
||||
@@ -16,12 +15,6 @@ See the docstring on the `#guard_msgs` command.
|
||||
|
||||
open Lean Parser.Tactic Elab Command
|
||||
|
||||
register_builtin_option guard_msgs.diff : Bool := {
|
||||
defValue := false
|
||||
descr := "When true, show a diff between expected and actual messages if they don't match. "
|
||||
}
|
||||
|
||||
|
||||
namespace Lean.Elab.Tactic.GuardMsgs
|
||||
|
||||
/-- Gives a string representation of a message without source position information.
|
||||
@@ -158,12 +151,7 @@ def MessageOrdering.apply (mode : MessageOrdering) (msgs : List String) : List S
|
||||
else
|
||||
-- Failed. Put all the messages back on the message log and add an error
|
||||
modify fun st => { st with messages := initMsgs ++ msgs }
|
||||
let feedback :=
|
||||
if (← getOptions).getBool `guard_msgs.diff false then
|
||||
let diff := Diff.diff (expected.split (· == '\n')).toArray (res.split (· == '\n')).toArray
|
||||
Diff.linesToString diff
|
||||
else res
|
||||
logErrorAt tk m!"❌ Docstring on `#guard_msgs` does not match generated message:\n\n{feedback}"
|
||||
logErrorAt tk m!"❌ Docstring on `#guard_msgs` does not match generated message:\n\n{res}"
|
||||
pushInfoLeaf (.ofCustomInfo { stx := ← getRef, value := Dynamic.mk (GuardMsgFailure.mk res) })
|
||||
| _ => throwUnsupportedSyntax
|
||||
|
||||
|
||||
@@ -25,11 +25,6 @@ open Meta
|
||||
builtin_initialize
|
||||
registerTraceClass `Elab.inductive
|
||||
|
||||
register_builtin_option inductive.autoPromoteIndices : Bool := {
|
||||
defValue := true
|
||||
descr := "Promote indices to parameters in inductive types whenever possible."
|
||||
}
|
||||
|
||||
def checkValidInductiveModifier [Monad m] [MonadError m] (modifiers : Modifiers) : m Unit := do
|
||||
if modifiers.isNoncomputable then
|
||||
throwError "invalid use of 'noncomputable' in inductive declaration"
|
||||
@@ -719,12 +714,10 @@ private def isDomainDefEq (arrowType : Expr) (type : Expr) : MetaM Bool := do
|
||||
Convert fixed indices to parameters.
|
||||
-/
|
||||
private partial def fixedIndicesToParams (numParams : Nat) (indTypes : Array InductiveType) (indFVars : Array Expr) : MetaM Nat := do
|
||||
if !inductive.autoPromoteIndices.get (← getOptions) then
|
||||
return numParams
|
||||
let masks ← indTypes.mapM (computeFixedIndexBitMask numParams · indFVars)
|
||||
trace[Elab.inductive] "masks: {masks}"
|
||||
if masks.all fun mask => !mask.contains true then
|
||||
return numParams
|
||||
trace[Elab.inductive] "masks: {masks}"
|
||||
-- We process just a non-fixed prefix of the indices for now. Reason: we don't want to change the order.
|
||||
-- TODO: extend it in the future. For example, it should be reasonable to change
|
||||
-- the order of indices generated by the auto implicit feature.
|
||||
|
||||
@@ -147,7 +147,7 @@ partial def evalTactic (stx : Syntax) : TacticM Unit := do
|
||||
if k == nullKind then
|
||||
-- Macro writers create a sequence of tactics `t₁ ... tₙ` using `mkNullNode #[t₁, ..., tₙ]`
|
||||
stx.getArgs.forM evalTactic
|
||||
else withTraceNode `Elab.step (fun _ => return stx) (tag := stx.getKind.toString) do
|
||||
else withTraceNode `Elab.step (fun _ => return stx) do
|
||||
let evalFns := tacticElabAttribute.getEntries (← getEnv) stx.getKind
|
||||
let macros := macroAttribute.getEntries (← getEnv) stx.getKind
|
||||
if evalFns.isEmpty && macros.isEmpty then
|
||||
|
||||
@@ -58,7 +58,7 @@ partial def falseOrByContra (g : MVarId) (useClassical : Option Bool := none) :
|
||||
let [g] ← g.applyConst ``False.elim | panic! "expected one sugoal"
|
||||
pure g
|
||||
|
||||
@[builtin_tactic Lean.Parser.Tactic.falseOrByContra]
|
||||
@[builtin_tactic falseOrByContra]
|
||||
def elabFalseOrByContra : Tactic
|
||||
| `(tactic| false_or_by_contra) => do liftMetaTactic1 (falseOrByContra ·)
|
||||
| _ => no_error_if_unused% throwUnsupportedSyntax
|
||||
|
||||
@@ -526,82 +526,6 @@ def cases₂ (mvarId : MVarId) (p : Expr) (hName : Name := `h) :
|
||||
| throwError "'cases' tactic failed, unexpected new hypothesis"
|
||||
return ((s₁.mvarId, f₁), (s₂.mvarId, f₂))
|
||||
|
||||
/--
|
||||
Helpful error message when omega cannot find a solution
|
||||
-/
|
||||
def formatErrorMessage (p : Problem) : OmegaM MessageData := do
|
||||
if p.possible then
|
||||
if p.isEmpty then
|
||||
return m!"it is false"
|
||||
else
|
||||
let as ← atoms
|
||||
let mask ← mentioned p.constraints
|
||||
let names ← varNames mask
|
||||
return m!"a possible counterexample may satisfy the constraints\n" ++
|
||||
m!"{prettyConstraints names p.constraints}\nwhere\n{prettyAtoms names as mask}"
|
||||
else
|
||||
-- formatErrorMessage should not be used in this case
|
||||
return "it is trivially solvable"
|
||||
where
|
||||
varNameOf (i : Nat) : String :=
|
||||
let c : Char := .ofNat ('a'.toNat + (i % 26))
|
||||
let suffix := if i < 26 then "" else s!"_{i / 26}"
|
||||
s!"{c}{suffix}"
|
||||
|
||||
inScope (s : String) : MetaM Bool := do
|
||||
let n := .mkSimple s
|
||||
if (← resolveGlobalName n).isEmpty then
|
||||
if ((← getLCtx).findFromUserName? n).isNone then
|
||||
return false
|
||||
return true
|
||||
|
||||
-- Assign ascending names a, b, c, …, z, a1 … to all atoms mentioned according to the mask
|
||||
-- but avoid names in the local or global scope
|
||||
varNames (mask : Array Bool) : MetaM (Array String) := do
|
||||
let mut names := #[]
|
||||
let mut next := 0
|
||||
for h : i in [:mask.size] do
|
||||
if mask[i] then
|
||||
while ← inScope (varNameOf next) do next := next + 1
|
||||
names := names.push (varNameOf next)
|
||||
next := next + 1
|
||||
else
|
||||
names := names.push "(masked)"
|
||||
return names
|
||||
|
||||
prettyConstraints (names : Array String) (constraints : HashMap Coeffs Fact) : String :=
|
||||
constraints.toList
|
||||
|>.map (fun ⟨coeffs, ⟨_, cst, _⟩⟩ => " " ++ prettyConstraint (prettyCoeffs names coeffs) cst)
|
||||
|> "\n".intercalate
|
||||
|
||||
prettyConstraint (e : String) : Constraint → String
|
||||
| ⟨none, none⟩ => s!"{e} is unconstrained" -- should not happen in error messages
|
||||
| ⟨none, some y⟩ => s!"{e} ≤ {y}"
|
||||
| ⟨some x, none⟩ => s!"{e} ≥ {x}"
|
||||
| ⟨some x, some y⟩ =>
|
||||
if y < x then "∅" else -- should not happen in error messages
|
||||
s!"{x} ≤ {e} ≤ {y}"
|
||||
|
||||
prettyCoeffs (names : Array String) (coeffs : Coeffs) : String :=
|
||||
coeffs.toList.enum
|
||||
|>.filter (fun (_,c) => c ≠ 0)
|
||||
|>.enum
|
||||
|>.map (fun (j, (i,c)) =>
|
||||
(if j > 0 then if c > 0 then " + " else " - " else if c > 0 then "" else "- ") ++
|
||||
(if Int.natAbs c = 1 then names[i]! else s!"{c.natAbs}*{names[i]!}"))
|
||||
|> String.join
|
||||
|
||||
mentioned (constraints : HashMap Coeffs Fact) : OmegaM (Array Bool) := do
|
||||
let initMask := Array.mkArray (← getThe State).atoms.size false
|
||||
return constraints.fold (init := initMask) fun mask coeffs _ =>
|
||||
coeffs.enum.foldl (init := mask) fun mask (i, c) =>
|
||||
if c = 0 then mask else mask.set! i true
|
||||
|
||||
prettyAtoms (names : Array String) (atoms : Array Expr) (mask : Array Bool) : MessageData :=
|
||||
(Array.zip names atoms).toList.enum
|
||||
|>.filter (fun (i, _) => mask.getD i false)
|
||||
|>.map (fun (_, (n, a)) => m!" {n} := {a}")
|
||||
|> m!"\n".joinSep
|
||||
|
||||
mutual
|
||||
|
||||
@@ -611,7 +535,7 @@ call `omegaImpl` in both branches.
|
||||
-/
|
||||
partial def splitDisjunction (m : MetaProblem) (g : MVarId) : OmegaM Unit := g.withContext do
|
||||
match m.disjunctions with
|
||||
| [] => throwError "omega could not prove the goal:\n{← formatErrorMessage m.problem}"
|
||||
| [] => throwError "omega did not find a contradiction:\n{m.problem}"
|
||||
| h :: t =>
|
||||
trace[omega] "Case splitting on {← inferType h}"
|
||||
let ctx ← getMCtx
|
||||
|
||||
@@ -21,12 +21,12 @@ open Meta
|
||||
throwErrorAt stx[2] "'split' tactic failed, select a single target to split"
|
||||
if simplifyTarget then
|
||||
liftMetaTactic fun mvarId => do
|
||||
let some mvarIds ← splitTarget? mvarId | Meta.throwTacticEx `split mvarId
|
||||
let some mvarIds ← splitTarget? mvarId | Meta.throwTacticEx `split mvarId ""
|
||||
return mvarIds
|
||||
else
|
||||
let fvarId ← getFVarId hyps[0]!
|
||||
liftMetaTactic fun mvarId => do
|
||||
let some mvarIds ← splitLocalDecl? mvarId fvarId | Meta.throwTacticEx `split mvarId
|
||||
let some mvarIds ← splitLocalDecl? mvarId fvarId | Meta.throwTacticEx `split mvarId ""
|
||||
return mvarIds
|
||||
| Location.wildcard =>
|
||||
liftMetaTactic fun mvarId => do
|
||||
@@ -34,7 +34,7 @@ open Meta
|
||||
for fvarId in fvarIds do
|
||||
if let some mvarIds ← splitLocalDecl? mvarId fvarId then
|
||||
return mvarIds
|
||||
let some mvarIds ← splitTarget? mvarId | Meta.throwTacticEx `split mvarId
|
||||
let some mvarIds ← splitTarget? mvarId | Meta.throwTacticEx `split mvarId ""
|
||||
return mvarIds
|
||||
|
||||
end Lean.Elab.Tactic
|
||||
|
||||
@@ -261,14 +261,6 @@ def SavedState.restore (s : SavedState) (restoreInfo : Bool := false) : TermElab
|
||||
unless restoreInfo do
|
||||
setInfoState infoState
|
||||
|
||||
/--
|
||||
Restores full state including sources for unique identifiers. Only intended for incremental reuse
|
||||
between elaboration runs, not for backtracking within a single run.
|
||||
-/
|
||||
def SavedState.restoreFull (s : SavedState) : TermElabM Unit := do
|
||||
s.meta.restoreFull
|
||||
set s.elab
|
||||
|
||||
instance : MonadBacktrack SavedState TermElabM where
|
||||
saveState := Term.saveState
|
||||
restoreState b := b.restore
|
||||
@@ -1387,8 +1379,7 @@ where
|
||||
private partial def elabTermAux (expectedType? : Option Expr) (catchExPostpone : Bool) (implicitLambda : Bool) : Syntax → TermElabM Expr
|
||||
| .missing => mkSyntheticSorryFor expectedType?
|
||||
| stx => withFreshMacroScope <| withIncRecDepth do
|
||||
withTraceNode `Elab.step (fun _ => return m!"expected type: {expectedType?}, term\n{stx}")
|
||||
(tag := stx.getKind.toString) do
|
||||
withTraceNode `Elab.step (fun _ => return m!"expected type: {expectedType?}, term\n{stx}") do
|
||||
checkSystem "elaborator"
|
||||
let env ← getEnv
|
||||
let result ← match (← liftMacroM (expandMacroImpl? env stx)) with
|
||||
@@ -1766,7 +1757,6 @@ builtin_initialize
|
||||
registerTraceClass `Elab.postpone
|
||||
registerTraceClass `Elab.coe
|
||||
registerTraceClass `Elab.debug
|
||||
registerTraceClass `Elab.reuse
|
||||
|
||||
export Term (TermElabM)
|
||||
|
||||
|
||||
@@ -114,7 +114,10 @@ unsafe def mkElabAttribute (γ) (attrBuiltinName attrName : Name) (parserNamespa
|
||||
return kind
|
||||
onAdded := fun builtin declName => do
|
||||
if builtin then
|
||||
declareBuiltinDocStringAndRanges declName
|
||||
if let some doc ← findDocString? (← getEnv) declName (includeBuiltin := false) then
|
||||
declareBuiltin (declName ++ `docString) (mkAppN (mkConst ``addBuiltinDocString) #[toExpr declName, toExpr doc])
|
||||
if let some declRanges ← findDeclarationRanges? declName then
|
||||
declareBuiltin (declName ++ `declRange) (mkAppN (mkConst ``addBuiltinDeclarationRanges) #[toExpr declName, toExpr declRanges])
|
||||
} attrDeclName
|
||||
|
||||
unsafe def mkMacroAttributeUnsafe (ref : Name) : IO (KeyedDeclsAttribute Macro) :=
|
||||
|
||||
@@ -9,7 +9,6 @@ Authors: Sebastian Ullrich
|
||||
-/
|
||||
|
||||
prelude
|
||||
import Init.System.Promise
|
||||
import Lean.Message
|
||||
import Lean.Parser.Types
|
||||
|
||||
@@ -59,26 +58,23 @@ deriving Inhabited
|
||||
-- cursor position. This may require starting the tasks suspended (e.g. in `Thunk`). The server may
|
||||
-- also need more dependency information for this in order to avoid priority inversion.
|
||||
structure SnapshotTask (α : Type) where
|
||||
/--
|
||||
Range that is marked as being processed by the server while the task is running. If `none`,
|
||||
the range of the outer task if some or else the entire file is reported.
|
||||
-/
|
||||
range? : Option String.Range
|
||||
/-- Range that is marked as being processed by the server while the task is running. -/
|
||||
range : String.Range
|
||||
/-- Underlying task producing the snapshot. -/
|
||||
task : Task α
|
||||
deriving Nonempty
|
||||
|
||||
/-- Creates a snapshot task from a reporting range and a `BaseIO` action. -/
|
||||
def SnapshotTask.ofIO (range? : Option String.Range) (act : BaseIO α) : BaseIO (SnapshotTask α) := do
|
||||
def SnapshotTask.ofIO (range : String.Range) (act : BaseIO α) : BaseIO (SnapshotTask α) := do
|
||||
return {
|
||||
range?
|
||||
range
|
||||
task := (← BaseIO.asTask act)
|
||||
}
|
||||
|
||||
/-- Creates a finished snapshot task. -/
|
||||
def SnapshotTask.pure (a : α) : SnapshotTask α where
|
||||
-- irrelevant when already finished
|
||||
range? := none
|
||||
range := default
|
||||
task := .pure a
|
||||
|
||||
/--
|
||||
@@ -88,26 +84,23 @@ def SnapshotTask.cancel (t : SnapshotTask α) : BaseIO Unit :=
|
||||
IO.cancel t.task
|
||||
|
||||
/-- Transforms a task's output without changing the reporting range. -/
|
||||
def SnapshotTask.map (t : SnapshotTask α) (f : α → β) (range? : Option String.Range := t.range?)
|
||||
def SnapshotTask.map (t : SnapshotTask α) (f : α → β) (range : String.Range := t.range)
|
||||
(sync := false) : SnapshotTask β :=
|
||||
{ range?, task := t.task.map (sync := sync) f }
|
||||
{ range, task := t.task.map (sync := sync) f }
|
||||
|
||||
/--
|
||||
Chains two snapshot tasks. The range is taken from the first task if not specified; the range of
|
||||
the second task is discarded. -/
|
||||
def SnapshotTask.bind (t : SnapshotTask α) (act : α → SnapshotTask β)
|
||||
(range? : Option String.Range := t.range?) (sync := false) : SnapshotTask β :=
|
||||
{ range?, task := t.task.bind (sync := sync) (act · |>.task) }
|
||||
(range : String.Range := t.range) (sync := false) : SnapshotTask β :=
|
||||
{ range, task := t.task.bind (sync := sync) (act · |>.task) }
|
||||
|
||||
/--
|
||||
Chains two snapshot tasks. The range is taken from the first task if not specified; the range of
|
||||
the second task is discarded. -/
|
||||
def SnapshotTask.bindIO (t : SnapshotTask α) (act : α → BaseIO (SnapshotTask β))
|
||||
(range? : Option String.Range := t.range?) (sync := false) : BaseIO (SnapshotTask β) :=
|
||||
return {
|
||||
range?
|
||||
task := (← BaseIO.bindTask (sync := sync) t.task fun a => (·.task) <$> (act a))
|
||||
}
|
||||
(range : String.Range := t.range) (sync := false) : BaseIO (SnapshotTask β) :=
|
||||
return { range, task := (← BaseIO.bindTask (sync := sync) t.task fun a => (·.task) <$> (act a)) }
|
||||
|
||||
/-- Synchronously waits on the result of the task. -/
|
||||
def SnapshotTask.get (t : SnapshotTask α) : α :=
|
||||
@@ -117,40 +110,6 @@ def SnapshotTask.get (t : SnapshotTask α) : α :=
|
||||
def SnapshotTask.get? (t : SnapshotTask α) : BaseIO (Option α) :=
|
||||
return if (← IO.hasFinished t.task) then some t.task.get else none
|
||||
|
||||
/--
|
||||
Arbitrary value paired with a syntax that should be inspected when considering the value for reuse.
|
||||
-/
|
||||
structure SyntaxGuarded (α : Type) where
|
||||
/-- Syntax to be inspected for reuse. -/
|
||||
stx : Syntax
|
||||
/-- Potentially reusable value. -/
|
||||
val : α
|
||||
|
||||
/--
|
||||
Pair of (optional) old snapshot task usable for incremental reuse and new snapshot promise for
|
||||
incremental reporting. Inside the elaborator, we build snapshots by carrying such bundles and then
|
||||
checking if we can reuse `old?` if set or else redoing the corresponding elaboration step. In either
|
||||
case, we derive new bundles for nested snapshots, if any, and finally `resolve` `new` to the result.
|
||||
|
||||
Note that failing to `resolve` a created promise will block the language server indefinitely!
|
||||
Corresponding `IO.Promise.new` calls should come with a "definitely resolved in ..." comment
|
||||
explaining how this is avoided in each case.
|
||||
|
||||
In the future, the 1-element history `old?` may be replaced with a global cache indexed by strong
|
||||
hashes but the promise will still need to be passed through the elaborator.
|
||||
-/
|
||||
structure SnapshotBundle (α : Type) where
|
||||
/--
|
||||
Snapshot task of corresponding elaboration in previous document version if any, paired with its
|
||||
old syntax to be considered for reuse. Should be set to `none` as soon as reuse can be ruled out.
|
||||
-/
|
||||
old? : Option (SyntaxGuarded (SnapshotTask α))
|
||||
/--
|
||||
Promise of snapshot value for the current document. When resolved, the language server will
|
||||
report its result even before the current elaborator invocation has finished.
|
||||
-/
|
||||
new : IO.Promise α
|
||||
|
||||
/--
|
||||
Tree of snapshots where each snapshot comes with an array of asynchronous further subtrees. Used
|
||||
for asynchronously collecting information about the entirety of snapshots in the language server.
|
||||
@@ -159,7 +118,7 @@ structure SnapshotBundle (α : Type) where
|
||||
inductive SnapshotTree where
|
||||
/-- Creates a snapshot tree node. -/
|
||||
| mk (element : Snapshot) (children : Array (SnapshotTask SnapshotTree))
|
||||
deriving Inhabited
|
||||
deriving Nonempty
|
||||
|
||||
/-- The immediately available element of the snapshot tree node. -/
|
||||
abbrev SnapshotTree.element : SnapshotTree → Snapshot
|
||||
@@ -176,49 +135,6 @@ class ToSnapshotTree (α : Type) where
|
||||
toSnapshotTree : α → SnapshotTree
|
||||
export ToSnapshotTree (toSnapshotTree)
|
||||
|
||||
instance [ToSnapshotTree α] : ToSnapshotTree (Option α) where
|
||||
toSnapshotTree
|
||||
| some a => toSnapshotTree a
|
||||
| none => default
|
||||
|
||||
/-- Snapshot type without child nodes. -/
|
||||
structure SnapshotLeaf extends Snapshot
|
||||
deriving Nonempty, TypeName
|
||||
|
||||
instance : ToSnapshotTree SnapshotLeaf where
|
||||
toSnapshotTree s := SnapshotTree.mk s.toSnapshot #[]
|
||||
|
||||
/-- Arbitrary snapshot type, used for extensibility. -/
|
||||
structure DynamicSnapshot where
|
||||
/-- Concrete snapshot value as `Dynamic`. -/
|
||||
val : Dynamic
|
||||
/-- Snapshot tree retrieved from `val` before erasure. -/
|
||||
tree : SnapshotTree
|
||||
deriving Nonempty
|
||||
|
||||
instance : ToSnapshotTree DynamicSnapshot where
|
||||
toSnapshotTree s := s.tree
|
||||
|
||||
/-- Creates a `DynamicSnapshot` from a typed snapshot value. -/
|
||||
def DynamicSnapshot.ofTyped [TypeName α] [ToSnapshotTree α] (val : α) : DynamicSnapshot where
|
||||
val := .mk val
|
||||
tree := ToSnapshotTree.toSnapshotTree val
|
||||
|
||||
/-- Returns the original snapshot value if it is of the given type. -/
|
||||
def DynamicSnapshot.toTyped? (α : Type) [TypeName α] (snap : DynamicSnapshot) :
|
||||
Option α :=
|
||||
snap.val.get? α
|
||||
|
||||
/--
|
||||
Runs a tree of snapshots to conclusion, incrementally performing `f` on each snapshot in tree
|
||||
preorder. -/
|
||||
@[specialize] partial def SnapshotTree.forM [Monad m] (s : SnapshotTree)
|
||||
(f : Snapshot → m PUnit) : m PUnit := do
|
||||
match s with
|
||||
| mk element children =>
|
||||
f element
|
||||
children.forM (·.get.forM f)
|
||||
|
||||
/--
|
||||
Option for printing end position of each message in addition to start position. Used for testing
|
||||
message ranges in the test suite. -/
|
||||
@@ -226,24 +142,25 @@ register_builtin_option printMessageEndPos : Bool := {
|
||||
defValue := false, descr := "print end position of each message in addition to start position"
|
||||
}
|
||||
|
||||
/-- Reports messages on stdout. If `json` is true, prints messages as JSON (one per line). -/
|
||||
def reportMessages (msgLog : MessageLog) (opts : Options) (json := false) : IO Unit := do
|
||||
if json then
|
||||
msgLog.forM (·.toJson <&> (·.compress) >>= IO.println)
|
||||
else
|
||||
msgLog.forM (·.toString (includeEndPos := printMessageEndPos.get opts) >>= IO.print)
|
||||
|
||||
/--
|
||||
Runs a tree of snapshots to conclusion and incrementally report messages on stdout. Messages are
|
||||
reported in tree preorder.
|
||||
This function is used by the cmdline driver; see `Lean.Server.FileWorker.reportSnapshots` for how
|
||||
the language server reports snapshots asynchronously. -/
|
||||
def SnapshotTree.runAndReport (s : SnapshotTree) (opts : Options) (json := false) : IO Unit := do
|
||||
s.forM (reportMessages ·.diagnostics.msgLog opts json)
|
||||
partial def SnapshotTree.runAndReport (s : SnapshotTree) (opts : Options) : IO Unit := do
|
||||
s.element.diagnostics.msgLog.forM
|
||||
(·.toString (includeEndPos := printMessageEndPos.get opts) >>= IO.print)
|
||||
for t in s.children do
|
||||
t.get.runAndReport opts
|
||||
|
||||
/-- Waits on and returns all snapshots in the tree. -/
|
||||
def SnapshotTree.getAll (s : SnapshotTree) : Array Snapshot :=
|
||||
s.forM (m := StateM _) (fun s => modify (·.push s)) |>.run #[] |>.2
|
||||
partial def SnapshotTree.getAll (s : SnapshotTree) : Array Snapshot :=
|
||||
go s |>.run #[] |>.2
|
||||
where
|
||||
go s : StateM (Array Snapshot) Unit := do
|
||||
modify (·.push s.element)
|
||||
for t in s.children do
|
||||
go t.get
|
||||
|
||||
/-- Metadata that does not change during the lifetime of the language processing process. -/
|
||||
structure ModuleProcessingContext where
|
||||
@@ -270,7 +187,7 @@ Creates snapshot message log from non-interactive message log, also allocating a
|
||||
that can be used by the server to memorize interactive diagnostics derived from the log.
|
||||
-/
|
||||
def Snapshot.Diagnostics.ofMessageLog (msgLog : Lean.MessageLog) :
|
||||
BaseIO Snapshot.Diagnostics := do
|
||||
ProcessingM Snapshot.Diagnostics := do
|
||||
return { msgLog, interactiveDiagsRef? := some (← IO.mkRef none) }
|
||||
|
||||
/-- Creates diagnostics from a single error message that should span the whole file. -/
|
||||
@@ -296,7 +213,7 @@ end Language
|
||||
/--
|
||||
Builds a function for processing a language using incremental snapshots by passing the previous
|
||||
snapshot to `Language.process` on subsequent invocations. -/
|
||||
def Language.mkIncrementalProcessor (process : Option InitSnap → ProcessingM InitSnap)
|
||||
partial def Language.mkIncrementalProcessor (process : Option InitSnap → ProcessingM InitSnap)
|
||||
(ctx : ModuleProcessingContext) : BaseIO (Parser.InputContext → BaseIO InitSnap) := do
|
||||
let oldRef ← IO.mkRef none
|
||||
return fun ictx => do
|
||||
|
||||
@@ -58,51 +58,6 @@ exist currently and likely it could at best be approximated by e.g. "furthest `t
|
||||
we remain at "go two commands up" at this point.
|
||||
-/
|
||||
|
||||
/-!
|
||||
# Note [Incremental Command Elaboration]
|
||||
|
||||
Because of Lean's use of persistent data structures, incremental reuse of fully elaborated commands
|
||||
is easy because we can simply snapshot the entire state after each command and then restart
|
||||
elaboration using the stored state at the point of change. However, incrementality within
|
||||
elaboration of a single command such as between tactic steps is much harder because we cannot simply
|
||||
return from those points to the language processor in a way that we can later resume from there.
|
||||
Instead, we exchange the need for continuations with some limited mutability: by allocating an
|
||||
`IO.Promise` "cell" in the language processor, we can both pass it to the elaborator to eventually
|
||||
fill it using `Promise.resolve` as well as convert it to a `Task` that will wait on that resolution
|
||||
using `Promise.result` and return it as part of the command snapshot created by the language
|
||||
processor. The elaborator can then create new promises itself and store their `result` when
|
||||
resolving an outer promise to create an arbitrary tree of promise-backed snapshot tasks. Thus, we
|
||||
can enable incremental reporting and reuse inside the elaborator using the same snapshot tree data
|
||||
structures as outside without having to change the elaborator's control flow.
|
||||
|
||||
While ideally we would decide what can be reused during command elaboration using strong hashes over
|
||||
the state and inputs, currently we rely on simpler syntactic checks: if all the syntax inspected up
|
||||
to a certain point is unchanged, we can assume that the old state can be reused. The central
|
||||
`SnapshotBundle` type passed inwards through the elaborator for this purpose combines the following
|
||||
data:
|
||||
* the `IO.Promise` to be resolved to an elaborator snapshot (whose type depends on the specific
|
||||
elaborator part we're in, e.g. `)
|
||||
* if there was a previous run:
|
||||
* a `SnapshotTask` holding the corresponding snapshot of the run
|
||||
* the relevant `Syntax` of the previous run to be compared before any reuse
|
||||
|
||||
Note that as we do not wait for the previous run to finish before starting to elaborate the next
|
||||
one, the `SnapshotTask` task may not be finished yet. Indeed, if we do find that we can reuse the
|
||||
contained state, we will want to explicitly wait for it instead of redoing the work. On the other
|
||||
hand, the `Syntax` is not surrounded by a task so that we can immediately access it for comparisons,
|
||||
even if the snapshot task may, eventually, give access to the same syntax tree.
|
||||
|
||||
TODO: tactic examples
|
||||
|
||||
While it is generally true that we can provide incremental reporting even without reuse, we
|
||||
generally want to avoid that when it would be confusing/annoying, e.g. when a tactic block is run
|
||||
multiple times because otherwise the progress bar would snap back and forth multiple times. For this
|
||||
purpose, we can disable both incremental modes using `Term.withoutTacticIncrementality`, assuming we
|
||||
opted into incrementality because of other parts of the combinator. `induction` is an example of
|
||||
this because there are some induction alternatives that are run multiple times, so we disable all of
|
||||
incrementality for them.
|
||||
-/
|
||||
|
||||
set_option linter.missingDocs true
|
||||
|
||||
namespace Lean.Language.Lean
|
||||
@@ -129,31 +84,34 @@ register_builtin_option showPartialSyntaxErrors : Bool := {
|
||||
|
||||
/-! The hierarchy of Lean snapshot types -/
|
||||
|
||||
/-- Snapshot after elaboration of the entire command. -/
|
||||
structure CommandFinishedSnapshot extends Language.Snapshot where
|
||||
/-- Final state of processing of a command. -/
|
||||
structure CommandFinishedSnapshot extends Snapshot where
|
||||
/-- Resulting elaboration state. -/
|
||||
cmdState : Command.State
|
||||
deriving Nonempty
|
||||
instance : ToSnapshotTree CommandFinishedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot, #[]⟩
|
||||
|
||||
/--
|
||||
State after processing a command's signature and before executing its tactic body, if any. Other
|
||||
commands should immediately proceed to `finished`. -/
|
||||
-- TODO: tactics
|
||||
structure CommandSignatureProcessedSnapshot extends Snapshot where
|
||||
/-- State after processing is finished. -/
|
||||
finishedSnap : SnapshotTask CommandFinishedSnapshot
|
||||
deriving Nonempty
|
||||
instance : ToSnapshotTree CommandSignatureProcessedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot, #[s.finishedSnap.map (sync := true) toSnapshotTree]⟩
|
||||
|
||||
/-- State after a command has been parsed. -/
|
||||
structure CommandParsedSnapshotData extends Snapshot where
|
||||
/-- Syntax tree of the command. -/
|
||||
stx : Syntax
|
||||
/-- Resulting parser state. -/
|
||||
parserState : Parser.ModuleParserState
|
||||
/--
|
||||
Snapshot for incremental reporting and reuse during elaboration, type dependent on specific
|
||||
elaborator.
|
||||
-/
|
||||
elabSnap : SnapshotTask DynamicSnapshot
|
||||
/-- State after processing is finished. -/
|
||||
finishedSnap : SnapshotTask CommandFinishedSnapshot
|
||||
/-- Cache for `save`; to be replaced with incrementality. -/
|
||||
tacticCache : IO.Ref Tactic.Cache
|
||||
/-- Signature processing task. -/
|
||||
sigSnap : SnapshotTask CommandSignatureProcessedSnapshot
|
||||
deriving Nonempty
|
||||
|
||||
/-- State after a command has been parsed. -/
|
||||
-- workaround for lack of recursive structures
|
||||
inductive CommandParsedSnapshot where
|
||||
@@ -165,23 +123,22 @@ deriving Nonempty
|
||||
abbrev CommandParsedSnapshot.data : CommandParsedSnapshot → CommandParsedSnapshotData
|
||||
| mk data _ => data
|
||||
/-- Next command, unless this is a terminal command. -/
|
||||
-- It would be really nice to not make this depend on `sig.finished` where possible
|
||||
abbrev CommandParsedSnapshot.next? : CommandParsedSnapshot →
|
||||
Option (SnapshotTask CommandParsedSnapshot)
|
||||
| mk _ next? => next?
|
||||
partial instance : ToSnapshotTree CommandParsedSnapshot where
|
||||
toSnapshotTree := go where
|
||||
go s := ⟨s.data.toSnapshot,
|
||||
#[s.data.elabSnap.map (sync := true) toSnapshotTree,
|
||||
s.data.finishedSnap.map (sync := true) toSnapshotTree] |>
|
||||
#[s.data.sigSnap.map (sync := true) toSnapshotTree] |>
|
||||
pushOpt (s.next?.map (·.map (sync := true) go))⟩
|
||||
|
||||
|
||||
/-- Cancels all significant computations from this snapshot onwards. -/
|
||||
partial def CommandParsedSnapshot.cancel (snap : CommandParsedSnapshot) : BaseIO Unit := do
|
||||
-- This is the only relevant computation right now, everything else is promises
|
||||
-- TODO: cancel additional elaboration tasks (which will be tricky with `DynamicSnapshot`) if we
|
||||
-- add them without switching to implicit cancellation
|
||||
snap.data.finishedSnap.cancel
|
||||
-- This is the only relevant computation right now
|
||||
-- TODO: cancel additional elaboration tasks if we add them without switching to implicit
|
||||
-- cancellation
|
||||
snap.data.sigSnap.cancel
|
||||
if let some next := snap.next? then
|
||||
-- recurse on next command (which may have been spawned just before we cancelled above)
|
||||
let _ ← IO.mapTask (sync := true) (·.cancel) next.task
|
||||
@@ -351,7 +308,7 @@ where
|
||||
processHeader (stx : Syntax) (parserState : Parser.ModuleParserState) :
|
||||
LeanProcessingM (SnapshotTask HeaderProcessedSnapshot) := do
|
||||
let ctx ← read
|
||||
SnapshotTask.ofIO (some ⟨0, ctx.input.endPos⟩) <|
|
||||
SnapshotTask.ofIO ⟨0, ctx.input.endPos⟩ <|
|
||||
ReaderT.run (r := ctx) <| -- re-enter reader in new task
|
||||
withHeaderExceptions (α := HeaderProcessedSnapshot) ({ · with result? := none }) do
|
||||
let opts ← match (← setupImports stx) with
|
||||
@@ -405,16 +362,16 @@ where
|
||||
-- is not `Inhabited`
|
||||
return .pure <| .mk (nextCmdSnap? := none) {
|
||||
diagnostics := .empty, stx := .missing, parserState
|
||||
elabSnap := .pure <| .ofTyped { diagnostics := .empty : SnapshotLeaf }
|
||||
finishedSnap := .pure { diagnostics := .empty, cmdState }
|
||||
tacticCache := (← IO.mkRef {})
|
||||
}
|
||||
sigSnap := .pure {
|
||||
diagnostics := .empty
|
||||
finishedSnap := .pure { diagnostics := .empty, cmdState } } }
|
||||
|
||||
let unchanged old : BaseIO CommandParsedSnapshot :=
|
||||
-- when syntax is unchanged, reuse command processing task as is
|
||||
if let some oldNext := old.next? then
|
||||
return .mk (data := old.data)
|
||||
(nextCmdSnap? := (← old.data.finishedSnap.bindIO (sync := true) fun oldFinished =>
|
||||
(nextCmdSnap? := (← old.data.sigSnap.bindIO (sync := true) fun oldSig =>
|
||||
oldSig.finishedSnap.bindIO (sync := true) fun oldFinished =>
|
||||
-- also wait on old command parse snapshot as parsing is cheap and may allow for
|
||||
-- elaboration reuse
|
||||
oldNext.bindIO (sync := true) fun oldNext => do
|
||||
@@ -427,7 +384,7 @@ where
|
||||
if (← isBeforeEditPos nextCom.data.parserState.pos) then
|
||||
return .pure (← unchanged old)
|
||||
|
||||
SnapshotTask.ofIO (some ⟨parserState.pos, ctx.input.endPos⟩) do
|
||||
SnapshotTask.ofIO ⟨parserState.pos, ctx.input.endPos⟩ do
|
||||
let beginPos := parserState.pos
|
||||
let scope := cmdState.scopes.head!
|
||||
let pmctx := {
|
||||
@@ -444,31 +401,21 @@ where
|
||||
-- on first change, make sure to cancel all further old tasks
|
||||
old.cancel
|
||||
|
||||
-- definitely resolved in `doElab` task
|
||||
let elabPromise ← IO.Promise.new
|
||||
let tacticCache ← old?.map (·.data.tacticCache) |>.getDM (IO.mkRef {})
|
||||
let finishedSnap ←
|
||||
doElab stx cmdState msgLog.hasErrors beginPos
|
||||
{ old? := old?.map fun old => ⟨old.data.stx, old.data.elabSnap⟩, new := elabPromise }
|
||||
tacticCache
|
||||
ctx
|
||||
|
||||
let sigSnap ← processCmdSignature stx cmdState msgLog.hasErrors beginPos ctx
|
||||
let next? ← if Parser.isTerminalCommand stx then pure none
|
||||
-- for now, wait on "command finished" snapshot before parsing next command
|
||||
else some <$> finishedSnap.bindIO fun finished =>
|
||||
else some <$> (sigSnap.bind (·.finishedSnap)).bindIO fun finished =>
|
||||
parseCmd none parserState finished.cmdState ctx
|
||||
return .mk (nextCmdSnap? := next?) {
|
||||
diagnostics := (← Snapshot.Diagnostics.ofMessageLog msgLog)
|
||||
diagnostics := (← Snapshot.Diagnostics.ofMessageLog msgLog ctx.toProcessingContext)
|
||||
stx
|
||||
parserState
|
||||
elabSnap := { range? := finishedSnap.range?, task := elabPromise.result }
|
||||
finishedSnap
|
||||
tacticCache
|
||||
sigSnap
|
||||
}
|
||||
|
||||
doElab (stx : Syntax) (cmdState : Command.State) (hasParseError : Bool) (beginPos : String.Pos)
|
||||
(snap : SnapshotBundle DynamicSnapshot) (tacticCache : IO.Ref Tactic.Cache) :
|
||||
LeanProcessingM (SnapshotTask CommandFinishedSnapshot) := do
|
||||
processCmdSignature (stx : Syntax) (cmdState : Command.State) (hasParseError : Bool)
|
||||
(beginPos : String.Pos) :
|
||||
LeanProcessingM (SnapshotTask CommandSignatureProcessedSnapshot) := do
|
||||
let ctx ← read
|
||||
|
||||
-- signature elaboration task; for now, does full elaboration
|
||||
@@ -476,26 +423,13 @@ where
|
||||
SnapshotTask.ofIO (stx.getRange?.getD ⟨beginPos, beginPos⟩) do
|
||||
let scope := cmdState.scopes.head!
|
||||
let cmdStateRef ← IO.mkRef { cmdState with messages := .empty }
|
||||
/-
|
||||
The same snapshot may be executed by different tasks. So, to make sure `elabCommandTopLevel`
|
||||
has exclusive access to the cache, we create a fresh reference here. Before this change, the
|
||||
following `tacticCache.modify` would reset the tactic post cache while another snapshot was
|
||||
still using it.
|
||||
-/
|
||||
let tacticCacheNew ← IO.mkRef (← tacticCache.get)
|
||||
let cmdCtx : Elab.Command.Context := { ctx with
|
||||
cmdPos := beginPos
|
||||
tacticCache? := some tacticCacheNew
|
||||
snap? := some snap
|
||||
}
|
||||
let cmdCtx : Elab.Command.Context := { ctx with cmdPos := beginPos, tacticCache? := none }
|
||||
let (output, _) ←
|
||||
IO.FS.withIsolatedStreams (isolateStderr := stderrAsMessages.get scope.opts) do
|
||||
liftM (m := BaseIO) do
|
||||
Elab.Command.catchExceptions
|
||||
(getResetInfoTrees *> Elab.Command.elabCommandTopLevel stx)
|
||||
cmdCtx cmdStateRef
|
||||
let postNew := (← tacticCacheNew.get).post
|
||||
tacticCache.modify fun _ => { pre := postNew, post := {} }
|
||||
let cmdState ← cmdStateRef.get
|
||||
let mut messages := cmdState.messages
|
||||
-- `stx.hasMissing` should imply `hasParseError`, but the latter should be cheaper to check in
|
||||
@@ -515,12 +449,14 @@ where
|
||||
data := output
|
||||
}
|
||||
let cmdState := { cmdState with messages }
|
||||
-- definitely resolve eventually
|
||||
snap.new.resolve <| .ofTyped { diagnostics := .empty : SnapshotLeaf }
|
||||
return {
|
||||
diagnostics := (← Snapshot.Diagnostics.ofMessageLog cmdState.messages)
|
||||
infoTree? := some cmdState.infoState.trees[0]!
|
||||
cmdState
|
||||
diagnostics := .empty
|
||||
finishedSnap := .pure {
|
||||
diagnostics :=
|
||||
(← Snapshot.Diagnostics.ofMessageLog cmdState.messages ctx.toProcessingContext)
|
||||
infoTree? := some cmdState.infoState.trees[0]!
|
||||
cmdState
|
||||
}
|
||||
}
|
||||
|
||||
/-- Waits for and returns final environment, if importing was successful. -/
|
||||
@@ -532,6 +468,6 @@ where goCmd snap :=
|
||||
if let some next := snap.next? then
|
||||
goCmd next.get
|
||||
else
|
||||
snap.data.finishedSnap.get.cmdState.env
|
||||
snap.data.sigSnap.get.finishedSnap.get.cmdState.env
|
||||
|
||||
end Lean
|
||||
|
||||
@@ -255,10 +255,6 @@ builtin_initialize addBuiltinUnusedVariablesIgnoreFn (fun _ stack opts =>
|
||||
(stx.isOfKind ``Lean.Parser.Term.matchAlt && pos == 1) ||
|
||||
(stx.isOfKind ``Lean.Parser.Tactic.inductionAltLHS && pos == 2))
|
||||
|
||||
/-- `#guard_msgs in cmd` itself runs linters in `cmd` (via `elabCommandTopLevel`), so do not run them again. -/
|
||||
builtin_initialize addBuiltinUnusedVariablesIgnoreFn (fun _ stack _ =>
|
||||
stack.any fun (stx, _) => stx.isOfKind ``Lean.guardMsgsCmd)
|
||||
|
||||
/-- Get the current list of `IgnoreFunction`s. -/
|
||||
def getUnusedVariablesIgnoreFns : CommandElabM (Array IgnoreFunction) := do
|
||||
return (unusedVariablesIgnoreFnsExt.getState (← getEnv)).2
|
||||
|
||||
@@ -14,8 +14,7 @@ open Lean.Elab
|
||||
|
||||
def logLint [Monad m] [MonadLog m] [AddMessageContext m] [MonadOptions m]
|
||||
(linterOption : Lean.Option Bool) (stx : Syntax) (msg : MessageData) : m Unit :=
|
||||
let disable := m!"note: this linter can be disabled with `set_option {linterOption.name} false`"
|
||||
logWarningAt stx (.tagged linterOption.name m!"{msg}\n{disable}")
|
||||
logWarningAt stx (.tagged linterOption.name m!"{msg} [{linterOption.name}]")
|
||||
|
||||
/-- If `linterOption` is true, print a linter warning message at the position determined by `stx`.
|
||||
-/
|
||||
|
||||
@@ -23,7 +23,7 @@ def mkErrorStringWithPos (fileName : String) (pos : Position) (msg : String) (en
|
||||
|
||||
inductive MessageSeverity where
|
||||
| information | warning | error
|
||||
deriving Inhabited, BEq, ToJson, FromJson
|
||||
deriving Inhabited, BEq
|
||||
|
||||
structure MessageDataContext where
|
||||
env : Environment
|
||||
@@ -46,18 +46,6 @@ structure PPFormat where
|
||||
/-- Searches for synthetic sorries in original input. Used to filter out certain messages. -/
|
||||
hasSyntheticSorry : MetavarContext → Bool := fun _ => false
|
||||
|
||||
structure TraceData where
|
||||
/-- Trace class, e.g. `Elab.step`. -/
|
||||
cls : Name
|
||||
/-- Start time in seconds; 0 if unknown to avoid `Option` allocation. -/
|
||||
startTime : Float := 0
|
||||
/-- Stop time in seconds; 0 if unknown to avoid `Option` allocation. -/
|
||||
stopTime : Float := startTime
|
||||
/-- Whether trace node defaults to collapsed in the infoview. -/
|
||||
collapsed : Bool := true
|
||||
/-- Optional tag shown in `trace.profiler.output` output after the trace class name. -/
|
||||
tag : String := ""
|
||||
|
||||
/-- Structured message data. We use it for reporting errors, trace messages, etc. -/
|
||||
inductive MessageData where
|
||||
/-- Eagerly formatted text. We inspect this in various hacks, so it is not immediately subsumed by `ofPPFormat`. -/
|
||||
@@ -77,11 +65,22 @@ inductive MessageData where
|
||||
/-- Tagged sections. `Name` should be viewed as a "kind", and is used by `MessageData` inspector functions.
|
||||
Example: an inspector that tries to find "definitional equality failures" may look for the tag "DefEqFailure". -/
|
||||
| tagged : Name → MessageData → MessageData
|
||||
| trace (data : TraceData) (msg : MessageData) (children : Array MessageData)
|
||||
| trace (cls : Name) (msg : MessageData) (children : Array MessageData) (collapsed : Bool)
|
||||
deriving Inhabited
|
||||
|
||||
namespace MessageData
|
||||
|
||||
/-- Determines whether the message contains any content. -/
|
||||
def isEmpty : MessageData → Bool
|
||||
| ofFormat f => f.isEmpty
|
||||
| withContext _ m => m.isEmpty
|
||||
| withNamingContext _ m => m.isEmpty
|
||||
| nest _ m => m.isEmpty
|
||||
| group m => m.isEmpty
|
||||
| compose m₁ m₂ => m₁.isEmpty && m₂.isEmpty
|
||||
| tagged _ m => m.isEmpty
|
||||
| _ => false
|
||||
|
||||
variable (p : Name → Bool) in
|
||||
/-- Returns true when the message contains a `MessageData.tagged tag ..` constructor where `p tag` is true. -/
|
||||
partial def hasTag : MessageData → Bool
|
||||
@@ -91,7 +90,7 @@ partial def hasTag : MessageData → Bool
|
||||
| group msg => hasTag msg
|
||||
| compose msg₁ msg₂ => hasTag msg₁ || hasTag msg₂
|
||||
| tagged n msg => p n || hasTag msg
|
||||
| trace data msg msgs => p data.cls || hasTag msg || msgs.any hasTag
|
||||
| trace cls msg msgs _ => p cls || hasTag msg || msgs.any hasTag
|
||||
| _ => false
|
||||
|
||||
/-- An empty message. -/
|
||||
@@ -134,7 +133,7 @@ where
|
||||
| group msg => visit mctx? msg
|
||||
| compose msg₁ msg₂ => visit mctx? msg₁ || visit mctx? msg₂
|
||||
| tagged _ msg => visit mctx? msg
|
||||
| trace _ msg msgs => visit mctx? msg || msgs.any (visit mctx?)
|
||||
| trace _ msg msgs _ => visit mctx? msg || msgs.any (visit mctx?)
|
||||
| _ => false
|
||||
|
||||
partial def formatAux : NamingContext → Option MessageDataContext → MessageData → IO Format
|
||||
@@ -148,11 +147,8 @@ partial def formatAux : NamingContext → Option MessageDataContext → MessageD
|
||||
| nCtx, ctx, nest n d => Format.nest n <$> formatAux nCtx ctx d
|
||||
| nCtx, ctx, compose d₁ d₂ => return (← formatAux nCtx ctx d₁) ++ (← formatAux nCtx ctx d₂)
|
||||
| nCtx, ctx, group d => Format.group <$> formatAux nCtx ctx d
|
||||
| nCtx, ctx, trace data header children => do
|
||||
let mut msg := f!"[{data.cls}]"
|
||||
if data.startTime != 0 then
|
||||
msg := f!"{msg} [{data.stopTime - data.startTime}]"
|
||||
msg := f!"{msg} {(← formatAux nCtx ctx header).nest 2}"
|
||||
| nCtx, ctx, trace cls header children _ => do
|
||||
let msg := f!"[{cls}] {(← formatAux nCtx ctx header).nest 2}"
|
||||
let children ← children.mapM (formatAux nCtx ctx)
|
||||
return .nest 2 (.joinSep (msg::children.toList) "\n")
|
||||
|
||||
@@ -209,15 +205,9 @@ instance : Coe (List Expr) MessageData := ⟨fun es => ofList <| es.map ofExpr
|
||||
|
||||
end MessageData
|
||||
|
||||
/--
|
||||
A `BaseMessage` is a richly formatted piece of information emitted by Lean.
|
||||
They are rendered by client editors in the infoview and in diagnostic windows.
|
||||
There are two varieties in the Lean core:
|
||||
* `Message`: Uses structured, effectful `MessageData` for formatting content.
|
||||
* `SerialMessage`: Stores pure `String` data. Obtained by running the effectful
|
||||
`Message.serialize`.
|
||||
-/
|
||||
structure BaseMessage (α : Type u) where
|
||||
/-- A `Message` is a richly formatted piece of information emitted by Lean.
|
||||
They are rendered by client editors in the infoview and in diagnostic windows. -/
|
||||
structure Message where
|
||||
fileName : String
|
||||
pos : Position
|
||||
endPos : Option Position := none
|
||||
@@ -226,53 +216,24 @@ structure BaseMessage (α : Type u) where
|
||||
severity : MessageSeverity := MessageSeverity.error
|
||||
caption : String := ""
|
||||
/-- The content of the message. -/
|
||||
data : α
|
||||
deriving Inhabited, ToJson, FromJson
|
||||
data : MessageData
|
||||
deriving Inhabited
|
||||
|
||||
/-- A `Message` is a richly formatted piece of information emitted by Lean.
|
||||
They are rendered by client editors in the infoview and in diagnostic windows. -/
|
||||
abbrev Message := BaseMessage MessageData
|
||||
namespace Message
|
||||
|
||||
/-- A `SerialMessage` is a `Message` whose `MessageData` has been eagerly
|
||||
serialized and is thus appropriate for use in pure contexts where the effectful
|
||||
`MessageData.toString` cannot be used. -/
|
||||
abbrev SerialMessage := BaseMessage String
|
||||
|
||||
namespace SerialMessage
|
||||
|
||||
@[inline] def toMessage (msg : SerialMessage) : Message :=
|
||||
{msg with data := msg.data}
|
||||
|
||||
protected def toString (msg : SerialMessage) (includeEndPos := false) : String := Id.run do
|
||||
let mut str := msg.data
|
||||
protected def toString (msg : Message) (includeEndPos := false) : IO String := do
|
||||
let mut str ← msg.data.toString
|
||||
let endPos := if includeEndPos then msg.endPos else none
|
||||
unless msg.caption == "" do
|
||||
str := msg.caption ++ ":\n" ++ str
|
||||
match msg.severity with
|
||||
| .information => pure ()
|
||||
| .warning => str := mkErrorStringWithPos msg.fileName msg.pos (endPos := endPos) "warning: " ++ str
|
||||
| .error => str := mkErrorStringWithPos msg.fileName msg.pos (endPos := endPos) "error: " ++ str
|
||||
| MessageSeverity.information => pure ()
|
||||
| MessageSeverity.warning => str := mkErrorStringWithPos msg.fileName msg.pos (endPos := endPos) "warning: " ++ str
|
||||
| MessageSeverity.error => str := mkErrorStringWithPos msg.fileName msg.pos (endPos := endPos) "error: " ++ str
|
||||
if str.isEmpty || str.back != '\n' then
|
||||
str := str ++ "\n"
|
||||
return str
|
||||
|
||||
instance : ToString SerialMessage := ⟨SerialMessage.toString⟩
|
||||
|
||||
end SerialMessage
|
||||
|
||||
namespace Message
|
||||
|
||||
@[inline] def serialize (msg : Message) : IO SerialMessage := do
|
||||
return {msg with data := ← msg.data.toString}
|
||||
|
||||
protected def toString (msg : Message) (includeEndPos := false) : IO String := do
|
||||
-- Remark: The inline here avoids a new message allocation when `msg` is shared
|
||||
return inline <| (← msg.serialize).toString includeEndPos
|
||||
|
||||
protected def toJson (msg : Message) : IO Json := do
|
||||
-- Remark: The inline here avoids a new message allocation when `msg` is shared
|
||||
return inline <| toJson (← msg.serialize)
|
||||
|
||||
end Message
|
||||
|
||||
/-- A persistent array of messages. -/
|
||||
|
||||
@@ -164,7 +164,7 @@ def mkHEqTrans (h₁ h₂ : Expr) : MetaM Expr := do
|
||||
| none, _ => throwAppBuilderException ``HEq.trans ("heterogeneous equality proof expected" ++ hasTypeMsg h₁ hType₁)
|
||||
| _, none => throwAppBuilderException ``HEq.trans ("heterogeneous equality proof expected" ++ hasTypeMsg h₂ hType₂)
|
||||
|
||||
/-- Given `h : HEq a b` where `a` and `b` have the same type, returns a proof of `Eq a b`. -/
|
||||
/-- Given `h : Eq a b`, returns a proof of `HEq a b`. -/
|
||||
def mkEqOfHEq (h : Expr) : MetaM Expr := do
|
||||
let hType ← infer h
|
||||
match hType.heq? with
|
||||
@@ -174,7 +174,7 @@ def mkEqOfHEq (h : Expr) : MetaM Expr := do
|
||||
let u ← getLevel α
|
||||
return mkApp4 (mkConst ``eq_of_heq [u]) α a b h
|
||||
| _ =>
|
||||
throwAppBuilderException ``eq_of_heq m!"heterogeneous equality proof expected{indentExpr h}"
|
||||
throwAppBuilderException ``HEq.trans m!"heterogeneous equality proof expected{indentExpr h}"
|
||||
|
||||
/--
|
||||
If `e` is `@Eq.refl α a`, return `a`.
|
||||
@@ -189,7 +189,7 @@ def isRefl? (e : Expr) : Option Expr := do
|
||||
If `e` is `@congrArg α β a b f h`, return `α`, `f` and `h`.
|
||||
Also works if `e` can be turned into such an application (e.g. `congrFun`).
|
||||
-/
|
||||
def congrArg? (e : Expr) : MetaM (Option (Expr × Expr × Expr)) := do
|
||||
def congrArg? (e : Expr) : MetaM (Option (Expr × Expr × Expr )) := do
|
||||
if e.isAppOfArity ``congrArg 6 then
|
||||
let #[α, _β, _a, _b, f, h] := e.getAppArgs | unreachable!
|
||||
return some (α, f, h)
|
||||
|
||||
@@ -371,14 +371,6 @@ def SavedState.restore (b : SavedState) : MetaM Unit := do
|
||||
Core.restore b.core
|
||||
modify fun s => { s with mctx := b.meta.mctx, zetaDeltaFVarIds := b.meta.zetaDeltaFVarIds, postponed := b.meta.postponed }
|
||||
|
||||
/--
|
||||
Restores full state including sources for unique identifiers. Only intended for incremental reuse
|
||||
between elaboration runs, not for backtracking within a single run.
|
||||
-/
|
||||
def SavedState.restoreFull (b : SavedState) : MetaM Unit := do
|
||||
Core.restoreFull b.core
|
||||
set b.meta
|
||||
|
||||
instance : MonadBacktrack SavedState MetaM where
|
||||
saveState := Meta.saveState
|
||||
restoreState s := s.restore
|
||||
|
||||
@@ -1173,35 +1173,6 @@ private def isDefEqLeftRight (fn : Name) (t s : Expr) : MetaM LBool := do
|
||||
trace[Meta.isDefEq.delta.unfoldLeftRight] fn
|
||||
toLBoolM <| Meta.isExprDefEqAux t s
|
||||
|
||||
/-- Helper predicate for `tryHeuristic`. -/
|
||||
private def isNonTrivialRegular (info : DefinitionVal) : MetaM Bool := do
|
||||
match info.hints with
|
||||
| .regular d =>
|
||||
if (← isProjectionFn info.name) then
|
||||
-- All projections are considered trivial
|
||||
return false
|
||||
if d > 2 then
|
||||
-- If definition depth is greater than 2, we claim it is not a trivial definition
|
||||
return true
|
||||
-- After consuming the lambda expressions, we consider a regular definition non-trivial if it is not "simple".
|
||||
-- Where simple is a bvar/lit/sort/proj or a single application where all arguments are bvar/lit/sort/proj.
|
||||
let val := consumeDefnPreamble info.value
|
||||
return !isSimple val (allowApp := true)
|
||||
| _ => return false
|
||||
where
|
||||
consumeDefnPreamble (e : Expr) : Expr :=
|
||||
match e with
|
||||
| .mdata _ e => consumeDefnPreamble e
|
||||
| .lam _ _ b _ => consumeDefnPreamble b
|
||||
| _ => e
|
||||
isSimple (e : Expr) (allowApp : Bool) : Bool :=
|
||||
match e with
|
||||
| .bvar .. | .sort .. | .lit .. | .fvar .. | .mvar .. => true
|
||||
| .app f a => isSimple a false && isSimple f allowApp
|
||||
| .proj _ _ b => isSimple b false
|
||||
| .mdata _ b => isSimple b allowApp
|
||||
| .lam .. | .letE .. | .forallE .. | .const .. => false
|
||||
|
||||
/-- Try to solve `f a₁ ... aₙ =?= f b₁ ... bₙ` by solving `a₁ =?= b₁, ..., aₙ =?= bₙ`.
|
||||
|
||||
Auxiliary method for isDefEqDelta -/
|
||||
@@ -1210,32 +1181,19 @@ private def tryHeuristic (t s : Expr) : MetaM Bool := do
|
||||
let mut s := s
|
||||
let tFn := t.getAppFn
|
||||
let sFn := s.getAppFn
|
||||
-- If `f` (i.e., `tFn`) is not a definition, we do not apply the heuristic.
|
||||
let .defnInfo info ← getConstInfo tFn.constName! | return false
|
||||
/-
|
||||
We apply the heuristic in the following cases:
|
||||
1- `f` is a non-trivial regular definition (see predicate `isNonTrivialRegular`)
|
||||
2- `f` is `match` application.
|
||||
3- `t` or `s` contain meta-variables.
|
||||
let info ← getConstInfo tFn.constName!
|
||||
/- We only use the heuristic when `f` is a regular definition or an auxiliary `match` application.
|
||||
That is, it is not marked an abbreviation (e.g., a user-facing projection) or as opaque (e.g., proof).
|
||||
We check whether terms contain metavariables to make sure we can solve constraints such
|
||||
as `S.proj ?x =?= S.proj t` without performing delta-reduction.
|
||||
That is, we are assuming the heuristic implemented by this method is seldom effective
|
||||
when `t` and `s` do not have metavariables, are not structurally equal, and `f` is an abbreviation.
|
||||
On the other hand, by unfolding `f`, we often produce smaller terms.
|
||||
|
||||
The third case is important to make sure we can solve constraints such as
|
||||
`S.proj ?x =?= S.proj t` without performing delta-reduction.
|
||||
|
||||
When the conditions 1&2&3 do not hold, we are assuming the heuristic implemented by this method is seldom effective
|
||||
when `f` is not simple, `t` and `s` do not have metavariables, are not structurally equal.
|
||||
|
||||
Recall that auxiliary `match` definitions are marked as abbreviations, but we must use the heuristic on
|
||||
them since they will not be unfolded when smartUnfolding is turned on. The abbreviation annotation in this
|
||||
case is used to help the kernel type checker.
|
||||
|
||||
The `isNonTrivialRegular` predicate is also useful to avoid applying the heuristic to very simple definitions that
|
||||
have not been marked as abbreviations by the user. Example:
|
||||
```
|
||||
protected def Mem (a : α) (s : Set α) : Prop := s a
|
||||
```
|
||||
at test 3807.lean
|
||||
-/
|
||||
unless (← isNonTrivialRegular info) || isMatcherCore (← getEnv) tFn.constName! do
|
||||
Recall that auxiliary `match` definitions are marked as abbreviations, but we must use the heuristic on
|
||||
them since they will not be unfolded when smartUnfolding is turned on. The abbreviation annotation in this
|
||||
case is used to help the kernel type checker. -/
|
||||
unless info.hints.isRegular || isMatcherCore (← getEnv) tFn.constName! do
|
||||
unless t.hasExprMVar || s.hasExprMVar do
|
||||
return false
|
||||
withTraceNodeBefore `Meta.isDefEq.delta (return m!"{t} =?= {s}") do
|
||||
@@ -1731,72 +1689,8 @@ private def isDefEqOnFailure (t s : Expr) : MetaM Bool := do
|
||||
unstuckMVar s (fun s => Meta.isExprDefEqAux t s) <|
|
||||
tryUnificationHints t s <||> tryUnificationHints s t
|
||||
|
||||
/--
|
||||
Result type for `isDefEqDelta`
|
||||
-/
|
||||
inductive DeltaStepResult where
|
||||
| eq | unknown
|
||||
| cont (t s : Expr)
|
||||
| diff (t s : Expr)
|
||||
|
||||
/--
|
||||
Perform one step of lazy delta reduction. This function decides whether to perform delta-reduction on `t`, `s`, or both.
|
||||
It is currently used to solve contraints of the form `(f a).i =?= (g a).i` where `i` is a numeral at `isDefEqProjDelta`.
|
||||
It is also a simpler version of `isDefEqDelta`. In the future, we may decide to combine these two functions like we do
|
||||
in the kernel.
|
||||
-/
|
||||
private def isDefEqDeltaStep (t s : Expr) : MetaM DeltaStepResult := do
|
||||
let tInfo? ← isDeltaCandidate? t
|
||||
let sInfo? ← isDeltaCandidate? s
|
||||
match tInfo?, sInfo? with
|
||||
| none, none => return .unknown
|
||||
| some _, none => unfold t (return .unknown) (k · s)
|
||||
| none, some _ => unfold s (return .unknown) (k t ·)
|
||||
| some tInfo, some sInfo =>
|
||||
match compare tInfo.hints sInfo.hints with
|
||||
| .lt => unfold t (return .unknown) (k · s)
|
||||
| .gt => unfold s (return .unknown) (k t ·)
|
||||
| .eq =>
|
||||
unfold t
|
||||
(unfold s (return .unknown) (k t ·))
|
||||
(fun t => unfold s (k t s) (k t ·))
|
||||
where
|
||||
k (t s : Expr) : MetaM DeltaStepResult := do
|
||||
let t ← whnfCore t
|
||||
let s ← whnfCore s
|
||||
match (← isDefEqQuick t s) with
|
||||
| .true => return .eq
|
||||
| .false => return .diff t s
|
||||
| .undef => return .cont t s
|
||||
|
||||
/--
|
||||
Helper function for solving contraints of the form `t.i =?= s.i`.
|
||||
-/
|
||||
private partial def isDefEqProjDelta (t s : Expr) (i : Nat) : MetaM Bool := do
|
||||
let t ← whnfCore t
|
||||
let s ← whnfCore s
|
||||
match (← isDefEqQuick t s) with
|
||||
| .true => return true
|
||||
| .false | .undef => loop t s
|
||||
where
|
||||
loop (t s : Expr) : MetaM Bool := do
|
||||
match (← isDefEqDeltaStep t s) with
|
||||
| .cont t s => loop t s
|
||||
| .eq => return true
|
||||
| .unknown => tryReduceProjs t s
|
||||
| .diff t s => tryReduceProjs t s
|
||||
|
||||
tryReduceProjs (t s : Expr) : MetaM Bool := do
|
||||
match (← projectCore? t i), (← projectCore? s i) with
|
||||
| some t, some s => Meta.isExprDefEqAux t s
|
||||
| _, _ => Meta.isExprDefEqAux t s
|
||||
|
||||
private def isDefEqProj : Expr → Expr → MetaM Bool
|
||||
| .proj m i t, .proj n j s =>
|
||||
if i == j && m == n then
|
||||
isDefEqProjDelta t s i
|
||||
else
|
||||
return false
|
||||
| .proj m i t, .proj n j s => pure (i == j && m == n) <&&> Meta.isExprDefEqAux t s
|
||||
| .proj structName 0 s, v => isDefEqSingleton structName s v
|
||||
| v, .proj structName 0 s => isDefEqSingleton structName s v
|
||||
| _, _ => pure false
|
||||
|
||||
@@ -33,7 +33,7 @@ def assumptionCore (mvarId : MVarId) : MetaM Bool :=
|
||||
/-- Close goal `mvarId` using an assumption. Throw error message if failed. -/
|
||||
def _root_.Lean.MVarId.assumption (mvarId : MVarId) : MetaM Unit :=
|
||||
unless (← mvarId.assumptionCore) do
|
||||
throwTacticEx `assumption mvarId
|
||||
throwTacticEx `assumption mvarId ""
|
||||
|
||||
@[deprecated MVarId.assumption]
|
||||
def assumption (mvarId : MVarId) : MetaM Unit :=
|
||||
|
||||
@@ -224,7 +224,7 @@ Throw exception if goal failed to be closed.
|
||||
-/
|
||||
def _root_.Lean.MVarId.contradiction (mvarId : MVarId) (config : Contradiction.Config := {}) : MetaM Unit :=
|
||||
unless (← mvarId.contradictionCore config) do
|
||||
throwTacticEx `contradiction mvarId
|
||||
throwTacticEx `contradiction mvarId ""
|
||||
|
||||
@[deprecated MVarId.contradiction]
|
||||
def contradiction (mvarId : MVarId) (config : Contradiction.Config := {}) : MetaM Unit :=
|
||||
|
||||
@@ -943,17 +943,10 @@ def deriveInduction (name : Name) : MetaM Unit := do
|
||||
|
||||
def isFunInductName (env : Environment) (name : Name) : Bool := Id.run do
|
||||
let .str p s := name | return false
|
||||
match s with
|
||||
| "induct" =>
|
||||
if (WF.eqnInfoExt.find? env p).isSome then return true
|
||||
if (Structural.eqnInfoExt.find? env p).isSome then return true
|
||||
return false
|
||||
| "mutual_induct" =>
|
||||
if let some eqnInfo := WF.eqnInfoExt.find? env p then
|
||||
if h : eqnInfo.declNames.size > 1 then
|
||||
return eqnInfo.declNames[0] = p
|
||||
return false
|
||||
| _ => return false
|
||||
unless s = "induct" do return false
|
||||
if (WF.eqnInfoExt.find? env p).isSome then return true
|
||||
if (Structural.eqnInfoExt.find? env p).isSome then return true
|
||||
return false
|
||||
|
||||
builtin_initialize
|
||||
registerReservedNamePredicate isFunInductName
|
||||
|
||||
@@ -67,6 +67,6 @@ Close given goal using `HEq.refl`.
|
||||
def _root_.Lean.MVarId.hrefl (mvarId : MVarId) : MetaM Unit := do
|
||||
mvarId.withContext do
|
||||
let some [] ← observing? do mvarId.apply (mkConst ``HEq.refl [← mkFreshLevelMVar])
|
||||
| throwTacticEx `hrefl mvarId
|
||||
| throwTacticEx `hrefl mvarId ""
|
||||
|
||||
end Lean.Meta
|
||||
|
||||
@@ -241,17 +241,7 @@ def getSimpLetCase (n : Name) (t : Expr) (b : Expr) : MetaM SimpLetCase := do
|
||||
else
|
||||
return SimpLetCase.dep
|
||||
|
||||
/--
|
||||
We use `withNewlemmas` whenever updating the local context.
|
||||
We use `withFreshCache` because the local context affects `simp` rewrites
|
||||
even when `contextual := false`.
|
||||
For example, the `discharger` may inspect the current local context. The default
|
||||
discharger does that when applying equational theorems, and the user may
|
||||
use `(discharger := assumption)` or `(discharger := omega)`.
|
||||
If the `wishFreshCache` introduces performance issues, we can design a better solution
|
||||
for the default discharger which is used most of the time.
|
||||
-/
|
||||
def withNewLemmas {α} (xs : Array Expr) (f : SimpM α) : SimpM α := withFreshCache do
|
||||
def withNewLemmas {α} (xs : Array Expr) (f : SimpM α) : SimpM α := do
|
||||
if (← getConfig).contextual then
|
||||
let mut s ← getSimpTheorems
|
||||
let mut updated := false
|
||||
@@ -260,7 +250,7 @@ def withNewLemmas {α} (xs : Array Expr) (f : SimpM α) : SimpM α := withFreshC
|
||||
s ← s.addTheorem (.fvar x.fvarId!) x
|
||||
updated := true
|
||||
if updated then
|
||||
withTheReader Context (fun ctx => { ctx with simpTheorems := s }) f
|
||||
withSimpTheorems s f
|
||||
else
|
||||
f
|
||||
else
|
||||
@@ -314,27 +304,30 @@ def simpArrow (e : Expr) : SimpM Result := do
|
||||
trace[Debug.Meta.Tactic.simp] "arrow [{(← getConfig).contextual}] {p} [{← isProp p}] -> {q} [{← isProp q}]"
|
||||
if (← pure (← getConfig).contextual <&&> isProp p <&&> isProp q) then
|
||||
trace[Debug.Meta.Tactic.simp] "ctx arrow {rp.expr} -> {q}"
|
||||
withLocalDeclD e.bindingName! rp.expr fun h => withNewLemmas #[h] do
|
||||
let rq ← simp q
|
||||
match rq.proof? with
|
||||
| none => mkImpCongr e rp rq
|
||||
| some hq =>
|
||||
let hq ← mkLambdaFVars #[h] hq
|
||||
/-
|
||||
We use the default reducibility setting at `mkImpDepCongrCtx` and `mkImpCongrCtx` because they use the theorems
|
||||
```lean
|
||||
@implies_dep_congr_ctx : ∀ {p₁ p₂ q₁ : Prop}, p₁ = p₂ → ∀ {q₂ : p₂ → Prop}, (∀ (h : p₂), q₁ = q₂ h) → (p₁ → q₁) = ∀ (h : p₂), q₂ h
|
||||
@implies_congr_ctx : ∀ {p₁ p₂ q₁ q₂ : Prop}, p₁ = p₂ → (p₂ → q₁ = q₂) → (p₁ → q₁) = (p₂ → q₂)
|
||||
```
|
||||
And the proofs may be from `rfl` theorems which are now omitted. Moreover, we cannot establish that the two
|
||||
terms are definitionally equal using `withReducible`.
|
||||
TODO (better solution): provide the problematic implicit arguments explicitly. It is more efficient and avoids this
|
||||
problem.
|
||||
-/
|
||||
if rq.expr.containsFVar h.fvarId! then
|
||||
return { expr := (← mkForallFVars #[h] rq.expr), proof? := (← withDefault <| mkImpDepCongrCtx (← rp.getProof) hq) }
|
||||
else
|
||||
return { expr := e.updateForallE! rp.expr rq.expr, proof? := (← withDefault <| mkImpCongrCtx (← rp.getProof) hq) }
|
||||
withLocalDeclD e.bindingName! rp.expr fun h => do
|
||||
let s ← getSimpTheorems
|
||||
let s ← s.addTheorem (.fvar h.fvarId!) h
|
||||
withSimpTheorems s do
|
||||
let rq ← simp q
|
||||
match rq.proof? with
|
||||
| none => mkImpCongr e rp rq
|
||||
| some hq =>
|
||||
let hq ← mkLambdaFVars #[h] hq
|
||||
/-
|
||||
We use the default reducibility setting at `mkImpDepCongrCtx` and `mkImpCongrCtx` because they use the theorems
|
||||
```lean
|
||||
@implies_dep_congr_ctx : ∀ {p₁ p₂ q₁ : Prop}, p₁ = p₂ → ∀ {q₂ : p₂ → Prop}, (∀ (h : p₂), q₁ = q₂ h) → (p₁ → q₁) = ∀ (h : p₂), q₂ h
|
||||
@implies_congr_ctx : ∀ {p₁ p₂ q₁ q₂ : Prop}, p₁ = p₂ → (p₂ → q₁ = q₂) → (p₁ → q₁) = (p₂ → q₂)
|
||||
```
|
||||
And the proofs may be from `rfl` theorems which are now omitted. Moreover, we cannot establish that the two
|
||||
terms are definitionally equal using `withReducible`.
|
||||
TODO (better solution): provide the problematic implicit arguments explicitly. It is more efficient and avoids this
|
||||
problem.
|
||||
-/
|
||||
if rq.expr.containsFVar h.fvarId! then
|
||||
return { expr := (← mkForallFVars #[h] rq.expr), proof? := (← withDefault <| mkImpDepCongrCtx (← rp.getProof) hq) }
|
||||
else
|
||||
return { expr := e.updateForallE! rp.expr rq.expr, proof? := (← withDefault <| mkImpCongrCtx (← rp.getProof) hq) }
|
||||
else
|
||||
mkImpCongr e rp (← simp q)
|
||||
|
||||
@@ -396,7 +389,7 @@ def simpLet (e : Expr) : SimpM Result := do
|
||||
| SimpLetCase.dep => return { expr := (← dsimp e) }
|
||||
| SimpLetCase.nondep =>
|
||||
let rv ← simp v
|
||||
withLocalDeclD n t fun x => withNewLemmas #[x] do
|
||||
withLocalDeclD n t fun x => do
|
||||
let bx := b.instantiate1 x
|
||||
let rbx ← simp bx
|
||||
let hb? ← match rbx.proof? with
|
||||
@@ -409,7 +402,7 @@ def simpLet (e : Expr) : SimpM Result := do
|
||||
| _, some h => return { expr := e', proof? := some (← mkLetCongr (← rv.getProof) h) }
|
||||
| SimpLetCase.nondepDepVar =>
|
||||
let v' ← dsimp v
|
||||
withLocalDeclD n t fun x => withNewLemmas #[x] do
|
||||
withLocalDeclD n t fun x => do
|
||||
let bx := b.instantiate1 x
|
||||
let rbx ← simp bx
|
||||
let e' := mkLet n t v' (← rbx.expr.abstractM #[x])
|
||||
|
||||
@@ -21,11 +21,7 @@ structure Result where
|
||||
/-- A proof that `$e = $expr`, where the simplified expression is on the RHS.
|
||||
If `none`, the proof is assumed to be `refl`. -/
|
||||
proof? : Option Expr := none
|
||||
/--
|
||||
If `cache := true` the result is cached.
|
||||
Warning: we will remove this field in the future. It is currently used by
|
||||
`arith := true`, but we can now refactor the code to avoid the hack.
|
||||
-/
|
||||
/-- If `cache := true` the result is cached. -/
|
||||
cache : Bool := true
|
||||
deriving Inhabited
|
||||
|
||||
@@ -288,6 +284,9 @@ Save current cache, reset it, execute `x`, and then restore original cache.
|
||||
modify fun s => { s with cache := {} }
|
||||
try x finally modify fun s => { s with cache := cacheSaved }
|
||||
|
||||
@[inline] def withSimpTheorems (s : SimpTheoremsArray) (x : SimpM α) : SimpM α := do
|
||||
withFreshCache <| withTheReader Context (fun ctx => { ctx with simpTheorems := s }) x
|
||||
|
||||
@[inline] def withDischarger (discharge? : Expr → SimpM (Option Expr)) (x : SimpM α) : SimpM α :=
|
||||
withFreshCache <| withReader (fun r => { MethodsRef.toMethods r with discharge? }.toMethodsRef) x
|
||||
|
||||
|
||||
@@ -68,7 +68,7 @@ We use this in `apply_rules` and `apply_assumption` where backtracking is not ne
|
||||
-/
|
||||
def applyFirst (cfg : ApplyConfig := {}) (transparency : TransparencyMode := .default)
|
||||
(lemmas : List Expr) (g : MVarId) : MetaM (List MVarId) := do
|
||||
(← applyTactics cfg transparency lemmas g).head
|
||||
(←applyTactics cfg transparency lemmas g).head
|
||||
|
||||
open Lean.Meta.Tactic.Backtrack (BacktrackConfig backtrack)
|
||||
|
||||
@@ -168,7 +168,7 @@ applied to the instantiations of the original goals, fails or returns `false`.
|
||||
def testPartialSolutions (cfg : SolveByElimConfig := {}) (test : List Expr → MetaM Bool) : SolveByElimConfig :=
|
||||
{ cfg with
|
||||
proc := fun orig goals => do
|
||||
guard <| ← test (← orig.mapM fun m => m.withContext do instantiateMVars (.mvar m))
|
||||
let .true ← test (← orig.mapM fun m => m.withContext do instantiateMVars (.mvar m)) | failure
|
||||
cfg.proc orig goals }
|
||||
|
||||
/--
|
||||
@@ -204,24 +204,22 @@ end SolveByElimConfig
|
||||
Elaborate a list of lemmas and local context.
|
||||
See `mkAssumptionSet` for an explanation of why this is needed.
|
||||
-/
|
||||
def elabContextLemmas (cfg : SolveByElimConfig) (g : MVarId)
|
||||
(lemmas : List (TermElabM Expr)) (ctx : SolveByElimConfig → TermElabM (List Expr)) :
|
||||
def elabContextLemmas (g : MVarId) (lemmas : List (TermElabM Expr)) (ctx : TermElabM (List Expr)) :
|
||||
MetaM (List Expr) := do
|
||||
g.withContext (Elab.Term.TermElabM.run' do pure ((← ctx cfg) ++ (← lemmas.mapM id)))
|
||||
g.withContext (Elab.Term.TermElabM.run' do pure ((← ctx) ++ (← lemmas.mapM id)))
|
||||
|
||||
/-- Returns the list of tactics corresponding to applying the available lemmas to the goal. -/
|
||||
def applyLemmas (cfg : SolveByElimConfig)
|
||||
(lemmas : List (TermElabM Expr)) (ctx : SolveByElimConfig → TermElabM (List Expr))
|
||||
(g : MVarId) : MetaM (Meta.Iterator (List MVarId)) := do
|
||||
let es ← elabContextLemmas cfg g lemmas ctx
|
||||
def applyLemmas (cfg : SolveByElimConfig) (lemmas : List (TermElabM Expr)) (ctx : TermElabM (List Expr))
|
||||
(g : MVarId)
|
||||
: MetaM (Meta.Iterator (List MVarId)) := do
|
||||
let es ← elabContextLemmas g lemmas ctx
|
||||
applyTactics cfg.toApplyConfig cfg.transparency es g
|
||||
|
||||
/-- Applies the first possible lemma to the goal. -/
|
||||
def applyFirstLemma (cfg : SolveByElimConfig)
|
||||
(lemmas : List (TermElabM Expr)) (ctx : SolveByElimConfig → TermElabM (List Expr))
|
||||
def applyFirstLemma (cfg : SolveByElimConfig) (lemmas : List (TermElabM Expr)) (ctx : TermElabM (List Expr))
|
||||
(g : MVarId) : MetaM (List MVarId) := do
|
||||
let es ← elabContextLemmas cfg g lemmas ctx
|
||||
applyFirst cfg.toApplyConfig cfg.transparency es g
|
||||
let es ← elabContextLemmas g lemmas ctx
|
||||
applyFirst cfg.toApplyConfig cfg.transparency es g
|
||||
|
||||
/--
|
||||
Solve a collection of goals by repeatedly applying lemmas, backtracking as necessary.
|
||||
@@ -239,16 +237,21 @@ By default `cfg.suspend` is `false,` `cfg.discharge` fails, and `cfg.failAtMaxDe
|
||||
and so the returned list is always empty.
|
||||
Custom wrappers (e.g. `apply_assumption` and `apply_rules`) may modify this behaviour.
|
||||
-/
|
||||
def solveByElim (cfg : SolveByElimConfig)
|
||||
(lemmas : List (TermElabM Expr)) (ctx : SolveByElimConfig → TermElabM (List Expr))
|
||||
def solveByElim (cfg : SolveByElimConfig) (lemmas : List (TermElabM Expr)) (ctx : TermElabM (List Expr))
|
||||
(goals : List MVarId) : MetaM (List MVarId) := do
|
||||
let cfg := cfg.processOptions
|
||||
-- We handle `cfg.symm` by saturating hypotheses of all goals using `symm`.
|
||||
-- This has better performance that the mathlib3 approach.
|
||||
let preprocessedGoals ← if cfg.symm then
|
||||
goals.mapM fun g => g.symmSaturate
|
||||
else
|
||||
pure goals
|
||||
try
|
||||
run cfg goals
|
||||
run cfg preprocessedGoals
|
||||
catch e => do
|
||||
-- Implementation note: as with `cfg.symm`, this is different from the mathlib3 approach,
|
||||
-- for (not as severe) performance reasons.
|
||||
match goals, cfg.exfalso with
|
||||
match preprocessedGoals, cfg.exfalso with
|
||||
| [g], true =>
|
||||
withTraceNode `Meta.Tactic.solveByElim
|
||||
(fun _ => return m!"⏮️ starting over using `exfalso`") do
|
||||
@@ -262,16 +265,6 @@ where
|
||||
else
|
||||
Lean.Meta.repeat1' (maxIters := cfg.maxDepth) (applyFirstLemma cfg lemmas ctx)
|
||||
|
||||
/--
|
||||
If `symm` is `true`, then adds in symmetric versions of each hypothesis.
|
||||
-/
|
||||
def saturateSymm (symm : Bool) (hyps : List Expr) : MetaM (List Expr) := do
|
||||
if symm then
|
||||
let extraHyps ← hyps.filterMapM fun hyp => try some <$> hyp.applySymm catch _ => pure none
|
||||
return hyps ++ extraHyps
|
||||
else
|
||||
return hyps
|
||||
|
||||
/--
|
||||
A `MetaM` analogue of the `apply_rules` user tactic.
|
||||
|
||||
@@ -283,9 +276,7 @@ If you need to remove particular local hypotheses, call `solveByElim` directly.
|
||||
-/
|
||||
def _root_.Lean.MVarId.applyRules (cfg : SolveByElimConfig) (lemmas : List (TermElabM Expr))
|
||||
(only : Bool := false) (g : MVarId) : MetaM (List MVarId) := do
|
||||
let ctx (cfg : SolveByElimConfig) : TermElabM (List Expr) :=
|
||||
if only then pure []
|
||||
else do saturateSymm cfg.symm (← getLocalHyps).toList
|
||||
let ctx : TermElabM (List Expr) := if only then pure [] else do pure (← getLocalHyps).toList
|
||||
solveByElim { cfg with backtracking := false } lemmas ctx [g]
|
||||
|
||||
open Lean.Parser.Tactic
|
||||
@@ -339,7 +330,7 @@ that have been explicitly removed via `only` or `[-h]`.)
|
||||
-- These `TermElabM`s must be run inside a suitable `g.withContext`,
|
||||
-- usually using `elabContextLemmas`.
|
||||
def mkAssumptionSet (noDefaults star : Bool) (add remove : List Term) (use : Array Ident) :
|
||||
MetaM (List (TermElabM Expr) × (SolveByElimConfig → TermElabM (List Expr))) := do
|
||||
MetaM (List (TermElabM Expr) × TermElabM (List Expr)) := do
|
||||
if star && !noDefaults then
|
||||
throwError "It doesn't make sense to use `*` without `only`."
|
||||
|
||||
@@ -354,12 +345,13 @@ def mkAssumptionSet (noDefaults star : Bool) (add remove : List Term) (use : Arr
|
||||
|
||||
if !remove.isEmpty && noDefaults && !star then
|
||||
throwError "It doesn't make sense to remove local hypotheses when using `only` without `*`."
|
||||
let locals (cfg : SolveByElimConfig) : TermElabM (List Expr) :=
|
||||
if noDefaults && !star then do pure []
|
||||
else do saturateSymm cfg.symm <| (← getLocalHyps).toList.removeAll (← remove.mapM elab')
|
||||
let locals : TermElabM (List Expr) := if noDefaults && !star then do
|
||||
pure []
|
||||
else do
|
||||
pure <| (← getLocalHyps).toList.removeAll (← remove.mapM elab')
|
||||
|
||||
return (lemmas, locals)
|
||||
where
|
||||
where
|
||||
/-- Run `elabTerm`. -/
|
||||
elab' (t : Term) : TermElabM Expr := Elab.Term.elabTerm t.raw none
|
||||
|
||||
|
||||
@@ -567,10 +567,9 @@ def addRewriteSuggestion (ref : Syntax) (rules : List (Expr × Bool))
|
||||
-- thus giving more information in the hovers.
|
||||
-- Perhaps in future we will have a better way to attach elaboration information to
|
||||
-- `Syntax` embedded in a `MessageData`.
|
||||
let toMessageData (e : Expr) : MessageData := if e.isConst then .ofConst e else .ofExpr e
|
||||
let mut tacMsg :=
|
||||
let rulesMsg := MessageData.sbracket <| MessageData.joinSep
|
||||
(rules.map fun ⟨e, symm⟩ => (if symm then "← " else "") ++ toMessageData e) ", "
|
||||
(rules.map fun ⟨e, symm⟩ => (if symm then "← " else "") ++ m!"{e}") ", "
|
||||
if let some loc := loc? then
|
||||
m!"rw {rulesMsg} at {loc}"
|
||||
else
|
||||
|
||||
@@ -36,10 +36,11 @@ def appendTagSuffix (mvarId : MVarId) (suffix : Name) : MetaM Unit := do
|
||||
def mkFreshExprSyntheticOpaqueMVar (type : Expr) (tag : Name := Name.anonymous) : MetaM Expr :=
|
||||
mkFreshExprMVar type MetavarKind.syntheticOpaque tag
|
||||
|
||||
def throwTacticEx (tacticName : Name) (mvarId : MVarId) (msg? : Option MessageData := none) : MetaM α :=
|
||||
match msg? with
|
||||
| none => throwError "tactic '{tacticName}' failed\n{mvarId}"
|
||||
| some msg => throwError "tactic '{tacticName}' failed, {msg}\n{mvarId}"
|
||||
def throwTacticEx (tacticName : Name) (mvarId : MVarId) (msg : MessageData) : MetaM α :=
|
||||
if msg.isEmpty then
|
||||
throwError "tactic '{tacticName}' failed\n{mvarId}"
|
||||
else
|
||||
throwError "tactic '{tacticName}' failed, {msg}\n{mvarId}"
|
||||
|
||||
def throwNestedTacticEx {α} (tacticName : Name) (ex : Exception) : MetaM α := do
|
||||
throwError "tactic '{tacticName}' failed, nested error:\n{ex.toMessageData}"
|
||||
|
||||
@@ -529,7 +529,7 @@ def reduceMatcher? (e : Expr) : MetaM ReduceMatcherResult := do
|
||||
i := i + 1
|
||||
return ReduceMatcherResult.stuck auxApp
|
||||
|
||||
def projectCore? (e : Expr) (i : Nat) : MetaM (Option Expr) := do
|
||||
private def projectCore? (e : Expr) (i : Nat) : MetaM (Option Expr) := do
|
||||
let e := e.toCtorIfLit
|
||||
matchConstCtor e.getAppFn (fun _ => pure none) fun ctorVal _ =>
|
||||
let numArgs := e.getAppNumArgs
|
||||
|
||||
@@ -620,22 +620,6 @@ def instantiateMVarsCore (mctx : MetavarContext) (e : Expr) : Expr × MetavarCon
|
||||
instantiateExprMVars e
|
||||
runST fun _ => instantiate e |>.run |>.run mctx
|
||||
|
||||
/-
|
||||
Substitutes assigned metavariables in `e` with their assigned value according to the
|
||||
`MetavarContext`, recursively.
|
||||
|
||||
Example:
|
||||
```
|
||||
run_meta do
|
||||
let mvar1 ← mkFreshExprMVar (mkConst `Nat)
|
||||
let e := (mkConst `Nat.succ).app mvar1
|
||||
-- e is `Nat.succ ?m.773`, `?m.773` is unassigned
|
||||
mvar1.mvarId!.assign (mkNatLit 42)
|
||||
-- e is `Nat.succ ?m.773`, `?m.773` is assigned to `42`
|
||||
let e' ← instantiateMVars e
|
||||
-- e' is `Nat.succ 42`, `?m.773` is assigned to `42`
|
||||
```
|
||||
-/
|
||||
def instantiateMVars [Monad m] [MonadMCtx m] (e : Expr) : m Expr := do
|
||||
if !e.hasMVar then
|
||||
return e
|
||||
|
||||
@@ -151,7 +151,7 @@ is an instance of a general family of type constructions known as inductive type
|
||||
It is remarkable that it is possible to construct a substantial edifice of mathematics
|
||||
based on nothing more than the type universes, dependent arrow types, and inductive types;
|
||||
everything else follows from those.
|
||||
Intuitively, an inductive type is built up from a specified list of constructors.
|
||||
Intuitively, an inductive type is built up from a specified list of constructor.
|
||||
For example, `List α` is the list of elements of type `α`, and is defined as follows:
|
||||
```
|
||||
inductive List (α : Type u) where
|
||||
@@ -208,194 +208,14 @@ def «structure» := leading_parser
|
||||
"deriving " >> "instance " >> derivingClasses >> " for " >> sepBy1 (recover ident skip) ", "
|
||||
@[builtin_command_parser] def noncomputableSection := leading_parser
|
||||
"noncomputable " >> "section" >> optional (ppSpace >> checkColGt >> ident)
|
||||
/--
|
||||
A `section`/`end` pair delimits the scope of `variable`, `open`, `set_option`, and `local` commands.
|
||||
Sections can be nested. `section <id>` provides a label to the section that has to appear with the
|
||||
matching `end`. In either case, the `end` can be omitted, in which case the section is closed at the
|
||||
end of the file.
|
||||
-/
|
||||
@[builtin_command_parser] def «section» := leading_parser
|
||||
"section" >> optional (ppSpace >> checkColGt >> ident)
|
||||
/--
|
||||
`namespace <id>` opens a section with label `<id>` that influences naming and name resolution inside
|
||||
the section:
|
||||
* Declarations names are prefixed: `def seventeen : ℕ := 17` inside a namespace `Nat` is given the
|
||||
full name `Nat.seventeen`.
|
||||
* Names introduced by `export` declarations are also prefixed by the identifier.
|
||||
* All names starting with `<id>.` become available in the namespace without the prefix. These names
|
||||
are preferred over names introduced by outer namespaces or `open`.
|
||||
* Within a namespace, declarations can be `protected`, which excludes them from the effects of
|
||||
opening the namespace.
|
||||
|
||||
As with `section`, namespaces can be nested and the scope of a namespace is terminated by a
|
||||
corresponding `end <id>` or the end of the file.
|
||||
|
||||
`namespace` also acts like `section` in delimiting the scope of `variable`, `open`, and other scoped commands.
|
||||
-/
|
||||
@[builtin_command_parser] def «namespace» := leading_parser
|
||||
"namespace " >> checkColGt >> ident
|
||||
/--
|
||||
`end` closes a `section` or `namespace` scope. If the scope is named `<id>`, it has to be closed
|
||||
with `end <id>`.
|
||||
-/
|
||||
@[builtin_command_parser] def «end» := leading_parser
|
||||
"end" >> optional (ppSpace >> checkColGt >> ident)
|
||||
/-- Declares one or more typed variables, or modifies whether already-declared variables are
|
||||
implicit.
|
||||
|
||||
Introduces variables that can be used in definitions within the same `namespace` or `section` block.
|
||||
When a definition mentions a variable, Lean will add it as an argument of the definition. The
|
||||
`variable` command is also able to add typeclass parameters. This is useful in particular when
|
||||
writing many definitions that have parameters in common (see below for an example).
|
||||
|
||||
Variable declarations have the same flexibility as regular function paramaters. In particular they
|
||||
can be [explicit, implicit][binder docs], or [instance implicit][tpil classes] (in which case they
|
||||
can be anonymous). This can be changed, for instance one can turn explicit variable `x` into an
|
||||
implicit one with `variable {x}`. Note that currently, you should avoid changing how variables are
|
||||
bound and declare new variables at the same time; see [issue 2789] for more on this topic.
|
||||
|
||||
See [*Variables and Sections* from Theorem Proving in Lean][tpil vars] for a more detailed
|
||||
discussion.
|
||||
|
||||
[tpil vars]: https://lean-lang.org/theorem_proving_in_lean4/dependent_type_theory.html#variables-and-sections
|
||||
(Variables and Sections on Theorem Proving in Lean)
|
||||
[tpil classes]: https://lean-lang.org/theorem_proving_in_lean4/type_classes.html
|
||||
(Type classes on Theorem Proving in Lean)
|
||||
[binder docs]: https://leanprover-community.github.io/mathlib4_docs/Lean/Expr.html#Lean.BinderInfo
|
||||
(Documentation for the BinderInfo type)
|
||||
[issue 2789]: https://github.com/leanprover/lean4/issues/2789
|
||||
(Issue 2789 on github)
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
section
|
||||
variable
|
||||
{α : Type u} -- implicit
|
||||
(a : α) -- explicit
|
||||
[instBEq : BEq α] -- instance implicit, named
|
||||
[Hashable α] -- instance implicit, anonymous
|
||||
|
||||
def isEqual (b : α) : Bool :=
|
||||
a == b
|
||||
|
||||
#check isEqual
|
||||
-- isEqual.{u} {α : Type u} (a : α) [instBEq : BEq α] (b : α) : Bool
|
||||
|
||||
variable
|
||||
{a} -- `a` is implicit now
|
||||
|
||||
def eqComm {b : α} := a == b ↔ b == a
|
||||
|
||||
#check eqComm
|
||||
-- eqComm.{u} {α : Type u} {a : α} [instBEq : BEq α] {b : α} : Prop
|
||||
end
|
||||
```
|
||||
|
||||
The following shows a typical use of `variable` to factor out definition arguments:
|
||||
|
||||
```lean
|
||||
variable (Src : Type)
|
||||
|
||||
structure Logger where
|
||||
trace : List (Src × String)
|
||||
#check Logger
|
||||
-- Logger (Src : Type) : Type
|
||||
|
||||
namespace Logger
|
||||
-- switch `Src : Type` to be implicit until the `end Logger`
|
||||
variable {Src}
|
||||
|
||||
def empty : Logger Src where
|
||||
trace := []
|
||||
#check empty
|
||||
-- Logger.empty {Src : Type} : Logger Src
|
||||
|
||||
variable (log : Logger Src)
|
||||
|
||||
def len :=
|
||||
log.trace.length
|
||||
#check len
|
||||
-- Logger.len {Src : Type} (log : Logger Src) : Nat
|
||||
|
||||
variable (src : Src) [BEq Src]
|
||||
|
||||
-- at this point all of `log`, `src`, `Src` and the `BEq` instance can all become arguments
|
||||
|
||||
def filterSrc :=
|
||||
log.trace.filterMap
|
||||
fun (src', str') => if src' == src then some str' else none
|
||||
#check filterSrc
|
||||
-- Logger.filterSrc {Src : Type} (log : Logger Src) (src : Src) [inst✝ : BEq Src] : List String
|
||||
|
||||
def lenSrc :=
|
||||
log.filterSrc src |>.length
|
||||
#check lenSrc
|
||||
-- Logger.lenSrc {Src : Type} (log : Logger Src) (src : Src) [inst✝ : BEq Src] : Nat
|
||||
end Logger
|
||||
```
|
||||
|
||||
-/
|
||||
@[builtin_command_parser] def «variable» := leading_parser
|
||||
"variable" >> many1 (ppSpace >> checkColGt >> Term.bracketedBinder)
|
||||
/-- Declares one or more universe variables.
|
||||
|
||||
`universe u v`
|
||||
|
||||
`Prop`, `Type`, `Type u` and `Sort u` are types that classify other types, also known as
|
||||
*universes*. In `Type u` and `Sort u`, the variable `u` stands for the universe's *level*, and a
|
||||
universe at level `u` can only classify universes that are at levels lower than `u`. For more
|
||||
details on type universes, please refer to [the relevant chapter of Theorem Proving in Lean][tpil
|
||||
universes].
|
||||
|
||||
Just as type arguments allow polymorphic definitions to be used at many different types, universe
|
||||
parameters, represented by universe variables, allow a definition to be used at any required level.
|
||||
While Lean mostly handles universe levels automatically, declaring them explicitly can provide more
|
||||
control when writing signatures. The `universe` keyword allows the declared universe variables to be
|
||||
used in a collection of definitions, and Lean will ensure that these definitions use them
|
||||
consistently.
|
||||
|
||||
[tpil universes]: https://lean-lang.org/theorem_proving_in_lean4/dependent_type_theory.html#types-as-objects
|
||||
(Type universes on Theorem Proving in Lean)
|
||||
|
||||
```lean
|
||||
/- Explicit type-universe parameter. -/
|
||||
def id₁.{u} (α : Type u) (a : α) := a
|
||||
|
||||
/- Implicit type-universe parameter, equivalent to `id₁`.
|
||||
Requires option `autoImplicit true`, which is the default. -/
|
||||
def id₂ (α : Type u) (a : α) := a
|
||||
|
||||
/- Explicit standalone universe variable declaration, equivalent to `id₁` and `id₂`. -/
|
||||
universe u
|
||||
def id₃ (α : Type u) (a : α) := a
|
||||
```
|
||||
|
||||
On a more technical note, using a universe variable only in the right-hand side of a definition
|
||||
causes an error if the universe has not been declared previously.
|
||||
|
||||
```lean
|
||||
def L₁.{u} := List (Type u)
|
||||
|
||||
-- def L₂ := List (Type u) -- error: `unknown universe level 'u'`
|
||||
|
||||
universe u
|
||||
def L₃ := List (Type u)
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
universe u v w
|
||||
|
||||
structure Pair (α : Type u) (β : Type v) : Type (max u v) where
|
||||
a : α
|
||||
b : β
|
||||
|
||||
#check Pair.{v, w}
|
||||
-- Pair : Type v → Type w → Type (max v w)
|
||||
```
|
||||
-/
|
||||
@[builtin_command_parser] def «universe» := leading_parser
|
||||
"universe" >> many1 (ppSpace >> checkColGt >> ident)
|
||||
@[builtin_command_parser] def check := leading_parser
|
||||
@@ -419,21 +239,6 @@ structure Pair (α : Type u) (β : Type v) : Type (max u v) where
|
||||
@[builtin_command_parser] def «init_quot» := leading_parser
|
||||
"init_quot"
|
||||
def optionValue := nonReservedSymbol "true" <|> nonReservedSymbol "false" <|> strLit <|> numLit
|
||||
/--
|
||||
`set_option <id> <value>` sets the option `<id>` to `<value>`. Depending on the type of the option,
|
||||
the value can be `true`, `false`, a string, or a numeral. Options are used to configure behavior of
|
||||
Lean as well as user-defined extensions. The setting is active until the end of the current `section`
|
||||
or `namespace` or the end of the file.
|
||||
Auto-completion is available for `<id>` to list available options.
|
||||
|
||||
`set_option <id> <value> in <command>` sets the option for just a single command:
|
||||
```
|
||||
set_option pp.all true in
|
||||
#check 1 + 1
|
||||
```
|
||||
Similarly, `set_option <id> <value> in` can also be used inside terms and tactics to set an option
|
||||
only in a single term or tactic.
|
||||
-/
|
||||
@[builtin_command_parser] def «set_option» := leading_parser
|
||||
"set_option " >> identWithPartialTrailingDot >> ppSpace >> optionValue
|
||||
def eraseAttr := leading_parser
|
||||
@@ -442,30 +247,6 @@ def eraseAttr := leading_parser
|
||||
"attribute " >> "[" >>
|
||||
withoutPosition (sepBy1 (eraseAttr <|> Term.attrInstance) ", ") >>
|
||||
"]" >> many1 (ppSpace >> ident)
|
||||
/-- Adds names from other namespaces to the current namespace.
|
||||
|
||||
The command `export Some.Namespace (name₁ name₂)` makes `name₁` and `name₂`:
|
||||
|
||||
- visible in the current namespace without prefix `Some.Namespace`, like `open`, and
|
||||
- visible from outside the current namespace `N` as `N.name₁` and `N.name₂`.
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
namespace Morning.Sky
|
||||
def star := "venus"
|
||||
end Morning.Sky
|
||||
|
||||
namespace Evening.Sky
|
||||
export Morning.Sky (star)
|
||||
-- `star` is now in scope
|
||||
#check star
|
||||
end Evening.Sky
|
||||
|
||||
-- `star` is visible in `Evening.Sky`
|
||||
#check Evening.Sky.star
|
||||
```
|
||||
-/
|
||||
@[builtin_command_parser] def «export» := leading_parser
|
||||
"export " >> ident >> " (" >> many1 ident >> ")"
|
||||
@[builtin_command_parser] def «import» := leading_parser
|
||||
@@ -486,118 +267,6 @@ def openScoped := leading_parser
|
||||
def openDecl :=
|
||||
withAntiquot (mkAntiquot "openDecl" `Lean.Parser.Command.openDecl (isPseudoKind := true)) <|
|
||||
openHiding <|> openRenaming <|> openOnly <|> openSimple <|> openScoped
|
||||
/-- Makes names from other namespaces visible without writing the namespace prefix.
|
||||
|
||||
Names that are made available with `open` are visible within the current `section` or `namespace`
|
||||
block. This makes referring to (type) definitions and theorems easier, but note that it can also
|
||||
make [scoped instances], notations, and attributes from a different namespace available.
|
||||
|
||||
The `open` command can be used in a few different ways:
|
||||
|
||||
* `open Some.Namespace.Path1 Some.Namespace.Path2` makes all non-protected names in
|
||||
`Some.Namespace.Path1` and `Some.Namespace.Path2` available without the prefix, so that
|
||||
`Some.Namespace.Path1.x` and `Some.Namespace.Path2.y` can be referred to by writing only `x` and
|
||||
`y`.
|
||||
|
||||
* `open Some.Namespace.Path hiding def1 def2` opens all non-protected names in `Some.Namespace.Path`
|
||||
except `def1` and `def2`.
|
||||
|
||||
* `open Some.Namespace.Path (def1 def2)` only makes `Some.Namespace.Path.def1` and
|
||||
`Some.Namespace.Path.def2` available without the full prefix, so `Some.Namespace.Path.def3` would
|
||||
be unaffected.
|
||||
|
||||
This works even if `def1` and `def2` are `protected`.
|
||||
|
||||
* `open Some.Namespace.Path renaming def1 → def1', def2 → def2'` same as `open Some.Namespace.Path
|
||||
(def1 def2)` but `def1`/`def2`'s names are changed to `def1'`/`def2'`.
|
||||
|
||||
This works even if `def1` and `def2` are `protected`.
|
||||
|
||||
* `open scoped Some.Namespace.Path1 Some.Namespace.Path2` **only** opens [scoped instances],
|
||||
notations, and attributes from `Namespace1` and `Namespace2`; it does **not** make any other name
|
||||
available.
|
||||
|
||||
* `open <any of the open shapes above> in` makes the names `open`-ed visible only in the next
|
||||
command or expression.
|
||||
|
||||
[scoped instance]: https://lean-lang.org/theorem_proving_in_lean4/type_classes.html#scoped-instances
|
||||
(Scoped instances in Theorem Proving in Lean)
|
||||
|
||||
|
||||
## Examples
|
||||
|
||||
```lean
|
||||
/-- SKI combinators https://en.wikipedia.org/wiki/SKI_combinator_calculus -/
|
||||
namespace Combinator.Calculus
|
||||
def I (a : α) : α := a
|
||||
def K (a : α) : β → α := fun _ => a
|
||||
def S (x : α → β → γ) (y : α → β) (z : α) : γ := x z (y z)
|
||||
end Combinator.Calculus
|
||||
|
||||
section
|
||||
-- open everything under `Combinator.Calculus`, *i.e.* `I`, `K` and `S`,
|
||||
-- until the section ends
|
||||
open Combinator.Calculus
|
||||
|
||||
theorem SKx_eq_K : S K x = I := rfl
|
||||
end
|
||||
|
||||
-- open everything under `Combinator.Calculus` only for the next command (the next `theorem`, here)
|
||||
open Combinator.Calculus in
|
||||
theorem SKx_eq_K' : S K x = I := rfl
|
||||
|
||||
section
|
||||
-- open only `S` and `K` under `Combinator.Calculus`
|
||||
open Combinator.Calculus (S K)
|
||||
|
||||
theorem SKxy_eq_y : S K x y = y := rfl
|
||||
|
||||
-- `I` is not in scope, we have to use its full path
|
||||
theorem SKxy_eq_Iy : S K x y = Combinator.Calculus.I y := rfl
|
||||
end
|
||||
|
||||
section
|
||||
open Combinator.Calculus
|
||||
renaming
|
||||
I → identity,
|
||||
K → konstant
|
||||
|
||||
#check identity
|
||||
#check konstant
|
||||
end
|
||||
|
||||
section
|
||||
open Combinator.Calculus
|
||||
hiding S
|
||||
|
||||
#check I
|
||||
#check K
|
||||
end
|
||||
|
||||
section
|
||||
namespace Demo
|
||||
inductive MyType
|
||||
| val
|
||||
|
||||
namespace N1
|
||||
scoped infix:68 " ≋ " => BEq.beq
|
||||
|
||||
scoped instance : BEq MyType where
|
||||
beq _ _ := true
|
||||
|
||||
def Alias := MyType
|
||||
end N1
|
||||
end Demo
|
||||
|
||||
-- bring `≋` and the instance in scope, but not `Alias`
|
||||
open scoped Demo.N1
|
||||
|
||||
#check Demo.MyType.val == Demo.MyType.val
|
||||
#check Demo.MyType.val ≋ Demo.MyType.val
|
||||
-- #check Alias -- unknown identifier 'Alias'
|
||||
end
|
||||
```
|
||||
-/
|
||||
@[builtin_command_parser] def «open» := leading_parser
|
||||
withPosition ("open" >> openDecl)
|
||||
|
||||
@@ -613,28 +282,6 @@ def initializeKeyword := leading_parser
|
||||
@[builtin_command_parser] def «in» := trailing_parser
|
||||
withOpen (ppDedent (" in " >> commandParser))
|
||||
|
||||
/--
|
||||
Adds a docstring to an existing declaration, replacing any existing docstring.
|
||||
The provided docstring should be written as a docstring for the `add_decl_doc` command, as in
|
||||
```
|
||||
/-- My new docstring -/
|
||||
add_decl_doc oldDeclaration
|
||||
```
|
||||
|
||||
This is useful for auto-generated declarations
|
||||
for which there is no place to write a docstring in the source code.
|
||||
|
||||
Parent projections in structures are an example of this:
|
||||
```
|
||||
structure Triple (α β γ : Type) extends Prod α β where
|
||||
thrd : γ
|
||||
|
||||
/-- Extracts the first two projections of a triple. -/
|
||||
add_decl_doc Triple.toProd
|
||||
```
|
||||
|
||||
Documentation can only be added to declarations in the same module.
|
||||
-/
|
||||
@[builtin_command_parser] def addDocString := leading_parser
|
||||
docComment >> "add_decl_doc " >> ident
|
||||
|
||||
|
||||
@@ -5,8 +5,9 @@ Authors: Leonardo de Moura, Sebastian Ullrich
|
||||
-/
|
||||
prelude
|
||||
import Lean.Parser.Basic
|
||||
import Lean.Compiler.InitAttr
|
||||
import Lean.ScopedEnvExtension
|
||||
import Lean.BuiltinDocAttr
|
||||
import Lean.DocString
|
||||
|
||||
/-! Extensible parsing via attributes -/
|
||||
|
||||
@@ -487,20 +488,23 @@ private def BuiltinParserAttribute.add (attrName : Name) (catName : Name)
|
||||
| Expr.const `Lean.Parser.Parser _ =>
|
||||
declareLeadingBuiltinParser catName declName prio
|
||||
| _ => throwError "unexpected parser type at '{declName}' (`Parser` or `TrailingParser` expected)"
|
||||
declareBuiltinDocStringAndRanges declName
|
||||
if let some doc ← findDocString? (← getEnv) declName (includeBuiltin := false) then
|
||||
declareBuiltin (declName ++ `docString) (mkAppN (mkConst ``addBuiltinDocString) #[toExpr declName, toExpr doc])
|
||||
if let some declRanges ← findDeclarationRanges? declName then
|
||||
declareBuiltin (declName ++ `declRange) (mkAppN (mkConst ``addBuiltinDeclarationRanges) #[toExpr declName, toExpr declRanges])
|
||||
runParserAttributeHooks catName declName (builtin := true)
|
||||
|
||||
/--
|
||||
The parsing tables for builtin parsers are "stored" in the extracted source code.
|
||||
-/
|
||||
def registerBuiltinParserAttribute (attrName declName : Name)
|
||||
(behavior := LeadingIdentBehavior.default) (ref : Name := by exact decl_name%) : IO Unit := do
|
||||
(behavior := LeadingIdentBehavior.default) : IO Unit := do
|
||||
let .str ``Lean.Parser.Category s := declName
|
||||
| throw (IO.userError "`declName` should be in Lean.Parser.Category")
|
||||
let catName := Name.mkSimple s
|
||||
addBuiltinParserCategory catName declName behavior
|
||||
registerBuiltinAttribute {
|
||||
ref := ref
|
||||
ref := declName
|
||||
name := attrName
|
||||
descr := "Builtin parser"
|
||||
add := fun declName stx kind => liftM $ BuiltinParserAttribute.add attrName catName declName stx kind
|
||||
|
||||
@@ -266,49 +266,29 @@ open Lean.PrettyPrinter Parenthesizer Syntax.MonadTraverser in
|
||||
term.parenthesizer prec
|
||||
visitToken
|
||||
|
||||
/--
|
||||
Explicit binder, like `(x y : A)` or `(x y)`.
|
||||
Default values can be specified using `(x : A := v)` syntax, and tactics using `(x : A := by tac)`.
|
||||
-/
|
||||
def explicitBinder (requireType := false) := leading_parser ppGroup <|
|
||||
def explicitBinder (requireType := false) := ppGroup $ leading_parser
|
||||
"(" >> withoutPosition (many1 binderIdent >> binderType requireType >> optional (binderTactic <|> binderDefault)) >> ")"
|
||||
/--
|
||||
Implicit binder, like `{x y : A}` or `{x y}`.
|
||||
In regular applications, whenever all parameters before it have been specified,
|
||||
then a `_` placeholder is automatically inserted for this parameter.
|
||||
Implicit parameters should be able to be determined from the other arguments and the return type
|
||||
by unification.
|
||||
|
||||
In `@` explicit mode, implicit binders behave like explicit binders.
|
||||
Implicit binder. In regular applications without `@`, it is automatically inserted
|
||||
and solved by unification whenever all explicit parameters before it are specified.
|
||||
-/
|
||||
def implicitBinder (requireType := false) := leading_parser ppGroup <|
|
||||
def implicitBinder (requireType := false) := ppGroup $ leading_parser
|
||||
"{" >> withoutPosition (many1 binderIdent >> binderType requireType) >> "}"
|
||||
def strictImplicitLeftBracket := atomic (group (symbol "{" >> "{")) <|> "⦃"
|
||||
def strictImplicitRightBracket := atomic (group (symbol "}" >> "}")) <|> "⦄"
|
||||
/--
|
||||
Strict-implicit binder, like `⦃x y : A⦄` or `⦃x y⦄`.
|
||||
In contrast to `{ ... }` implicit binders, strict-implicit binders do not automatically insert
|
||||
a `_` placeholder until at least one subsequent explicit parameter is specified.
|
||||
Do *not* use strict-implicit binders unless there is a subsequent explicit parameter.
|
||||
Assuming this rule is followed, for fully applied expressions implicit and strict-implicit binders have the same behavior.
|
||||
|
||||
Example: If `h : ∀ ⦃x : A⦄, x ∈ s → p x` and `hs : y ∈ s`,
|
||||
then `h` by itself elaborates to itself without inserting `_` for the `x : A` parameter,
|
||||
and `h hs` has type `p y`.
|
||||
In contrast, if `h' : ∀ {x : A}, x ∈ s → p x`, then `h` by itself elaborates to have type `?m ∈ s → p ?m`
|
||||
with `?m` a fresh metavariable.
|
||||
Strict-implicit binder. In contrast to `{ ... }` regular implicit binders,
|
||||
a strict-implicit binder is inserted automatically only when at least one subsequent
|
||||
explicit parameter is specified.
|
||||
-/
|
||||
def strictImplicitBinder (requireType := false) := leading_parser ppGroup <|
|
||||
def strictImplicitBinder (requireType := false) := ppGroup <| leading_parser
|
||||
strictImplicitLeftBracket >> many1 binderIdent >>
|
||||
binderType requireType >> strictImplicitRightBracket
|
||||
/--
|
||||
Instance-implicit binder, like `[C]` or `[inst : C]`.
|
||||
In regular applications without `@` explicit mode, it is automatically inserted
|
||||
and solved for by typeclass inference for the specified class `C`.
|
||||
In `@` explicit mode, if `_` is used for an an instance-implicit parameter, then it is still solved for by typeclass inference;
|
||||
use `(_)` to inhibit this and have it be solved for by unification instead, like an implicit argument.
|
||||
Instance-implicit binder. In regular applications without `@`, it is automatically inserted
|
||||
and solved by typeclass inference of the specified class.
|
||||
-/
|
||||
def instBinder := leading_parser ppGroup <|
|
||||
def instBinder := ppGroup <| leading_parser
|
||||
"[" >> withoutPosition (optIdent >> termParser) >> "]"
|
||||
/-- A `bracketedBinder` matches any kind of binder group that uses some kind of brackets:
|
||||
* An explicit binder like `(x y : A)`
|
||||
@@ -766,17 +746,6 @@ is short for accessing the `i`-th field (1-indexed) of `e` if it is of a structu
|
||||
@[builtin_term_parser] def arrow := trailing_parser
|
||||
checkPrec 25 >> unicodeSymbol " → " " -> " >> termParser 25
|
||||
|
||||
/--
|
||||
Syntax kind for syntax nodes representing the field of a projection in the `InfoTree`.
|
||||
Specifically, the `InfoTree` node for a projection `s.f` contains a child `InfoTree` node
|
||||
with syntax ``(Syntax.node .none identProjKind #[`f])``.
|
||||
|
||||
This is necessary because projection syntax cannot always be detected purely syntactically
|
||||
(`s.f` may refer to either the identifier `s.f` or a projection `s.f` depending on
|
||||
the available context).
|
||||
-/
|
||||
def identProjKind := `Lean.Parser.Term.identProj
|
||||
|
||||
def isIdent (stx : Syntax) : Bool :=
|
||||
-- antiquotations should also be allowed where an identifier is expected
|
||||
stx.isAntiquot || stx.isIdent
|
||||
|
||||
@@ -46,6 +46,9 @@ def ppExprWithInfos (e : Expr) (optsPerPos : Delaborator.OptionsPerPos := {}) (d
|
||||
let fmt ← ppTerm stx
|
||||
return ⟨fmt, infos⟩
|
||||
|
||||
def ppConst (e : Expr) : MetaM Format := do
|
||||
ppUsing e fun e => return (← delabCore e (delab := Delaborator.delabConst)).1
|
||||
|
||||
@[export lean_pp_expr]
|
||||
def ppExprLegacy (env : Environment) (mctx : MetavarContext) (lctx : LocalContext) (opts : Options) (e : Expr) : IO Format :=
|
||||
Prod.fst <$> ((ppExpr e).run' { lctx := lctx } { mctx := mctx }).toIO { options := opts, fileName := "<PrettyPrinter>", fileMap := default } { env := env }
|
||||
@@ -72,8 +75,8 @@ private partial def noContext : MessageData → MessageData
|
||||
| MessageData.group msg => MessageData.group (noContext msg)
|
||||
| MessageData.compose msg₁ msg₂ => MessageData.compose (noContext msg₁) (noContext msg₂)
|
||||
| MessageData.tagged tag msg => MessageData.tagged tag (noContext msg)
|
||||
| MessageData.trace data header children =>
|
||||
MessageData.trace data (noContext header) (children.map noContext)
|
||||
| MessageData.trace cls header children collapsed =>
|
||||
MessageData.trace cls (noContext header) (children.map noContext) collapsed
|
||||
| msg => msg
|
||||
|
||||
-- strip context (including environments with registered pretty printers) to prevent infinite recursion when pretty printing pretty printer error
|
||||
@@ -98,37 +101,4 @@ unsafe def registerParserCompilers : IO Unit := do
|
||||
ParserCompiler.registerParserCompiler ⟨`formatter, formatterAttribute, combinatorFormatterAttribute⟩
|
||||
|
||||
end PrettyPrinter
|
||||
|
||||
namespace MessageData
|
||||
|
||||
open Lean PrettyPrinter Delaborator
|
||||
|
||||
/--
|
||||
Turns a `MetaM FormatWithInfos` into a `MessageData` using `.ofPPFormat` and running the monadic value in the given context.
|
||||
Uses the `pp.tagAppFns` option to annotate constants with terminfo, which is necessary for seeing the type on mouse hover.
|
||||
-/
|
||||
def ofFormatWithInfos
|
||||
(fmt : MetaM FormatWithInfos)
|
||||
(noContext : Unit → Format := fun _ => "<no context, could not generate MessageData>") : MessageData :=
|
||||
.ofPPFormat
|
||||
{ pp := fun
|
||||
| some ctx => ctx.runMetaM <| withOptions (pp.tagAppFns.set · true) fmt
|
||||
| none => return noContext () }
|
||||
|
||||
/-- Pretty print a const expression using `delabConst` and generate terminfo.
|
||||
This function avoids inserting `@` if the constant is for a function whose first
|
||||
argument is implicit, which is what the default `toMessageData` for `Expr` does.
|
||||
Panics if `e` is not a constant. -/
|
||||
def ofConst (e : Expr) : MessageData :=
|
||||
if e.isConst then
|
||||
.ofFormatWithInfos (PrettyPrinter.ppExprWithInfos (delab := delabConst) e) fun _ => f!"{e}"
|
||||
else
|
||||
panic! "not a constant"
|
||||
|
||||
/-- Generates `MessageData` for a declaration `c` as `c.{<levels>} <params> : <type>`, with terminfo. -/
|
||||
def signature (c : Name) : MessageData :=
|
||||
.ofFormatWithInfos (PrettyPrinter.ppSignature c) fun _ => f!"{c}"
|
||||
|
||||
end MessageData
|
||||
|
||||
end Lean
|
||||
|
||||
@@ -67,18 +67,12 @@ def realizeGlobalConstNoOverloadCore (n : Name) : CoreM Name := do
|
||||
|
||||
/--
|
||||
Similar to `resolveGlobalConst`, but also executes reserved name actions.
|
||||
|
||||
Consider using `realizeGlobalConstWithInfo` if you want the syntax to show the resulting name's info
|
||||
on hover.
|
||||
-/
|
||||
def realizeGlobalConst (stx : Syntax) : CoreM (List Name) :=
|
||||
withRef stx do preprocessSyntaxAndResolve stx realizeGlobalConstCore
|
||||
|
||||
/--
|
||||
Similar to `realizeGlobalConstNoOverload`, but also executes reserved name actions.
|
||||
|
||||
Consider using `realizeGlobalConstNoOverloadWithInfo` if you want the syntax to show the resulting
|
||||
name's info on hover.
|
||||
-/
|
||||
def realizeGlobalConstNoOverload (id : Syntax) : CoreM Name := do
|
||||
ensureNonAmbiguous id (← realizeGlobalConst id)
|
||||
|
||||
@@ -289,7 +289,7 @@ def filterFieldList [Monad m] [MonadError m] (n : Name) (cs : List (Name × List
|
||||
if cs.isEmpty then throwUnknownConstant n
|
||||
return cs.map (·.1)
|
||||
|
||||
/-- Given a name `n`, returns a list of possible interpretations for global constants.
|
||||
/-- Given a name `n`, return a list of possible interpretations for global constants.
|
||||
|
||||
Similar to `resolveGlobalName`, but discard any candidate whose `fieldList` is not empty.
|
||||
For identifiers taken from syntax, use `resolveGlobalConst` instead, which respects preresolved names. -/
|
||||
@@ -320,14 +320,11 @@ def preprocessSyntaxAndResolve [Monad m] [MonadResolveName m] [MonadEnv m] [Mona
|
||||
| _ => throwErrorAt stx s!"expected identifier"
|
||||
|
||||
/--
|
||||
Interprets the syntax `n` as an identifier for a global constant, and returns a list of resolved
|
||||
Interpret the syntax `n` as an identifier for a global constant, and return a list of resolved
|
||||
constant names that it could be referring to based on the currently open namespaces.
|
||||
This should be used instead of `resolveGlobalConstCore` for identifiers taken from syntax
|
||||
because `Syntax` objects may have names that have already been resolved.
|
||||
|
||||
Consider using `realizeGlobalConst` if you need to handle reserved names, and
|
||||
`realizeGlobalConstWithInfo` if you want the syntax to show the resulting name's info on hover.
|
||||
|
||||
## Example:
|
||||
```
|
||||
def Boo.x := 1
|
||||
@@ -348,7 +345,7 @@ def resolveGlobalConst [Monad m] [MonadResolveName m] [MonadEnv m] [MonadError m
|
||||
preprocessSyntaxAndResolve stx resolveGlobalConstCore
|
||||
|
||||
/--
|
||||
Given a list of names produced by `resolveGlobalConst`, throws an error if the list does not contain
|
||||
Given a list of names produced by `resolveGlobalConst`, throw an error if the list does not contain
|
||||
exactly one element.
|
||||
Recall that `resolveGlobalConst` does not return empty lists.
|
||||
-/
|
||||
@@ -358,13 +355,9 @@ def ensureNonAmbiguous [Monad m] [MonadError m] (id : Syntax) (cs : List Name) :
|
||||
| [c] => pure c
|
||||
| _ => throwErrorAt id s!"ambiguous identifier '{id}', possible interpretations: {cs.map mkConst}"
|
||||
|
||||
/-- Interprets the syntax `n` as an identifier for a global constant, and return a resolved
|
||||
/-- Interpret the syntax `n` as an identifier for a global constant, and return a resolved
|
||||
constant name. If there are multiple possible interpretations it will throw.
|
||||
|
||||
Consider using `realizeGlobalConstNoOverload` if you need to handle reserved names, and
|
||||
`realizeGlobalConstNoOverloadWithInfo` if you
|
||||
want the syntax to show the resulting name's info on hover.
|
||||
|
||||
## Example:
|
||||
```
|
||||
def Boo.x := 1
|
||||
|
||||
@@ -73,11 +73,6 @@ structure WorkerContext where
|
||||
maxDocVersionRef : IO.Ref Int
|
||||
freshRequestIdRef : IO.Ref Int
|
||||
/--
|
||||
Channel that receives a message for every a `$/lean/fileProgress` notification, indicating whether
|
||||
the notification suggests that the file is currently being processed.
|
||||
-/
|
||||
chanIsProcessing : IO.Channel Bool
|
||||
/--
|
||||
Diagnostics that are included in every single `textDocument/publishDiagnostics` notification.
|
||||
-/
|
||||
stickyDiagnosticsRef : IO.Ref StickyDiagnostics
|
||||
@@ -228,8 +223,7 @@ This option can only be set on the command line, not in the lakefile or via `set
|
||||
| t::ts => do
|
||||
let mut st := st
|
||||
unless (← IO.hasFinished t.task) do
|
||||
if let some range := t.range? then
|
||||
ctx.chanOut.send <| mkFileProgressAtPosNotification doc.meta range.start
|
||||
ctx.chanOut.send <| mkFileProgressAtPosNotification doc.meta t.range.start
|
||||
if !st.hasBlocked then
|
||||
publishDiagnostics ctx doc
|
||||
st := { st with hasBlocked := true }
|
||||
@@ -320,9 +314,8 @@ section Initialization
|
||||
catch _ => pure ()
|
||||
let maxDocVersionRef ← IO.mkRef 0
|
||||
let freshRequestIdRef ← IO.mkRef 0
|
||||
let chanIsProcessing ← IO.Channel.new
|
||||
let stickyDiagnosticsRef ← IO.mkRef ∅
|
||||
let chanOut ← mkLspOutputChannel maxDocVersionRef chanIsProcessing
|
||||
let chanOut ← mkLspOutputChannel maxDocVersionRef
|
||||
let srcSearchPathPromise ← IO.Promise.new
|
||||
|
||||
let processor := Language.Lean.process (setupImports meta chanOut srcSearchPathPromise)
|
||||
@@ -340,7 +333,6 @@ section Initialization
|
||||
clientHasWidgets
|
||||
maxDocVersionRef
|
||||
freshRequestIdRef
|
||||
chanIsProcessing
|
||||
cmdlineOpts := opts
|
||||
stickyDiagnosticsRef
|
||||
}
|
||||
@@ -363,7 +355,7 @@ section Initialization
|
||||
the output FS stream after discarding outdated notifications. This is the only component of
|
||||
the worker with access to the output stream, so we can synchronize messages from parallel
|
||||
elaboration tasks here. -/
|
||||
mkLspOutputChannel maxDocVersion chanIsProcessing : IO (IO.Channel JsonRpc.Message) := do
|
||||
mkLspOutputChannel maxDocVersion : IO (IO.Channel JsonRpc.Message) := do
|
||||
let chanOut ← IO.Channel.new
|
||||
let _ ← chanOut.forAsync (prio := .dedicated) fun msg => do
|
||||
-- discard outdated notifications; note that in contrast to responses, notifications can
|
||||
@@ -382,9 +374,6 @@ section Initialization
|
||||
-- note that because of `server.reportDelayMs`, we cannot simply set `maxDocVersion` here
|
||||
-- as that would allow outdated messages to be reported until the delay is over
|
||||
o.writeLspMessage msg |>.catchExceptions (fun _ => pure ())
|
||||
if let .notification "$/lean/fileProgress" (some params) := msg then
|
||||
if let some (params : LeanFileProgressParams) := fromJson? (toJson params) |>.toOption then
|
||||
chanIsProcessing.send (! params.processing.isEmpty)
|
||||
return chanOut
|
||||
|
||||
getImportClosure? (snap : Language.Lean.InitialSnapshot) : Array Name := Id.run do
|
||||
@@ -678,13 +667,6 @@ def runRefreshTask : WorkerM (Task (Except IO.Error Unit)) := do
|
||||
let ctx ← read
|
||||
IO.asTask do
|
||||
while ! (←IO.checkCanceled) do
|
||||
let pastProcessingStates ← ctx.chanIsProcessing.recvAllCurrent
|
||||
if pastProcessingStates.isEmpty then
|
||||
-- Processing progress has not changed since we last sent out a refresh request
|
||||
-- => do not send out another one for now so that we do not make the client spam
|
||||
-- semantic token requests while idle and already having received an up-to-date state
|
||||
IO.sleep 1000
|
||||
continue
|
||||
sendServerRequest ctx "workspace/semanticTokens/refresh" (none : Option Nat)
|
||||
IO.sleep 2000
|
||||
|
||||
|
||||
@@ -419,9 +419,6 @@ where
|
||||
return toDocumentSymbols text stxs (syms.push sym) stack
|
||||
toDocumentSymbols text stxs syms stack
|
||||
|
||||
/--
|
||||
`SyntaxNodeKind`s for which the syntax node and its children receive no semantic highlighting.
|
||||
-/
|
||||
def noHighlightKinds : Array SyntaxNodeKind := #[
|
||||
-- usually have special highlighting by the client
|
||||
``Lean.Parser.Term.sorry,
|
||||
@@ -432,8 +429,17 @@ def noHighlightKinds : Array SyntaxNodeKind := #[
|
||||
``Lean.Parser.Command.docComment,
|
||||
``Lean.Parser.Command.moduleDoc]
|
||||
|
||||
structure SemanticTokensContext where
|
||||
beginPos : String.Pos
|
||||
endPos? : Option String.Pos
|
||||
text : FileMap
|
||||
snap : Snapshot
|
||||
|
||||
structure SemanticTokensState where
|
||||
data : Array Nat
|
||||
lastLspPos : Lsp.Position
|
||||
|
||||
-- TODO: make extensible, or don't
|
||||
/-- Keywords for which a specific semantic token is provided. -/
|
||||
def keywordSemanticTokenMap : RBMap String SemanticTokenType compare :=
|
||||
RBMap.empty
|
||||
|>.insert "sorry" .leanSorryLike
|
||||
@@ -441,112 +447,7 @@ def keywordSemanticTokenMap : RBMap String SemanticTokenType compare :=
|
||||
|>.insert "stop" .leanSorryLike
|
||||
|>.insert "#exit" .leanSorryLike
|
||||
|
||||
/-- Semantic token information for a given `Syntax`. -/
|
||||
structure LeanSemanticToken where
|
||||
/-- Syntax of the semantic token. -/
|
||||
stx : Syntax
|
||||
/-- Type of the semantic token. -/
|
||||
type : SemanticTokenType
|
||||
|
||||
/-- Semantic token information with absolute LSP positions. -/
|
||||
structure AbsoluteLspSemanticToken where
|
||||
/-- Start position of the semantic token. -/
|
||||
pos : Lsp.Position
|
||||
/-- End position of the semantic token. -/
|
||||
tailPos : Lsp.Position
|
||||
/-- Start position of the semantic token. -/
|
||||
type : SemanticTokenType
|
||||
deriving BEq, Hashable, FromJson, ToJson
|
||||
|
||||
/--
|
||||
Given a set of `LeanSemanticToken`, computes the `AbsoluteLspSemanticToken` with absolute
|
||||
LSP position information for each token.
|
||||
-/
|
||||
def computeAbsoluteLspSemanticTokens
|
||||
(text : FileMap)
|
||||
(beginPos : String.Pos)
|
||||
(endPos? : Option String.Pos)
|
||||
(tokens : Array LeanSemanticToken)
|
||||
: Array AbsoluteLspSemanticToken :=
|
||||
tokens.filterMap fun ⟨stx, type⟩ => do
|
||||
let (pos, tailPos) := (← stx.getPos?, ← stx.getTailPos?)
|
||||
guard <| beginPos <= pos && endPos?.all (pos < ·)
|
||||
let (lspPos, lspTailPos) := (text.utf8PosToLspPos pos, text.utf8PosToLspPos tailPos)
|
||||
return ⟨lspPos, lspTailPos, type⟩
|
||||
|
||||
/-- Filters all duplicate semantic tokens with the same `pos`, `tailPos` and `type`. -/
|
||||
def filterDuplicateSemanticTokens (tokens : Array AbsoluteLspSemanticToken) : Array AbsoluteLspSemanticToken :=
|
||||
tokens.groupByKey id |>.toArray.map (·.1)
|
||||
|
||||
/--
|
||||
Given a set of `AbsoluteLspSemanticToken`, computes the LSP `SemanticTokens` data with
|
||||
token-relative positioning.
|
||||
See https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_semanticTokens.
|
||||
-/
|
||||
def computeDeltaLspSemanticTokens (tokens : Array AbsoluteLspSemanticToken) : SemanticTokens := Id.run do
|
||||
let tokens := tokens.qsort fun ⟨pos1, tailPos1, _⟩ ⟨pos2, tailPos2, _⟩ =>
|
||||
pos1 < pos2 || pos1 == pos2 && tailPos1 <= tailPos2
|
||||
let mut data : Array Nat := Array.mkEmpty (5*tokens.size)
|
||||
let mut lastPos : Lsp.Position := ⟨0, 0⟩
|
||||
for ⟨pos, tailPos, type⟩ in tokens do
|
||||
let deltaLine := pos.line - lastPos.line
|
||||
let deltaStart := pos.character - (if pos.line == lastPos.line then lastPos.character else 0)
|
||||
let length := tailPos.character - pos.character
|
||||
let tokenType := type.toNat
|
||||
let tokenModifiers := 0
|
||||
data := data ++ #[deltaLine, deltaStart, length, tokenType, tokenModifiers]
|
||||
lastPos := pos
|
||||
return { data }
|
||||
|
||||
/--
|
||||
Collects all semantic tokens that can be deduced purely from `Syntax`
|
||||
without elaboration information.
|
||||
-/
|
||||
partial def collectSyntaxBasedSemanticTokens : (stx : Syntax) → Array LeanSemanticToken
|
||||
| `($e.$id:ident) =>
|
||||
let tokens := collectSyntaxBasedSemanticTokens e
|
||||
tokens.push ⟨id, SemanticTokenType.property⟩
|
||||
| `($e |>.$field:ident) =>
|
||||
let tokens := collectSyntaxBasedSemanticTokens e
|
||||
tokens.push ⟨field, SemanticTokenType.property⟩
|
||||
| stx => Id.run do
|
||||
if noHighlightKinds.contains stx.getKind then
|
||||
return #[]
|
||||
let mut tokens :=
|
||||
if stx.isOfKind choiceKind then
|
||||
collectSyntaxBasedSemanticTokens stx[0]
|
||||
else
|
||||
stx.getArgs.map collectSyntaxBasedSemanticTokens |>.flatten
|
||||
let Syntax.atom _ val := stx
|
||||
| return tokens
|
||||
let isRegularKeyword := val.length > 0 && val.front.isAlpha
|
||||
let isHashKeyword := val.length > 1 && val.front == '#' && (val.get ⟨1⟩).isAlpha
|
||||
if ! isRegularKeyword && ! isHashKeyword then
|
||||
return tokens
|
||||
return tokens.push ⟨stx, keywordSemanticTokenMap.findD val .keyword⟩
|
||||
|
||||
/-- Collects all semantic tokens from the given `Elab.InfoTree`. -/
|
||||
def collectInfoBasedSemanticTokens (i : Elab.InfoTree) : Array LeanSemanticToken :=
|
||||
List.toArray <| i.deepestNodes fun _ i _ => do
|
||||
let .ofTermInfo ti := i
|
||||
| none
|
||||
let .original .. := ti.stx.getHeadInfo
|
||||
| none
|
||||
if let `($_:ident) := ti.stx then
|
||||
if let Expr.fvar fvarId .. := ti.expr then
|
||||
if let some localDecl := ti.lctx.find? fvarId then
|
||||
-- Recall that `isAuxDecl` is an auxiliary declaration used to elaborate a recursive definition.
|
||||
if localDecl.isAuxDecl then
|
||||
if ti.isBinder then
|
||||
return ⟨ti.stx, SemanticTokenType.function⟩
|
||||
else if ! localDecl.isImplementationDetail then
|
||||
return ⟨ti.stx, SemanticTokenType.variable⟩
|
||||
if ti.stx.getKind == Parser.Term.identProjKind then
|
||||
return ⟨ti.stx, SemanticTokenType.property⟩
|
||||
none
|
||||
|
||||
/-- Computes the semantic tokens in the range [beginPos, endPos?). -/
|
||||
def handleSemanticTokens (beginPos : String.Pos) (endPos? : Option String.Pos)
|
||||
partial def handleSemanticTokens (beginPos : String.Pos) (endPos? : Option String.Pos)
|
||||
: RequestM (RequestTask SemanticTokens) := do
|
||||
let doc ← readDoc
|
||||
match endPos? with
|
||||
@@ -561,25 +462,78 @@ def handleSemanticTokens (beginPos : String.Pos) (endPos? : Option String.Pos)
|
||||
let t := doc.cmdSnaps.waitUntil (·.endPos >= endPos)
|
||||
mapTask t fun (snaps, _) => run doc snaps
|
||||
where
|
||||
run doc snaps : RequestM SemanticTokens := do
|
||||
let mut leanSemanticTokens := #[]
|
||||
for s in snaps do
|
||||
if s.endPos <= beginPos then
|
||||
continue
|
||||
let syntaxBasedSemanticTokens := collectSyntaxBasedSemanticTokens s.stx
|
||||
let infoBasedSemanticTokens := collectInfoBasedSemanticTokens s.infoTree
|
||||
leanSemanticTokens := leanSemanticTokens ++ syntaxBasedSemanticTokens ++ infoBasedSemanticTokens
|
||||
let absoluteLspSemanticTokens := computeAbsoluteLspSemanticTokens doc.meta.text beginPos endPos? leanSemanticTokens
|
||||
let absoluteLspSemanticTokens := filterDuplicateSemanticTokens absoluteLspSemanticTokens
|
||||
let semanticTokens := computeDeltaLspSemanticTokens absoluteLspSemanticTokens
|
||||
return semanticTokens
|
||||
run doc snaps : RequestM SemanticTokens :=
|
||||
StateT.run' (s := { data := #[], lastLspPos := ⟨0, 0⟩ : SemanticTokensState }) do
|
||||
for s in snaps do
|
||||
if s.endPos <= beginPos then
|
||||
continue
|
||||
ReaderT.run (r := SemanticTokensContext.mk beginPos endPos? doc.meta.text s) <|
|
||||
go s.stx
|
||||
return { data := (← get).data }
|
||||
go (stx : Syntax) := do
|
||||
match stx with
|
||||
| `($e.$id:ident) => go e; addToken id SemanticTokenType.property
|
||||
-- indistinguishable from next pattern
|
||||
--| `(level|$id:ident) => addToken id SemanticTokenType.variable
|
||||
| `($id:ident) => highlightId id
|
||||
| _ =>
|
||||
if !noHighlightKinds.contains stx.getKind then
|
||||
highlightKeyword stx
|
||||
if stx.isOfKind choiceKind then
|
||||
go stx[0]
|
||||
else
|
||||
stx.getArgs.forM go
|
||||
highlightId (stx : Syntax) : ReaderT SemanticTokensContext (StateT SemanticTokensState RequestM) _ := do
|
||||
if let some range := stx.getRange? then
|
||||
let mut lastPos := range.start
|
||||
for ti in (← read).snap.infoTree.deepestNodes (fun
|
||||
| _, i@(Elab.Info.ofTermInfo ti), _ => match i.pos? with
|
||||
| some ipos => if range.contains ipos then some ti else none
|
||||
| _ => none
|
||||
| _, _, _ => none) do
|
||||
let pos := ti.stx.getPos?.get!
|
||||
-- avoid reporting same position twice; the info node can occur multiple times if
|
||||
-- e.g. the term is elaborated multiple times
|
||||
if pos < lastPos then
|
||||
continue
|
||||
if let Expr.fvar fvarId .. := ti.expr then
|
||||
if let some localDecl := ti.lctx.find? fvarId then
|
||||
-- Recall that `isAuxDecl` is an auxiliary declaration used to elaborate a recursive definition.
|
||||
if localDecl.isAuxDecl then
|
||||
if ti.isBinder then
|
||||
addToken ti.stx SemanticTokenType.function
|
||||
else
|
||||
addToken ti.stx SemanticTokenType.variable
|
||||
else if ti.stx.getPos?.get! > lastPos then
|
||||
-- any info after the start position: must be projection notation
|
||||
addToken ti.stx SemanticTokenType.property
|
||||
lastPos := ti.stx.getPos?.get!
|
||||
highlightKeyword stx := do
|
||||
if let Syntax.atom _ val := stx then
|
||||
if (val.length > 0 && val.front.isAlpha) ||
|
||||
-- Support for keywords of the form `#<alpha>...`
|
||||
(val.length > 1 && val.front == '#' && (val.get ⟨1⟩).isAlpha) then
|
||||
addToken stx (keywordSemanticTokenMap.findD val .keyword)
|
||||
addToken stx type := do
|
||||
let ⟨beginPos, endPos?, text, _⟩ ← read
|
||||
if let (some pos, some tailPos) := (stx.getPos?, stx.getTailPos?) then
|
||||
if beginPos <= pos && endPos?.all (pos < ·) then
|
||||
let lspPos := (← get).lastLspPos
|
||||
let lspPos' := text.utf8PosToLspPos pos
|
||||
let deltaLine := lspPos'.line - lspPos.line
|
||||
let deltaStart := lspPos'.character - (if lspPos'.line == lspPos.line then lspPos.character else 0)
|
||||
let length := (text.utf8PosToLspPos tailPos).character - lspPos'.character
|
||||
let tokenType := type.toNat
|
||||
let tokenModifiers := 0
|
||||
modify fun st => {
|
||||
data := st.data ++ #[deltaLine, deltaStart, length, tokenType, tokenModifiers]
|
||||
lastLspPos := lspPos'
|
||||
}
|
||||
|
||||
/-- Computes all semantic tokens for the document. -/
|
||||
def handleSemanticTokensFull (_ : SemanticTokensParams)
|
||||
: RequestM (RequestTask SemanticTokens) := do
|
||||
handleSemanticTokens 0 none
|
||||
|
||||
/-- Computes the semantic tokens in the range provided by `p`. -/
|
||||
def handleSemanticTokensRange (p : SemanticTokensRangeParams)
|
||||
: RequestM (RequestTask SemanticTokens) := do
|
||||
let doc ← readDoc
|
||||
|
||||
@@ -42,9 +42,9 @@ private partial def mkCmdSnaps (initSnap : Language.Lean.InitialSnapshot) :
|
||||
mpState := headerParsed.parserState
|
||||
cmdState := headerSuccess.cmdState
|
||||
} <| .delayed <| headerSuccess.firstCmdSnap.task.bind go
|
||||
where
|
||||
go cmdParsed :=
|
||||
cmdParsed.data.finishedSnap.task.map fun finished =>
|
||||
where go cmdParsed :=
|
||||
cmdParsed.data.sigSnap.task.bind fun sig =>
|
||||
sig.finishedSnap.task.map fun finished =>
|
||||
.ok <| .cons {
|
||||
stx := cmdParsed.data.stx
|
||||
mpState := cmdParsed.data.parserState
|
||||
|
||||
@@ -58,7 +58,6 @@ def runCommandElabM (snap : Snapshot) (meta : DocumentMeta) (c : CommandElabM α
|
||||
fileName := meta.uri,
|
||||
fileMap := meta.text,
|
||||
tacticCache? := none
|
||||
snap? := none
|
||||
}
|
||||
c.run ctx |>.run' snap.cmdState
|
||||
|
||||
|
||||
@@ -1,198 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: David Thrane Christiansen
|
||||
-/
|
||||
prelude
|
||||
import Init.Data
|
||||
import Lean.Data.HashMap
|
||||
import Init.Omega
|
||||
|
||||
namespace Lean.Diff
|
||||
/--
|
||||
An action in an edit script to transform one source sequence into a target sequence.
|
||||
-/
|
||||
inductive Action where
|
||||
/-- Insert the item into the source -/
|
||||
| insert
|
||||
/-- Delete the item from the source -/
|
||||
| delete
|
||||
/-- Leave the item in the source -/
|
||||
| skip
|
||||
deriving Repr, BEq, Hashable, Repr
|
||||
|
||||
instance : ToString Action where
|
||||
toString
|
||||
| .insert => "insert"
|
||||
| .delete => "delete"
|
||||
| .skip => "skip"
|
||||
|
||||
/--
|
||||
Determines a prefix to show on a given line when printing diff results.
|
||||
|
||||
For `delete`, the prefix is `"-"`, for `insert`, it is `"+"`, and for `skip` it is `" "`.
|
||||
-/
|
||||
def Action.linePrefix : (action : Action) → String
|
||||
| .delete => "-"
|
||||
| .insert => "+"
|
||||
| .skip => " "
|
||||
|
||||
/--
|
||||
A histogram entry consists of the number of times the element occurs in the left and right input
|
||||
arrays along with an index at which the element can be found, if applicable.
|
||||
-/
|
||||
structure Histogram.Entry (α : Type u) (lsize rsize : Nat) where
|
||||
/-- How many times the element occurs in the left array -/
|
||||
leftCount : Nat
|
||||
/-- An index of the element in the left array, if applicable -/
|
||||
leftIndex : Option (Fin lsize)
|
||||
/-- Invariant: the count is non-zero iff there is a saved index -/
|
||||
leftWF : leftCount = 0 ↔ leftIndex = none
|
||||
/-- How many times the element occurs in the right array -/
|
||||
rightCount : Nat
|
||||
/-- An index of the element in the right array, if applicable -/
|
||||
rightIndex : Option (Fin rsize)
|
||||
/-- Invariant: the count is non-zero iff there is a saved index -/
|
||||
rightWF : rightCount = 0 ↔ rightIndex = none
|
||||
|
||||
/-- A histogram for arrays maps each element to a count and, if applicable, an index.-/
|
||||
def Histogram (α : Type u) (lsize rsize : Nat) [BEq α] [Hashable α] :=
|
||||
Lean.HashMap α (Histogram.Entry α lsize rsize)
|
||||
|
||||
|
||||
section
|
||||
|
||||
variable [BEq α] [Hashable α]
|
||||
|
||||
/-- Add an element from the left array to a histogram -/
|
||||
def Histogram.addLeft (histogram : Histogram α lsize rsize) (index : Fin lsize) (val : α)
|
||||
: Histogram α lsize rsize :=
|
||||
match histogram.find? val with
|
||||
| none => histogram.insert val {
|
||||
leftCount := 1, leftIndex := some index,
|
||||
leftWF := by simp,
|
||||
rightCount := 0, rightIndex := none
|
||||
rightWF := by simp
|
||||
}
|
||||
| some x => histogram.insert val {x with
|
||||
leftCount := x.leftCount + 1, leftIndex := some index, leftWF := by simp
|
||||
}
|
||||
|
||||
/-- Add an element from the right array to a histogram -/
|
||||
def Histogram.addRight (histogram : Histogram α lsize rsize) (index : Fin rsize) (val : α)
|
||||
: Histogram α lsize rsize :=
|
||||
match histogram.find? val with
|
||||
| none => histogram.insert val {
|
||||
leftCount := 0, leftIndex := none,
|
||||
leftWF := by simp,
|
||||
rightCount := 1, rightIndex := some index,
|
||||
rightWF := by simp
|
||||
}
|
||||
| some x => histogram.insert val {x with
|
||||
rightCount := x.leftCount + 1, rightIndex := some index,
|
||||
rightWF := by simp
|
||||
}
|
||||
|
||||
/-- Given two `Subarray`s, find their common prefix and return their differing suffixes -/
|
||||
def matchPrefix (left right : Subarray α) : Array α × Subarray α × Subarray α :=
|
||||
let rec go (pref : Array α) : Array α × Subarray α × Subarray α :=
|
||||
let i := pref.size
|
||||
if _ : i < left.size ∧ i < right.size then
|
||||
if left[i]'(by omega) == right[i]'(by omega) then
|
||||
go <| pref.push <| left[i]'(by omega)
|
||||
else (pref, left.drop pref.size, right.drop pref.size)
|
||||
else (pref, left.drop pref.size, right.drop pref.size)
|
||||
termination_by left.size - pref.size
|
||||
go #[]
|
||||
|
||||
|
||||
/-- Given two `Subarray`s, find their common suffix and return their differing prefixes -/
|
||||
def matchSuffix (left right : Subarray α) : Subarray α × Subarray α × Array α :=
|
||||
let rec go (i : Nat) : Subarray α × Subarray α × Array α :=
|
||||
if _ : i < left.size ∧ i < right.size then
|
||||
if left[left.size - i - 1]'(by omega) == right[right.size - i - 1]'(by omega) then
|
||||
go i.succ
|
||||
else
|
||||
(left.take (left.size - i), right.take (right.size - i), (left.drop (left.size - i)).toArray)
|
||||
else
|
||||
(left.take (left.size - i), right.take (right.size - i), (left.drop (left.size - i)).toArray)
|
||||
termination_by left.size - i
|
||||
go 0
|
||||
|
||||
/--
|
||||
Finds the least common subsequence of two arrays using a variant of jgit's histogram diff algorithm.
|
||||
|
||||
Unlike jgit's implementation, this one does not perform a cutoff on histogram bucket sizes, nor does
|
||||
it fall back to the Myers diff algorithm. This means that it has quadratic worst-case complexity.
|
||||
Empirically, however, even quadratic cases of this implementation can handle hundreds of megabytes
|
||||
in a second or so, and it is intended to be used for short strings like error messages. Be
|
||||
cautious when applying it to larger workloads.
|
||||
-/
|
||||
partial def lcs (left right : Subarray α) : Array α := Id.run do
|
||||
let (pref, left, right) := matchPrefix left right
|
||||
let (left, right, suff) := matchSuffix left right
|
||||
let mut hist : Histogram α left.size right.size := .empty
|
||||
for h : i in [0:left.size] do
|
||||
hist := hist.addLeft ⟨i, Membership.get_elem_helper h rfl⟩ left[i]
|
||||
for h : i in [0:right.size] do
|
||||
hist := hist.addRight ⟨i, Membership.get_elem_helper h rfl⟩ right[i]
|
||||
let mut best := none
|
||||
for (k, v) in hist.toList do
|
||||
if let {leftCount := lc, leftIndex := some li, rightCount := rc, rightIndex := some ri, ..} := v then
|
||||
match best with
|
||||
| none =>
|
||||
best := some ((lc + rc), k, li, ri)
|
||||
| some (count, _) =>
|
||||
if lc + rc < count then
|
||||
best := some ((lc + rc), k, li, ri)
|
||||
let some (_, v, li, ri) := best
|
||||
| return pref ++ suff
|
||||
|
||||
let lefts := left.split ⟨li.val, by cases li; simp_arith; omega⟩
|
||||
let rights := right.split ⟨ri.val, by cases ri; simp_arith; omega⟩
|
||||
|
||||
pref ++ lcs lefts.1 rights.1 ++ #[v] ++ lcs (lefts.2.drop 1) (rights.2.drop 1) ++ suff
|
||||
|
||||
/--
|
||||
Computes an edit script to transform `left` into `right`.
|
||||
-/
|
||||
def diff [Inhabited α] (original edited : Array α) : Array (Action × α) :=
|
||||
if h : ¬(0 < original.size) then
|
||||
edited.map (.insert, ·)
|
||||
else if h' : ¬(0 < edited.size) then
|
||||
original.map (.delete, ·)
|
||||
else Id.run do
|
||||
have : original.size > 0 := by omega
|
||||
have : edited.size > 0 := by omega
|
||||
let mut out := #[]
|
||||
let ds := lcs original.toSubarray edited.toSubarray
|
||||
let mut i := 0
|
||||
let mut j := 0
|
||||
for d in ds do
|
||||
while i < original.size && original[i]! != d do
|
||||
out := out.push <| (.delete, original[i]!)
|
||||
i := i.succ
|
||||
while j < edited.size && edited[j]! != d do
|
||||
out := out.push <| (.insert, edited[j]!)
|
||||
j := j.succ
|
||||
out := out.push <| (.skip, d)
|
||||
i := i.succ
|
||||
j := j.succ
|
||||
while h : i < original.size do
|
||||
out := out.push <| (.delete, original[i])
|
||||
i := i.succ
|
||||
while h : j < edited.size do
|
||||
out := out.push <| (.insert, edited[j])
|
||||
j := j.succ
|
||||
out
|
||||
|
||||
/-- Shows a line-by-line edit script with markers for added and removed lines. -/
|
||||
def linesToString [ToString α] (lines : Array (Action × α)) : String := Id.run do
|
||||
let mut out : String := ""
|
||||
for (act, line) in lines do
|
||||
let lineStr := toString line
|
||||
if lineStr == "" then
|
||||
out := out ++ s!"{act.linePrefix}\n"
|
||||
else
|
||||
out := out ++ s!"{act.linePrefix} {lineStr}\n"
|
||||
out
|
||||
@@ -1,15 +1,19 @@
|
||||
/-
|
||||
Copyright (c) 2023 Microsoft Corporation. All rights reserved.
|
||||
Copyright (c) 2020 Microsoft Corporation. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Kim Morrison
|
||||
Author: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Lean.CoreM
|
||||
|
||||
/-!
|
||||
# Heartbeats
|
||||
# Lean.Heartbeats
|
||||
|
||||
This provides some utilities is the first file in the Lean import hierarchy. It is responsible for setting
|
||||
up basic definitions, most of which Lean already has "built in knowledge" about,
|
||||
so it is important that they be set up in exactly this way. (For example, Lean will
|
||||
use `PUnit` in the desugaring of `do` notation, or in the pattern match compiler.)
|
||||
|
||||
Functions for interacting with the deterministic timeout heartbeat mechanism.
|
||||
-/
|
||||
|
||||
namespace Lean
|
||||
|
||||
@@ -1,322 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2024 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Sebastian Ullrich
|
||||
-/
|
||||
prelude
|
||||
import Lean.Util.Trace
|
||||
|
||||
/-! `trace.profiler.output` Firefox Profiler integration -/
|
||||
|
||||
namespace Lean.Firefox
|
||||
|
||||
/-! Definitions from https://github.com/firefox-devtools/profiler/blob/main/src/types/profile.js -/
|
||||
|
||||
abbrev Milliseconds := Float
|
||||
|
||||
structure Category where
|
||||
name : String
|
||||
color : String
|
||||
subcategories : Array String := #[]
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure ProfileMeta where
|
||||
interval : Milliseconds := 0 -- should be irrelevant with `tracing-ms`
|
||||
startTime : Milliseconds
|
||||
categories : Array Category := #[]
|
||||
processType : Nat := 0
|
||||
product : String := "lean"
|
||||
preprocessedProfileVersion : Nat := 48
|
||||
markerSchema : Array Json := #[]
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure StackTable where
|
||||
frame : Array Nat
|
||||
«prefix» : Array (Option Nat)
|
||||
category : Array Nat
|
||||
subcategory : Array Nat
|
||||
length : Nat
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure SamplesTable where
|
||||
stack : Array Nat
|
||||
time : Array Milliseconds
|
||||
weight : Array Milliseconds
|
||||
weightType : String := "tracing-ms"
|
||||
length : Nat
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure FuncTable where
|
||||
name : Array Nat
|
||||
isJS : Array Json := #[]
|
||||
relevantForJS : Array Json := #[]
|
||||
resource : Array Int
|
||||
fileName : Array (Option Nat)
|
||||
lineNumber : Array (Option Nat)
|
||||
columnNumber : Array (Option Nat)
|
||||
length : Nat
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure FrameTable where
|
||||
func : Array Nat
|
||||
inlineDepth : Array Json := #[]
|
||||
innerWindowID : Array Json := #[]
|
||||
length : Nat
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure RawMarkerTable where
|
||||
data : Array Json := #[]
|
||||
name : Array Json := #[]
|
||||
length : Nat := 0
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure ResourceTable where
|
||||
type : Array Json := #[]
|
||||
length : Nat := 0
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure Thread where
|
||||
name : String
|
||||
processType : String := "default"
|
||||
isMainThread : Bool := true
|
||||
samples : SamplesTable
|
||||
markers: RawMarkerTable := {}
|
||||
stackTable : StackTable
|
||||
frameTable : FrameTable
|
||||
resourceTable : ResourceTable := {}
|
||||
stringArray : Array String
|
||||
funcTable : FuncTable
|
||||
deriving FromJson, ToJson
|
||||
|
||||
structure Profile where
|
||||
meta : ProfileMeta
|
||||
libs : Array Json := #[]
|
||||
threads : Array Thread
|
||||
deriving FromJson, ToJson
|
||||
|
||||
/-- Thread with maps necessary for computing max sharing indices -/
|
||||
structure ThreadWithMaps extends Thread where
|
||||
stringMap : HashMap String Nat := {}
|
||||
funcMap : HashMap Nat Nat := {}
|
||||
stackMap : HashMap (Nat × Option Nat) Nat := {}
|
||||
/-- Last timestamp encountered: stop time of preceding sibling, or else start time of parent. -/
|
||||
lastTime : Float := 0
|
||||
|
||||
-- TODO: add others, dynamically?
|
||||
def categories : Array Category := #[
|
||||
{ name := "Other", color := "gray" },
|
||||
{ name := "Elab", color := "red" },
|
||||
{ name := "Meta", color := "yellow" }
|
||||
]
|
||||
|
||||
private partial def addTrace (pp : Bool) (thread : ThreadWithMaps) (trace : MessageData) :
|
||||
IO ThreadWithMaps :=
|
||||
(·.2) <$> StateT.run (go none trace) thread
|
||||
where
|
||||
go parentStackIdx? : _ → StateT ThreadWithMaps IO Unit
|
||||
| .trace data msg children => do
|
||||
if data.startTime == 0 then
|
||||
return -- no time data, skip
|
||||
let mut funcName := data.cls.toString
|
||||
if !data.tag.isEmpty then
|
||||
funcName := s!"{funcName}: {data.tag}"
|
||||
if pp then
|
||||
funcName := s!"{funcName}: {← msg.format}"
|
||||
let strIdx ← modifyGet fun thread =>
|
||||
if let some idx := thread.stringMap.find? funcName then
|
||||
(idx, thread)
|
||||
else
|
||||
(thread.stringMap.size, { thread with
|
||||
stringArray := thread.stringArray.push funcName
|
||||
stringMap := thread.stringMap.insert funcName thread.stringMap.size })
|
||||
let category := categories.findIdx? (·.name == data.cls.getRoot.toString) |>.getD 0
|
||||
let funcIdx ← modifyGet fun thread =>
|
||||
if let some idx := thread.funcMap.find? strIdx then
|
||||
(idx, thread)
|
||||
else
|
||||
(thread.funcMap.size, { thread with
|
||||
funcTable := {
|
||||
name := thread.funcTable.name.push strIdx
|
||||
resource := thread.funcTable.resource.push (-1)
|
||||
-- the following fields could be inferred from `Syntax` in the message
|
||||
fileName := thread.funcTable.fileName.push none
|
||||
lineNumber := thread.funcTable.lineNumber.push none
|
||||
columnNumber := thread.funcTable.columnNumber.push none
|
||||
length := thread.funcTable.length + 1
|
||||
}
|
||||
frameTable := {
|
||||
func := thread.frameTable.func.push thread.funcMap.size
|
||||
length := thread.frameTable.length + 1
|
||||
}
|
||||
funcMap := thread.funcMap.insert strIdx thread.funcMap.size })
|
||||
let frameIdx := funcIdx
|
||||
let stackIdx ← modifyGet fun thread =>
|
||||
if let some idx := thread.stackMap.find? (frameIdx, parentStackIdx?) then
|
||||
(idx, thread)
|
||||
else
|
||||
(thread.stackMap.size, { thread with
|
||||
stackTable := {
|
||||
frame := thread.stackTable.frame.push frameIdx
|
||||
«prefix» := thread.stackTable.prefix.push parentStackIdx?
|
||||
category := thread.stackTable.category.push category
|
||||
subcategory := thread.stackTable.subcategory.push 0
|
||||
length := thread.stackTable.length + 1
|
||||
}
|
||||
stackMap := thread.stackMap.insert (frameIdx, parentStackIdx?) thread.stackMap.size })
|
||||
modify fun thread => { thread with lastTime := data.startTime }
|
||||
for c in children do
|
||||
if let some nextStart := getNextStart? c then
|
||||
-- add run time slice between children/before first child
|
||||
modify fun thread => { thread with samples := {
|
||||
stack := thread.samples.stack.push stackIdx
|
||||
time := thread.samples.time.push (thread.lastTime * 1000)
|
||||
weight := thread.samples.weight.push ((nextStart - thread.lastTime) * 1000)
|
||||
length := thread.samples.length + 1
|
||||
} }
|
||||
go (some stackIdx) c
|
||||
-- add remaining slice after last child
|
||||
modify fun thread => { thread with
|
||||
lastTime := data.stopTime
|
||||
samples := {
|
||||
stack := thread.samples.stack.push stackIdx
|
||||
time := thread.samples.time.push (thread.lastTime * 1000)
|
||||
weight := thread.samples.weight.push ((data.stopTime - thread.lastTime) * 1000)
|
||||
length := thread.samples.length + 1
|
||||
} }
|
||||
| .withContext _ msg => go parentStackIdx? msg
|
||||
| _ => return
|
||||
/-- Returns first `startTime` in the trace tree, if any. -/
|
||||
getNextStart?
|
||||
| .trace data _ children => do
|
||||
if data.startTime != 0 then
|
||||
return data.startTime
|
||||
if let some startTime := children.findSome? getNextStart? then
|
||||
return startTime
|
||||
none
|
||||
| .withContext _ msg => getNextStart? msg
|
||||
| _ => none
|
||||
|
||||
def Thread.new (name : String) : Thread := {
|
||||
name
|
||||
samples := { stack := #[], time := #[], weight := #[], length := 0 }
|
||||
stackTable := { frame := #[], «prefix» := #[], category := #[], subcategory := #[], length := 0 }
|
||||
frameTable := { func := #[], length := 0 }
|
||||
stringArray := #[]
|
||||
funcTable := {
|
||||
name := #[], resource := #[], fileName := #[], lineNumber := #[], columnNumber := #[],
|
||||
length := 0 }
|
||||
}
|
||||
|
||||
def Profile.export (name : String) (startTime : Milliseconds) (traceState : TraceState)
|
||||
(opts : Options) : IO Profile := do
|
||||
let thread := Thread.new name
|
||||
-- wrap entire trace up to current time in `runFrontend` node
|
||||
let trace := .trace {
|
||||
cls := `runFrontend, startTime, stopTime := (← IO.monoNanosNow).toFloat / 1000000000,
|
||||
collapsed := true } "" (traceState.traces.toArray.map (·.msg))
|
||||
let thread ← addTrace (Lean.trace.profiler.output.pp.get opts) { thread with } trace
|
||||
return {
|
||||
meta := { startTime, categories }
|
||||
threads := #[thread.toThread]
|
||||
}
|
||||
|
||||
structure ThreadWithCollideMaps extends ThreadWithMaps where
|
||||
/-- Max sharing map for samples -/
|
||||
sampleMap : HashMap Nat Nat := {}
|
||||
|
||||
/--
|
||||
Adds samples from `add` to `thread`, increasing the weight of existing samples with identical stacks
|
||||
instead of pushing new samples.
|
||||
-/
|
||||
private partial def collideThreads (thread : ThreadWithCollideMaps) (add : Thread) :
|
||||
ThreadWithCollideMaps :=
|
||||
StateT.run collideSamples thread |>.2
|
||||
where
|
||||
collideSamples : StateM ThreadWithCollideMaps Unit := do
|
||||
for oldSampleIdx in [0:add.samples.length] do
|
||||
let oldStackIdx := add.samples.stack[oldSampleIdx]!
|
||||
let stackIdx ← collideStacks oldStackIdx
|
||||
modify fun thread =>
|
||||
if let some idx := thread.sampleMap.find? stackIdx then
|
||||
-- imperative to preserve linear use of arrays here!
|
||||
let ⟨⟨⟨t1, t2, t3, samples, t5, t6, t7, t8, t9, t10⟩, o2, o3, o4, o5⟩, o6⟩ := thread
|
||||
let ⟨s1, s2, weight, s3, s4⟩ := samples
|
||||
let weight := weight.set! idx <| weight[idx]! + add.samples.weight[oldSampleIdx]!
|
||||
let samples := ⟨s1, s2, weight, s3, s4⟩
|
||||
⟨⟨⟨t1, t2, t3, samples, t5, t6, t7, t8, t9, t10⟩, o2, o3, o4, o5⟩, o6⟩
|
||||
else
|
||||
-- imperative to preserve linear use of arrays here!
|
||||
let ⟨⟨⟨t1, t2, t3, samples, t5, t6, t7, t8, t9, t10⟩, o2, o3, o4, o5⟩, sampleMap⟩ :=
|
||||
thread
|
||||
let ⟨stack, time, weight, _, length⟩ := samples
|
||||
let samples := {
|
||||
stack := stack.push stackIdx
|
||||
time := time.push time.size.toFloat
|
||||
weight := weight.push add.samples.weight[oldSampleIdx]!
|
||||
length := length + 1
|
||||
}
|
||||
let sampleMap := sampleMap.insert stackIdx sampleMap.size
|
||||
⟨⟨⟨t1, t2, t3, samples, t5, t6, t7, t8, t9, t10⟩, o2, o3, o4, o5⟩, sampleMap⟩
|
||||
collideStacks oldStackIdx : StateM ThreadWithCollideMaps Nat := do
|
||||
let oldParentStackIdx? := add.stackTable.prefix[oldStackIdx]!
|
||||
let parentStackIdx? ← oldParentStackIdx?.mapM (collideStacks ·)
|
||||
let oldFrameIdx := add.stackTable.frame[oldStackIdx]!
|
||||
let oldFuncIdx := add.frameTable.func[oldFrameIdx]!
|
||||
let oldStrIdx := add.funcTable.name[oldFuncIdx]!
|
||||
let strIdx ← getStrIdx add.stringArray[oldStrIdx]!
|
||||
let funcIdx ← modifyGet fun thread =>
|
||||
if let some idx := thread.funcMap.find? strIdx then
|
||||
(idx, thread)
|
||||
else
|
||||
(thread.funcMap.size, { thread with
|
||||
funcTable := {
|
||||
name := thread.funcTable.name.push strIdx
|
||||
resource := thread.funcTable.resource.push (-1)
|
||||
fileName := thread.funcTable.fileName.push none
|
||||
lineNumber := thread.funcTable.lineNumber.push none
|
||||
columnNumber := thread.funcTable.columnNumber.push none
|
||||
length := thread.funcTable.length + 1
|
||||
}
|
||||
frameTable := {
|
||||
func := thread.frameTable.func.push thread.funcMap.size
|
||||
length := thread.frameTable.length + 1
|
||||
}
|
||||
funcMap := thread.funcMap.insert strIdx thread.funcMap.size })
|
||||
let frameIdx := funcIdx
|
||||
modifyGet fun thread =>
|
||||
if let some idx := thread.stackMap.find? (frameIdx, parentStackIdx?) then
|
||||
(idx, thread)
|
||||
else
|
||||
(thread.stackMap.size,
|
||||
let ⟨⟨⟨t1,t2, t3, t4, t5, stackTable, t7, t8, t9, t10⟩, o2, o3, stackMap, o5⟩, o6⟩ :=
|
||||
thread
|
||||
let { frame, «prefix», category, subcategory, length } := stackTable
|
||||
let stackTable := {
|
||||
frame := frame.push frameIdx
|
||||
«prefix» := prefix.push parentStackIdx?
|
||||
category := category.push add.stackTable.category[oldStackIdx]!
|
||||
subcategory := subcategory.push add.stackTable.subcategory[oldStackIdx]!
|
||||
length := length + 1
|
||||
}
|
||||
let stackMap := stackMap.insert (frameIdx, parentStackIdx?) stackMap.size
|
||||
⟨⟨⟨t1,t2, t3, t4, t5, stackTable, t7, t8, t9, t10⟩, o2, o3, stackMap, o5⟩, o6⟩)
|
||||
getStrIdx (s : String) :=
|
||||
modifyGet fun thread =>
|
||||
if let some idx := thread.stringMap.find? s then
|
||||
(idx, thread)
|
||||
else
|
||||
(thread.stringMap.size, { thread with
|
||||
stringArray := thread.stringArray.push s
|
||||
stringMap := thread.stringMap.insert s thread.stringMap.size })
|
||||
|
||||
/--
|
||||
Merges given profiles such that samples with identical stacks are deduplicated by adding up their
|
||||
weights. Minimizes profile size while preserving per-stack timing information.
|
||||
-/
|
||||
def Profile.collide (ps : Array Profile) : Option Profile := do
|
||||
let base ← ps[0]?
|
||||
let thread := Thread.new "collided"
|
||||
let thread := ps.map (·.threads) |>.flatten.foldl collideThreads { thread with }
|
||||
return { base with threads := #[thread.toThread] }
|
||||
|
||||
end Lean.Firefox
|
||||
@@ -129,7 +129,7 @@ def addRawTrace (msg : MessageData) : m Unit := do
|
||||
def addTrace (cls : Name) (msg : MessageData) : m Unit := do
|
||||
let ref ← getRef
|
||||
let msg ← addMessageContext msg
|
||||
modifyTraces (·.push { ref, msg := .trace { collapsed := false, cls } msg #[] })
|
||||
modifyTraces (·.push { ref, msg := .trace (collapsed := false) cls msg #[] })
|
||||
|
||||
@[inline] def trace (cls : Name) (msg : Unit → MessageData) : m Unit := do
|
||||
if (← isTracingEnabledFor cls) then
|
||||
@@ -141,18 +141,18 @@ def addTrace (cls : Name) (msg : MessageData) : m Unit := do
|
||||
addTrace cls msg
|
||||
|
||||
private def addTraceNode (oldTraces : PersistentArray TraceElem)
|
||||
(data : TraceData) (ref : Syntax) (msg : MessageData) : m Unit :=
|
||||
(cls : Name) (ref : Syntax) (msg : MessageData) (collapsed : Bool) : m Unit :=
|
||||
withRef ref do
|
||||
let msg := .trace data msg ((← getTraces).toArray.map (·.msg))
|
||||
let msg := .trace cls msg ((← getTraces).toArray.map (·.msg)) collapsed
|
||||
let msg ← addMessageContext msg
|
||||
modifyTraces fun _ =>
|
||||
oldTraces.push { ref, msg }
|
||||
|
||||
def withStartStopSeconds [Monad m] [MonadLiftT BaseIO m] (act : m α) : m (α × Float × Float) := do
|
||||
def withSeconds [Monad m] [MonadLiftT BaseIO m] (act : m α) : m (α × Float) := do
|
||||
let start ← IO.monoNanosNow
|
||||
let a ← act
|
||||
let stop ← IO.monoNanosNow
|
||||
return (a, start.toFloat / 1000000000, stop.toFloat / 1000000000)
|
||||
return (a, (stop - start).toFloat / 1000000000)
|
||||
|
||||
register_builtin_option trace.profiler : Bool := {
|
||||
defValue := false
|
||||
@@ -166,20 +166,6 @@ register_builtin_option trace.profiler.threshold : Nat := {
|
||||
descr := "threshold in milliseconds, traces below threshold will not be activated"
|
||||
}
|
||||
|
||||
register_builtin_option trace.profiler.output : String := {
|
||||
defValue := ""
|
||||
group := "profiler"
|
||||
descr :=
|
||||
"output `trace.profiler` data in Firefox Profiler-compatible format to given file path"
|
||||
}
|
||||
|
||||
register_builtin_option trace.profiler.output.pp : Bool := {
|
||||
defValue := false
|
||||
group := "profiler"
|
||||
descr :=
|
||||
"if false, limit text in exported trace nodes to trace class name and `TraceData.tag`, if any"
|
||||
}
|
||||
|
||||
def trace.profiler.threshold.getSecs (o : Options) : Float :=
|
||||
(trace.profiler.threshold.get o).toFloat / 1000
|
||||
|
||||
@@ -222,32 +208,31 @@ instance [always : MonadAlwaysExcept ε m] [STWorld ω m] [BEq α] [Hashable α]
|
||||
except := let _ := always.except; inferInstance
|
||||
|
||||
def withTraceNode [always : MonadAlwaysExcept ε m] [MonadLiftT BaseIO m] (cls : Name)
|
||||
(msg : Except ε α → m MessageData) (k : m α) (collapsed := true) (tag := "") : m α := do
|
||||
(msg : Except ε α → m MessageData) (k : m α) (collapsed := true) : m α := do
|
||||
let _ := always.except
|
||||
let opts ← getOptions
|
||||
let clsEnabled ← isTracingEnabledFor cls
|
||||
unless clsEnabled || trace.profiler.get opts do
|
||||
return (← k)
|
||||
let oldTraces ← getResetTraces
|
||||
let (res, start, stop) ← withStartStopSeconds <| observing k
|
||||
let aboveThresh := trace.profiler.get opts && stop - start > trace.profiler.threshold.getSecs opts
|
||||
let (res, secs) ← withSeconds <| observing k
|
||||
let aboveThresh := trace.profiler.get opts && secs > trace.profiler.threshold.getSecs opts
|
||||
unless clsEnabled || aboveThresh do
|
||||
modifyTraces (oldTraces ++ ·)
|
||||
return (← MonadExcept.ofExcept res)
|
||||
let ref ← getRef
|
||||
let mut m ← try msg res catch _ => pure m!"<exception thrown while producing trace node message>"
|
||||
let mut data := { cls, collapsed, tag }
|
||||
if profiler.get opts || aboveThresh then
|
||||
data := { data with startTime := start, stopTime := stop }
|
||||
addTraceNode oldTraces data ref m
|
||||
m := m!"[{secs}s] {m}"
|
||||
addTraceNode oldTraces cls ref m collapsed
|
||||
MonadExcept.ofExcept res
|
||||
|
||||
def withTraceNode' [MonadAlwaysExcept Exception m] [MonadLiftT BaseIO m] (cls : Name)
|
||||
(k : m (α × MessageData)) (collapsed := true) (tag := "") : m α :=
|
||||
(k : m (α × MessageData)) (collapsed := true) : m α :=
|
||||
let msg := fun
|
||||
| .ok (_, msg) => return msg
|
||||
| .error err => return err.toMessageData
|
||||
Prod.fst <$> withTraceNode cls msg k collapsed tag
|
||||
Prod.fst <$> withTraceNode cls msg k collapsed
|
||||
|
||||
end
|
||||
|
||||
@@ -315,7 +300,7 @@ TODO: find better name for this function.
|
||||
-/
|
||||
def withTraceNodeBefore [MonadRef m] [AddMessageContext m] [MonadOptions m]
|
||||
[always : MonadAlwaysExcept ε m] [MonadLiftT BaseIO m] [ExceptToEmoji ε α] (cls : Name)
|
||||
(msg : m MessageData) (k : m α) (collapsed := true) (tag := "") : m α := do
|
||||
(msg : m MessageData) (k : m α) (collapsed := true) : m α := do
|
||||
let _ := always.except
|
||||
let opts ← getOptions
|
||||
let clsEnabled ← isTracingEnabledFor cls
|
||||
@@ -325,16 +310,15 @@ def withTraceNodeBefore [MonadRef m] [AddMessageContext m] [MonadOptions m]
|
||||
let ref ← getRef
|
||||
-- make sure to preserve context *before* running `k`
|
||||
let msg ← withRef ref do addMessageContext (← msg)
|
||||
let (res, start, stop) ← withStartStopSeconds <| observing k
|
||||
let aboveThresh := trace.profiler.get opts && stop - start > trace.profiler.threshold.getSecs opts
|
||||
let (res, secs) ← withSeconds <| observing k
|
||||
let aboveThresh := trace.profiler.get opts && secs > trace.profiler.threshold.getSecs opts
|
||||
unless clsEnabled || aboveThresh do
|
||||
modifyTraces (oldTraces ++ ·)
|
||||
return (← MonadExcept.ofExcept res)
|
||||
let mut msg := m!"{ExceptToEmoji.toEmoji res} {msg}"
|
||||
let mut data := { cls, collapsed, tag }
|
||||
if profiler.get opts || aboveThresh then
|
||||
data := { data with startTime := start, stopTime := stop }
|
||||
addTraceNode oldTraces data ref msg
|
||||
msg := m!"[{secs}s] {msg}"
|
||||
addTraceNode oldTraces cls ref msg collapsed
|
||||
MonadExcept.ofExcept res
|
||||
|
||||
end Lean
|
||||
|
||||
@@ -143,10 +143,10 @@ where
|
||||
| ctx, nest n d => Format.nest n <$> go nCtx ctx d
|
||||
| ctx, compose d₁ d₂ => do let d₁ ← go nCtx ctx d₁; let d₂ ← go nCtx ctx d₂; pure $ d₁ ++ d₂
|
||||
| ctx, group d => Format.group <$> go nCtx ctx d
|
||||
| ctx, .trace data header children => do
|
||||
| ctx, .trace cls header children collapsed => do
|
||||
let header := (← go nCtx ctx header).nest 4
|
||||
let nodes ←
|
||||
if data.collapsed && !children.isEmpty then
|
||||
if collapsed && !children.isEmpty then
|
||||
let children := children.map fun child =>
|
||||
MessageData.withNamingContext nCtx <|
|
||||
match ctx with
|
||||
@@ -154,11 +154,11 @@ where
|
||||
| none => child
|
||||
let blockSize := ctx.bind (infoview.maxTraceChildren.get? ·.opts)
|
||||
|>.getD infoview.maxTraceChildren.defValue
|
||||
let children := chopUpChildren data.cls blockSize children.toSubarray
|
||||
let children := chopUpChildren cls blockSize children.toSubarray
|
||||
pure (.lazy children)
|
||||
else
|
||||
pure (.strict (← children.mapM (go nCtx ctx)))
|
||||
let e := .trace data.cls header data.collapsed nodes
|
||||
let e := .trace cls header collapsed nodes
|
||||
return .tag (← pushEmbed e) ".\n"
|
||||
|
||||
/-- Recursively moves child nodes after the first `blockSize` into a new "more" node. -/
|
||||
@@ -167,7 +167,7 @@ where
|
||||
if children.size > blockSize + 1 then -- + 1 to make idempotent
|
||||
let more := chopUpChildren cls blockSize children[blockSize:]
|
||||
children[:blockSize].toArray.push <|
|
||||
.trace { collapsed := true, cls }
|
||||
.trace (collapsed := true) cls
|
||||
f!"{children.size - blockSize} more entries..." more
|
||||
else children
|
||||
|
||||
|
||||
@@ -1044,7 +1044,6 @@ LEAN_EXPORT bool lean_string_lt(b_lean_obj_arg s1, b_lean_obj_arg s2);
|
||||
static inline uint8_t lean_string_dec_eq(b_lean_obj_arg s1, b_lean_obj_arg s2) { return lean_string_eq(s1, s2); }
|
||||
static inline uint8_t lean_string_dec_lt(b_lean_obj_arg s1, b_lean_obj_arg s2) { return lean_string_lt(s1, s2); }
|
||||
LEAN_EXPORT uint64_t lean_string_hash(b_lean_obj_arg);
|
||||
LEAN_EXPORT lean_obj_res lean_string_of_usize(size_t);
|
||||
|
||||
/* Thunks */
|
||||
|
||||
|
||||
@@ -30,6 +30,7 @@ int compare(reducibility_hints const & h1, reducibility_hints const & h2) {
|
||||
return -1; /* unfold f1 */
|
||||
else
|
||||
return 1; /* unfold f2 */
|
||||
return h1.get_height() > h2.get_height() ? -1 : 1;
|
||||
} else {
|
||||
return 0; /* reduce both */
|
||||
}
|
||||
|
||||
@@ -24,6 +24,32 @@ inductive TomlOutcome where
|
||||
| fail (log : MessageLog)
|
||||
| error (e : IO.Error)
|
||||
|
||||
@[inline] def Fin.allM [Monad m] (n) (f : Fin n → m Bool) : m Bool :=
|
||||
loop 0
|
||||
where
|
||||
loop (i : Nat) : m Bool := do
|
||||
if h : i < n then
|
||||
if (← f ⟨i, h⟩) then loop (i+1) else pure false
|
||||
else
|
||||
pure true
|
||||
termination_by n - i
|
||||
|
||||
@[inline] def Fin.all (n) (f : Fin n → Bool) : Bool :=
|
||||
Id.run <| allM n f
|
||||
|
||||
def bytesBEq (a b : ByteArray) : Bool :=
|
||||
if h_size : a.size = b.size then
|
||||
Fin.all a.size fun i => a[i] = b[i]'(h_size ▸ i.isLt)
|
||||
else
|
||||
false
|
||||
|
||||
def String.fromUTF8 (bytes : ByteArray) : String :=
|
||||
String.fromUTF8Unchecked bytes |>.map id
|
||||
|
||||
@[inline] def String.fromUTF8? (bytes : ByteArray) : Option String :=
|
||||
let s := String.fromUTF8 bytes
|
||||
if bytesBEq s.toUTF8 bytes then some s else none
|
||||
|
||||
nonrec def loadToml (tomlFile : FilePath) : BaseIO TomlOutcome := do
|
||||
let fileName := tomlFile.fileName.getD tomlFile.toString
|
||||
let input ←
|
||||
|
||||
@@ -302,7 +302,11 @@ extern "C" LEAN_EXPORT obj_res lean_io_prim_handle_mk(b_obj_arg filename, uint8
|
||||
#ifdef LEAN_WINDOWS
|
||||
|
||||
static inline HANDLE win_handle(FILE * fp) {
|
||||
#ifdef q4_WCE
|
||||
return (HANDLE)_fileno(fp);
|
||||
#else
|
||||
return (HANDLE)_get_osfhandle(_fileno(fp));
|
||||
#endif
|
||||
}
|
||||
|
||||
/* Handle.lock : (@& Handle) → (exclusive : Bool) → IO Unit */
|
||||
@@ -387,41 +391,13 @@ extern "C" LEAN_EXPORT obj_res lean_io_prim_handle_unlock(b_obj_arg h, obj_arg /
|
||||
|
||||
#endif
|
||||
|
||||
/* Handle.isTty : (@& Handle) → BaseIO Bool */
|
||||
extern "C" LEAN_EXPORT obj_res lean_io_prim_handle_is_tty(b_obj_arg h, obj_arg /* w */) {
|
||||
FILE * fp = io_get_handle(h);
|
||||
#ifdef LEAN_WINDOWS
|
||||
/*
|
||||
On Windows, there are two approaches for detecting a console.
|
||||
1) _isatty(_fileno(fp)) != 0
|
||||
This checks whether the file descriptor is a *character device*,
|
||||
not just a terminal (unlike Unix's isatty). Thus, it produces a false
|
||||
positive in some edge cases (such as NUL).
|
||||
https://stackoverflow.com/q/3648711
|
||||
2) GetConsoleMode(win_handle(fp), &mode) != 0
|
||||
Errors if the handle is not a console. Unfortunately, this produces
|
||||
a false negative for a terminal emulator like MSYS/Cygwin's Mintty,
|
||||
which is not implemented as a Windows-recognized console on
|
||||
old Windows versions (e.g., pre-Windows 10, pre-ConPTY).
|
||||
https://github.com/msys2/MINGW-packages/issues/14087
|
||||
We choose to use GetConsoleMode as that seems like the more modern approach,
|
||||
and Lean does not support pre-Windows 10.
|
||||
*/
|
||||
DWORD mode;
|
||||
return io_result_mk_ok(box(GetConsoleMode(win_handle(fp), &mode) != 0));
|
||||
#else
|
||||
// We ignore errors for consistency with Windows.
|
||||
return io_result_mk_ok(box(isatty(fileno(fp))));
|
||||
#endif
|
||||
}
|
||||
|
||||
/* Handle.isEof : (@& Handle) → BaseIO Bool */
|
||||
extern "C" LEAN_EXPORT obj_res lean_io_prim_handle_is_eof(b_obj_arg h, obj_arg /* w */) {
|
||||
FILE * fp = io_get_handle(h);
|
||||
return io_result_mk_ok(box(std::feof(fp) != 0));
|
||||
}
|
||||
|
||||
/* Handle.flush : (@& Handle) → IO Unit */
|
||||
/* Handle.flush : (@& Handle) → IO Bool */
|
||||
extern "C" LEAN_EXPORT obj_res lean_io_prim_handle_flush(b_obj_arg h, obj_arg /* w */) {
|
||||
FILE * fp = io_get_handle(h);
|
||||
if (!std::fflush(fp)) {
|
||||
|
||||
@@ -1614,14 +1614,10 @@ extern "C" LEAN_EXPORT object * lean_mk_string(char const * s) {
|
||||
return lean_mk_string_from_bytes(s, strlen(s));
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT obj_res lean_string_from_utf8(b_obj_arg a) {
|
||||
extern "C" LEAN_EXPORT obj_res lean_string_from_utf8_unchecked(b_obj_arg a) {
|
||||
return lean_mk_string_from_bytes(reinterpret_cast<char *>(lean_sarray_cptr(a)), lean_sarray_size(a));
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT uint8 lean_string_validate_utf8(b_obj_arg a) {
|
||||
return validate_utf8(lean_sarray_cptr(a), lean_sarray_size(a));
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT obj_res lean_string_to_utf8(b_obj_arg s) {
|
||||
size_t sz = lean_string_size(s) - 1;
|
||||
obj_res r = lean_alloc_sarray(1, sz, sz);
|
||||
@@ -1633,10 +1629,6 @@ object * mk_string(std::string const & s) {
|
||||
return lean_mk_string_from_bytes(s.data(), s.size());
|
||||
}
|
||||
|
||||
object * mk_ascii_string(std::string const & s) {
|
||||
return lean_mk_string_core(s.data(), s.size(), s.size());
|
||||
}
|
||||
|
||||
std::string string_to_std(b_obj_arg o) {
|
||||
lean_assert(string_size(o) > 0);
|
||||
return std::string(w_string_cstr(o), lean_string_size(o) - 1);
|
||||
@@ -1745,38 +1737,38 @@ extern "C" LEAN_EXPORT obj_res lean_string_data(obj_arg s) {
|
||||
|
||||
static bool lean_string_utf8_get_core(char const * str, usize size, usize i, uint32 & result) {
|
||||
unsigned c = static_cast<unsigned char>(str[i]);
|
||||
/* zero continuation (0 to 0x7F) */
|
||||
/* zero continuation (0 to 127) */
|
||||
if ((c & 0x80) == 0) {
|
||||
result = c;
|
||||
return true;
|
||||
}
|
||||
|
||||
/* one continuation (0x80 to 0x7FF) */
|
||||
/* one continuation (128 to 2047) */
|
||||
if ((c & 0xe0) == 0xc0 && i + 1 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
result = ((c & 0x1f) << 6) | (c1 & 0x3f);
|
||||
if (result >= 0x80) {
|
||||
if (result >= 128) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
/* two continuations (0x800 to 0xD7FF and 0xE000 to 0xFFFF) */
|
||||
/* two continuations (2048 to 55295 and 57344 to 65535) */
|
||||
if ((c & 0xf0) == 0xe0 && i + 2 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
unsigned c2 = static_cast<unsigned char>(str[i+2]);
|
||||
result = ((c & 0x0f) << 12) | ((c1 & 0x3f) << 6) | (c2 & 0x3f);
|
||||
if (result >= 0x800 && (result < 0xD800 || result > 0xDFFF)) {
|
||||
if (result >= 2048 && (result < 55296 || result > 57343)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
/* three continuations (0x10000 to 0x10FFFF) */
|
||||
/* three continuations (65536 to 1114111) */
|
||||
if ((c & 0xf8) == 0xf0 && i + 3 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
unsigned c2 = static_cast<unsigned char>(str[i+2]);
|
||||
unsigned c3 = static_cast<unsigned char>(str[i+3]);
|
||||
result = ((c & 0x07) << 18) | ((c1 & 0x3f) << 12) | ((c2 & 0x3f) << 6) | (c3 & 0x3f);
|
||||
if (result >= 0x10000 && result <= 0x10FFFF) {
|
||||
if (result >= 65536 && result <= 1114111) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
@@ -1814,32 +1806,32 @@ extern "C" LEAN_EXPORT uint32 lean_string_utf8_get(b_obj_arg s, b_obj_arg i0) {
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT uint32_t lean_string_utf8_get_fast_cold(char const * str, size_t i, size_t size, unsigned char c) {
|
||||
/* one continuation (0x80 to 0x7FF) */
|
||||
/* one continuation (128 to 2047) */
|
||||
if ((c & 0xe0) == 0xc0 && i + 1 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
uint32_t result = ((c & 0x1f) << 6) | (c1 & 0x3f);
|
||||
if (result >= 0x80) {
|
||||
if (result >= 128) {
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/* two continuations (0x800 to 0xD7FF and 0xE000 to 0xFFFF) */
|
||||
/* two continuations (2048 to 55295 and 57344 to 65535) */
|
||||
if ((c & 0xf0) == 0xe0 && i + 2 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
unsigned c2 = static_cast<unsigned char>(str[i+2]);
|
||||
uint32_t result = ((c & 0x0f) << 12) | ((c1 & 0x3f) << 6) | (c2 & 0x3f);
|
||||
if (result >= 0x800 && (result < 0xD800 || result > 0xDFFF)) {
|
||||
if (result >= 2048 && (result < 55296 || result > 57343)) {
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
/* three continuations (0x10000 to 0x10FFFF) */
|
||||
/* three continuations (65536 to 1114111) */
|
||||
if ((c & 0xf8) == 0xf0 && i + 3 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
unsigned c2 = static_cast<unsigned char>(str[i+2]);
|
||||
unsigned c3 = static_cast<unsigned char>(str[i+3]);
|
||||
uint32_t result = ((c & 0x07) << 18) | ((c1 & 0x3f) << 12) | ((c2 & 0x3f) << 6) | (c3 & 0x3f);
|
||||
if (result >= 0x10000 && result <= 0x10FFFF) {
|
||||
if (result >= 65536 && result <= 1114111) {
|
||||
return result;
|
||||
}
|
||||
}
|
||||
@@ -1926,19 +1918,6 @@ static inline bool is_utf8_first_byte(unsigned char c) {
|
||||
return (c & 0x80) == 0 || (c & 0xe0) == 0xc0 || (c & 0xf0) == 0xe0 || (c & 0xf8) == 0xf0;
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT uint8 lean_string_is_valid_pos(b_obj_arg s, b_obj_arg i0) {
|
||||
if (!lean_is_scalar(i0)) {
|
||||
/* See comment at string_utf8_get */
|
||||
return false;
|
||||
}
|
||||
usize i = lean_unbox(i0);
|
||||
usize sz = lean_string_size(s) - 1;
|
||||
if (i > sz) return false;
|
||||
if (i == sz) return true;
|
||||
char const * str = lean_string_cstr(s);
|
||||
return is_utf8_first_byte(str[i]);
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT obj_res lean_string_utf8_extract(b_obj_arg s, b_obj_arg b0, b_obj_arg e0) {
|
||||
if (!lean_is_scalar(b0) || !lean_is_scalar(e0)) {
|
||||
/* See comment at string_utf8_get */
|
||||
@@ -2020,10 +1999,6 @@ extern "C" LEAN_EXPORT uint64 lean_string_hash(b_obj_arg s) {
|
||||
return hash_str(sz, (unsigned char const *) str, 11);
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT obj_res lean_string_of_usize(size_t n) {
|
||||
return mk_ascii_string(std::to_string(n));
|
||||
}
|
||||
|
||||
// =======================================
|
||||
// ByteArray & FloatArray
|
||||
|
||||
|
||||
@@ -113,7 +113,7 @@ unsigned utf8_to_unicode(uchar const * begin, uchar const * end) {
|
||||
auto it = begin;
|
||||
unsigned c = *it;
|
||||
++it;
|
||||
if (c < 0x80)
|
||||
if (c < 128)
|
||||
return c;
|
||||
unsigned mask = (1u << 6) -1;
|
||||
unsigned hmask = mask;
|
||||
@@ -164,40 +164,40 @@ optional<unsigned> get_utf8_first_byte_opt(unsigned char c) {
|
||||
|
||||
unsigned next_utf8(char const * str, size_t size, size_t & i) {
|
||||
unsigned c = static_cast<unsigned char>(str[i]);
|
||||
/* zero continuation (0 to 0x7F) */
|
||||
/* zero continuation (0 to 127) */
|
||||
if ((c & 0x80) == 0) {
|
||||
i++;
|
||||
return c;
|
||||
}
|
||||
|
||||
/* one continuation (0x80 to 0x7FF) */
|
||||
/* one continuation (128 to 2047) */
|
||||
if ((c & 0xe0) == 0xc0 && i + 1 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
unsigned r = ((c & 0x1f) << 6) | (c1 & 0x3f);
|
||||
if (r >= 0x80) {
|
||||
if (r >= 128) {
|
||||
i += 2;
|
||||
return r;
|
||||
}
|
||||
}
|
||||
|
||||
/* two continuations (0x800 to 0xD7FF and 0xE000 to 0xFFFF) */
|
||||
/* two continuations (2048 to 55295 and 57344 to 65535) */
|
||||
if ((c & 0xf0) == 0xe0 && i + 2 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
unsigned c2 = static_cast<unsigned char>(str[i+2]);
|
||||
unsigned r = ((c & 0x0f) << 12) | ((c1 & 0x3f) << 6) | (c2 & 0x3f);
|
||||
if (r >= 0x800 && (r < 0xD800 || r > 0xDFFF)) {
|
||||
if (r >= 2048 && (r < 55296 || r > 57343)) {
|
||||
i += 3;
|
||||
return r;
|
||||
}
|
||||
}
|
||||
|
||||
/* three continuations (0x10000 to 0x10FFFF) */
|
||||
/* three continuations (65536 to 1114111) */
|
||||
if ((c & 0xf8) == 0xf0 && i + 3 < size) {
|
||||
unsigned c1 = static_cast<unsigned char>(str[i+1]);
|
||||
unsigned c2 = static_cast<unsigned char>(str[i+2]);
|
||||
unsigned c3 = static_cast<unsigned char>(str[i+3]);
|
||||
unsigned r = ((c & 0x07) << 18) | ((c1 & 0x3f) << 12) | ((c2 & 0x3f) << 6) | (c3 & 0x3f);
|
||||
if (r >= 0x10000 && r <= 0x10FFFF) {
|
||||
if (r >= 65536 && r <= 1114111) {
|
||||
i += 4;
|
||||
return r;
|
||||
}
|
||||
@@ -220,56 +220,6 @@ void utf8_decode(std::string const & str, std::vector<unsigned> & out) {
|
||||
}
|
||||
}
|
||||
|
||||
bool validate_utf8(uint8_t const * str, size_t size) {
|
||||
size_t i = 0;
|
||||
while (i < size) {
|
||||
unsigned c = str[i];
|
||||
if ((c & 0x80) == 0) {
|
||||
/* zero continuation (0 to 0x7F) */
|
||||
i++;
|
||||
} else if ((c & 0xe0) == 0xc0) {
|
||||
/* one continuation (0x80 to 0x7FF) */
|
||||
if (i + 1 >= size) return false;
|
||||
|
||||
unsigned c1 = str[i+1];
|
||||
if ((c1 & 0xc0) != 0x80) return false;
|
||||
|
||||
unsigned r = ((c & 0x1f) << 6) | (c1 & 0x3f);
|
||||
if (r < 0x80) return false;
|
||||
|
||||
i += 2;
|
||||
} else if ((c & 0xf0) == 0xe0) {
|
||||
/* two continuations (0x800 to 0xD7FF and 0xE000 to 0xFFFF) */
|
||||
if (i + 2 >= size) return false;
|
||||
|
||||
unsigned c1 = str[i+1];
|
||||
unsigned c2 = str[i+2];
|
||||
if ((c1 & 0xc0) != 0x80 || (c2 & 0xc0) != 0x80) return false;
|
||||
|
||||
unsigned r = ((c & 0x0f) << 12) | ((c1 & 0x3f) << 6) | (c2 & 0x3f);
|
||||
if (r < 0x800 || (r >= 0xD800 && r <= 0xDFFF)) return false;
|
||||
|
||||
i += 3;
|
||||
} else if ((c & 0xf8) == 0xf0) {
|
||||
/* three continuations (0x10000 to 0x10FFFF) */
|
||||
if (i + 3 >= size) return false;
|
||||
|
||||
unsigned c1 = str[i+1];
|
||||
unsigned c2 = str[i+2];
|
||||
unsigned c3 = str[i+3];
|
||||
if ((c1 & 0xc0) != 0x80 || (c2 & 0xc0) != 0x80 || (c3 & 0xc0) != 0x80) return false;
|
||||
|
||||
unsigned r = ((c & 0x07) << 18) | ((c1 & 0x3f) << 12) | ((c2 & 0x3f) << 6) | (c3 & 0x3f);
|
||||
if (r < 0x10000 || r > 0x10FFFF) return false;
|
||||
|
||||
i += 4;
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
#define TAG_CONT static_cast<unsigned char>(0b10000000)
|
||||
#define TAG_TWO_B static_cast<unsigned char>(0b11000000)
|
||||
#define TAG_THREE_B static_cast<unsigned char>(0b11100000)
|
||||
|
||||
@@ -45,9 +45,6 @@ LEAN_EXPORT unsigned next_utf8(char const * str, size_t size, size_t & i);
|
||||
/* Decode a UTF-8 encoded string `str` into unicode scalar values */
|
||||
LEAN_EXPORT void utf8_decode(std::string const & str, std::vector<unsigned> & out);
|
||||
|
||||
/* Returns true if the provided string is valid UTF-8 */
|
||||
LEAN_EXPORT bool validate_utf8(uint8_t const * str, size_t size);
|
||||
|
||||
/* Push a unicode scalar value into a utf-8 encoded string */
|
||||
LEAN_EXPORT void push_unicode_scalar(std::string & s, unsigned code);
|
||||
|
||||
|
||||
@@ -217,7 +217,6 @@ static void display_help(std::ostream & out) {
|
||||
#endif
|
||||
std::cout << " --plugin=file load and initialize Lean shared library for registering linters etc.\n";
|
||||
std::cout << " --load-dynlib=file load shared library to make its symbols available to the interpreter\n";
|
||||
std::cout << " --json report Lean output (e.g., messages) as JSON (one per line)\n";
|
||||
std::cout << " --deps just print dependencies of a Lean input\n";
|
||||
std::cout << " --print-prefix print the installation prefix for Lean and exit\n";
|
||||
std::cout << " --print-libdir print the installation directory for Lean's built-in libraries and exit\n";
|
||||
@@ -231,7 +230,6 @@ static void display_help(std::ostream & out) {
|
||||
|
||||
static int print_prefix = 0;
|
||||
static int print_libdir = 0;
|
||||
static int json_output = 0;
|
||||
|
||||
static struct option g_long_options[] = {
|
||||
{"version", no_argument, 0, 'v'},
|
||||
@@ -262,7 +260,6 @@ static struct option g_long_options[] = {
|
||||
#endif
|
||||
{"plugin", required_argument, 0, 'p'},
|
||||
{"load-dynlib", required_argument, 0, 'l'},
|
||||
{"json", no_argument, &json_output, 1},
|
||||
{"print-prefix", no_argument, &print_prefix, 1},
|
||||
{"print-libdir", no_argument, &print_libdir, 1},
|
||||
#ifdef LEAN_DEBUG
|
||||
@@ -349,7 +346,6 @@ extern "C" object * lean_run_frontend(
|
||||
object * main_module_name,
|
||||
uint32_t trust_level,
|
||||
object * ilean_filename,
|
||||
uint8_t json_output,
|
||||
object * w
|
||||
);
|
||||
pair_ref<environment, object_ref> run_new_frontend(
|
||||
@@ -357,8 +353,7 @@ pair_ref<environment, object_ref> run_new_frontend(
|
||||
options const & opts, std::string const & file_name,
|
||||
name const & main_module_name,
|
||||
uint32_t trust_level,
|
||||
optional<std::string> const & ilean_file_name,
|
||||
uint8_t json
|
||||
optional<std::string> const & ilean_file_name
|
||||
) {
|
||||
object * oilean_file_name = mk_option_none();
|
||||
if (ilean_file_name) {
|
||||
@@ -371,7 +366,6 @@ pair_ref<environment, object_ref> run_new_frontend(
|
||||
main_module_name.to_obj_arg(),
|
||||
trust_level,
|
||||
oilean_file_name,
|
||||
json_output,
|
||||
io_mk_world()
|
||||
));
|
||||
}
|
||||
@@ -722,7 +716,7 @@ extern "C" LEAN_EXPORT int lean_main(int argc, char ** argv) {
|
||||
|
||||
if (!main_module_name)
|
||||
main_module_name = name("_stdin");
|
||||
pair_ref<environment, object_ref> r = run_new_frontend(contents, opts, mod_fn, *main_module_name, trust_lvl, ilean_fn, json_output);
|
||||
pair_ref<environment, object_ref> r = run_new_frontend(contents, opts, mod_fn, *main_module_name, trust_lvl, ilean_fn);
|
||||
env = r.fst();
|
||||
bool ok = unbox(r.snd().raw());
|
||||
|
||||
|
||||
BIN
stage0/src/include/lean/lean.h
generated
BIN
stage0/src/include/lean/lean.h
generated
Binary file not shown.
BIN
stage0/src/runtime/io.cpp
generated
BIN
stage0/src/runtime/io.cpp
generated
Binary file not shown.
BIN
stage0/src/runtime/object.cpp
generated
BIN
stage0/src/runtime/object.cpp
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Coe.c
generated
BIN
stage0/stdlib/Init/Coe.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Control/Basic.c
generated
BIN
stage0/stdlib/Init/Control/Basic.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Control/EState.c
generated
BIN
stage0/stdlib/Init/Control/EState.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Control/Except.c
generated
BIN
stage0/stdlib/Init/Control/Except.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Control/ExceptCps.c
generated
BIN
stage0/stdlib/Init/Control/ExceptCps.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Control/Id.c
generated
BIN
stage0/stdlib/Init/Control/Id.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Control/Option.c
generated
BIN
stage0/stdlib/Init/Control/Option.c
generated
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user