Compare commits

...

53 Commits

Author SHA1 Message Date
Leonardo de Moura
29a37fea78 chore: move fixed tests 2025-04-03 09:36:23 -07:00
Leonardo de Moura
ef09b113f4 feat: function composition normalization rules 2025-04-03 09:35:39 -07:00
Sebastian Ullrich
83067d67d6 chore: avoid mimalloc warnings (#7800) 2025-04-03 13:52:16 +00:00
Sebastian Ullrich
314f6c73b7 chore: remove ctest timeout pending further testing 2025-04-03 14:05:07 +02:00
Kim Morrison
680cb0ad5a chore: more failing test cases for grind (#7801)
Adds more failing test cases for grind.
2025-04-03 10:12:47 +00:00
Luisa Cicolini
e59d070af1 feat: add BitVec.umulOverflow and BitVec.smulOverflow definitions and additional theorems (#7659)
This PR adds SMT-LIB operators to detect overflow
`BitVec.(umul_overflow, smul_overflow)`, according to the definitions
[here](https://github.com/SMT-LIB/SMT-LIB-2/blob/2.7/Theories/FixedSizeBitVectors.smt2),
and the theorems proving equivalence of such definitions with the
`BitVec` library functions (`umulOverflow_eq`, `smulOverflow_eq`).
Support theorems for these proofs are `BitVec.toInt_one_of_lt,
BitVec.toInt_mul_toInt_lt, BitVec.le_toInt_mul_toInt,
BitVec.toNat_mul_toNat_lt, BitVec.two_pow_le_toInt_mul_toInt_iff,
BitVec.toInt_mul_toInt_lt_neg_two_pow_iff` and `Int.neg_mul_le_mul,
Int.bmod_eq_self_of_le_mul_two, Int.mul_le_mul_of_natAbs_le,
Int.mul_le_mul_of_le_of_le_of_nonneg_of_nonpos, Int.pow_lt_pow`. The PR
also includes a set of tests.

Co-authored by @tobiasgrosser.

---------

Co-authored-by: Tobias Grosser <tobias@grosser.es>
Co-authored-by: Tobias Grosser <github@grosser.es>
Co-authored-by: Siddharth <siddu.druid@gmail.com>
2025-04-03 08:42:52 +00:00
Markus Himmel
bb6bfdba37 feat: Nat.lcm lemmas (#7791)
This PR adds lemmas about `Nat.lcm`.
2025-04-03 08:31:05 +00:00
Henrik Böving
626075ca34 feat: add Std.SharedMutex (#7770)
This PR adds a shared mutex (or read-write lock) as `Std.SharedMutex`.

In order to easily migrate a `Std.Mutex` to `Std.SharedMutex` if
necessary, the functions for obtaining exclusive access are named the
same, allowing a correct drop in to be done by just swapping types.
2025-04-03 08:30:54 +00:00
Rob23oba
5e13f81e30 feat: Option.pfilter and other lemmas (#7774)
This PR adds `Option.pfilter`, a variant of `Option.filter` and several
lemmas for it and other `Option` functions. These lemmas are split off
from #7400.
2025-04-03 08:30:38 +00:00
Kim Morrison
196d899c02 feat: grind internal CommRing class (#7797)
This PR adds a monolithic `CommRing` class, for internal use by `grind`,
and includes instances for `Int`/`BitVec`/`IntX`/`UIntX`.
2025-04-03 08:30:19 +00:00
Kim Morrison
6a22951e90 chore: begin development cycle for v4.20.0 (#7798) 2025-04-03 08:29:52 +00:00
Sebastian Ullrich
fd0d585916 chore: update test output 2025-04-03 10:27:54 +02:00
Lean stage0 autoupdater
bdd8d6fcac chore: update stage0 2025-04-03 03:26:05 +00:00
Mac Malone
01f3bbb2de fix: lake: Lean shared lib path before the workspace's (#7796)
This PR moves Lean's shared library path before the workspace's in
Lake's augmented environment (e.g., `lake env`).

Lean's comes first because Lean needs to load its own shared libraries
from this path. Giving the workspace greater precedence can break this
(e.g., when bootstrapping), This change does not effect shared library
path on Windows (i.e., `PATH`) because such shared libraries are already
prioritized by being located next to the executable.
2025-04-03 01:24:11 +00:00
Kim Morrison
12ec466aa6 feat: further release checklist automation (#7785)
This PR adds further automation to the release process, taking care of
tagging, and creating new `bump/v4.X.0` branches automatically, and
fixing some bugs.

---------

Co-authored-by: Johan Commelin <johan@commelin.net>
2025-04-03 00:02:07 +00:00
Sebastian Ullrich
c658648ee8 refactor: split Lean.EnvironmentExtension from Lean.Environment (#7794) 2025-04-02 16:19:12 +00:00
Sebastian Ullrich
9c87db2d77 fix: filter empty arguments from FFI flags (#7793)
This PR prevents compilation issues on some local dev configurations
2025-04-02 15:16:41 +00:00
Sebastian Ullrich
33e456dd3c chore: CI: improve ccache (#7643) 2025-04-02 13:18:42 +00:00
Sebastian Ullrich
dedfbaf521 fix: lean --run should not parse and permute remaining arguments (#7789)
This PR fixes `lean` potentially changing or interpreting arguments
after `--run`.

**Breaking change**: The Lean file to run must now be passed directly
after `--run`, which accidentally was not enforced before.
2025-04-02 12:44:31 +00:00
Sebastian Ullrich
bd24ca3093 test: re-elaboration benchmarks (#7784)
Tests language server memory use by repeatedly re-elaborate a given file
2025-04-02 10:10:46 +00:00
Siddharth
fe986b4533 feat: BitVec.add_shiftLeft_eq_or_shiftLeft (#7761)
This PR implements the core theorem for the Bitwuzla rewrites
[NORM_BV_NOT_OR_SHL](e09c50818b/src/rewrite/rewrites_bv.cpp (L1495-L1510))
and
[BV_ADD_SHL](e09c50818b/src/rewrite/rewrites_bv.cpp (L395-L401)),
which convert the mixed-boolean-arithmetic expression into a purely
arithmetic expression:

```lean
theorem add_shiftLeft_eq_or_shiftLeft {x y : BitVec w} :
    x + (y <<< x) =  x ||| (y <<< x)
```
2025-04-02 10:06:33 +00:00
Marc Huisinga
336b68ec20 feat: 'unknown identifier' code actions (#7665)
This PR adds support for code actions that resolve 'unknown identifier'
errors by either importing the missing declaration or by changing the
identifier to one from the environment.

<details>
<summary>Demo (Click to open)</summary>


![Demo](https://github.com/user-attachments/assets/ba575860-b76d-4213-8cd7-a5525cd60287)
</details>

Specifically, the following kinds of code actions are added by this PR,
all of which are triggered on 'unknown identifier' errors:
- A code action to import the module containing the identifier at the
text cursor position.
- A code action to change the identifier at the text cursor position to
one from the environment.
- A source action to import the modules for all unambiguous identifiers
in the file.

### Details
When clicking on an identifier with an 'unknown identifier' diagnostic,
after a debounce delay of 1000ms, the language server looks up the
(potentially partial) identifier at the position of the cursor in the
global reference data structure by fuzzy-matching against all
identifiers and collects the 10 closest matching entries. This search
accounts for open namespaces at the position of the cursor, including
the namespace of the type / expected type when using dot notation. The
10 closest matching entries are then offered to the user as code
actions:
- If the suggested identifier is not contained in the environment, a
code action that imports the module that the identifier is contained in
and changes the identifier to the suggested one is offered. The
suggestion is inserted in a "minimal" manner, i.e. by accounting for
open namespaces.
- If the suggested identifier is contained in the environment, a code
action that only changes the identifier to the suggested one is offered.
- If the suggested identifier is not contained in the environment and
the suggested identifier is a perfectly unambiguous match, a source
action to import all unambiguous in the file is offered.

The source action to import all unambiguous identifiers can also always
be triggered by right-clicking in the document and selecting the 'Source
Action...' entry.

At the moment, for large projects, the search for closely matching
identifiers in the global reference data structure is still a bit slow.
I hope to optimize it next quarter.

### Implementation notes
- Since the global reference data structure is in the watchdog process,
whereas the elaboration information is in the file worker process, this
PR implements support for file worker -> watchdog requests, including a
new `$/lean/queryModule` request that can be used by the file worker to
request global identifier information.
- To identify 'unknown identifier' errors, several 'unknown identifier'
errors in the elaborator are tagged with a new tag.
- The debounce delay of 1000ms is necessary because VS Code will
re-request code actions while editing an unknown identifier and also
while hovering over the identifier.
- We also implement cancellation for these 'unknown identifier' code
actions. Once the file worker responds to the request as having been
cancelled, the watchdog cancels its computation of all corresponding
file worker -> watchdog requests, too.
- Aliases (i.e. `export`) are currently not accounted for. I've found
that we currently don't handle them correctly in auto-completion, too,
so we will likely add support for this later when fixing the
corresponding auto-completion issue.
- The new code actions added by this request support incrementality.
2025-04-02 09:43:40 +00:00
Sebastian Ullrich
5df4e48dc9 feat: importModules without loading environment extensions (#6325)
This PR ensures that environments can be loaded, repeatedly, without
executing arbitrary code
2025-04-02 08:37:11 +00:00
Kim Morrison
1ee7e1a9d8 chore: normalize URLs to the language reference in test results (#7782)
Links to the language reference include a version slug, either `latest`
or `v4.X.0`. These are included in hovers, which then get tested. To
avoid test breakages, in the testing framework we normalize all such URL
prefixes back to `REFERENCE`.
2025-04-02 06:17:31 +00:00
Leonardo de Moura
85f94abe19 feat: helper theorems (#7783)
This PR adds helper theorems for equality propagation.
2025-04-02 01:43:14 +00:00
Leonardo de Moura
2979830120 fix: Bool disequality propagation in grind (#7781)
This PR adds a new propagation rule for `Bool` disequalities to `grind`.
It now propagates `x = true` (`x = false`) from the disequality `x =
false` (`x = true`). It ensures we don't have to perform case analysis
on `x` to learn this fact. See tests.
2025-04-01 22:12:20 +00:00
Leonardo de Moura
27084f6646 fix: missing propagation rules for non decidable lawful BEq in grind (#7778)
This PR adds missing propagation rules for `LawfulBEq A` to `grind`.
They are needed in a context where the instance `DecidableEq A` is not
available. See new test.
2025-04-01 20:15:01 +00:00
Cameron Zwarich
cdc2731401 chore: derive more type classes for IR data structures (#7085) 2025-04-01 19:59:25 +00:00
Leonardo de Moura
6c42cb353a fix: prop local instances in grind (#7777)
This PR fixes the introduction procedure used in `grind`. It was not
registering local instances that are also propositions. See new test.
2025-04-01 18:51:45 +00:00
Leonardo de Moura
8ff05f9760 feat: improve grind equality proof discharger (#7776)
This PR improves the equality proof discharger used by the E-matching
procedure in `grind`.
2025-04-01 18:02:38 +00:00
Leonardo de Moura
73d08f663d feat: NatCast.natCast unexpander (#7775)
This PR adds an unexpander for `NatCast.natCast`. See new comment for
details.
2025-04-01 17:11:44 +00:00
Markus Himmel
b6f18e8e2f feat: Nat.gcd lemmas (#7756)
This PR adds lemmas about `Nat.gcd` (some of which are currently present
in mathlib).
2025-04-01 17:05:42 +00:00
Sebastian Ullrich
8b1caa3bc2 fix: make new codegen async realization-compatible (#7316)
Follow-up to #7247
2025-04-01 15:55:14 +00:00
Henrik Böving
6a45bd5f77 feat: add Std.Barrier (#7771)
This PR adds a barrier primitive as `Std.Barrier`.

The implementation is mirrored after [the Rust
one](https://github.com/rust-lang/rust/blob/b8ae372/library/std/src/sync/barrier.rs)
as C++14 does not have barriers yet.
2025-04-01 15:48:13 +00:00
Sebastian Ullrich
9c6c54107f doc: AsyncMode.mainOnly is the default (#7773) 2025-04-01 13:04:18 +00:00
Sebastian Ullrich
daa41939fe fix: sanitize build and mimalloc (#7772)
TODO: try `MI_TRACK_ASAN` instead
2025-04-01 12:57:24 +00:00
Kim Morrison
2063fd3976 feat: upgrades to release automation (#7769)
This PR fixes a number of bugs in the release automation scripts, adds a
script to merge tags into remote `stable` branches, and makes the main
`release_checklist.py` script give suggestions to call the
`merge_remote.py` and `release_steps.py` scripts when needed.

---------

Co-authored-by: Johan Commelin <johan@commelin.net>
2025-04-01 08:17:24 +00:00
Siddharth
55b0d390c6 feat: BitVec.append_add_append_eq_append (#7757)
This PR adds the Bitwuzla rewrite `NORM_BV_ADD_CONCAT` for symbolic
simplification of add-of-append.

---------

Co-authored-by: Tobias Grosser <github@grosser.es>
2025-04-01 07:47:18 +00:00
Henrik Böving
32cd701994 feat: add Std.RecursiveMutex (#7755)
This PR adds `Std.RecursiveMutex` as a recursive/reentrant equivalent to
`Std.Mutex`.
2025-04-01 07:35:36 +00:00
Johan Commelin
911ea07a73 chore: add script to generate release steps (#7747)
This PR takes a step towards automating the release process.
Somewhat following the idea of

https://blog.danslimmon.com/2019/07/15/do-nothing-scripting-the-key-to-gradual-automation/
2025-04-01 04:25:57 +00:00
Kim Morrison
fcb0ab8490 chore: add List.head_singleton theorem (#7768) 2025-04-01 03:59:55 +00:00
Kim Morrison
50cec261fc chore: failing test cases for grind proving List lemmas (#7767) 2025-04-01 03:56:08 +00:00
Kim Morrison
cdedcf6b48 chore: fix statement of List/Array/Vector.all_filter (#7766) 2025-04-01 03:29:53 +00:00
Mac Malone
7fefa8660e chore: lake: rm excess -lstdcpp from FFI example (#7758)
This PR removes the `-lstdcpp` extra link argument from the FFI example.
It is not actually necessary.
2025-04-01 03:10:54 +00:00
Kyle Miller
34142685a9 fix: use more reduction when computing parent types (#7764)
This PR adds in more normalization for the routine that computes a
parent type. Some mathlib adaptations are the result of not reducing the
type parameters.
2025-04-01 02:48:17 +00:00
Leonardo de Moura
e7fc50acb1 feat: dependent implication introduction in grind (#7765)
This PR improves how `grind` normalizes dependent implications during
introduction.
Previously, `grind` would introduce a hypothesis `h : p` for a goal of
the form `.. ⊢ (h : p) → q h`, and then normalize and assert a
non-dependent copy of `p`. As a result, the local context would contain
both `h : p` and a separate `h' : p'`, where `p'` is the normal form of
`p`. Moreover, `q` would still depend on the original `h`.

After this commit, `grind` avoids creating a copy. The context will now
contain only `h : p'`, and the new goal becomes `.. ⊢ q (he.mpr_prop
h)`, where `he` is a proof of `p = p'`.
2025-04-01 02:38:13 +00:00
Mac Malone
c30c71a278 fix: lake: target kinds & keys (#7763)
This PR corrects build key fetches to produce jobs with the proper data
kinds and fixes a failed coercion from key literals to targets.
2025-04-01 01:28:07 +00:00
Leonardo de Moura
bb07a732e7 refactor: use mkAuxLemma in mkAuxTheorem (#7762)
cc @Kha

---------

Co-authored-by: Sebastian Ullrich <sebasti@nullri.ch>
2025-03-31 22:50:30 +00:00
Kyle Miller
d6303a8e7f refactor: factor out common code for structure default values (#7737)
This PR factors out a `Lean.Meta.instantiateStructDefaultValueFn?`
function for instantiating default values for fields.
2025-03-31 22:40:39 +00:00
Leonardo de Moura
1d47360099 fix: transparency setting when computing congruence lemmas in grind (#7760)
This PR ensures `grind` is using the default transparency setting when
computing auxiliary congruence lemmas.
2025-03-31 20:52:36 +00:00
Sofia Rodrigues
edb02104d2 fix: async task assertions in tests by replacing assert! with assertBEq (#7729)
This PR replaces `assert!` with `assertBEq` to fix issues where asserts
didn't trigger the `ctest` due to being in a separate task. This was
caused by panics not being caught in tasks, while IO errors were handled
by the `AsyncTask` if we use the `block` function on them.

---------

Co-authored-by: Henrik Böving <hargonix@gmail.com>
2025-03-31 17:49:29 +00:00
Henrik Böving
6faab78384 chore: delete unused invariant (#7759)
This PR deletes an unused invariant from the AIG to CNF conversion.
Interestingly despite being listed in the AIGNET paper it is actually
not used in the proof so we can just remove it.
2025-03-31 17:35:46 +00:00
Henrik Böving
1b5a52a5e9 feat: Std.BaseMutex.tryLock and Mutex.tryAtomically (#7751)
This PR adds `Std.BaseMutex.tryLock` and `Std.Mutex.tryAtomically` as
well as unit tests for our locking and condition variable primitives.

---------

Co-authored-by: Markus Himmel <markus@lean-fro.org>
2025-03-31 12:19:09 +00:00
472 changed files with 4855 additions and 1422 deletions

View File

@@ -1,3 +1,4 @@
# instantiated by ci.yml
name: build-template
on:
workflow_call:
@@ -45,7 +46,7 @@ jobs:
CCACHE_DIR: ${{ github.workspace }}/.ccache
CCACHE_COMPRESS: true
# current cache limit
CCACHE_MAXSIZE: 200M
CCACHE_MAXSIZE: 600M
# squelch error message about missing nixpkgs channel
NIX_BUILD_SHELL: bash
LSAN_OPTIONS: max_leaks=10
@@ -97,32 +98,22 @@ jobs:
sudo apt-get install -y gcc-multilib g++-multilib ccache libuv1-dev:i386 pkgconf:i386
if: matrix.cmultilib
- name: Cache
id: restore-cache
if: matrix.name != 'Linux Lake'
uses: actions/cache@v4
uses: actions/cache/restore@v4
with:
# NOTE: must be in sync with `save` below
path: |
.ccache
key: ${{ matrix.name }}-build-v3-${{ github.event.pull_request.head.sha }}
# fall back to (latest) previous cache
restore-keys: |
${{ matrix.name }}-build-v3
save-always: true
- name: Cache
if: matrix.name == 'Linux Lake'
uses: actions/cache@v4
with:
path: |
.ccache
build/stage1/**/*.trace
${{ matrix.name == 'Linux Lake' && 'build/stage1/**/*.trace
build/stage1/**/*.olean
build/stage1/**/*.ilean
build/stage1/**/*.c
build/stage1/**/*.c.o*
build/stage1/**/*.c.o*' || '' }}
key: ${{ matrix.name }}-build-v3-${{ github.event.pull_request.head.sha }}
# fall back to (latest) previous cache
restore-keys: |
${{ matrix.name }}-build-v3
save-always: true
# open nix-shell once for initial setup
- name: Setup
run: |
@@ -236,6 +227,7 @@ jobs:
make -C build update-stage0 && rm -rf build/stage* && make -C build -j$NPROC
if: matrix.name == 'Linux' && inputs.check-level >= 1
- name: CCache stats
if: always()
run: ccache -s
- name: Show stacktrace for coredumps
if: failure() && runner.os == 'Linux'
@@ -243,4 +235,17 @@ jobs:
for c in $(find . -name core); do
progbin="$(file $c | sed "s/.*execfn: '\([^']*\)'.*/\1/")"
echo bt | $GDB/bin/gdb -q $progbin $c || true
done
done
- name: Save Cache
if: always() && steps.restore-cache.outputs.cache-hit != 'true'
uses: actions/cache/save@v4
with:
# NOTE: must be in sync with `restore` above
path: |
.ccache
${{ matrix.name == 'Linux Lake' && 'build/stage1/**/*.trace
build/stage1/**/*.olean
build/stage1/**/*.ilean
build/stage1/**/*.c
build/stage1/**/*.c.o*' || '' }}
key: ${{ steps.restore-cache.outputs.cache-primary-key }}

View File

@@ -20,6 +20,9 @@ foreach(var ${vars})
elseif("${var}" MATCHES "LLVM*|PKG_CONFIG|USE_LAKE|USE_MIMALLOC")
list(APPEND STAGE0_ARGS "-D${var}=${${var}}")
endif()
elseif("${var}" MATCHES "USE_MIMALLOC")
list(APPEND CL_ARGS "-D${var}=${${var}}")
list(APPEND STAGE0_ARGS "-D${var}=${${var}}")
elseif(("${var}" MATCHES "CMAKE_.*") AND NOT ("${var}" MATCHES "CMAKE_BUILD_TYPE") AND NOT ("${var}" MATCHES "CMAKE_HOME_DIRECTORY"))
list(APPEND PLATFORM_ARGS "-D${var}=${${var}}")
endif()

View File

@@ -30,6 +30,7 @@
"LEANC_EXTRA_CC_FLAGS": "-fsanitize=address,undefined",
"LEAN_EXTRA_LINKER_FLAGS": "-fsanitize=address,undefined -fsanitize-link-c++-runtime",
"SMALL_ALLOCATOR": "OFF",
"USE_MIMALLOC": "OFF",
"BSYMBOLIC": "OFF",
"LEAN_TEST_VARS": "MAIN_STACK_SIZE=16000"
},

View File

@@ -42,6 +42,10 @@ We'll use `v4.6.0` as the intended release version as a running example.
- In order to have the access rights to push to these repositories and merge PRs,
you will need to be a member of the `lean-release-managers` team at both `leanprover-community` and `leanprover`.
Contact Kim Morrison (@kim-em) to arrange access.
- There is an experimental script that will guide you through the steps for each of the repositories below.
The script should be invoked as
`script/release_steps.py vx.y.x <repo>` where `<repo>` is a case-insensitive substring of the repo name.
For example: `script/release_steps.py v4.6.0 batt` will guide you through the steps for the Batteries repository.
- For each of the repositories listed below:
- Make a PR to `master`/`main` changing the toolchain to `v4.6.0`
- The usual branch name would be `bump_to_v4.6.0`.
@@ -224,20 +228,8 @@ We'll use `v4.7.0-rc1` as the intended release version in this example.
You will want a few bullet points for main topics from the release notes.
Please also make sure that whoever is handling social media knows the release is out.
- Begin the next development cycle (i.e. for `v4.8.0`) on the Lean repository, by making a PR that:
- Uses branch name `dev_cycle_v4.8`.
- Updates `src/CMakeLists.txt` to say `set(LEAN_VERSION_MINOR 8)`
- Replaces the "release notes will be copied" text in the `v4.6.0` section of `RELEASES.md` with the
finalized release notes from the `releases/v4.6.0` branch.
- Replaces the "development in progress" in the `v4.7.0` section of `RELEASES.md` with
```
Release candidate, release notes will be copied from the branch `releases/v4.7.0` once completed.
```
and inserts the following section before that section:
```
v4.8.0
----------
Development in progress.
```
- Removes all the entries from the `./releases_drafts/` folder.
- Titled "chore: begin development cycle for v4.8.0"

View File

@@ -0,0 +1,6 @@
**Breaking Changes**
* The functions `Lean.Environment.importModules` and `Lean.Environment.finalizeImport` have been extended with a new parameter `loadExts : Bool := false` that enables environment extension state loading.
Their previous behavior corresponds to setting the flag to `true` but is only safe to do in combination with `enableInitializersExecution`; see also the `importModules` docstring.
The new default value `false` ensures the functions can be used correctly multiple times within the same process when environment extension access is not needed.
The wrapper function `Lean.Environment.withImportModules` now always calls `importModules` with `loadExts := false` as it is incompatible with extension loading.

View File

@@ -0,0 +1,70 @@
import Lean.Data.Lsp
open Lean
open Lean.Lsp
open Lean.JsonRpc
/-!
Tests language server memory use by repeatedly re-elaborate a given file.
NOTE: only works on Linux for now.
HACK: The line that is to be prepended with a space is hard-coded below to be sufficiently far down
not to touch the imports for usual files.
-/
def main (args : List String) : IO Unit := do
let leanCmd :: file :: iters :: args := args | panic! "usage: script <lean> <file> <#iterations> <server-args>..."
let uri := s!"file:///{file}"
Ipc.runWith leanCmd (#["--worker", "-DstderrAsMessages=false"] ++ args ++ #[uri]) do
-- for use with heaptrack:
--Ipc.runWith "heaptrack" (#[leanCmd, "--worker", "-DstderrAsMessages=false"] ++ args ++ #[uri]) do
-- -- heaptrack has no quiet mode??
-- let _ ← (← Ipc.stdout).getLine
-- let _ ← (← Ipc.stdout).getLine
let capabilities := {
textDocument? := some {
completion? := some {
completionItem? := some {
insertReplaceSupport? := true
}
}
}
}
Ipc.writeRequest 0, "initialize", { capabilities : InitializeParams }
let text IO.FS.readFile file
let mut requestNo : Nat := 1
let mut versionNo : Nat := 1
Ipc.writeNotification "textDocument/didOpen", {
textDocument := { uri := uri, languageId := "lean", version := 1, text := text } : DidOpenTextDocumentParams }
for i in [0:iters.toNat!] do
if i > 0 then
versionNo := versionNo + 1
let pos := { line := 19, character := 0 }
let params : DidChangeTextDocumentParams := {
textDocument := {
uri := uri
version? := versionNo
}
contentChanges := #[TextDocumentContentChangeEvent.rangeChange {
start := pos
«end» := pos
} " "]
}
let params := toJson params
Ipc.writeNotification "textDocument/didChange", params
requestNo := requestNo + 1
let diags Ipc.collectDiagnostics requestNo uri versionNo
if let some diags := diags then
for diag in diags.param.diagnostics do
IO.eprintln diag.message
requestNo := requestNo + 1
let status IO.FS.readFile s!"/proc/{(← read).pid}/status"
for line in status.splitOn "\n" |>.filter (·.startsWith "RssAnon") do
IO.eprintln line
let _ Ipc.collectDiagnostics requestNo uri versionNo
( Ipc.stdin).writeLspMessage (Message.notification "exit" none)
discard <| Ipc.waitForExit

167
script/merge_remote.py Executable file
View File

@@ -0,0 +1,167 @@
#!/usr/bin/env python3
"""
Merge a tag into a branch on a GitHub repository.
This script checks if a specified tag can be merged cleanly into a branch and performs
the merge if possible. If the merge cannot be done cleanly, it prints a helpful message.
Usage:
python3 merge_remote.py <org/repo> <branch> <tag>
Arguments:
org/repo: GitHub repository in the format 'organization/repository'
branch: The target branch to merge into
tag: The tag to merge from
Example:
python3 merge_remote.py leanprover/mathlib4 stable v4.6.0
The script uses the GitHub CLI (`gh`), so make sure it's installed and authenticated.
"""
import argparse
import subprocess
import sys
import tempfile
import os
import shutil
def run_command(command, check=True, capture_output=True):
"""Run a shell command and return the result."""
try:
result = subprocess.run(
command,
check=check,
shell=True,
text=True,
capture_output=capture_output
)
return result
except subprocess.CalledProcessError as e:
if capture_output:
print(f"Command failed: {command}")
print(f"Error: {e.stderr}")
return e
def clone_repo(repo, temp_dir):
"""Clone the repository to a temporary directory using shallow clone."""
print(f"Shallow cloning {repo}...")
# Keep the shallow clone for efficiency
clone_result = run_command(f"gh repo clone {repo} {temp_dir} -- --depth=1", check=False)
if clone_result.returncode != 0:
print(f"Failed to clone repository {repo}.")
print(f"Error: {clone_result.stderr}")
return False
return True
def check_and_merge(repo, branch, tag, temp_dir):
"""Check if tag can be merged into branch and perform the merge if possible."""
# Change to the temporary directory
os.chdir(temp_dir)
# First fetch the specific remote branch with its history
print(f"Fetching branch '{branch}'...")
fetch_branch = run_command(f"git fetch origin {branch}:refs/remotes/origin/{branch} --update-head-ok")
if fetch_branch.returncode != 0:
print(f"Error: Failed to fetch branch '{branch}'.")
return False
# Then fetch the specific tag
print(f"Fetching tag '{tag}'...")
fetch_tag = run_command(f"git fetch origin tag {tag}")
if fetch_tag.returncode != 0:
print(f"Error: Failed to fetch tag '{tag}'.")
return False
# Check if branch exists now that we've fetched it
branch_check = run_command(f"git branch -r | grep origin/{branch}")
if branch_check.returncode != 0:
print(f"Error: Branch '{branch}' does not exist in repository.")
return False
# Check if tag exists
tag_check = run_command(f"git tag -l {tag}")
if tag_check.returncode != 0 or not tag_check.stdout.strip():
print(f"Error: Tag '{tag}' does not exist in repository.")
return False
# Checkout the branch
print(f"Checking out branch '{branch}'...")
checkout_result = run_command(f"git checkout -b {branch} origin/{branch}")
if checkout_result.returncode != 0:
return False
# Try merging the tag in a dry-run to check if it can be merged cleanly
print(f"Checking if {tag} can be merged cleanly into {branch}...")
merge_check = run_command(f"git merge --no-commit --no-ff {tag}", check=False)
if merge_check.returncode != 0:
print(f"Cannot merge {tag} cleanly into {branch}.")
print("Merge conflicts would occur. Aborting merge.")
run_command("git merge --abort")
return False
# Abort the test merge
run_command("git reset --hard HEAD")
# Now perform the actual merge and push to remote
print(f"Merging {tag} into {branch}...")
merge_result = run_command(f"git merge {tag} --no-edit")
if merge_result.returncode != 0:
print(f"Failed to merge {tag} into {branch}.")
return False
print(f"Pushing changes to remote...")
push_result = run_command(f"git push origin {branch}")
if push_result.returncode != 0:
print(f"Failed to push changes to remote.")
return False
print(f"Successfully merged {tag} into {branch} and pushed to remote.")
return True
def main():
parser = argparse.ArgumentParser(
description="Merge a tag into a branch on a GitHub repository.",
formatter_class=argparse.RawDescriptionHelpFormatter,
epilog="""
Examples:
%(prog)s leanprover/mathlib4 stable v4.6.0 Merge tag v4.6.0 into stable branch
The script will:
1. Clone the repository
2. Check if the tag and branch exist
3. Check if the tag can be merged cleanly into the branch
4. Perform the merge and push to remote if possible
"""
)
parser.add_argument("repo", help="GitHub repository in the format 'organization/repository'")
parser.add_argument("branch", help="The target branch to merge into")
parser.add_argument("tag", help="The tag to merge from")
args = parser.parse_args()
# Create a temporary directory for the repository
temp_dir = tempfile.mkdtemp()
try:
# Clone the repository
if not clone_repo(args.repo, temp_dir):
sys.exit(1)
# Check if the tag can be merged and perform the merge
if not check_and_merge(args.repo, args.branch, args.tag, temp_dir):
sys.exit(1)
finally:
# Clean up the temporary directory
print(f"Cleaning up temporary files...")
shutil.rmtree(temp_dir)
if __name__ == "__main__":
main()

View File

@@ -7,6 +7,13 @@ import base64
import subprocess
import sys
import os
# Import run_command from merge_remote.py
from merge_remote import run_command
def debug(verbose, message):
"""Print debug message if verbose mode is enabled."""
if verbose:
print(f" [DEBUG] {message}")
def parse_repos_config(file_path):
with open(file_path, "r") as f:
@@ -90,7 +97,7 @@ def is_version_gte(version1, version2):
return False
return parse_version(version1) >= parse_version(version2)
def is_merged_into_stable(repo_url, tag_name, stable_branch, github_token):
def is_merged_into_stable(repo_url, tag_name, stable_branch, github_token, verbose=False):
# First get the commit SHA for the tag
api_base = repo_url.replace("https://github.com/", "https://api.github.com/repos/")
headers = {'Authorization': f'token {github_token}'} if github_token else {}
@@ -98,6 +105,7 @@ def is_merged_into_stable(repo_url, tag_name, stable_branch, github_token):
# Get tag's commit SHA
tag_response = requests.get(f"{api_base}/git/refs/tags/{tag_name}", headers=headers)
if tag_response.status_code != 200:
debug(verbose, f"Could not fetch tag {tag_name}, status code: {tag_response.status_code}")
return False
# Handle both single object and array responses
@@ -106,22 +114,48 @@ def is_merged_into_stable(repo_url, tag_name, stable_branch, github_token):
# Find the exact matching tag in the list
matching_tags = [tag for tag in tag_data if tag['ref'] == f'refs/tags/{tag_name}']
if not matching_tags:
debug(verbose, f"No matching tag found for {tag_name} in response list")
return False
tag_sha = matching_tags[0]['object']['sha']
else:
tag_sha = tag_data['object']['sha']
# Check if the tag is an annotated tag and get the actual commit SHA
if tag_data.get('object', {}).get('type') == 'tag' or (
isinstance(tag_data, list) and
matching_tags and
matching_tags[0].get('object', {}).get('type') == 'tag'):
# Get the commit that this tag points to
tag_obj_response = requests.get(f"{api_base}/git/tags/{tag_sha}", headers=headers)
if tag_obj_response.status_code == 200:
tag_obj = tag_obj_response.json()
if 'object' in tag_obj and tag_obj['object']['type'] == 'commit':
commit_sha = tag_obj['object']['sha']
debug(verbose, f"Tag is annotated. Resolved commit SHA: {commit_sha}")
tag_sha = commit_sha # Use the actual commit SHA
# Get commits on stable branch containing this SHA
commits_response = requests.get(
f"{api_base}/commits?sha={stable_branch}&per_page=100",
headers=headers
)
if commits_response.status_code != 200:
debug(verbose, f"Could not fetch commits for branch {stable_branch}, status code: {commits_response.status_code}")
return False
# Check if any commit in stable's history matches our tag's SHA
stable_commits = [commit['sha'] for commit in commits_response.json()]
return tag_sha in stable_commits
is_merged = tag_sha in stable_commits
debug(verbose, f"Tag SHA: {tag_sha}")
debug(verbose, f"First 5 stable commits: {stable_commits[:5]}")
debug(verbose, f"Total stable commits fetched: {len(stable_commits)}")
if not is_merged:
debug(verbose, f"Tag SHA not found in first {len(stable_commits)} commits of stable branch")
return is_merged
def is_release_candidate(version):
return "-rc" in version
@@ -195,13 +229,17 @@ def pr_exists_with_title(repo_url, title, github_token):
return None
def main():
parser = argparse.ArgumentParser(description="Check release status of Lean4 repositories")
parser.add_argument("toolchain", help="The toolchain version to check (e.g., v4.6.0)")
parser.add_argument("--verbose", "-v", action="store_true", help="Enable verbose debugging output")
parser.add_argument("--dry-run", action="store_true", help="Dry run mode (no actions taken)")
args = parser.parse_args()
github_token = get_github_token()
if len(sys.argv) != 2:
print("Usage: python3 release_checklist.py <toolchain>")
sys.exit(1)
toolchain = sys.argv[1]
toolchain = args.toolchain
verbose = args.verbose
# dry_run = args.dry_run # Not used yet but available for future implementation
stripped_toolchain = strip_rc_suffix(toolchain)
lean_repo_url = "https://github.com/leanprover/lean4"
@@ -256,6 +294,7 @@ def main():
for repo in repos:
name = repo["name"]
url = repo["url"]
org_repo = extract_org_repo_from_url(url)
branch = repo["branch"]
check_stable = repo["stable-branch"]
check_tag = repo.get("toolchain-tag", True)
@@ -291,20 +330,38 @@ def main():
print(f" ✅ PR with title '{pr_title}' exists: #{pr_number} ({pr_url})")
else:
print(f" ❌ PR with title '{pr_title}' does not exist")
print(f" Run `script/release_steps.py {toolchain} {name}` to create it")
repo_status[name] = False
continue
print(f" ✅ On compatible toolchain (>= {toolchain})")
if check_tag:
if not tag_exists(url, toolchain, github_token):
print(f" ❌ Tag {toolchain} does not exist. Run `script/push_repo_release_tag.py {extract_org_repo_from_url(url)} {branch} {toolchain}`.")
repo_status[name] = False
continue
tag_exists_initially = tag_exists(url, toolchain, github_token)
if not tag_exists_initially:
if args.dry_run:
print(f" ❌ Tag {toolchain} does not exist. Run `script/push_repo_release_tag.py {org_repo} {branch} {toolchain}`.")
repo_status[name] = False
continue
else:
print(f" … Tag {toolchain} does not exist. Running `script/push_repo_release_tag.py {org_repo} {branch} {toolchain}`...")
# Run the script to create the tag
subprocess.run(["script/push_repo_release_tag.py", org_repo, branch, toolchain])
# Check again if the tag exists now
if not tag_exists(url, toolchain, github_token):
print(f" ❌ Manual intervention required.")
repo_status[name] = False
continue
# This will print in all successful cases - whether tag existed initially or was created successfully
print(f" ✅ Tag {toolchain} exists")
if check_stable and not is_release_candidate(toolchain):
if not is_merged_into_stable(url, toolchain, "stable", github_token):
if not is_merged_into_stable(url, toolchain, "stable", github_token, verbose):
org_repo = extract_org_repo_from_url(url)
print(f" ❌ Tag {toolchain} is not merged into stable")
print(f" Run `script/merge_remote.py {org_repo} stable {toolchain}` to merge it")
repo_status[name] = False
continue
print(f" ✅ Tag {toolchain} is merged into stable")
@@ -313,9 +370,16 @@ def main():
next_version = get_next_version(toolchain)
bump_branch = f"bump/{next_version}"
if not branch_exists(url, bump_branch, github_token):
print(f" ❌ Bump branch {bump_branch} does not exist")
repo_status[name] = False
continue
if args.dry_run:
print(f" ❌ Bump branch {bump_branch} does not exist. Run `gh api -X POST /repos/{org_repo}/git/refs -f ref=refs/heads/{bump_branch} -f sha=$(gh api /repos/{org_repo}/git/refs/heads/{branch} --jq .object.sha)` to create it.")
repo_status[name] = False
continue
print(f" … Bump branch {bump_branch} does not exist. Creating it...")
result = run_command(f"gh api -X POST /repos/{org_repo}/git/refs -f ref=refs/heads/{bump_branch} -f sha=$(gh api /repos/{org_repo}/git/refs/heads/{branch} --jq .object.sha)", check=False)
if result.returncode != 0:
print(f" ❌ Failed to create bump branch {bump_branch}")
repo_status[name] = False
continue
print(f" ✅ Bump branch {bump_branch} exists")
if not check_bump_branch_toolchain(url, bump_branch, github_token):
repo_status[name] = False

View File

@@ -63,7 +63,9 @@ repositories:
toolchain-tag: true
stable-branch: false
branch: main
dependencies: []
dependencies:
- Cli
- Batteries
- name: plausible
url: https://github.com/leanprover-community/plausible
@@ -85,6 +87,7 @@ repositories:
- Batteries
- doc-gen4
- import-graph
- plausible
- name: REPL
url: https://github.com/leanprover-community/repl

154
script/release_steps.py Executable file
View File

@@ -0,0 +1,154 @@
#!/usr/bin/env python3
"""
Generate release steps script for Lean4 repositories.
This script helps automate the release process for Lean4 and its dependent repositories
by generating step-by-step instructions for updating toolchains, creating tags,
and managing branches.
Usage:
python3 release_steps.py <version> <repo>
Arguments:
version: The version to set in the lean-toolchain file (e.g., v4.6.0)
repo: A substring of the repository name as specified in release_repos.yml
Example:
python3 release_steps.py v4.6.0 mathlib
python3 release_steps.py v4.6.0 batt
The script reads repository configurations from release_repos.yml in the same directory.
Each repository may have specific requirements for:
- Branch management
- Toolchain updates
- Dependency updates
- Tagging conventions
- Stable branch handling
"""
import argparse
import yaml
import os
import sys
import re
def load_repos_config(file_path):
with open(file_path, "r") as f:
return yaml.safe_load(f)["repositories"]
def find_repo(repo_substring, config):
pattern = re.compile(re.escape(repo_substring), re.IGNORECASE)
matching_repos = [r for r in config if pattern.search(r["name"])]
if not matching_repos:
print(f"Error: No repository matching '{repo_substring}' found in configuration.")
sys.exit(1)
if len(matching_repos) > 1:
print(f"Error: Multiple repositories matching '{repo_substring}' found in configuration: {', '.join(r['name'] for r in matching_repos)}")
sys.exit(1)
return matching_repos[0]
def generate_script(repo, version, config):
repo_config = find_repo(repo, config)
repo_name = repo_config['name']
repo_url = repo_config['url']
# Extract the last component of the URL, removing the .git extension if present
repo_dir = repo_url.split('/')[-1].replace('.git', '')
default_branch = repo_config.get("branch", "main")
dependencies = repo_config.get("dependencies", [])
requires_tagging = repo_config.get("toolchain-tag", True)
has_stable_branch = repo_config.get("stable-branch", True)
script_lines = [
f"cd {repo_dir}",
"git fetch",
f"git checkout {default_branch} && git pull",
f"git checkout -b bump_to_{version}",
f"echo leanprover/lean4:{version} > lean-toolchain",
]
# Special cases for specific repositories
if repo_name == "REPL":
script_lines.extend([
"lake update",
"cd test/Mathlib",
f"perl -pi -e 's/rev = \"v\\d+\\.\\d+\\.\\d+(-rc\\d+)?\"/rev = \"{version}\"/g' lakefile.toml",
f"echo leanprover/lean4:{version} > lean-toolchain",
"lake update",
"cd ../..",
"./test.sh"
])
elif dependencies:
script_lines.append('echo "Please update the dependencies in lakefile.{lean,toml}"')
script_lines.append("lake update")
script_lines.append("")
script_lines.extend([
f'git commit -am "chore: bump toolchain to {version}"',
""
])
if re.search(r'rc\d+$', version) and repo_name in ["Batteries", "Mathlib"]:
script_lines.extend([
"echo 'This repo has nightly-testing infrastructure'",
f"git merge origin/bump/{version.split('-rc')[0]}",
"echo 'Please resolve any conflicts.'",
""
])
if repo_name != "Mathlib":
script_lines.extend([
"lake build && if lake check-test; then lake test; fi",
""
])
script_lines.extend([
'gh pr create --title "chore: bump toolchain to ' + version + '" --body ""',
"echo 'Please review the PR and merge it.'",
""
])
# Special cases for specific repositories
if repo_name == "ProofWidgets4":
script_lines.append(f"echo 'Note: Follow the version convention of the repository for tagging.'")
elif requires_tagging:
script_lines.append(f"git checkout {default_branch} && git pull")
script_lines.append(f'[ "$(cat lean-toolchain)" = "leanprover/lean4:{version}" ] && git tag -a {version} -m \'Release {version}\' && git push origin --tags || echo "Error: lean-toolchain does not contain expected version {version}"')
if has_stable_branch:
script_lines.extend([
"git checkout stable && git pull",
f"git merge {version} --no-edit",
"git push origin stable"
])
return "\n".join(script_lines)
def main():
parser = argparse.ArgumentParser(
description="Generate release steps script for Lean4 repositories.",
formatter_class=argparse.RawDescriptionHelpFormatter,
epilog="""
Examples:
%(prog)s v4.6.0 mathlib Generate steps for updating Mathlib to v4.6.0
%(prog)s v4.6.0 batt Generate steps for updating Batteries to v4.6.0
The script will generate shell commands to:
1. Update the lean-toolchain file
2. Create appropriate branches and commits
3. Create pull requests
4. Create version tags
5. Update stable branches where applicable"""
)
parser.add_argument("version", help="The version to set in the lean-toolchain file (e.g., v4.6.0)")
parser.add_argument("repo", help="A substring of the repository name as specified in release_repos.yml")
args = parser.parse_args()
config_path = os.path.join(os.path.dirname(__file__), "release_repos.yml")
config = load_repos_config(config_path)
script = generate_script(args.repo, args.version, config)
print(script)
if __name__ == "__main__":
main()

View File

@@ -10,7 +10,7 @@ endif()
include(ExternalProject)
project(LEAN CXX C)
set(LEAN_VERSION_MAJOR 4)
set(LEAN_VERSION_MINOR 19)
set(LEAN_VERSION_MINOR 20)
set(LEAN_VERSION_PATCH 0)
set(LEAN_VERSION_IS_RELEASE 0) # This number is 1 in the release revision, and 0 otherwise.
set(LEAN_SPECIAL_VERSION_DESC "" CACHE STRING "Additional version description like 'nightly-2018-03-11'")

View File

@@ -3893,7 +3893,7 @@ theorem all_map {xs : Array α} {p : β → Bool} : (xs.map f).all p = xs.all (p
/-- Variant of `all_filter` with a side condition for the `stop` argument. -/
@[simp] theorem all_filter' {xs : Array α} {p q : α Bool} (w : stop = (xs.filter p).size) :
(xs.filter p).all q 0 stop = xs.all fun a => p a q a := by
(xs.filter p).all q 0 stop = xs.all fun a => !(p a) || q a := by
subst w
rcases xs with xs
rw [List.filter_toArray]
@@ -3904,7 +3904,7 @@ theorem any_filter {xs : Array α} {p q : α → Bool} :
simp
theorem all_filter {xs : Array α} {p q : α Bool} :
(xs.filter p).all q 0 = xs.all fun a => p a q a := by
(xs.filter p).all q 0 = xs.all fun a => !(p a) || q a := by
simp
/-- Variant of `any_filterMap` with a side condition for the `stop` argument. -/

View File

@@ -760,4 +760,19 @@ def reverse : {w : Nat} → BitVec w → BitVec w
| 0, x => x
| w + 1, x => concat (reverse (x.truncate w)) (x.msb)
/-- `umulOverflow x y` returns `true` if multiplying `x` and `y` results in *unsigned* overflow.
SMT-Lib name: `bvumulo`.
-/
def umulOverflow {w : Nat} (x y : BitVec w) : Bool := x.toNat * y.toNat 2 ^ w
/-- `smulOverflow x y` returns `true` if multiplying `x` and `y` results in *signed* overflow,
treating `x` and `y` as 2's complement signed bitvectors.
SMT-Lib name: `bvsmulo`.
-/
def smulOverflow {w : Nat} (x y : BitVec w) : Bool :=
(x.toInt * y.toInt 2 ^ (w - 1)) || (x.toInt * y.toInt < - 2 ^ (w - 1))
end BitVec

View File

@@ -7,6 +7,7 @@ prelude
import Init.Data.BitVec.Folds
import Init.Data.Nat.Mod
import Init.Data.Int.LemmasAux
import Init.Data.BitVec.Lemmas
/-!
# Bit blasting of bitvectors
@@ -1358,6 +1359,31 @@ theorem negOverflow_eq {w : Nat} (x : BitVec w) :
simp only [toInt_intMin, Nat.add_one_sub_one, Int.ofNat_emod, Int.neg_inj]
rw_mod_cast [Nat.mod_eq_of_lt (by simp [Nat.pow_lt_pow_succ])]
theorem umulOverflow_eq {w : Nat} (x y : BitVec w) :
umulOverflow x y =
(0 < w && BitVec.twoPow (w * 2) w x.zeroExtend (w * 2) * y.zeroExtend (w * 2)) := by
simp only [umulOverflow, toNat_twoPow, le_def, toNat_mul, toNat_setWidth, mod_mul_mod]
rcases w with _|w
· simp [of_length_zero, toInt_zero, mul_mod_mod]
· simp only [ge_iff_le, show 0 < w + 1 by omega, decide_true, mul_mod_mod, Bool.true_and,
decide_eq_decide]
rw [Nat.mod_eq_of_lt BitVec.toNat_mul_toNat_lt, Nat.mod_eq_of_lt]
have := Nat.pow_lt_pow_of_lt (a := 2) (n := w + 1) (m := (w + 1) * 2)
omega
theorem smulOverflow_eq {w : Nat} (x y : BitVec w) :
smulOverflow x y =
(0 < w &&
((signExtend (w * 2) (intMax w)).slt (signExtend (w * 2) x * signExtend (w * 2) y) ||
(signExtend (w * 2) x * signExtend (w * 2) y).slt (signExtend (w * 2) (intMin w)))) := by
simp only [smulOverflow]
rcases w with _|w
· simp [of_length_zero, toInt_zero]
· have h₁ := BitVec.two_pow_le_toInt_mul_toInt_iff (x := x) (y := y)
have h₂ := BitVec.toInt_mul_toInt_lt_neg_two_pow_iff (x := x) (y := y)
simp only [Nat.add_one_sub_one] at h₁ h₂
simp [h₁, h₂]
/- ### umod -/
theorem getElem_umod {n d : BitVec w} (hi : i < w) :
@@ -1752,4 +1778,22 @@ theorem extractLsb'_mul {w len} {x y : BitVec w} (hlen : len ≤ w) :
(x * y).extractLsb' 0 len = (x.extractLsb' 0 len) * (y.extractLsb' 0 len) := by
simp [ setWidth_eq_extractLsb' hlen, setWidth_mul _ _ hlen]
/-- Adding bitvectors that are zero in complementary positions equals concatenation. -/
theorem append_add_append_eq_append {v w : Nat} {x : BitVec v} {y : BitVec w} :
(x ++ 0#w) + (0#v ++ y) = x ++ y := by
rw [add_eq_or_of_and_eq_zero] <;> ext i <;> simp
/-- Heuristically, `y <<< x` is much larger than `x`,
and hence low bits of `y <<< x`. Thus, `x + (y <<< x) = x ||| (y <<< x).` -/
theorem add_shifLeft_eq_or_shiftLeft {x y : BitVec w} :
x + (y <<< x) = x ||| (y <<< x) := by
rw [add_eq_or_of_and_eq_zero]
ext i hi
simp only [shiftLeft_eq', getElem_and, getElem_shiftLeft, getElem_zero, and_eq_false_imp,
not_eq_eq_eq_not, Bool.not_true, decide_eq_false_iff_not, Nat.not_lt]
intros hxi hxval
have : 2^i x.toNat := two_pow_le_toNat_of_getElem_eq_true hi hxi
have : i < 2^i := by exact Nat.lt_two_pow_self
omega
end BitVec

View File

@@ -136,6 +136,12 @@ protected theorem toNat_lt_twoPow_of_le (h : m ≤ n) {x : BitVec m} :
theorem testBit_toNat (x : BitVec w) : x.toNat.testBit i = x.getLsbD i := rfl
theorem two_pow_le_toNat_of_getElem_eq_true {i : Nat} {x : BitVec w}
(hi : i < w) (hx : x[i] = true) : 2^i x.toNat := by
apply Nat.testBit_implies_ge
rw [ getElem_eq_testBit_toNat x i hi]
exact hx
theorem getMsb'_eq_getLsb' (x : BitVec w) (i : Fin w) :
x.getMsb' i = x.getLsb' w - 1 - i, by omega := by
simp only [getMsb', getLsb']
@@ -594,6 +600,14 @@ theorem toInt_nonneg_of_msb_false {x : BitVec w} (h : x.msb = false) : 0 ≤ x.t
have : 2 * x.toNat < 2 ^ w := msb_eq_false_iff_two_mul_lt.mp h
omega
@[simp] theorem toInt_one_of_lt {w : Nat} (h : 1 < w) : (1#w).toInt = 1 := by
rw [toInt_eq_msb_cond]
simp only [msb_one, show w 1 by omega, decide_false, Bool.false_eq_true, reduceIte,
toNat_ofNat, Int.ofNat_emod]
norm_cast
apply Nat.mod_eq_of_lt
apply Nat.one_lt_two_pow (by omega)
/-- Prove equality of bitvectors in terms of nat operations. -/
theorem eq_of_toInt_eq {x y : BitVec n} : x.toInt = y.toInt x = y := by
intro eq
@@ -3339,6 +3353,14 @@ theorem sub_toAdd {n} (x y : BitVec n) : x - y = x + - y := by
simp only [toNat_sub, toNat_add, toNat_neg, Nat.add_mod_mod]
rw [Nat.add_comm]
theorem add_left_neg (x : BitVec w) : -x + x = 0#w := by
apply toInt_inj.mp
simp [toInt_neg, Int.add_left_neg]
theorem add_right_neg (x : BitVec w) : x + -x = 0#w := by
rw [BitVec.add_comm]
exact add_left_neg x
@[simp] theorem neg_zero (n:Nat) : -BitVec.ofNat n 0 = BitVec.ofNat n 0 := by apply eq_of_toNat_eq ; simp
theorem add_sub_cancel (x y : BitVec w) : x + y - y = x := by
@@ -4568,6 +4590,30 @@ theorem udiv_twoPow_eq_of_lt {w : Nat} {x : BitVec w} {k : Nat} (hk : k < w) : x
have : 2^k < 2^w := Nat.pow_lt_pow_of_lt (by decide) hk
simp [bitvec_to_nat, Nat.shiftRight_eq_div_pow, Nat.mod_eq_of_lt this]
theorem toInt_mul_toInt_le {x y : BitVec w} : x.toInt * y.toInt 2 ^ (w * 2 - 2) := by
rcases w with _|w
· simp [of_length_zero]
· have xlt := two_mul_toInt_lt (x := x); have xle := le_two_mul_toInt (x := x)
have ylt := two_mul_toInt_lt (x := y); have yle := le_two_mul_toInt (x := y)
have h : 2 ^ ((w + 1) * 2 - 2) = 2 ^ ((w + 1) - 1) * 2 ^ ((w + 1) - 1) := by
rw [ Nat.pow_add, Nat.mul_two, Nat.mul_comm (m := 2) (n := (w + 1) - 1),
Nat.mul_sub_one, Nat.mul_comm]
rw_mod_cast [h]
rw [ Nat.two_pow_pred_mul_two (by omega), Int.natCast_mul] at xlt ylt xle yle
exact Int.mul_le_mul_of_natAbs_le (by omega) (by omega)
theorem le_toInt_mul_toInt {x y : BitVec w} : - (2 ^ (w * 2 - 2)) x.toInt * y.toInt := by
rcases w with _|w
· simp [of_length_zero]
· have xlt := two_mul_toInt_lt (x := x); have xle := le_two_mul_toInt (x := x)
have ylt := two_mul_toInt_lt (x := y); have yle := le_two_mul_toInt (x := y)
have h : 2 ^ ((w + 1) * 2 - 2) = 2 ^ ((w + 1) - 1) * 2 ^ ((w + 1) - 1) := by
rw [ Nat.pow_add, Nat.mul_two, Nat.mul_comm (m := 2) (n := (w + 1) - 1),
Nat.mul_sub_one, Nat.mul_comm]
rw_mod_cast [h]
rw [ Nat.two_pow_pred_mul_two (by omega), Int.natCast_mul] at xlt ylt xle yle
exact Int.neg_mul_le_mul (by omega) (by omega) (by omega) (by omega)
theorem shiftLeft_neg {x : BitVec w} {y : Nat} :
(-x) <<< y = - (x <<< y) := by
rw [shiftLeft_eq_mul_twoPow, shiftLeft_eq_mul_twoPow, BitVec.neg_mul]
@@ -4931,6 +4977,10 @@ theorem toNat_mul_of_lt {w} {x y : BitVec w} (h : x.toNat * y.toNat < 2^w) :
(x * y).toNat = x.toNat * y.toNat := by
rw [BitVec.toNat_mul, Nat.mod_eq_of_lt h]
theorem toNat_mul_toNat_lt {x y : BitVec w} : x.toNat * y.toNat < 2 ^ (w * 2) := by
have := BitVec.isLt x; have := BitVec.isLt y
simp only [Nat.mul_two, Nat.pow_add]
exact Nat.mul_lt_mul_of_le_of_lt (by omega) (by omega) (by omega)
/--
`x ≤ y + z` if and only if `x - z ≤ y`
@@ -4955,6 +5005,41 @@ theorem sub_le_sub_iff_le {x y z : BitVec w} (hxz : z ≤ x) (hyz : z ≤ y) :
BitVec.toNat_sub_of_le (by rw [BitVec.le_def]; omega)]
omega
theorem two_pow_le_toInt_mul_toInt_iff {x y : BitVec w} :
2 ^ (w - 1) x.toInt * y.toInt
(signExtend (w * 2) (intMax w)).slt (signExtend (w * 2) x * signExtend (w * 2) y) := by
rcases w with _|w
· simp [of_length_zero]
· have := Int.pow_lt_pow_of_lt (a := 2) (b := (w + 1) * 2 - 2) (c := (w + 1) * 2 - 1) (by omega)
have := @BitVec.le_toInt_mul_toInt (w + 1) x y
have := @BitVec.toInt_mul_toInt_le (w + 1) x y
simp only [Nat.add_one_sub_one, BitVec.slt, intMax, ofNat_eq_ofNat, toInt_mul,
decide_eq_true_eq]
repeat rw [BitVec.toInt_signExtend_of_le (by omega)]
simp only [show BitVec.twoPow (w + 1) w - 1#(w + 1) = BitVec.intMax (w + 1) by simp [intMax],
toInt_intMax, Nat.add_one_sub_one]
push_cast
rw [ Nat.two_pow_pred_add_two_pow_pred (by omega),
Int.bmod_eq_self_of_le_mul_two (by rw [ Nat.mul_two]; push_cast; omega)
(by rw [ Nat.mul_two]; push_cast; omega)]
omega
theorem toInt_mul_toInt_lt_neg_two_pow_iff {x y : BitVec w} :
x.toInt * y.toInt < - 2 ^ (w - 1)
(signExtend (w * 2) x * signExtend (w * 2) y).slt (signExtend (w * 2) (intMin w)) := by
rcases w with _|w
· simp [of_length_zero]
· have := Int.pow_lt_pow_of_lt (a := 2) (b := (w + 1) * 2 - 2) (c := (w + 1) * 2 - 1) (by omega)
have := @BitVec.le_toInt_mul_toInt (w + 1) x y
have := @BitVec.toInt_mul_toInt_le (w + 1) x y
simp only [BitVec.slt, toInt_mul, intMin, Nat.add_one_sub_one, decide_eq_true_eq]
repeat rw [BitVec.toInt_signExtend_of_le (by omega)]
simp only [toInt_twoPow, show ¬w + 1 w by omega, reduceIte]
push_cast
rw [ Nat.two_pow_pred_add_two_pow_pred (by omega),
Int.bmod_eq_self_of_le_mul_two (by rw [ Nat.mul_two]; push_cast; omega)
(by rw [ Nat.mul_two]; push_cast; omega)]
/-! ### neg -/
theorem msb_eq_toInt {x : BitVec w}:

View File

@@ -151,4 +151,59 @@ theorem bmod_bmod_of_dvd {a : Int} {n m : Nat} (hnm : n m) :
obtain k, rfl := hnm
simp [Int.mul_assoc]
theorem bmod_eq_self_of_le_mul_two {x : Int} {y : Nat} (hle : -y x * 2) (hlt : x * 2 < y) :
x.bmod y = x := by
apply bmod_eq_self_of_le (by omega) (by omega)
theorem mul_le_mul_of_natAbs_le {x y : Int} {s t : Nat} (hx : x.natAbs s) (hy : y.natAbs t) :
x * y s * t := by
by_cases 0 < s 0 < t
· have := Nat.mul_pos (n := s) (m := t) (by omega) (by omega)
by_cases hx : 0 < x <;> by_cases hy : 0 < y
· apply Int.mul_le_mul <;> omega
· have : x * y 0 := Int.mul_nonpos_of_nonneg_of_nonpos (by omega) (by omega); omega
· have : x * y 0 := Int.mul_nonpos_of_nonpos_of_nonneg (by omega) (by omega); omega
· have : -x * -y s * t := Int.mul_le_mul (by omega) (by omega) (by omega) (by omega)
simp [Int.neg_mul_neg] at this
norm_cast
· have : (x = 0 y = 0) x * y = 0 := by simp [Int.mul_eq_zero]
norm_cast
omega
/--
This is a generalization of `a ≤ c` and `b ≤ d` implying `a * b ≤ c * d` for natural numbers,
appropriately generalized to integers when `b` is nonnegative and `c` is nonpositive.
-/
theorem mul_le_mul_of_le_of_le_of_nonneg_of_nonpos {a b c d : Int}
(hac : a c) (hbd : d b) (hb : 0 b) (hc : c 0) : a * b c * d :=
Int.le_trans (Int.mul_le_mul_of_nonneg_right hac hb) (Int.mul_le_mul_of_nonpos_left hc hbd)
theorem mul_le_mul_of_le_of_le_of_nonneg_of_nonneg {a b c d : Int}
(hac : a c) (hbd : b d) (hb : 0 b) (hc : 0 c) : a * b c * d :=
Int.le_trans (Int.mul_le_mul_of_nonneg_right hac hb) (Int.mul_le_mul_of_nonneg_left hbd hc)
theorem mul_le_mul_of_le_of_le_of_nonpos_of_nonpos {a b c d : Int}
(hac : c a) (hbd : d b) (hb : b 0) (hc : c 0) : a * b c * d :=
Int.le_trans (Int.mul_le_mul_of_nonpos_right hac hb) (Int.mul_le_mul_of_nonpos_left hc hbd)
theorem mul_le_mul_of_le_of_le_of_nonpos_of_nonneg {a b c d : Int}
(hac : c a) (hbd : b d) (hb : b 0) (hc : 0 c) : a * b c * d :=
Int.le_trans (Int.mul_le_mul_of_nonpos_right hac hb) (Int.mul_le_mul_of_nonneg_left hbd hc)
/--
A corollary of |s| ≤ x, and |t| ≤ y, then |s * t| ≤ x * y,
-/
theorem neg_mul_le_mul {x y : Int} {s t : Nat} (lbx : -s x) (ubx : x < s) (lby : -t y) (uby : y < t) :
-(s * t) x * y := by
have := Nat.mul_pos (n := s) (m := t) (by omega) (by omega)
by_cases 0 x <;> by_cases 0 y
· have : 0 x * y := by apply Int.mul_nonneg <;> omega
norm_cast
omega
· rw [Int.mul_comm (a := x), Int.mul_comm (a := (s : Int)), Int.neg_mul]; apply Int.mul_le_mul_of_le_of_le_of_nonneg_of_nonpos <;> omega
· rw [ Int.neg_mul]; apply Int.mul_le_mul_of_le_of_le_of_nonneg_of_nonpos <;> omega
· have : 0 < x * y := by apply Int.mul_pos_of_neg_of_neg <;> omega
norm_cast
omega
end Int

View File

@@ -1796,6 +1796,29 @@ theorem of_not_dvd (a b : Int) : a != 0 → ¬ (a b) → b % a > 0 := by
simp [h₁] at h₂
assumption
def le_of_le_cert (p q : Poly) (k : Nat) : Bool :=
q == p.addConst (- k)
theorem le_of_le (ctx : Context) (p q : Poly) (k : Nat)
: le_of_le_cert p q k p.denote' ctx 0 q.denote' ctx 0 := by
simp [le_of_le_cert]; intro; subst q; simp
intro h
simp [Lean.Omega.Int.add_le_zero_iff_le_neg']
exact Int.le_trans h (Int.ofNat_zero_le _)
def not_le_of_le_cert (p q : Poly) (k : Nat) : Bool :=
q == (p.mul (-1)).addConst (1 + k)
theorem not_le_of_le (ctx : Context) (p q : Poly) (k : Nat)
: not_le_of_le_cert p q k p.denote' ctx 0 ¬ q.denote' ctx 0 := by
simp [not_le_of_le_cert]; intro; subst q
intro h
apply Int.pos_of_neg_neg
apply Int.lt_of_add_one_le
simp [Int.neg_add, Int.neg_sub]
rw [ Int.add_assoc, Int.add_assoc, Int.add_neg_cancel_right, Lean.Omega.Int.add_le_zero_iff_le_neg']
simp; exact Int.le_trans h (Int.ofNat_zero_le _)
end Int.Linear
theorem Int.not_le_eq (a b : Int) : (¬a b) = (b + 1 a) := by

View File

@@ -56,4 +56,11 @@ protected theorem two_pow_pred_sub_two_pow' {w : Nat} (h : 0 < w) :
rw [ Nat.two_pow_pred_add_two_pow_pred h]
simp [h]
theorem pow_lt_pow_of_lt {a : Int} {b c : Nat} (ha : 1 < a) (hbc : b < c):
a ^ b < a ^ c := by
rw [ Int.toNat_of_nonneg (a := a) (by omega), Int.natCast_pow, Int.natCast_pow]
have := Nat.pow_lt_pow_of_lt (a := a.toNat) (m := c) (n := b)
simp only [Int.ofNat_lt]
omega
end Int

View File

@@ -105,7 +105,7 @@ abbrev length_eq_zero := @length_eq_zero_iff
theorem eq_nil_iff_length_eq_zero : l = [] length l = 0 :=
length_eq_zero_iff.symm
theorem length_pos_of_mem {a : α} : {l : List α}, a l 0 < length l
@[grind] theorem length_pos_of_mem {a : α} : {l : List α}, a l 0 < length l
| _::_, _ => Nat.zero_lt_succ _
theorem exists_mem_of_length_pos : {l : List α}, 0 < length l a, a l
@@ -185,7 +185,8 @@ theorem singleton_inj {α : Type _} {a b : α} : [a] = [b] ↔ a = b := by
We simplify `l.get i` to `l[i.1]'i.2` and `l.get? i` to `l[i]?`.
-/
@[simp] theorem get_eq_getElem {l : List α} {i : Fin l.length} : l.get i = l[i.1]'i.2 := rfl
@[simp, grind]
theorem get_eq_getElem {l : List α} {i : Fin l.length} : l.get i = l[i.1]'i.2 := rfl
set_option linter.deprecated false in
@[deprecated "Use `a[i]?` instead." (since := "2025-02-12")]
@@ -224,7 +225,8 @@ theorem get?_eq_getElem? {l : List α} {i : Nat} : l.get? i = l[i]? := by
We simplify `l[i]!` to `(l[i]?).getD default`.
-/
@[simp] theorem getElem!_eq_getElem?_getD [Inhabited α] {l : List α} {i : Nat} :
@[simp, grind]
theorem getElem!_eq_getElem?_getD [Inhabited α] {l : List α} {i : Nat} :
l[i]! = (l[i]?).getD (default : α) := by
simp only [getElem!_def]
match l[i]? with
@@ -233,16 +235,16 @@ We simplify `l[i]!` to `(l[i]?).getD default`.
/-! ### getElem? and getElem -/
@[simp] theorem getElem?_nil {i : Nat} : ([] : List α)[i]? = none := rfl
@[simp, grind] theorem getElem?_nil {i : Nat} : ([] : List α)[i]? = none := rfl
theorem getElem_cons {l : List α} (w : i < (a :: l).length) :
(a :: l)[i] =
if h : i = 0 then a else l[i-1]'(match i, h with | i+1, _ => succ_lt_succ_iff.mp w) := by
cases i <;> simp
theorem getElem?_cons_zero {l : List α} : (a::l)[0]? = some a := rfl
@[grind] theorem getElem?_cons_zero {l : List α} : (a::l)[0]? = some a := rfl
@[simp] theorem getElem?_cons_succ {l : List α} : (a::l)[i+1]? = l[i]? := rfl
@[simp, grind] theorem getElem?_cons_succ {l : List α} : (a::l)[i+1]? = l[i]? := rfl
theorem getElem?_cons : (a :: l)[i]? = if i = 0 then some a else l[i-1]? := by
cases i <;> simp [getElem?_cons_zero]
@@ -335,7 +337,8 @@ We simplify away `getD`, replacing `getD l n a` with `(l[n]?).getD a`.
Because of this, there is only minimal API for `getD`.
-/
@[simp] theorem getD_eq_getElem?_getD {l : List α} {i : Nat} {a : α} : getD l i a = (l[i]?).getD a := by
@[simp, grind]
theorem getD_eq_getElem?_getD {l : List α} {i : Nat} {a : α} : getD l i a = (l[i]?).getD a := by
simp [getD]
theorem getD_cons_zero : getD (x :: xs) 0 d = x := by simp
@@ -362,7 +365,7 @@ theorem get!_eq_getElem! [Inhabited α] (l : List α) (i) : l.get! i = l[i]! :=
@[simp] theorem not_mem_nil {a : α} : ¬ a [] := nofun
@[simp] theorem mem_cons : a (b :: l) a = b a l :=
@[simp] theorem mem_cons : a b :: l a = b a l :=
fun h => by cases h <;> simp [Membership.mem, *],
fun | Or.inl rfl => by constructor | Or.inr h => by constructor; assumption
@@ -683,7 +686,7 @@ theorem set_eq_of_length_le {l : List α} {i : Nat} (h : l.length ≤ i) {a : α
induction l generalizing i with
| nil => simp_all
| cons a l ih =>
induction i
cases i
· simp_all
· simp only [set_cons_succ, cons.injEq, true_and]
rw [ih]
@@ -921,6 +924,8 @@ theorem head?_eq_getElem? : ∀ {l : List α}, l.head? = l[0]?
| [] => rfl
| a :: l => by simp
theorem head_singleton {a : α} : head [a] (by simp) = a := by simp
theorem head_eq_getElem {l : List α} (h : l []) : head l h = l[0]'(length_pos_iff.mpr h) := by
cases l with
| nil => simp at h
@@ -1053,6 +1058,9 @@ theorem getLast?_tail {l : List α} : (tail l).getLast? = if l.length = 1 then n
| nil => simp [List.map]
| cons _ as ih => simp [List.map, ih]
@[simp] theorem isEmpty_map {l : List α} {f : α β} : (l.map f).isEmpty = l.isEmpty := by
cases l <;> simp
@[simp] theorem getElem?_map {f : α β} : {l : List α} {i : Nat}, (map f l)[i]? = Option.map f l[i]?
| [], _ => rfl
| _ :: _, 0 => by simp
@@ -3359,7 +3367,7 @@ theorem all_eq_not_any_not {l : List α} {p : α → Bool} : l.all p = !l.any (!
split <;> simp_all
@[simp] theorem all_filter {l : List α} {p q : α Bool} :
(filter p l).all q = l.all fun a => p a q a := by
(filter p l).all q = l.all fun a => !(p a) || q a := by
induction l with
| nil => rfl
| cons h t ih =>

View File

@@ -777,31 +777,34 @@ protected theorem pow_succ (n m : Nat) : n^(succ m) = n^m * n :=
protected theorem pow_add_one (n m : Nat) : n^(m + 1) = n^m * n :=
rfl
protected theorem pow_zero (n : Nat) : n^0 = 1 := rfl
@[simp] protected theorem pow_zero (n : Nat) : n^0 = 1 := rfl
theorem pow_le_pow_left {n m : Nat} (h : n m) : (i : Nat), n^i m^i
@[simp] protected theorem pow_one (a : Nat) : a ^ 1 = a := by
simp [Nat.pow_succ]
protected theorem pow_le_pow_left {n m : Nat} (h : n m) : (i : Nat), n^i m^i
| 0 => Nat.le_refl _
| succ i => Nat.mul_le_mul (pow_le_pow_left h i) h
| succ i => Nat.mul_le_mul (Nat.pow_le_pow_left h i) h
theorem pow_le_pow_right {n : Nat} (hx : n > 0) {i : Nat} : {j}, i j n^i n^j
protected theorem pow_le_pow_right {n : Nat} (hx : n > 0) {i : Nat} : {j}, i j n^i n^j
| 0, h =>
have : i = 0 := eq_zero_of_le_zero h
this.symm Nat.le_refl _
| succ j, h =>
match le_or_eq_of_le_succ h with
| Or.inl h => show n^i n^j * n from
have : n^i * 1 n^j * n := Nat.mul_le_mul (pow_le_pow_right hx h) hx
have : n^i * 1 n^j * n := Nat.mul_le_mul (Nat.pow_le_pow_right hx h) hx
Nat.mul_one (n^i) this
| Or.inr h =>
h.symm Nat.le_refl _
set_option linter.missingDocs false in
@[deprecated Nat.pow_le_pow_left (since := "2025-02-17")]
abbrev pow_le_pow_of_le_left := @pow_le_pow_left
abbrev pow_le_pow_of_le_left := @Nat.pow_le_pow_left
set_option linter.missingDocs false in
@[deprecated Nat.pow_le_pow_right (since := "2025-02-17")]
abbrev pow_le_pow_of_le_right := @pow_le_pow_right
abbrev pow_le_pow_of_le_right := @Nat.pow_le_pow_right
protected theorem pow_pos (h : 0 < a) : 0 < a^n :=
match n with
@@ -822,6 +825,33 @@ protected theorem two_pow_pos (w : Nat) : 0 < 2^w := Nat.pow_pos (by decide)
instance {n m : Nat} [NeZero n] : NeZero (n^m) :=
Nat.ne_zero_iff_zero_lt.mpr (Nat.pow_pos (pos_of_neZero _))
protected theorem mul_pow (a b n : Nat) : (a * b) ^ n = a ^ n * b ^ n := by
induction n with
| zero => simp [Nat.pow_zero]
| succ n ih =>
rw [Nat.pow_succ, ih, Nat.pow_succ, Nat.pow_succ, Nat.mul_assoc, Nat.mul_assoc]
congr 1
rw [Nat.mul_assoc, Nat.mul_assoc, Nat.mul_comm _ a]
protected theorem pow_lt_pow_left {a b n : Nat} (hab : a < b) (h : n 0) : a ^ n < b ^ n := by
cases n with
| zero => simp at h
| succ n =>
clear h
induction n with
| zero => simpa
| succ n ih =>
rw [Nat.pow_succ a, Nat.pow_succ b]
exact Nat.lt_of_le_of_lt (Nat.mul_le_mul_left _ (Nat.le_of_lt hab))
(Nat.mul_lt_mul_of_pos_right ih (Nat.lt_of_le_of_lt (Nat.zero_le _) hab))
protected theorem pow_left_inj {a b n : Nat} (hn : n 0) : a ^ n = b ^ n a = b := by
refine fun h => ?_, (· rfl)
match Nat.lt_trichotomy a b with
| Or.inl hab => exact False.elim (absurd h (ne_of_lt (Nat.pow_lt_pow_left hab hn)))
| Or.inr (Or.inl hab) => exact hab
| Or.inr (Or.inr hab) => exact False.elim (absurd h (Nat.ne_of_lt' (Nat.pow_lt_pow_left hab hn)))
/-! # min/max -/
/--
@@ -1170,9 +1200,15 @@ protected theorem mul_sub_right_distrib (n m k : Nat) : (n - m) * k = n * k - m
| zero => simp
| succ m ih => rw [Nat.sub_succ, Nat.pred_mul, ih, succ_mul, Nat.sub_sub]; done
protected theorem sub_mul (n m k : Nat) : (n - m) * k = n * k - m * k :=
Nat.mul_sub_right_distrib n m k
protected theorem mul_sub_left_distrib (n m k : Nat) : n * (m - k) = n * m - n * k := by
rw [Nat.mul_comm, Nat.mul_sub_right_distrib, Nat.mul_comm m n, Nat.mul_comm n k]
protected theorem mul_sub (n m k : Nat) : n * (m - k) = n * m - n * k :=
Nat.mul_sub_left_distrib n m k
/-! # Helper normalization theorems -/
theorem not_le_eq (a b : Nat) : (¬ (a b)) = (b + 1 a) :=

View File

@@ -501,7 +501,7 @@ theorem and_lt_two_pow (x : Nat) {y n : Nat} (right : y < 2^n) : (x &&& y) < 2^n
have yf : testBit y i = false := by
apply Nat.testBit_lt_two_pow
apply Nat.lt_of_lt_of_le right
exact pow_le_pow_right Nat.zero_lt_two i_ge_n
exact Nat.pow_le_pow_right Nat.zero_lt_two i_ge_n
simp [testBit_and, yf]
@[simp] theorem and_two_pow_sub_one_eq_mod (x n : Nat) : x &&& 2^n - 1 = x % 2^n := by

View File

@@ -21,6 +21,12 @@ protected theorem dvd_trans {a b c : Nat} (h₁ : a b) (h₂ : b c) : a
| d, (h₃ : b = a * d), e, (h₄ : c = b * e) =>
d * e, show c = a * (d * e) by simp[h₃,h₄, Nat.mul_assoc]
protected theorem dvd_mul_left_of_dvd {a b : Nat} (h : a b) (c : Nat) : a c * b :=
Nat.dvd_trans h (Nat.dvd_mul_left _ _)
protected theorem dvd_mul_right_of_dvd {a b : Nat} (h : a b) (c : Nat) : a b * c :=
Nat.dvd_trans h (Nat.dvd_mul_right _ _)
protected theorem eq_zero_of_zero_dvd {a : Nat} (h : 0 a) : a = 0 :=
let c, H' := h; H'.trans c.zero_mul
@@ -106,8 +112,26 @@ protected theorem dvd_of_mul_dvd_mul_left
protected theorem dvd_of_mul_dvd_mul_right (kpos : 0 < k) (H : m * k n * k) : m n := by
rw [Nat.mul_comm m k, Nat.mul_comm n k] at H; exact Nat.dvd_of_mul_dvd_mul_left kpos H
theorem dvd_sub {k m n : Nat} (H : n m) (h₁ : k m) (h₂ : k n) : k m - n :=
(Nat.dvd_add_iff_left h₂).2 <| by rwa [Nat.sub_add_cancel H]
theorem dvd_sub {k m n : Nat} (h₁ : k m) (h₂ : k n) : k m - n :=
if H : n m then
(Nat.dvd_add_iff_left h₂).2 <| by rwa [Nat.sub_add_cancel H]
else
Nat.sub_eq_zero_of_le (Nat.le_of_not_le H) Nat.dvd_zero k
theorem dvd_sub_iff_right {m n k : Nat} (hkn : k n) (h : m n) : m n - k m k := by
refine ?_, dvd_sub h
let x, hx := h
cases hx
intro hy
let y, hy := hy
have hk : k = m * (x - y) := by
rw [Nat.sub_eq_iff_eq_add hkn] at hy
rw [Nat.mul_sub, hy, Nat.add_comm, Nat.add_sub_cancel]
exact hk Nat.dvd_mul_right _ _
theorem dvd_sub_iff_left {m n k : Nat} (hkn : k n) (h : m k) : m n - k m n := by
rw (occs := [2]) [ Nat.sub_add_cancel hkn]
exact Nat.dvd_add_iff_left h
protected theorem mul_dvd_mul {a b c d : Nat} : a b c d a * c b * d
| e, he, f, hf =>

View File

@@ -1,7 +1,7 @@
/-
Copyright (c) 2021 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Jeremy Avigad, Leonardo de Moura, Mario Carneiro
Authors: Jeremy Avigad, Leonardo de Moura, Mario Carneiro, Markus Himmel
-/
prelude
import Init.Data.Nat.Dvd
@@ -106,11 +106,11 @@ theorem gcd_comm (m n : Nat) : gcd m n = gcd n m :=
(dvd_gcd (gcd_dvd_right n m) (gcd_dvd_left n m))
instance : Std.Commutative gcd := gcd_comm
theorem gcd_eq_left_iff_dvd : m n gcd m n = m :=
fun h => by rw [gcd_rec, mod_eq_zero_of_dvd h, gcd_zero_left],
fun h => h gcd_dvd_right m n
theorem gcd_eq_left_iff_dvd : gcd m n = m m n :=
fun h => h gcd_dvd_right m n,
fun h => by rw [gcd_rec, mod_eq_zero_of_dvd h, gcd_zero_left]
theorem gcd_eq_right_iff_dvd : m n gcd n m = m := by
theorem gcd_eq_right_iff_dvd : gcd n m = m m n := by
rw [gcd_comm]; exact gcd_eq_left_iff_dvd
theorem gcd_assoc (m n k : Nat) : gcd (gcd m n) k = gcd m (gcd n k) :=
@@ -174,12 +174,20 @@ theorem gcd_dvd_gcd_of_dvd_left {m k : Nat} (n : Nat) (H : m k) : gcd m n
theorem gcd_dvd_gcd_of_dvd_right {m k : Nat} (n : Nat) (H : m k) : gcd n m gcd n k :=
dvd_gcd (gcd_dvd_left n m) (Nat.dvd_trans (gcd_dvd_right n m) H)
theorem gcd_dvd_gcd_mul_left (m n k : Nat) : gcd m n gcd (k * m) n :=
theorem gcd_dvd_gcd_mul_left_left (m n k : Nat) : gcd m n gcd (k * m) n :=
gcd_dvd_gcd_of_dvd_left _ (Nat.dvd_mul_left _ _)
theorem gcd_dvd_gcd_mul_right (m n k : Nat) : gcd m n gcd (m * k) n :=
@[deprecated gcd_dvd_gcd_mul_left_left (since := "2025-04-01")]
theorem gcd_dvd_gcd_mul_left (m n k : Nat) : gcd m n gcd (k * m) n :=
gcd_dvd_gcd_mul_left_left m n k
theorem gcd_dvd_gcd_mul_right_left (m n k : Nat) : gcd m n gcd (m * k) n :=
gcd_dvd_gcd_of_dvd_left _ (Nat.dvd_mul_right _ _)
@[deprecated gcd_dvd_gcd_mul_right_left (since := "2025-04-01")]
theorem gcd_dvd_gcd_mul_right (m n k : Nat) : gcd m n gcd (m * k) n :=
gcd_dvd_gcd_mul_right_left m n k
theorem gcd_dvd_gcd_mul_left_right (m n k : Nat) : gcd m n gcd m (k * n) :=
gcd_dvd_gcd_of_dvd_right _ (Nat.dvd_mul_left _ _)
@@ -192,6 +200,16 @@ theorem gcd_eq_left {m n : Nat} (H : m n) : gcd m n = m :=
theorem gcd_eq_right {m n : Nat} (H : n m) : gcd m n = n := by
rw [gcd_comm, gcd_eq_left H]
theorem gcd_right_eq_iff {m n n' : Nat} : gcd m n = gcd m n' k, k m (k n k n') := by
refine fun h k hkm => fun hkn => ?_, fun hkn' => ?_, fun h => Nat.dvd_antisymm ?_ ?_
· exact Nat.dvd_trans (h dvd_gcd hkm hkn) (Nat.gcd_dvd_right m n')
· exact Nat.dvd_trans (h dvd_gcd hkm hkn') (Nat.gcd_dvd_right m n)
· exact dvd_gcd_iff.2 gcd_dvd_left _ _, (h _ (gcd_dvd_left _ _)).1 (gcd_dvd_right _ _)
· exact dvd_gcd_iff.2 gcd_dvd_left _ _, (h _ (gcd_dvd_left _ _)).2 (gcd_dvd_right _ _)
theorem gcd_left_eq_iff {m m' n : Nat} : gcd m n = gcd m' n k, k n (k m k m') := by
rw [gcd_comm m n, gcd_comm m' n, gcd_right_eq_iff]
@[simp] theorem gcd_mul_left_left (m n : Nat) : gcd (m * n) n = n :=
Nat.dvd_antisymm (gcd_dvd_right _ _) (dvd_gcd (Nat.dvd_mul_left _ _) (Nat.dvd_refl _))
@@ -216,10 +234,123 @@ theorem gcd_eq_right {m n : Nat} (H : n m) : gcd m n = n := by
@[simp] theorem gcd_gcd_self_left_left (m n : Nat) : gcd (gcd m n) m = gcd m n := by
rw [gcd_comm m n, gcd_gcd_self_left_right]
theorem gcd_add_mul_self (m n k : Nat) : gcd m (n + k * m) = gcd m n := by
@[simp] theorem gcd_add_mul_right_right (m n k : Nat) : gcd m (n + k * m) = gcd m n := by
simp [gcd_rec m (n + k * m), gcd_rec m n]
theorem gcd_eq_zero_iff {i j : Nat} : gcd i j = 0 i = 0 j = 0 :=
@[deprecated gcd_add_mul_right_right (since := "2025-03-31")]
theorem gcd_add_mul_self (m n k : Nat) : gcd m (n + k * m) = gcd m n :=
gcd_add_mul_right_right _ _ _
@[simp] theorem gcd_add_mul_left_right (m n k : Nat) : gcd m (n + m * k) = gcd m n := by
rw [Nat.mul_comm, gcd_add_mul_right_right]
@[simp] theorem gcd_mul_right_add_right (m n k : Nat) : gcd m (k * m + n) = gcd m n := by
rw [Nat.add_comm, gcd_add_mul_right_right]
@[simp] theorem gcd_mul_left_add_right (m n k : Nat) : gcd m (m * k + n) = gcd m n := by
rw [Nat.add_comm, gcd_add_mul_left_right]
@[simp] theorem gcd_add_mul_right_left (m n k : Nat) : gcd (n + k * m) m = gcd n m := by
rw [gcd_comm, gcd_add_mul_right_right, gcd_comm]
@[simp] theorem gcd_add_mul_left_left (m n k : Nat) : gcd (n + m * k) m = gcd n m := by
rw [Nat.mul_comm, gcd_add_mul_right_left]
@[simp] theorem gcd_mul_right_add_left (m n k : Nat) : gcd (k * m + n) m = gcd n m := by
rw [Nat.add_comm, gcd_add_mul_right_left]
@[simp] theorem gcd_mul_left_add_left (m n k : Nat) : gcd (m * k + n) m = gcd n m := by
rw [Nat.add_comm, gcd_add_mul_left_left]
@[simp] theorem gcd_add_self_right (m n : Nat) : gcd m (n + m) = gcd m n := by
simpa using gcd_add_mul_right_right _ _ 1
@[simp] theorem gcd_self_add_right (m n : Nat) : gcd m (m + n) = gcd m n := by
simpa using gcd_mul_right_add_right _ _ 1
@[simp] theorem gcd_add_self_left (m n : Nat) : gcd (n + m) m = gcd n m := by
simpa using gcd_add_mul_right_left _ _ 1
@[simp] theorem gcd_self_add_left (m n : Nat) : gcd (m + n) m = gcd n m := by
simpa using gcd_mul_right_add_left _ _ 1
@[simp] theorem gcd_add_left_left_of_dvd {m k : Nat} (n : Nat) :
m k gcd (k + n) m = gcd n m := by
rintro l, rfl; exact gcd_mul_left_add_left m n l
@[simp] theorem gcd_add_right_left_of_dvd {m k : Nat} (n : Nat) :
m k gcd (n + k) m = gcd n m := by
rintro l, rfl; exact gcd_add_mul_left_left m n l
@[simp] theorem gcd_add_left_right_of_dvd {n k : Nat} (m : Nat) :
n k gcd n (k + m) = gcd n m := by
rintro l, rfl; exact gcd_mul_left_add_right n m l
@[simp] theorem gcd_add_right_right_of_dvd {n k : Nat} (m : Nat) :
n k gcd n (m + k) = gcd n m := by
rintro l, rfl; exact gcd_add_mul_left_right n m l
@[simp] theorem gcd_sub_mul_right_right {m n k : Nat} (h : k * m n) :
gcd m (n - k * m) = gcd m n := by
rw [ gcd_add_mul_right_right m (n - k * m) k, Nat.sub_add_cancel h]
@[simp] theorem gcd_sub_mul_left_right {m n k : Nat} (h : m * k n) :
gcd m (n - m * k) = gcd m n := by
rw [ gcd_add_mul_left_right m (n - m * k) k, Nat.sub_add_cancel h]
@[simp] theorem gcd_mul_right_sub_right {m n k : Nat} (h : n k * m) :
gcd m (k * m - n) = gcd m n :=
gcd_right_eq_iff.2 fun _ hl => dvd_sub_iff_right h (Nat.dvd_mul_left_of_dvd hl _)
@[simp] theorem gcd_mul_left_sub_right {m n k : Nat} (h : n m * k) :
gcd m (m * k - n) = gcd m n := by
rw [Nat.mul_comm, gcd_mul_right_sub_right (Nat.mul_comm _ _ h)]
@[simp] theorem gcd_sub_mul_right_left {m n k : Nat} (h : k * m n) :
gcd (n - k * m) m = gcd n m := by
rw [gcd_comm, gcd_sub_mul_right_right h, gcd_comm]
@[simp] theorem gcd_sub_mul_left_left {m n k : Nat} (h : m * k n) :
gcd (n - m * k) m = gcd n m := by
rw [Nat.mul_comm, gcd_sub_mul_right_left (Nat.mul_comm _ _ h)]
@[simp] theorem gcd_mul_right_sub_left {m n k : Nat} (h : n k * m) :
gcd (k * m - n) m = gcd n m := by
rw [gcd_comm, gcd_mul_right_sub_right h, gcd_comm]
@[simp] theorem gcd_mul_left_sub_left {m n k : Nat} (h : n m * k) :
gcd (m * k - n) m = gcd n m := by
rw [Nat.mul_comm, gcd_mul_right_sub_left (Nat.mul_comm _ _ h)]
@[simp] theorem gcd_sub_self_right {m n : Nat} (h : m n) : gcd m (n - m) = gcd m n := by
simpa using gcd_sub_mul_right_right (k := 1) (by simpa using h)
@[simp] theorem gcd_self_sub_right {m n : Nat} (h : n m) : gcd m (m - n) = gcd m n := by
simpa using gcd_mul_right_sub_right (k := 1) (by simpa using h)
@[simp] theorem gcd_sub_self_left {m n : Nat} (h : m n) : gcd (n - m) m = gcd n m := by
simpa using gcd_sub_mul_right_left (k := 1) (by simpa using h)
@[simp] theorem gcd_self_sub_left {m n : Nat} (h : n m) : gcd (m - n) m = gcd n m := by
simpa using gcd_mul_right_sub_left (k := 1) (by simpa using h)
@[simp] theorem gcd_sub_left_left_of_dvd {n k : Nat} (m : Nat) (h : n k) :
m k gcd (k - n) m = gcd n m := by
rintro l, rfl; exact gcd_mul_left_sub_left h
@[simp] theorem gcd_sub_right_left_of_dvd {n k : Nat} (m : Nat) (h : k n) :
m k gcd (n - k) m = gcd n m := by
rintro l, rfl; exact gcd_sub_mul_left_left h
@[simp] theorem gcd_sub_left_right_of_dvd {m k : Nat} (n : Nat) (h : m k) :
n k gcd n (k - m) = gcd n m := by
rintro l, rfl; exact gcd_mul_left_sub_right h
@[simp] theorem gcd_sub_right_right_of_dvd {m k : Nat} (n : Nat) (h : k m) :
n k gcd n (m - k) = gcd n m := by
rintro l, rfl; exact gcd_sub_mul_left_right h
@[simp] theorem gcd_eq_zero_iff {i j : Nat} : gcd i j = 0 i = 0 j = 0 :=
fun h => eq_zero_of_gcd_eq_zero_left h, eq_zero_of_gcd_eq_zero_right h,
fun h => by simp [h]
@@ -237,7 +368,7 @@ theorem gcd_eq_iff {a b : Nat} :
· exact Nat.dvd_gcd ha hb
/-- Represent a divisor of `m * n` as a product of a divisor of `m` and a divisor of `n`. -/
def prod_dvd_and_dvd_of_dvd_prod {k m n : Nat} (H : k m * n) :
def dvdProdDvdOfDvdProd {k m n : Nat} (h : k m * n) :
{d : {m' // m' m} × {n' // n' n} // k = d.1.val * d.2.val} :=
if h0 : gcd k m = 0 then
0, eq_zero_of_gcd_eq_zero_right h0 Nat.dvd_refl 0,
@@ -248,15 +379,97 @@ def prod_dvd_and_dvd_of_dvd_prod {k m n : Nat} (H : k m * n) :
refine gcd k m, gcd_dvd_right k m, k / gcd k m, ?_, hd.symm
apply Nat.dvd_of_mul_dvd_mul_left (Nat.pos_of_ne_zero h0)
rw [hd, gcd_mul_right]
exact Nat.dvd_gcd (Nat.dvd_mul_right _ _) H
exact Nat.dvd_gcd (Nat.dvd_mul_right _ _) h
theorem gcd_mul_dvd_mul_gcd (k m n : Nat) : gcd k (m * n) gcd k m * gcd k n := by
@[inherit_doc dvdProdDvdOfDvdProd, deprecated dvdProdDvdOfDvdProd (since := "2025-04-01")]
def prod_dvd_and_dvd_of_dvd_prod {k m n : Nat} (H : k m * n) :
{d : {m' // m' m} × {n' // n' n} // k = d.1.val * d.2.val} :=
dvdProdDvdOfDvdProd H
protected theorem dvd_mul {k m n : Nat} : k m * n k₁ k₂, k₁ m k₂ n k₁ * k₂ = k := by
refine fun h => ?_, ?_
· obtain k₁, hk₁, k₂, hk₂, rfl := dvdProdDvdOfDvdProd h
exact k₁, k₂, hk₁, hk₂, rfl
· rintro k₁, k₂, hk₁, hk₂, rfl
exact Nat.mul_dvd_mul hk₁ hk₂
theorem gcd_mul_right_dvd_mul_gcd (k m n : Nat) : gcd k (m * n) gcd k m * gcd k n := by
let m', hm', n', hn', (h : gcd k (m * n) = m' * n') :=
prod_dvd_and_dvd_of_dvd_prod <| gcd_dvd_right k (m * n)
dvdProdDvdOfDvdProd <| gcd_dvd_right k (m * n)
rw [h]
have h' : m' * n' k := h gcd_dvd_left ..
exact Nat.mul_dvd_mul
(dvd_gcd (Nat.dvd_trans (Nat.dvd_mul_right m' n') h') hm')
(dvd_gcd (Nat.dvd_trans (Nat.dvd_mul_left n' m') h') hn')
@[deprecated gcd_mul_right_dvd_mul_gcd (since := "2025-04-02")]
theorem gcd_mul_dvd_mul_gcd (k m n : Nat) : gcd k (m * n) gcd k m * gcd k n :=
gcd_mul_right_dvd_mul_gcd k m n
theorem gcd_mul_left_dvd_mul_gcd (k m n : Nat) : gcd (m * n) k gcd m k * gcd n k := by
simpa [gcd_comm, Nat.mul_comm] using gcd_mul_right_dvd_mul_gcd _ _ _
theorem dvd_gcd_mul_iff_dvd_mul {k n m : Nat} : k gcd k n * m k n * m := by
refine (Nat.dvd_trans · <| Nat.mul_dvd_mul_right (k.gcd_dvd_right n) m), fun y, hy ?_
rw [ gcd_mul_right, hy, gcd_mul_left]
exact Nat.dvd_mul_right k (gcd m y)
theorem dvd_mul_gcd_iff_dvd_mul {k n m : Nat} : k n * gcd k m k n * m := by
rw [Nat.mul_comm, dvd_gcd_mul_iff_dvd_mul, Nat.mul_comm]
theorem dvd_gcd_mul_gcd_iff_dvd_mul {k n m : Nat} : k gcd k n * gcd k m k n * m := by
rw [dvd_gcd_mul_iff_dvd_mul, dvd_mul_gcd_iff_dvd_mul]
theorem gcd_eq_one_iff {m n : Nat} : gcd m n = 1 c, c m c n c = 1 := by
simp [gcd_eq_iff]
theorem gcd_mul_right_right_of_gcd_eq_one {n m k : Nat} : gcd n m = 1 gcd n (m * k) = gcd n k := by
rw [gcd_right_eq_iff, gcd_eq_one_iff]
refine fun h l hl₁ => ?_, fun a => Nat.dvd_mul_left_of_dvd a m
rw [Nat.dvd_mul]
rintro k₁, k₂, hk₁, hk₂, rfl
obtain rfl : k₁ = 1 := h _ (Nat.dvd_trans (Nat.dvd_mul_right k₁ k₂) hl₁) hk₁
simpa
theorem gcd_mul_left_right_of_gcd_eq_one {n m k : Nat} (h : gcd n m = 1) :
gcd n (k * m) = gcd n k := by
rw [Nat.mul_comm, gcd_mul_right_right_of_gcd_eq_one h]
theorem gcd_mul_right_left_of_gcd_eq_one {n m k : Nat} (h : gcd n m = 1) :
gcd (n * k) m = gcd k m := by
rw [gcd_comm, gcd_mul_right_right_of_gcd_eq_one (gcd_comm _ _ h), gcd_comm]
theorem gcd_mul_left_left_of_gcd_eq_one {n m k : Nat} (h : gcd n m = 1) :
gcd (k * n) m = gcd k m := by
rw [Nat.mul_comm, gcd_mul_right_left_of_gcd_eq_one h]
theorem gcd_pow_left_of_gcd_eq_one {k n m : Nat} (h : gcd n m = 1) : gcd (n ^ k) m = 1 := by
induction k with
| zero => simp [Nat.pow_zero]
| succ k ih => rw [Nat.pow_succ, gcd_mul_left_left_of_gcd_eq_one h, ih]
theorem gcd_pow_right_of_gcd_eq_one {k n m : Nat} (h : gcd n m = 1) : gcd n (m ^ k) = 1 := by
rw [gcd_comm, gcd_pow_left_of_gcd_eq_one (gcd_comm _ _ h)]
theorem pow_gcd_pow_of_gcd_eq_one {k l n m : Nat} (h : gcd n m = 1) : gcd (n ^ k) (m ^ l) = 1 :=
gcd_pow_left_of_gcd_eq_one (gcd_pow_right_of_gcd_eq_one h)
theorem gcd_div_gcd_div_gcd_of_pos_left {n m : Nat} (h : 0 < n) :
gcd (n / gcd n m) (m / gcd n m) = 1 := by
rw [gcd_div (gcd_dvd_left _ _) (gcd_dvd_right _ _), Nat.div_self (gcd_pos_of_pos_left _ h)]
theorem gcd_div_gcd_div_gcd_of_pos_right {n m : Nat} (h : 0 < m) :
gcd (n / gcd n m) (m / gcd n m) = 1 := by
rw [gcd_div (gcd_dvd_left _ _) (gcd_dvd_right _ _), Nat.div_self (gcd_pos_of_pos_right _ h)]
theorem pow_gcd_pow {k n m : Nat} : gcd (n ^ k) (m ^ k) = (gcd n m) ^ k := by
refine (Nat.eq_zero_or_pos n).elim (by rintro rfl; cases k <;> simp [Nat.pow_zero]) (fun hn => ?_)
conv => lhs; rw [ Nat.div_mul_cancel (gcd_dvd_left n m)]
conv => lhs; arg 2; rw [ Nat.div_mul_cancel (gcd_dvd_right n m)]
rw [Nat.mul_pow, Nat.mul_pow, gcd_mul_right, pow_gcd_pow_of_gcd_eq_one, Nat.one_mul]
exact gcd_div_gcd_div_gcd_of_pos_left hn
theorem pow_dvd_pow_iff {a b n : Nat} (h : n 0) : a ^ n b ^ n a b := by
rw [ gcd_eq_left_iff_dvd, gcd_eq_left_iff_dvd, pow_gcd_pow, Nat.pow_left_inj h]
end Nat

View File

@@ -1,7 +1,7 @@
/-
Copyright (c) 2014 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Jeremy Avigad, Leonardo de Moura, Mario Carneiro
Authors: Jeremy Avigad, Leonardo de Moura, Mario Carneiro, Markus Himmel
-/
prelude
import Init.Data.Nat.Gcd
@@ -10,9 +10,6 @@ import Init.Data.Nat.Lemmas
/-!
# Lemmas about `Nat.lcm`
## Future work:
Most of the material about `Nat.gcd` from `Init.Data.Nat.Gcd` has analogues for `Nat.lcm`
that should be added to this file.
-/
namespace Nat
@@ -29,17 +26,36 @@ Examples:
-/
def lcm (m n : Nat) : Nat := m * n / gcd m n
theorem lcm_eq_mul_div (m n : Nat) : lcm m n = m * n / gcd m n := rfl
@[simp] theorem gcd_mul_lcm (m n : Nat) : gcd m n * lcm m n = m * n := by
rw [lcm_eq_mul_div,
Nat.mul_div_cancel' (Nat.dvd_trans (gcd_dvd_left m n) (Nat.dvd_mul_right m n))]
@[simp] theorem lcm_mul_gcd (m n : Nat) : lcm m n * gcd m n = m * n := by
simp [Nat.mul_comm]
@[simp] theorem lcm_dvd_mul (m n : Nat) : lcm m n m * n := gcd m n, by simp
@[simp] theorem gcd_dvd_mul (m n : Nat) : gcd m n m * n := lcm m n, by simp
@[simp] theorem lcm_le_mul {m n : Nat} (hm : 0 < m) (hn : 0 < n) : lcm m n m * n :=
le_of_dvd (Nat.mul_pos hm hn) (lcm_dvd_mul _ _)
@[simp] theorem gcd_le_mul {m n : Nat} (hm : 0 < m) (hn : 0 < n) : gcd m n m * n :=
le_of_dvd (Nat.mul_pos hm hn) (gcd_dvd_mul _ _)
theorem lcm_comm (m n : Nat) : lcm m n = lcm n m := by
rw [lcm, lcm, Nat.mul_comm n m, gcd_comm n m]
rw [lcm_eq_mul_div, lcm_eq_mul_div, Nat.mul_comm n m, gcd_comm n m]
instance : Std.Commutative lcm := lcm_comm
@[simp] theorem lcm_zero_left (m : Nat) : lcm 0 m = 0 := by simp [lcm]
@[simp] theorem lcm_zero_left (m : Nat) : lcm 0 m = 0 := by simp [lcm_eq_mul_div]
@[simp] theorem lcm_zero_right (m : Nat) : lcm m 0 = 0 := by simp [lcm]
@[simp] theorem lcm_zero_right (m : Nat) : lcm m 0 = 0 := by simp [lcm_eq_mul_div]
@[simp] theorem lcm_one_left (m : Nat) : lcm 1 m = m := by simp [lcm]
@[simp] theorem lcm_one_left (m : Nat) : lcm 1 m = m := by simp [lcm_eq_mul_div]
@[simp] theorem lcm_one_right (m : Nat) : lcm m 1 = m := by simp [lcm]
@[simp] theorem lcm_one_right (m : Nat) : lcm m 1 = m := by simp [lcm_eq_mul_div]
instance : Std.LawfulIdentity lcm 1 where
left_id := lcm_one_left
right_id := lcm_one_right
@@ -47,16 +63,32 @@ instance : Std.LawfulIdentity lcm 1 where
@[simp] theorem lcm_self (m : Nat) : lcm m m = m := by
match eq_zero_or_pos m with
| .inl h => rw [h, lcm_zero_left]
| .inr h => simp [lcm, Nat.mul_div_cancel _ h]
| .inr h => simp [lcm_eq_mul_div, Nat.mul_div_cancel _ h]
instance : Std.IdempotentOp lcm := lcm_self
theorem dvd_lcm_left (m n : Nat) : m lcm m n :=
n / gcd m n, by rw [ Nat.mul_div_assoc m (Nat.gcd_dvd_right m n)]; rfl
n / gcd m n, by rw [ Nat.mul_div_assoc m (Nat.gcd_dvd_right m n), lcm_eq_mul_div]
theorem dvd_lcm_right (m n : Nat) : n lcm m n := lcm_comm n m dvd_lcm_left n m
theorem gcd_mul_lcm (m n : Nat) : gcd m n * lcm m n = m * n := by
rw [lcm, Nat.mul_div_cancel' (Nat.dvd_trans (gcd_dvd_left m n) (Nat.dvd_mul_right m n))]
theorem lcm_ne_zero (hm : m 0) (hn : n 0) : lcm m n 0 := by
intro h
have h1 := gcd_mul_lcm m n
rw [h, Nat.mul_zero] at h1
match mul_eq_zero.1 h1.symm with
| .inl hm1 => exact hm hm1
| .inr hn1 => exact hn hn1
theorem lcm_pos : 0 < m 0 < n 0 < lcm m n := by
simpa [ Nat.pos_iff_ne_zero] using lcm_ne_zero
theorem le_lcm_left (m : Nat) (hn : 0 < n) : m lcm m n :=
(Nat.eq_zero_or_pos m).elim (by rintro rfl; simp)
(fun hm => le_of_dvd (lcm_pos hm hn) (dvd_lcm_left m n))
theorem le_lcm_right (hm : 0 < m) (n : Nat) : n lcm m n :=
(Nat.eq_zero_or_pos n).elim (by rintro rfl; simp)
(fun hn => le_of_dvd (lcm_pos hm hn) (dvd_lcm_right m n))
theorem lcm_dvd {m n k : Nat} (H1 : m k) (H2 : n k) : lcm m n k := by
match eq_zero_or_pos k with
@@ -66,6 +98,18 @@ theorem lcm_dvd {m n k : Nat} (H1 : m k) (H2 : n k) : lcm m n k := b
rw [gcd_mul_lcm, gcd_mul_right, Nat.mul_comm n k]
exact dvd_gcd (Nat.mul_dvd_mul_left _ H2) (Nat.mul_dvd_mul_right H1 _)
theorem lcm_dvd_iff {m n k : Nat} : lcm m n k m k n k :=
fun h => Nat.dvd_trans (dvd_lcm_left _ _) h, Nat.dvd_trans (dvd_lcm_right _ _) h,
fun hm, hn => lcm_dvd hm hn
theorem lcm_eq_left_iff_dvd : lcm m n = m n m := by
refine (Nat.eq_zero_or_pos m).elim (by rintro rfl; simp) (fun hm => ?_)
rw [lcm_eq_mul_div, Nat.div_eq_iff_eq_mul_left (gcd_pos_of_pos_left _ hm) (gcd_dvd_mul _ _),
Nat.mul_left_cancel_iff hm, Eq.comm, gcd_eq_right_iff_dvd]
theorem lcm_eq_right_iff_dvd : lcm m n = n m n := by
rw [lcm_comm, lcm_eq_left_iff_dvd]
theorem lcm_assoc (m n k : Nat) : lcm (lcm m n) k = lcm m (lcm n k) :=
Nat.dvd_antisymm
(lcm_dvd
@@ -78,12 +122,126 @@ Nat.dvd_antisymm
(dvd_lcm_right (lcm m n) k)))
instance : Std.Associative lcm := lcm_assoc
theorem lcm_ne_zero (hm : m 0) (hn : n 0) : lcm m n 0 := by
intro h
have h1 := gcd_mul_lcm m n
rw [h, Nat.mul_zero] at h1
match mul_eq_zero.1 h1.symm with
| .inl hm1 => exact hm hm1
| .inr hn1 => exact hn hn1
theorem lcm_mul_left (m n k : Nat) : lcm (m * n) (m * k) = m * lcm n k := by
refine (Nat.eq_zero_or_pos m).elim (by rintro rfl; simp) (fun hm => ?_)
rw [lcm_eq_mul_div, gcd_mul_left,
Nat.mul_div_assoc _ (Nat.mul_dvd_mul_left _ (gcd_dvd_right _ _)), Nat.mul_div_mul_left _ _ hm,
lcm_eq_mul_div, Nat.mul_div_assoc _ (gcd_dvd_right _ _), Nat.mul_assoc]
theorem lcm_mul_right (m n k : Nat) : lcm (m * n) (k * n) = lcm m k * n := by
rw [Nat.mul_comm _ n, Nat.mul_comm _ n, Nat.mul_comm _ n, lcm_mul_left]
theorem eq_zero_of_lcm_eq_zero (h : lcm m n = 0) : m = 0 n = 0 := by
cases m <;> cases n <;> simp [lcm_ne_zero] at *
@[simp] theorem lcm_eq_zero_iff : lcm m n = 0 m = 0 n = 0 := by
cases m <;> cases n <;> simp [lcm_ne_zero] at *
theorem lcm_eq_iff {n m l : Nat} :
lcm n m = l n l m l ( c, n c m c l c) := by
refine ?_, fun hn, hm, hl => Nat.dvd_antisymm (lcm_dvd hn hm) ?_
· rintro rfl
exact dvd_lcm_left _ _, dvd_lcm_right _ _, fun _ => Nat.lcm_dvd
· exact hl _ (dvd_lcm_left _ _) (dvd_lcm_right _ _)
theorem lcm_div {m n k : Nat} (hkm : k m) (hkn : k n) : lcm (m / k) (n / k) = lcm m n / k := by
refine (Nat.eq_zero_or_pos k).elim (by rintro rfl; simp) (fun hk => lcm_eq_iff.2
Nat.div_dvd_div hkm (dvd_lcm_left m n), Nat.div_dvd_div hkn (dvd_lcm_right m n),
fun c hc₁ hc₂ => ?_)
rw [div_dvd_iff_dvd_mul _ hk] at hc₁ hc₂
· exact lcm_dvd hc₁ hc₂
· exact Nat.dvd_trans hkm (dvd_lcm_left _ _)
· exact hkn
· exact hkm
theorem lcm_dvd_lcm_of_dvd_left {m k : Nat} (n : Nat) (h : m k) : lcm m n lcm k n :=
lcm_dvd (Nat.dvd_trans h (dvd_lcm_left _ _)) (dvd_lcm_right _ _)
theorem lcm_dvd_lcm_of_dvd_right {m k : Nat} (n : Nat) (h : m k) : lcm n m lcm n k :=
lcm_dvd (dvd_lcm_left _ _) (Nat.dvd_trans h (dvd_lcm_right _ _))
theorem lcm_dvd_lcm_mul_left_left (m n k : Nat) : lcm m n lcm (k * m) n :=
lcm_dvd_lcm_of_dvd_left _ (Nat.dvd_mul_left _ _)
theorem lcm_dvd_lcm_mul_right_left (m n k : Nat) : lcm m n lcm (m * k) n :=
lcm_dvd_lcm_of_dvd_left _ (Nat.dvd_mul_right _ _)
theorem lcm_dvd_lcm_mul_left_right (m n k : Nat) : lcm m n lcm m (k * n) :=
lcm_dvd_lcm_of_dvd_right _ (Nat.dvd_mul_left _ _)
theorem lcm_dvd_lcm_mul_right_right (m n k : Nat) : lcm m n lcm m (n * k) :=
lcm_dvd_lcm_of_dvd_right _ (Nat.dvd_mul_right _ _)
theorem lcm_eq_left {m n : Nat} (h : n m) : lcm m n = m :=
lcm_eq_left_iff_dvd.2 h
theorem lcm_eq_right {m n : Nat} (h : m n) : lcm m n = n :=
lcm_eq_right_iff_dvd.2 h
@[simp] theorem lcm_mul_left_left (m n : Nat) : lcm (m * n) n = m * n := by
simpa [lcm_eq_iff, Nat.dvd_mul_left] using fun _ h _ => h
@[simp] theorem lcm_mul_left_right (m n : Nat) : lcm n (m * n) = m * n := by
simp [lcm_eq_iff, Nat.dvd_mul_left]
@[simp] theorem lcm_mul_right_left (m n : Nat) : lcm (n * m) n = n * m := by
simpa [lcm_eq_iff, Nat.dvd_mul_right] using fun _ h _ => h
@[simp] theorem lcm_mul_right_right (m n : Nat) : lcm n (n * m) = n * m := by
simp [lcm_eq_iff, Nat.dvd_mul_right]
@[simp] theorem lcm_lcm_self_right_left (m n : Nat) : lcm m (lcm m n) = lcm m n := by
simp [ lcm_assoc]
@[simp] theorem lcm_lcm_self_right_right (m n : Nat) : lcm m (lcm n m) = lcm m n := by
rw [lcm_comm n m, lcm_lcm_self_right_left]
@[simp] theorem lcm_lcm_self_left_left (m n : Nat) : lcm (lcm m n) m = lcm n m := by
simp [lcm_comm]
@[simp] theorem lcm_lcm_self_left_right (m n : Nat) : lcm (lcm n m) m = lcm n m := by
simp [lcm_comm]
theorem lcm_eq_mul_iff {m n : Nat} : lcm m n = m * n m = 0 n = 0 gcd m n = 1 := by
rw [lcm_eq_mul_div, Nat.div_eq_self, Nat.mul_eq_zero, or_assoc]
@[simp] theorem lcm_eq_one_iff {m n : Nat} : lcm m n = 1 m = 1 n = 1 := by
refine fun h => ?_, ?_, by rintro rfl, rfl; simp <;>
(apply Nat.eq_one_of_dvd_one; simp [ h, dvd_lcm_left, dvd_lcm_right])
theorem lcm_mul_right_dvd_mul_lcm (k m n : Nat) : lcm k (m * n) lcm k m * lcm k n :=
lcm_dvd (Nat.dvd_mul_left_of_dvd (dvd_lcm_left _ _) _)
(Nat.mul_dvd_mul (dvd_lcm_right _ _) (dvd_lcm_right _ _))
theorem lcm_mul_left_dvd_mul_lcm (k m n : Nat) : lcm (m * n) k lcm m k * lcm n k := by
simpa [lcm_comm, Nat.mul_comm] using lcm_mul_right_dvd_mul_lcm _ _ _
theorem lcm_dvd_mul_self_left_iff_dvd_mul {k n m : Nat} : lcm k n k * m n k * m :=
fun h => Nat.dvd_trans (dvd_lcm_right _ _) h, fun h => lcm_dvd (Nat.dvd_mul_right k m) h
theorem lcm_dvd_mul_self_right_iff_dvd_mul {k m n : Nat} : lcm n k m * k n m * k := by
rw [lcm_comm, Nat.mul_comm m, lcm_dvd_mul_self_left_iff_dvd_mul]
theorem lcm_mul_right_right_eq_mul_of_lcm_eq_mul {n m k : Nat} (h : lcm n m = n * m) :
lcm n (m * k) = lcm n k * m := by
rcases lcm_eq_mul_iff.1 h with (rfl|rfl|h) <;> try (simp; done)
rw [Nat.mul_comm _ m, lcm_eq_mul_div, gcd_mul_right_right_of_gcd_eq_one h, Nat.mul_comm,
Nat.mul_assoc, Nat.mul_comm k, Nat.mul_div_assoc _ (gcd_dvd_mul _ _), lcm_eq_mul_div]
theorem lcm_mul_left_right_eq_mul_of_lcm_eq_mul {n m k} (h : lcm n m = n * m) :
lcm n (k * m) = lcm n k * m := by
rw [Nat.mul_comm, lcm_mul_right_right_eq_mul_of_lcm_eq_mul h, Nat.mul_comm]
theorem lcm_mul_right_left_eq_mul_of_lcm_eq_mul {n m k} (h : lcm n m = n * m) :
lcm (n * k) m = n * lcm k m := by
rw [lcm_comm, lcm_mul_right_right_eq_mul_of_lcm_eq_mul, lcm_comm, Nat.mul_comm]
rwa [lcm_comm, Nat.mul_comm]
theorem lcm_mul_left_left_eq_mul_of_lcm_eq_mul {n m k} (h : lcm n m = n * m) :
lcm (k * n) m = n * lcm k m := by
rw [Nat.mul_comm, lcm_mul_right_left_eq_mul_of_lcm_eq_mul h]
theorem pow_lcm_pow {k n m : Nat} : lcm (n ^ k) (m ^ k) = (lcm n m) ^ k := by
rw [lcm_eq_mul_div, pow_gcd_pow, Nat.mul_pow, Nat.div_pow (gcd_dvd_mul _ _), lcm_eq_mul_div]
end Nat

View File

@@ -532,6 +532,12 @@ protected theorem pos_of_lt_mul_right {a b c : Nat} (h : a < b * c) : 0 < b := b
replace h : 0 < b * c := by omega
exact Nat.pos_of_mul_pos_right h
protected theorem mul_dvd_mul_iff_left {a b c : Nat} (h : 0 < a) : a * b a * c b c :=
fun k, hk => k, Nat.mul_left_cancel h (Nat.mul_assoc _ _ _ hk), Nat.mul_dvd_mul_left _
protected theorem mul_dvd_mul_iff_right {a b c : Nat} (h : 0 < c) : a * c b * c a b := by
rw [Nat.mul_comm _ c, Nat.mul_comm _ c, Nat.mul_dvd_mul_iff_left h]
/-! ### div/mod -/
theorem mod_two_eq_zero_or_one (n : Nat) : n % 2 = 0 n % 2 = 1 :=
@@ -602,6 +608,27 @@ theorem mod_eq_sub (x w : Nat) : x % w = x - w * (x / w) := by
conv => rhs; congr; rw [ mod_add_div x w]
simp
theorem div_dvd_div {m n k : Nat} : k m m n m / k n / k := by
refine (Nat.eq_zero_or_pos k).elim (by rintro rfl; simp) (fun hk => ?_)
rintro a, rfl b, rfl
rw [Nat.mul_comm, Nat.mul_div_cancel _ hk, Nat.mul_comm, Nat.mul_assoc, Nat.mul_div_cancel _ hk]
exact Nat.dvd_mul_left a b
@[simp] theorem div_dvd_iff_dvd_mul {a b c : Nat} (h : b a) (hb : 0 < b) :
a / b c a b * c := by
rcases h with k, rfl
rw [Nat.mul_comm, Nat.mul_div_cancel _ hb, Nat.mul_comm, Nat.mul_dvd_mul_iff_left hb]
theorem div_eq_self {m n : Nat} : m / n = m m = 0 n = 1 := by
refine fun h => (Nat.eq_zero_or_pos m).elim Or.inl ?_, fun h => by cases h <;> simp_all
refine fun hm => Or.inr ?_
rcases Nat.lt_trichotomy n 1 with (hn|hn|hn)
· obtain rfl : n = 0 := by rwa [lt_one_iff] at hn
obtain rfl : 0 = m := by simpa [Nat.div_zero] using h
simp at hm
· exact hn
· exact False.elim (absurd h (Nat.ne_of_lt (Nat.div_lt_self hm hn)))
/-! ### pow -/
theorem pow_succ' {m n : Nat} : m ^ n.succ = m * m ^ n := by
@@ -626,9 +653,6 @@ protected theorem zero_pow {n : Nat} (H : 0 < n) : 0 ^ n = 0 := by
| zero => rfl
| succ _ ih => rw [Nat.pow_succ, Nat.mul_one, ih]
@[simp] protected theorem pow_one (a : Nat) : a ^ 1 = a := by
rw [Nat.pow_succ, Nat.pow_zero, Nat.one_mul]
protected theorem pow_two (a : Nat) : a ^ 2 = a * a := by rw [Nat.pow_succ, Nat.pow_one]
protected theorem pow_add (a m n : Nat) : a ^ (m + n) = a ^ m * a ^ n := by
@@ -650,11 +674,6 @@ protected theorem pow_mul' (a m n : Nat) : a ^ (m * n) = (a ^ n) ^ m := by
protected theorem pow_right_comm (a m n : Nat) : (a ^ m) ^ n = (a ^ n) ^ m := by
rw [ Nat.pow_mul, Nat.pow_mul']
protected theorem mul_pow (a b n : Nat) : (a * b) ^ n = a ^ n * b ^ n := by
induction n with
| zero => rw [Nat.pow_zero, Nat.pow_zero, Nat.pow_zero, Nat.mul_one]
| succ _ ih => rw [Nat.pow_succ, Nat.pow_succ, Nat.pow_succ, Nat.mul_mul_mul_comm, ih]
protected theorem one_lt_two_pow (h : n 0) : 1 < 2 ^ n :=
match n, h with
| n+1, _ => by
@@ -872,6 +891,18 @@ theorem dvd_of_pow_dvd {p k m : Nat} (hk : 1 ≤ k) (hpk : p ^ k m) : p
protected theorem pow_div {x m n : Nat} (h : n m) (hx : 0 < x) : x ^ m / x ^ n = x ^ (m - n) := by
rw [Nat.div_eq_iff_eq_mul_left (Nat.pow_pos hx) (Nat.pow_dvd_pow _ h), Nat.pow_sub_mul_pow _ h]
protected theorem div_pow {a b c : Nat} (h : a b) : (b / a) ^ c = b ^ c / a ^ c := by
refine (Nat.eq_zero_or_pos c).elim (by rintro rfl; simp) (fun hc => ?_)
refine (Nat.eq_zero_or_pos a).elim (by rintro rfl; simp [hc]) (fun ha => ?_)
rw [eq_comm, Nat.div_eq_iff_eq_mul_left (Nat.pow_pos ha)
((Nat.pow_dvd_pow_iff (Nat.pos_iff_ne_zero.1 hc)).2 h)]
clear hc
induction c with
| zero => simp
| succ c ih =>
rw [Nat.pow_succ (b / a), Nat.pow_succ a, Nat.mul_comm _ a, Nat.mul_assoc, Nat.mul_assoc _ a,
Nat.div_mul_cancel h, Nat.mul_comm b, Nat.mul_assoc, ih, Nat.pow_succ]
/-! ### shiftLeft and shiftRight -/
@[simp] theorem shiftLeft_zero : n <<< 0 = n := rfl

View File

@@ -144,6 +144,13 @@ none
| none => b
| some a => f a rfl
/-- Partial filter. If `o : Option α`, `p : (a : α) → o = some a → Bool`, then `o.pfilter p` is
the same as `o.filter p` but `p` is passed the proof that `o = some a`. -/
@[inline] def pfilter (o : Option α) (p : (a : α) o = some a Bool) : Option α :=
match o with
| none => none
| some a => bif p a rfl then o else none
/--
Executes a monadic action on an optional value if it is present, or does nothing if there is no
value.

View File

@@ -59,6 +59,15 @@ theorem get!_eq_getD [Inhabited α] (o : Option α) : o.get! = o.getD default :=
@[deprecated get!_eq_getD (since := "2024-11-18")] abbrev get!_eq_getD_default := @get!_eq_getD
theorem get_congr {o o' : Option α} {ho : o.isSome} (h : o = o') :
o.get ho = o'.get (h ho) := by
cases h; rfl
theorem get_inj {o1 o2 : Option α} {h1} {h2} :
o1.get h1 = o2.get h2 o1 = o2 := by
match o1, o2, h1, h2 with
| some a, some b, _, _ => simp only [Option.get_some, Option.some.injEq]
theorem mem_unique {o : Option α} {a b : α} (ha : a o) (hb : b o) : a = b :=
some.inj <| ha hb
@@ -75,6 +84,12 @@ theorem isSome_iff_exists : isSome x ↔ ∃ a, x = some a := by cases x <;> sim
theorem isSome_eq_isSome : (isSome x = isSome y) (x = none y = none) := by
cases x <;> cases y <;> simp
theorem isSome_of_mem {x : Option α} {y : α} (h : y x) : x.isSome := by
cases x <;> trivial
theorem isSome_of_eq_some {x : Option α} {y : α} (h : x = some y) : x.isSome := by
cases x <;> trivial
@[simp] theorem not_isSome : isSome a = false a.isNone = true := by
cases a <;> simp
@@ -142,6 +157,23 @@ theorem bind_congr {α β} {o : Option α} {f g : α → Option β} :
(h : a, o = some a f a = g a) o.bind f = o.bind g := by
cases o <;> simp
theorem isSome_bind {α β : Type _} (x : Option α) (f : α Option β) :
(x.bind f).isSome = x.any (fun x => (f x).isSome) := by
cases x <;> rfl
theorem isSome_of_isSome_bind {α β : Type _} {x : Option α} {f : α Option β}
(h : (x.bind f).isSome) : x.isSome := by
cases x <;> trivial
theorem isSome_apply_of_isSome_bind {α β : Type _} {x : Option α} {f : α Option β}
(h : (x.bind f).isSome) : (f (x.get (isSome_of_isSome_bind h))).isSome := by
cases x <;> trivial
@[simp] theorem get_bind {α β : Type _} {x : Option α} {f : α Option β} (h : (x.bind f).isSome) :
(x.bind f).get h = (f (x.get (isSome_of_isSome_bind h))).get
(isSome_apply_of_isSome_bind h) := by
cases x <;> trivial
theorem join_eq_some : x.join = some a x = some (some a) := by
simp [bind_eq_some]
@@ -221,11 +253,11 @@ theorem map_inj_right {f : α → β} {o o' : Option α} (w : ∀ x y, f x = f y
| none => simp
| some a' => simpa using fun h => w _ _ h, fun h => congrArg f h
@[simp] theorem map_if {f : α β} [Decidable c] :
@[simp] theorem map_if {f : α β} {_ : Decidable c} :
(if c then some a else none).map f = if c then some (f a) else none := by
split <;> rfl
@[simp] theorem map_dif {f : α β} [Decidable c] {a : c α} :
@[simp] theorem map_dif {f : α β} {_ : Decidable c} {a : c α} :
(if h : c then some (a h) else none).map f = if h : c then some (f (a h)) else none := by
split <;> rfl
@@ -240,8 +272,8 @@ theorem isSome_of_isSome_filter (p : α → Bool) (o : Option α) (h : (o.filter
@[deprecated isSome_of_isSome_filter (since := "2025-03-18")]
abbrev isSome_filter_of_isSome := @isSome_of_isSome_filter
@[simp] theorem filter_eq_none {p : α Bool} :
o.filter p = none o = none a, a o ¬ p a := by
@[simp] theorem filter_eq_none {o : Option α} {p : α Bool} :
o.filter p = none a, a o ¬ p a := by
cases o <;> simp [filter_some]
@[simp] theorem filter_eq_some {o : Option α} {p : α Bool} :
@@ -262,6 +294,10 @@ theorem mem_filter_iff {p : α → Bool} {a : α} {o : Option α} :
a o.filter p a o p a := by
simp
theorem filter_eq_bind (x : Option α) (p : α Bool) :
x.filter p = x.bind (Option.guard (fun a => p a)) := by
cases x <;> rfl
@[simp] theorem all_guard (p : α Prop) [DecidablePred p] (a : α) :
Option.all q (guard p a) = (!p a || q a) := by
simp only [guard]
@@ -272,6 +308,45 @@ theorem mem_filter_iff {p : α → Bool} {a : α} {o : Option α} :
simp only [guard]
split <;> simp_all
theorem all_eq_true (p : α Bool) (x : Option α) :
x.all p = true y, x = some y p y := by
cases x <;> simp
theorem all_eq_true_iff_get (p : α Bool) (x : Option α) :
x.all p = true (h : x.isSome) p (x.get h) := by
cases x <;> simp
theorem all_eq_false (p : α Bool) (x : Option α) :
x.all p = false y, x = some y p y = false := by
cases x <;> simp
theorem all_eq_false_iff_get (p : α Bool) (x : Option α) :
x.all p = false h : x.isSome, p (x.get h) = false := by
cases x <;> simp
theorem any_eq_true (p : α Bool) (x : Option α) :
x.any p = true y, x = some y p y := by
cases x <;> simp
theorem any_eq_true_iff_get (p : α Bool) (x : Option α) :
x.any p = true h : x.isSome, p (x.get h) := by
cases x <;> simp
theorem any_eq_false (p : α Bool) (x : Option α) :
x.any p = false y, x = some y p y = false := by
cases x <;> simp
theorem any_eq_false_iff_get (p : α Bool) (x : Option α) :
x.any p = false (h : x.isSome) p (x.get h) = false := by
cases x <;> simp
theorem isSome_of_any {x : Option α} {p : α Bool} (h : x.any p) : x.isSome := by
cases x <;> trivial
theorem any_map {α β : Type _} {x : Option α} {f : α β} {p : β Bool} :
(x.map f).any p = x.any (fun a => p (f a)) := by
cases x <;> rfl
theorem bind_map_comm {α β} {x : Option (Option α)} {f : α β} :
x.bind (Option.map f) = (x.map (Option.map f)).bind id := by cases x <;> simp
@@ -335,6 +410,18 @@ theorem guard_comp {p : α → Prop} [DecidablePred p] {f : β → α} :
ext1 b
simp [guard]
theorem bind_guard (x : Option α) (p : α Prop) {_ : DecidablePred p} :
x.bind (Option.guard p) = x.filter p := by
simp only [Option.filter_eq_bind, decide_eq_true_eq]
theorem guard_eq_map (p : α Prop) [DecidablePred p] :
Option.guard p = fun x => Option.map (fun _ => x) (if p x then some x else none) := by
funext x
simp [Option.guard]
theorem guard_def (p : α Prop) {_ : DecidablePred p} :
Option.guard p = fun x => if p x then some x else none := rfl
theorem liftOrGet_eq_or_eq {f : α α α} (h : a b, f a b = a f a b = b) :
o₁ o₂, liftOrGet f o₁ o₂ = o₁ liftOrGet f o₁ o₂ = o₂
| none, none => .inl rfl
@@ -501,90 +588,104 @@ end beq
/-! ### ite -/
section ite
@[simp] theorem dite_none_left_eq_some {p : Prop} [Decidable p] {b : ¬p Option β} :
@[simp] theorem dite_none_left_eq_some {p : Prop} {_ : Decidable p} {b : ¬p Option β} :
(if h : p then none else b h) = some a h, b h = some a := by
split <;> simp_all
@[simp] theorem dite_none_right_eq_some {p : Prop} [Decidable p] {b : p Option α} :
@[simp] theorem dite_none_right_eq_some {p : Prop} {_ : Decidable p} {b : p Option α} :
(if h : p then b h else none) = some a h, b h = some a := by
split <;> simp_all
@[simp] theorem some_eq_dite_none_left {p : Prop} [Decidable p] {b : ¬p Option β} :
@[simp] theorem some_eq_dite_none_left {p : Prop} {_ : Decidable p} {b : ¬p Option β} :
some a = (if h : p then none else b h) h, some a = b h := by
split <;> simp_all
@[simp] theorem some_eq_dite_none_right {p : Prop} [Decidable p] {b : p Option α} :
@[simp] theorem some_eq_dite_none_right {p : Prop} {_ : Decidable p} {b : p Option α} :
some a = (if h : p then b h else none) h, some a = b h := by
split <;> simp_all
@[simp] theorem ite_none_left_eq_some {p : Prop} [Decidable p] {b : Option β} :
@[simp] theorem ite_none_left_eq_some {p : Prop} {_ : Decidable p} {b : Option β} :
(if p then none else b) = some a ¬ p b = some a := by
split <;> simp_all
@[simp] theorem ite_none_right_eq_some {p : Prop} [Decidable p] {b : Option α} :
@[simp] theorem ite_none_right_eq_some {p : Prop} {_ : Decidable p} {b : Option α} :
(if p then b else none) = some a p b = some a := by
split <;> simp_all
@[simp] theorem some_eq_ite_none_left {p : Prop} [Decidable p] {b : Option β} :
@[simp] theorem some_eq_ite_none_left {p : Prop} {_ : Decidable p} {b : Option β} :
some a = (if p then none else b) ¬ p some a = b := by
split <;> simp_all
@[simp] theorem some_eq_ite_none_right {p : Prop} [Decidable p] {b : Option α} :
@[simp] theorem some_eq_ite_none_right {p : Prop} {_ : Decidable p} {b : Option α} :
some a = (if p then b else none) p some a = b := by
split <;> simp_all
theorem mem_dite_none_left {x : α} [Decidable p] {l : ¬ p Option α} :
theorem mem_dite_none_left {x : α} {_ : Decidable p} {l : ¬ p Option α} :
(x if h : p then none else l h) h : ¬ p, x l h := by
simp
theorem mem_dite_none_right {x : α} [Decidable p] {l : p Option α} :
theorem mem_dite_none_right {x : α} {_ : Decidable p} {l : p Option α} :
(x if h : p then l h else none) h : p, x l h := by
simp
theorem mem_ite_none_left {x : α} [Decidable p] {l : Option α} :
theorem mem_ite_none_left {x : α} {_ : Decidable p} {l : Option α} :
(x if p then none else l) ¬ p x l := by
simp
theorem mem_ite_none_right {x : α} [Decidable p] {l : Option α} :
theorem mem_ite_none_right {x : α} {_ : Decidable p} {l : Option α} :
(x if p then l else none) p x l := by
simp
@[simp] theorem isSome_dite {p : Prop} [Decidable p] {b : p β} :
@[simp] theorem isSome_dite {p : Prop} {_ : Decidable p} {b : p β} :
(if h : p then some (b h) else none).isSome = true p := by
split <;> simpa
@[simp] theorem isSome_ite {p : Prop} [Decidable p] :
@[simp] theorem isSome_ite {p : Prop} {_ : Decidable p} :
(if p then some b else none).isSome = true p := by
split <;> simpa
@[simp] theorem isSome_dite' {p : Prop} [Decidable p] {b : ¬ p β} :
@[simp] theorem isSome_dite' {p : Prop} {_ : Decidable p} {b : ¬ p β} :
(if h : p then none else some (b h)).isSome = true ¬ p := by
split <;> simpa
@[simp] theorem isSome_ite' {p : Prop} [Decidable p] :
@[simp] theorem isSome_ite' {p : Prop} {_ : Decidable p} :
(if p then none else some b).isSome = true ¬ p := by
split <;> simpa
@[simp] theorem get_dite {p : Prop} [Decidable p] (b : p β) (w) :
@[simp] theorem get_dite {p : Prop} {_ : Decidable p} (b : p β) (w) :
(if h : p then some (b h) else none).get w = b (by simpa using w) := by
split
· simp
· exfalso
simp at w
contradiction
@[simp] theorem get_ite {p : Prop} [Decidable p] (h) :
@[simp] theorem get_ite {p : Prop} {_ : Decidable p} (h) :
(if p then some b else none).get h = b := by
simpa using get_dite (p := p) (fun _ => b) (by simpa using h)
@[simp] theorem get_dite' {p : Prop} [Decidable p] (b : ¬ p β) (w) :
@[simp] theorem get_dite' {p : Prop} {_ : Decidable p} (b : ¬ p β) (w) :
(if h : p then none else some (b h)).get w = b (by simpa using w) := by
split
· exfalso
simp at w
contradiction
· simp
@[simp] theorem get_ite' {p : Prop} [Decidable p] (h) :
@[simp] theorem get_ite' {p : Prop} {_ : Decidable p} (h) :
(if p then none else some b).get h = b := by
simpa using get_dite' (p := p) (fun _ => b) (by simpa using h)
end ite
theorem isSome_filter {α : Type _} {x : Option α} {f : α Bool} :
(x.filter f).isSome = x.any f := by
cases x
· rfl
· rw [Bool.eq_iff_iff]
simp only [Option.any_some, Option.filter, Option.isSome_ite]
@[simp] theorem get_filter {α : Type _} {x : Option α} {f : α Bool} (h : (x.filter f).isSome) :
(x.filter f).get h = x.get (isSome_of_isSome_filter f x h) := by
cases x
· contradiction
· unfold Option.filter
simp only [Option.get_ite, Option.get_some]
/-! ### pbind -/
@[simp] theorem pbind_none : pbind none f = none := rfl
@@ -592,7 +693,16 @@ end ite
@[simp] theorem map_pbind {o : Option α} {f : (a : α) a o Option β} {g : β γ} :
(o.pbind f).map g = o.pbind (fun a h => (f a h).map g) := by
cases o <;> simp
cases o <;> rfl
@[simp] theorem pbind_map {α β γ : Type _} (o : Option α)
(f : α β) (g : (x : β) o.map f = some x Option γ) :
(o.map f).pbind g = o.pbind (fun x h => g (f x) (h rfl)) := by
cases o <;> rfl
@[simp] theorem pbind_eq_bind {α β : Type _} (o : Option α)
(f : α Option β) : o.pbind (fun x _ => f x) = o.bind f := by
cases o <;> rfl
@[congr] theorem pbind_congr {o o' : Option α} (ho : o = o')
{f : (a : α) a o Option β} {g : (a : α) a o' Option β}
@@ -602,39 +712,20 @@ end ite
theorem pbind_eq_none_iff {o : Option α} {f : (a : α) a o Option β} :
o.pbind f = none o = none a h, f a h = none := by
cases o with
| none => simp
| some a =>
simp only [pbind_some, reduceCtorEq, mem_def, some.injEq, false_or]
constructor
· intro h
exact a, rfl, h
· rintro a, rfl, h
exact h
cases o <;> simp
theorem isSome_pbind_iff {o : Option α} {f : (a : α) a o Option β} :
(o.pbind f).isSome a h, (f a h).isSome := by
cases o <;> simp
@[deprecated "isSome_pbind_iff" (since := "2025-04-01")]
theorem pbind_isSome {o : Option α} {f : (a : α) a o Option β} :
(o.pbind f).isSome = a h, (f a h).isSome := by
cases o with
| none => simp
| some a =>
simp only [pbind_some, mem_def, some.injEq, eq_iff_iff]
constructor
· intro h
exact a, rfl, h
· rintro a, rfl, h
exact h
exact propext isSome_pbind_iff
theorem pbind_eq_some_iff {o : Option α} {f : (a : α) a o Option β} {b : β} :
o.pbind f = some b a h, f a h = some b := by
cases o with
| none => simp
| some a =>
simp only [pbind_some, mem_def, some.injEq]
constructor
· intro h
exact a, rfl, h
· rintro a, rfl, h
exact h
cases o <;> simp
/-! ### pmap -/
@@ -648,10 +739,12 @@ theorem pbind_eq_some_iff {o : Option α} {f : (a : α) → a ∈ o → Option
pmap f o h = none o = none := by
cases o <;> simp
@[simp] theorem pmap_isSome {p : α Prop} {f : (a : α), p a β} {o : Option α} {h} :
@[simp] theorem isSome_pmap {p : α Prop} {f : (a : α), p a β} {o : Option α} {h} :
(pmap f o h).isSome = o.isSome := by
cases o <;> simp
@[deprecated isSome_pmap (since := "2025-04-01")] abbrev pmap_isSome := @isSome_pmap
@[simp] theorem pmap_eq_some_iff {p : α Prop} {f : (a : α), p a β} {o : Option α} {h} :
pmap f o h = some b (a : α) (h : p a), o = some a b = f a h := by
cases o with
@@ -677,6 +770,28 @@ theorem pmap_map (o : Option α) (f : α → β) {p : β → Prop} (g : ∀ b, p
pmap (fun a h => g (f a) h) o (fun a m => H (f a) (mem_map_of_mem f m)) := by
cases o <;> simp
theorem pmap_pred_congr {α : Type u}
{p p' : α Prop} (hp : x, p x p' x)
{o o' : Option α} (ho : o = o')
(h : x, x o p x) : x, x o' p' x := by
intro y hy
cases ho
exact (hp y).mp (h y hy)
@[congr]
theorem pmap_congr {α : Type u} {β : Type v}
{p p' : α Prop} (hp : x, p x p' x)
{f : (x : α) p x β} {f' : (x : α) p' x β}
(hf : x h, f x ((hp x).mpr h) = f' x h)
{o o' : Option α} (ho : o = o')
{h : x, x o p x} :
Option.pmap f o h = Option.pmap f' o' (Option.pmap_pred_congr hp ho h) := by
cases ho
cases o
· rfl
· dsimp
rw [hf]
/-! ### pelim -/
@[simp] theorem pelim_none : pelim none b f = b := rfl
@@ -691,6 +806,69 @@ theorem pmap_map (o : Option α) (f : α → β) {p : β → Prop} (g : ∀ b, p
o.pelim g (fun a h => g' (f a (H a h))) := by
cases o <;> simp
/-! ### pfilter -/
@[congr]
theorem pfilter_congr {α : Type u} {o o' : Option α} (ho : o = o')
{f : (a : α) o = some a Bool} {g : (a : α) o' = some a Bool}
(hf : a ha, f a (ho.trans ha) = g a ha) :
o.pfilter f = o'.pfilter g := by
cases ho
congr; funext a ha
exact hf a ha
@[simp] theorem pfilter_none {α : Type _} {p : (a : α) none = some a Bool} :
none.pfilter p = none := by
rfl
@[simp] theorem pfilter_some {α : Type _} {x : α} {p : (a : α) some x = some a Bool} :
(some x).pfilter p = if p x rfl then some x else none := by
simp only [pfilter, cond_eq_if]
theorem isSome_pfilter_iff {α : Type _} {o : Option α} {p : (a : α) o = some a Bool} :
(o.pfilter p).isSome (a : α) (ha : o = some a), p a ha := by
cases o <;> simp
theorem isSome_pfilter_iff_get {α : Type _} {o : Option α} {p : (a : α) o = some a Bool} :
(o.pfilter p).isSome (h : o.isSome), p (o.get h) (get_mem h) := by
cases o <;> simp
theorem isSome_of_isSome_pfilter {α : Type _} {o : Option α} {p : (a : α) o = some a Bool}
(h : (o.pfilter p).isSome) : o.isSome :=
(isSome_pfilter_iff_get.mp h).1
@[simp] theorem get_pfilter {α : Type _} {o : Option α} {p : (a : α) o = some a Bool}
(h : (o.pfilter p).isSome) :
(o.pfilter p).get h = o.get (isSome_of_isSome_pfilter h) := by
cases o <;> simp
theorem pfilter_eq_none_iff {α : Type _} {o : Option α} {p : (a : α) o = some a Bool} :
o.pfilter p = none o = none (a : α) (ha : o = some a), p a ha = false := by
cases o <;> simp
theorem pfilter_eq_some_iff {α : Type _} {o : Option α} {p : (a : α) o = some a Bool}
{a : α} : o.pfilter p = some a ha, p a ha = true := by
simp only [eq_some_iff_get_eq, get_pfilter, isSome_pfilter_iff]
constructor
· rintro b, hb, rfl, hb', rfl
exact hb, rfl, hb'
· rintro h, rfl, h'
exact o.get h, h, rfl, h', rfl
@[simp] theorem pfilter_eq_filter {α : Type _} {o : Option α} {p : α Bool} :
o.pfilter (fun a _ => p a) = o.filter p := by
cases o with
| none => rfl
| some a =>
simp only [pfilter, Option.filter, Bool.cond_eq_ite_iff]
theorem pfilter_eq_pbind_ite {α : Type _} {o : Option α}
{p : (a : α) o = some a Bool} :
o.pfilter p = o.pbind (fun a h => if p a h then some a else none) := by
cases o
· rfl
· simp only [Option.pfilter, Bool.cond_eq_ite, Option.pbind_some]
/-! ### LT and LE -/
@[simp] theorem not_lt_none [LT α] {a : Option α} : ¬ a < none := by cases a <;> simp [LT.lt, Option.lt]

View File

@@ -2393,6 +2393,17 @@ instance : Std.LawfulIdentity (α := ISize) (· + ·) 0 where
@[simp] protected theorem Int64.sub_self (a : Int64) : a - a = 0 := Int64.toBitVec_inj.1 (BitVec.sub_self _)
@[simp] protected theorem ISize.sub_self (a : ISize) : a - a = 0 := ISize.toBitVec_inj.1 (BitVec.sub_self _)
protected theorem Int8.add_left_neg (a : Int8) : -a + a = 0 := Int8.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem Int8.add_right_neg (a : Int8) : a + -a = 0 := Int8.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem Int16.add_left_neg (a : Int16) : -a + a = 0 := Int16.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem Int16.add_right_neg (a : Int16) : a + -a = 0 := Int16.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem Int32.add_left_neg (a : Int32) : -a + a = 0 := Int32.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem Int32.add_right_neg (a : Int32) : a + -a = 0 := Int32.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem Int64.add_left_neg (a : Int64) : -a + a = 0 := Int64.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem Int64.add_right_neg (a : Int64) : a + -a = 0 := Int64.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem ISize.add_left_neg (a : ISize) : -a + a = 0 := ISize.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem ISize.add_right_neg (a : ISize) : a + -a = 0 := ISize.toBitVec_inj.1 (BitVec.add_right_neg _)
@[simp] protected theorem Int8.sub_add_cancel (a b : Int8) : a - b + b = a :=
Int8.toBitVec_inj.1 (BitVec.sub_add_cancel _ _)
@[simp] protected theorem Int16.sub_add_cancel (a b : Int16) : a - b + b = a :=

View File

@@ -2376,6 +2376,17 @@ instance : Std.LawfulIdentity (α := USize) (· + ·) 0 where
@[simp] protected theorem UInt64.sub_self (a : UInt64) : a - a = 0 := UInt64.toBitVec_inj.1 (BitVec.sub_self _)
@[simp] protected theorem USize.sub_self (a : USize) : a - a = 0 := USize.toBitVec_inj.1 (BitVec.sub_self _)
protected theorem UInt8.add_left_neg (a : UInt8) : -a + a = 0 := UInt8.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem UInt8.add_right_neg (a : UInt8) : a + -a = 0 := UInt8.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem UInt16.add_left_neg (a : UInt16) : -a + a = 0 := UInt16.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem UInt16.add_right_neg (a : UInt16) : a + -a = 0 := UInt16.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem UInt32.add_left_neg (a : UInt32) : -a + a = 0 := UInt32.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem UInt32.add_right_neg (a : UInt32) : a + -a = 0 := UInt32.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem UInt64.add_left_neg (a : UInt64) : -a + a = 0 := UInt64.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem UInt64.add_right_neg (a : UInt64) : a + -a = 0 := UInt64.toBitVec_inj.1 (BitVec.add_right_neg _)
protected theorem USize.add_left_neg (a : USize) : -a + a = 0 := USize.toBitVec_inj.1 (BitVec.add_left_neg _)
protected theorem USize.add_right_neg (a : USize) : a + -a = 0 := USize.toBitVec_inj.1 (BitVec.add_right_neg _)
@[simp] protected theorem UInt8.neg_zero : -(0 : UInt8) = 0 := rfl
@[simp] protected theorem UInt16.neg_zero : -(0 : UInt16) = 0 := rfl
@[simp] protected theorem UInt32.neg_zero : -(0 : UInt32) = 0 := rfl

View File

@@ -2783,7 +2783,7 @@ theorem any_eq_not_all_not {xs : Vector α n} {p : α → Bool} : xs.any p = !xs
simp
@[simp] theorem all_filter {xs : Vector α n} {p q : α Bool} :
(xs.filter p).all q = xs.all fun a => p a q a := by
(xs.filter p).all q = xs.all fun a => !(p a) || q a := by
rcases xs with xs, rfl
simp

View File

@@ -15,3 +15,13 @@ instance (priority := 300) Zero.toOfNat0 {α} [Zero α] : OfNat α (nat_lit 0) w
instance (priority := 200) Zero.ofOfNat0 {α} [OfNat α (nat_lit 0)] : Zero α where
zero := 0
/-!
Instances converting between `One α` and `OfNat α (nat_lit 1)`.
-/
instance (priority := 300) One.toOfNat1 {α} [One α] : OfNat α (nat_lit 1) where
ofNat := One α.1
instance (priority := 200) One.ofOfNat1 {α} [OfNat α (nat_lit 1)] : One α where
one := 1

View File

@@ -236,10 +236,12 @@ namespace List
instance : GetElem (List α) Nat α fun as i => i < as.length where
getElem as i h := as.get i, h
@[simp] theorem getElem_cons_zero (a : α) (as : List α) (h : 0 < (a :: as).length) : getElem (a :: as) 0 h = a := by
@[simp, grind]
theorem getElem_cons_zero (a : α) (as : List α) (h : 0 < (a :: as).length) : getElem (a :: as) 0 h = a := by
rfl
@[simp] theorem getElem_cons_succ (a : α) (as : List α) (i : Nat) (h : i + 1 < (a :: as).length) : getElem (a :: as) (i+1) h = getElem as i (Nat.lt_of_succ_lt_succ h) := by
@[simp, grind]
theorem getElem_cons_succ (a : α) (as : List α) (i : Nat) (h : i + 1 < (a :: as).length) : getElem (a :: as) (i+1) h = getElem as i (Nat.lt_of_succ_lt_succ h) := by
rfl
@[simp] theorem getElem_mem : {l : List α} {n} (h : n < l.length), l[n]'h l

View File

@@ -12,3 +12,4 @@ import Init.Grind.Propagator
import Init.Grind.Util
import Init.Grind.Offset
import Init.Grind.PP
import Init.Grind.CommRing

View File

@@ -0,0 +1,11 @@
/-
Copyright (c) 2025 Lean FRO, LLC. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
prelude
import Init.Grind.CommRing.Basic
import Init.Grind.CommRing.Int
import Init.Grind.CommRing.UInt
import Init.Grind.CommRing.SInt
import Init.Grind.CommRing.BitVec

View File

@@ -0,0 +1,47 @@
/-
Copyright (c) 2025 Lean FRO, LLC. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
prelude
import Init.Data.Zero
/-!
# A monolithic commutative ring typeclass for internal use in `grind`.
-/
namespace Lean.Grind
class CommRing (α : Type u) extends Add α, Zero α, Mul α, One α, Neg α where
add_assoc : a b c : α, a + b + c = a + (b + c)
add_comm : a b : α, a + b = b + a
add_zero : a : α, a + 0 = a
neg_add_cancel : a : α, -a + a = 0
mul_assoc : a b c : α, a * b * c = a * (b * c)
mul_comm : a b : α, a * b = b * a
mul_one : a : α, a * 1 = a
left_distrib : a b c : α, a * (b + c) = a * b + a * c
zero_mul : a : α, 0 * a = 0
namespace CommRing
variable {α : Type u} [CommRing α]
theorem zero_add (a : α) : 0 + a = a := by
rw [add_comm, add_zero]
theorem add_neg_cancel (a : α) : a + -a = 0 := by
rw [add_comm, neg_add_cancel]
theorem one_mul (a : α) : 1 * a = a := by
rw [mul_comm, mul_one]
theorem right_distrib (a b c : α) : (a + b) * c = a * c + b * c := by
rw [mul_comm, left_distrib, mul_comm c, mul_comm c]
theorem mul_zero (a : α) : a * 0 = 0 := by
rw [mul_comm, zero_mul]
end CommRing
end Lean.Grind

View File

@@ -0,0 +1,23 @@
/-
Copyright (c) 2025 Lean FRO, LLC. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
prelude
import Init.Grind.CommRing.Basic
import Init.Data.BitVec.Lemmas
namespace Lean.Grind
instance : CommRing (BitVec w) where
add_assoc := BitVec.add_assoc
add_comm := BitVec.add_comm
add_zero := BitVec.add_zero
neg_add_cancel := BitVec.add_left_neg
mul_assoc := BitVec.mul_assoc
mul_comm := BitVec.mul_comm
mul_one := BitVec.mul_one
left_distrib _ _ _ := BitVec.mul_add
zero_mul _ := BitVec.zero_mul
end Lean.Grind

View File

@@ -0,0 +1,23 @@
/-
Copyright (c) 2025 Lean FRO, LLC. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
prelude
import Init.Grind.CommRing.Basic
import Init.Data.Int.Lemmas
namespace Lean.Grind
instance : CommRing Int where
add_assoc := Int.add_assoc
add_comm := Int.add_comm
add_zero := Int.add_zero
neg_add_cancel := Int.add_left_neg
mul_assoc := Int.mul_assoc
mul_comm := Int.mul_comm
mul_one := Int.mul_one
left_distrib := Int.mul_add
zero_mul := Int.zero_mul
end Lean.Grind

View File

@@ -0,0 +1,67 @@
/-
Copyright (c) 2025 Lean FRO, LLC. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
prelude
import Init.Grind.CommRing.Basic
import Init.Data.SInt.Lemmas
namespace Lean.Grind
instance : CommRing Int8 where
add_assoc := Int8.add_assoc
add_comm := Int8.add_comm
add_zero := Int8.add_zero
neg_add_cancel := Int8.add_left_neg
mul_assoc := Int8.mul_assoc
mul_comm := Int8.mul_comm
mul_one := Int8.mul_one
left_distrib _ _ _ := Int8.mul_add
zero_mul _ := Int8.zero_mul
instance : CommRing Int16 where
add_assoc := Int16.add_assoc
add_comm := Int16.add_comm
add_zero := Int16.add_zero
neg_add_cancel := Int16.add_left_neg
mul_assoc := Int16.mul_assoc
mul_comm := Int16.mul_comm
mul_one := Int16.mul_one
left_distrib _ _ _ := Int16.mul_add
zero_mul _ := Int16.zero_mul
instance : CommRing Int32 where
add_assoc := Int32.add_assoc
add_comm := Int32.add_comm
add_zero := Int32.add_zero
neg_add_cancel := Int32.add_left_neg
mul_assoc := Int32.mul_assoc
mul_comm := Int32.mul_comm
mul_one := Int32.mul_one
left_distrib _ _ _ := Int32.mul_add
zero_mul _ := Int32.zero_mul
instance : CommRing Int64 where
add_assoc := Int64.add_assoc
add_comm := Int64.add_comm
add_zero := Int64.add_zero
neg_add_cancel := Int64.add_left_neg
mul_assoc := Int64.mul_assoc
mul_comm := Int64.mul_comm
mul_one := Int64.mul_one
left_distrib _ _ _ := Int64.mul_add
zero_mul _ := Int64.zero_mul
instance : CommRing ISize where
add_assoc := ISize.add_assoc
add_comm := ISize.add_comm
add_zero := ISize.add_zero
neg_add_cancel := ISize.add_left_neg
mul_assoc := ISize.mul_assoc
mul_comm := ISize.mul_comm
mul_one := ISize.mul_one
left_distrib _ _ _ := ISize.mul_add
zero_mul _ := ISize.zero_mul
end Lean.Grind

View File

@@ -0,0 +1,67 @@
/-
Copyright (c) 2025 Lean FRO, LLC. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Kim Morrison
-/
prelude
import Init.Grind.CommRing.Basic
import Init.Data.UInt.Lemmas
namespace Lean.Grind
instance : CommRing UInt8 where
add_assoc := UInt8.add_assoc
add_comm := UInt8.add_comm
add_zero := UInt8.add_zero
neg_add_cancel := UInt8.add_left_neg
mul_assoc := UInt8.mul_assoc
mul_comm := UInt8.mul_comm
mul_one := UInt8.mul_one
left_distrib _ _ _ := UInt8.mul_add
zero_mul _ := UInt8.zero_mul
instance : CommRing UInt16 where
add_assoc := UInt16.add_assoc
add_comm := UInt16.add_comm
add_zero := UInt16.add_zero
neg_add_cancel := UInt16.add_left_neg
mul_assoc := UInt16.mul_assoc
mul_comm := UInt16.mul_comm
mul_one := UInt16.mul_one
left_distrib _ _ _ := UInt16.mul_add
zero_mul _ := UInt16.zero_mul
instance : CommRing UInt32 where
add_assoc := UInt32.add_assoc
add_comm := UInt32.add_comm
add_zero := UInt32.add_zero
neg_add_cancel := UInt32.add_left_neg
mul_assoc := UInt32.mul_assoc
mul_comm := UInt32.mul_comm
mul_one := UInt32.mul_one
left_distrib _ _ _ := UInt32.mul_add
zero_mul _ := UInt32.zero_mul
instance : CommRing UInt64 where
add_assoc := UInt64.add_assoc
add_comm := UInt64.add_comm
add_zero := UInt64.add_zero
neg_add_cancel := UInt64.add_left_neg
mul_assoc := UInt64.mul_assoc
mul_comm := UInt64.mul_comm
mul_one := UInt64.mul_one
left_distrib _ _ _ := UInt64.mul_add
zero_mul _ := UInt64.zero_mul
instance : CommRing USize where
add_assoc := USize.add_assoc
add_comm := USize.add_comm
add_zero := USize.add_zero
neg_add_cancel := USize.add_left_neg
mul_assoc := USize.mul_assoc
mul_comm := USize.mul_comm
mul_one := USize.mul_one
left_distrib _ _ _ := USize.mul_add
zero_mul _ := USize.zero_mul
end Lean.Grind

View File

@@ -18,6 +18,9 @@ theorem rfl_true : true = true :=
def intro_with_eq (p p' : Prop) (q : Sort u) (he : p = p') (h : p' q) : p q :=
fun hp => h (he.mp hp)
def intro_with_eq' (p p' : Prop) (q : p Sort u) (he : p = p') (h : (h : p') q (he.mpr_prop h)) : (h : p) q h :=
fun hp => h (he.mp hp)
/-! And -/
theorem and_eq_of_eq_true_left {a b : Prop} (h : a = True) : (a b) = b := by simp [h]
@@ -74,6 +77,14 @@ theorem eq_congr' {α : Sort u} {a₁ b₁ a₂ b₂ : α} (h₁ : a₁ = b₂)
theorem ne_of_ne_of_eq_left {α : Sort u} {a b c : α} (h₁ : a = b) (h₂ : b c) : a c := by simp [*]
theorem ne_of_ne_of_eq_right {α : Sort u} {a b c : α} (h₁ : a = c) (h₂ : b c) : b a := by simp [*]
/-! BEq -/
theorem beq_eq_true_of_eq {α : Type u} {_ : BEq α} {_ : LawfulBEq α} {a b : α} (h : a = b) : (a == b) = true := by
simp[*]
theorem beq_eq_false_of_diseq {α : Type u} {_ : BEq α} {_ : LawfulBEq α} {a b : α} (h : ¬ a = b) : (a == b) = false := by
simp[*]
/-! Bool.and -/
theorem Bool.and_eq_of_eq_true_left {a b : Bool} (h : a = true) : (a && b) = b := by simp [h]
@@ -102,6 +113,9 @@ theorem Bool.not_eq_of_eq_false {a : Bool} (h : a = false) : (!a) = true := by s
theorem Bool.eq_false_of_not_eq_true {a : Bool} (h : (!a) = true) : a = false := by simp_all
theorem Bool.eq_true_of_not_eq_false {a : Bool} (h : (!a) = false) : a = true := by simp_all
theorem Bool.eq_false_of_not_eq_true' {a : Bool} (h : ¬ a = true) : a = false := by simp_all
theorem Bool.eq_true_of_not_eq_false' {a : Bool} (h : ¬ a = false) : a = true := by simp_all
theorem Bool.false_of_not_eq_self {a : Bool} (h : (!a) = a) : False := by
by_cases a <;> simp_all

View File

@@ -120,7 +120,7 @@ init_grind_norm
-- Bool not
Bool.not_not
-- beq
beq_iff_eq beq_eq_decide_eq
beq_iff_eq beq_eq_decide_eq beq_self_eq_true
-- bne
bne_iff_ne bne_eq_decide_not_eq
-- Bool not eq true/false
@@ -147,5 +147,8 @@ init_grind_norm
Int.Linear.sub_fold Int.Linear.neg_fold
-- Int divides
Int.one_dvd Int.zero_dvd
-- Function composition
Function.const_apply Function.comp_apply Function.const_comp
Function.comp_const Function.true_comp Function.false_comp
end Lean.Grind

View File

@@ -78,6 +78,19 @@ def offsetUnexpander : PrettyPrinter.Unexpander := fun stx => do
| `($_ $lhs:term $rhs:term) => `($lhs + $rhs)
| _ => throw ()
/-
Remark: `↑a` is notation for `Nat.cast a`. `Nat.cast` is an abbreviation
for `NatCast.natCast`. We added it because users wanted to use dot-notation (e.g., `a.cast`).
`grind` expands all reducible definitions. Thus, a `grind` failure state contains
many `NatCast.natCast` applications which is too verbose. We add the following
unexpander to cope with this issue.
-/
@[app_unexpander NatCast.natCast]
def natCastUnexpander : PrettyPrinter.Unexpander := fun stx => do
match stx with
| `($_ $a:term) => `($a)
| _ => throw ()
/--
A marker to indicate that a proposition has already been normalized and should not
be processed again.

View File

@@ -488,7 +488,6 @@ instance instCCPOPProd [CCPO α] [CCPO β] : CCPO (α ×' β) where
csup c := CCPO.csup (PProd.chain.fst c), CCPO.csup (PProd.chain.snd c)
csup_spec := by
intro a, b c hchain
try dsimp -- TODO(kmill) remove
constructor
next =>
intro h₁, h₂ a', b' cab

View File

@@ -1415,6 +1415,11 @@ class Zero (α : Type u) where
/-- The zero element of the type. -/
zero : α
/-- A type with a "one" element. -/
class One (α : Type u) where
/-- The "one" element of the type. -/
one : α
/-- The homogeneous version of `HAdd`: `a + b : α` where `a b : α`. -/
class Add (α : Type u) where
/-- `a + b` computes the sum of `a` and `b`. See `HAdd`. -/

View File

@@ -334,6 +334,13 @@ theorem not_forall_of_exists_not {p : α → Prop} : (∃ x, ¬p x) → ¬∀ x,
@[simp] theorem exists_eq_right' : ( a, p a a' = a) p a' := by simp [@eq_comm _ a']
@[simp] theorem exists_prop_eq {p : (a : α) a = a' Prop} :
( (a : α) (h : a = a'), p a h) p a' rfl :=
fun _, e, h => e h, fun h => _, rfl, h
@[simp] theorem exists_prop_eq' {p : (a : α) a' = a Prop} :
( (a : α) (h : a' = a), p a h) p a' rfl := by simp [@eq_comm _ a']
@[simp] theorem forall_eq_or_imp : ( a, a = a' q a p a) p a' a, q a p a := by
simp only [or_imp, forall_and, forall_eq]

View File

@@ -1367,8 +1367,7 @@ A child process that was spawned with configuration `cfg`.
The configuration determines whether the child process's standard input, standard output, and
standard error are `IO.FS.Handle`s or `Unit`.
-/
-- TODO(Sebastian): constructor must be private
structure Child (cfg : StdioConfig) where
structure Child (cfg : StdioConfig) where private mk ::
/--
The child process's standard input handle, if it was configured as `IO.Process.Stdio.piped`, or
`()` otherwise.
@@ -1428,6 +1427,9 @@ standard input is exhausted.
@[extern "lean_io_process_child_take_stdin"] opaque Child.takeStdin {cfg : @& StdioConfig} : Child cfg
IO (cfg.stdin.toHandleType × Child { cfg with stdin := Stdio.null })
/-- Returns the operating system process id of the child process. -/
@[extern "lean_io_process_child_pid"] opaque Child.pid {cfg : @& StdioConfig} : Child cfg UInt32
/--
The result of running a process to completion.
-/

View File

@@ -40,3 +40,4 @@ import Lean.Replay
import Lean.PrivateName
import Lean.PremiseSelection
import Lean.Namespace
import Lean.EnvExtension

View File

@@ -4,7 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Environment
import Lean.EnvExtension
namespace Lean

View File

@@ -14,29 +14,32 @@ namespace Lean.Compiler.FFI
@[extern "lean_get_leanc_extra_flags"]
private opaque getLeancExtraFlags : Unit String
private def flagsStringToArray (s : String) : Array String :=
s.splitOn.toArray |>.filter (· "")
/-- Return C compiler flags for including Lean's headers. -/
def getCFlags (leanSysroot : FilePath) : Array String :=
#["-I", (leanSysroot / "include").toString] ++ (getLeancExtraFlags ()).trim.splitOn
#["-I", (leanSysroot / "include").toString] ++ flagsStringToArray (getLeancExtraFlags ())
@[extern "lean_get_leanc_internal_flags"]
private opaque getLeancInternalFlags : Unit String
/-- Return C compiler flags needed to use the C compiler bundled with the Lean toolchain. -/
def getInternalCFlags (leanSysroot : FilePath) : Array String :=
(getLeancInternalFlags ()).trim.splitOn.toArray.map (·.replace "ROOT" leanSysroot.toString)
flagsStringToArray (getLeancInternalFlags ()) |>.map (·.replace "ROOT" leanSysroot.toString)
@[extern "lean_get_linker_flags"]
private opaque getBuiltinLinkerFlags (linkStatic : Bool) : String
/-- Return linker flags for linking against Lean's libraries. -/
def getLinkerFlags (leanSysroot : FilePath) (linkStatic := true) : Array String :=
#["-L", (leanSysroot / "lib" / "lean").toString] ++ (getBuiltinLinkerFlags linkStatic).trim.splitOn
#["-L", (leanSysroot / "lib" / "lean").toString] ++ flagsStringToArray (getBuiltinLinkerFlags linkStatic)
@[extern "lean_get_internal_linker_flags"]
private opaque getBuiltinInternalLinkerFlags : Unit String
/-- Return linker flags needed to use the linker bundled with the Lean toolchain. -/
def getInternalLinkerFlags (leanSysroot : FilePath) : Array String :=
(getBuiltinInternalLinkerFlags ()).trim.splitOn.toArray.map (·.replace "ROOT" leanSysroot.toString)
flagsStringToArray (getBuiltinInternalLinkerFlags ()) |>.map (·.replace "ROOT" leanSysroot.toString)
end Lean.Compiler.FFI

View File

@@ -180,7 +180,7 @@ structure CtorInfo where
size : Nat
usize : Nat
ssize : Nat
deriving Repr
deriving Inhabited, Repr
def CtorInfo.beq : CtorInfo CtorInfo Bool
| n₁, cidx₁, size₁, usize₁, ssize₁, n₂, cidx₂, size₂, usize₂, ssize₂ =>
@@ -223,6 +223,7 @@ inductive Expr where
| lit (v : LitVal)
/-- Return `1 : uint8` Iff `RC(x) > 1` -/
| isShared (x : VarId)
deriving Inhabited
@[export lean_ir_mk_ctor_expr] def mkCtorExpr (n : Name) (cidx : Nat) (size : Nat) (usize : Nat) (ssize : Nat) (ys : Array Arg) : Expr :=
Expr.ctor n, cidx, size, usize, ssize ys

View File

@@ -15,6 +15,7 @@ inductive CtorFieldInfo where
| object (i : Nat)
| usize (i : Nat)
| scalar (sz : Nat) (offset : Nat) (type : IRType)
deriving Inhabited
namespace CtorFieldInfo

View File

@@ -10,10 +10,7 @@ import Lean.Compiler.LCNF.Internalize
namespace Lean.Compiler.LCNF
abbrev AuxDeclCache := PHashMap Decl Name
builtin_initialize auxDeclCacheExt : EnvExtension AuxDeclCache
registerEnvExtension (pure {}) (asyncMode := .sync) -- compilation is non-parallel anyway
builtin_initialize auxDeclCacheExt : CacheExtension Decl Name CacheExtension.register
inductive CacheAuxDeclResult where
| new
@@ -22,11 +19,11 @@ inductive CacheAuxDeclResult where
def cacheAuxDecl (decl : Decl) : CompilerM CacheAuxDeclResult := do
let key := { decl with name := .anonymous }
let key normalizeFVarIds key
match auxDeclCacheExt.getState ( getEnv) |>.find? key with
match ( auxDeclCacheExt.find? key) with
| some declName =>
return .alreadyCached declName
| none =>
modifyEnv fun env => auxDeclCacheExt.modifyState env fun s => s.insert key decl.name
auxDeclCacheExt.insert key decl.name
return .new
end Lean.Compiler.LCNF

View File

@@ -14,21 +14,15 @@ State for the environment extension used to save the LCNF base phase type for de
that do not have code associated with them.
Example: constructors, inductive types, foreign functions.
-/
structure BaseTypeExtState where
/-- The LCNF type for the `base` phase. -/
base : PHashMap Name Expr := {}
deriving Inhabited
builtin_initialize baseTypeExt : EnvExtension BaseTypeExtState
registerEnvExtension (pure {}) (asyncMode := .sync) -- compilation is non-parallel anyway
builtin_initialize baseTypeExt : CacheExtension Name Expr CacheExtension.register
def getOtherDeclBaseType (declName : Name) (us : List Level) : CoreM Expr := do
let info getConstInfo declName
let type match baseTypeExt.getState ( getEnv) |>.base.find? declName with
let type match ( baseTypeExt.find? declName) with
| some type => pure type
| none =>
let type Meta.MetaM.run' <| toLCNFType info.type
modifyEnv fun env => baseTypeExt.modifyState env fun s => { s with base := s.base.insert declName type }
baseTypeExt.insert declName type
pure type
return type.instantiateLevelParamsNoCache info.levelParams us

View File

@@ -483,4 +483,26 @@ def getConfig : CompilerM ConfigOptions :=
def CompilerM.run (x : CompilerM α) (s : State := {}) (phase : Phase := .base) : CoreM α := do
x { phase, config := toConfigOptions ( getOptions) } |>.run' s
/-- Environment extension for local caching of key-value pairs, not persisted in .olean files. -/
structure CacheExtension (α β : Type) [BEq α] [Hashable α] extends EnvExtension (List α × PHashMap α β)
deriving Inhabited
namespace CacheExtension
def register [BEq α] [Hashable α] [Inhabited β] :
IO (CacheExtension α β) :=
CacheExtension.mk <$> registerEnvExtension (pure ([], {})) (asyncMode := .sync) -- compilation is non-parallel anyway
(replay? := some fun oldState newState _ s =>
let newEntries := newState.1.take (newState.1.length - oldState.1.length)
newEntries.foldl (init := s) fun s e =>
(e :: s.1, s.2.insert e (newState.2.find! e)))
def insert [BEq α] [Hashable α] [Inhabited β] (ext : CacheExtension α β) (a : α) (b : β) : CoreM Unit := do
modifyEnv (ext.modifyState · fun as, m => (a :: as, m.insert a b))
def find? [BEq α] [Hashable α] [Inhabited β] (ext : CacheExtension α β) (a : α) : CoreM (Option β) := do
return ext.toEnvExtension.getState ( getEnv) |>.2.find? a
end CacheExtension
end Lean.Compiler.LCNF

View File

@@ -249,6 +249,7 @@ builtin_initialize functionSummariesExt : SimplePersistentEnvExtension (Name ×
addEntryFn := fun s e, n => s.insert e n
toArrayFn := fun s => s.toArray.qsort decLt
asyncMode := .sync -- compilation is non-parallel anyway
replay? := some <| SimplePersistentEnvExtension.replayOfFilter (!·.contains ·.1) (fun s e, n => s.insert e n)
}
/--

View File

@@ -111,20 +111,14 @@ State for the environment extension used to save the LCNF mono phase type for de
that do not have code associated with them.
Example: constructors, inductive types, foreign functions.
-/
structure MonoTypeExtState where
/-- The LCNF type for the `mono` phase. -/
mono : PHashMap Name Expr := {}
deriving Inhabited
builtin_initialize monoTypeExt : EnvExtension MonoTypeExtState
registerEnvExtension (pure {}) (asyncMode := .sync) -- compilation is non-parallel anyway
builtin_initialize monoTypeExt : CacheExtension Name Expr CacheExtension.register
def getOtherDeclMonoType (declName : Name) : CoreM Expr := do
match monoTypeExt.getState ( getEnv) |>.mono.find? declName with
match ( monoTypeExt.find? declName) with
| some type => return type
| none =>
let type toMonoType ( getOtherDeclBaseType declName [])
modifyEnv fun env => monoTypeExt.modifyState env fun s => { s with mono := s.mono.insert declName type }
monoTypeExt.insert declName type
return type
end Lean.Compiler.LCNF

View File

@@ -94,7 +94,6 @@ builtin_initialize passManagerExt : PersistentEnvExtension Name (Name × PassMan
addImportedFn := fun ns => return ([], ImportM.runCoreM <| runImportedDecls ns)
addEntryFn := fun (installerDeclNames, _) (installerDeclName, managerNew) => (installerDeclName :: installerDeclNames, managerNew)
exportEntriesFn := fun s => s.1.reverse.toArray
asyncMode := .sync
}
def getPassManager : CoreM PassManager :=

View File

@@ -21,22 +21,21 @@ private abbrev findAtSorted? (decls : Array Decl) (declName : Name) : Option Dec
let tmpDecl := { tmpDecl with name := declName }
decls.binSearch tmpDecl declLt
abbrev DeclExt := PersistentEnvExtension Decl Decl DeclExtState
abbrev DeclExt := SimplePersistentEnvExtension Decl DeclExtState
def mkDeclExt (name : Name := by exact decl_name%) : IO DeclExt := do
registerPersistentEnvExtension {
registerSimplePersistentEnvExtension {
name := name
mkInitial := return {}
addImportedFn := fun _ => return {}
addImportedFn := fun _ => {}
addEntryFn := fun decls decl => decls.insert decl.name decl
exportEntriesFn := fun s =>
let decls := s.foldl (init := #[]) fun decls _ decl => decls.push decl
sortDecls decls
toArrayFn := (sortDecls ·.toArray)
asyncMode := .sync -- compilation is non-parallel anyway
replay? := some <| SimplePersistentEnvExtension.replayOfFilter
(fun s d => !s.contains d.name) (fun decls decl => decls.insert decl.name decl)
}
builtin_initialize baseExt : PersistentEnvExtension Decl Decl DeclExtState mkDeclExt
builtin_initialize monoExt : PersistentEnvExtension Decl Decl DeclExtState mkDeclExt
builtin_initialize baseExt : DeclExt mkDeclExt
builtin_initialize monoExt : DeclExt mkDeclExt
def getDeclCore? (env : Environment) (ext : DeclExt) (declName : Name) : Option Decl :=
match env.getModuleIdxFor? declName with

View File

@@ -397,7 +397,7 @@ structure FolderOleanEntry where
structure FolderEntry extends FolderOleanEntry where
folder : Folder
builtin_initialize folderExt : PersistentEnvExtension FolderOleanEntry FolderEntry (List FolderOleanEntry × SMap Name Folder)
builtin_initialize folderExt : PersistentEnvExtension FolderOleanEntry FolderEntry (List FolderEntry × SMap Name Folder)
registerPersistentEnvExtension {
mkInitial := return ([], builtinFolders)
addImportedFn := fun entriesArray => do
@@ -408,9 +408,12 @@ builtin_initialize folderExt : PersistentEnvExtension FolderOleanEntry FolderEnt
let folder IO.ofExcept <| getFolderCore ctx.env ctx.opts folderDeclName
folders := folders.insert declName folder
return ([], folders.switch)
addEntryFn := fun (entries, map) entry => (entry.toFolderOleanEntry :: entries, map.insert entry.declName entry.folder)
exportEntriesFn := fun (entries, _) => entries.reverse.toArray
addEntryFn := fun (entries, map) entry => (entry :: entries, map.insert entry.declName entry.folder)
exportEntriesFn := fun (entries, _) => entries.reverse.toArray.map (·.toFolderOleanEntry)
asyncMode := .sync
replay? := some fun oldState newState _ s =>
let newEntries := newState.1.take (newState.1.length - oldState.1.length)
(newEntries ++ s.1, newEntries.foldl (init := s.2) fun s e => s.insert e.declName (newState.2.find! e.declName))
}
def registerFolder (declName : Name) (folderDeclName : Name) : CoreM Unit := do

View File

@@ -86,6 +86,8 @@ builtin_initialize specExtension : SimplePersistentEnvExtension SpecEntry SpecSt
addImportedFn := fun _ => {}
toArrayFn := fun s => sortEntries s.toArray
asyncMode := .sync
replay? := some <| SimplePersistentEnvExtension.replayOfFilter
(!·.specInfo.contains ·.declName) SpecState.addEntry
}
/--

View File

@@ -33,6 +33,8 @@ builtin_initialize specCacheExt : SimplePersistentEnvExtension CacheEntry Cache
addEntryFn := addEntry
addImportedFn := fun es => (mkStateFromImportedEntries addEntry {} es).switch
asyncMode := .sync
replay? := some <| SimplePersistentEnvExtension.replayOfFilter
(!·.contains ·.key) addEntry
}
def cacheSpec (key : Expr) (declName : Name) : CoreM Unit :=

View File

@@ -4,7 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Environment
import Lean.EnvExtension
namespace Lean

View File

@@ -145,34 +145,12 @@ structure DiagnosticWith (α : Type) where
/-- An array of related diagnostic information, e.g. when symbol-names within a scope collide all definitions can be marked via this property. -/
relatedInformation? : Option (Array DiagnosticRelatedInformation) := none
/-- A data entry field that is preserved between a `textDocument/publishDiagnostics` notification and `textDocument/codeAction` request. -/
data?: Option Json := none
data? : Option Json := none
deriving Inhabited, BEq, ToJson, FromJson
def DiagnosticWith.fullRange (d : DiagnosticWith α) : Range :=
d.fullRange?.getD d.range
attribute [local instance] Ord.arrayOrd in
/-- Restriction of `DiagnosticWith` to properties that are displayed to users in the InfoView. -/
private structure DiagnosticWith.UserVisible (α : Type) where
range : Range
fullRange? : Option Range
severity? : Option DiagnosticSeverity
code? : Option DiagnosticCode
source? : Option String
message : α
tags? : Option (Array DiagnosticTag)
relatedInformation? : Option (Array DiagnosticRelatedInformation)
deriving Ord
/-- Extracts user-visible properties from the given `DiagnosticWith`. -/
private def DiagnosticWith.UserVisible.ofDiagnostic (d : DiagnosticWith α)
: DiagnosticWith.UserVisible α :=
{ d with }
/-- Compares `DiagnosticWith` instances modulo non-user-facing properties. -/
def compareByUserVisible [Ord α] (a b : DiagnosticWith α) : Ordering :=
compare (DiagnosticWith.UserVisible.ofDiagnostic a) (DiagnosticWith.UserVisible.ofDiagnostic b)
abbrev Diagnostic := DiagnosticWith String
/-- Parameters for the [`textDocument/publishDiagnostics` notification](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_publishDiagnostics). -/

View File

@@ -7,6 +7,7 @@ Authors: Joscha Mennicken
prelude
import Lean.Expr
import Lean.Data.Lsp.Basic
import Lean.Data.JsonRpc
import Std.Data.TreeMap
set_option linter.missingDocs true -- keep it documented
@@ -201,4 +202,62 @@ structure LeanStaleDependencyParams where
staleDependency : DocumentUri
deriving FromJson, ToJson
/-- LSP type for `Lean.OpenDecl`. -/
inductive OpenNamespace
/-- All declarations in `«namespace»` are opened, except for `exceptions`. -/
| allExcept («namespace» : Name) (exceptions : Array Name)
/-- The declaration `«from»` is renamed to `to`. -/
| renamed («from» : Name) (to : Name)
deriving FromJson, ToJson
/-- Query in the `$/lean/queryModule` watchdog <- worker request. -/
structure LeanModuleQuery where
/-- Identifier (potentially partial) to query. -/
identifier : String
/--
Namespaces that are open at the position of `identifier`.
Used for accurately matching declarations against `identifier` in context.
-/
openNamespaces : Array OpenNamespace
deriving FromJson, ToJson
/--
Used in the `$/lean/queryModule` watchdog <- worker request, which is used by the worker to
extract information from the .ilean information in the watchdog.
-/
structure LeanQueryModuleParams where
/--
The request ID in the context of which this worker -> watchdog request was emitted.
Used for cancelling this request in the watchdog.
-/
sourceRequestID : JsonRpc.RequestID
/-- Module queries for extracting .ilean information in the watchdog. -/
queries : Array LeanModuleQuery
deriving FromJson, ToJson
/-- Result entry of a module query. -/
structure LeanIdentifier where
/-- Module that `decl` is defined in. -/
module : Name
/-- Full name of the declaration that matches the query. -/
decl : Name
/-- Whether this `decl` matched the query exactly. -/
isExactMatch : Bool
deriving FromJson, ToJson
/--
Result for a single module query.
Identifiers in the response are sorted descendingly by how well they match the query.
-/
abbrev LeanQueriedModule := Array LeanIdentifier
/-- Response for the `$/lean/queryModule` watchdog <- worker request. -/
structure LeanQueryModuleResponse where
/--
Results for each query in `LeanQueryModuleParams`.
Positions correspond to `queries` in the parameter of the request.
-/
queryResults : Array LeanQueriedModule
deriving FromJson, ToJson, Inhabited
end Lean.Lsp

View File

@@ -1222,8 +1222,8 @@ private def resolveLValAux (e : Expr) (eType : Expr) (lval : LVal) : TermElabM L
-- Then search the environment
if let some (baseStructName, fullName) findMethod? structName (.mkSimple fieldName) then
return LValResolution.const baseStructName structName fullName
throwLValError e eType
m!"invalid field '{fieldName}', the environment does not contain '{Name.mkStr structName fieldName}'"
let msg := mkUnknownIdentifierMessage m!"invalid field '{fieldName}', the environment does not contain '{Name.mkStr structName fieldName}'"
throwLValError e eType msg
| none, LVal.fieldName _ _ (some suffix) _ =>
if e.isConst then
throwUnknownConstant (e.constName! ++ suffix)
@@ -1502,7 +1502,7 @@ where
else if let some (fvar, []) resolveLocalName idNew then
return fvar
else
throwError "invalid dotted identifier notation, unknown identifier `{idNew}` from expected type{indentExpr expectedType}"
throwUnknownIdentifier m!"invalid dotted identifier notation, unknown identifier `{idNew}` from expected type{indentExpr expectedType}"
catch
| ex@(.error ..) =>
match ( unfoldDefinition? resultType) with
@@ -1550,7 +1550,7 @@ private partial def elabAppFn (f : Syntax) (lvals : List LVal) (namedArgs : Arra
| `(@$_) => throwUnsupportedSyntax -- invalid occurrence of `@`
| `(_) => throwError "placeholders '_' cannot be used where a function is expected"
| `(.$id:ident) =>
addCompletionInfo <| CompletionInfo.dotId f id.getId ( getLCtx) expectedType?
addCompletionInfo <| CompletionInfo.dotId id id.getId ( getLCtx) expectedType?
let fConst resolveDotName id expectedType?
let s observing do
-- Use (force := true) because we want to record the result of .ident resolution even in patterns

View File

@@ -22,7 +22,8 @@ def processHeader (header : Syntax) (opts : Options) (messages : MessageLog)
(plugins : Array System.FilePath := #[]) (leakEnv := false)
: IO (Environment × MessageLog) := do
try
let env importModules (leakEnv := leakEnv) (headerToImports header) opts trustLevel plugins
let env
importModules (leakEnv := leakEnv) (loadExts := true) (headerToImports header) opts trustLevel plugins
pure (env, messages)
catch e =>
let env mkEmptyEnvironment

View File

@@ -87,11 +87,11 @@ def applyAttributesOf (preDefs : Array PreDefinition) (applicationTime : Attribu
for preDef in preDefs do
applyAttributesAt preDef.declName preDef.modifiers.attrs applicationTime
def abstractNestedProofs (preDef : PreDefinition) : MetaM PreDefinition := withRef preDef.ref do
def abstractNestedProofs (preDef : PreDefinition) (cache := true) : MetaM PreDefinition := withRef preDef.ref do
if preDef.kind.isTheorem || preDef.kind.isExample then
pure preDef
else do
let value Meta.abstractNestedProofs preDef.declName preDef.value
let value Meta.abstractNestedProofs (cache := cache) preDef.declName preDef.value
pure { preDef with value := value }
/-- Auxiliary method for (temporarily) adding pre definition as an axiom -/
@@ -121,9 +121,9 @@ private def reportTheoremDiag (d : TheoremVal) : TermElabM Unit := do
-- let info
logInfo <| MessageData.trace { cls := `theorem } m!"{d.name}" (#[sizeMsg] ++ constOccsMsg)
private def addNonRecAux (preDef : PreDefinition) (compile : Bool) (all : List Name) (applyAttrAfterCompilation := true) : TermElabM Unit :=
private def addNonRecAux (preDef : PreDefinition) (compile : Bool) (all : List Name) (applyAttrAfterCompilation := true) (cacheProofs := true) : TermElabM Unit :=
withRef preDef.ref do
let preDef abstractNestedProofs preDef
let preDef abstractNestedProofs (cache := cacheProofs) preDef
let mkDefDecl : TermElabM Declaration :=
return Declaration.defnDecl {
name := preDef.declName, levelParams := preDef.levelParams, type := preDef.type, value := preDef.value
@@ -168,8 +168,8 @@ private def addNonRecAux (preDef : PreDefinition) (compile : Bool) (all : List N
def addAndCompileNonRec (preDef : PreDefinition) (all : List Name := [preDef.declName]) : TermElabM Unit := do
addNonRecAux preDef (compile := true) (all := all)
def addNonRec (preDef : PreDefinition) (applyAttrAfterCompilation := true) (all : List Name := [preDef.declName]) : TermElabM Unit := do
addNonRecAux preDef (compile := false) (applyAttrAfterCompilation := applyAttrAfterCompilation) (all := all)
def addNonRec (preDef : PreDefinition) (applyAttrAfterCompilation := true) (all : List Name := [preDef.declName]) (cacheProofs := true) : TermElabM Unit := do
addNonRecAux preDef (compile := false) (applyAttrAfterCompilation := applyAttrAfterCompilation) (all := all) (cacheProofs := cacheProofs)
/--
Eliminate recursive application annotations containing syntax. These annotations are used by the well-founded recursion module

View File

@@ -27,7 +27,7 @@ where
go (fvars.push x) (vals.map fun val => val.bindingBody!.instantiate1 x)
def addPreDefsFromUnary (preDefs : Array PreDefinition) (preDefsNonrec : Array PreDefinition)
(unaryPreDefNonRec : PreDefinition) : TermElabM Unit := do
(unaryPreDefNonRec : PreDefinition) (cacheProofs := true) : TermElabM Unit := do
/-
We must remove `implemented_by` attributes from the auxiliary application because
this attribute is only relevant for code that is compiled. Moreover, the `[implemented_by <decl>]`
@@ -41,21 +41,21 @@ def addPreDefsFromUnary (preDefs : Array PreDefinition) (preDefsNonrec : Array P
-- we recognize that below and then do not set @[irreducible]
withOptions (allowUnsafeReducibility.set · true) do
if unaryPreDefNonRec.declName = preDefs[0]!.declName then
addNonRec preDefNonRec (applyAttrAfterCompilation := false)
addNonRec preDefNonRec (applyAttrAfterCompilation := false) (cacheProofs := cacheProofs)
else
withEnableInfoTree false do
addNonRec preDefNonRec (applyAttrAfterCompilation := false)
preDefsNonrec.forM (addNonRec · (applyAttrAfterCompilation := false) (all := declNames))
addNonRec preDefNonRec (applyAttrAfterCompilation := false) (cacheProofs := cacheProofs)
preDefsNonrec.forM (addNonRec · (applyAttrAfterCompilation := false) (all := declNames) (cacheProofs := cacheProofs))
/--
Cleans the right-hand-sides of the predefinitions, to prepare for inclusion in the EqnInfos:
* Remove RecAppSyntax markers
* Abstracts nested proofs (and for that, add the `_unsafe_rec` definitions)
-/
def cleanPreDefs (preDefs : Array PreDefinition) : TermElabM (Array PreDefinition) := do
def cleanPreDefs (preDefs : Array PreDefinition) (cacheProofs := true) : TermElabM (Array PreDefinition) := do
addAndCompilePartialRec preDefs
let preDefs preDefs.mapM (eraseRecAppSyntax ·)
let preDefs preDefs.mapM (abstractNestedProofs ·)
let preDefs preDefs.mapM (abstractNestedProofs (cache := cacheProofs) ·)
return preDefs
/--

View File

@@ -66,8 +66,8 @@ def wfRecursion (preDefs : Array PreDefinition) (termMeasure?s : Array (Option T
trace[Elab.definition.wf] ">> {preDefNonRec.declName} :=\n{preDefNonRec.value}"
let preDefsNonrec preDefsFromUnaryNonRec fixedParamPerms argsPacker preDefs preDefNonRec
Mutual.addPreDefsFromUnary preDefs preDefsNonrec preDefNonRec
let preDefs Mutual.cleanPreDefs preDefs
Mutual.addPreDefsFromUnary (cacheProofs := false) preDefs preDefsNonrec preDefNonRec
let preDefs Mutual.cleanPreDefs (cacheProofs := false) preDefs
registerEqnsInfo preDefs preDefNonRec.declName fixedParamPerms argsPacker
for preDef in preDefs, wfPreprocessProof in wfPreprocessProofs do
unless preDef.kind.isTheorem do

View File

@@ -86,8 +86,9 @@ private partial def printStructure (id : Name) (levelParams : List Name) (numPar
let env getEnv
let kind := if isClass env id then "class" else "structure"
let header mkHeader' kind id levelParams type isUnsafe (sig := false)
let levels := levelParams.map Level.param
liftTermElabM <| forallTelescope ( getConstInfo id).type fun params _ =>
let s := Expr.const id (levelParams.map .param)
let s := Expr.const id levels
withLocalDeclD `self (mkAppN s params) fun self => do
let mut m : MessageData := header
-- Signature
@@ -100,15 +101,12 @@ private partial def printStructure (id : Name) (levelParams : List Name) (numPar
unless parents.isEmpty do
m := m ++ Format.line ++ "parents:"
for parent in parents do
let ptype inferType (mkApp (mkAppN (.const parent.projFn (levelParams.map .param)) params) self)
let ptype inferType (mkApp (mkAppN (.const parent.projFn levels) params) self)
m := m ++ indentD m!"{.ofConstName parent.projFn (fullNames := true)} : {ptype}"
-- Fields
-- Collect params in a map for default value processing
let paramMap : NameMap Expr params.foldlM (init := {}) fun paramMap param => do
pure <| paramMap.insert ( param.fvarId!.getUserName) param
-- Collect autoParam tactics, which are all on the flat constructor:
let flatCtorName := mkFlatCtorOfStructCtorName ctor
let flatCtorInfo try getConstInfo flatCtorName catch _ => getConstInfo (id ++ `_flat_ctor) -- TODO(kmill): remove catch
let flatCtorInfo getConstInfo flatCtorName
let autoParams : NameMap Syntax forallTelescope flatCtorInfo.type fun args _ =>
args[numParams:].foldlM (init := {}) fun set arg => do
let decl arg.fvarId!.getDecl
@@ -136,9 +134,7 @@ private partial def printStructure (id : Name) (levelParams : List Name) (numPar
let stx : TSyntax ``Parser.Tactic.tacticSeq := stx
pure m!" := by{indentD stx}"
else if let some defFn := getEffectiveDefaultFnForField? env id field then
let cinfo getConstInfo defFn
let defValue instantiateValueLevelParams cinfo (levelParams.map .param)
if let some val processDefaultValue paramMap fieldMap defValue then
if let some (_, val) instantiateStructDefaultValueFn? defFn levels params (pure fieldMap.find?) then
pure m!" :={indentExpr val}"
else
pure m!" := <error>"
@@ -157,24 +153,6 @@ private partial def printStructure (id : Name) (levelParams : List Name) (numPar
-- Omit proofs; the delaborator enables `pp.proofs` for non-constant proofs, but we don't want this for default values
withOptions (fun opts => opts.set pp.proofs.name false) do
logInfo m
where
processDefaultValue (paramMap : NameMap Expr) (fieldValues : NameMap Expr) : Expr MetaM (Option Expr)
| .lam n d b c => do
if c.isExplicit then
let some val := fieldValues.find? n | return none
if isDefEq ( inferType val) d then
processDefaultValue paramMap fieldValues (b.instantiate1 val)
else
return none
else
let some param := paramMap.find? n | return none
if isDefEq ( inferType param) d then
processDefaultValue paramMap fieldValues (b.instantiate1 param)
else
return none
| e =>
let_expr id _ a := e | return some e
return some a
private def printIdCore (id : Name) : CommandElabM Unit := do
let env getEnv

View File

@@ -399,7 +399,7 @@ The `structureType?` is the expected type of the structure instance.
-/
private def mkCtorHeader (ctorVal : ConstructorVal) (structureType? : Option Expr) : TermElabM CtorHeaderResult := do
let flatCtorName := mkFlatCtorOfStructCtorName ctorVal.name
let cinfo try getConstInfo flatCtorName catch _ => getConstInfo (ctorVal.induct ++ `_flat_ctor) -- TODO(kmill): remove catch
let cinfo getConstInfo flatCtorName
let us mkFreshLevelMVars ctorVal.levelParams.length
let mut type instantiateTypeLevelParams cinfo.toConstantVal us
let mut params : Array Expr := #[]
@@ -726,36 +726,13 @@ After default values are resolved, then the one that is added to the environment
as an `_inherited_default` auxiliary function is normalized; we don't do those normalizations here.
-/
private partial def getFieldDefaultValue? (fieldName : Name) : StructInstM (NameSet × Option Expr) := do
match getEffectiveDefaultFnForField? ( getEnv) ( read).structName fieldName with
| none => return ({}, none)
| some defaultFn =>
trace[Elab.struct] "default fn for '{fieldName}' is '{.ofConstName defaultFn}'"
let cinfo getConstInfo defaultFn
let us := ( read).levels
go? {} <| ( instantiateValueLevelParams cinfo us).beta ( read).params
where
logFailure : StructInstM (NameSet × Option Expr) := do
logError m!"default value for field '{fieldName}' of structure '{.ofConstName (← read).structName}' could not be instantiated, ignoring"
return ({}, none)
go? (usedFields : NameSet) (e : Expr) : StructInstM (NameSet × Option Expr) := do
match e with
| Expr.lam n d b c =>
if c.isExplicit then
let some val := ( get).fieldMap.find? n
| trace[Elab.struct] "default value error: no value for field '{n}'"
logFailure
let valType inferType val
if ( isDefEq valType d) then
go? (usedFields.insert n) (b.instantiate1 val)
else
trace[Elab.struct] "default value error: {← mkHasTypeButIsExpectedMsg valType d}"
logFailure
else
trace[Elab.struct] "default value error: unexpected implicit argument{indentExpr e}"
logFailure
| e =>
let_expr id _ a := e | return (usedFields, some e)
return (usedFields, some a)
let some defFn := getEffectiveDefaultFnForField? ( getEnv) ( read).structName fieldName
| return ({}, none)
let fieldMap := ( get).fieldMap
let some (fields, val) instantiateStructDefaultValueFn? defFn ( read).levels ( read).params (pure fieldMap.find?)
| logError m!"default value for field '{fieldName}' of structure '{.ofConstName (← read).structName}' could not be instantiated, ignoring"
return ({}, none)
return (fields, val)
/--
Auxiliary type for `synthDefaultFields`
@@ -935,11 +912,14 @@ where
let ctor := getStructureCtor ( getEnv) parentName
unless params.size == ctor.numParams do return .done e
let flatCtorName := mkFlatCtorOfStructCtorName ctor.name
let cinfo try getConstInfo flatCtorName catch _ => getConstInfo (ctor.induct ++ `_flat_ctor) -- TODO(kmill): remove catch
let cinfo getConstInfo flatCtorName
let ctorVal instantiateValueLevelParams cinfo us
let fieldArgs := parentFields.map fieldMap.find!
-- Normalize the expressions since there might be some projections.
let params params.mapM normalizeExpr
let e' := (ctorVal.beta params).beta fieldArgs
return .done e'
-- Continue, since we need to reduce the parameters.
return .continue e'
private def getParentStructType? (parentStructName : Name) : StructInstM (Option (Expr × Option Name)) := do
let env getEnv
@@ -953,6 +933,7 @@ private def getParentStructType? (parentStructName : Name) : StructInstM (Option
let params := ty.getAppArgs
pure <| mkApp (mkAppN (.const projFn us) params) e
let projTy whnf <| inferType proj
let projTy normalizeExpr projTy
let projTy reduceSelfProjs self projTy
let projTy normalizeExpr projTy
if projTy.containsFVar self.fvarId! then

View File

@@ -529,46 +529,25 @@ private def fieldFromMsg (info : StructFieldInfo) : MessageData :=
m!"field '{info.name}'"
/--
Instantiates default value for field `fieldName` set at structure `structName`.
The arguments for the `_default` auxiliary function are provided by `fieldMap`.
Instantiates default value for field `fieldName` set at structure `structName`, using the field fvars in the `StructFieldInfo`s.
After default values are resolved, then the one that is added to the environment
as an `_inherited_default` auxiliary function is normalized; we don't do those normalizations here.
as an `_inherited_default` auxiliary function is normalized;
we don't do those normalizations here, since that could be wasted effort if this default isn't chosen.
-/
private partial def getFieldDefaultValue? (structName : Name) (paramMap : NameMap Expr) (fieldName : Name) : StructElabM (Option Expr) := do
match getDefaultFnForField? ( getEnv) structName fieldName with
| none => return none
| some defaultFn =>
let cinfo getConstInfo defaultFn
let us mkFreshLevelMVarsFor cinfo
go? ( instantiateValueLevelParams cinfo us)
where
failed : MetaM (Option Expr) := do
logWarning m!"ignoring default value for field '{fieldName}' defined at '{.ofConstName structName}'"
return none
private partial def getFieldDefaultValue? (structName : Name) (params : Array Expr) (fieldName : Name) : StructElabM (Option Expr) := do
let some defFn := getDefaultFnForField? ( getEnv) structName fieldName
| return none
let fieldVal? (n : Name) : StructElabM (Option Expr) := do
let some info findFieldInfo? n | return none
return info.fvar
let some (_, val) instantiateStructDefaultValueFn? defFn none params fieldVal?
| logWarning m!"default value for field '{fieldName}' of structure '{.ofConstName structName}' could not be instantiated, ignoring"
return none
return val
go? (e : Expr) : StructElabM (Option Expr) := do
match e with
| Expr.lam n d b c =>
if c.isExplicit then
let some info findFieldInfo? n | failed
let valType inferType info.fvar
if ( isDefEq valType d) then
go? (b.instantiate1 info.fvar)
else
failed
else
let some param := paramMap.find? n | return none
if isDefEq ( inferType param) d then
go? (b.instantiate1 param)
else
failed
| e =>
let r := if e.isAppOfArity ``id 2 then e.appArg! else e
return some ( reduceFieldProjs r)
private def getFieldDefault? (structName : Name) (paramMap : NameMap Expr) (fieldName : Name) :
private def getFieldDefault? (structName : Name) (params : Array Expr) (fieldName : Name) :
StructElabM (Option StructFieldDefault) := do
if let some val getFieldDefaultValue? structName (paramMap : NameMap Expr) fieldName then
if let some val getFieldDefaultValue? structName params fieldName then
-- Important: we use `getFieldDefaultValue?` because we want default value definitions, not *inherited* ones, to properly handle diamonds
trace[Elab.structure] "found default value for '{fieldName}' from '{.ofConstName structName}'{indentExpr val}"
return StructFieldDefault.optParam val
@@ -593,7 +572,7 @@ Adds `fieldName` of type `fieldType` from structure `structName`.
See `withStructFields` for meanings of other arguments.
-/
private partial def withStructField (view : StructView) (sourceStructNames : List Name) (inSubobject? : Option Expr)
(structName : Name) (paramMap : NameMap Expr) (fieldName : Name) (fieldType : Expr)
(structName : Name) (params : Array Expr) (fieldName : Name) (fieldType : Expr)
(k : Expr StructElabM α) : StructElabM α := do
trace[Elab.structure] "withStructField '{.ofConstName structName}', field '{fieldName}'"
let fieldType instantiateMVars fieldType
@@ -612,7 +591,7 @@ private partial def withStructField (view : StructView) (sourceStructNames : Lis
let existingFieldType inferType existingField.fvar
unless ( isDefEq fieldType existingFieldType) do
throwError "field type mismatch, field '{fieldName}' from parent '{.ofConstName structName}' {← mkHasTypeButIsExpectedMsg fieldType existingFieldType}"
if let some d getFieldDefault? structName paramMap fieldName then
if let some d getFieldDefault? structName params fieldName then
addFieldInheritedDefault fieldName structName d
k existingField.fvar
else
@@ -639,7 +618,7 @@ private partial def withStructField (view : StructView) (sourceStructNames : Lis
binfo := fieldInfo.binderInfo
projFn? := fieldInfo.projFn
}
if let some d getFieldDefault? structName paramMap fieldName then
if let some d getFieldDefault? structName params fieldName then
addFieldInheritedDefault fieldName structName d
k fieldFVar
@@ -652,7 +631,7 @@ Does not add a parent field for the structure itself; that is done by `withStruc
- the continuation `k` is run with a constructor expression for this structure
-/
private partial def withStructFields (view : StructView) (sourceStructNames : List Name)
(structType : Expr) (inSubobject? : Option Expr) (paramMap : NameMap Expr)
(structType : Expr) (inSubobject? : Option Expr)
(k : Expr StructElabM α) : StructElabM α := do
let structName getStructureName structType
let .const _ us := structType.getAppFn | unreachable!
@@ -688,7 +667,7 @@ private partial def withStructFields (view : StructView) (sourceStructNames : Li
let fieldName := fields[i]
let fieldMVar := fieldMVars[i]!
let fieldType inferType fieldMVar
withStructField view sourceStructNames inSubobject? structName paramMap fieldName fieldType fun fieldFVar => do
withStructField view sourceStructNames inSubobject? structName params fieldName fieldType fun fieldFVar => do
fieldMVar.mvarId!.assign fieldFVar
goFields (i + 1)
else
@@ -739,14 +718,7 @@ private partial def withStruct (view : StructView) (sourceStructNames : List Nam
let allFields := getStructureFieldsFlattened env structName (includeSubobjectFields := false)
let withStructFields' (kind : StructFieldKind) (inSubobject? : Option Expr) (k : StructFieldInfo StructElabM α) : StructElabM α := do
-- Create a parameter map for default value processing
let info getConstInfoInduct structName
let paramMap : NameMap Expr forallTelescope info.type fun xs _ => do
let mut paramMap := {}
for param in params, x in xs do
paramMap := paramMap.insert ( x.fvarId!.getUserName) param
return paramMap
withStructFields view sourceStructNames structType inSubobject? paramMap fun structVal => do
withStructFields view sourceStructNames structType inSubobject? fun structVal => do
if let some _ findFieldInfo? structFieldName then
throwErrorAt projRef "field '{structFieldName}' has already been declared\n\n\
The 'toParent : P' syntax can be used to adjust the name for the parent projection"
@@ -755,7 +727,7 @@ private partial def withStruct (view : StructView) (sourceStructNames : List Nam
-- which for inherited fields might not have been seen yet.
-- Note: duplication is ok for now. We use a stable sort later.
for fieldName in allFields do
if let some d getFieldDefault? structName paramMap fieldName then
if let some d getFieldDefault? structName params fieldName then
addFieldInheritedDefault fieldName structName d
withLetDecl rawStructFieldName structType structVal fun structFVar => do
let info : StructFieldInfo := {
@@ -1227,8 +1199,8 @@ private def registerStructure (structName : Name) (infos : Array StructFieldInfo
fieldName := info.name
projFn := info.declName
binderInfo := info.binfo
autoParam? := if let some (.autoParam tactic) := info.resolvedDefault? then some tactic else none
subobject? := if let .subobject parentName := info.kind then parentName else none
autoParam? := none -- deprecated field
}
else
return none

View File

@@ -20,8 +20,6 @@ def elabAsAuxLemma : Lean.Elab.Tactic.Tactic
unless mvars.isEmpty do
throwError "Cannot abstract term into auxiliary lemma because there are open goals."
let e instantiateMVars (mkMVar mvarId)
let env getEnv
-- TODO: this likely should share name creation code with `mkAuxLemma`
let e mkAuxTheorem ( mkFreshUserName <| env.asyncPrefix?.getD env.mainModule ++ `_auxLemma) ( mvarId.getType) e
let e mkAuxTheorem (prefix? := ( Term.getDeclName?)) ( mvarId.getType) e
mvarId.assign e
| _ => throwError "Invalid as_aux_lemma syntax"

View File

@@ -141,11 +141,11 @@ def grind
let result Grind.main mvar'.mvarId! params mainDeclName fallback
if result.hasFailures then
throwError "`grind` failed\n{← result.toMessageData}"
let auxName Term.mkAuxName `grind
-- `grind` proofs are often big
let e if ( isProp type) then
mkAuxTheorem auxName type ( instantiateMVarsProfiling mvar') (zetaDelta := true)
mkAuxTheorem (prefix? := mainDeclName) type ( instantiateMVarsProfiling mvar') (zetaDelta := true)
else
let auxName Term.mkAuxName `grind
mkAuxDefinition auxName type ( instantiateMVarsProfiling mvar') (zetaDelta := true)
mvarId.assign e
return result.trace

View File

@@ -672,7 +672,7 @@ open Lean Elab Tactic Parser.Tactic
/-- The `omega` tactic, for resolving integer and natural linear arithmetic problems. -/
def omegaTactic (cfg : OmegaConfig) : TacticM Unit := do
let auxName Term.mkAuxName `omega
let declName? Term.getDeclName?
liftMetaFinishingTactic fun g => do
let some g g.falseOrByContra | return ()
g.withContext do
@@ -682,7 +682,7 @@ def omegaTactic (cfg : OmegaConfig) : TacticM Unit := do
trace[omega] "analyzing {hyps.length} hypotheses:\n{← hyps.mapM inferType}"
omega hyps g'.mvarId! cfg
-- Omega proofs are typically rather large, so hide them in a separate definition
let e mkAuxTheorem auxName type ( instantiateMVarsProfiling g') (zetaDelta := true)
let e mkAuxTheorem (prefix? := declName?) type ( instantiateMVarsProfiling g') (zetaDelta := true)
g.assign e

View File

@@ -1971,7 +1971,7 @@ where
isValidAutoBoundImplicitName n (relaxedAutoImplicit.get ( getOptions)) then
throwAutoBoundImplicitLocal n
else
throwError "unknown identifier '{Lean.mkConst n}'"
throwUnknownIdentifier m!"unknown identifier '{Lean.mkConst n}'"
mkConsts candidates explicitLevels
/--

161
src/Lean/EnvExtension.lean Normal file
View File

@@ -0,0 +1,161 @@
/-
Copyright (c) 2025 Lean FRO. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura, Sebastian Ullrich
-/
prelude
import Lean.Environment
/-! Further environment extension API; the primitives live in `Lean.Environment`. -/
namespace Lean
/-- Simple `PersistentEnvExtension` that implements `exportEntriesFn` using a list of entries. -/
def SimplePersistentEnvExtension (α σ : Type) := PersistentEnvExtension α α (List α × σ)
@[specialize] def mkStateFromImportedEntries {α σ : Type} (addEntryFn : σ α σ) (initState : σ) (as : Array (Array α)) : σ :=
as.foldl (fun r es => es.foldl (fun r e => addEntryFn r e) r) initState
structure SimplePersistentEnvExtensionDescr (α σ : Type) where
name : Name := by exact decl_name%
addEntryFn : σ α σ
addImportedFn : Array (Array α) σ
toArrayFn : List α Array α := fun es => es.toArray
asyncMode : EnvExtension.AsyncMode := .mainOnly
replay? : Option ((newEntries : List α) (newState : σ) σ List α × σ) := none
/--
Returns a function suitable for `SimplePersistentEnvExtensionDescr.replay?` that replays all new
entries onto the state using `addEntryFn`. `p` should filter out entries that have already been
added to the state by a prior replay of the same realizable constant.
-/
def SimplePersistentEnvExtension.replayOfFilter (p : σ α Bool)
(addEntryFn : σ α σ) : List α σ σ List α × σ :=
fun newEntries _ s =>
let newEntries := newEntries.filter (p s)
(newEntries, newEntries.foldl (init := s) addEntryFn)
def registerSimplePersistentEnvExtension {α σ : Type} [Inhabited σ] (descr : SimplePersistentEnvExtensionDescr α σ) : IO (SimplePersistentEnvExtension α σ) :=
registerPersistentEnvExtension {
name := descr.name,
mkInitial := pure ([], descr.addImportedFn #[]),
addImportedFn := fun as => pure ([], descr.addImportedFn as),
addEntryFn := fun s e => match s with
| (entries, s) => (e::entries, descr.addEntryFn s e),
exportEntriesFn := fun s => descr.toArrayFn s.1.reverse,
statsFn := fun s => format "number of local entries: " ++ format s.1.length
asyncMode := descr.asyncMode
replay? := descr.replay?.map fun replay oldState newState _ (entries, s) =>
let newEntries := newState.1.take (newState.1.length - oldState.1.length)
let (newEntries, s) := replay newEntries newState.2 s
(newEntries ++ entries, s)
}
namespace SimplePersistentEnvExtension
instance {α σ : Type} [Inhabited σ] : Inhabited (SimplePersistentEnvExtension α σ) :=
inferInstanceAs (Inhabited (PersistentEnvExtension α α (List α × σ)))
/-- Get the list of values used to update the state of the given
`SimplePersistentEnvExtension` in the current file. -/
def getEntries {α σ : Type} [Inhabited σ] (ext : SimplePersistentEnvExtension α σ) (env : Environment) : List α :=
(PersistentEnvExtension.getState ext env).1
/-- Get the current state of the given `SimplePersistentEnvExtension`. -/
def getState {α σ : Type} [Inhabited σ] (ext : SimplePersistentEnvExtension α σ) (env : Environment)
(asyncMode := ext.toEnvExtension.asyncMode) : σ :=
(PersistentEnvExtension.getState (asyncMode := asyncMode) ext env).2
/-- Set the current state of the given `SimplePersistentEnvExtension`. This change is *not* persisted across files. -/
def setState {α σ : Type} (ext : SimplePersistentEnvExtension α σ) (env : Environment) (s : σ) : Environment :=
PersistentEnvExtension.modifyState ext env (fun entries, _ => (entries, s))
/-- Modify the state of the given extension in the given environment by applying the given function. This change is *not* persisted across files. -/
def modifyState {α σ : Type} (ext : SimplePersistentEnvExtension α σ) (env : Environment) (f : σ σ) : Environment :=
PersistentEnvExtension.modifyState ext env (fun entries, s => (entries, f s))
@[inherit_doc PersistentEnvExtension.findStateAsync]
def findStateAsync {α σ : Type} [Inhabited σ] (ext : SimplePersistentEnvExtension α σ)
(env : Environment) (declPrefix : Name) : σ :=
PersistentEnvExtension.findStateAsync ext env declPrefix |>.2
end SimplePersistentEnvExtension
/-- Environment extension for tagging declarations.
Declarations must only be tagged in the module where they were declared. -/
def TagDeclarationExtension := SimplePersistentEnvExtension Name NameSet
def mkTagDeclarationExtension (name : Name := by exact decl_name%)
(asyncMode : EnvExtension.AsyncMode := .mainOnly) : IO TagDeclarationExtension :=
registerSimplePersistentEnvExtension {
name := name,
addImportedFn := fun _ => {},
addEntryFn := fun s n => s.insert n,
toArrayFn := fun es => es.toArray.qsort Name.quickLt
asyncMode
}
namespace TagDeclarationExtension
instance : Inhabited TagDeclarationExtension :=
inferInstanceAs (Inhabited (SimplePersistentEnvExtension Name NameSet))
def tag (ext : TagDeclarationExtension) (env : Environment) (declName : Name) : Environment :=
have : Inhabited Environment := env
assert! env.getModuleIdxFor? declName |>.isNone -- See comment at `TagDeclarationExtension`
assert! env.asyncMayContain declName
ext.addEntry env declName
def isTagged (ext : TagDeclarationExtension) (env : Environment) (declName : Name) : Bool :=
match env.getModuleIdxFor? declName with
| some modIdx => (ext.getModuleEntries env modIdx).binSearchContains declName Name.quickLt
| none => if ext.toEnvExtension.asyncMode matches .async then
(ext.findStateAsync env declName).contains declName
else
(ext.getState env).contains declName
end TagDeclarationExtension
/-- Environment extension for mapping declarations to values.
Declarations must only be inserted into the mapping in the module where they were declared. -/
structure MapDeclarationExtension (α : Type) extends PersistentEnvExtension (Name × α) (Name × α) (NameMap α)
deriving Inhabited
def mkMapDeclarationExtension (name : Name := by exact decl_name%) : IO (MapDeclarationExtension α) :=
.mk <$> registerPersistentEnvExtension {
name := name,
mkInitial := pure {}
addImportedFn := fun _ => pure {}
addEntryFn := fun s (n, v) => s.insert n v
exportEntriesFn := fun s => s.toArray
asyncMode := .async
replay? := some fun _ newState newConsts s =>
newConsts.foldl (init := s) fun s c =>
if let some a := newState.find? c then
s.insert c a
else s
}
namespace MapDeclarationExtension
def insert (ext : MapDeclarationExtension α) (env : Environment) (declName : Name) (val : α) : Environment :=
have : Inhabited Environment := env
assert! env.getModuleIdxFor? declName |>.isNone -- See comment at `MapDeclarationExtension`
assert! env.asyncMayContain declName
ext.addEntry env (declName, val)
def find? [Inhabited α] (ext : MapDeclarationExtension α) (env : Environment) (declName : Name) : Option α :=
match env.getModuleIdxFor? declName with
| some modIdx =>
match (ext.getModuleEntries env modIdx).binSearch (declName, default) (fun a b => Name.quickLt a.1 b.1) with
| some e => some e.2
| none => none
| none => (ext.findStateAsync env declName).find? declName
def contains [Inhabited α] (ext : MapDeclarationExtension α) (env : Environment) (declName : Name) : Bool :=
match env.getModuleIdxFor? declName with
| some modIdx => (ext.getModuleEntries env modIdx).binSearchContains (declName, default) (fun a b => Name.quickLt a.1 b.1)
| none => (ext.findStateAsync env declName).contains declName
end MapDeclarationExtension

View File

@@ -1044,11 +1044,11 @@ def instantiateValueLevelParams! (c : ConstantInfo) (ls : List Level) : Expr :=
end ConstantInfo
/--
Async access mode for environment extensions used in `EnvironmentExtension.get/set/modifyState`.
Depending on their specific uses, extensions may opt out of the strict `sync` access mode in order
to avoid blocking parallel elaboration and/or to optimize accesses. The access mode is set at
environment extension registration time but can be overriden at `EnvironmentExtension.getState` in
order to weaken it for specific accesses.
Async access mode for environment extensions used in `EnvExtension.get/set/modifyState`.
When modified in concurrent contexts, extensions may need to switch to a different mode than the
default `mainOnly`, which will panic in such cases. The access mode is set at environment extension
registration time but can be overriden when calling the mentioned functions in order to weaken it
for specific accesses.
In all modes, the state stored into the `.olean` file for persistent environment extensions is the
result of `getState` called on the main environment branch at the end of the file, i.e. it
@@ -1056,15 +1056,15 @@ encompasses all modifications for all modes but `local`.
-/
inductive EnvExtension.AsyncMode where
/--
Default access mode, writing and reading the extension state to/from the full `checked`
Safest access mode, writes and reads the extension state to/from the full `checked`
environment. This mode ensures the observed state is identical independently of whether or how
parallel elaboration is used but `getState` will block on all prior environment branches by
waiting for `checked`. `setState` and `modifyState` do not block.
While a safe default, any extension that reasonably could be used in parallel elaboration contexts
should opt for a weaker mode to avoid blocking unless there is no way to access the correct state
without waiting for all prior environment branches, in which case its data management should be
restructured if at all possible.
While a safe fallback for when `mainOnly` is not sufficient, any extension that reasonably could
be used in parallel elaboration contexts should opt for a weaker mode to avoid blocking unless
there is no way to access the correct state without waiting for all prior environment branches, in
which case its data management should be restructured if at all possible.
-/
| sync
/--
@@ -1077,9 +1077,10 @@ inductive EnvExtension.AsyncMode where
-/
| local
/--
Like `local` but panics when trying to modify the state on anything but the main environment
branch. For extensions that fulfill this requirement, all modes functionally coincide but this
is the safest and most efficient choice in that case, preventing accidental misuse.
Default access mode. Like `local` but panics when trying to modify the state on anything but the
main environment branch. For extensions that fulfill this requirement, all modes functionally
coincide with `local` but this is the safest and most efficient choice in that case, preventing
accidental misuse.
This mode is suitable for extensions that are modified only at the command elaboration level
before any environment forks in the command, and in particular for extensions that are modified
@@ -1464,156 +1465,6 @@ unsafe def registerPersistentEnvExtensionUnsafe {α β σ : Type} [Inhabited σ]
@[implemented_by registerPersistentEnvExtensionUnsafe]
opaque registerPersistentEnvExtension {α β σ : Type} [Inhabited σ] (descr : PersistentEnvExtensionDescr α β σ) : IO (PersistentEnvExtension α β σ)
/-- Simple `PersistentEnvExtension` that implements `exportEntriesFn` using a list of entries. -/
def SimplePersistentEnvExtension (α σ : Type) := PersistentEnvExtension α α (List α × σ)
@[specialize] def mkStateFromImportedEntries {α σ : Type} (addEntryFn : σ α σ) (initState : σ) (as : Array (Array α)) : σ :=
as.foldl (fun r es => es.foldl (fun r e => addEntryFn r e) r) initState
structure SimplePersistentEnvExtensionDescr (α σ : Type) where
name : Name := by exact decl_name%
addEntryFn : σ α σ
addImportedFn : Array (Array α) σ
toArrayFn : List α Array α := fun es => es.toArray
asyncMode : EnvExtension.AsyncMode := .mainOnly
replay? : Option ((newEntries : List α) (newState : σ) σ List α × σ) := none
/--
Returns a function suitable for `SimplePersistentEnvExtensionDescr.replay?` that replays all new
entries onto the state using `addEntryFn`. `p` should filter out entries that have already been
added to the state by a prior replay of the same realizable constant.
-/
def SimplePersistentEnvExtension.replayOfFilter (p : σ α Bool)
(addEntryFn : σ α σ) : List α σ σ List α × σ :=
fun newEntries _ s =>
let newEntries := newEntries.filter (p s)
(newEntries, newEntries.foldl (init := s) addEntryFn)
def registerSimplePersistentEnvExtension {α σ : Type} [Inhabited σ] (descr : SimplePersistentEnvExtensionDescr α σ) : IO (SimplePersistentEnvExtension α σ) :=
registerPersistentEnvExtension {
name := descr.name,
mkInitial := pure ([], descr.addImportedFn #[]),
addImportedFn := fun as => pure ([], descr.addImportedFn as),
addEntryFn := fun s e => match s with
| (entries, s) => (e::entries, descr.addEntryFn s e),
exportEntriesFn := fun s => descr.toArrayFn s.1.reverse,
statsFn := fun s => format "number of local entries: " ++ format s.1.length
asyncMode := descr.asyncMode
replay? := descr.replay?.map fun replay oldState newState _ (entries, s) =>
let newEntries := newState.1.take (newState.1.length - oldState.1.length)
let (newEntries, s) := replay newEntries newState.2 s
(newEntries ++ entries, s)
}
namespace SimplePersistentEnvExtension
instance {α σ : Type} [Inhabited σ] : Inhabited (SimplePersistentEnvExtension α σ) :=
inferInstanceAs (Inhabited (PersistentEnvExtension α α (List α × σ)))
/-- Get the list of values used to update the state of the given
`SimplePersistentEnvExtension` in the current file. -/
def getEntries {α σ : Type} [Inhabited σ] (ext : SimplePersistentEnvExtension α σ) (env : Environment) : List α :=
(PersistentEnvExtension.getState ext env).1
/-- Get the current state of the given `SimplePersistentEnvExtension`. -/
def getState {α σ : Type} [Inhabited σ] (ext : SimplePersistentEnvExtension α σ) (env : Environment)
(asyncMode := ext.toEnvExtension.asyncMode) : σ :=
(PersistentEnvExtension.getState (asyncMode := asyncMode) ext env).2
/-- Set the current state of the given `SimplePersistentEnvExtension`. This change is *not* persisted across files. -/
def setState {α σ : Type} (ext : SimplePersistentEnvExtension α σ) (env : Environment) (s : σ) : Environment :=
PersistentEnvExtension.modifyState ext env (fun entries, _ => (entries, s))
/-- Modify the state of the given extension in the given environment by applying the given function. This change is *not* persisted across files. -/
def modifyState {α σ : Type} (ext : SimplePersistentEnvExtension α σ) (env : Environment) (f : σ σ) : Environment :=
PersistentEnvExtension.modifyState ext env (fun entries, s => (entries, f s))
@[inherit_doc PersistentEnvExtension.findStateAsync]
def findStateAsync {α σ : Type} [Inhabited σ] (ext : SimplePersistentEnvExtension α σ)
(env : Environment) (declPrefix : Name) : σ :=
PersistentEnvExtension.findStateAsync ext env declPrefix |>.2
end SimplePersistentEnvExtension
/-- Environment extension for tagging declarations.
Declarations must only be tagged in the module where they were declared. -/
def TagDeclarationExtension := SimplePersistentEnvExtension Name NameSet
def mkTagDeclarationExtension (name : Name := by exact decl_name%)
(asyncMode : EnvExtension.AsyncMode := .mainOnly) : IO TagDeclarationExtension :=
registerSimplePersistentEnvExtension {
name := name,
addImportedFn := fun _ => {},
addEntryFn := fun s n => s.insert n,
toArrayFn := fun es => es.toArray.qsort Name.quickLt
asyncMode
}
namespace TagDeclarationExtension
instance : Inhabited TagDeclarationExtension :=
inferInstanceAs (Inhabited (SimplePersistentEnvExtension Name NameSet))
def tag (ext : TagDeclarationExtension) (env : Environment) (declName : Name) : Environment :=
have : Inhabited Environment := env
assert! env.getModuleIdxFor? declName |>.isNone -- See comment at `TagDeclarationExtension`
assert! env.asyncMayContain declName
ext.addEntry env declName
def isTagged (ext : TagDeclarationExtension) (env : Environment) (declName : Name) : Bool :=
match env.getModuleIdxFor? declName with
| some modIdx => (ext.getModuleEntries env modIdx).binSearchContains declName Name.quickLt
| none => if ext.toEnvExtension.asyncMode matches .async then
(ext.findStateAsync env declName).contains declName
else
(ext.getState env).contains declName
end TagDeclarationExtension
/-- Environment extension for mapping declarations to values.
Declarations must only be inserted into the mapping in the module where they were declared. -/
structure MapDeclarationExtension (α : Type) extends PersistentEnvExtension (Name × α) (Name × α) (NameMap α)
deriving Inhabited
def mkMapDeclarationExtension (name : Name := by exact decl_name%) : IO (MapDeclarationExtension α) :=
.mk <$> registerPersistentEnvExtension {
name := name,
mkInitial := pure {}
addImportedFn := fun _ => pure {}
addEntryFn := fun s (n, v) => s.insert n v
exportEntriesFn := fun s => s.toArray
asyncMode := .async
replay? := some fun _ newState newConsts s =>
newConsts.foldl (init := s) fun s c =>
if let some a := newState.find? c then
s.insert c a
else s
}
namespace MapDeclarationExtension
def insert (ext : MapDeclarationExtension α) (env : Environment) (declName : Name) (val : α) : Environment :=
have : Inhabited Environment := env
assert! env.getModuleIdxFor? declName |>.isNone -- See comment at `MapDeclarationExtension`
assert! env.asyncMayContain declName
ext.addEntry env (declName, val)
def find? [Inhabited α] (ext : MapDeclarationExtension α) (env : Environment) (declName : Name) : Option α :=
match env.getModuleIdxFor? declName with
| some modIdx =>
match (ext.getModuleEntries env modIdx).binSearch (declName, default) (fun a b => Name.quickLt a.1 b.1) with
| some e => some e.2
| none => none
| none => (ext.findStateAsync env declName).find? declName
def contains [Inhabited α] (ext : MapDeclarationExtension α) (env : Environment) (declName : Name) : Bool :=
match env.getModuleIdxFor? declName with
| some modIdx => (ext.getModuleEntries env modIdx).binSearchContains (declName, default) (fun a b => Name.quickLt a.1 b.1)
| none => (ext.findStateAsync env declName).contains declName
end MapDeclarationExtension
@[extern "lean_save_module_data"]
opaque saveModuleData (fname : @& System.FilePath) (mod : @& Name) (data : @& ModuleData) : IO Unit
@[extern "lean_read_module_data"]
@@ -1800,16 +1651,12 @@ private def equivInfo (cinfo₁ cinfo₂ : ConstantInfo) : Bool := Id.run do
&& tval₁.all == tval₂.all
/--
Construct environment from `importModulesCore` results.
Constructs environment from `importModulesCore` results.
If `leakEnv` is true, we mark the environment as persistent, which means it
will not be freed. We set this when the object would survive until the end of
the process anyway. In exchange, RC updates are avoided, which is especially
important when they would be atomic because the environment is shared across
threads (potentially, storing it in an `IO.Ref` is sufficient for marking it
as such). -/
See also `importModules` for parameter documentation.
-/
def finalizeImport (s : ImportState) (imports : Array Import) (opts : Options) (trustLevel : UInt32 := 0)
(leakEnv := false) : IO Environment := do
(leakEnv loadExts : Bool) : IO Environment := do
let numConsts := s.moduleData.foldl (init := 0) fun numConsts mod =>
numConsts + mod.constants.size + mod.extraConstNames.size
let mut const2ModIdx : Std.HashMap Name ModuleIdx := Std.HashMap.emptyWithCapacity (capacity := numConsts)
@@ -1859,14 +1706,15 @@ def finalizeImport (s : ImportState) (imports : Array Import) (opts : Options) (
Safety: There are no concurrent accesses to `env` at this point. -/
env unsafe Runtime.markPersistent env
env finalizePersistentExtensions env s.moduleData opts
if leakEnv then
/- Ensure the final environment including environment extension states is
marked persistent as documented.
if loadExts then
env finalizePersistentExtensions env s.moduleData opts
if leakEnv then
/- Ensure the final environment including environment extension states is
marked persistent as documented.
Safety: There are no concurrent accesses to `env` at this point, assuming
extensions' `addImportFn`s did not spawn any unbound tasks. -/
env unsafe Runtime.markPersistent env
Safety: There are no concurrent accesses to `env` at this point, assuming
extensions' `addImportFn`s did not spawn any unbound tasks. -/
env unsafe Runtime.markPersistent env
return { env with realizedImportedConsts? := some {
-- safety: `RealizationContext` is private
env := unsafe unsafeCast env
@@ -1874,9 +1722,22 @@ def finalizeImport (s : ImportState) (imports : Array Import) (opts : Options) (
constsRef := ( IO.mkRef {})
} }
@[export lean_import_modules]
/--
Creates environment object from given imports.
If `leakEnv` is true, we mark the environment as persistent, which means it will not be freed. We
set this when the object would survive until the end of the process anyway. In exchange, RC updates
are avoided, which is especially important when they would be atomic because the environment is
shared across threads (potentially, storing it in an `IO.Ref` is sufficient for marking it as such).
If `loadExts` is true, we initialize the environment extensions using the imported data. Doing so
may use the interpreter and thus is only safe to do after calling `enableInitializersExecution`; see
also caveats there. If not set, every extension will have its initial value as its state. While the
environment's constant map can be accessed without `loadExts`, many functions that take
`Environment` or are in a monad carrying it such as `CoreM` may not function properly without it.
-/
def importModules (imports : Array Import) (opts : Options) (trustLevel : UInt32 := 0)
(plugins : Array System.FilePath := #[]) (leakEnv := false)
(plugins : Array System.FilePath := #[]) (leakEnv := false) (loadExts := false)
: IO Environment := profileitIO "import" opts do
for imp in imports do
if imp.module matches .anonymous then
@@ -1884,13 +1745,17 @@ def importModules (imports : Array Import) (opts : Options) (trustLevel : UInt32
withImporting do
plugins.forM Lean.loadPlugin
let (_, s) importModulesCore imports |>.run
finalizeImport (leakEnv := leakEnv) s imports opts trustLevel
finalizeImport (leakEnv := leakEnv) (loadExts := loadExts) s imports opts trustLevel
/--
Create environment object from imports and free compacted regions after calling `act`. No live references to the
environment object or imported objects may exist after `act` finishes. -/
unsafe def withImportModules {α : Type} (imports : Array Import) (opts : Options) (trustLevel : UInt32 := 0) (act : Environment IO α) : IO α := do
let env importModules imports opts trustLevel
Creates environment object from imports and frees compacted regions after calling `act`. No live
references to the environment object or imported objects may exist after `act` finishes. As this
cannot be ruled out after loading environment extensions, `importModules`'s `loadExts` is always
unset using this function.
-/
unsafe def withImportModules {α : Type} (imports : Array Import) (opts : Options)
(act : Environment IO α) (trustLevel : UInt32 := 0) : IO α := do
let env importModules (loadExts := false) imports opts trustLevel
try act env finally env.freeRegions
@[inherit_doc Kernel.Environment.enableDiag]

View File

@@ -69,9 +69,31 @@ protected def throwError [Monad m] [MonadError m] (msg : MessageData) : m α :=
let (ref, msg) AddErrorMessageContext.add ref msg
throw <| Exception.error ref msg
/--
Tag used for `unknown identifier` messages.
This tag is used by the 'import unknown identifier' code action to detect messages that should
prompt the code action.
-/
def unknownIdentifierMessageTag : Name := `unknownIdentifier
/--
Creates a `MessageData` that is tagged with `unknownIdentifierMessageTag`.
This tag is used by the 'import unknown identifier' code action to detect messages that should
prompt the code action.
-/
def mkUnknownIdentifierMessage (msg : MessageData) : MessageData :=
MessageData.tagged unknownIdentifierMessageTag msg
/--
Throw an unknown identifier error message that is tagged with `unknownIdentifierMessageTag`.
See also `mkUnknownIdentifierMessage`.
-/
def throwUnknownIdentifier [Monad m] [MonadError m] (msg : MessageData) : m α :=
Lean.throwError <| mkUnknownIdentifierMessage msg
/-- Throw an unknown constant error message. -/
def throwUnknownConstant [Monad m] [MonadError m] (constName : Name) : m α :=
Lean.throwError m!"unknown constant '{.ofConstName constName}'"
throwUnknownIdentifier m!"unknown constant '{.ofConstName constName}'"
/-- Throw an error exception using the given message data and reference syntax. -/
protected def throwErrorAt [Monad m] [MonadError m] (ref : Syntax) (msg : MessageData) : m α := do

View File

@@ -31,6 +31,7 @@ def isNonTrivialProof (e : Expr) : MetaM Bool := do
pure $ !f.isAtomic || args.any fun arg => !arg.isAtomic
structure Context where
cache : Bool
baseName : Name
structure State where
@@ -74,21 +75,19 @@ where
let type zetaReduce type
/- Ensure proofs nested in type are also abstracted -/
let type visit type
let lemmaName mkAuxName (ctx.baseName ++ `proof) ( get).nextIdx
modify fun s => { s with nextIdx := s.nextIdx + 1 }
/- We turn on zetaDelta-expansion to make sure we don't need to perform an expensive `check` step to
identify which let-decls can be abstracted. If we design a more efficient test, we can avoid the eager zetaDelta expansion step.
It a benchmark created by @selsam, The extra `check` step was a bottleneck. -/
mkAuxTheorem lemmaName type e (zetaDelta := true)
mkAuxTheorem (prefix? := ctx.baseName) (cache := ctx.cache) type e (zetaDelta := true)
end AbstractNestedProofs
/-- Replace proofs nested in `e` with new lemmas. The new lemmas have names of the form `mainDeclName.proof_<idx>` -/
def abstractNestedProofs (mainDeclName : Name) (e : Expr) : MetaM Expr := do
def abstractNestedProofs (mainDeclName : Name) (e : Expr) (cache := true) : MetaM Expr := do
if ( isProof e) then
-- `e` is a proof itself. So, we don't abstract nested proofs
return e
else
AbstractNestedProofs.visit e |>.run { baseName := mainDeclName } |>.run |>.run' { nextIdx := 1 }
AbstractNestedProofs.visit e |>.run { cache, baseName := mainDeclName } |>.run |>.run' { nextIdx := 1 }
end Lean.Meta

View File

@@ -10,6 +10,7 @@ import Lean.AddDecl
import Lean.Util.FoldConsts
import Lean.Meta.Basic
import Lean.Meta.Check
import Lean.Meta.Tactic.AuxLemma
/-!
@@ -391,36 +392,9 @@ def mkAuxDefinitionFor (name : Name) (value : Expr) (zetaDelta : Bool := false)
/--
Create an auxiliary theorem with the given name, type and value. It is similar to `mkAuxDefinition`.
-/
def mkAuxTheorem (name : Name) (type : Expr) (value : Expr) (zetaDelta : Bool := false) : MetaM Expr := do
def mkAuxTheorem (type : Expr) (value : Expr) (zetaDelta : Bool := false) (prefix? : Option Name) (cache := true) : MetaM Expr := do
let result Closure.mkValueTypeClosure type value zetaDelta
let env getEnv
let decl :=
if env.hasUnsafe result.type || env.hasUnsafe result.value then
-- `result` contains unsafe code, thus we cannot use a theorem.
Declaration.defnDecl {
name
levelParams := result.levelParams.toList
type := result.type
value := result.value
hints := ReducibilityHints.opaque
safety := DefinitionSafety.unsafe
}
else
Declaration.thmDecl {
name
levelParams := result.levelParams.toList
type := result.type
value := result.value
}
addDecl decl
let name mkAuxLemma (prefix? := prefix?) (cache := cache) result.levelParams.toList result.type result.value
return mkAppN (mkConst name result.levelArgs.toList) result.exprArgs
/--
Similar to `mkAuxTheorem`, but infers the type of `value`.
-/
def mkAuxTheoremFor (name : Name) (value : Expr) (zetaDelta : Bool := false) : MetaM Expr := do
let type inferType value
let type := type.headBeta
mkAuxTheorem name type value zetaDelta
end Lean.Meta

View File

@@ -61,7 +61,7 @@ partial def mkHCongrWithArity (f : Expr) (numArgs : Nat) : MetaM CongrTheorem :=
forallBoundedTelescope fType numArgs (cleanupAnnotations := true) fun xs _ =>
forallBoundedTelescope fType numArgs (cleanupAnnotations := true) fun ys _ => do
if xs.size != numArgs then
throwError "failed to generate hcongr theorem, insufficient number of arguments"
throwError "failed to generate `hcongr` theorem: expected {numArgs} arguments, but got {xs.size} for{indentExpr f}"
else
let lctx := addPrimeToFVarUserNames ys ( getLCtx) |> setBinderInfosD ys |> setBinderInfosD xs
withLCtx lctx ( getLocalInstances) do

View File

@@ -181,4 +181,42 @@ def etaStructReduce (e : Expr) (p : Name → Bool) : MetaM Expr := do
else
return .continue)
/--
Instantiates the default value given by the default value function `defaultFn`.
- `defaultFn` is the default value function returned by `Lean.getEffectiveDefaultFnForField?` or `Lean.getDefaultFnForField?`.
- `levels?` is the list of levels to use, and otherwise the levels are inferred.
- `params` is the list of structure parameters. These are assumed to be correct for the given levels.
- `fieldVal?` is a function for getting fields for values, if they exist.
If successful, returns a set of fields used and the resulting default value.
Success is expected. Callers should do metacontext backtracking themselves if needed.
-/
partial def instantiateStructDefaultValueFn?
[Monad m] [MonadEnv m] [MonadError m] [MonadLiftT MetaM m] [MonadControlT MetaM m]
(defaultFn : Name) (levels? : Option (List Level)) (params : Array Expr)
(fieldVal? : Name m (Option Expr)) : m (Option (NameSet × Expr)) := do
let cinfo getConstInfo defaultFn
let us levels?.getDM (mkFreshLevelMVarsFor cinfo)
assert! us.length == cinfo.levelParams.length
let mut val liftMetaM <| instantiateValueLevelParams cinfo us
for param in params do
let .lam _ t b _ := val | return none
-- If no levels given, need to unify to solve for level mvars.
if levels?.isNone then
unless ( isDefEq ( inferType param) t) do return none
val := b.instantiate1 param
go? {} val
where
go? (usedFields : NameSet) (e : Expr) : m (Option (NameSet × Expr)) := do
match e with
| .lam n d b _ =>
let some val fieldVal? n | return none
if ( isDefEq ( inferType val) d) then
go? (usedFields.insert n) (b.instantiate1 val)
else
return none
| e =>
let_expr id _ a := e | return some (usedFields, e)
return some (usedFields, a)
end Lean.Meta

View File

@@ -28,19 +28,32 @@ builtin_initialize auxLemmasExt : EnvExtension AuxLemmas ←
This method is useful for tactics (e.g., `simp`) that may perform preprocessing steps to lemmas provided by
users. For example, `simp` preprocessor may convert a lemma into multiple ones.
-/
def mkAuxLemma (levelParams : List Name) (type : Expr) (value : Expr) : MetaM Name := do
def mkAuxLemma (levelParams : List Name) (type : Expr) (value : Expr) (prefix? : Option Name := none) (cache := true) : MetaM Name := do
let env getEnv
let s := auxLemmasExt.getState env
let mkNewAuxLemma := do
let auxName := Name.mkNum (env.asyncPrefix?.getD env.mainModule ++ `_auxLemma) s.idx
addDecl <| Declaration.thmDecl {
name := auxName
levelParams, type, value
}
let auxName := prefix?.getD (env.asyncPrefix?.getD (mkPrivateName env .anonymous)) ++ `_proof |>.appendIndexAfter s.idx
let decl :=
if env.hasUnsafe type || env.hasUnsafe value then
-- `result` contains unsafe code, thus we cannot use a theorem.
Declaration.defnDecl {
name := auxName
hints := ReducibilityHints.opaque
safety := DefinitionSafety.unsafe
levelParams, type, value
}
else
Declaration.thmDecl {
name := auxName
levelParams, type, value
}
addDecl decl
modifyEnv fun env => auxLemmasExt.modifyState env fun idx, lemmas => idx + 1, lemmas.insert type (auxName, levelParams)
return auxName
match s.lemmas.find? type with
| some (name, levelParams') => if levelParams == levelParams' then return name else mkNewAuxLemma
| none => mkNewAuxLemma
if cache then
if let some (name, levelParams') := s.lemmas.find? type then
if levelParams == levelParams' then
return name
mkNewAuxLemma
end Lean.Meta

View File

@@ -72,5 +72,6 @@ builtin_initialize registerTraceClass `grind.debug.matchCond.lambda
builtin_initialize registerTraceClass `grind.debug.matchCond.proveFalse
builtin_initialize registerTraceClass `grind.debug.mbtc
builtin_initialize registerTraceClass `grind.debug.ematch
builtin_initialize registerTraceClass `grind.debug.proveEq
end Lean

View File

@@ -101,56 +101,76 @@ private def intro1 : GoalM FVarId := do
private partial def introNext (goal : Goal) (generation : Nat) : GrindM IntroResult := do
Prod.fst <$> GoalM.run goal do
let target ( get).mvarId.getType
if target.isArrow then
if target.isForall then
let p := target.bindingDomain!
if !( isProp p) then
let fvarId intro1
return .newLocal fvarId ( get)
else
let tag ( get).mvarId.getTag
let q := target.bindingBody!
let r preprocessHypothesis p
let fvarId mkFreshFVarId
let lctx := ( getLCtx).mkLocalDecl fvarId ( mkCleanName target.bindingName! r.expr) r.expr target.bindingInfo!
let mvarId := ( get).mvarId
let mvarNew mkFreshExprMVarAt lctx ( getLocalInstances) q .syntheticOpaque tag
let mvarIdNew := mvarNew.mvarId!
mvarIdNew.withContext do
let h mkLambdaFVars #[mkFVar fvarId] mvarNew
match r.proof? with
| some he =>
let tag mvarId.getTag
let qBase := target.bindingBody!
let fvarId mkFreshFVarId
let fvar := mkFVar fvarId
let r preprocessHypothesis p
let lctx := ( getLCtx).mkLocalDecl fvarId ( mkCleanName target.bindingName! r.expr) r.expr target.bindingInfo!
let mut localInsts getLocalInstances
if let some className isClass? r.expr then
localInsts := localInsts.push { className, fvar }
match r.proof? with
| some he =>
if target.isArrow then
let q := qBase
let u getLevel q
let hNew := mkAppN (mkConst ``Lean.Grind.intro_with_eq [u]) #[p, r.expr, q, he, h]
mvarId.assign hNew
return .newHyp fvarId { ( get) with mvarId := mvarIdNew }
| none =>
-- `p` and `p'` are definitionally equal
let mvarNew mkFreshExprMVarAt lctx localInsts q .syntheticOpaque tag
let mvarIdNew := mvarNew.mvarId!
mvarIdNew.withContext do
let h mkLambdaFVars #[fvar] mvarNew
let hNew := mkAppN (mkConst ``Lean.Grind.intro_with_eq [u]) #[p, r.expr, q, he, h]
mvarId.assign hNew
return .newHyp fvarId { ( get) with mvarId := mvarIdNew }
else
let q := mkLambda target.bindingName! target.bindingInfo! p qBase
let q' := qBase.instantiate1 (mkApp4 (mkConst ``Eq.mpr_prop) p r.expr he fvar)
let u getLevel q'
let mvarNew mkFreshExprMVarAt lctx localInsts q' .syntheticOpaque tag
let mvarIdNew := mvarNew.mvarId!
mvarIdNew.withContext do
let h mkLambdaFVars #[fvar] mvarNew
let hNew := mkAppN (mkConst ``Lean.Grind.intro_with_eq' [u]) #[p, r.expr, q, he, h]
mvarId.assign hNew
return .newHyp fvarId { ( get) with mvarId := mvarIdNew }
| none =>
-- `p` and `p'` are definitionally equal
let q := if target.isArrow then qBase else qBase.instantiate1 (mkFVar fvarId)
let mvarNew mkFreshExprMVarAt lctx localInsts q .syntheticOpaque tag
let mvarIdNew := mvarNew.mvarId!
mvarIdNew.withContext do
let h mkLambdaFVars #[mkFVar fvarId] mvarNew
mvarId.assign h
return .newHyp fvarId { ( get) with mvarId := mvarIdNew }
else if ( getConfig).zetaDelta && (target.isLet || target.isLetFun) then
let targetNew := expandLet target #[]
let mvarId := ( get).mvarId
mvarId.withContext do
let mvarNew mkFreshExprSyntheticOpaqueMVar targetNew ( mvarId.getTag)
mvarId.assign mvarNew
introNext { ( get) with mvarId := mvarNew.mvarId! } generation
else if target.isLet || target.isForall || target.isLetFun then
let fvarId intro1
( get).mvarId.withContext do
let localDecl fvarId.getDecl
if ( isProp localDecl.type) then
-- Add a non-dependent copy
let mvarId ( get).mvarId.assert ( mkCleanName `h localDecl.type) localDecl.type (mkFVar fvarId)
return .newDepHyp { ( get) with mvarId }
else
if target.isLet || target.isLetFun then
else if target.isLet || target.isLetFun then
if ( getConfig).zetaDelta then
let targetNew := expandLet target #[]
let mvarId := ( get).mvarId
mvarId.withContext do
let mvarNew mkFreshExprSyntheticOpaqueMVar targetNew ( mvarId.getTag)
mvarId.assign mvarNew
introNext { ( get) with mvarId := mvarNew.mvarId! } generation
else
let fvarId intro1
( get).mvarId.withContext do
let localDecl fvarId.getDecl
if ( isProp localDecl.type) then
-- Add a non-dependent copy
let mvarId ( get).mvarId.assert ( mkCleanName `h localDecl.type) localDecl.type (mkFVar fvarId)
return .newDepHyp { ( get) with mvarId }
else
let v := ( fvarId.getDecl).value
let r preprocessHypothesis v
let x shareCommon (mkFVar fvarId)
addNewEq x r.expr ( r.getProof) generation
return .newLocal fvarId ( get)
else
return .newLocal fvarId ( get)
else
return .done goal

View File

@@ -119,15 +119,25 @@ builtin_grind_propagator propagateNotDown ↓Not := fun e => do
else if ( isEqv e a) then
closeGoal <| mkApp2 (mkConst ``Grind.false_of_not_eq_self) a ( mkEqProof e a)
def propagateBoolDiseq (a : Expr) : GoalM Unit := do
if let some h mkDiseqProof? a ( getBoolFalseExpr) then
pushEqBoolTrue a <| mkApp2 (mkConst ``Grind.Bool.eq_true_of_not_eq_false') a h
if let some h mkDiseqProof? a ( getBoolTrueExpr) then
pushEqBoolFalse a <| mkApp2 (mkConst ``Grind.Bool.eq_false_of_not_eq_true') a h
/-- Propagates `Eq` upwards -/
builtin_grind_propagator propagateEqUp Eq := fun e => do
let_expr Eq _ a b := e | return ()
let_expr Eq α a b := e | return ()
if ( isEqTrue a) then
pushEq e b <| mkApp3 (mkConst ``Grind.eq_eq_of_eq_true_left) a b ( mkEqTrueProof a)
else if ( isEqTrue b) then
pushEq e a <| mkApp3 (mkConst ``Grind.eq_eq_of_eq_true_right) a b ( mkEqTrueProof b)
else if ( isEqv a b) then
pushEqTrue e <| mkEqTrueCore e ( mkEqProof a b)
if α.isConstOf ``Bool then
if ( isEqFalse e) then
propagateBoolDiseq a
propagateBoolDiseq b
let aRoot getRootENode a
let bRoot getRootENode b
if aRoot.ctor && bRoot.ctor && aRoot.self.getAppFn != bRoot.self.getAppFn then
@@ -146,6 +156,9 @@ builtin_grind_propagator propagateEqDown ↓Eq := fun e => do
pushEq a b <| mkOfEqTrueCore e ( mkEqTrueProof e)
else if ( isEqFalse e) then
let_expr Eq α lhs rhs := e | return ()
if α.isConstOf ``Bool then
propagateBoolDiseq lhs
propagateBoolDiseq rhs
propagateCutsatDiseq lhs rhs
let thms getExtTheorems α
if !thms.isEmpty then
@@ -159,6 +172,25 @@ builtin_grind_propagator propagateEqDown ↓Eq := fun e => do
for thm in ( getExtTheorems α) do
instantiateExtTheorem thm e
builtin_grind_propagator propagateBEqUp BEq.beq := fun e => do
/-
`grind` uses the normalization rule `Bool.beq_eq_decide_eq`, but it is only applicable if
the type implements the instances `BEq`, `LawfulBEq`, **and** `DecidableEq α`.
However, we may be in a context where only `BEq` and `LawfulBEq` are available.
Thus, we have added this propagator as a backup.
-/
let_expr f@BEq.beq α binst a b := e | return ()
if ( isEqv a b) then
let u := f.constLevels!
let lawfulBEq := mkApp2 (mkConst ``LawfulBEq u) α binst
let .some linst trySynthInstance lawfulBEq | return ()
pushEqBoolTrue e <| mkApp6 (mkConst ``Grind.beq_eq_true_of_eq u) α binst linst a b ( mkEqProof a b)
else if let some h mkDiseqProof? a b then
let u := f.constLevels!
let lawfulBEq := mkApp2 (mkConst ``LawfulBEq u) α binst
let .some linst trySynthInstance lawfulBEq | return ()
pushEqBoolFalse e <| mkApp6 (mkConst ``Grind.beq_eq_false_of_diseq u) α binst linst a b h
/-- Propagates `EqMatch` downwards -/
builtin_grind_propagator propagateEqMatchDown Grind.EqMatch := fun e => do
if ( isEqTrue e) then

View File

@@ -22,15 +22,23 @@ private def withoutModifyingState (x : GoalM α) : GoalM α := do
set saved
/--
If `e` has not been internalized yet, simplify it, and internalize the result.
If `e` has not been internalized yet, instantiate metavariables, unfold reducible, canonicalize,
and internalize the result.
This is an auxliary function used at `proveEq?` and `proveHEq?`.
-/
private def preprocessAndInternalize (e : Expr) (gen : Nat := 0) : GoalM Simp.Result := do
private def ensureInternalized (e : Expr) : GoalM Expr := do
if ( alreadyInternalized e) then
return { expr := e }
return e
else
let r preprocess e
internalize r.expr gen
return r
/-
It is important to expand reducible declarations. Otherwise, we cannot prove
`¬ a = []` and `b ≠ []` by congruence closure even when `a` and `b` are in the same
equivalence class.
-/
let e shareCommon ( canon ( unfoldReducible ( instantiateMVars e)))
internalize e 0
return e
/--
Try to construct a proof that `lhs = rhs` using the information in the
@@ -42,24 +50,26 @@ This function mainly relies on congruence closure, and constraint
propagation. It will not perform case analysis.
-/
def proveEq? (lhs rhs : Expr) : GoalM (Option Expr) := do
trace[grind.debug.proveEq] "({lhs}) = ({rhs})"
if ( alreadyInternalized lhs <&&> alreadyInternalized rhs) then
if ( isEqv lhs rhs) then
return some ( mkEqProof lhs rhs)
else
return none
else withoutModifyingState do
let lhs preprocessAndInternalize lhs
let rhs preprocessAndInternalize rhs
/-
We used to apply the `grind` normalizer, but it created unexpected failures.
Here is an example, suppose we are trying to prove `i < (a :: l).length` is equal to `0 < (a :: l).length`
when `i` and `0` are in the same equivalence class. This should hold by applying congruence closure.
However, if we apply the normalizer, we obtain `i+1 ≤ (a :: l).length` and `1 ≤ (a :: l).length`, and
the equality cannot be detected by congruence closure anymore.
-/
let lhs ensureInternalized lhs
let rhs ensureInternalized rhs
processNewFacts
unless ( isEqv lhs.expr rhs.expr) do return none
unless ( hasSameType lhs.expr rhs.expr) do return none
let h mkEqProof lhs.expr rhs.expr
let h match lhs.proof?, rhs.proof? with
| none, none => pure h
| none, some h₂ => mkEqTrans h ( mkEqSymm h₂)
| some h₁, none => mkEqTrans h₁ h
| some h₁, some h₂ => mkEqTrans ( mkEqTrans h₁ h) ( mkEqSymm h₂)
return some h
unless ( isEqv lhs rhs) do return none
unless ( hasSameType lhs rhs) do return none
mkEqProof lhs rhs
/-- Similiar to `proveEq?`, but for heterogeneous equality. -/
def proveHEq? (lhs rhs : Expr) : GoalM (Option Expr) := do
@@ -69,16 +79,11 @@ def proveHEq? (lhs rhs : Expr) : GoalM (Option Expr) := do
else
return none
else withoutModifyingState do
let lhs preprocessAndInternalize lhs
let rhs preprocessAndInternalize rhs
-- See comment at `proveEq?`
let lhs ensureInternalized lhs
let rhs ensureInternalized rhs
processNewFacts
unless ( isEqv lhs.expr rhs.expr) do return none
let h mkHEqProof lhs.expr rhs.expr
let h match lhs.proof?, rhs.proof? with
| none, none => pure h
| none, some h₂ => mkHEqTrans h ( mkHEqSymm h₂)
| some h₁, none => mkHEqTrans h₁ h
| some h₁, some h₂ => mkHEqTrans ( mkHEqTrans h₁ h) ( mkHEqSymm h₂)
return some h
unless ( isEqv lhs rhs) do return none
mkHEqProof lhs rhs
end Lean.Meta.Grind

View File

@@ -199,7 +199,7 @@ Abtracts nested proofs in `e`. This is a preprocessing step performed before int
-/
def abstractNestedProofs (e : Expr) : GrindM Expr := do
let nextIdx := ( get).nextThmIdx
let (e, s') AbstractNestedProofs.visit e |>.run { baseName := ( getMainDeclName) } |>.run |>.run { nextIdx }
let (e, s') AbstractNestedProofs.visit e |>.run { cache := true, baseName := ( getMainDeclName) } |>.run |>.run { nextIdx }
modify fun s => { s with nextThmIdx := s'.nextIdx }
return e
@@ -229,10 +229,10 @@ def mkHCongrWithArity (f : Expr) (numArgs : Nat) : GrindM CongrTheorem := do
if let some result := ( get).congrThms.find? key then
return result
if let .const declName us := f then
if let some result mkHCongrWithArityForConst? declName us numArgs then
if let some result withDefault <| Meta.mkHCongrWithArityForConst? declName us numArgs then
modify fun s => { s with congrThms := s.congrThms.insert key result }
return result
let result Meta.mkHCongrWithArity f numArgs
let result withDefault <| Meta.mkHCongrWithArity f numArgs
modify fun s => { s with congrThms := s.congrThms.insert key result }
return result

View File

@@ -4,7 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Environment
import Lean.EnvExtension
import Lean.PrivateName
namespace Lean

View File

@@ -4,7 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Environment
import Lean.EnvExtension
namespace Lean

View File

@@ -9,6 +9,7 @@ import Lean.PrettyPrinter.Delaborator.Basic
import Lean.PrettyPrinter.Delaborator.SubExpr
import Lean.PrettyPrinter.Delaborator.TopDownAnalyze
import Lean.Meta.CoeAttr
import Lean.Meta.Structure
namespace Lean.PrettyPrinter.Delaborator
open Lean.Meta
@@ -556,8 +557,7 @@ def delabDelayedAssignedMVar : Delab := whenNotPPOption getPPMVarsDelayed do
delabMVarAux decl.mvarIdPending
private partial def collectStructFields
(structName : Name)
(paramMap : NameMap Expr)
(structName : Name) (levels : List Level) (params : Array Expr)
(fields : Array (TSyntax ``Parser.Term.structInstField))
(fieldValues : NameMap Expr)
(s : ConstructorVal) :
@@ -575,14 +575,13 @@ private partial def collectStructFields
if getPPOption getPPStructureInstancesFlatten then
if let some s' isConstructorApp? ( getExpr) then
if s'.induct == parentName then
let (fieldValues, fields) collectStructFields structName paramMap fields fieldValues s'
let (fieldValues, fields) collectStructFields structName levels params fields fieldValues s'
return (i + 1, fieldValues, fields)
/- Does it have the default value, and should it be omitted? -/
/- Does this field have a default value? and if so, can we omit the field? -/
unless getPPOption getPPStructureInstancesDefaults do
if let some defFn := getEffectiveDefaultFnForField? ( getEnv) structName fieldName then
let cinfo getConstInfo defFn
let defValue := cinfo.instantiateValueLevelParams! ( mkFreshLevelMVarsFor cinfo)
if let some defValue withNewMCtxDepth <| processDefaultValue paramMap fieldValues defValue then
-- Use `withNewMCtxDepth` to prevent delaborator from solving metavariables.
if let some (_, defValue) withNewMCtxDepth <| instantiateStructDefaultValueFn? defFn levels params (pure fieldValues.find?) then
if withReducible <| withNewMCtxDepth <| isDefEq defValue ( getExpr) then
-- Default value matches, skip the field.
return (i + 1, fieldValues, fields)
@@ -596,24 +595,6 @@ private partial def collectStructFields
let field `(structInstField|$fieldId:ident := $value)
return (i + 1, fieldValues, fields.push field))
return (fieldValues, fields)
where
processDefaultValue (paramMap : NameMap Expr) (fieldValues : NameMap Expr) : Expr MetaM (Option Expr)
| .lam n d b c => do
if c.isExplicit then
let some val := fieldValues.find? n | return none
if isDefEq ( inferType val) d then
processDefaultValue paramMap fieldValues (b.instantiate1 val)
else
return none
else
let some param := paramMap.find? n | return none
if isDefEq ( inferType param) d then
processDefaultValue paramMap fieldValues (b.instantiate1 param)
else
return none
| e =>
let_expr id _ a := e | return some e
return some a
/--
Delaborate structure constructor applications using structure instance notation or anonymous constructor notation.
@@ -663,14 +644,10 @@ def delabStructureInstance : Delab := do
If `pp.structureInstances.flatten` is true (and `pp.explicit` is false or the subobject has no parameters)
then subobjects are flattened.
-/
-- For default value handling, we need to create a map of type parameter names to expressions.
let .const _ levels := ( getExpr).getAppFn | failure
let args := ( getExpr).getAppArgs
let paramMap : NameMap Expr forallTelescope s.type fun xs _ => do
let mut paramMap := {}
for param in args[:s.numParams], x in xs do
paramMap := paramMap.insert ( x.fvarId!.getUserName) param
return paramMap
let (_, fields) collectStructFields s.induct paramMap #[] {} s
let params := args[0:s.numParams]
let (_, fields) collectStructFields s.induct levels params #[] {} s
let tyStx? : Option Term withType do
if getPPOption getPPStructureInstanceType then delab else pure none
`({ $fields,* $[: $tyStx?]? })

View File

@@ -4,7 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Environment
import Lean.EnvExtension
namespace Lean

View File

@@ -80,7 +80,7 @@ partial def getFinishedPrefix : AsyncList ε α → BaseIO (List α × Option ε
else pure [], none, false
partial def getFinishedPrefixWithTimeout (xs : AsyncList ε α) (timeoutMs : UInt32)
(cancelTk? : Option (ServerTask Unit) := none) : BaseIO (List α × Option ε × Bool) := do
(cancelTks : List (ServerTask Unit) := []) : BaseIO (List α × Option ε × Bool) := do
let timeoutTask : ServerTask (Unit Except ε (AsyncList ε α))
if timeoutMs == 0 then
pure <| ServerTask.pure (Sum.inl ())
@@ -100,21 +100,17 @@ where
| delayed tl =>
let tl : ServerTask (Except ε (AsyncList ε α)) := tl
let tl := tl.mapCheap .inr
let cancelTk? := do return ( cancelTk?).mapCheap .inl
let tasks : { t : List _ // t.length > 0 } :=
match cancelTk? with
| none => [tl, timeoutTask], by exact Nat.zero_lt_succ _
| some cancelTk => [tl, cancelTk, timeoutTask], by exact Nat.zero_lt_succ _
let r ServerTask.waitAny tasks.val (h := tasks.property)
let cancelTks := cancelTks.map (·.mapCheap .inl)
let r ServerTask.waitAny (tl :: cancelTks ++ [timeoutTask])
match r with
| .inl _ => return [], none, false -- Timeout or cancellation - stop waiting
| .inr (.ok tl) => go timeoutTask tl
| .inr (.error e) => return [], some e, true
partial def getFinishedPrefixWithConsistentLatency (xs : AsyncList ε α) (latencyMs : UInt32)
(cancelTk? : Option (ServerTask Unit) := none) : BaseIO (List α × Option ε × Bool) := do
(cancelTks : List (ServerTask Unit) := []) : BaseIO (List α × Option ε × Bool) := do
let timestamp IO.monoMsNow
let r xs.getFinishedPrefixWithTimeout latencyMs cancelTk?
let r xs.getFinishedPrefixWithTimeout latencyMs cancelTks
let passedTimeMs := ( IO.monoMsNow) - timestamp
let remainingLatencyMs := (latencyMs.toNat - passedTimeMs).toUInt32
sleepWithCancellation remainingLatencyMs
@@ -123,14 +119,14 @@ where
sleepWithCancellation (sleepDurationMs : UInt32) : BaseIO Unit := do
if sleepDurationMs == 0 then
return
let some cancelTk := cancelTk?
| IO.sleep sleepDurationMs
return
if cancelTk.hasFinished then
if cancelTks.isEmpty then
IO.sleep sleepDurationMs
return
if cancelTks.anyM (·.hasFinished) then
return
let sleepTask Lean.Server.ServerTask.BaseIO.asTask do
IO.sleep sleepDurationMs
ServerTask.waitAny [sleepTask, cancelTk]
ServerTask.waitAny <| sleepTask :: cancelTks
end AsyncList

View File

@@ -147,7 +147,7 @@ def handleCodeActionResolve (param : CodeAction) : RequestM (RequestTask CodeAct
let doc readDoc
let some data := param.data?
| throw (RequestError.invalidParams "Expected a data field on CodeAction.")
let data : CodeActionResolveData liftExcept <| Except.mapError RequestError.invalidParams <| fromJson? data
let data RequestM.parseRequestParams CodeActionResolveData data
let pos := doc.meta.text.lspPosToUtf8Pos data.params.range.end
withWaitFindSnap doc (fun s => s.endPos pos)
(notFoundX := throw <| RequestError.internalError "snapshot not found")

View File

@@ -0,0 +1,341 @@
/-
Copyright (c) 2025 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Marc Huisinga
-/
prelude
import Lean.Server.FileWorker.Utils
import Lean.Data.Lsp.Internal
import Lean.Server.Requests
import Lean.Server.Completion.CompletionInfoSelection
import Lean.Server.CodeActions.Basic
import Lean.Server.Completion.CompletionUtils
namespace Lean.Server.FileWorker
open Lean.Lsp
open Lean.Server.Completion
structure UnknownIdentifierInfo where
paramsRange : String.Range
diagRange : String.Range
def waitUnknownIdentifierRanges (doc : EditableDocument) (requestedRange : String.Range)
: BaseIO (Array String.Range) := do
let text := doc.meta.text
let msgLog := Language.toSnapshotTree doc.initSnap |>.collectMessagesInRange requestedRange |>.get
let mut ranges := #[]
for msg in msgLog.reportedPlusUnreported do
if ! msg.data.hasTag (· == unknownIdentifierMessageTag) then
continue
let msgRange : String.Range := text.ofPosition msg.pos, text.ofPosition <| msg.endPos.getD msg.pos
if ! msgRange.overlaps requestedRange
(includeFirstStop := true) (includeSecondStop := true) then
continue
ranges := ranges.push msgRange
return ranges
structure Insertion where
fullName : Name
edit : TextEdit
structure Query extends LeanModuleQuery where
env : Environment
determineInsertion : Name Insertion
partial def collectOpenNamespaces (currentNamespace : Name) (openDecls : List OpenDecl)
: Array OpenNamespace := Id.run do
let mut openNamespaces : Array OpenNamespace := #[]
let mut currentNamespace := currentNamespace
while ! currentNamespace.isAnonymous do
openNamespaces := openNamespaces.push <| .allExcept currentNamespace #[]
currentNamespace := currentNamespace.getPrefix
let openDeclNamespaces := openDecls.map fun
| .simple ns except => .allExcept ns except.toArray
| .explicit id declName => .renamed declName id
openNamespaces := openNamespaces ++ openDeclNamespaces.toArray
return openNamespaces
def computeFallbackQuery?
(doc : EditableDocument)
(requestedRange : String.Range)
(unknownIdentifierRange : String.Range)
(infoTree : Elab.InfoTree)
: Option Query := do
let text := doc.meta.text
let info? := infoTree.smallestInfo? fun info => Id.run do
let some range := info.range?
| return false
return range.overlaps requestedRange (includeFirstStop := true) (includeSecondStop := true)
let some (ctx, _) := info?
| none
return {
identifier := text.source.extract unknownIdentifierRange.start unknownIdentifierRange.stop
openNamespaces := collectOpenNamespaces ctx.currNamespace ctx.openDecls
env := ctx.env
determineInsertion decl :=
let minimizedId := minimizeGlobalIdentifierInContext ctx.currNamespace ctx.openDecls decl
{
fullName := minimizedId
edit := {
range := text.utf8RangeToLspRange unknownIdentifierRange
newText := minimizedId.toString
}
}
}
def computeIdQuery?
(doc : EditableDocument)
(ctx : Elab.ContextInfo)
(stx : Syntax)
(id : Name)
: Option Query := do
let some pos := stx.getPos? (canonicalOnly := true)
| none
let some tailPos := stx.getTailPos? (canonicalOnly := true)
| none
return {
identifier := id.toString
openNamespaces := collectOpenNamespaces ctx.currNamespace ctx.openDecls
env := ctx.env
determineInsertion decl :=
let minimizedId := minimizeGlobalIdentifierInContext ctx.currNamespace ctx.openDecls decl
{
fullName := minimizedId
edit := {
range := doc.meta.text.utf8RangeToLspRange pos, tailPos
newText := minimizedId.toString
}
}
}
def computeDotQuery?
(doc : EditableDocument)
(ctx : Elab.ContextInfo)
(ti : Elab.TermInfo)
: IO (Option Query) := do
let text := doc.meta.text
let some pos := ti.stx.getPos? (canonicalOnly := true)
| return none
let some tailPos := ti.stx.getTailPos? (canonicalOnly := true)
| return none
let typeNames? : Option (Array Name) ctx.runMetaM ti.lctx do
try
return some <| getDotCompletionTypeNames ( Lean.instantiateMVars ( Lean.Meta.inferType ti.expr))
catch _ =>
return none
let some typeNames := typeNames?
| return none
return some {
identifier := text.source.extract pos tailPos
openNamespaces := typeNames.map (.allExcept · #[])
env := ctx.env
determineInsertion decl :=
{
fullName := decl
edit := {
range := text.utf8RangeToLspRange pos, tailPos
newText := decl.getString!
}
}
}
def computeDotIdQuery?
(doc : EditableDocument)
(ctx : Elab.ContextInfo)
(stx : Syntax)
(id : Name)
(lctx : LocalContext)
(expectedType? : Option Expr)
: IO (Option Query) := do
let some pos := stx.getPos? (canonicalOnly := true)
| return none
let some tailPos := stx.getTailPos? (canonicalOnly := true)
| return none
let some expectedType := expectedType?
| return none
let typeNames? : Option (Array Name) ctx.runMetaM lctx do
let resultTypeFn := ( instantiateMVars expectedType).cleanupAnnotations.getAppFn.cleanupAnnotations
let .const .. := resultTypeFn
| return none
try
return some <| getDotCompletionTypeNames resultTypeFn
catch _ =>
return none
let some typeNames := typeNames?
| return none
return some {
identifier := id.toString
openNamespaces := typeNames.map (.allExcept · #[])
env := ctx.env
determineInsertion decl :=
{
fullName := decl
edit := {
range := doc.meta.text.utf8RangeToLspRange pos, tailPos
newText := decl.getString!
}
}
}
def computeQuery?
(doc : EditableDocument)
(requestedRange : String.Range)
(unknownIdentifierRange : String.Range)
: RequestM (Option Query) := do
let text := doc.meta.text
let some (stx, infoTree) := RequestM.findCmdDataAtPos doc unknownIdentifierRange.stop (includeStop := true) |>.get
| return none
let completionInfo? : Option ContextualizedCompletionInfo := do
let (completionPartitions, _) := findPrioritizedCompletionPartitionsAt text unknownIdentifierRange.stop stx infoTree
let highestPrioPartition completionPartitions[0]?
let (completionInfo, _) highestPrioPartition[0]?
return completionInfo
let some _, ctx, info := completionInfo?
| return computeFallbackQuery? doc requestedRange unknownIdentifierRange infoTree
match info with
| .id (stx := stx) (id := id) .. =>
return computeIdQuery? doc ctx stx id
| .dot (termInfo := ti) .. =>
return computeDotQuery? doc ctx ti
| .dotId stx id lctx expectedType? =>
return computeDotIdQuery? doc ctx stx id lctx expectedType?
| _ => return none
def importAllUnknownIdentifiersProvider : Name := `unknownIdentifiers
def importAllUnknownIdentifiersCodeAction (params : CodeActionParams) (kind : String) : CodeAction := {
title := "Import all unambiguous unknown identifiers"
kind? := kind
data? := some <| toJson {
params,
providerName := importAllUnknownIdentifiersProvider
providerResultIndex := 0
: CodeActionResolveData
}
}
def handleUnknownIdentifierCodeAction
(id : JsonRpc.RequestID)
(params : CodeActionParams)
(requestedRange : String.Range)
(unknownIdentifierRanges : Array String.Range)
: RequestM (Array CodeAction) := do
let rc read
let doc := rc.doc
let text := doc.meta.text
let queries unknownIdentifierRanges.filterMapM fun unknownIdentifierRange =>
computeQuery? doc requestedRange unknownIdentifierRange
if queries.isEmpty then
return #[]
let responseTask RequestM.sendServerRequest LeanQueryModuleParams LeanQueryModuleResponse "$/lean/queryModule" {
sourceRequestID := id
queries := queries.map (·.toLeanModuleQuery)
: LeanQueryModuleParams
}
let r ServerTask.waitAny [
responseTask.mapCheap Sum.inl,
rc.cancelTk.requestCancellationTask.mapCheap Sum.inr
]
let .inl (.success response) := r
| RequestM.checkCancelled
return #[]
let headerStx := doc.initSnap.stx
let importInsertionPos : Lsp.Position :=
match headerStx.getTailPos? with
| some headerTailPos => {
line := (text.utf8PosToLspPos headerTailPos |>.line) + 1
character := 0
}
| none => { line := 0, character := 0 }
let importInsertionRange : Lsp.Range := importInsertionPos, importInsertionPos
let mut unknownIdentifierCodeActions := #[]
let mut hasUnambigiousImportCodeAction := false
for q in queries, result in response.queryResults do
for mod, decl, isExactMatch in result do
let isDeclInEnv := q.env.contains decl
if ! isDeclInEnv && mod == q.env.mainModule then
-- Don't offer any code actions for identifiers defined further down in the same file
continue
let insertion := q.determineInsertion decl
if ! isDeclInEnv then
unknownIdentifierCodeActions := unknownIdentifierCodeActions.push {
title := s!"Import {insertion.fullName} from {mod}"
kind? := "quickfix"
edit? := WorkspaceEdit.ofTextDocumentEdit {
textDocument := doc.versionedIdentifier
edits := #[
{
range := importInsertionRange
newText := s!"import {mod}\n"
},
insertion.edit
]
}
}
if isExactMatch then
hasUnambigiousImportCodeAction := true
else
unknownIdentifierCodeActions := unknownIdentifierCodeActions.push {
title := s!"Change to {insertion.fullName}"
kind? := "quickfix"
edit? := WorkspaceEdit.ofTextDocumentEdit {
textDocument := doc.versionedIdentifier
edits := #[insertion.edit]
}
}
if hasUnambigiousImportCodeAction then
unknownIdentifierCodeActions := unknownIdentifierCodeActions.push <|
importAllUnknownIdentifiersCodeAction params "quickfix"
return unknownIdentifierCodeActions
def handleResolveImportAllUnknownIdentifiersCodeAction?
(id : JsonRpc.RequestID)
(action : CodeAction)
(unknownIdentifierRanges : Array String.Range)
: RequestM (Option CodeAction) := do
let rc read
let doc := rc.doc
let text := doc.meta.text
let queries unknownIdentifierRanges.filterMapM fun unknownIdentifierRange =>
computeQuery? doc 0, text.source.endPos unknownIdentifierRange
if queries.isEmpty then
return none
let responseTask RequestM.sendServerRequest LeanQueryModuleParams LeanQueryModuleResponse "$/lean/queryModule" {
sourceRequestID := id
queries := queries.map (·.toLeanModuleQuery)
: LeanQueryModuleParams
}
let .success response := responseTask.get
| return none
let headerStx := doc.initSnap.stx
let importInsertionPos : Lsp.Position :=
match headerStx.getTailPos? with
| some headerTailPos => {
line := (text.utf8PosToLspPos headerTailPos |>.line) + 1
character := 0
}
| none => { line := 0, character := 0 }
let importInsertionRange : Lsp.Range := importInsertionPos, importInsertionPos
let mut edits : Array TextEdit := #[]
let mut imports : Std.HashSet Name :=
for q in queries, result in response.queryResults do
let some mod, decl, _ := result.find? fun id =>
id.isExactMatch && ! q.env.contains id.decl
| continue
if mod == q.env.mainModule then
continue
let insertion := q.determineInsertion decl
if ! imports.contains mod then
edits := edits.push {
range := importInsertionRange
newText := s!"import {mod}\n"
}
edits := edits.push insertion.edit
imports := imports.insert mod
return some { action with
edit? := WorkspaceEdit.ofTextDocumentEdit {
textDocument := doc.versionedIdentifier
edits
}
}

View File

@@ -9,6 +9,7 @@ import Lean.Elab.Tactic.Doc
import Lean.Server.Completion.CompletionResolution
import Lean.Server.Completion.EligibleHeaderDecls
import Lean.Server.RequestCancellation
import Lean.Server.Completion.CompletionUtils
namespace Lean.Server.Completion
open Elab
@@ -268,9 +269,6 @@ end IdCompletionUtils
section DotCompletionUtils
private def unfoldeDefinitionGuarded? (e : Expr) : MetaM (Option Expr) :=
try unfoldDefinition? e catch _ => pure none
/-- Return `true` if `e` is a `declName`-application, or can be unfolded (delta-reduced) to one. -/
private partial def isDefEqToAppOf (e : Expr) (declName : Name) : MetaM Bool := do
let isConstOf := match e.getAppFn with
@@ -340,17 +338,11 @@ section DotCompletionUtils
Given a type, try to extract relevant type names for dot notation field completion.
We extract the type name, parent struct names, and unfold the type.
The process mimics the dot notation elaboration procedure at `App.lean` -/
private partial def getDotCompletionTypeNames (type : Expr) : MetaM NameSetModPrivate :=
return ( visit type |>.run RBTree.empty).2
where
visit (type : Expr) : StateRefT NameSetModPrivate MetaM Unit := do
let .const typeName _ := type.getAppFn | return ()
modify fun s => s.insert typeName
if isStructure ( getEnv) typeName then
for parentName in ( getAllParentStructures typeName) do
modify fun s => s.insert parentName
let some type unfoldeDefinitionGuarded? type | return ()
visit type
private def getDotCompletionTypeNameSet (type : Expr) : MetaM NameSetModPrivate := do
let mut set := .empty
for typeName in getDotCompletionTypeNames type do
set := set.insert typeName
return set
end DotCompletionUtils
@@ -478,7 +470,7 @@ def dotCompletion
: CancellableM (Array CompletionItem) :=
runM params completionInfoPos ctx info.lctx do
let nameSet try
getDotCompletionTypeNames ( instantiateMVars ( inferType info.expr))
getDotCompletionTypeNameSet ( instantiateMVars ( inferType info.expr))
catch _ =>
pure RBTree.empty
if nameSet.isEmpty then
@@ -513,7 +505,7 @@ def dotIdCompletion
| return ()
let nameSet try
getDotCompletionTypeNames resultTypeFn
getDotCompletionTypeNameSet resultTypeFn
catch _ =>
pure RBTree.empty

View File

@@ -5,7 +5,7 @@ Authors: Leonardo de Moura, Marc Huisinga
-/
prelude
import Init.Prelude
import Lean.Elab.InfoTree.Types
import Lean.Meta.WHNF
namespace Lean.Server.Completion
open Elab
@@ -19,4 +19,48 @@ structure ContextualizedCompletionInfo where
ctx : ContextInfo
info : CompletionInfo
partial def minimizeGlobalIdentifierInContext (currNamespace : Name) (openDecls : List OpenDecl) (id : Name)
: Name := Id.run do
let mut minimized := shortenIn id currNamespace
for openDecl in openDecls do
let candidate? := match openDecl with
| .simple ns except =>
let candidate := shortenIn id ns
if ! except.contains candidate then
some candidate
else
none
| .explicit alias declName =>
if declName == id then
some alias
else
none
if let some candidate := candidate? then
if candidate.getNumParts < minimized.getNumParts then
minimized := candidate
return minimized
where
shortenIn (id : Name) (contextNamespace : Name) : Name :=
if contextNamespace matches .anonymous then
id
else if contextNamespace.isPrefixOf id then
id.replacePrefix contextNamespace .anonymous
else
shortenIn id contextNamespace.getPrefix
def unfoldeDefinitionGuarded? (e : Expr) : MetaM (Option Expr) :=
try Lean.Meta.unfoldDefinition? e catch _ => pure none
partial def getDotCompletionTypeNames (type : Expr) : MetaM (Array Name) :=
return ( visit type |>.run #[]).2
where
visit (type : Expr) : StateRefT (Array Name) MetaM Unit := do
let .const typeName _ := type.getAppFn | return ()
modify fun s => s.push typeName
if isStructure ( getEnv) typeName then
for parentName in ( getAllParentStructures typeName) do
modify fun s => s.push parentName
let some type unfoldeDefinitionGuarded? type | return ()
visit type
end Lean.Server.Completion

View File

@@ -29,6 +29,7 @@ import Lean.Server.FileWorker.SetupFile
import Lean.Server.Rpc.Basic
import Lean.Widget.InteractiveDiagnostic
import Lean.Server.Completion.ImportCompletion
import Lean.Server.CodeActions.UnknownIdentifier
/-!
For general server architecture, see `README.md`. For details of IPC communication, see `Watchdog.lean`.
@@ -74,27 +75,28 @@ open Widget in
structure WorkerContext where
/-- Synchronized output channel for LSP messages. Notifications for outdated versions are
discarded on read. -/
chanOut : Std.Channel JsonRpc.Message
chanOut : Std.Channel JsonRpc.Message
/--
Latest document version received by the client, used for filtering out notifications from
previous versions.
-/
maxDocVersionRef : IO.Ref Int
freshRequestIdRef : IO.Ref Int
maxDocVersionRef : IO.Ref Int
freshRequestIdRef : IO.Ref Int
/--
Diagnostics that are included in every single `textDocument/publishDiagnostics` notification.
-/
stickyDiagnosticsRef : IO.Ref (Array InteractiveDiagnostic)
partialHandlersRef : IO.Ref (RBMap String PartialHandlerInfo compare)
hLog : FS.Stream
initParams : InitializeParams
processor : Parser.InputContext BaseIO Lean.Language.Lean.InitialSnapshot
clientHasWidgets : Bool
stickyDiagnosticsRef : IO.Ref (Array InteractiveDiagnostic)
partialHandlersRef : IO.Ref (RBMap String PartialHandlerInfo compare)
pendingServerRequestsRef : IO.Ref (Std.TreeMap RequestID (IO.Promise (ServerRequestResponse Json)))
hLog : FS.Stream
initParams : InitializeParams
processor : Parser.InputContext BaseIO Lean.Language.Lean.InitialSnapshot
clientHasWidgets : Bool
/--
Options defined on the worker cmdline (i.e. not including options from `setup-file`), used for
context-free tasks such as editing delay.
-/
cmdlineOpts : Options
cmdlineOpts : Options
def WorkerContext.modifyGetPartialHandler (ctx : WorkerContext) (method : String)
(f : PartialHandlerInfo α × PartialHandlerInfo) : BaseIO α :=
@@ -113,6 +115,31 @@ def WorkerContext.modifyPartialHandler (ctx : WorkerContext) (method : String)
def WorkerContext.updateRequestsInFlight (ctx : WorkerContext) (method : String) (f : Nat Nat) : BaseIO Unit :=
ctx.modifyPartialHandler method fun h => { h with requestsInFlight := f h.requestsInFlight }
def WorkerContext.initPendingServerRequest
responseType [FromJson responseType] [Inhabited responseType]
(ctx : WorkerContext) (id : RequestID) :
BaseIO (ServerTask (ServerRequestResponse responseType)) := do
let responsePromise IO.Promise.new
ctx.pendingServerRequestsRef.modify (·.insert id responsePromise)
let responseTask := responsePromise.result!.asServerTask
let responseTask := responseTask.mapCheap fun
| .success response =>
match fromJson? response with
| .ok response => .success response
| .error message => .failure .invalidParams message
| .failure code message => .failure code message
return responseTask
def WorkerContext.resolveServerRequestResponse (ctx : WorkerContext) (id : RequestID)
(response : ServerRequestResponse Json) : BaseIO Unit := do
let responsePromise? ctx.pendingServerRequestsRef.modifyGet fun pendingServerRequests =>
let responsePromise? := pendingServerRequests.get? id
let pendingServerRequests := pendingServerRequests.erase id
(responsePromise?, pendingServerRequests)
let some responsePromise := responsePromise?
| return
responsePromise.resolve response
/-! # Asynchronous snapshot elaboration -/
section Elab
@@ -409,6 +436,7 @@ section Initialization
let maxDocVersionRef IO.mkRef 0
let freshRequestIdRef IO.mkRef (0 : Int)
let stickyDiagnosticsRef IO.mkRef
let pendingServerRequestsRef IO.mkRef
let chanOut mkLspOutputChannel maxDocVersionRef
let srcSearchPathPromise IO.Promise.new
let timestamp IO.monoMsNow
@@ -434,6 +462,7 @@ section Initialization
processor
clientHasWidgets
partialHandlersRef
pendingServerRequestsRef
maxDocVersionRef
freshRequestIdRef
cmdlineOpts := opts
@@ -495,15 +524,26 @@ section Initialization
end Initialization
section ServerRequests
def sendServerRequest [ToJson α]
def sendServerRequest
paramType [ToJson paramType] responseType [FromJson responseType] [Inhabited responseType]
(ctx : WorkerContext)
(method : String)
(param : α)
: BaseIO Unit := do
(param : paramType)
: BaseIO (ServerTask (ServerRequestResponse responseType)) := do
let freshRequestId ctx.freshRequestIdRef.modifyGet fun freshRequestId =>
(freshRequestId, freshRequestId + 1)
let r : JsonRpc.Request α := freshRequestId, method, param
let responseTask ctx.initPendingServerRequest responseType freshRequestId
let r : JsonRpc.Request paramType := freshRequestId, method, param
ctx.chanOut.send r
return responseTask
def sendUntypedServerRequest
(ctx : WorkerContext)
(method : String)
(param : Json)
: BaseIO (ServerTask (ServerRequestResponse Json)) := do
sendServerRequest Json Json ctx method param
end ServerRequests
section Updates
@@ -544,6 +584,7 @@ section NotificationHandling
cancelTk
hLog := ctx.hLog
initParams := ctx.initParams
serverRequestEmitter := sendUntypedServerRequest ctx
}
RequestM.runInIO (handleOnDidChange p) rc
if ¬ changes.isEmpty then
@@ -634,32 +675,6 @@ section MessageHandling
: WorkerM Unit := do
updatePendingRequests (·.insert id r)
open Widget RequestM Language in
def handleGetInteractiveDiagnosticsRequest (params : GetInteractiveDiagnosticsParams) :
WorkerM (Array InteractiveDiagnostic) := do
let ctx read
let st get
-- NOTE: always uses latest document (which is the only one we can retrieve diagnostics for);
-- any race should be temporary as the client should re-request interactive diagnostics when
-- they receive the non-interactive diagnostics for the new document
let stickyDiags ctx.stickyDiagnosticsRef.get
let diags st.doc.diagnosticsRef.get
-- NOTE: does not wait for `lineRange?` to be fully elaborated, which would be problematic with
-- fine-grained incremental reporting anyway; instead, the client is obligated to resend the
-- request when the non-interactive diagnostics of this range have changed
return (stickyDiags ++ diags).filter fun diag =>
let r := diag.fullRange
let diagStartLine := r.start.line
let diagEndLine :=
if r.end.character == 0 then
r.end.line
else
r.end.line + 1
params.lineRange?.all fun s, e =>
-- does [s,e) intersect [diagStartLine,diagEndLine)?
s diagStartLine diagStartLine < e
diagStartLine s s < diagEndLine
def handleImportCompletionRequest (id : RequestID) (params : CompletionParams)
: WorkerM (ServerTask (Except Error AvailableImportsCache)) := do
let ctx read
@@ -684,17 +699,9 @@ section MessageHandling
ctx.chanOut.send <| .response id (toJson completions)
pure { availableImports, lastRequestTimestampMs : AvailableImportsCache }
def handleRequest (id : RequestID) (method : String) (params : Json)
: WorkerM Unit := do
def handleStatefulPreRequestSpecialCases (id : RequestID) (method : String) (params : Json) : WorkerM Bool := do
let ctx read
let st get
ctx.modifyPartialHandler method fun h => { h with
pendingRefreshInfo? := none
requestsInFlight := h.requestsInFlight + 1
}
-- special cases
try
match method with
-- needs access to `WorkerState.rpcSessions`
@@ -702,48 +709,128 @@ section MessageHandling
let ps parseParams RpcConnectParams params
let resp handleRpcConnect ps
ctx.chanOut.send <| .response id (toJson resp)
return
| "$/lean/rpc/call" =>
let params parseParams Lsp.RpcCallParams params
-- needs access to `EditableDocumentCore.diagnosticsRef`
if params.method == `Lean.Widget.getInteractiveDiagnostics then
let some seshRef := st.rpcSessions.find? params.sessionId
| ctx.chanOut.send <| .responseError id .rpcNeedsReconnect "Outdated RPC session" none
let params IO.ofExcept (fromJson? params.params)
let resp handleGetInteractiveDiagnosticsRequest params
let resp seshRef.modifyGet fun st =>
rpcEncode resp st.objects |>.map (·) ({st with objects := ·})
ctx.chanOut.send <| .response id resp
return
return true
| "textDocument/completion" =>
let params parseParams CompletionParams params
-- must not wait on import processing snapshot
if ImportCompletion.isImportCompletionRequest st.doc.meta.text st.doc.initSnap.stx params
then
let importCachingTask handleImportCompletionRequest id params
set { st with importCachingTask? := some importCachingTask }
return
| _ => pure ()
-- Must not wait on import processing snapshot
if ! ImportCompletion.isImportCompletionRequest st.doc.meta.text st.doc.initSnap.stx params then
return false
let importCachingTask handleImportCompletionRequest id params
set { st with importCachingTask? := some importCachingTask }
return true
| _ =>
return false
catch e =>
ctx.chanOut.send <| .responseError id .internalError (toString e) none
return
return true
let cancelTk RequestCancellationToken.new
-- TODO: move into language-specific request handling
let rc : RequestContext :=
{ rpcSessions := st.rpcSessions
srcSearchPathTask := st.srcSearchPathTask
doc := st.doc
cancelTk
hLog := ctx.hLog
initParams := ctx.initParams }
let requestTask? EIO.toIO' <| handleLspRequest method params rc
let requestTask match requestTask? with
| Except.error e =>
emitResponse ctx (isComplete := false) <| e.toLspResponseError id
pure <| ServerTask.pure <| .ok ()
| Except.ok requestTask => ServerTask.IO.mapTaskCheap (t := requestTask) fun
open Widget RequestM Language in
def handleGetInteractiveDiagnosticsRequest
(ctx : WorkerContext)
(params : GetInteractiveDiagnosticsParams)
: RequestM (Array InteractiveDiagnostic) := do
let doc readDoc
-- NOTE: always uses latest document (which is the only one we can retrieve diagnostics for);
-- any race should be temporary as the client should re-request interactive diagnostics when
-- they receive the non-interactive diagnostics for the new document
let stickyDiags ctx.stickyDiagnosticsRef.get
let diags doc.diagnosticsRef.get
-- NOTE: does not wait for `lineRange?` to be fully elaborated, which would be problematic with
-- fine-grained incremental reporting anyway; instead, the client is obligated to resend the
-- request when the non-interactive diagnostics of this range have changed
return (stickyDiags ++ diags).filter fun diag =>
let r := diag.fullRange
let diagStartLine := r.start.line
let diagEndLine :=
if r.end.character == 0 then
r.end.line
else
r.end.line + 1
params.lineRange?.all fun s, e =>
-- does [s,e) intersect [diagStartLine,diagEndLine)?
s diagStartLine diagStartLine < e
diagStartLine s s < diagEndLine
def handlePreRequestSpecialCases? (ctx : WorkerContext) (st : WorkerState)
(id : RequestID) (method : String) (params : Json)
: RequestM (Option (RequestTask (LspResponse Json))) := do
match method with
| "$/lean/rpc/call" =>
let params RequestM.parseRequestParams Lsp.RpcCallParams params
if params.method != `Lean.Widget.getInteractiveDiagnostics then
return none
let some seshRef := st.rpcSessions.find? params.sessionId
| throw RequestError.rpcNeedsReconnect
let params RequestM.parseRequestParams Widget.GetInteractiveDiagnosticsParams params.params
let resp handleGetInteractiveDiagnosticsRequest ctx params
let resp seshRef.modifyGet fun st =>
rpcEncode resp st.objects |>.map (·) ({st with objects := ·})
return some <| .pure { response := resp, isComplete := true }
| "codeAction/resolve" =>
let params RequestM.parseRequestParams CodeAction params
let some data := params.data?
| throw (RequestError.invalidParams "Expected a data field on CodeAction.")
let data RequestM.parseRequestParams CodeActionResolveData data
if data.providerName != importAllUnknownIdentifiersProvider then
return none
return some <| RequestM.asTask do
let fileRange := 0, st.doc.meta.text.source.endPos
let unknownIdentifierRanges waitUnknownIdentifierRanges st.doc fileRange
if unknownIdentifierRanges.isEmpty then
return { response := toJson params, isComplete := true }
let action? handleResolveImportAllUnknownIdentifiersCodeAction? id params unknownIdentifierRanges
let action := action?.getD params
return { response := toJson action, isComplete := true }
| _ =>
return none
def handlePostRequestSpecialCases (id : RequestID) (method : String) (params : Json)
(task : RequestTask (LspResponse Json)) : RequestM (RequestTask (LspResponse Json)) := do
let doc RequestM.readDoc
match method with
| "textDocument/codeAction" =>
let .ok (params : CodeActionParams) := fromJson? params
| return task
RequestM.mapRequestTaskCostly task fun r => do
let isSourceAction := params.context.only?.any fun only =>
only.contains "source" || only.contains "source.organizeImports"
if isSourceAction then
let unknownIdentifierRanges waitUnknownIdentifierRanges doc 0, doc.meta.text.source.endPos
if unknownIdentifierRanges.isEmpty then
return r
let .ok (codeActions : Array CodeAction) := fromJson? r.response
| return r
return { r with response := toJson <| codeActions.push <| importAllUnknownIdentifiersCodeAction params "source.organizeImports" }
else
let requestedRange := doc.meta.text.lspRangeToUtf8Range params.range
let unknownIdentifierRanges waitUnknownIdentifierRanges doc requestedRange
if unknownIdentifierRanges.isEmpty then
return r
let .ok (codeActions : Array CodeAction) := fromJson? r.response
| return r
RequestM.checkCancelled
-- Since computing the unknown identifier code actions is *really* expensive,
-- we only do it when the user has stopped typing for a second.
IO.sleep 1000
RequestM.checkCancelled
let unknownIdentifierCodeActions handleUnknownIdentifierCodeAction id params requestedRange unknownIdentifierRanges
return { r with response := toJson <| codeActions ++ unknownIdentifierCodeActions }
| _ =>
return task
def emitRequestResponse
(requestTask? : Except RequestError (RequestTask (LspResponse Json)))
(cancelTk : RequestCancellationToken)
(id : RequestID)
(method : String)
: WorkerM (ServerTask (Except Error Unit)) := do
let ctx read
match requestTask? with
| Except.error e =>
emitResponse ctx (isComplete := false) <| e.toLspResponseError id
return ServerTask.pure <| .ok ()
| Except.ok requestTask =>
ServerTask.IO.mapTaskCheap (t := requestTask) fun
| Except.ok r => do
if cancelTk.wasCancelledByCancelRequest then
-- Try not to emit a partial response if this request was cancelled.
@@ -754,10 +841,7 @@ section MessageHandling
emitResponse ctx (isComplete := r.isComplete) <| .response id (toJson r.response)
| Except.error e =>
emitResponse ctx (isComplete := false) <| e.toLspResponseError id
queueRequest id { cancelTk, requestTask }
where
emitResponse (ctx : WorkerContext) (m : JsonRpc.Message) (isComplete : Bool) : IO Unit := do
ctx.chanOut.send m
let timestamp IO.monoMsNow
@@ -770,8 +854,47 @@ section MessageHandling
some { lastRefreshTimestamp := timestamp, successiveRefreshAttempts := 0 }
}
def handleResponse (_ : RequestID) (_ : Json) : WorkerM Unit :=
return -- The only response that we currently expect here is always empty
def handleRequest (id : RequestID) (method : String) (params : Json)
: WorkerM Unit := do
let ctx read
let st get
ctx.modifyPartialHandler method fun h => { h with
pendingRefreshInfo? := none
requestsInFlight := h.requestsInFlight + 1
}
let hasHandledSpecialCase handleStatefulPreRequestSpecialCases id method params
if hasHandledSpecialCase then
return
let cancelTk RequestCancellationToken.new
-- TODO: move into language-specific request handling
let rc : RequestContext := {
rpcSessions := st.rpcSessions
srcSearchPathTask := st.srcSearchPathTask
doc := st.doc
cancelTk
hLog := ctx.hLog
initParams := ctx.initParams
serverRequestEmitter := sendUntypedServerRequest ctx
}
let requestTask? EIO.toIO' <| RequestM.run (rc := rc) do
if let some response handlePreRequestSpecialCases? ctx st id method params then
return response
let task handleLspRequest method params
let task handlePostRequestSpecialCases id method params task
return task
let requestTask emitRequestResponse requestTask? cancelTk id method
queueRequest id { cancelTk, requestTask }
def handleResponse (id : RequestID) (response : Json) : WorkerM Unit := do
let ctx read
ctx.resolveServerRequestResponse id (.success response)
def handleResponseError (id : RequestID) (code : ErrorCode) (message : String) : WorkerM Unit := do
let ctx read
ctx.resolveServerRequestResponse id (.failure code message)
end MessageHandling
@@ -811,9 +934,8 @@ section MainLoop
| Message.response id result =>
handleResponse id result
mainLoop
| Message.responseError .. =>
-- Ignore all errors as we currently only handle a single request with an optional response
-- where failure is not an issue.
| Message.responseError id code message _ =>
handleResponseError id code message
mainLoop
| _ => throwServerError "Got invalid JSON-RPC message"
end MainLoop
@@ -871,7 +993,7 @@ def runRefreshTasks : WorkerM (Array (ServerTask Unit)) := do
if cancelled then
return
continue
sendServerRequest ctx refreshMethod (none : Option Nat)
let _ sendServerRequest (Option Nat) (Option Nat) ctx refreshMethod none
return tasks
where

View File

@@ -141,8 +141,8 @@ def handleInlayHints (p : InlayHintParams) (s : InlayHintState) :
| some lastEditTimestamp =>
let timeSinceLastEditMs := timestamp - lastEditTimestamp
inlayHintEditDelayMs - timeSinceLastEditMs
let (snaps, _, isComplete) ctx.doc.cmdSnaps.getFinishedPrefixWithConsistentLatency editDelayMs.toUInt32 (cancelTk? := ctx.cancelTk.cancellationTask)
if IO.hasFinished ctx.cancelTk.cancellationTask then
let (snaps, _, isComplete) ctx.doc.cmdSnaps.getFinishedPrefixWithConsistentLatency editDelayMs.toUInt32 (cancelTks := ctx.cancelTk.cancellationTasks)
if ctx.cancelTk.wasCancelled then
-- Inlay hint request has been cancelled, either by a cancellation request or another edit.
-- In the former case, we will simply discard the result and respond with a request error
-- denoting cancellation.

Some files were not shown because too many files have changed in this diff Show More