Compare commits

...

42 Commits

Author SHA1 Message Date
Kim Morrison
c157dc673c fix: use exact ref check for nightly-testing branch existence
This PR switches from `git ls-remote --heads origin nightly-testing` (which
does substring pattern matching and could match e.g.
`refs/heads/foo/nightly-testing`) to `git show-ref --verify --quiet` on the
local tracking ref after fetch, which is an exact match.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 10:10:36 +10:00
Kim Morrison
f8f6e27b05 fix: skip nightly-testing merge when branch does not exist
When a release repo has no nightly-testing branch, `git merge
origin/nightly-testing` fails with "not something we can merge" and the
error handler misinterprets this as a merge conflict, then crashes trying
to commit with nothing to commit. Check for the branch with
`git ls-remote` before attempting the merge.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 10:04:00 +10:00
Sofia Rodrigues
bf4f51e704 fix: windows build for signal handlers (#12955)
This PR fixes the windows build with signal handlers.
2026-03-17 23:02:01 +00:00
Lean stage0 autoupdater
4ba85acc46 chore: update stage0 2026-03-17 17:55:05 +00:00
Henrik Böving
32643234b5 fix: actively ignore borrow annotations for export functions (#12952)
This PR ensures that when a function is marked `export` its borrow
annotations (if present) are always ignored.

This was the previous behavior in the C++ version of this file but
slightly modified when porting to the old IR and thus subsequently
ported to LCNF wrongly as well.
2026-03-17 17:22:56 +00:00
Wojciech Nawrocki
147ce5ab18 chore: use IO.CancelToken in server (#12948)
This PR moves `RequestCancellationToken` from `IO.Ref` to
`IO.CancelToken`.

They consist of the same data, but the constructor of `CancelToken` is
private. Hence there is no way to take the `Ref` in a
`RequestCancellationToken` and turn it into a `CancelToken`. This in
turn means that we can't set `Core.Context.cancelTk?` to be the one in
`RequestContext` when launching `CoreM` tasks in request handlers.
2026-03-17 16:48:53 +00:00
Garmelon
6b7f0ad5fc chore: check test output before exit code in piles (#12947)
This improves the feedback when tests fail. Getting a diff is more
useful than a vague exit code.
2026-03-17 16:34:21 +00:00
Markus Himmel
5f5a450eb9 feat: forall lemmas (#12945)
This PR adds a few `forall` lemmas to the `simp` set.
2026-03-17 15:00:39 +00:00
Garmelon
7c011aa522 fix: use process signal numbers from correct architecture (#12900)
This PR fixes some process signals that were incorrectly numbered.

From what I can tell, the code used signals and signal numbers for
Alpha/SPARC, not x86/ARM. The test was also broken and always green,
hiding the mistake.
2026-03-17 13:33:13 +00:00
Wojciech Różowski
6160d17e2d feat: allow @[cbv_eval] to override @[cbv_opaque] (#12944)
This PR changes the interaction between `@[cbv_opaque]` and
`@[cbv_eval]`
attributes in the `cbv` tactic. Previously, `@[cbv_opaque]` completely
blocked
all reduction including `@[cbv_eval]` rewrite rules. Now, `@[cbv_eval]`
rules
can fire on `@[cbv_opaque]` constants, allowing users to provide custom
rewrite
rules without exposing the full definition. Equation theorems, unfold
theorems,
and kernel reduction remain suppressed for opaque constants.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-17 13:08:21 +00:00
Sofia Rodrigues
a0048bf703 feat: introduce Headers data type for HTTP (#12127)
This PR introduces the `Headers` data type, that provides a good and
convenient abstraction for parsing, querying, and encoding HTTP/1.1
headers.

This contains the same code as #10478, divided into separate pieces to
facilitate easier review.

The pieces of this feature are:
- Core data structures: #12126
- Headers: #12127
- URI:  #12128
- Body: #12144
- H1: #12146
- Server: #12151
- Client:

---------

Co-authored-by: Rob23oba <152706811+Rob23oba@users.noreply.github.com>
2026-03-17 12:25:01 +00:00
Lean stage0 autoupdater
72a97a747a chore: update stage0 2026-03-17 11:48:51 +00:00
Sebastian Ullrich
1127eefdca chore: consistent build flags between USE_LAKE ON and OFF (#12941)
Fixes the stage 2 build using USE_LAKE=OFF. We should not use
`lakefile.toml.in` for any semantically relevant flags.
2026-03-17 11:02:55 +00:00
Robin Arnez
aa18927d2e fix: segfault in idbgClientLoop (#12940)
This PR fixes a segfault when running `idbgClientLoop`. `@[extern]`
expects that the function doesn't include erased arguments in the
signature; however, `@[export]` exports the function with all arguments,
including erased ones. This causes a function signature mismatch between
`idbgClientLoopImpl` and `idbgClientLoop`, causing segfaults. However,
instead of solving the deeper problem that `@[extern]` - `@[export]`
pairs can cause such problems, this PR removes the erased arguments from
`idbgClientLoopImpl` and replaces occurrences of `α` with `NonScalar`.
2026-03-17 10:55:54 +00:00
Henrik Böving
606c149cd6 chore: fix update-stage0 with make build (#12943)
This PR fixes the stage0 build with -DUSE_LAKE=OFF
2026-03-17 10:28:51 +00:00
Sebastian Ullrich
3c32607020 fix: incorrect borrow annotation on demangleBtLinCStr leading to segfault on panic (#12939) 2026-03-17 09:24:57 +00:00
Eric Wieser
6714601ee4 fix: remove accidental type monomorphism in Id.run_seqLeft (#12936)
This PR fixes `Id.run_seqLeft` and `Id.run_seqRight` to apply when the
two monad results are different.
2026-03-17 06:43:51 +00:00
damiano
6b604625f2 fix: add missing pp-spaces in grind_pattern (#11686)
This PR adds a pretty-printed space in `grind_pattern`.

[#lean4 > Some pretty printing quirks @
💬](https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/Some.20pretty.20printing.20quirks/near/563848793)

Co-authored-by: Kim Morrison <kim@tqft.net>
2026-03-17 04:15:02 +00:00
Kim Morrison
e96b0ff39c fix: use response files on all platforms to avoid ARG_MAX (#12540)
This PR extends Lake's use of response files (`@file`) from Windows-only
to all platforms, avoiding `ARG_MAX` limits when invoking `clang`/`ar`
with many object files.

Lake already uses response files on Windows to avoid exceeding CLI
length limits. On macOS and Linux, linking Mathlib's ~15,000 object
files into a shared library can exceed macOS's `ARG_MAX` (262,144
bytes). Both `clang` and `gcc` support `@file` response files on all
platforms, so this is safe to enable unconditionally.

Reported as a macOS issue at
https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/The.20clang.20command.20line.20with.20all.20~15.2C000.20Mathlib.20.2Ec.2Eo.2Eexport/near/574369912:
the Mathlib cache ships Linux `.so` shared libs but not macOS `.dylib`
files, so `precompileModules` on macOS triggers a full re-link that
exceeds `ARG_MAX`.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-17 04:14:37 +00:00
Kim Morrison
50ee6dff0a chore: update leantar to v0.1.19 (#12938) 2026-03-17 03:55:21 +00:00
Mac Malone
9e0aa14b6f feat: lake: fixedToolchain package configuration (#12935)
This PR adds the `fixedToolchain` Lake package configuration option.
Setting this to `true` informs Lake that the package is only expected to
function on a single toolchain (like Mathlib). This causes Lake's
toolchain update procedure to prioritize its toolchain and avoids the
need to separate input-to-output mappings for the package by toolchain
version in the Lake cache.
2026-03-17 02:37:55 +00:00
Garmelon
5c685465bd chore: handle absence of meld in fix_expected.py (#12934) 2026-03-16 19:07:44 +00:00
Garmelon
ef87f6b9ac chore: delete temp files before, not after tests (#12932) 2026-03-16 19:02:28 +00:00
Garmelon
49715fe63c chore: improve how test suite interacts with stages (#12913)
The tests need to run with certain environment variables set that only
cmake really knows and that differ between stages. Cmake could just set
the variables directly when running the tests and benchmarks, but that
would leave no good way to manually run a single benchmark. So cmake
generates some stage-specific scripts instead that set the required
environment variables.

Previously, those scripts were sourced directly by the individual
`run_*` scripts, so the env scripts of different stages would overwrite
each other. This PR changes the setup so they can instead be generated
next to each other. This also simplifies the `run_*` scripts themselves
a bit, and makes `tests/bench/build` less of a hack.
2026-03-16 15:20:03 +00:00
Lean stage0 autoupdater
133fd016b4 chore: update stage0 2026-03-16 13:15:14 +00:00
Bhavik Mehta
76e593a52d fix: rename Int.sq_nonnneg to Int.sq_nonneg (#12909)
This PR fixes the typo in `Int.sq_nonnneg`.

Closes #12906.

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-03-16 10:52:57 +00:00
Jesse Alama
fa9a32b5c8 fix: correct swapped operands in Std.Time subtraction instances (#12919)
This PR fixes the `HSub PlainTime Duration` instance, which had its
operands reversed: it computed `duration - time` instead of `time -
duration`. For example, subtracting 2 minutes from `time("13:02:01")`
would give `time("10:57:59")` rather than the expected
`time("13:00:01")`. We also noticed that `HSub PlainDateTime
Millisecond.Offset` is similarly affected.

Closes #12918
2026-03-16 10:52:06 +00:00
Henrik Böving
2d999d7622 refactor: ignore borrow annotations at export/extern tricks (#12930)
This PR places `set_option compiler.ignoreBorrowAnnotation true in` on
to all `export`/`extern`
pairs. This is necessary because `export` forces all arguments to be
passed as owned while `extern`
respects borrow annotations. The current approach to the
`export`/`extern` trick was always broken
but never surfaced. However, with upcoming changes many
`export`/`extern` pairs are going to be
affected by borrow annotations and would've broken without this.
2026-03-16 10:03:40 +00:00
Sebastian Ullrich
ddd5c213c6 chore: CLAUDE.md: stage 2 build instructions (#12929) 2026-03-16 09:47:14 +00:00
Kim Morrison
c9ceba1784 fix: use null-safe while-read loop for subverso manifest sync (#12928)
This PR replaces `find -print0 | xargs -0 -I{} sh -c '...'` with
`find -print0 | while IFS= read -r -d '' f; do ... done` for the
subverso sub-manifest sync in release_steps.py. The original xargs
invocation had fragile nested shell quoting; the while-read loop is
both null-delimiter safe and more readable.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-16 08:17:32 +00:00
Mac Malone
57df23f27e feat: lake: cached compressed module artifacts (#12914)
This PR adds packing and unpacking of module artifacts into `.ltar`
archives using `leantar`.
2026-03-16 04:36:19 +00:00
Mac Malone
ea8fca2d9f refactor: lake: download arts by default in cache get (#12927)
This PR changes `lake cache get` to download artifacts by default.
Artifacts can be downloaded on demand with the new `--mappings-only`
option (`--download-arts` is now obsolete).

In the future, the plan is to have Lake download mappings when cloning
dependencies. Then, `lake cache get` will primarily be used to download
artifacts eagerly. Thus, it makes sense to have that as the default.
2026-03-16 02:29:44 +00:00
Paul Reichert
274997420a refactor: remove backward compatibility options from iterator/slice/range modules (#12925)
This PR removes `respectTransparency`, `reducibleClassField` and `simp
+instances` usages in the iterator/slice/range modules.
2026-03-15 14:03:51 +00:00
Wojciech Różowski
6631352136 fix: remove accidentally added code from Sym.Simp.Pattern (#12926)
This PR removes unused functions (`mkPatternCoreFromLambda`,
`mkPatternFromLambda`, `mkSimprocPatternFromExpr`) and the `import
Lean.Meta.AbstractMVars` that were added to `Lean.Meta.Sym.Pattern`
after merging #12597.
2026-03-15 10:30:26 +00:00
Leonardo de Moura
cfa8c5a036 fix: handle universe level commutativity in sym pattern matching (#12923)
This PR fixes a bug where `max u v` and `max v u` fail to match in
SymM's pattern matching. Both `processLevel` (Phase 1) and
`isLevelDefEqS` (Phase 2) treated `max` positionally, so `max u v ≠ max
v u` structurally even though they are semantically equal.

The fix has three parts:
- Eagerly normalize universe levels in patterns at creation time
(`preprocessDeclPattern`, `preprocessExprPattern`,
`mkSimprocPatternFromExpr`)
- Normalize the target level in `processLevel` before matching, using a
`where go` refactor
- Add `tryApproxMaxMax` to `processLevel` and `isLevelDefEqS`: when
positional `max/max` matching would fail, check if one argument from
each side matches structurally and match the remaining pair

Also moves `normalizeLevels` from `Grind.Util` to `Sym.Util` to avoid
code duplication, since both Sym and Grind need it.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-15 01:06:16 +00:00
Leonardo de Moura
7120d9aef5 fix: eta-reduce expressions in sym discrimination tree lookup (#12920)
This PR adds eta reduction to the sym discrimination tree lookup
functions (`getMatch`, `getMatchWithExtra`, `getMatchLoop`). Without
this, expressions like `StateM Nat` that unfold to eta-expanded forms
`(fun α => StateT Nat Id α)` fail to match discrimination tree entries
for the eta-reduced form `(StateT Nat Id)`.

Also optimizes `etaReduce` with an early exit for non-lambda expressions
and removes a redundant `n == 0` check.
Includes a test verifying that `P (StateM Nat)` matches a disc tree
entry for `P (StateT Nat Id)`.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-14 16:57:10 +00:00
Joachim Breitner
c2d4079193 perf: optimize string literal equality simprocs for kernel efficiency (#12887)
This PR optimizes the `String.reduceEq`, `String.reduceNe`, and
`Sym.Simp` string equality simprocs to produce kernel-efficient proofs.
Previously, these used `String.decEq` which forced the kernel to run
UTF-8 encoding/decoding and byte array comparison, causing 86+ kernel
unfoldings on short strings.

The new approach reduces string inequality to `List Char` via
`String.ofList_injective`, then uses two strategies depending on the
difference:

- **Different characters at position `i`**: Projects to `Nat` via
`congrArg (fun l => (List.get!Internal l i).toNat)`, then uses
`Nat.ne_of_beq_eq_false rfl`. This avoids `Decidable` instances entirely
— the kernel only evaluates `Nat.beq` on two concrete natural numbers.

- **One string is a prefix of the other**: Uses `congrArg (List.drop n
·)` with `List.cons_ne_nil`, which is a definitional proof requiring no
`decide` step at all.

For equal strings, `eq_true rfl` avoids kernel evaluation entirely.

The shared proof construction is in `Lean.Meta.mkStringLitNeProof`
(`Lean/Meta/StringLitProof.lean`), used by both the standard simprocs
and the `Sym.Simp` ground evaluator.

Kernel max unfolds for `"hello" ≠ "foo"`: 86+ → 6.

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-14 10:30:31 +00:00
Wojciech Nawrocki
47b3be0524 feat: update RPC wire format (#12905)
This PR adjusts the JSON encoding of RPC references from `{"p": "n"}` to
`{"__rpcref": "n"}`. Existing clients will continue to work unchanged,
but should eventually move to the new format by advertising the
`rpcWireFormat` client capability.

- This came up in leanprover/vscode-lean4#712.
- The new encoding is far less likely to clash with real-world names,
and is now documented as a "reserved internal name".
- At 8 bytes vs. 1 byte, it incurs a ~5% size increase on the JSON size
of interactive terms, e.g. from 868KiB to 903KiB on the
leanprover/vscode-lean4#500 test.
- Make `deriving RpcEncodable` throw an error when it encounters the
reserved name. We cannot easily guard against clashes in user-provided
JSON, however, so we just assume it does not clash.
- Add a notion of *RPC wire format* with corresponding `rpcWireFormat`
client and server capabilities. The format before this PR is now called
`v0`, whereas here we implement `v1`. Existing clients should eventually
implement compatibility with `v1` (because doing so fixes the above
bug), but will continue to work in the meantime. The format may be
revised again in the future (but we don't expect to revise it so often
that semver would be useful).
- Document everything.


## Alternative designs (abandoned for now)

- Option 1. Add a method `$/lean/rpc/metadata` which, given the name of
an RPC method `foo`, returns metadata containing a description of where
the RPC refs in any return value of `foo` would be (essentially a
description of the structure of the return type).
- Option 2. Wrap every response to `$/lean/rpc/call` in such metadata.
This would be a different change to the wire format.
- To implement this in an extensible way, we extend `RpcEncodable` by a
`refPaths` field. But how does `refPaths` describe where the refs are?
- Option A. Emit the code of a JS method that extracts the refs. This is
maybe simplest, but it would leave non-JS clients (e.g. `lean.nvim`)
behind.
- Option B. Give the description in some query language. The query
language must be able to describe paths into arbitrary inductive types.
- The most popular option,
[JSONPath](https://www.rfc-editor.org/rfc/rfc9535), seemingly cannot
describe non-uniform paths (e.g. both the `a`s in `{a: 1, {b: {a:
2}}}`).
- [JMESPath](https://jmespath.org/) can describe non-uniform paths, and
has 'fully compliant' implementations in many languages, but doesn't
seem to handle recursive paths.
- The most expressive option is [jq](https://github.com/jqlang/jq), but
the most popular way to run it is via an Emscripten WASM blob in
[jq-web](https://github.com/fiatjaf/jq-web) which seems heavy. There is
[jqjs](https://github.com/mwh/jqjs) as well; I'm not sure how
production-ready that is.
2026-03-13 23:46:16 +00:00
Wojciech Różowski
de2b177423 fix: make cbv_opaque take precedence over cbv_eval (#12908)
This PR makes `@[cbv_opaque]` unconditionally block all evaluation of a
constant
by `cbv`, including `@[cbv_eval]` rewrite rules. Previously,
`@[cbv_eval]` could
bypass `@[cbv_opaque]`, and for bare constants (not applications),
`isOpaqueConst`
could fall through to `handleConst` which would unfold the definition
body.

The intended usage pattern is now: mark subterm-producing functions
(like
`DHashMap.insert`) as `@[cbv_opaque]` to prevent unfolding, and provide
`@[cbv_eval]` theorems on the *consuming* function (like
`DHashMap.contains`)
which pattern-matches against the opaque subterms.
2026-03-13 14:52:33 +00:00
Wojciech Różowski
a32173e6f6 feat: add tracing to cbv (#12896)
This PR adds a basic tracing infrastructure to `cbv` tactic.
2026-03-13 12:05:49 +00:00
Sebastian Graf
e6d9220eee test: add dite and match splitting to sym-based MVCGen (#12903)
This PR generalizes the sym MVCGen's match splitting from `ite`-only to
`ite`, `dite`, and arbitrary matchers. Previously, only `ite` was
supported; `dite` and match expressions were rejected with an error.

`mkBackwardRuleForSplit` uses `SplitInfo.splitWith` to build the
splitting proof. Hypothesis types are discovered via `rwIfOrMatcher`
inside the splitter telescope, and `TransformAltFVars.all` provides the
proper fvars for `mkForallFVars`. Subgoal type metavariables use
`mkFreshExprSyntheticOpaqueMVar` so that `rwIfOrMatcher`'s internal
`assumption` tactic cannot assign them.

Adds `DiteSplit`, `MatchSplit`, and `MatchSplitState` test cases and a
`vcgen_match_split` benchmark.

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 22:39:43 +00:00
Sebastian Graf
aae827cb4c refactor: replace flat Array Expr with TransformAltFVars in MatcherApp.transform (#12902)
This PR introduces a `TransformAltFVars` structure to replace the flat
`Array Expr`
parameter in the `onAlt` callback of `MatcherApp.transform`. The new
structure gives
callers structured access to the different kinds of fvars introduced in
matcher
alternative telescopes: constructor fields, overlap parameters,
discriminant equations,
and extra equations from `addEqualities`.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 21:48:08 +00:00
353 changed files with 4530 additions and 1315 deletions

View File

@@ -20,9 +20,24 @@ CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS='--rerun-failed'
# Single test from tests/foo/bar/ (quick check during development)
cd tests/foo/bar && ./run_test example_test.lean
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS=-R testname'
```
## Testing stage 2
When requested to test stage 2, build it as follows:
```
make -C build/release stage2 -j$(nproc)
```
Stage 2 is *not* automatically invalidated by changes to `src/` which allows for faster iteration
when fixing a specific file in the stage 2 build but for invalidating any files that already passed
the stage 2 build as well as for final validation,
```
make -C build/release/stage2 clean-stdlib
```
must be run manually before building.
## New features
When asked to implement new features:

View File

@@ -79,7 +79,7 @@ if(NOT CMAKE_SYSTEM_NAME MATCHES "Emscripten")
endif()
find_program(LEANTAR leantar)
if(NOT LEANTAR)
set(LEANTAR_VERSION v0.1.18)
set(LEANTAR_VERSION v0.1.19)
if(CMAKE_SYSTEM_NAME MATCHES "Windows")
set(LEANTAR_ARCHIVE_SUFFIX .zip)
set(LEANTAR_TARGET x86_64-pc-windows-msvc)

View File

@@ -1,6 +1,3 @@
#!/usr/bin/env bash
source ../../../tests/env_test.sh
leanmake --always-make bin
capture ./build/bin/test hello world

5
doc/examples/run_test → doc/examples/run_test.sh Executable file → Normal file
View File

@@ -1,7 +1,4 @@
#!/usr/bin/env bash
source ../../tests/env_test.sh
capture_only "$1" \
lean -Dlinter.all=false "$1"
check_exit_is_success
check_out_file
check_exit_is_success

View File

@@ -492,8 +492,9 @@ def execute_release_steps(repo, version, config):
'ROOT_REV=$(jq -r \'.packages[] | select(.name == "subverso") | .rev\' lake-manifest.json); '
'SUBVERSO_URL=$(jq -r \'.packages[] | select(.name == "subverso") | .url\' lake-manifest.json); '
'DEMOD_REV=$(git ls-remote "$SUBVERSO_URL" "refs/tags/no-modules/$ROOT_REV" | awk \'{print $1}\'); '
'find test-projects -name lake-manifest.json -print0 | '
'xargs -0 -I{} sh -c \'jq --arg rev "$DEMOD_REV" \'.packages |= map(if .name == "subverso" then .rev = $rev else . end)\' "{}" > /tmp/lm_tmp.json && mv /tmp/lm_tmp.json "{}"\''
'find test-projects -name lake-manifest.json -print0 | while IFS= read -r -d \'\' f; do '
'jq --arg rev "$DEMOD_REV" \'.packages |= map(if .name == "subverso" then .rev = $rev else . end)\' "$f" > /tmp/lm_tmp.json && mv /tmp/lm_tmp.json "$f"; '
'done'
)
run_command(sync_script, cwd=repo_path)
print(green("Synced de-modulized subverso rev to all test-project sub-manifests"))
@@ -658,56 +659,61 @@ def execute_release_steps(repo, version, config):
# Fetch latest changes to ensure we have the most up-to-date nightly-testing branch
print(blue("Fetching latest changes from origin..."))
run_command("git fetch origin", cwd=repo_path)
try:
print(blue("Merging origin/nightly-testing..."))
run_command("git merge origin/nightly-testing", cwd=repo_path)
print(green("Merge completed successfully"))
except subprocess.CalledProcessError:
# Merge failed due to conflicts - check which files are conflicted
print(blue("Merge conflicts detected, checking which files are affected..."))
# Get conflicted files using git status
status_result = run_command("git status --porcelain", cwd=repo_path)
conflicted_files = []
for line in status_result.stdout.splitlines():
if len(line) >= 2 and line[:2] in ['UU', 'AA', 'DD', 'AU', 'UA', 'DU', 'UD']:
# Extract filename (skip the first 3 characters which are status codes)
conflicted_files.append(line[3:])
# Filter out allowed files
allowed_patterns = ['lean-toolchain', 'lake-manifest.json']
problematic_files = []
for file in conflicted_files:
is_allowed = any(pattern in file for pattern in allowed_patterns)
if not is_allowed:
problematic_files.append(file)
if problematic_files:
# There are conflicts in non-allowed files - fail
print(red("❌ Merge failed!"))
print(red(f"Merging nightly-testing resulted in conflicts in:"))
for file in problematic_files:
print(red(f" - {file}"))
print(red("Please resolve these conflicts manually."))
return
else:
# Only allowed files are conflicted - resolve them automatically
print(green(f"✅ Only allowed files conflicted: {', '.join(conflicted_files)}"))
print(blue("Resolving conflicts automatically..."))
# For lean-toolchain and lake-manifest.json, keep our versions
# Check if nightly-testing branch exists on origin (use local ref after fetch for exact match)
nightly_check = run_command("git show-ref --verify --quiet refs/remotes/origin/nightly-testing", cwd=repo_path, check=False)
if nightly_check.returncode != 0:
print(yellow("No nightly-testing branch found on origin, skipping merge"))
else:
try:
print(blue("Merging origin/nightly-testing..."))
run_command("git merge origin/nightly-testing", cwd=repo_path)
print(green("Merge completed successfully"))
except subprocess.CalledProcessError:
# Merge failed due to conflicts - check which files are conflicted
print(blue("Merge conflicts detected, checking which files are affected..."))
# Get conflicted files using git status
status_result = run_command("git status --porcelain", cwd=repo_path)
conflicted_files = []
for line in status_result.stdout.splitlines():
if len(line) >= 2 and line[:2] in ['UU', 'AA', 'DD', 'AU', 'UA', 'DU', 'UD']:
# Extract filename (skip the first 3 characters which are status codes)
conflicted_files.append(line[3:])
# Filter out allowed files
allowed_patterns = ['lean-toolchain', 'lake-manifest.json']
problematic_files = []
for file in conflicted_files:
print(blue(f"Keeping our version of {file}"))
run_command(f"git checkout --ours {file}", cwd=repo_path)
# Complete the merge
run_command("git add .", cwd=repo_path)
run_command("git commit --no-edit", cwd=repo_path)
print(green("Merge completed successfully with automatic conflict resolution"))
is_allowed = any(pattern in file for pattern in allowed_patterns)
if not is_allowed:
problematic_files.append(file)
if problematic_files:
# There are conflicts in non-allowed files - fail
print(red("❌ Merge failed!"))
print(red(f"Merging nightly-testing resulted in conflicts in:"))
for file in problematic_files:
print(red(f" - {file}"))
print(red("Please resolve these conflicts manually."))
return
else:
# Only allowed files are conflicted - resolve them automatically
print(green(f"✅ Only allowed files conflicted: {', '.join(conflicted_files)}"))
print(blue("Resolving conflicts automatically..."))
# For lean-toolchain and lake-manifest.json, keep our versions
for file in conflicted_files:
print(blue(f"Keeping our version of {file}"))
run_command(f"git checkout --ours {file}", cwd=repo_path)
# Complete the merge
run_command("git add .", cwd=repo_path)
run_command("git commit --no-edit", cwd=repo_path)
print(green("✅ Merge completed successfully with automatic conflict resolution"))
# Build and test (skip for Mathlib)
if repo_name not in ["mathlib4"]:

View File

@@ -118,6 +118,9 @@ option(USE_LAKE_CACHE "Use the Lake artifact cache for stage 1 builds (requires
set(LEAN_EXTRA_MAKE_OPTS "" CACHE STRING "extra options to lean --make")
set(LEANC_CC ${CMAKE_C_COMPILER} CACHE STRING "C compiler to use in `leanc`")
# Temporary, core-only flags. Must be synced with stdlib_flags.h.
string(APPEND LEAN_EXTRA_MAKE_OPTS " -Dbackward.do.legacy=false")
if(LAZY_RC MATCHES "ON")
set(LEAN_LAZY_RC "#define LEAN_LAZY_RC")
endif()

View File

@@ -254,8 +254,8 @@ instance : LawfulMonad Id := by
@[simp, grind =] theorem run_bind (x : Id α) (f : α Id β) : (x >>= f).run = (f x.run).run := rfl
@[simp, grind =] theorem run_pure (a : α) : (pure a : Id α).run = a := rfl
@[simp, grind =] theorem pure_run (a : Id α) : pure a.run = a := rfl
@[simp] theorem run_seqRight (x y : Id α) : (x *> y).run = y.run := rfl
@[simp] theorem run_seqLeft (x y : Id α) : (x <* y).run = x.run := rfl
@[simp] theorem run_seqRight (x : Id α) (y : Id β) : (x *> y).run = y.run := rfl
@[simp] theorem run_seqLeft (x : Id α) (y : Id β) : (x <* y).run = x.run := rfl
@[simp] theorem run_seq (f : Id (α β)) (x : Id α) : (f <*> x).run = f.run x.run := rfl
end Id

View File

@@ -98,7 +98,7 @@ well-founded recursion mechanism to prove that the function terminates.
@[simp] theorem pmap_push {P : α Prop} (f : a, P a β) (a : α) (xs : Array α) (h : b xs.push a, P b) :
pmap f (xs.push a) h =
(pmap f xs (fun a m => by simp at h; exact h a (.inl m))).push (f a (h a (by simp))) := by
(pmap f xs (fun a m => by simp at h; exact h.1 _ m)).push (f a (h a (by simp))) := by
simp [pmap]
@[simp] theorem attach_empty : (#[] : Array α).attach = #[] := rfl
@@ -153,7 +153,7 @@ theorem attachWith_congr {xs ys : Array α} (w : xs = ys) {P : α → Prop} {H :
@[simp] theorem attachWith_push {a : α} {xs : Array α} {P : α Prop} {H : x xs.push a, P x} :
(xs.push a).attachWith P H =
(xs.attachWith P (fun x h => by simp at H; exact H x (.inl h))).push a, H a (by simp) := by
(xs.attachWith P (fun x h => by simp at H; exact H.1 _ h)).push a, H a (by simp) := by
cases xs
simp

View File

@@ -118,16 +118,19 @@ theorem toNat_pow_of_nonneg {x : Int} (h : 0 ≤ x) (k : Nat) : (x ^ k).toNat =
| succ k ih =>
rw [Int.pow_succ, Int.toNat_mul (Int.pow_nonneg h) h, ih, Nat.pow_succ]
protected theorem sq_nonnneg (m : Int) : 0 m ^ 2 := by
protected theorem sq_nonneg (m : Int) : 0 m ^ 2 := by
rw [Int.pow_succ, Int.pow_one]
cases m
· apply Int.mul_nonneg <;> simp
· apply Int.mul_nonneg_of_nonpos_of_nonpos <;> exact negSucc_le_zero _
@[deprecated Int.sq_nonneg (since := "2026-03-13")]
protected theorem sq_nonnneg (m : Int) : 0 m ^ 2 := Int.sq_nonneg m
protected theorem pow_nonneg_of_even {m : Int} {n : Nat} (h : n % 2 = 0) : 0 m ^ n := by
rw [ Nat.mod_add_div n 2, h, Nat.zero_add, Int.pow_mul]
apply Int.pow_nonneg
exact Int.sq_nonnneg m
exact Int.sq_nonneg m
protected theorem neg_pow {m : Int} {n : Nat} : (-m)^n = (-1)^(n % 2) * m^n := by
rw [Int.neg_eq_neg_one_mul, Int.mul_pow]

View File

@@ -435,8 +435,9 @@ theorem Iter.forIn_filterMapWithPostcondition
match (f out).run with
| some c => g c acc
| none => return .yield acc) := by
simp +instances [Iter.forIn_eq_forIn_toIterM, filterMapWithPostcondition, IterM.forIn_filterMapWithPostcondition,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]; rfl
simp only [filterMapWithPostcondition, IterM.forIn_filterMapWithPostcondition, forIn_eq_forIn_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
rfl -- expressions are equal up to different matchers
theorem Iter.forIn_filterMapM
[Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -448,8 +449,9 @@ theorem Iter.forIn_filterMapM
match f out with
| some c => g c acc
| none => return .yield acc) := by
simp +instances [filterMapM, forIn_eq_forIn_toIterM, IterM.forIn_filterMapM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]; rfl
simp [filterMapM, forIn_eq_forIn_toIterM, IterM.forIn_filterMapM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
rfl
theorem Iter.forIn_filterMap
[Monad n] [LawfulMonad n] [Finite α Id]
@@ -469,8 +471,8 @@ theorem Iter.forIn_mapWithPostcondition
{g : β₂ γ o (ForInStep γ)} :
forIn (it.mapWithPostcondition f) init g =
forIn it init (fun out acc => do g ( (f out).run) acc) := by
simp +instances [mapWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_mapWithPostcondition,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [mapWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_mapWithPostcondition]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.forIn_mapM
[Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -498,8 +500,8 @@ theorem Iter.forIn_filterWithPostcondition
haveI : MonadLift n o := monadLift
forIn (it.filterWithPostcondition f) init g =
forIn it init (fun out acc => do if ( (f out).run).down then g out acc else return .yield acc) := by
simp +instances [filterWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_filterWithPostcondition,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_filterWithPostcondition]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.forIn_filterM
[Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -508,8 +510,8 @@ theorem Iter.forIn_filterM
[IteratorLoop α Id o] [LawfulIteratorLoop α Id o]
{it : Iter (α := α) β} {f : β n (ULift Bool)} {init : γ} {g : β γ o (ForInStep γ)} :
forIn (it.filterM f) init g = forIn it init (fun out acc => do if ( f out).down then g out acc else return .yield acc) := by
simp +instances [filterM, forIn_eq_forIn_toIterM, IterM.forIn_filterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterM, forIn_eq_forIn_toIterM, IterM.forIn_filterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.forIn_filter
[Monad n] [LawfulMonad n]
@@ -550,8 +552,9 @@ theorem Iter.foldM_filterMapM {α β γ δ : Type w}
it.foldM (init := init) (fun d b => do
let some c f b | pure d
g d c) := by
simp +instances [filterMapM, IterM.foldM_filterMapM, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]; rfl
simp only [filterMapM, IterM.foldM_filterMapM, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
rfl
theorem Iter.foldM_mapWithPostcondition {α β γ δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -563,8 +566,8 @@ theorem Iter.foldM_mapWithPostcondition {α β γ δ : Type w}
{f : β PostconditionT n γ} {g : δ γ o δ} {init : δ} {it : Iter (α := α) β} :
(it.mapWithPostcondition f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do let c (f b).run; g d c) := by
simp +instances [mapWithPostcondition, IterM.foldM_mapWithPostcondition, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [mapWithPostcondition, IterM.foldM_mapWithPostcondition, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_mapM {α β γ δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -578,8 +581,8 @@ theorem Iter.foldM_mapM {α β γ δ : Type w}
haveI : MonadLift n o := MonadLiftT.monadLift
(it.mapM f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do let c f b; g d c) := by
simp +instances [mapM, IterM.foldM_mapM, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [mapM, IterM.foldM_mapM, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_filterWithPostcondition {α β δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -591,8 +594,8 @@ theorem Iter.foldM_filterWithPostcondition {α β δ : Type w}
{f : β PostconditionT n (ULift Bool)} {g : δ β o δ} {init : δ} {it : Iter (α := α) β} :
(it.filterWithPostcondition f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do if ( (f b).run).down then g d b else pure d) := by
simp +instances [filterWithPostcondition, IterM.foldM_filterWithPostcondition, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterWithPostcondition, IterM.foldM_filterWithPostcondition, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_filterM {α β δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -605,8 +608,8 @@ theorem Iter.foldM_filterM {α β δ : Type w}
{f : β n (ULift Bool)} {g : δ β o δ} {init : δ} {it : Iter (α := α) β} :
(it.filterM f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do if ( f b).down then g d b else pure d) := by
simp +instances [filterM, IterM.foldM_filterM, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterM, IterM.foldM_filterM, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_filterMap {α β γ δ : Type w} {n : Type w Type w''}
[Iterator α Id β] [Finite α Id] [Monad n] [LawfulMonad n]

View File

@@ -232,7 +232,6 @@ public theorem Iter.toArray_flatMapM {α α₂ β γ : Type w} {m : Type w → T
(it₁.flatMapM f).toArray = Array.flatten <$> (it₁.mapM fun b => do ( f b).toArray).toArray := by
simp [flatMapM, toArray_flatMapAfterM]
set_option backward.isDefEq.respectTransparency false in
public theorem Iter.toList_flatMapAfter {α α₂ β γ : Type w} [Iterator α Id β] [Iterator α₂ Id γ]
[Finite α Id] [Finite α₂ Id]
{f : β Iter (α := α₂) γ} {it₁ : Iter (α := α) β} {it₂ : Option (Iter (α := α₂) γ)} :
@@ -241,9 +240,9 @@ public theorem Iter.toList_flatMapAfter {α α₂ β γ : Type w} [Iterator α I
| some it₂ => it₂.toList ++
(it₁.map fun b => (f b).toList).toList.flatten := by
simp only [flatMapAfter, Iter.toList, toIterM_toIter, IterM.toList_flatMapAfter]
cases it₂ <;> simp [map, IterM.toList_map_eq_toList_mapM, - IterM.toList_map]
unfold Iter.toList
cases it₂ <;> simp [map]
set_option backward.isDefEq.respectTransparency false in
public theorem Iter.toArray_flatMapAfter {α α₂ β γ : Type w} [Iterator α Id β] [Iterator α₂ Id γ]
[Finite α Id] [Finite α₂ Id]
{f : β Iter (α := α₂) γ} {it₁ : Iter (α := α) β} {it₂ : Option (Iter (α := α₂) γ)} :
@@ -252,6 +251,7 @@ public theorem Iter.toArray_flatMapAfter {α α₂ β γ : Type w} [Iterator α
| some it₂ => it₂.toArray ++
(it₁.map fun b => (f b).toArray).toArray.flatten := by
simp only [flatMapAfter, Iter.toArray, toIterM_toIter, IterM.toArray_flatMapAfter]
unfold Iter.toArray
cases it₂ <;> simp [map, IterM.toArray_map_eq_toArray_mapM, - IterM.toArray_map]
public theorem Iter.toList_flatMap {α α₂ β γ : Type w} [Iterator α Id β] [Iterator α₂ Id γ]

View File

@@ -374,7 +374,6 @@ theorem IterM.toList_map_eq_toList_filterMapM {α β γ : Type w} {m : Type w
simp [toList_map_eq_toList_mapM, toList_mapM_eq_toList_filterMapM]
congr <;> simp
set_option backward.whnf.reducibleClassField false in
/--
Variant of `toList_filterMapWithPostcondition_filterMapWithPostcondition` that is intended to be
used with the `apply` tactic. Because neither the LHS nor the RHS determine all implicit parameters,
@@ -395,7 +394,7 @@ private theorem IterM.toList_filterMapWithPostcondition_filterMapWithPostconditi
(it.filterMapWithPostcondition (n := o) fg).toList := by
induction it using IterM.inductSteps with | step it ihy ihs
letI : MonadLift n o := monadLift
haveI : LawfulMonadLift n o := by simp +instances [this], by simp +instances [this]
haveI : LawfulMonadLift n o := LawfulMonadLiftT.monadLift_pure, LawfulMonadLiftT.monadLift_bind
rw [toList_eq_match_step, toList_eq_match_step, step_filterMapWithPostcondition,
bind_assoc, step_filterMapWithPostcondition, step_filterMapWithPostcondition]
simp only [bind_assoc, liftM_bind]
@@ -602,7 +601,6 @@ theorem IterM.toList_map_mapM {α β γ δ : Type w}
toList_filterMapM_mapM]
congr <;> simp
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_filterMapWithPostcondition {α β γ : Type w} {m : Type w Type w'}
[Monad m] [LawfulMonad m]
@@ -626,7 +624,6 @@ theorem IterM.toList_filterMapWithPostcondition {α β γ : Type w} {m : Type w
· simp [ihs _, heq]
· simp [heq]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_mapWithPostcondition {α β γ : Type w} {m : Type w Type w'}
[Monad m] [LawfulMonad m] [Iterator α Id β] [Finite α Id]
@@ -647,25 +644,25 @@ theorem IterM.toList_mapWithPostcondition {α β γ : Type w} {m : Type w → Ty
· simp [ihs _, heq]
· simp [heq]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_filterMapM {α β γ : Type w} {m : Type w Type w'}
[Monad m] [MonadAttach m] [LawfulMonad m] [WeaklyLawfulMonadAttach m]
[Iterator α Id β] [Finite α Id]
{f : β m (Option γ)} (it : IterM (α := α) Id β) :
(it.filterMapM f).toList = it.toList.run.filterMapM f := by
simp [toList_filterMapM_eq_toList_filterMapWithPostcondition, toList_filterMapWithPostcondition,
PostconditionT.attachLift, PostconditionT.run_eq_map, WeaklyLawfulMonadAttach.map_attach]
simp only [toList_filterMapM_eq_toList_filterMapWithPostcondition,
toList_filterMapWithPostcondition, PostconditionT.run_eq_map]
simp [PostconditionT.attachLift, WeaklyLawfulMonadAttach.map_attach]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_mapM {α β γ : Type w} {m : Type w Type w'}
[Monad m] [MonadAttach m] [LawfulMonad m] [WeaklyLawfulMonadAttach m]
[Iterator α Id β] [Finite α Id]
{f : β m γ} (it : IterM (α := α) Id β) :
(it.mapM f).toList = it.toList.run.mapM f := by
simp [toList_mapM_eq_toList_mapWithPostcondition, toList_mapWithPostcondition,
PostconditionT.attachLift, PostconditionT.run_eq_map, WeaklyLawfulMonadAttach.map_attach]
simp only [toList_mapM_eq_toList_mapWithPostcondition, toList_mapWithPostcondition,
PostconditionT.run_eq_map]
simp [PostconditionT.attachLift, WeaklyLawfulMonadAttach.map_attach]
@[simp]
theorem IterM.toList_filterMap {α β γ : Type w} {m : Type w Type w'}
@@ -1303,7 +1300,6 @@ theorem IterM.forIn_filterMap
rw [filterMap, forIn_filterMapWithPostcondition]
simp [PostconditionT.run_eq_map]
set_option backward.isDefEq.respectTransparency false in
theorem IterM.forIn_mapWithPostcondition
[Monad m] [LawfulMonad m] [Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
[MonadLiftT m n] [LawfulMonadLiftT m n] [MonadLiftT n o] [LawfulMonadLiftT n o]
@@ -1314,9 +1310,9 @@ theorem IterM.forIn_mapWithPostcondition
haveI : MonadLift n o := monadLift
forIn (it.mapWithPostcondition f) init g =
forIn it init (fun out acc => do g ( (f out).run) acc) := by
rw [mapWithPostcondition, InternalCombinators.map, InternalCombinators.filterMap,
filterMapWithPostcondition, forIn_filterMapWithPostcondition]
simp [PostconditionT.run_eq_map]
unfold mapWithPostcondition InternalCombinators.map Map.instIterator Map.instIteratorLoop Map
rw [ InternalCombinators.filterMap, filterMapWithPostcondition, forIn_filterMapWithPostcondition]
simp
theorem IterM.forIn_mapM
[Monad m] [LawfulMonad m] [Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -1480,7 +1476,7 @@ theorem IterM.foldM_filterM {α β δ : Type w}
simp [filterM, foldM_filterMapWithPostcondition, PostconditionT.run_attachLift]
congr 1; ext out acc
apply bind_congr; intro fx
cases fx.down <;> simp [PostconditionT.run_eq_map]
cases fx.down <;> simp
theorem IterM.foldM_filterMap {α β γ δ : Type w} {m : Type w Type w'} {n : Type w Type w''}
[Iterator α m β] [Finite α m] [Monad m] [Monad n] [LawfulMonad m] [LawfulMonad n]

View File

@@ -32,11 +32,12 @@ theorem Iter.forIn'_eq {α β : Type w} [Iterator α Id β] [Finite α Id]
IterM.DefaultConsumers.forIn' (n := m) (fun _ _ f x => f x.run) γ (fun _ _ _ => True)
it.toIterM init _ (fun _ => id)
(fun out h acc => return f out (Iter.isPlausibleIndirectOutput_iff_isPlausibleIndirectOutput_toIterM.mpr h) acc, trivial) := by
simp +instances only [ForIn'.forIn']
simp only [ForIn'.forIn']
have : a b c, f a b c = (Subtype.val <$> (·, trivial) <$> f a b c) := by simp
simp +singlePass only [this]
rw [hl.lawful (fun _ _ f x => f x.run) (wf := IteratorLoop.wellFounded_of_finite)]
simp +instances [IteratorLoop.defaultImplementation]
simp only [IteratorLoop.forIn, Functor.map_map, id_map',
bind_pure_comp]
theorem Iter.forIn_eq {α β : Type w} [Iterator α Id β] [Finite α Id]
{m : Type x Type x'} [Monad m] [LawfulMonad m] [IteratorLoop α Id m]
@@ -81,7 +82,7 @@ theorem Iter.forIn'_eq_forIn'_toIterM {α β : Type w} [Iterator α Id β]
letI : ForIn' m (IterM (α := α) Id β) β _ := IterM.instForIn'
ForIn'.forIn' it.toIterM init
(fun out h acc => f out (isPlausibleIndirectOutput_iff_isPlausibleIndirectOutput_toIterM.mpr h) acc) := by
simp +instances [ForIn'.forIn', monadLift]
simp [ForIn'.forIn', monadLift]
theorem Iter.forIn_eq_forIn_toIterM {α β : Type w} [Iterator α Id β]
[Finite α Id] {m : Type w Type w''} [Monad m] [LawfulMonad m]

View File

@@ -109,10 +109,10 @@ theorem IterM.forIn'_eq {α β : Type w} {m : Type w → Type w'} [Iterator α m
letI : ForIn' n (IterM (α := α) m β) β _ := IterM.instForIn'
ForIn'.forIn' (α := β) (m := n) it init f = IterM.DefaultConsumers.forIn' (n := n)
(fun _ _ f x => monadLift x >>= f) γ (fun _ _ _ => True) it init _ (fun _ => id) (return f · · ·, trivial) := by
simp +instances only [ForIn'.forIn']
simp only [ForIn'.forIn']
have : f = (Subtype.val <$> (·, trivial) <$> f · · ·) := by simp
rw [this, hl.lawful (fun _ _ f x => monadLift x >>= f) (wf := IteratorLoop.wellFounded_of_finite)]
simp +instances [IteratorLoop.defaultImplementation]
simp [IteratorLoop.forIn]
try rfl
theorem IterM.forIn_eq {α β : Type w} {m : Type w Type w'} [Iterator α m β] [Finite α m]

View File

@@ -272,6 +272,12 @@ theorem PostconditionT.run_bind' {m : Type w → Type w'} [Monad m] [LawfulMonad
(x >>= f).run = x.run >>= (f · |>.run) :=
run_bind
@[simp]
protected theorem PostconditionT.run_pure {m : Type w Type w'} [Monad m] [LawfulMonad m]
{α : Type w} {x : α} :
(pure x : PostconditionT m α).run = pure x := by
simp [run_eq_map]
@[simp]
theorem PostconditionT.property_lift {m : Type w Type w'} [Functor m] {α : Type w}
{x : m α} : (lift x : PostconditionT m α).Property = (fun _ => True) := by

View File

@@ -867,7 +867,7 @@ theorem and_le_right {n m : Nat} : n &&& m ≤ m :=
le_of_testBit (by simp)
theorem left_le_or {n m : Nat} : n n ||| m :=
le_of_testBit (by simpa using fun i => Or.inl)
le_of_testBit (by simp)
theorem right_le_or {n m : Nat} : m n ||| m :=
le_of_testBit (by simpa using fun i => Or.inr)
le_of_testBit (by simp)

View File

@@ -208,7 +208,7 @@ public instance LawfulOrderLT.of_lt {α : Type u} [LT α] [i : Asymm (α := α)
haveI := LE.ofLT α
LawfulOrderLT α :=
letI := LE.ofLT α
{ lt_iff a b := by simp +instances [LE.le]; apply Asymm.asymm }
{ lt_iff a b := by simp [LE.le]; apply Asymm.asymm }
/--
If an `LT α` instance is asymmetric and its negation is transitive, then `LE.ofLT α` represents a
@@ -253,8 +253,7 @@ public theorem LawfulOrderInf.of_lt {α : Type u} [Min α] [LT α]
letI := LE.ofLT α
{ le_min_iff a b c := by
open Classical in
simp +instances only [LE.le]
simp [ not_or, Decidable.not_iff_not]
simp only [LE.le, not_or, Decidable.not_iff_not]
simpa [Decidable.imp_iff_not_or] using min_lt_iff a b c }
/--
@@ -283,8 +282,7 @@ public theorem LawfulOrderSup.of_lt {α : Type u} [Max α] [LT α]
letI := LE.ofLT α
{ max_le_iff a b c := by
open Classical in
simp +instances only [LE.le]
simp [ not_or, Decidable.not_iff_not]
simp only [LE.le, not_or, Decidable.not_iff_not]
simpa [Decidable.imp_iff_not_or] using lt_max_iff a b c }
/--

View File

@@ -287,7 +287,7 @@ scoped instance (priority := low) instLawfulOrderLTOpposite {il : LE α} {it : L
letI := il.opposite
letI := it.opposite
{ lt_iff a b := by
simp +instances only [LE.opposite, LT.opposite]
simp only [LE.le, LT.lt]
letI := il; letI := it
exact LawfulOrderLT.lt_iff b a }
@@ -297,7 +297,7 @@ scoped instance (priority := low) instLawfulOrderBEqOpposite {il : LE α} {ib :
LawfulOrderBEq α :=
letI := il.opposite
{ beq_iff_le_and_ge a b := by
simp +instances only [LE.opposite]
simp only [LE.le]
letI := il; letI := ib
rw [LawfulOrderBEq.beq_iff_le_and_ge]
exact and_comm }
@@ -310,7 +310,7 @@ scoped instance (priority := low) instLawfulOrderInfOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMax
{ max_le_iff a b c := by
simp +instances only [LE.opposite, Min.oppositeMax]
simp only [LE.le, Max.max]
letI := il; letI := im
exact LawfulOrderInf.le_min_iff c a b }
@@ -322,11 +322,11 @@ scoped instance (priority := low) instLawfulOrderMinOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMax
{ max_eq_or a b := by
simp +instances only [Min.oppositeMax]
simp only [Max.max]
letI := il; letI := im
exact MinEqOr.min_eq_or a b
max_le_iff a b c := by
simp +instances only [LE.opposite, Min.oppositeMax]
simp only [LE.le, Max.max]
letI := il; letI := im
exact LawfulOrderInf.le_min_iff c a b }
@@ -338,7 +338,7 @@ scoped instance (priority := low) instLawfulOrderSupOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMin
{ le_min_iff a b c := by
simp +instances only [LE.opposite, Max.oppositeMin]
simp only [LE.le, Min.min]
letI := il; letI := im
exact LawfulOrderSup.max_le_iff b c a }
@@ -350,11 +350,11 @@ scoped instance (priority := low) instLawfulOrderMaxOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMin
{ min_eq_or a b := by
simp +instances only [Max.oppositeMin]
simp only [Min.min]
letI := il; letI := im
exact MaxEqOr.max_eq_or a b
le_min_iff a b c := by
simp +instances only [LE.opposite, Max.oppositeMin]
simp only [LE.le, Min.min]
letI := il; letI := im
exact LawfulOrderSup.max_le_iff b c a }
@@ -366,11 +366,11 @@ scoped instance (priority := low) instLawfulOrderLeftLeaningMinOpposite {il : LE
letI := il.opposite
letI := im.oppositeMax
{ max_eq_left a b hab := by
simp +instances only [Min.oppositeMax]
simp only [Max.max]
letI := il; letI := im
exact LawfulOrderLeftLeaningMin.min_eq_left a b hab
max_eq_right a b hab := by
simp +instances only [Min.oppositeMax]
simp only [Max.max]
letI := il; letI := im
exact LawfulOrderLeftLeaningMin.min_eq_right a b hab }
@@ -382,11 +382,11 @@ scoped instance (priority := low) instLawfulOrderLeftLeaningMaxOpposite {il : LE
letI := il.opposite
letI := im.oppositeMin
{ min_eq_left a b hab := by
simp +instances only [Max.oppositeMin]
simp only [Min.min]
letI := il; letI := im
exact LawfulOrderLeftLeaningMax.max_eq_left a b hab
min_eq_right a b hab := by
simp +instances only [Max.oppositeMin]
simp only [Min.min]
letI := il; letI := im
exact LawfulOrderLeftLeaningMax.max_eq_right a b hab }

View File

@@ -796,7 +796,6 @@ automatically. If it fails, it is necessary to provide some of the fields manual
@[inline, expose, implicit_reducible]
public def LinearOrderPackage.ofOrd (α : Type u)
(args : Packages.LinearOrderOfOrdArgs α := by exact {}) : LinearOrderPackage α :=
-- set_option backward.isDefEq.respectTransparency false in
letI := LinearPreorderPackage.ofOrd α args.toLinearPreorderOfOrdArgs
haveI : LawfulEqOrd α := args.eq_of_compare _ _
letI : Min α := args.min

View File

@@ -597,7 +597,7 @@ instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α] [LE α] [Decidabl
LawfulIteratorLoop (Rxc.Iterator α) Id n where
lawful := by
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
simp only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
rw [IterM.DefaultConsumers.forIn'.wf]
split; rotate_left
· simp only [IterM.step_eq,
@@ -636,7 +636,7 @@ The pure function mapping a range iterator of type {name}`IterM` to the next ste
This function is prefixed with {lit}`Monadic` in order to disambiguate it from the version for iterators
of type {name}`Iter`.
-/
@[inline]
@[inline, implicit_reducible]
def Iterator.Monadic.step [UpwardEnumerable α] [LT α] [DecidableLT α]
(it : IterM (α := Rxo.Iterator α) Id α) :
IterStep (IterM (α := Rxo.Iterator α) Id α) α :=
@@ -1113,7 +1113,6 @@ private theorem Iterator.instIteratorLoop.loop_eq_wf [UpwardEnumerable α] [LT
· rw [WellFounded.fix_eq]
simp_all
set_option backward.isDefEq.respectTransparency false in
private theorem Iterator.instIteratorLoop.loopWf_eq [UpwardEnumerable α] [LT α] [DecidableLT α]
[LawfulUpwardEnumerable α] [LawfulUpwardEnumerableLT α]
{n : Type u Type w} [Monad n] [LawfulMonad n] (γ : Type u)
@@ -1165,14 +1164,13 @@ termination_by IteratorLoop.WithWF.mk ⟨⟨some next, upperBound⟩⟩ acc (hwf
decreasing_by
simp [IteratorLoop.rel, Monadic.isPlausibleStep_iff, Monadic.step, *]
set_option backward.isDefEq.respectTransparency false in
instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α] [LT α] [DecidableLT α]
[LawfulUpwardEnumerable α] [LawfulUpwardEnumerableLT α]
{n : Type u Type w} [Monad n] [LawfulMonad n] :
LawfulIteratorLoop (Rxo.Iterator α) Id n where
lawful := by
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
simp only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
rw [IterM.DefaultConsumers.forIn'.wf]
split; rotate_left
· simp [IterM.step_eq, Monadic.step, Internal.LawfulMonadLiftBindFunction.liftBind_pure (liftBind := lift)]
@@ -1637,7 +1635,7 @@ instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α]
LawfulIteratorLoop (Rxi.Iterator α) Id n where
lawful := by
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
simp only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
rw [IterM.DefaultConsumers.forIn'.wf]
split; rotate_left
· simp [Monadic.step_eq_step, Monadic.step, Internal.LawfulMonadLiftBindFunction.liftBind_pure]

View File

@@ -248,7 +248,16 @@ instance : HasModel Int8 (BitVec 8) where
le_iff_encode_le := by simp [LE.le, Int8.le]
lt_iff_encode_lt := by simp [LT.lt, Int8.lt]
set_option backward.whnf.reducibleClassField false in
private theorem succ?_eq_minValueSealed {x : Int8} :
UpwardEnumerable.succ? x = if x + 1 = minValueSealed then none else some (x + 1) :=
(rfl)
private theorem succMany?_eq_maxValueSealed {i : Int8} :
UpwardEnumerable.succMany? n i =
have := i.minValue_le_toInt
if h : i.toInt + n maxValueSealed.toInt then some (.ofIntLE _ (by omega) (maxValueSealed_def h)) else none :=
(rfl)
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -256,16 +265,16 @@ theorem instUpwardEnumerable_eq :
apply HasModel.succ?_eq_of_technicalCondition
simp [HasModel.encode, succ?, Int8.toBitVec_inj, toBitVec_minValueSealed_eq_intMinSealed]
· ext
simp +instances [HasModel.succMany?_eq, instUpwardEnumerable, HasModel.encode, HasModel.decode,
simp [HasModel.succMany?_eq, succMany?_eq_maxValueSealed, HasModel.encode, HasModel.decode,
toInt_toBitVec, toBitVec_maxValueSealed_eq_intMaxSealed, ofIntLE_eq_ofInt]
instance : LawfulUpwardEnumerable Int8 := by
simp +instances only [instUpwardEnumerable_eq]
rw [instUpwardEnumerable_eq]
infer_instance
instance : LawfulUpwardEnumerableLE Int8 := by
simp +instances only [instUpwardEnumerable_eq]
rw [instUpwardEnumerable_eq]
infer_instance
public instance instRxcHasSize : Rxc.HasSize Int8 where
@@ -277,7 +286,7 @@ theorem instRxcHasSize_eq :
toInt_toBitVec, HasModel.toNat_toInt_add_one_sub_toInt (Nat.zero_lt_succ _)]
public instance instRxcLawfulHasSize : Rxc.LawfulHasSize Int8 := by
simp +instances only [instUpwardEnumerable_eq, instRxcHasSize_eq]
rw [instUpwardEnumerable_eq, instRxcHasSize_eq]
infer_instance
public instance : Rxc.IsAlwaysFinite Int8 := by exact inferInstance
@@ -294,7 +303,7 @@ theorem instRxiHasSize_eq :
HasModel.encode, HasModel.toNat_two_pow_sub_one_sub_toInt (show 8 > 0 by omega)]
public instance instRxiLawfulHasSize : Rxi.LawfulHasSize Int8 := by
simp +instances only [instUpwardEnumerable_eq, instRxiHasSize_eq]
rw [instUpwardEnumerable_eq, instRxiHasSize_eq]
infer_instance
public instance instRxiIsAlwaysFinite : Rxi.IsAlwaysFinite Int8 := by exact inferInstance
@@ -344,7 +353,6 @@ instance : HasModel Int16 (BitVec 16) where
le_iff_encode_le := by simp [LE.le, Int16.le]
lt_iff_encode_lt := by simp [LT.lt, Int16.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -440,7 +448,6 @@ instance : HasModel Int32 (BitVec 32) where
le_iff_encode_le := by simp [LE.le, Int32.le]
lt_iff_encode_lt := by simp [LT.lt, Int32.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -536,7 +543,6 @@ instance : HasModel Int64 (BitVec 64) where
le_iff_encode_le := by simp [LE.le, Int64.le]
lt_iff_encode_lt := by simp [LT.lt, Int64.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -637,7 +643,6 @@ instance : HasModel ISize (BitVec System.Platform.numBits) where
le_iff_encode_le := by simp [LE.le, ISize.le]
lt_iff_encode_lt := by simp [LT.lt, ISize.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext

View File

@@ -26,7 +26,7 @@ variable {shape : RangeShape} {α : Type u}
structure SubarrayIterator (α : Type u) where
xs : Subarray α
@[inline, expose]
@[inline, expose, implicit_reducible]
def SubarrayIterator.step :
IterM (α := SubarrayIterator α) Id α IterStep (IterM (α := SubarrayIterator α) m α) α
| xs =>

View File

@@ -28,7 +28,6 @@ open Std Std.Iterators Std.PRange Std.Slice
namespace SubarrayIterator
set_option backward.isDefEq.respectTransparency false in
theorem step_eq {it : Iter (α := SubarrayIterator α) α} :
it.step = if h : it.1.xs.start < it.1.xs.stop then
haveI := it.1.xs.start_le_stop
@@ -215,7 +214,6 @@ public theorem Array.stop_toSubarray {xs : Array α} {lo hi : Nat} :
(xs.toSubarray lo hi).stop = min hi xs.size := by
simp [toSubarray_eq_min, Subarray.stop]
set_option backward.whnf.reducibleClassField false in
public theorem Subarray.toList_eq {xs : Subarray α} :
xs.toList = (xs.array.extract xs.start xs.stop).toList := by
let aslice := xs

View File

@@ -70,7 +70,6 @@ end ListSlice
namespace List
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rco {xs : List α} {lo hi : Nat} :
xs[lo...hi].toList = (xs.take hi).drop lo := by
@@ -78,9 +77,9 @@ public theorem toList_mkSlice_rco {xs : List α} {lo hi : Nat} :
simp only [Std.Rco.Sliceable.mkSlice, toSlice, ListSlice.toList_eq]
by_cases h : lo < hi
· have : lo hi := by omega
simp +instances [h, List.take_drop, Nat.add_sub_cancel' _, List.take_eq_take_min]
simp [h, List.take_drop, Nat.add_sub_cancel' _, List.take_eq_take_min]
· have : min hi xs.length lo := by omega
simp +instances [h, Nat.min_eq_right this]
simp [h, Nat.min_eq_right this]
@[simp, grind =]
public theorem toArray_mkSlice_rco {xs : List α} {lo hi : Nat} :
@@ -111,12 +110,11 @@ public theorem size_mkSlice_rcc {xs : List α} {lo hi : Nat} :
xs[lo...=hi].size = min (hi + 1) xs.length - lo := by
simp [ length_toList_eq_size]
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rci {xs : List α} {lo : Nat} :
xs[lo...*].toList = xs.drop lo := by
rw [List.drop_eq_drop_min]
simp +instances [ListSlice.toList_eq, Std.Rci.Sliceable.mkSlice, List.toUnboundedSlice]
simp [ListSlice.toList_eq, Std.Rci.Sliceable.mkSlice, List.toUnboundedSlice]
@[simp, grind =]
public theorem toArray_mkSlice_rci {xs : List α} {lo : Nat} :
@@ -290,11 +288,11 @@ section ListSubslices
namespace ListSlice
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rco {xs : ListSlice α} {lo hi : Nat} :
xs[lo...hi].toList = (xs.toList.take hi).drop lo := by
simp +instances only [instSliceableListSliceNat_1, List.toList_mkSlice_rco, ListSlice.toList_eq (xs := xs)]
rw [instSliceableListSliceNat_1]
simp only [List.toList_mkSlice_rco, ListSlice.toList_eq (xs := xs)]
obtain xs, stop := xs
cases stop
· simp
@@ -329,13 +327,13 @@ public theorem size_mkSlice_rcc {xs : ListSlice α} {lo hi : Nat} :
xs[lo...=hi].size = min (hi + 1) xs.size - lo := by
simp [ length_toList_eq_size]
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rci {xs : ListSlice α} {lo : Nat} :
xs[lo...*].toList = xs.toList.drop lo := by
simp +instances only [instSliceableListSliceNat_2, ListSlice.toList_eq (xs := xs)]
rw [instSliceableListSliceNat_2]
simp only [ListSlice.toList_eq (xs := xs)]
obtain xs, stop := xs
simp +instances only
simp only
split <;> simp
@[simp, grind =]

View File

@@ -55,9 +55,11 @@ end String
namespace String.Internal
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_posof"]
opaque posOf (s : String) (c : Char) : Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_offsetofpos"]
opaque offsetOfPos (s : String) (pos : Pos.Raw) : Nat
@@ -67,6 +69,7 @@ opaque extract : (@& String) → (@& Pos.Raw) → (@& Pos.Raw) → String
@[extern "lean_string_length"]
opaque length : (@& String) Nat
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_pushn"]
opaque pushn (s : String) (c : Char) (n : Nat) : String
@@ -76,45 +79,57 @@ opaque append : String → (@& String) → String
@[extern "lean_string_utf8_next"]
opaque next (s : @& String) (p : @& Pos.Raw) : Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_isempty"]
opaque isEmpty (s : String) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_foldl"]
opaque foldl (f : String Char String) (init : String) (s : String) : String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_isprefixof"]
opaque isPrefixOf (p : String) (s : String) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_any"]
opaque any (s : String) (p : Char Bool) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_contains"]
opaque contains (s : String) (c : Char) : Bool
@[extern "lean_string_utf8_get"]
opaque get (s : @& String) (p : @& Pos.Raw) : Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_capitalize"]
opaque capitalize (s : String) : String
@[extern "lean_string_utf8_at_end"]
opaque atEnd : (@& String) (@& Pos.Raw) Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_nextwhile"]
opaque nextWhile (s : String) (p : Char Bool) (i : String.Pos.Raw) : String.Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_trim"]
opaque trim (s : String) : String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_intercalate"]
opaque intercalate (s : String) : List String String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_front"]
opaque front (s : String) : Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_drop"]
opaque drop (s : String) (n : Nat) : String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_dropright"]
opaque dropRight (s : String) (n : Nat) : String
@@ -141,33 +156,43 @@ def List.asString (s : List Char) : String :=
namespace Substring.Raw.Internal
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_tostring"]
opaque toString : Substring.Raw String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_drop"]
opaque drop : Substring.Raw Nat Substring.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_front"]
opaque front (s : Substring.Raw) : Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_takewhile"]
opaque takeWhile : Substring.Raw (Char Bool) Substring.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_extract"]
opaque extract : Substring.Raw String.Pos.Raw String.Pos.Raw Substring.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_all"]
opaque all (s : Substring.Raw) (p : Char Bool) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_beq"]
opaque beq (ss1 ss2 : Substring.Raw) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_isempty"]
opaque isEmpty (ss : Substring.Raw) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_get"]
opaque get : Substring.Raw String.Pos.Raw Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_prev"]
opaque prev : Substring.Raw String.Pos.Raw String.Pos.Raw
@@ -175,9 +200,11 @@ end Substring.Raw.Internal
namespace String.Pos.Raw.Internal
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_pos_sub"]
opaque sub : String.Pos.Raw String.Pos.Raw String.Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_pos_min"]
opaque min (p₁ p₂ : Pos.Raw) : Pos.Raw

View File

@@ -230,7 +230,7 @@ Examples:
* `"empty".isEmpty = false`
* `" ".isEmpty = false`
-/
@[inline] def isEmpty (s : String) : Bool :=
@[inline, expose] def isEmpty (s : String) : Bool :=
s.utf8ByteSize == 0
@[export lean_string_isempty]

View File

@@ -57,4 +57,14 @@ theorem length_map {f : Char → Char} {s : String} : (s.map f).length = s.lengt
theorem map_eq_empty {f : Char Char} {s : String} : s.map f = "" s = "" := by
simp [ toList_eq_nil_iff]
@[simp]
theorem map_map {f g : Char Char} {s : String} : String.map g (String.map f s) = String.map (g f) s := by
apply String.ext
simp [List.map_map]
@[simp]
theorem map_id {s : String} : String.map id s = s := by
apply String.ext
simp [List.map_id]
end String

View File

@@ -229,7 +229,7 @@ Examples:
* `"Orange".toLower = "orange"`
* `"ABc123".toLower = "abc123"`
-/
@[inline] def toLower (s : String) : String :=
@[inline, expose] def toLower (s : String) : String :=
s.map Char.toLower
/--

View File

@@ -100,7 +100,7 @@ Unsafe implementation of `attachWith`, taking advantage of the fact that the rep
@[simp] theorem pmap_push {P : α Prop} {f : a, P a β} {a : α} {xs : Vector α n} {h : b xs.push a, P b} :
pmap f (xs.push a) h =
(pmap f xs (fun a m => by simp at h; exact h a (.inl m))).push (f a (h a (by simp))) := by
(pmap f xs (fun a m => by simp at h; exact h.1 _ m)).push (f a (h a (by simp))) := by
simp [pmap]
@[simp] theorem attach_empty : (#v[] : Vector α 0).attach = #v[] := rfl
@@ -147,7 +147,7 @@ theorem attachWith_congr {xs ys : Vector α n} (w : xs = ys) {P : α → Prop} {
@[simp] theorem attachWith_push {a : α} {xs : Vector α n} {P : α Prop} {H : x xs.push a, P x} :
(xs.push a).attachWith P H =
(xs.attachWith P (fun x h => by simp at H; exact H x (.inl h))).push a, H a (by simp) := by
(xs.attachWith P (fun x h => by simp at H; exact H.1 _ h)).push a, H a (by simp) := by
rcases xs with xs, rfl
simp

View File

@@ -72,6 +72,12 @@ theorem and_and_right : (a ∧ b) ∧ c ↔ (a ∧ c) ∧ b ∧ c := by rw [and_
theorem and_iff_left (hb : b) : a b a := Iff.intro And.left (And.intro · hb)
theorem and_iff_right (ha : a) : a b b := Iff.intro And.right (And.intro ha ·)
@[simp] theorem and_imp_left_iff_true {P Q : Prop} : (P Q P) True := by
simpa using And.left
@[simp] theorem and_imp_right_iff_true {P Q : Prop} : (P Q Q) True := by
simp
/-! ## or -/
theorem or_self_iff : a a a := or_self _ .rfl
@@ -94,6 +100,12 @@ theorem or_rotate : a b c ↔ b c a := by simp only [or_left_com
theorem or_iff_left (hb : ¬b) : a b a := or_iff_left_iff_imp.mpr hb.elim
theorem or_iff_right (ha : ¬a) : a b b := or_iff_right_iff_imp.mpr ha.elim
@[simp] theorem imp_or_left_iff_true {P Q : Prop} : (P P Q) True := by
simpa using Or.inl
@[simp] theorem imp_or_right_iff_true {P Q : Prop} : (Q P Q) True := by
simpa using Or.inr
/-! ## distributivity -/
theorem not_imp_of_and_not : a ¬b ¬(a b)
@@ -344,12 +356,34 @@ theorem not_forall_of_exists_not {p : α → Prop} : (∃ x, ¬p x) → ¬∀ x,
@[simp] theorem exists_prop_eq' {p : (a : α) a' = a Prop} :
( (a : α) (h : a' = a), p a h) p a' rfl := by simp [@eq_comm _ a']
@[simp] theorem forall_eq_or_imp : ( a, a = a' q a p a) p a' a, q a p a := by
@[simp] theorem forall_eq_or_imp {P Q : α Prop} :
( a, a = a' Q a P a) P a' a, Q a P a := by
simp only [or_imp, forall_and, forall_eq]
@[simp] theorem forall_eq_or_imp' {P Q : α Prop} {a' : α} :
( (a : α), a' = a Q a P a) P a' (a : α), Q a P a := by
simp only [or_imp, forall_and, forall_eq']
@[simp] theorem forall_or_eq_imp {P Q : α Prop} :
( a, Q a a = a' P a) ( a, Q a P a) P a' := by
simp only [or_imp, forall_and, forall_eq]
@[simp] theorem forall_or_eq_imp' {P Q : α Prop} :
( a, Q a a' = a P a) ( a, Q a P a) P a' := by
simp only [or_imp, forall_and, forall_eq']
@[simp] theorem exists_eq_or_imp : ( a, (a = a' q a) p a) p a' a, q a p a := by
simp only [or_and_right, exists_or, exists_eq_left]
@[simp] theorem exists_eq_or_imp' : ( a, (a' = a q a) p a) p a' a, q a p a := by
simp only [or_and_right, exists_or, exists_eq_left']
@[simp] theorem exists_or_eq_imp : ( a, (q a a = a') p a) ( a, q a p a) p a' := by
simp only [or_and_right, exists_or, exists_eq_left]
@[simp] theorem exists_or_eq_imp' : ( a, (q a a' = a) p a) ( a, q a p a) p a' := by
simp only [or_and_right, exists_or, exists_eq_left']
@[simp] theorem exists_eq_right_right : ( (a : α), p a q a a = a') p a' q a' := by
simp [ and_assoc]
@@ -404,6 +438,26 @@ theorem forall_prop_of_false {p : Prop} {q : p → Prop} (hn : ¬p) : (∀ h' :
@[simp] theorem forall_self_imp (P : Prop) (Q : P Prop) : ( p : P, P Q p) p, Q p :=
fun h p => h p p, fun h _ p => h p
@[simp]
theorem forall_or_imp_or_self_right_right {P Q R : α Prop} :
( a, P a Q a R a Q a) ( a, P a R a Q a) := by
simp only [or_imp, imp_or_right_iff_true, and_true]
@[simp]
theorem forall_or_imp_or_self_right_left {P Q R : α Prop} :
( a, P a Q a Q a R a) ( a, P a Q a R a) := by
simp only [or_imp, imp_or_left_iff_true, and_true]
@[simp]
theorem forall_or_imp_or_self_left_right {P Q R : α Prop} :
( a, Q a P a R a Q a) ( a, P a R a Q a) := by
simp only [or_imp, imp_or_right_iff_true, true_and]
@[simp]
theorem forall_or_imp_or_self_left_left {P Q R : α Prop} :
( a, Q a P a Q a R a) ( a, P a Q a R a) := by
simp only [or_imp, imp_or_left_iff_true, true_and]
end quantifiers
/-! ## membership -/

View File

@@ -56,6 +56,7 @@ private def syntaxToExternAttrData (stx : Syntax) : AttrM ExternAttrData := do
return { entries := entries.toList }
-- Forward declaration
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_add_extern"]
opaque addExtern (declName : Name) (externAttrData : ExternAttrData) : CoreM Unit

View File

@@ -81,7 +81,10 @@ where
ps.map fun p => { p with borrow := p.type.isPossibleRef }
initParamsIfNotExported (exported : Bool) (ps : Array (Param .impure)) : Array (Param .impure) :=
if exported then ps else initParams ps
if exported then
ps.map ({ · with borrow := false })
else
initParams ps
goCode (declName : Name) (code : Code .impure) : InitM Unit := do
match code with

View File

@@ -333,11 +333,7 @@ public def demangleBtLine (line : String) : Option String := do
return pfx ++ demangled ++ sfx
@[export lean_demangle_bt_line_cstr]
def demangleBtLineCStr (line : @& String) : String :=
def demangleBtLineCStr (line : String) : String :=
(demangleBtLine line).getD ""
@[export lean_demangle_symbol_cstr]
def demangleSymbolCStr (symbol : @& String) : String :=
(demangleSymbol symbol).getD ""
end Lean.Name.Demangle

View File

@@ -69,6 +69,11 @@ structure LeanClientCapabilities where
If `none` or `false`, silent diagnostics will not be served to the client.
-/
silentDiagnosticSupport? : Option Bool := none
/--
The latest RPC wire format supported by the client.
Defaults to `v0` when `none`.
-/
rpcWireFormat? : Option RpcWireFormat := none
deriving ToJson, FromJson
structure ClientCapabilities where
@@ -86,6 +91,13 @@ def ClientCapabilities.silentDiagnosticSupport (c : ClientCapabilities) : Bool :
| return false
return silentDiagnosticSupport
def ClientCapabilities.rpcWireFormat (c : ClientCapabilities) : RpcWireFormat := Id.run do
let some lean := c.lean?
| return .v0
let some v := lean.rpcWireFormat?
| return .v0
return v
structure LeanServerCapabilities where
moduleHierarchyProvider? : Option ModuleHierarchyOptions
rpcProvider? : Option RpcOptions

View File

@@ -132,6 +132,11 @@ structure HighlightMatchesOptions where
structure RpcOptions where
highlightMatchesProvider? : Option HighlightMatchesOptions := none
/--
The latest RPC wire format supported by the server.
Defaults to `v0` when `none`.
-/
rpcWireFormat? : Option RpcWireFormat := none
deriving FromJson, ToJson
structure LeanModule where
@@ -208,11 +213,18 @@ structure RpcConnected where
/-- `$/lean/rpc/call` client->server request.
A request to execute a procedure bound for RPC. If an incorrect session ID is present, the server
errors with `RpcNeedsReconnect`.
A request to execute a procedure bound for RPC.
If an incorrect session ID is present, the server errors with `RpcNeedsReconnect`.
Extending TDPP is weird. But in Lean, symbols exist in the context of a position within a source
file. So we need this to refer to code in the environment at that position. -/
Extends `TextDocumentPositionParams` because in Lean,
symbols exist in the context of a position within a source file
(e.g., `foo` is defined below but not above `def foo`).
We use the position to resolve the set of defined constants,
registered RPC methods, etc.
Both the request and the response may contain `RpcEncodable` data.
It is serialized following the `RpcWireFormat`.
-/
structure RpcCallParams extends TextDocumentPositionParams where
sessionId : UInt64
/-- Procedure to invoke. Must be fully qualified. -/
@@ -227,7 +239,8 @@ A notification to release remote references. Should be sent by the client when i
structure RpcReleaseParams where
uri : DocumentUri
sessionId : UInt64
refs : Array RpcRef
/-- Array of RPC references to release. -/
refs : Array Json
deriving FromJson, ToJson
/-- `$/lean/rpc/keepAlive` client->server notification.

View File

@@ -235,8 +235,8 @@ def idbgCompileAndEval (α : Type) [Nonempty α]
| .error msg => throw (.userError s!"idbg evalConst failed: {msg}")
/-- Connect to the debug server, receive expressions, evaluate, send results. Loops forever. -/
@[nospecialize, export lean_idbg_client_loop] def idbgClientLoopImpl {α : Type} [Nonempty α]
(siteId : String) (imports : Array Import) (apply : α String) : IO Unit := do
@[nospecialize, export lean_idbg_client_loop] def idbgClientLoopImpl
(siteId : String) (imports : Array Import) (apply : NonScalar String) : IO Unit := do
let baseEnv idbgLoadEnv imports
let port := idbgPort siteId
let addr := SocketAddressV4.mk (.ofParts 127 0 0 1) port
@@ -249,7 +249,7 @@ def idbgCompileAndEval (α : Type) [Nonempty α]
let json IO.ofExcept (Json.parse msg)
let type IO.ofExcept (exprFromJson? ( IO.ofExcept (json.getObjVal? "type")))
let value IO.ofExcept (exprFromJson? ( IO.ofExcept (json.getObjVal? "value")))
let fnVal idbgCompileAndEval α baseEnv type value
let fnVal idbgCompileAndEval NonScalar baseEnv type value
let result := apply fnVal
sendMsg client result
let t client.shutdown |>.toIO

View File

@@ -192,7 +192,8 @@ where
-- Last resort: Split match
trace[Elab.Tactic.Do.vcgen] "split match: {e}"
burnOne
return info.splitWith goal.toExpr (useSplitter := true) fun altSuff expAltType idx params => do
return info.splitWith goal.toExpr (useSplitter := true) fun altSuff expAltType idx altFVars => do
let params := altFVars.altParams
burnOne
let some goal := parseMGoal? expAltType
| throwError "Bug in `mvcgen`: Expected alt type {expAltType} could not be parsed as an MGoal."
@@ -253,8 +254,8 @@ where
mkFreshExprSyntheticOpaqueMVar hypsTy (name.appendIndexAfter i)
let (joinPrf, joinGoal) forallBoundedTelescope joinTy numJoinParams fun joinParams _body => do
let φ info.splitWith (mkSort .zero) fun _suff _expAltType idx altParams =>
return mkAppN hypsMVars[idx]! (joinParams ++ altParams)
let φ info.splitWith (mkSort .zero) fun _suff _expAltType idx altFVars =>
return mkAppN hypsMVars[idx]! (joinParams ++ altFVars.altParams)
withLocalDecl ( mkFreshUserName `h) .default φ fun h => do
-- NB: `mkJoinGoal` is not quite `goal.withNewProg` because we only take 4 args and clear
-- the stateful hypothesis of the goal.

View File

@@ -51,30 +51,106 @@ def altInfos (info : SplitInfo) : Array (Nat × Expr) := match info with
| matcher matcherApp => matcherApp.altNumParams.mapIdx fun idx numParams =>
(numParams, matcherApp.alts[idx]!)
/-- The expression represented by a `SplitInfo`. -/
def expr : SplitInfo Expr
| .ite e => e
| .dite e => e
| .matcher matcherApp => matcherApp.toExpr
/--
Introduces fvars for all varying parts of a `SplitInfo` and provides the abstract
`SplitInfo` and fvars to the continuation.
For `ite`/`dite`, introduces `c : Prop`, `dec : Decidable c`, `t : mα` (or `t : c → mα`),
`e : mα` (or `e : ¬c → mα`).
For `matcher`, introduces discriminant fvars and alternative fvars, builds a non-dependent
motive `fun _ ... _ => mα`, and adjusts matcher universe levels.
The abstract `SplitInfo` satisfies `abstractInfo.expr = abstractProgram`.
For `matcher`, the abstract `MatcherApp` stores eta-expanded fvar alts so that
`splitWith`/`matcherApp.transform` can `instantiateLambda` them directly (no patching needed).
Since eta-expanded alts like `fun n => alt n` can cause expensive higher-order unification in
backward rule patterns, callers building backward rules should eta-reduce them first (e.g.
via `Expr.eta` on the alt arguments of `abstractInfo.expr`).
-/
def withAbstract {n} {α} [MonadLiftT MetaM n] [MonadControlT MetaM n] [Monad n] [Inhabited α]
(info : SplitInfo) (resTy : Expr)
(k : (abstractInfo : SplitInfo) (splitFVars : Array Expr) n α) : n α :=
match info with
| .ite _ =>
withLocalDeclD `c (mkSort 0) fun c =>
withLocalDeclD `dec (mkApp (mkConst ``Decidable) c) fun dec =>
withLocalDeclD `t resTy fun t =>
withLocalDeclD `e resTy fun e => do
let u liftMetaM <| getLevel resTy
k (.ite <| mkApp5 (mkConst ``_root_.ite [u]) resTy c dec t e) #[c, dec, t, e]
| .dite _ =>
withLocalDeclD `c (mkSort 0) fun c =>
withLocalDeclD `dec (mkApp (mkConst ``Decidable) c) fun dec => do
let tTy liftMetaM <| mkArrow c resTy
let eTy liftMetaM <| mkArrow (mkNot c) resTy
withLocalDeclD `t tTy fun t =>
withLocalDeclD `e eTy fun e => do
let u liftMetaM <| getLevel resTy
k (.dite <| mkApp5 (mkConst ``_root_.dite [u]) resTy c dec t e) #[c, dec, t, e]
| .matcher matcherApp => do
let discrNamesTypes matcherApp.discrs.mapIdxM fun i discr => do
return ((`discr).appendIndexAfter (i+1), liftMetaM <| inferType discr)
withLocalDeclsDND discrNamesTypes fun discrs => do
-- Non-dependent motive: fun _ ... _ => mα
let motive liftMetaM <| lambdaTelescope matcherApp.motive fun motiveArgs _ =>
mkLambdaFVars motiveArgs resTy
-- The matcher's universe levels include a `uElimPos` slot for the elimination target level.
-- Our abstract motive `fun _ ... _ => mα` may target a different level than the original
-- dependent motive, so we update `matcherLevels[uElimPos]` to `getLevel mα`.
let matcherLevels match matcherApp.uElimPos? with
| none => pure matcherApp.matcherLevels
| some pos => do
let uElim liftMetaM <| getLevel resTy
pure <| matcherApp.matcherLevels.set! pos uElim
-- Build partial application to infer alt types
let matcherPartial := mkAppN (mkConst matcherApp.matcherName matcherLevels.toList) matcherApp.params
let matcherPartial := mkApp matcherPartial motive
let matcherPartial := mkAppN matcherPartial discrs
let origAltTypes liftMetaM <| inferArgumentTypesN matcherApp.alts.size matcherPartial
let altNamesTypes := origAltTypes.mapIdx fun i ty => ((`alt).appendIndexAfter (i+1), ty)
withLocalDeclsDND altNamesTypes fun alts => do
-- Eta-expand fvar alts so `splitWith`/`matcherApp.transform` can `instantiateLambda` them.
let abstractMatcherApp : MatcherApp := {
matcherApp with
matcherLevels := matcherLevels
discrs := discrs
motive := motive
alts := ( liftMetaM <| alts.mapM etaExpand)
remaining := #[]
}
k (.matcher abstractMatcherApp) (discrs ++ alts)
def splitWith
{n} [MonadLiftT MetaM n] [MonadControlT MetaM n] [Monad n] [MonadError n] [MonadEnv n] [MonadLog n]
[AddMessageContext n] [MonadOptions n]
(info : SplitInfo) (resTy : Expr) (onAlt : Name Expr Nat Array Expr n Expr) (useSplitter := false) : n Expr := match info with
(info : SplitInfo) (resTy : Expr) (onAlt : Name Expr Nat MatcherApp.TransformAltFVars n Expr) (useSplitter := false) : n Expr := match info with
| ite e => do
let u getLevel resTy
let c := e.getArg! 1
let h := e.getArg! 2
if useSplitter then -- dite is the "splitter" for ite
let n liftMetaM <| mkFreshUserName `h
let t withLocalDecl n .default c fun h => do mkLambdaFVars #[h] ( onAlt `isTrue resTy 0 #[])
let e withLocalDecl n .default (mkNot c) fun h => do mkLambdaFVars #[h] ( onAlt `isFalse resTy 1 #[])
let t withLocalDecl n .default c fun h => do mkLambdaFVars #[h] ( onAlt `isTrue resTy 0 { fields := #[h] })
let e withLocalDecl n .default (mkNot c) fun h => do mkLambdaFVars #[h] ( onAlt `isFalse resTy 1 { fields := #[h] })
return mkApp5 (mkConst ``_root_.dite [u]) resTy c h t e
else
let t onAlt `isTrue resTy 0 #[]
let e onAlt `isFalse resTy 1 #[]
let t onAlt `isTrue resTy 0 { fields := #[] }
let e onAlt `isFalse resTy 1 { fields := #[] }
return mkApp5 (mkConst ``_root_.ite [u]) resTy c h t e
| dite e => do
let u getLevel resTy
let c := e.getArg! 1
let h := e.getArg! 2
let n liftMetaM <| mkFreshUserName `h
let t withLocalDecl n .default c fun h => do mkLambdaFVars #[h] ( onAlt `isTrue resTy 0 #[h])
let e withLocalDecl n .default (mkNot c) fun h => do mkLambdaFVars #[h] ( onAlt `isFalse resTy 1 #[h])
let t withLocalDecl n .default c fun h => do mkLambdaFVars #[h] ( onAlt `isTrue resTy 0 { args := #[h], fields := #[h] })
let e withLocalDecl n .default (mkNot c) fun h => do mkLambdaFVars #[h] ( onAlt `isFalse resTy 1 { args := #[h], fields := #[h] })
return mkApp5 (mkConst ``_root_.dite [u]) resTy c h t e
| matcher matcherApp => do
let mask := matcherApp.discrs.map (·.isFVar)
@@ -83,8 +159,8 @@ def splitWith
(·.toExpr) <$> matcherApp.transform
(useSplitter := useSplitter) (addEqualities := useSplitter) -- (freshenNames := true)
(onMotive := fun xs _body => pure (absMotiveBody.instantiateRev (Array.mask mask xs)))
(onAlt := fun idx expAltType params _alt => do
onAlt ((`h).appendIndexAfter (idx+1)) expAltType idx params)
(onAlt := fun idx expAltType altFVars _alt => do
onAlt ((`h).appendIndexAfter (idx+1)) expAltType idx altFVars)
def simpDiscrs? (info : SplitInfo) (e : Expr) : SimpM (Option Simp.Result) := match info with
| dite _ | ite _ => return none -- Tricky because we need to simultaneously rewrite `[Decidable c]`

View File

@@ -623,6 +623,7 @@ private def evalSuggestSimpAllTrace : TryTactic := fun tac => do
| _ => throwUnsupportedSyntax
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_eval_suggest_tactic"] -- forward definition to avoid mutual block
opaque evalSuggest : TryTactic

View File

@@ -752,6 +752,7 @@ private def lakeAdd (env : Environment) (cinfo : ConstantInfo) : Environment :=
}
-- forward reference due to too many cyclic dependencies
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_is_reserved_name"]
private opaque isReservedName (env : Environment) (name : Name) : Bool
@@ -1768,6 +1769,7 @@ private def looksLikeOldCodegenName : Name → Bool
| .str _ s => s.startsWith "_cstage" || s.startsWith "_spec_" || s.startsWith "_elambda"
| _ => false
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_get_ir_extra_const_names"]
private opaque getIRExtraConstNames (env : Environment) (level : OLeanLevel) (includeDecls := false) : Array Name
@@ -1804,6 +1806,7 @@ def mkModuleData (env : Environment) (level : OLeanLevel := .private) : IO Modul
constNames, constants, entries
}
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_ir_export_entries"]
private opaque exportIREntries (env : Environment) : Array (Name × Array EnvExtensionEntry)
@@ -1862,6 +1865,7 @@ private def setImportedEntries (states : Array EnvExtensionState) (mods : Array
{ s with importedEntries := s.importedEntries.set! modIdx entries }
return states
set_option compiler.ignoreBorrowAnnotation true in
/--
"Forward declaration" needed for updating the attribute table with user-defined attributes.
User-defined attributes are declared using the `initialize` command. The `initialize` command is just syntax sugar for the `init` attribute.
@@ -1872,9 +1876,12 @@ private def setImportedEntries (states : Array EnvExtensionState) (mods : Array
Later, we set this method with code that adds the user-defined attributes that were imported after we initialized `attributeExtension`.
-/
@[extern "lean_update_env_attributes"] opaque updateEnvAttributes : Environment IO Environment
set_option compiler.ignoreBorrowAnnotation true in
/-- "Forward declaration" for retrieving the number of builtin attributes. -/
@[extern "lean_get_num_attributes"] opaque getNumBuiltinAttributes : IO Nat
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_run_init_attrs"]
private opaque runInitAttrs (env : Environment) (opts : Options) : IO Unit
@@ -2399,6 +2406,7 @@ def displayStats (env : Environment) : IO Unit := do
@[extern "lean_eval_const"]
private unsafe opaque evalConstCore (α) (env : @& Environment) (opts : @& Options) (constName : @& Name) : Except String α
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_eval_check_meta"]
private opaque evalCheckMeta (env : Environment) (constName : Name) : Except String Unit

View File

@@ -760,6 +760,7 @@ have to hard-code the true arity of these definitions here, and make sure the C
We have used another hack based on `IO.Ref`s in the past, it was safer but less efficient.
-/
set_option compiler.ignoreBorrowAnnotation true in
/--
Reduces an expression to its *weak head normal form*.
This is when the "head" of the top-level expression has been fully reduced.
@@ -768,6 +769,7 @@ The result may contain subexpressions that have not been reduced.
See `Lean.Meta.whnfImp` for the implementation.
-/
@[extern "lean_whnf"] opaque whnf : Expr MetaM Expr
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns the inferred type of the given expression. Assumes the expression is type-correct.
@@ -822,8 +824,11 @@ def e3 : Expr := .app (.const ``Nat.zero []) (.const ``Nat.zero [])
See `Lean.Meta.inferTypeImp` for the implementation of `inferType`.
-/
@[extern "lean_infer_type"] opaque inferType : Expr MetaM Expr
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_is_expr_def_eq"] opaque isExprDefEqAux : Expr Expr MetaM Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_is_level_def_eq"] opaque isLevelDefEqAux : Level Level MetaM Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_synth_pending"] protected opaque synthPending : MVarId MetaM Bool
def whnfForall (e : Expr) : MetaM Expr := do
@@ -2498,6 +2503,7 @@ def isDefEqD (t s : Expr) : MetaM Bool :=
def isDefEqI (t s : Expr) : MetaM Bool :=
withReducibleAndInstances <| isDefEq t s
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns `true` if `mvarId := val` was successfully assigned.
This method uses the same assignment validation performed by `isDefEq`, but it does not check whether the types match.

View File

@@ -46,12 +46,14 @@ Forward definition of `getEquationsForImpl`.
We want to use `getEquationsFor` in the simplifier,
getEquationsFor` depends on `mkEquationsFor` which uses the simplifier.
-/
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_get_match_equations_for"]
opaque getEquationsFor (matchDeclName : Name) : MetaM MatchEqns
/-
Forward definition of `genMatchCongrEqnsImpl`.
-/
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_get_congr_match_equations_for"]
opaque genMatchCongrEqns (matchDeclName : Name) : MetaM (Array Name)

View File

@@ -201,6 +201,47 @@ private def forallAltTelescope'
fun ys args _mask _bodyType => k ys args
) k
/--
Fvars/exprs introduced in the telescope of a matcher alternative during `transform`.
* `args` are the values passed to `instantiateLambda` on the original alt. They usually
coincide with `fields`, but may include non-fvar values (e.g. `Unit.unit` for thunked alts).
* `fields` are the constructor-field fvars (proper fvar subset of `args`).
* `overlaps` are overlap-parameter fvars (splitter path only, for non-`casesOn` splitters).
* `discrEqs` are discriminant-equation fvars from the matcher's own type (`numDiscrEqs`).
* `extraEqs` are equation fvars added by the `addEqualities` flag.
**Example.** `transform` with `addEqualities := true` on a `Nat.casesOn` application
`Nat.casesOn (motive := …) n alt₀ alt₁` opens alt telescopes:
```
Alt 0 (zero): (heq : n = Nat.zero) → motive Nat.zero
⟹ { args := #[], fields := #[], extraEqs := #[heq] }
Alt 1 (succ): (k : Nat) → (heq : n = Nat.succ k) → motive (Nat.succ k)
⟹ { args := #[k], fields := #[k], extraEqs := #[heq] }
```
-/
structure TransformAltFVars where
/-- Arguments for `instantiateLambda` on the original alternative (see example above).
May include non-fvar values like `Unit.unit` for thunked alternatives. -/
args : Array Expr := #[]
/-- Constructor field fvars, i.e. the proper fvar subset of `args` (see example above). -/
fields : Array Expr
/-- Overlap parameter fvars (non-casesOn splitters only). -/
overlaps : Array Expr := #[]
/-- Discriminant equation fvars from the matcher's own type (`numDiscrEqs`). -/
discrEqs : Array Expr := #[]
/-- Extra equation fvars added by `addEqualities` (see `heq` in the example above). -/
extraEqs : Array Expr := #[]
/-- The `altParams` that were used for `instantiateLambda alt altParams` inside `transform`. -/
def TransformAltFVars.altParams (fvars : TransformAltFVars) : Array Expr :=
fvars.args ++ fvars.discrEqs
/-- All proper fvars in binding order, matching the lambdas that `transform` wraps around the alt result. -/
def TransformAltFVars.all (fvars : TransformAltFVars) : Array Expr :=
fvars.fields ++ fvars.overlaps ++ fvars.discrEqs ++ fvars.extraEqs
/--
Performs a possibly type-changing transformation to a `MatcherApp`.
@@ -229,7 +270,7 @@ def transform
(addEqualities : Bool := false)
(onParams : Expr n Expr := pure)
(onMotive : Array Expr Expr n Expr := fun _ e => pure e)
(onAlt : Nat Expr Array Expr Expr n Expr := fun _ _ _ e => pure e)
(onAlt : Nat Expr TransformAltFVars Expr n Expr := fun _ _ _ e => pure e)
(onRemaining : Array Expr n (Array Expr) := pure) :
n MatcherApp := do
@@ -331,7 +372,7 @@ def transform
let altParams := args ++ ys3
let alt try instantiateLambda alt altParams
catch _ => throwError "unexpected matcher application, insufficient number of parameters in alternative"
let alt' onAlt altIdx altType altParams alt
let alt' onAlt altIdx altType { args, fields := ys, overlaps := ys2, discrEqs := ys3, extraEqs := ys4 } alt
mkLambdaFVars (ys ++ ys2 ++ ys3 ++ ys4) alt'
if splitterAltInfo.hasUnitThunk then
-- The splitter expects a thunked alternative, but we don't want the `x : Unit` to be in
@@ -372,7 +413,7 @@ def transform
let names lambdaTelescope alt fun xs _ => xs.mapM (·.fvarId!.getUserName)
withUserNames xs names do
let alt instantiateLambda alt xs
let alt' onAlt altIdx altType xs alt
let alt' onAlt altIdx altType { args := xs, fields := xs, extraEqs := ys4 } alt
mkLambdaFVars (xs ++ ys4) alt'
alts' := alts'.push alt'
@@ -446,7 +487,7 @@ def inferMatchType (matcherApp : MatcherApp) : MetaM MatcherApp := do
}
mkArrowN extraParams typeMatcherApp.toExpr
)
(onAlt := fun _altIdx expAltType _altParams alt => do
(onAlt := fun _altIdx expAltType _altFVars alt => do
let altType inferType alt
let eq mkEq expAltType altType
let proof mkFreshExprSyntheticOpaqueMVar eq

View File

@@ -0,0 +1,111 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Joachim Breitner
-/
module
prelude
public import Lean.Meta.AppBuilder
public section
namespace Lean.Meta
/--
Builds a proof of `String.ofList l₁ ≠ String.ofList l₂` for two distinct string literals.
The proof avoids both `String.decEq` (expensive UTF-8 byte array comparison) and full
`List Char` comparison. It reduces to `List Char` via `String.ofList_injective`, then
proves the lists differ using one of two strategies depending on whether the strings
share a common prefix of the shorter string's full length:
- **Different characters at position `i`**: Uses
`congrArg (fun l => (List.get!Internal l i).toNat)` to project down to `Nat`, then
`Nat.ne_of_beq_eq_false rfl` for a single `Nat.beq` comparison. This avoids `Decidable`
instances entirely — the kernel only evaluates `Nat.beq` on two concrete natural numbers.
- **One string is a prefix of the other**: Uses `congrArg (List.drop n ·)` where `n` is the
shorter string's length, then `List.cons_ne_nil` to prove the non-empty tail differs from `[]`.
This avoids `decide` entirely since `cons_ne_nil` is a definitional proof.
-/
def mkStringLitNeProof (s₁ s₂ : String) : MetaM Expr := do
let l₁ := s₁.toList
let l₂ := s₂.toList
let l₁Expr := toExpr l₁
let l₂Expr := toExpr l₂
let charConst := mkConst ``Char
let listCharTy := mkApp (mkConst ``List [.zero]) charConst
-- Find the first differing character index
let minLen := min l₁.length l₂.length
let diffIdx := Id.run do
for i in [:minLen] do
if l₁[i]! l₂[i]! then return i
return minLen
-- Build the list inequality proof, dispatching on whether it's a character
-- difference or a prefix/length difference
let listEq := mkApp3 (mkConst ``Eq [.succ .zero]) listCharTy l₁Expr l₂Expr
let listNeProof
if diffIdx < minLen then
-- Both lists have a character at diffIdx: project to Nat via get!Internal + toNat,
-- then use Nat.ne_of_beq_eq_false rfl (avoids Decidable instances entirely)
let inhabCharExpr := mkConst ``Char.instInhabited
let natConst := mkConst ``Nat
let iExpr := toExpr diffIdx
-- f = fun l => (List.get!Internal l i).toNat
let projFn := mkLambda `l .default listCharTy
(mkApp (mkConst ``Char.toNat)
(mkApp4 (mkConst ``List.get!Internal [.zero]) charConst inhabCharExpr (mkBVar 0) iExpr))
let mkGetToNat (lExpr : Expr) : Expr :=
mkApp (mkConst ``Char.toNat)
(mkApp4 (mkConst ``List.get!Internal [.zero]) charConst inhabCharExpr lExpr iExpr)
let projL1 := mkGetToNat l₁Expr
let projL2 := mkGetToNat l₂Expr
let projEq := mkApp3 (mkConst ``Eq [.succ .zero]) natConst projL1 projL2
let congrArgFn := mkApp5 (mkConst ``congrArg [.succ .zero, .succ .zero])
listCharTy natConst l₁Expr l₂Expr projFn
-- Nat.ne_of_beq_eq_false rfl : n₁ ≠ n₂ (kernel evaluates Nat.beq on two literals)
let projNeProof := mkApp3 (mkConst ``Nat.ne_of_beq_eq_false)
projL1 projL2 ( mkEqRefl (mkConst ``false))
pure <| mkApp4 (mkConst ``mt) listEq projEq congrArgFn projNeProof
else
-- One list is a prefix of the other: use drop + cons_ne_nil
let nExpr := toExpr minLen
let dropFn := mkLambda `l .default listCharTy
(mkApp3 (mkConst ``List.drop [.zero]) charConst nExpr (mkBVar 0))
let dropL1 := mkApp3 (mkConst ``List.drop [.zero]) charConst nExpr l₁Expr
let dropL2 := mkApp3 (mkConst ``List.drop [.zero]) charConst nExpr l₂Expr
let dropEq := mkApp3 (mkConst ``Eq [.succ .zero]) listCharTy dropL1 dropL2
let congrArgFn := mkApp5 (mkConst ``congrArg [.succ .zero, .succ .zero])
listCharTy listCharTy l₁Expr l₂Expr dropFn
-- The longer list's drop has a head element; the shorter list's drop is []
let (hdChar, tlList) :=
if l₁.length l₂.length then
let dropped := l₂.drop minLen
(dropped.head!, dropped.tail!)
else
let dropped := l₁.drop minLen
(dropped.head!, dropped.tail!)
let hdExpr := toExpr hdChar
let tlExpr := toExpr tlList
-- cons_ne_nil hd tl : hd :: tl ≠ []
let consNeNil := mkApp3 (mkConst ``List.cons_ne_nil [.zero]) charConst hdExpr tlExpr
let dropNeProof :=
if l₁.length l₂.length then
-- l₁ is shorter: drop l₁ = [], drop l₂ = hd :: tl, need [] ≠ hd :: tl
mkApp4 (mkConst ``Ne.symm [.succ .zero]) listCharTy
(mkApp3 (mkConst ``List.cons [.zero]) charConst hdExpr tlExpr)
(mkApp (mkConst ``List.nil [.zero]) charConst)
consNeNil
else
-- l₂ is shorter: drop l₁ = hd :: tl, drop l₂ = [], need hd :: tl ≠ []
consNeNil
pure <| mkApp4 (mkConst ``mt) listEq dropEq congrArgFn dropNeProof
-- strNeProof : String.ofList l₁ ≠ String.ofList l₂ via mt ofList_injective listNeProof
let strType := mkConst ``String
let strEq := mkApp3 (mkConst ``Eq [.succ .zero]) strType
(mkApp (mkConst ``String.ofList) l₁Expr) (mkApp (mkConst ``String.ofList) l₂Expr)
return mkApp4 (mkConst ``mt) strEq listEq
(mkApp2 (mkConst ``String.ofList_injective) l₁Expr l₂Expr) listNeProof
end Lean.Meta

View File

@@ -30,12 +30,12 @@ then returns `f`. Otherwise, returns `e`.
Returns the original expression when not reducible to enable pointer equality checks.
-/
public def etaReduce (e : Expr) : Expr :=
go e 0
if e.isLambda then go e 0 else e
where
go (body : Expr) (n : Nat) : Expr :=
match body with
| .lam _ _ b _ => go b (n+1)
| _ => if n == 0 then e else etaReduceAux body n 0 e
| _ => etaReduceAux body n 0 e
/-- Returns `true` if `e` can be eta-reduced. Uses pointer equality for efficiency. -/
public def isEtaReducible (e : Expr) : Bool :=

View File

@@ -17,7 +17,7 @@ import Lean.Meta.Sym.ProofInstInfo
import Lean.Meta.Sym.AlphaShareBuilder
import Lean.Meta.Sym.Offset
import Lean.Meta.Sym.Eta
import Lean.Meta.AbstractMVars
import Lean.Meta.Sym.Util
import Init.Data.List.MapIdx
import Init.Data.Nat.Linear
import Std.Do.Triple.Basic
@@ -31,7 +31,9 @@ framework (`Sym`). The design prioritizes performance by using a two-phase appro
# Phase 1 (Syntactic Matching)
- Patterns use de Bruijn indices for expression variables and renamed level params (`_uvar.0`, `_uvar.1`, ...) for universe variables
- Matching is purely structural after reducible definitions are unfolded during preprocessing
- Universe levels treat `max` and `imax` as uninterpreted functions (no AC reasoning)
- Universe levels are eagerly normalized in patterns and normalized on the target side during matching
- `tryApproxMaxMax` handles `max` commutativity when one argument matches structurally. It is relevant
for constraints such as `max ?u a =?= max a b`
- Binders and term metavariables are deferred to Phase 2
# Phase 2 (Pending Constraints)
@@ -107,6 +109,7 @@ def preprocessDeclPattern (declName : Name) : MetaM (List Name × Expr) := do
let us := levelParams.map mkLevelParam
let type instantiateTypeLevelParams info.toConstantVal us
let type preprocessType type
let type normalizeLevels type
return (levelParams, type)
def preprocessExprPattern (e : Expr) (levelParams₀ : List Name) : MetaM (List Name × Expr) := do
@@ -115,6 +118,7 @@ def preprocessExprPattern (e : Expr) (levelParams₀ : List Name) : MetaM (List
let us := levelParams.map mkLevelParam
let type := type.instantiateLevelParams levelParams₀ us
let type preprocessType type
let type normalizeLevels type
return (levelParams, type)
/--
@@ -261,45 +265,6 @@ public def mkEqPatternFromDecl (declName : Name) : MetaM (Pattern × Expr) := do
let_expr Eq _ lhs rhs := type | throwError "conclusion is not a equality{indentExpr type}"
return (lhs, rhs)
/--
Like `mkPatternCore` but takes a lambda expression instead of a forall type.
Uses `lambdaBoundedTelescope` to open binders and detect instance/proof arguments.
-/
def mkPatternCoreFromLambda (lam : Expr) (levelParams : List Name)
(varTypes : Array Expr) (pattern : Expr) : MetaM Pattern := do
let fnInfos mkProofInstInfoMapFor pattern
let checkTypeMask := mkCheckTypeMask pattern varTypes.size
let checkTypeMask? := if checkTypeMask.all (· == false) then none else some checkTypeMask
let varInfos? lambdaBoundedTelescope lam varTypes.size fun xs _ =>
mkProofInstArgInfo? xs
return { levelParams, varTypes, pattern, fnInfos, varInfos?, checkTypeMask? }
def mkPatternFromLambda (levelParams : List Name) (lam : Expr) : MetaM Pattern := do
let rec go (pattern : Expr) (varTypes : Array Expr) : MetaM Pattern := do
if let .lam _ d b _ := pattern then
return ( go b (varTypes.push d))
let pattern preprocessType pattern
mkPatternCoreFromLambda lam levelParams varTypes pattern
go lam #[]
/--
Creates a `Pattern` from an expression containing metavariables.
Metavariables in `e` become pattern variables (wildcards). For example,
`Nat.succ ?m` produces a pattern matching `Nat.succ _` with discrimination
tree keys `[Nat.succ, *]`.
This is used for user-registered simproc patterns where the user provides
an expression with underscores (elaborated as metavariables) to specify
what the simproc should match.
-/
public def mkSimprocPatternFromExpr (e : Expr) : MetaM Pattern := do
let result abstractMVars e
let levelParams := result.paramNames.mapIdx fun i _ => Name.num uvarPrefix i
let us := levelParams.toList.map mkLevelParam
let expr := result.expr.instantiateLevelParamsArray result.paramNames us.toArray
mkPatternFromLambda levelParams.toList expr
structure UnifyM.Context where
pattern : Pattern
unify : Bool := true
@@ -365,35 +330,42 @@ def assignLevel (uidx : Nat) (u : Level) : UnifyM Bool := do
modify fun s => { s with uAssignment := s.uAssignment.set! uidx (some u) }
return true
def processLevel (u : Level) (v : Level) : UnifyM Bool := do
match u, v with
| .zero, .zero => return true
| .succ u, .succ v => processLevel u v
| .zero, .succ _ => return false
| .succ _, .zero => return false
| .zero, .max v₁ v₂ => processLevel .zero v <&&> processLevel .zero v
| .max u₁ u₂, .zero => processLevel u₁ .zero <&&> processLevel u₂ .zero
| .zero, .imax _ v => processLevel .zero v
| .imax _ u, .zero => processLevel u .zero
| .max u₁ u₂, .max v₁ v₂ => processLevel u₁ v₁ <&&> processLevel u₂ v₂
| .imax u₁ u₂, .imax v₁ v => processLevel u₁ v₁ <&&> processLevel u₂ v
| .param uName, _ =>
if let some uidx := isUVar? uName then
assignLevel uidx v
else if u == v then
return true
else if v.isMVar && ( read).unify then
pushLevelPending u v
return true
else
return false
| .mvar _, _ | _, .mvar _ =>
if ( read).unify then
pushLevelPending u v
return true
else
return false
| _, _ => return false
def processLevel (u : Level) (v : Level) : UnifyM Bool :=
go u v.normalize
where
go (u : Level) (v : Level) : UnifyM Bool := do
match u, v with
| .zero, .zero => return true
| .succ u, .succ v => go u v
| .zero, .succ _ => return false
| .succ _, .zero => return false
| .zero, .max v₁ v₂ => go .zero v₁ <&&> go .zero v₂
| .max u₁ u₂, .zero => go u₁ .zero <&&> go u₂ .zero
| .zero, .imax _ v => go .zero v
| .imax _ u, .zero => go u .zero
| .max u₁ u, .max v₁ v₂ =>
-- tryApproxMaxMax: find a shared concrete arg between both sides
if u == v then go u₁ v₂
else if u₁ == v₂ then go u₂ v₁
else go u₁ v₁ <&&> go u₂ v₂
| .imax u₁ u₂, .imax v₁ v₂ => go u v <&&> go u₂ v₂
| .param uName, _ =>
if let some uidx := isUVar? uName then
assignLevel uidx v
else if u == v then
return true
else if v.isMVar && ( read).unify then
pushLevelPending u v
return true
else
return false
| .mvar _, _ | _, .mvar _ =>
if ( read).unify then
pushLevelPending u v
return true
else
return false
| _, _ => return false
def processLevels (us : List Level) (vs : List Level) : UnifyM Bool := do
match us, vs with
@@ -542,7 +514,7 @@ def tryAssignLevelMVar (u : Level) (v : Level) : MetaM Bool := do
/--
Structural definitional equality for universe levels.
Treats `max` and `imax` as uninterpreted functions (no AC reasoning).
Uses `tryApproxMaxMax` to handle `max` commutativity when one argument matches structurally.
Attempts metavariable assignment in both directions if structural matching fails.
-/
def isLevelDefEqS (u : Level) (v : Level) : MetaM Bool := do
@@ -556,7 +528,10 @@ def isLevelDefEqS (u : Level) (v : Level) : MetaM Bool := do
| .max u₁ u₂, .zero => isLevelDefEqS u₁ .zero <&&> isLevelDefEqS u₂ .zero
| .zero, .imax _ v => isLevelDefEqS .zero v
| .imax _ u, .zero => isLevelDefEqS u .zero
| .max u₁ u₂, .max v₁ v₂ => isLevelDefEqS u₁ v₁ <&&> isLevelDefEqS u₂ v₂
| .max u₁ u₂, .max v₁ v₂ =>
if u₂ == v₁ then isLevelDefEqS u₁ v₂
else if u₁ == v₂ then isLevelDefEqS u₂ v₁
else isLevelDefEqS u₁ v₁ <&&> isLevelDefEqS u₂ v₂
| .imax u₁ u₂, .imax v₁ v₂ => isLevelDefEqS u₁ v₁ <&&> isLevelDefEqS u₂ v₂
| _, _ => tryAssignLevelMVar u v <||> tryAssignLevelMVar v u
@@ -598,6 +573,7 @@ structure DefEqM.Context where
abbrev DefEqM := ReaderT DefEqM.Context SymM
set_option compiler.ignoreBorrowAnnotation true in
/--
Structural definitional equality. It is much cheaper than `isDefEq`.

View File

@@ -8,6 +8,7 @@ prelude
public import Lean.Meta.Sym.Pattern
public import Lean.Meta.DiscrTree.Basic
import Lean.Meta.Sym.Offset
import Lean.Meta.Sym.Eta
import Init.Omega
namespace Lean.Meta.Sym
open DiscrTree
@@ -154,7 +155,7 @@ partial def getMatchLoop (todo : Array Expr) (c : Trie α) (result : Array α) :
else if h : csize = 0 then
result
else
let e := todo.back!
let e := etaReduce todo.back!
let todo := todo.pop
let first := cs[0] /- Recall that `Key.star` is the minimal key -/
if csize = 1 then
@@ -184,6 +185,7 @@ public def getMatch (d : DiscrTree α) (e : Expr) : Array α :=
let result := match d.root.find? .star with
| none => .mkEmpty initCapacity
| some (.node vs _) => vs
let e := etaReduce e
match d.root.find? (getKey e) with
| none => result
| some c => getMatchLoop (pushArgsTodo #[] e) c result
@@ -196,6 +198,7 @@ This is useful for rewriting: if a pattern matches `f x` but `e` is `f x y z`, w
still apply the rewrite and return `(value, 2)` indicating 2 extra arguments.
-/
public partial def getMatchWithExtra (d : DiscrTree α) (e : Expr) : Array (α × Nat) :=
let e := etaReduce e
let result := getMatch d e
let result := result.map (·, 0)
if !e.isApp then

View File

@@ -8,6 +8,7 @@ prelude
public import Lean.Meta.Sym.Simp.SimpM
import Init.Sym.Lemmas
import Lean.Meta.Sym.LitValues
import Lean.Meta.StringLitProof
namespace Lean.Meta.Sym.Simp
/-!
@@ -376,6 +377,25 @@ def evalFinPred (n : Expr) (trueThm falseThm : Expr) (op : {n : Nat} → Fin n
open Lean.Sym
/--
Evaluates `@Eq String a b` for string literal arguments, producing kernel-efficient proofs.
When equal, uses `eq_self` (no kernel evaluation needed). When different, uses
`mkStringLitNeProof` which finds the first differing character position and proves
inequality via `congrArg (List.get?Internal · i)`.
-/
private def evalStringEq (a b : Expr) : SimpM Result := do
let some va := getStringValue? a | return .rfl
let some vb := getStringValue? b | return .rfl
if va = vb then
let e getTrueExpr
return .step e (mkApp2 (mkConst ``eq_self [.succ .zero]) (mkConst ``String) a) (done := true)
else
let neProof mkStringLitNeProof va vb
let proof := mkApp2 (mkConst ``eq_false)
(mkApp3 (mkConst ``Eq [.succ .zero]) (mkConst ``String) a b) neProof
let e getFalseExpr
return .step e proof (done := true)
def evalLT (α : Expr) (a b : Expr) : SimpM Result :=
match_expr α with
| Nat => evalBinPred getNatValue? (mkConst ``Nat.lt_eq_true) (mkConst ``Nat.lt_eq_false) (. < .) a b
@@ -434,7 +454,7 @@ def evalEq (α : Expr) (a b : Expr) : SimpM Result :=
| Fin n => evalFinPred n (mkConst ``Fin.eq_eq_true) (mkConst ``Fin.eq_eq_false) (. = .) a b
| BitVec n => evalBitVecPred n (mkConst ``BitVec.eq_eq_true) (mkConst ``BitVec.eq_eq_false) (. = .) a b
| Char => evalBinPred getCharValue? (mkConst ``Char.eq_eq_true) (mkConst ``Char.eq_eq_false) (. = .) a b
| String => evalBinPred getStringValue? (mkConst ``String.eq_eq_true) (mkConst ``String.eq_eq_false) (. = .) a b
| String => evalStringEq a b
| _ => return .rfl
def evalDvd (α : Expr) (a b : Expr) : SimpM Result :=

View File

@@ -227,6 +227,7 @@ def SimpM.run' (x : SimpM α) (methods : Methods := {}) (config : Config := {})
let initialLCtxSize := ( getLCtx).decls.size
x methods.toMethodsRef { initialLCtxSize, config } |>.run' {}
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_sym_simp"] -- Forward declaration
opaque simp : Simproc

View File

@@ -109,4 +109,23 @@ where
public def _root_.Lean.MVarId.checkMaxShared (mvarId : MVarId) (msg := "") : SymM Unit := do
( mvarId.getDecl).type.checkMaxShared msg
/-- Quick filter for checking whether we can skip `normalizeLevels`. -/
def levelsAlreadyNormalized (e : Expr) : Bool :=
Option.isNone <| e.find? fun
| .const _ us => us.any (! ·.isAlreadyNormalizedCheap)
| .sort u => !u.isAlreadyNormalizedCheap
| _ => false
/--
Normalizes universe levels in constants and sorts.
-/
public def normalizeLevels (e : Expr) : CoreM Expr := do
if levelsAlreadyNormalized e then return e
let pre (e : Expr) := do
match e with
| .sort u => return .done <| e.updateSort! u.normalize
| .const _ us => return .done <| e.updateConst! (us.map Level.normalize)
| _ => return .continue
Core.transform e (pre := pre)
end Lean.Meta.Sym

View File

@@ -15,7 +15,11 @@ public section
namespace Lean
builtin_initialize registerTraceClass `Meta.Tactic.cbv
builtin_initialize registerTraceClass `Meta.Tactic.cbv.rewrite (inherited := true)
builtin_initialize registerTraceClass `Meta.Tactic.cbv.unfold (inherited := true)
builtin_initialize registerTraceClass `Meta.Tactic.cbv.controlFlow (inherited := true)
builtin_initialize registerTraceClass `Meta.Tactic.cbv.simprocs (inherited := true)
builtin_initialize registerTraceClass `Debug.Meta.Tactic.cbv
builtin_initialize registerTraceClass `Debug.Meta.Tactic.cbv.simprocs (inherited := true)
builtin_initialize registerTraceClass `Debug.Meta.Tactic.cbv.reduce (inherited := true)
end Lean

View File

@@ -256,16 +256,20 @@ def cbvSimprocDispatch (tree : DiscrTree CbvSimprocEntry)
(erased : PHashSet Name) : Simproc := fun e => do
let candidates := Sym.getMatchWithExtra tree e
if candidates.isEmpty then
trace[Debug.Meta.Tactic.cbv.simprocs] "no cbv simprocs found for {e}"
return .rfl
for (entry, numExtra) in candidates do
unless erased.contains entry.declName do
let result if numExtra == 0 then
entry.proc e
else
simpOverApplied e numExtra entry.proc
let simprocName := (privateToUserName entry.declName).replacePrefix `Lean.Meta.Sym.Simp .anonymous |>.replacePrefix `Lean.Meta.Tactic.Cbv .anonymous
let result withTraceNode `Meta.Tactic.cbv.simprocs (fun
| .ok (Result.step e' ..) => return m!"simproc {simprocName}:{indentExpr e}\n==>{indentExpr e'}"
| .ok (Result.rfl true) => return m!"simproc {simprocName}: done{indentExpr e}"
| .ok _ => return m!"simproc {simprocName}: no change"
| .error err => return m!"simproc {simprocName}: {err.toMessageData}") do
if numExtra == 0 then
entry.proc e
else
simpOverApplied e numExtra entry.proc
if result matches .step _ _ _ then
trace[Debug.Meta.Tactic.cbv.simprocs] "cbv simproc `{entry.declName}` result {e} => {result.getResultExpr e}"
return result
if result matches .rfl (done := true) then
return result

View File

@@ -176,7 +176,7 @@ builtin_cbv_simproc ↓ simpDIteCbv (@dite _ _ _ _ _) := fun e => do
let b share <| b.betaRev #[h']
return .step b <| mkApp (e.replaceFn ``dite_cond_eq_false) h
else
-- If we get stuck after simplifying `p` to `p'`, then we try to evaluate the original instance
-- If we get stuck after simplifying `p` to `p'`, then we try to evaluate the original instance
simpAndMatchDIteDecidable f α c inst a b do
-- Otherwise, we make `Decidable c'` instance and try to evaluate it instead
let inst' := mkApp4 (mkConst ``decidable_of_decidable_of_eq) c c' inst h
@@ -286,6 +286,7 @@ builtin_cbv_simproc ↓ simpCbvCond (@cond _ _ _) := simpCond
public def reduceRecMatcher : Simproc := fun e => do
if let some e' withCbvOpaqueGuard <| reduceRecMatcher? e then
trace[Meta.Tactic.cbv.rewrite] "recMatcher:{indentExpr e}\n==>{indentExpr e'}"
return .step e' ( Sym.mkEqRefl e')
else
return .rfl
@@ -306,9 +307,12 @@ public def tryMatcher : Simproc := fun e => do
let some info getMatcherInfo? appFn | return .rfl
let start := info.numParams + 1
let stop := start + info.numDiscrs
(simpAppArgRange · start stop)
let result (simpAppArgRange · start stop)
>> tryMatchEquations appFn
<|> reduceRecMatcher
<| e
if let .step e' .. := result then
trace[Meta.Tactic.cbv.controlFlow] "match `{appFn}`:{indentExpr e}\n==>{indentExpr e'}"
return result
end Lean.Meta.Tactic.Cbv

View File

@@ -70,17 +70,20 @@ There are also places where we deviate from strict call-by-value semantics:
## Attributes
- `@[cbv_opaque]`: prevents `cbv` from unfolding a definition. The constant is
returned as-is without attempting any equation or unfold theorems.
- `@[cbv_opaque]`: prevents `cbv` from unfolding a definition. Equation theorems,
unfold theorems, and kernel reduction are all suppressed. However, `@[cbv_eval]`
rules can still fire on an `@[cbv_opaque]` constant, allowing users to provide
custom rewrite rules without exposing the full definition.
- `@[cbv_eval]`: registers a theorem as a custom rewrite rule for `cbv`. The
theorem must be an unconditional equality whose LHS is an application of a
constant. Use `@[cbv_eval ←]` to rewrite right-to-left. These rules are tried
before equation theorems, so they can be used together with `@[cbv_opaque]` to
replace the default unfolding behavior with a controlled set of evaluation rules.
before equation theorems and can override `@[cbv_opaque]`.
## Unfolding order
For a constant application, `handleApp` tries in order:
For a constant application, `handleApp` first checks `@[cbv_opaque]`. If the
constant is opaque, only `@[cbv_eval]` rewrite rules are attempted; the result
is marked done regardless of whether a rule fires. Otherwise it tries in order:
1. `@[cbv_eval]` rewrite rules
2. Equation theorems (e.g. `foo.eq_1`, `foo.eq_2`)
3. Unfold equations
@@ -112,70 +115,77 @@ def tryEquations : Simproc := fun e => do
return .rfl
let some appFn := e.getAppFn.constName? | return .rfl
let thms getEqnTheorems appFn
Simproc.tryCatch (thms.rewrite (d := dischargeNone)) e
let result Simproc.tryCatch (thms.rewrite (d := dischargeNone)) e
if let .step e' .. := result then
trace[Meta.Tactic.cbv.rewrite] "equation `{appFn}`:{indentExpr e}\n==>{indentExpr e'}"
return result
def tryUnfold : Simproc := fun e => do
unless e.isApp do
return .rfl
let some appFn := e.getAppFn.constName? | return .rfl
let some thm getUnfoldTheorem appFn | return .rfl
Simproc.tryCatch (fun e => Theorem.rewrite thm e) e
/-- Try equation theorems, then unfold equations. Skip `@[cbv_opaque]` constants. -/
def handleConstApp : Simproc := fun e => do
if ( isCbvOpaque e.getAppFn.constName!) then
return .rfl (done := true)
else
tryEquations <|> tryUnfold <| e
let result Simproc.tryCatch (fun e => Theorem.rewrite thm e) e
if let .step e' .. := result then
trace[Meta.Tactic.cbv.unfold] "unfold `{appFn}`:{indentExpr e}\n==>{indentExpr e'}"
return result
def betaReduce : Simproc := fun e => do
-- TODO: Improve term sharing
let new := e.headBeta
let new Sym.share new
trace[Debug.Meta.Tactic.cbv.reduce] "beta:{indentExpr e}\n==>{indentExpr new}"
return .step new ( Sym.mkEqRefl new)
def tryCbvTheorems : Simproc := fun e => do
let some fnName := e.getAppFn.constName? | return .rfl
let some evalLemmas getCbvEvalLemmas fnName | return .rfl
Simproc.tryCatch (Theorems.rewrite evalLemmas (d := dischargeNone)) e
let result Simproc.tryCatch (Theorems.rewrite evalLemmas (d := dischargeNone)) e
if let .step e' .. := result then
trace[Meta.Tactic.cbv.rewrite] "@[cbv_eval] `{fnName}`:{indentExpr e}\n==>{indentExpr e'}"
return result
/-- Try equation theorems, then unfold equations. -/
def handleConstApp : Simproc := fun e => do
tryEquations <|> tryUnfold <| e
/--
Post-pass handler for applications. For a constant-headed application, tries
`@[cbv_eval]` rules, then equation/unfold theorems, then `reduceRecMatcher`.
For a lambda-headed application, beta-reduces.
Post-pass handler for applications. For a constant-headed application, if the
constant is `@[cbv_opaque]`, only `@[cbv_eval]` rules are tried (and the result
is marked done). Otherwise tries `@[cbv_eval]` rules, equation/unfold theorems,
and `reduceRecMatcher`. For a lambda-headed application, beta-reduces.
-/
def handleApp : Simproc := fun e => do
unless e.isApp do return .rfl
let fn := e.getAppFn
match fn with
| .const constName _ =>
if ( isCbvOpaque constName) then
return ( tryCbvTheorems e).markAsDone
let info getConstInfo constName
tryCbvTheorems <|> (guardSimproc (fun _ => info.hasValue) handleConstApp) <|> reduceRecMatcher <| e
| .lam .. => betaReduce e
| _ => return .rfl
def isOpaqueConst : Simproc := fun e => do
def handleOpaqueConst : Simproc := fun e => do
let .const constName _ := e | return .rfl
let hasTheorems := ( getCbvEvalLemmas constName).isSome
if hasTheorems then
let res (tryCbvTheorems) e
match res with
| .rfl false =>
return .rfl
| _ => return res
else
return .rfl ( isCbvOpaque constName)
if ( isCbvOpaque constName) then
return ( tryCbvTheorems e).markAsDone
return .rfl
def foldLit : Simproc := fun e => do
let some n := e.rawNatLit? | return .rfl
-- TODO: check performance of sharing
return .step ( Sym.share <| mkNatLit n) ( Sym.mkEqRefl e)
let some n := e.rawNatLit? | return .rfl
-- TODO: check performance of sharing
let new Sym.share <| mkNatLit n
trace[Debug.Meta.Tactic.cbv.reduce] "foldLit: {e} ==> {new}"
return .step new ( Sym.mkEqRefl e)
def zetaReduce : Simproc := fun e => do
let .letE _ _ value body _ := e | return .rfl
let new := expandLet body #[value]
-- TODO: Improve sharing
let new Sym.share new
trace[Debug.Meta.Tactic.cbv.reduce] "zeta:{indentExpr e}\n==>{indentExpr new}"
return .step new ( Sym.mkEqRefl new)
/--
@@ -186,6 +196,11 @@ the original and rewritten struct are definitionally equal, falls back to `HCong
-/
def handleProj : Simproc := fun e => do
let Expr.proj typeName idx struct := e | return .rfl
withTraceNode `Debug.Meta.Tactic.cbv.reduce (fun
| .ok (Result.step e' ..) => return m!"proj `{typeName}`.{idx}:{indentExpr e}\n==>{indentExpr e'}"
| .ok (Result.rfl true) => return m!"proj `{typeName}`.{idx}: stuck{indentExpr e}"
| .ok _ => return m!"proj `{typeName}`.{idx}: no change"
| .error err => return m!"proj `{typeName}`.{idx}: {err.toMessageData}") do
-- We recursively simplify the projection
let res simp struct
match res with
@@ -241,6 +256,7 @@ def simplifyAppFn : Simproc := fun e => do
let congrArgFun := Lean.mkLambda `x .default newType (mkAppN (.bvar 0) e.getAppArgs)
let newValue mkAppNS e' e.getAppArgs
let newProof mkCongrArg congrArgFun proof
trace[Debug.Meta.Tactic.cbv.reduce] "simplifyAppFn:{indentExpr e}\n==>{indentExpr newValue}"
return .step newValue newProof
def handleConst : Simproc := fun e => do
@@ -253,6 +269,7 @@ def handleConst : Simproc := fun e => do
unless info.hasValue && info.levelParams.length == lvls.length do return .rfl
let fBody instantiateValueLevelParams info lvls
let eNew Sym.share fBody
trace[Meta.Tactic.cbv.unfold] "const `{n}`:{indentExpr e}\n==>{indentExpr eNew}"
return .step eNew ( Sym.mkEqRefl eNew)
/--
@@ -265,7 +282,7 @@ def cbvPreStep : Simproc := fun e => do
match e with
| .lit .. => foldLit e
| .proj .. => handleProj e
| .const .. => isOpaqueConst >> handleConst <| e
| .const .. => handleOpaqueConst >> (tryCbvTheorems <|> handleConst) <| e
| .app .. => tryMatcher <|> simplifyAppFn <| e
| .letE .. =>
if e.letNondep! then
@@ -296,7 +313,10 @@ def cbvCore (e : Expr) (config : Sym.Simp.Config := {}) : Sym.SymM Result := do
/-- Reduce a single expression. Unfolds reducibles, shares subterms, then runs the
simplifier with `cbvPre`/`cbvPost`. Used by `conv => cbv`. -/
public def cbvEntry (e : Expr) : MetaM Result := do
trace[Meta.Tactic.cbv] "Called cbv tactic to simplify {e}"
withTraceNode `Meta.Tactic.cbv (fun
| .ok (Result.step e' ..) => return m!"cbv:{indentExpr e}\n==>{indentExpr e'}"
| .ok (Result.rfl ..) => return m!"cbv: no change{indentExpr e}"
| .error err => return m!"cbv: {err.toMessageData}") do
let simprocs getCbvSimprocs
let config : Sym.Simp.Config := { maxSteps := cbv.maxSteps.get ( getOptions) }
let methods := mkCbvMethods simprocs
@@ -330,7 +350,11 @@ public def cbvGoal (mvarId : MVarId) (simplifyTarget : Bool := true) (fvarIdsToS
for fvarId in fvarIdsToSimp do
let localDecl fvarId.getDecl
let type := localDecl.type
let result cbvCore type config
let result withTraceNode `Meta.Tactic.cbv (fun
| .ok (Result.step type' ..) => return m!"hypothesis `{localDecl.userName}`:{indentExpr type}\n==>{indentExpr type'}"
| .ok (Result.rfl ..) => return m!"hypothesis `{localDecl.userName}`: no change"
| .error err => return m!"hypothesis `{localDecl.userName}`: {err.toMessageData}") do
cbvCore type config
match result with
| .rfl _ => pure ()
| .step type' proof _ =>
@@ -344,7 +368,11 @@ public def cbvGoal (mvarId : MVarId) (simplifyTarget : Bool := true) (fvarIdsToS
-- Process target
if simplifyTarget then
let target mvarIdNew.getType
let result cbvCore target config
let result withTraceNode `Meta.Tactic.cbv (fun
| .ok (Result.step target' ..) => return m!"target:{indentExpr target}\n==>{indentExpr target'}"
| .ok (Result.rfl ..) => return m!"target: no change"
| .error err => return m!"target: {err.toMessageData}") do
cbvCore target config
match result with
| .rfl _ => pure ()
| .step target' proof _ =>
@@ -370,6 +398,9 @@ Attempt to close a goal of the form `decide P = true` by reducing only the LHS u
- Otherwise, throws a user-friendly error showing where the reduction got stuck.
-/
public def cbvDecideGoal (m : MVarId) : MetaM Unit := do
withTraceNode `Meta.Tactic.cbv (fun
| .ok () => return m!"decide_cbv: closed goal"
| .error err => return m!"decide_cbv: {err.toMessageData}") do
let config : Sym.Simp.Config := { maxSteps := cbv.maxSteps.get ( getOptions) }
Sym.SymM.run do
let m Sym.preprocessMVar m
@@ -377,6 +408,7 @@ public def cbvDecideGoal (m : MVarId) : MetaM Unit := do
let some (_, lhs, _) := mType.eq? |
throwError "`decide_cbv`: expected goal of the form `decide _ = true`, got: {indentExpr mType}"
let result cbvCore lhs config
trace[Meta.Tactic.cbv] "decide_cbv:{indentExpr lhs}\n==>{indentExpr (result.getResultExpr lhs)}"
let checkResult (e : Expr) (onTrue : Sym.SymM Unit) : Sym.SymM Unit := do
if ( Sym.isBoolTrueExpr e) then
onTrue

View File

@@ -315,7 +315,7 @@ partial def foldAndCollect (oldIH newIH : FVarId) (isRecCall : Expr → Option E
-- statement and the inferred alt types
let dummyGoal := mkConst ``True []
mkArrow eTypeAbst dummyGoal)
(onAlt := fun _altIdx altType _altParams alt => do
(onAlt := fun _altIdx altType _altFVars alt => do
lambdaTelescope1 alt fun oldIH' alt => do
forallBoundedTelescope altType (some 1) fun newIH' _goal' => do
let #[newIH'] := newIH' | unreachable!
@@ -333,7 +333,7 @@ partial def foldAndCollect (oldIH newIH : FVarId) (isRecCall : Expr → Option E
(onMotive := fun _motiveArgs motiveBody => do
let some (_extra, body) := motiveBody.arrow? | throwError "motive not an arrow"
M.eval (foldAndCollect oldIH newIH isRecCall body))
(onAlt := fun _altIdx altType _altParams alt => do
(onAlt := fun _altIdx altType _altFVars alt => do
lambdaTelescope1 alt fun oldIH' alt => do
-- We don't have suitable newIH around here, but we don't care since
-- we just want to fold calls. So lets create a fake one.
@@ -691,7 +691,7 @@ partial def buildInductionBody (toErase toClear : Array FVarId) (goal : Expr)
(addEqualities := true)
(onParams := (foldAndCollect oldIH newIH isRecCall ·))
(onMotive := fun xs _body => pure (absMotiveBody.beta (Array.mask mask xs)))
(onAlt := fun altIdx expAltType _altParams alt => M2.branch do
(onAlt := fun altIdx expAltType _altFVars alt => M2.branch do
lambdaTelescope1 alt fun oldIH' alt => do
forallBoundedTelescope expAltType (some 1) fun newIH' goal' => do
let #[newIH'] := newIH' | unreachable!
@@ -714,7 +714,7 @@ partial def buildInductionBody (toErase toClear : Array FVarId) (goal : Expr)
(addEqualities := true)
(onParams := (foldAndCollect oldIH newIH isRecCall ·))
(onMotive := fun xs _body => pure (absMotiveBody.beta (Array.mask mask xs)))
(onAlt := fun altIdx expAltType _altParams alt => M2.branch do
(onAlt := fun altIdx expAltType _altFVars alt => M2.branch do
withRewrittenMotiveArg expAltType (rwMatcher altIdx) fun expAltType' =>
buildInductionBody toErase toClear expAltType' oldIH newIH isRecCall alt)
return matcherApp'.toExpr

View File

@@ -229,6 +229,7 @@ private inductive MulEqProof where
| mulVar (k : Int) (a : Expr) (h : Expr)
| none
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_cutsat_eq_cnstr_to_proof"] -- forward definition
private opaque EqCnstr.toExprProof (c' : EqCnstr) : ProofM Expr

View File

@@ -41,6 +41,7 @@ def inconsistent : GoalM Bool := do
if ( isInconsistent) then return true
return ( get').conflict?.isSome
set_option compiler.ignoreBorrowAnnotation true in
/-- Creates a new variable in the cutsat module. -/
@[extern "lean_grind_cutsat_mk_var"] -- forward definition
opaque mkVar (e : Expr) : GoalM Var
@@ -62,6 +63,7 @@ def isIntTerm (e : Expr) : GoalM Bool :=
def eliminated (x : Var) : GoalM Bool :=
return ( get').elimEqs[x]!.isSome
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_cutsat_assert_eq"] -- forward definition
opaque EqCnstr.assert (c : EqCnstr) : GoalM Unit
@@ -114,6 +116,7 @@ def DiseqCnstr.throwUnexpected (c : DiseqCnstr) : GoalM α := do
def DiseqCnstr.denoteExpr (c : DiseqCnstr) : GoalM Expr := do
return mkNot (mkIntEq ( c.p.denoteExpr') (mkIntLit 0))
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_cutsat_assert_le"] -- forward definition
opaque LeCnstr.assert (c : LeCnstr) : GoalM Unit

View File

@@ -12,6 +12,7 @@ import Lean.Meta.IntInstTesters
public section
namespace Lean.Meta.Grind.Arith.Cutsat
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_cutsat_propagate_nonlinear"]
opaque propagateNonlinearTerm (y : Var) (x : Var) : GoalM Bool

View File

@@ -109,7 +109,7 @@ where
let e eraseIrrelevantMData e
/- We must fold kernel projections like it is done in the preprocessor. -/
let e foldProjs e
normalizeLevels e
Sym.normalizeLevels e
private def markNestedProof (e : Expr) : M Expr := do
let prop inferType e

View File

@@ -138,7 +138,7 @@ example : f b = f c → a ≤ f a → f (f a) ≤ f (f (f a)) := by
```
-/
@[builtin_command_parser] def grindPattern := leading_parser
Term.attrKind >> "grind_pattern " >> optional ("[" >> ident >> "]") >> ident >> darrow >> sepBy1 termParser "," >> optional grindPatternCnstrs
Term.attrKind >> "grind_pattern " >> optional ("[" >> ident >> "]") >> ident >> darrow >> sepBy1 termParser ", " >> optional grindPatternCnstrs
@[builtin_command_parser] def initGrindNorm := leading_parser
"init_grind_norm " >> many ident >> "| " >> many ident

View File

@@ -63,7 +63,7 @@ def preprocessImpl (e : Expr) : GoalM Simp.Result := do
let e' markNestedSubsingletons e'
let e' eraseIrrelevantMData e'
let e' foldProjs e'
let e' normalizeLevels e'
let e' Sym.normalizeLevels e'
let r' eraseSimpMatchDiscrsOnly e'
let r r.mkEqTrans r'
let e' := r'.expr
@@ -98,6 +98,6 @@ but ensures assumptions made by `grind` are satisfied.
-/
def preprocessLight (e : Expr) : GoalM Expr := do
let e instantiateMVars e
shareCommon ( canon ( normalizeLevels ( foldProjs ( eraseIrrelevantMData ( markNestedSubsingletons ( Sym.unfoldReducible e))))))
shareCommon ( canon ( Sym.normalizeLevels ( foldProjs ( eraseIrrelevantMData ( markNestedSubsingletons ( Sym.unfoldReducible e))))))
end Lean.Meta.Grind

View File

@@ -1359,6 +1359,7 @@ partial def getCongrRoot (e : Expr) : GoalM Expr := do
def isInconsistent : GoalM Bool :=
return ( get).inconsistent
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns a proof that `a = b`.
It assumes `a` and `b` are in the same equivalence class, and have the same type.
@@ -1367,6 +1368,7 @@ It assumes `a` and `b` are in the same equivalence class, and have the same type
@[extern "lean_grind_mk_eq_proof"]
opaque mkEqProof (a b : Expr) : GoalM Expr
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns a proof that `a ≍ b`.
It assumes `a` and `b` are in the same equivalence class.
@@ -1376,14 +1378,17 @@ It assumes `a` and `b` are in the same equivalence class.
opaque mkHEqProof (a b : Expr) : GoalM Expr
-- Forward definition
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_process_new_facts"]
opaque processNewFacts : GoalM Unit
-- Forward definition
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_internalize"]
opaque internalize (e : Expr) (generation : Nat) (parent? : Option Expr := none) : GoalM Unit
-- Forward definition
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_preprocess"]
opaque preprocess : Expr GoalM Simp.Result
@@ -1729,6 +1734,7 @@ def withoutModifyingState (x : GoalM α) : GoalM α := do
finally
set saved
set_option compiler.ignoreBorrowAnnotation true in
/-- Canonicalizes nested types, type formers, and instances in `e`. -/
@[extern "lean_grind_canon"] -- Forward definition
opaque canon (e : Expr) : GoalM Expr

View File

@@ -153,25 +153,7 @@ def foldProjs (e : Expr) : MetaM Expr := do
return .done e
Meta.transform e (post := post)
/-- Quick filter for checking whether we can skip `normalizeLevels`. -/
private def levelsAlreadyNormalized (e : Expr) : Bool :=
Option.isNone <| e.find? fun
| .const _ us => us.any (! ·.isAlreadyNormalizedCheap)
| .sort u => !u.isAlreadyNormalizedCheap
| _ => false
/--
Normalizes universe levels in constants and sorts.
-/
def normalizeLevels (e : Expr) : CoreM Expr := do
if levelsAlreadyNormalized e then return e
let pre (e : Expr) := do
match e with
| .sort u => return .done <| e.updateSort! u.normalize
| .const _ us => return .done <| e.updateConst! (us.map Level.normalize)
| _ => return .continue
Core.transform e (pre := pre)
set_option compiler.ignoreBorrowAnnotation true in
/--
Normalizes the given expression using the `grind` simplification theorems and simprocs.
This function is used for normalizing E-matching patterns. Note that it does not return a proof.

View File

@@ -7,6 +7,7 @@ module
prelude
public import Lean.Meta.Tactic.Simp.BuiltinSimprocs.Char
import Lean.Meta.StringLitProof
public section
@@ -57,6 +58,7 @@ builtin_dsimproc [simp, seval] reduceSingleton (String.singleton _) := fun e =>
let some m fromExpr? e.appArg! | return .continue
evalPropStep e (op n m)
@[inline] def reduceBoolPred (declName : Name) (arity : Nat) (op : String String Bool) (e : Expr) : SimpM DStep := do
unless e.isAppOfArity declName arity do return .continue
let some n fromExpr? e.appFn!.appArg! | return .continue
@@ -67,8 +69,18 @@ builtin_simproc [simp, seval] reduceLT (( _ : String) < _) := reduceBinPred ``
builtin_simproc [simp, seval] reduceLE (( _ : String) _) := reduceBinPred ``LE.le 4 (. .)
builtin_simproc [simp, seval] reduceGT (( _ : String) > _) := reduceBinPred ``GT.gt 4 (. > .)
builtin_simproc [simp, seval] reduceGE (( _ : String) _) := reduceBinPred ``GE.ge 4 (. .)
builtin_simproc [simp, seval] reduceEq (( _ : String) = _) := reduceBinPred ``Eq 3 (. = .)
builtin_simproc [simp, seval] reduceNe (( _ : String) _) := reduceBinPred ``Ne 3 (. .)
builtin_simproc [simp, seval] reduceEq (( _ : String) = _) := fun e => do
unless e.isAppOfArity ``Eq 3 do return .continue
let some n fromExpr? e.appFn!.appArg! | return .continue
let some m fromExpr? e.appArg! | return .continue
evalEqPropStep e (n = m) (mkStringLitNeProof n m)
builtin_simproc [simp, seval] reduceNe (( _ : String) _) := fun e => do
unless e.isAppOfArity ``Ne 3 do return .continue
let some n fromExpr? e.appFn!.appArg! | return .continue
let some m fromExpr? e.appArg! | return .continue
evalNePropStep e (n m) (mkStringLitNeProof n m)
builtin_dsimproc [simp, seval] reduceBEq (( _ : String) == _) := reduceBoolPred ``BEq.beq 4 (. == .)
builtin_dsimproc [simp, seval] reduceBNe (( _ : String) != _) := reduceBoolPred ``bne 4 (. != .)

View File

@@ -24,4 +24,40 @@ def evalPropStep (p : Expr) (result : Bool) : SimpM Step := do
else
return .done { expr := mkConst ``False, proof? := mkAppN (mkConst ``eq_false_of_decide) #[p, d.appArg!, ( mkEqRefl (mkConst ``false))] }
/--
Like `evalPropStep`, but specialized for `@Eq.{u} α a b`. When the values are equal, uses
`eq_true rfl` which requires no kernel evaluation. When different, calls `mkNeProof` to build
a proof of `a ≠ b`.
-/
def evalEqPropStep (e : Expr) (eq : Bool) (mkNeProof : SimpM Expr) : SimpM Step := do
let α := e.appFn!.appFn!.appArg!
let a := e.appFn!.appArg!
let u := e.appFn!.appFn!.appFn!.constLevels!.head!
if eq then
let proof := mkApp2 (mkConst ``eq_true) e (mkApp2 (mkConst ``Eq.refl [u]) α a)
return .done { expr := mkConst ``True, proof? := proof }
else
let neProof mkNeProof
let proof := mkApp2 (mkConst ``eq_false) e neProof
return .done { expr := mkConst ``False, proof? := proof }
/--
Like `evalPropStep`, but specialized for `@Ne.{u} α a b`. When the values differ, calls
`mkNeProof` to build a proof of `a ≠ b`. When equal, uses `eq_false (not_not_intro rfl)`
which requires no kernel evaluation.
-/
def evalNePropStep (e : Expr) (ne : Bool) (mkNeProof : SimpM Expr) : SimpM Step := do
if ne then
let neProof mkNeProof
let proof := mkApp2 (mkConst ``eq_true) e neProof
return .done { expr := mkConst ``True, proof? := proof }
else
let α := e.appFn!.appFn!.appArg!
let a := e.appFn!.appArg!
let u := e.appFn!.appFn!.appFn!.constLevels!.head!
let eqProp := mkApp3 (mkConst ``Eq [u]) α a a
let rflExpr := mkApp2 (mkConst ``Eq.refl [u]) α a
let proof := mkApp2 (mkConst ``eq_false) e (mkApp2 (mkConst ``not_not_intro) eqProp rflExpr)
return .done { expr := mkConst ``False, proof? := proof }
end Lean.Meta.Simp

View File

@@ -295,9 +295,11 @@ Executes `x` using a `MetaM` configuration for inferred from `Simp.Config`.
@[inline] def withSimpMetaConfig (x : SimpM α) : SimpM α := do
withConfigWithKey ( readThe Simp.Context).metaConfig x
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_simp"]
opaque simp (e : Expr) : SimpM Result
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_dsimp"]
opaque dsimp (e : Expr) : SimpM Expr

View File

@@ -36,6 +36,7 @@ namespace Lean.Meta
/-! # Smart unfolding support -/
-- ===========================
set_option compiler.ignoreBorrowAnnotation true in
/--
Forward declaration. It is defined in the module `src/Lean/Elab/PreDefinition/Structural/Eqns.lean`.
It is possible to avoid this hack if we move `Structural.EqnInfo` and `Structural.eqnInfoExt`

View File

@@ -174,6 +174,7 @@ See also `Lean.matchConstStructure` for a less restrictive version.
| _ => failK ()
| _ => failK ()
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_has_compile_error"]
opaque hasCompileError (env : Environment) (constName : Name) : Bool

View File

@@ -234,10 +234,12 @@ def recover'.formatter (fmt : PrettyPrinter.Formatter) := fmt
-- Note that there is a mutual recursion
-- `categoryParser -> mkAntiquot -> termParser -> categoryParser`, so we need to introduce an indirection somewhere
-- anyway.
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_mk_antiquot_formatter"]
opaque mkAntiquot.formatter' (name : String) (kind : SyntaxNodeKind) (anonymous := true) (isPseudoKind := false) : Formatter
-- break up big mutual recursion
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_pretty_printer_formatter_interpret_parser_descr"]
opaque interpretParserDescr' : ParserDescr CoreM Formatter

View File

@@ -307,6 +307,7 @@ def recover'.parenthesizer (p : PrettyPrinter.Parenthesizer) : PrettyPrinter.Par
-- Note that there is a mutual recursion
-- `categoryParser -> mkAntiquot -> termParser -> categoryParser`, so we need to introduce an indirection somewhere
-- anyway.
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_mk_antiquot_parenthesizer"]
opaque mkAntiquot.parenthesizer' (name : String) (kind : SyntaxNodeKind) (anonymous := true) (isPseudoKind := false) : Parenthesizer
@@ -314,6 +315,7 @@ opaque mkAntiquot.parenthesizer' (name : String) (kind : SyntaxNodeKind) (anonym
liftM x
-- break up big mutual recursion
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_pretty_printer_parenthesizer_interpret_parser_descr"]
opaque interpretParserDescr' : ParserDescr CoreM Parenthesizer

View File

@@ -654,15 +654,17 @@ section NotificationHandling
def handleRpcRelease (p : Lsp.RpcReleaseParams) : WorkerM Unit := do
-- NOTE(WN): when the worker restarts e.g. due to changed imports, we may receive `rpc/release`
-- for the previous RPC session. This is fine, just ignore.
let ctx read
if let some seshRef := ( get).rpcSessions.get? p.sessionId then
let monoMsNow IO.monoMsNow
let discardRefs : StateM RpcObjectStore Unit := do
for ref in p.refs do
discard do rpcReleaseRef ref
seshRef.modify fun st =>
let st := st.keptAlive monoMsNow
let ((), objects) := discardRefs.run st.objects
{ st with objects }
let wireFormat seshRef.modifyGet fun st =>
(st.objects.wireFormat, st.keptAlive monoMsNow)
for ref in p.refs do
let .ok p := ref.getObjVal? wireFormat.refFieldName >>= fromJson?
| ctx.hLog.putStrLn s!"malformed RPC ref (wire format {toJson wireFormat}): {ref.compress}"
seshRef.modify fun st =>
let (_, objects) := rpcReleaseRef p |>.run st.objects
{ st with objects }
def handleRpcKeepAlive (p : Lsp.RpcKeepAliveParams) : WorkerM Unit := do
match ( get).rpcSessions.get? p.sessionId with
@@ -676,7 +678,8 @@ end NotificationHandling
section RequestHandling
def handleRpcConnect (_ : RpcConnectParams) : WorkerM RpcConnected := do
let (newId, newSesh) RpcSession.new
let wireFormat := ( read).initParams.capabilities.rpcWireFormat
let (newId, newSesh) RpcSession.new wireFormat
let newSeshRef IO.mkRef newSesh
modify fun st => { st with rpcSessions := st.rpcSessions.insert newId newSeshRef }
return { sessionId := newId }

View File

@@ -83,12 +83,12 @@ namespace RpcSession
def keepAliveTimeMs : Nat :=
30000
def new : IO (UInt64 × RpcSession) := do
def new (wireFormat : Lsp.RpcWireFormat) : IO (UInt64 × RpcSession) := do
/- We generate a random ID to ensure that session IDs do not repeat across re-initializations
and worker restarts. Otherwise, the client may attempt to use outdated references. -/
let newId ByteArray.toUInt64LE! <$> IO.getRandomBytes 8
let newSesh := {
objects := {}
objects := { wireFormat }
expireTime := ( IO.monoMsNow) + keepAliveTimeMs
}
return (newId, newSesh)

View File

@@ -14,8 +14,8 @@ public section
namespace Lean.Server
structure RequestCancellationToken where
cancelledByCancelRequest : IO.Ref Bool
cancelledByEdit : IO.Ref Bool
cancelledByCancelRequest : IO.CancelToken
cancelledByEdit : IO.CancelToken
requestCancellationPromise : IO.Promise Unit
editCancellationPromise : IO.Promise Unit
@@ -23,18 +23,18 @@ namespace RequestCancellationToken
def new : BaseIO RequestCancellationToken := do
return {
cancelledByCancelRequest := IO.mkRef false
cancelledByEdit := IO.mkRef false
cancelledByCancelRequest := IO.CancelToken.new
cancelledByEdit := IO.CancelToken.new
requestCancellationPromise := IO.Promise.new
editCancellationPromise := IO.Promise.new
}
def cancelByCancelRequest (tk : RequestCancellationToken) : BaseIO Unit := do
tk.cancelledByCancelRequest.set true
tk.cancelledByCancelRequest.set
tk.requestCancellationPromise.resolve ()
def cancelByEdit (tk : RequestCancellationToken) : BaseIO Unit := do
tk.cancelledByEdit.set true
tk.cancelledByEdit.set
tk.editCancellationPromise.resolve ()
def requestCancellationTask (tk : RequestCancellationToken): ServerTask Unit :=
@@ -47,10 +47,10 @@ def cancellationTasks (tk : RequestCancellationToken) : List (ServerTask Unit) :
[tk.requestCancellationTask, tk.editCancellationTask]
def wasCancelledByCancelRequest (tk : RequestCancellationToken) : BaseIO Bool :=
tk.cancelledByCancelRequest.get
tk.cancelledByCancelRequest.isSet
def wasCancelledByEdit (tk : RequestCancellationToken) : BaseIO Bool :=
tk.cancelledByEdit.get
tk.cancelledByEdit.isSet
def wasCancelled (tk : RequestCancellationToken) : BaseIO Bool := do
return ( tk.wasCancelledByCancelRequest) || ( tk.wasCancelledByEdit)

View File

@@ -24,28 +24,55 @@ first connect to the session using `$/lean/rpc/connect`. -/
namespace Lean.Lsp
/--
An object which RPC clients can refer to without marshalling.
The address of an object in an `RpcObjectStore` that clients can refer to.
Its JSON encoding depends on the RPC wire format.
The language server may serve the same `RpcRef` multiple times and maintains a reference count
to track how many times it has served the reference.
The language server may serve the same `RpcRef` multiple times.
It maintains a reference count to track how many times it has served the reference.
If clients want to release the object associated with an `RpcRef`,
they must release the reference as many times as they have received it from the server.
-/
structure RpcRef where
/- NOTE(WN): It is important for this to be a single-field structure
in order to deserialize as an `Object` on the JS side. -/
p : USize
deriving Inhabited, BEq, Hashable, FromJson, ToJson
deriving Inhabited, BEq, Hashable
instance : ToString RpcRef where
toString r := toString r.p
/-- The *RPC wire format* specifies how user data is encoded in RPC requests. -/
inductive RpcWireFormat where
/-- Version `0` uses JSON.
Serialized RPC data is stored directly in `RpcCallParams.params` and in `JsonRpc.Response.result`
(as opposed to being wrapped in additional metadata).
General types (except RPC references) are (de)serialized via `ToJson/FromJson`.
RPC references are serialized as `{"p": n}`. -/
| v0
/-- Version `1` is like `0`,
except that RPC references are serialized as `{"__rpcref": n}`. -/
| v1
deriving FromJson, ToJson
@[inline]
def RpcWireFormat.refFieldName : RpcWireFormat String
| .v0 => "p"
| .v1 => "__rpcref"
end Lean.Lsp
namespace Lean.Server
/--
Marks values to be encoded as opaque references in RPC packets.
`WithRpcRef α` marks values that RPC clients should see as opaque references.
This includes heavy objects such as `Lean.Environment`s
and non-serializable objects such as closures.
In the Lean server, `WithRpcRef α` is a structure containing a value of type `α`
and an associated `id`.
In an RPC client (e.g. the infoview), it is an opaque reference.
Thus, `WithRpcRef α` is cheap to transmit, but its data may only be accessed server-side.
In practice, this is used by client code to pass data
between various RPC methods provided by the server.
Two `WithRpcRef`s with the same `id` will yield the same client-side reference.
See also the docstring for `RpcEncodable`.
@@ -90,6 +117,8 @@ structure RpcObjectStore : Type where
Value to use for the next fresh `RpcRef`, monotonically increasing.
-/
nextRef : USize := 0
/-- The RPC wire format to follow. -/
wireFormat : Lsp.RpcWireFormat := .v1
def rpcStoreRef [TypeName α] (obj : WithRpcRef α) : StateM RpcObjectStore Lsp.RpcRef := do
let st get
@@ -137,29 +166,27 @@ def rpcReleaseRef (r : Lsp.RpcRef) : StateM RpcObjectStore Bool := do
return true
/--
`RpcEncodable α` means that `α` can be deserialized from and serialized into JSON
for the purpose of receiving arguments to and sending return values from
Remote Procedure Calls (RPCs).
`RpcEncodable α` means `α` can be used as an argument or return value
in Remote Procedure Call (RPC) methods.
Any type with `FromJson` and `ToJson` instances is `RpcEncodable`.
The following types are `RpcEncodable` by default:
- Any type with both `FromJson` and `ToJson` instances.
- Any type all of whose components
(meaning the fields of a `structure`, or the arguments to `inductive` constructors)
are `RpcEncodable`.
Use `deriving RpcEncodable` to construct an instance in this case.
- Certain common containers, if containing `RpcEncodable` data (see instances below).
- Any `WithRpcRef α` where `[TypeName α]`.
Furthermore, types that do not have these instances may still be `RpcEncodable`.
Use `deriving RpcEncodable` to automatically derive instances for such types.
Some names are reserved for internal use and must not appear
as the name of a field or constructor argument in any `RpcEncodable` type,
or as an object key in any nested JSON.
The following are currently reserved:
- `__rpcref`
This occurs when `α` contains data that should not or cannot be serialized:
for instance, heavy objects such as `Lean.Environment`, or closures.
For such data, we use the `WithRpcRef` marker.
Note that for `WithRpcRef α` to be `RpcEncodable`,
`α` must have a `TypeName` instance
On the server side, `WithRpcRef α` is a structure containing a value of type `α` and an associated
`id`.
On the client side, it is an opaque reference of (structural) type `Lsp.RpcRef`.
Thus, `WithRpcRef α` is cheap to transmit over the network
but may only be accessed on the server side.
In practice, it is used by the client to pass data
between various RPC methods provided by the server.
Two `WithRpcRef`s with the same `id` will yield the same client-side reference.
It is also possible (but discouraged for non-experts)
to implement custom `RpcEncodable` instances.
These must respect the `RpcObjectStore.wireFormat`.
-/
-- TODO(WN): for Lean.js, compile `WithRpcRef` to "opaque reference" on the client
class RpcEncodable (α : Type) where
@@ -200,7 +227,13 @@ instance [TypeName α] : RpcEncodable (WithRpcRef α) :=
{ rpcEncode, rpcDecode }
where
-- separate definitions to prevent inlining
rpcEncode r := toJson <$> rpcStoreRef r
rpcDecode j := do rpcGetRef α ( fromJson? j)
rpcEncode r := do
let ref rpcStoreRef r
let fieldName := ( get).wireFormat.refFieldName
return Json.mkObj [(fieldName, toJson ref.p)]
rpcDecode j := do
let fieldName := ( read).wireFormat.refFieldName
let p j.getObjValAs? USize fieldName
rpcGetRef α p
end Lean.Server

View File

@@ -30,6 +30,9 @@ private def deriveStructureInstance (indVal : InductiveVal) (params : Array Expr
let mut encInits := #[]
let mut decInits := #[]
for fieldName in fields do
if fieldName == `__rpcref then
throwError "'__rpcref' is reserved and cannot be used as a field name. \
See the `RpcEncodable` docstring."
let fid := mkIdent fieldName
fieldIds := fieldIds.push fid
if isOptField fieldName then
@@ -72,6 +75,9 @@ private def deriveInductiveInstance (indVal : InductiveVal) (params : Array Expr
-- create the constructor
let fieldStxs argVars.mapM fun arg => do
let name := ( getFVarLocalDecl arg).userName
if name == `__rpcref then
throwError "'__rpcref' is reserved and cannot be used as an argument name. \
See the `RpcEncodable` docstring."
`(bracketedBinderF| ($(mkIdent name) : Json))
let pktCtor `(Parser.Command.ctor|
| $ctorId:ident $[$fieldStxs]* : RpcEncodablePacket)

View File

@@ -46,6 +46,12 @@ def normalizedRef (ref : RpcRef) : NormalizeM RpcRef := do
})
return ptr
-- Test-only instances using the most recent version of the RPC wire format.
instance : ToJson RpcRef where
toJson r := Json.mkObj [("__rpcref", toJson r.p)]
instance : FromJson RpcRef where
fromJson? j := return j.getObjValAs? USize "__rpcref"
structure SubexprInfo where
info : RpcRef
subexprPos : String
@@ -713,6 +719,7 @@ partial def main (args : List String) : IO Unit := do
}
lean? := some {
silentDiagnosticSupport? := some true
rpcWireFormat? := some .v1
}
}
Ipc.writeRequest 0, "initialize", { initializationOptions?, capabilities : InitializeParams }

View File

@@ -1602,6 +1602,7 @@ def mkLeanServerCapabilities : ServerCapabilities := {
moduleHierarchyProvider? := some {}
rpcProvider? := some {
highlightMatchesProvider? := some {}
rpcWireFormat? := some .v1
}
}
}

View File

@@ -1011,7 +1011,6 @@ theorem getEntryGT?_eq_getEntryGT?ₘ [Ord α] (k : α) (t : Impl α β) :
getEntryGT? k t = getEntryGT?ₘ k t := by
rw [getEntryGT?_eq_getEntryGT?ₘ', getEntryGT?ₘ', getEntryGT?ₘ, explore_eq_applyPartition] <;> simp
set_option backward.whnf.reducibleClassField false in
theorem getEntryLT?_eq_getEntryGT?_reverse [o : Ord α] [TransOrd α] {t : Impl α β} {k : α} :
getEntryLT? k t = @getEntryGT? α β o.opposite k t.reverse := by
rw [getEntryLT?, @getEntryGT?.eq_def, Ord.opposite]
@@ -1019,7 +1018,6 @@ theorem getEntryLT?_eq_getEntryGT?_reverse [o : Ord α] [TransOrd α] {t : Impl
simp only [*, getEntryLT?.go, reverse, getEntryGT?.go, OrientedCmp.eq_swap (b := k),
Ordering.swap]
set_option backward.whnf.reducibleClassField false in
theorem getEntryLE?_eq_getEntryGE?_reverse [o : Ord α] [TransOrd α] {t : Impl α β} {k : α} :
getEntryLE? k t = @getEntryGE? α β o.opposite k t.reverse := by
rw [getEntryLE?, @getEntryGE?.eq_def, Ord.opposite]

View File

@@ -623,7 +623,6 @@ theorem RxoIterator.step_cons_of_isLT_eq_false [Ord α] {upper : α} {h : (compa
rw [step, h]
simp only [Bool.false_eq_true, reduceIte]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem RxoIterator.val_run_step_toIterM_iter [Ord α] {z : RxoIterator α β} : (z : Iter (α := RxoIterator α β) ((a : α) × β a)).toIterM.step.run.inflate.val = z.step := by
rw [IterM.step]

View File

@@ -1895,7 +1895,6 @@ theorem contains_of_contains_insertMany_list' [TransCmp cmp] [BEq α] [LawfulBEq
(h' : contains (insertMany t l) k = true)
(w : l.findSomeRev? (fun a, b => if cmp a k = .eq then some b else none) = none) :
contains t k = true :=
set_option backward.whnf.reducibleClassField false in
Impl.Const.contains_of_contains_insertMany_list' h
(by simpa [Impl.Const.insertMany_eq_insertMany!] using h')
(by simpa [compare_eq_iff_beq, BEq.comm] using w)

View File

@@ -18,25 +18,27 @@ namespace IO
namespace Async
/--
Unix style signals for Unix and Windows. SIGKILL and SIGSTOP are missing because they cannot be caught.
Unix style signals for Unix and Windows.
SIGKILL and SIGSTOP are missing because they cannot be caught.
SIGBUS, SIGFPE, SIGILL, and SIGSEGV are missing because they cannot be caught safely by libuv.
SIGPIPE is not present because the runtime ignores the signal.
-/
inductive Signal
/--
Hangup detected on controlling terminal or death of controlling process. SIGHUP is not
generated when terminal raw mode is enabled.
Hangup detected on controlling terminal or death of controlling process.
On Windows:
* SIGHUP is generated when the user closes the console window. The program is given ~10 seconds to
cleanup before Windows unconditionally terminates it.
- SIGHUP is generated when the user closes the console window. The program is given ~10 seconds to
perform cleanup before Windows unconditionally terminates it.
-/
| sighup
/--
Interrupt program.
* Normally delivered when the user presses CTRL+C. Not generated when terminal raw mode is enabled (like Unix).
Notes:
- Normally delivered when the user presses CTRL+C. Not generated when terminal raw mode is enabled.
-/
| sigint
@@ -60,14 +62,14 @@ inductive Signal
| sigabrt
/--
Emulate instruction executed
User-defined signal 1.
-/
| sigemt
| sigusr1
/--
Bad system call.
User-defined signal 2.
-/
| sigsys
| sigusr2
/--
Real-time timer expired.
@@ -83,14 +85,9 @@ inductive Signal
| sigterm
/--
Urgent condition on socket.
Child status has changed.
-/
| sigurg
/--
Stop typed at tty.
-/
| sigtstp
| sigchld
/--
Continue after stop.
@@ -98,9 +95,9 @@ inductive Signal
| sigcont
/--
Child status has changed.
Stop typed at terminal.
-/
| sigchld
| sigtstp
/--
Background read attempted from control terminal.
@@ -108,14 +105,14 @@ inductive Signal
| sigttin
/--
Background write attempted to control terminal
Background write attempted to control terminal.
-/
| sigttou
/--
I/O now possible.
Urgent condition on socket.
-/
| sigio
| sigurg
/--
CPU time limit exceeded.
@@ -148,52 +145,48 @@ inductive Signal
| sigwinch
/--
Status request from keyboard.
I/O now possible.
-/
| siginfo
| sigio
/--
User-defined signal 1.
Bad system call.
-/
| sigusr1
/--
User-defined signal 2.
-/
| sigusr2
| sigsys
deriving Repr, DecidableEq, BEq
namespace Signal
/--
Converts a `Signal` to its corresponding `Int32` value as defined in the libc `signal.h`.
Converts a `Signal` to its corresponding `Int32` value as defined in `man 7 signal`.
These values are then mapped to the underlying architecture's values in runtime/uv/signal.cpp,
so make sure to update that whenever you update this code.
-/
def toInt32 : Signal Int32
private def toInt32 : Signal Int32
| .sighup => 1
| .sigint => 2
| .sigquit => 3
| .sigtrap => 5
| .sigabrt => 6
| .sigemt => 7
| .sigsys => 12
| .sigusr1 => 10
| .sigusr2 => 12
| .sigalrm => 14
| .sigterm => 15
| .sigurg => 16
| .sigtstp => 18
| .sigcont => 19
| .sigchld => 20
| .sigchld => 17
| .sigcont => 18
| .sigtstp => 20
| .sigttin => 21
| .sigttou => 22
| .sigio => 23
| .sigurg => 23
| .sigxcpu => 24
| .sigxfsz => 25
| .sigvtalrm => 26
| .sigprof => 27
| .sigwinch => 28
| .siginfo => 29
| .sigusr1 => 30
| .sigusr2 => 31
| .sigio => 29
| .sigsys => 31
/--
`Signal.Waiter` can be used to handle a specific signal once.

View File

@@ -12,6 +12,7 @@ public import Std.Internal.Http.Data.Request
public import Std.Internal.Http.Data.Response
public import Std.Internal.Http.Data.Status
public import Std.Internal.Http.Data.Chunk
public import Std.Internal.Http.Data.Headers
/-!
# HTTP Data Types

View File

@@ -7,6 +7,7 @@ module
prelude
public import Std.Internal.Http.Internal
public import Std.Internal.Http.Data.Headers
public meta import Std.Internal.Http.Internal.String
public section
@@ -20,8 +21,7 @@ Reference: https://www.rfc-editor.org/rfc/rfc9112.html#section-7.1
-/
namespace Std.Http
open Internal
open Internal Char
set_option linter.all true
@@ -207,3 +207,114 @@ instance : Encode .v11 Chunk where
buffer.append #[size, exts.toUTF8, "\r\n".toUTF8, chunk.data, "\r\n".toUTF8]
end Chunk
/--
Trailer headers sent after the final chunk in HTTP/1.1 chunked transfer encoding.
Per RFC 9112 §7.1.2, trailers allow the sender to include additional metadata after
the message body, such as message integrity checks or digital signatures.
-/
structure Trailer where
/--
The trailer header fields as key-value pairs.
-/
headers : Headers
deriving Inhabited
namespace Trailer
/--
Creates an empty trailer with no headers.
-/
def empty : Trailer :=
{ headers := .empty }
/--
Inserts a trailer header field.
-/
@[inline]
def insert (trailer : Trailer) (name : Header.Name) (value : Header.Value) : Trailer :=
{ headers := trailer.headers.insert name value }
/--
Inserts a trailer header field from string name and value, panicking if either is invalid.
-/
@[inline]
def insert! (trailer : Trailer) (name : String) (value : String) : Trailer :=
{ headers := trailer.headers.insert! name value }
/--
Retrieves the first value for the given trailer header name.
Returns `none` if absent.
-/
@[inline]
def get? (trailer : Trailer) (name : Header.Name) : Option Header.Value :=
trailer.headers.get? name
/--
Retrieves all values for the given trailer header name.
Returns `none` if absent.
-/
@[inline]
def getAll? (trailer : Trailer) (name : Header.Name) : Option (Array Header.Value) :=
trailer.headers.getAll? name
/--
Checks if a trailer header with the given name exists.
-/
@[inline]
def contains (trailer : Trailer) (name : Header.Name) : Bool :=
trailer.headers.contains name
/--
Removes a trailer header with the given name.
-/
@[inline]
def erase (trailer : Trailer) (name : Header.Name) : Trailer :=
{ headers := trailer.headers.erase name }
/--
Gets the number of trailer headers.
-/
@[inline]
def size (trailer : Trailer) : Nat :=
trailer.headers.size
/--
Checks if the trailer has no headers.
-/
@[inline]
def isEmpty (trailer : Trailer) : Bool :=
trailer.headers.isEmpty
/--
Merges two trailers, accumulating values for duplicate keys from both.
-/
def merge (t1 t2 : Trailer) : Trailer :=
{ headers := t1.headers.merge t2.headers }
/--
Converts the trailer headers to a list of key-value pairs.
-/
def toList (trailer : Trailer) : List (Header.Name × Header.Value) :=
trailer.headers.toList
/--
Converts the trailer headers to an array of key-value pairs.
-/
def toArray (trailer : Trailer) : Array (Header.Name × Header.Value) :=
trailer.headers.toArray
/--
Folds over all key-value pairs in the trailer headers.
-/
def fold (trailer : Trailer) (init : α) (f : α Header.Name Header.Value α) : α :=
trailer.headers.fold init f
instance : Encode .v11 Trailer where
encode buffer trailer :=
buffer.write "0\r\n".toUTF8
|> (Encode.encode .v11 · trailer.headers)
|>.write "\r\n".toUTF8
end Trailer

View File

@@ -0,0 +1,268 @@
/-
Copyright (c) 2025 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sofia Rodrigues
-/
module
prelude
public import Std.Internal.Http.Data.Headers.Basic
public import Std.Internal.Http.Data.Headers.Name
public import Std.Internal.Http.Data.Headers.Value
public section
/-!
# Headers
This module defines the `Headers` type, which represents a collection of HTTP header name-value pairs.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#section-5
-/
namespace Std.Http
set_option linter.all true
open Std Internal
/--
A structure for managing HTTP headers as key-value pairs.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#section-5
-/
structure Headers where
/--
The underlying multimap that stores headers.
-/
map : IndexMultiMap Header.Name Header.Value
deriving Inhabited, Repr
instance : Membership Header.Name Headers where
mem headers name := name headers.map
instance (name : Header.Name) (h : Headers) : Decidable (name h) :=
inferInstanceAs (Decidable (name h.map))
namespace Headers
/--
Retrieves the first `Header.Value` for the given key.
-/
@[inline]
def get (headers : Headers) (name : Header.Name) (h : name headers) : Header.Value :=
headers.map.get name h
/--
Retrieves all `Header.Value` entries for the given key.
-/
@[inline]
def getAll (headers : Headers) (name : Header.Name) (h : name headers) : Array Header.Value :=
headers.map.getAll name h
/--
Like `getAll`, but returns `none` instead of requiring a membership proof.
Returns `none` if the header is absent.
-/
@[inline]
def getAll? (headers : Headers) (name : Header.Name) : Option (Array Header.Value) :=
headers.map.getAll? name
/--
Retrieves the first `Header.Value` for the given key.
Returns `none` if the header is absent.
-/
@[inline]
def get? (headers : Headers) (name : Header.Name) : Option Header.Value :=
headers.map.get? name
/--
Checks if the entry is present in the `Headers`.
-/
@[inline]
def hasEntry (headers : Headers) (name : Header.Name) (value : Header.Value) : Bool :=
headers.map.hasEntry name value
/--
Retrieves the last header value for the given key.
Returns `none` if the header is absent.
-/
@[inline]
def getLast? (headers : Headers) (name : Header.Name) : Option Header.Value :=
headers.map.getLast? name
/--
Like `get?`, but returns a default value if absent.
-/
@[inline]
def getD (headers : Headers) (name : Header.Name) (d : Header.Value) : Header.Value :=
headers.map.getD name d
/--
Like `get?`, but panics if absent.
-/
@[inline]
def get! (headers : Headers) (name : Header.Name) : Header.Value :=
headers.map.get! name
/--
Inserts a new key-value pair into the headers.
-/
@[inline]
def insert (headers : Headers) (key : Header.Name) (value : Header.Value) : Headers :=
{ map := headers.map.insert key value }
/--
Adds a header from string name and value, panicking if either is invalid.
-/
@[inline]
def insert! (headers : Headers) (name : String) (value : String) : Headers :=
headers.insert (Header.Name.ofString! name) (Header.Value.ofString! value)
/--
Adds a header from string name and value.
Returns `none` if either the header name or value is invalid.
-/
@[inline]
def insert? (headers : Headers) (name : String) (value : String) : Option Headers := do
let name Header.Name.ofString? name
let value Header.Value.ofString? value
pure <| headers.insert name value
/--
Inserts a new key with an array of values.
-/
@[inline]
def insertMany (headers : Headers) (key : Header.Name) (values : Array Header.Value) : Headers :=
{ map := headers.map.insertMany key values }
/--
Creates empty headers.
-/
def empty : Headers :=
{ map := }
/--
Creates headers from a list of key-value pairs.
-/
def ofList (pairs : List (Header.Name × Header.Value)) : Headers :=
{ map := IndexMultiMap.ofList pairs }
/--
Checks if a header with the given name exists.
-/
@[inline]
def contains (headers : Headers) (name : Header.Name) : Bool :=
headers.map.contains name
/--
Removes a header with the given name.
-/
@[inline]
def erase (headers : Headers) (name : Header.Name) : Headers :=
{ map := headers.map.erase name }
/--
Gets the number of headers.
-/
@[inline]
def size (headers : Headers) : Nat :=
headers.map.size
/--
Checks if the headers are empty.
-/
@[inline]
def isEmpty (headers : Headers) : Bool :=
headers.map.isEmpty
/--
Merges two headers, accumulating values for duplicate keys from both.
-/
def merge (headers1 headers2 : Headers) : Headers :=
{ map := headers1.map headers2.map }
/--
Converts the headers to a list of key-value pairs (flattened). Each header with multiple values produces
multiple pairs.
-/
def toList (headers : Headers) : List (Header.Name × Header.Value) :=
headers.map.toList
/--
Converts the headers to an array of key-value pairs (flattened). Each header with multiple values
produces multiple pairs.
-/
def toArray (headers : Headers) : Array (Header.Name × Header.Value) :=
headers.map.toArray
/--
Folds over all key-value pairs in the headers.
-/
def fold (headers : Headers) (init : α) (f : α Header.Name Header.Value α) : α :=
headers.map.toArray.foldl (fun acc (k, v) => f acc k v) init
/--
Maps a function over all header values, producing new headers.
-/
def mapValues (headers : Headers) (f : Header.Name Header.Value Header.Value) : Headers :=
let pairs := headers.map.toArray.map (fun (k, v) => (k, f k v))
{ map := pairs.foldl (fun acc (k, v) => acc.insert k v) IndexMultiMap.empty }
/--
Filters and maps over header key-value pairs. Returns only the pairs for which the function returns `some`.
-/
def filterMap (headers : Headers) (f : Header.Name Header.Value Option Header.Value) : Headers :=
let pairs := headers.map.toArray.filterMap (fun (k, v) =>
match f k v with
| some v' => some (k, v')
| none => none)
{ map := pairs.foldl (fun acc (k, v) => acc.insert k v) IndexMultiMap.empty }
/--
Filters header key-value pairs, keeping only those that satisfy the predicate.
-/
def filter (headers : Headers) (f : Header.Name Header.Value Bool) : Headers :=
headers.filterMap (fun k v => if f k v then some v else none)
/--
Updates all the values of a header if it exists.
-/
def update (headers : Headers) (name : Header.Name) (f : Header.Value Header.Value) : Headers :=
{ map := headers.map.update name f }
/--
Replaces the last value for the given header name.
If the header is absent, returns the headers unchanged.
-/
@[inline]
def replaceLast (headers : Headers) (name : Header.Name) (value : Header.Value) : Headers :=
{ map := headers.map.replaceLast name value }
instance : ToString Headers where
toString headers :=
let pairs := headers.map.toArray.map (fun (k, v) => s!"{k}: {v}")
String.intercalate "\r\n" pairs.toList
instance : Encode .v11 Headers where
encode buffer headers :=
headers.fold buffer (fun buf name value =>
buf.writeString s!"{name}: {value}\r\n")
instance : EmptyCollection Headers :=
Headers.empty
instance : Singleton (Header.Name × Header.Value) Headers :=
fun a, b => ( : Headers).insert a b
instance : Insert (Header.Name × Header.Value) Headers :=
fun a, b s => s.insert a b
instance : Union Headers :=
merge
instance [Monad m] : ForIn m Headers (Header.Name × Header.Value) where
forIn headers b f := forIn headers.map.entries b f
end Std.Http.Headers

View File

@@ -0,0 +1,217 @@
/-
Copyright (c) 2025 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sofia Rodrigues
-/
module
prelude
public import Std.Internal.Http.Data.Headers.Name
public import Std.Internal.Http.Data.Headers.Value
public import Std.Internal.Parsec.Basic
public section
/-!
# Header Typeclass and Common Headers
This module defines the `Header` typeclass for typed HTTP headers and some common header parsers.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#name-representation-data-and-met
-/
namespace Std.Http
set_option linter.all true
open Internal
/--
Typeclass for typed HTTP headers that can be parsed from and serialized to header values.
-/
class Header (α : Type) where
/--
Parses a header value into the typed representation.
-/
parse : Header.Value Option α
/--
Serializes the typed representation back to a name-value pair.
-/
serialize : α Header.Name × Header.Value
instance [h : Header α] : Encode .v11 α where
encode buffer a :=
let (name, value) := h.serialize a
buffer.writeString s!"{name}: {value}\r\n"
namespace Header
private def parseTokenList (v : Value) : Option (Array String) := do
let rawParts := v.value.split (· == ',')
let parts := rawParts.map (·.trimAscii)
guard (parts.all (¬·.isEmpty))
return parts.toArray.map (fun s => s.toString.toLower)
/--
The `Content-Length` header, representing the size of the message body in bytes.
Parses only valid natural number values.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#section-8.6-2
-/
structure ContentLength where
/--
The content length in bytes.
-/
length : Nat
deriving BEq, Repr
namespace ContentLength
/--
Parses a content length header value.
-/
def parse (v : Value) : Option ContentLength :=
v.value.toNat?.map (.mk)
/--
Serializes a content length header back to a name-value pair.
-/
def serialize (h : ContentLength) : Name × Value :=
(Header.Name.contentLength, Value.ofString! (toString h.length))
instance : Header ContentLength := parse, serialize
end ContentLength
/--
Validates the chunked placement rules for the Transfer Encoding header. Returns `false` if the
encoding list violates the constraints.
Reference: https://www.rfc-editor.org/rfc/rfc9112#section-6.1
-/
@[expose]
def TransferEncoding.Validate (codings : Array String) : Bool :=
if codings.isEmpty || codings.any (fun coding => !isToken coding) then
false
else
let chunkedCount := codings.filter (· == "chunked") |>.size
-- the sender MUST either apply chunked as the final transfer coding
let lastIsChunked := codings.back? == some "chunked"
if chunkedCount > 1 then
false
else if chunkedCount == 1 && !lastIsChunked then
false
else
true
/--
The `Transfer-Encoding` header, representing the list of transfer codings applied to the message body.
Validation rules (RFC 9112 Section 6.1):
- "chunked" may appear at most once.
- If "chunked" is present, it must be the last encoding in the list.
Reference: https://www.rfc-editor.org/rfc/rfc9112#section-6.1
-/
structure TransferEncoding where
/--
The ordered list of transfer codings.
-/
codings : Array String
/--
Proof that the transfer codings satisfy the chunked placement rules.
-/
isValid : TransferEncoding.Validate codings = true
deriving Repr
namespace TransferEncoding
/--
Returns `true` if the transfer encoding ends with chunked.
-/
def isChunked (te : TransferEncoding) : Bool :=
te.codings.back? == some "chunked"
/--
Parses a comma-separated list of transfer codings from a header value, validating chunked placement.
-/
def parse (v : Value) : Option TransferEncoding := do
let codings parseTokenList v
if h : TransferEncoding.Validate codings then
some codings, h
else
none
/--
Serializes a transfer encoding back to a comma-separated header value.
-/
def serialize (te : TransferEncoding) : Header.Name × Header.Value :=
let value := ",".intercalate (te.codings.toList)
(Header.Name.transferEncoding, Value.ofString! value)
instance : Header TransferEncoding := parse, serialize
end TransferEncoding
/--
The `Connection` header, represented as a list of connection option tokens.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#name-connection
-/
structure Connection where
/--
The normalized connection-option tokens.
-/
tokens : Array String
/--
Proof that all tokens satisfy `isToken`.
-/
valid : tokens.all isToken = true
deriving Repr
namespace Connection
/--
Checks whether a specific token is present in the `Connection` header value.
-/
def containsToken (connection : Connection) (token : String) : Bool :=
let token := token.trimAscii.toString.toLower
connection.tokens.any (· == token)
/--
Checks whether the `Connection` header requests connection close semantics.
-/
def shouldClose (connection : Connection) : Bool :=
connection.containsToken "close"
/--
Parses a `Connection` header value into normalized tokens.
-/
def parse (v : Value) : Option Connection := do
let tokens parseTokenList v
if h : tokens.all isToken = true then
some tokens, h
else
none
/--
Serializes a `Connection` header back to a comma-separated value.
-/
def serialize (connection : Connection) : Header.Name × Header.Value :=
let value := ",".intercalate connection.tokens.toList
(Header.Name.connection, Value.ofString! value)
instance : Header Connection := parse, serialize
end Std.Http.Header.Connection

View File

@@ -0,0 +1,180 @@
/-
Copyright (c) 2025 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sofia Rodrigues
-/
module
prelude
public import Init.Data.ToString
public import Std.Internal.Http.Internal
public section
/-!
# Header Names
This module defines the `Name` type, which represents validated HTTP header names that conform to
HTTP standards.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#section-5
-/
namespace Std.Http.Header
set_option linter.all true
open Internal Char
/--
Proposition asserting that a string is a valid HTTP header name: all characters are valid token
characters and the string is non-empty.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#name-field-names
-/
abbrev IsValidHeaderName (s : String) : Prop :=
isToken s
/--
A validated HTTP header name that ensures all characters conform to HTTP standards. Header names are
case-insensitive according to HTTP specifications.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#name-field-names
-/
@[ext]
structure Name where
/--
The lowercased normalized header name string.
-/
value : String
/--
The proof that it's a valid header name.
-/
isValidHeaderValue : IsValidHeaderName value := by decide
/--
The proof that we stored the header name in stored as a lower case string.
-/
isLowerCase : IsLowerCase value := by decide
deriving Repr, DecidableEq
namespace Name
instance : BEq Name where
beq a b := a.value = b.value
instance : Hashable Name where
hash x := Hashable.hash x.value
theorem Name.beq_eq {x y : Name} : (x == y) = (x.value == y.value) :=
rfl
instance : LawfulBEq Name where
rfl {x} := by simp [Name.beq_eq]
eq_of_beq {x y} := by grind [Name.beq_eq, Name.ext]
instance : LawfulHashable Name := inferInstance
instance : Inhabited Name where
default := "_", by decide, by decide
/--
Attempts to create a `Name` from a `String`, returning `none` if the string contains invalid
characters for HTTP header names or is empty.
-/
def ofString? (s : String) : Option Name :=
let val := s.trimAscii.toString.toLower
if h : IsValidHeaderName val IsLowerCase val then
some val, h.left, h.right
else
none
/--
Creates a `Name` from a string, panicking with an error message if the string contains invalid
characters for HTTP header names or is empty.
-/
def ofString! (s : String) : Name :=
match ofString? s with
| some res => res
| none => panic! s!"invalid header name: {s.quote}"
/--
Converts the header name to title case (e.g., "Content-Type").
Note: some well-known headers have unconventional casing (e.g., "WWW-Authenticate"),
but since HTTP header names are case-insensitive, this always uses simple capitalization.
-/
@[inline]
def toCanonical (name : Name) : String :=
let it := name.value.splitOn "-"
|>.map (·.capitalize)
String.intercalate "-" it
/--
Performs a case-insensitive comparison between a `Name` and a `String`. Returns `true` if they match.
-/
@[expose]
def is (name : Name) (s : String) : Bool :=
name.value == s.toLower
instance : ToString Name where
toString name := name.toCanonical
/--
Standard Content-Type header name
-/
def contentType : Header.Name := .mk "content-type"
/--
Standard Content-Length header name
-/
def contentLength : Header.Name := .mk "content-length"
/--
Standard Host header name
-/
def host : Header.Name := .mk "host"
/--
Standard Authorization header name
-/
def authorization : Header.Name := .mk "authorization"
/--
Standard User-Agent header name
-/
def userAgent : Header.Name := .mk "user-agent"
/--
Standard Accept header name
-/
def accept : Header.Name := .mk "accept"
/--
Standard Connection header name
-/
def connection : Header.Name := .mk "connection"
/--
Standard Transfer-Encoding header name
-/
def transferEncoding : Header.Name := .mk "transfer-encoding"
/--
Standard Server header name
-/
def server : Header.Name := .mk "server"
/--
Standard Date header name
-/
def date : Header.Name := .mk "date"
/--
Standard Expect header name
-/
def expect : Header.Name := .mk "expect"
end Std.Http.Header.Name

View File

@@ -0,0 +1,103 @@
/-
Copyright (c) 2025 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sofia Rodrigues
-/
module
prelude
public import Init.Data.ToString
public import Std.Internal.Http.Internal
public section
/-!
# Header Values
This module defines the `Value` type, which represents validated HTTP header values that conform to HTTP
standards.
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#name-field-values
-/
namespace Std.Http.Header
set_option linter.all true
open Internal
/--
Proposition that asserts all characters in a string are valid for HTTP header values,
and that the first and last characters (if present) are `field-vchar` (not SP/HTAB).
field-value = *field-content
field-content = field-vchar
[ 1*( SP / HTAB / field-vchar ) field-vchar ]
Reference: https://www.rfc-editor.org/rfc/rfc9110.html#section-5.5
-/
abbrev IsValidHeaderValue (s : String) : Prop :=
let s := s.toList
s.all Char.fieldContent
(s.head?.map Char.fieldVchar |>.getD true)
(s.getLast?.map Char.fieldVchar |>.getD true)
/--
A validated HTTP header value that ensures all characters conform to HTTP standards.
-/
structure Value where
/--
The string data.
-/
value : String
/--
The proof that it's a valid header value.
-/
isValidHeaderValue : IsValidHeaderValue value := by decide
deriving BEq, DecidableEq, Repr
instance : Hashable Value where
hash := Hashable.hash Value.value
instance : Inhabited Value where
default := "", by decide
namespace Value
/--
Attempts to create a `Value` from a `String`, returning `none` if the string contains invalid characters
for HTTP header values.
-/
@[expose]
def ofString? (s : String) : Option Value :=
-- A field value does not include leading or trailing whitespace.
let val := s.trimAscii.toString
if h : IsValidHeaderValue val then
some val, h
else
none
/--
Creates a `Value` from a string, panicking with an error message if the string contains invalid
characters for HTTP header values.
-/
@[expose]
def ofString! (s : String) : Value :=
match ofString? s with
| some res => res
| none => panic! s!"invalid header value: {s.quote}"
/--
Performs a case-insensitive comparison between a `Value` and a `String`. Returns `true` if they match.
-/
@[expose]
def is (s : Value) (h : String) : Bool :=
s.value.toLower == h.toLower
instance : ToString Value where
toString v := v.value
end Std.Http.Header.Value

View File

@@ -9,6 +9,7 @@ prelude
public import Std.Internal.Http.Data.Extensions
public import Std.Internal.Http.Data.Method
public import Std.Internal.Http.Data.Version
public import Std.Internal.Http.Data.Headers
public section
@@ -50,6 +51,11 @@ structure Request.Head where
The raw request-target string (commonly origin-form path/query, `"*"`, or authority-form).
-/
uri : String
/--
Collection of HTTP headers for the request (Content-Type, Authorization, etc.).
-/
headers : Headers := .empty
deriving Inhabited, Repr
/--
@@ -93,6 +99,8 @@ instance : ToString Head where
toString req.method ++ " " ++
toString req.uri ++ " " ++
toString req.version ++
"\r\n" ++
toString req.headers ++
"\r\n"
open Internal in
@@ -103,6 +111,8 @@ instance : Encode .v11 Head where
let buffer := buffer.writeString req.uri
let buffer := buffer.writeChar ' '
let buffer := Encode.encode (v := .v11) buffer req.version
let buffer := buffer.writeString "\r\n"
let buffer := Encode.encode (v := .v11) buffer req.headers
buffer.writeString "\r\n"
/--
@@ -137,6 +147,43 @@ Sets the request target/URI for the request being built.
def uri (builder : Builder) (uri : String) : Builder :=
{ builder with line := { builder.line with uri := uri } }
/--
Sets the headers for the request being built.
-/
def headers (builder : Builder) (headers : Headers) : Builder :=
{ builder with line := { builder.line with headers } }
/--
Adds a single header to the request being built.
-/
def header (builder : Builder) (key : Header.Name) (value : Header.Value) : Builder :=
{ builder with line := { builder.line with headers := builder.line.headers.insert key value } }
/--
Adds a single header to the request being built, panics if the header is invalid.
-/
def header! (builder : Builder) (key : String) (value : String) : Builder :=
let key := Header.Name.ofString! key
let value := Header.Value.ofString! value
{ builder with line := { builder.line with headers := builder.line.headers.insert key value } }
/--
Adds a single header to the request being built.
Returns `none` if the header name or value is invalid.
-/
def header? (builder : Builder) (key : String) (value : String) : Option Builder := do
let key Header.Name.ofString? key
let value Header.Value.ofString? value
pure <| { builder with line := { builder.line with headers := builder.line.headers.insert key value } }
/--
Adds a header to the request being built only if the Option Header.Value is some.
-/
def headerOpt (builder : Builder) (key : Header.Name) (value : Option Header.Value) : Builder :=
match value with
| some v => builder.header key v
| none => builder
/--
Inserts a typed extension value into the request being built.
-/

View File

@@ -9,6 +9,7 @@ prelude
public import Std.Internal.Http.Data.Extensions
public import Std.Internal.Http.Data.Status
public import Std.Internal.Http.Data.Version
public import Std.Internal.Http.Data.Headers
public section
@@ -36,6 +37,12 @@ structure Response.Head where
The HTTP protocol version used in the response, e.g. `HTTP/1.1`.
-/
version : Version := .v11
/--
The set of response headers, providing metadata such as `Content-Type`,
`Content-Length`, and caching directives.
-/
headers : Headers := .empty
deriving Inhabited, Repr
/--
@@ -78,7 +85,9 @@ instance : ToString Head where
toString r :=
toString r.version ++ " " ++
toString r.status.toCode ++ " " ++
r.status.reasonPhrase ++ "\r\n"
r.status.reasonPhrase ++ "\r\n" ++
toString r.headers ++
"\r\n"
open Internal in
instance : Encode .v11 Head where
@@ -88,6 +97,8 @@ instance : Encode .v11 Head where
let buffer := buffer.writeString (toString r.status.toCode)
let buffer := buffer.writeChar ' '
let buffer := buffer.writeString r.status.reasonPhrase
let buffer := buffer.writeString "\r\n"
let buffer := Encode.encode (v := .v11) buffer r.headers
buffer.writeString "\r\n"
/--
@@ -108,6 +119,35 @@ Sets the HTTP status code for the response being built.
def status (builder : Builder) (status : Status) : Builder :=
{ builder with line := { builder.line with status := status } }
/--
Sets the headers for the response being built.
-/
def headers (builder : Builder) (headers : Headers) : Builder :=
{ builder with line := { builder.line with headers } }
/--
Adds a single header to the response being built.
-/
def header (builder : Builder) (key : Header.Name) (value : Header.Value) : Builder :=
{ builder with line := { builder.line with headers := builder.line.headers.insert key value } }
/--
Adds a single header to the response being built, panics if the header is invalid.
-/
def header! (builder : Builder) (key : String) (value : String) : Builder :=
let key := Header.Name.ofString! key
let value := Header.Value.ofString! value
{ builder with line := { builder.line with headers := builder.line.headers.insert key value } }
/--
Adds a single header to the response being built.
Returns `none` if the header name or value is invalid.
-/
def header? (builder : Builder) (key : String) (value : String) : Option Builder := do
let key Header.Name.ofString? key
let value Header.Value.ofString? value
pure <| { builder with line := { builder.line with headers := builder.line.headers.insert key value } }
/--
Inserts a typed extension value into the response being built.
-/

View File

@@ -6,7 +6,10 @@ Authors: Sofia Rodrigues
module
prelude
public import Std.Internal.Http.Internal.Char
public import Std.Internal.Http.Internal.ChunkedBuffer
public import Std.Internal.Http.Internal.LowerCase
public import Std.Internal.Http.Internal.IndexMultiMap
public import Std.Internal.Http.Internal.Encode
public import Std.Internal.Http.Internal.String
public import Std.Internal.Http.Internal.Char

View File

@@ -0,0 +1,281 @@
/-
Copyright (c) 2025 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sofia Rodrigues
-/
module
prelude
public import Init.Grind
public import Init.Data.Int.OfNat
public import Std.Data.HashMap
public section
/-!
# IndexMultiMap
This module defines a generic `IndexMultiMap` type that maps keys to multiple values.
The implementation stores entries in a flat array for iteration and an index HashMap
for fast key lookups. Each key always has at least one associated value.
-/
namespace Std.Internal
open Std Internal
set_option linter.all true
/--
A structure for managing ordered key-value pairs where each key can have multiple values.
-/
structure IndexMultiMap (α : Type u) (β : Type v) [BEq α] [Hashable α] where
/--
Flat array of all key-value entries in insertion order.
-/
entries : Array (α × β)
/--
Maps each key to its indices in `entries`. Each array is non-empty.
-/
indexes : HashMap α (Array Nat)
/--
Invariant: every key in `indexes` maps to a non-empty array of valid indices into `entries`.
-/
validity : k : α, (p : k indexes)
let idx := (indexes.get k p);
idx.size > 0 ( i idx, i < entries.size)
deriving Repr
instance [BEq α] [Hashable α] [Inhabited α] [Inhabited β] : Inhabited (IndexMultiMap α β) where
default := #[], .emptyWithCapacity, by intro h p; simp at p
namespace IndexMultiMap
variable {α : Type u} {β : Type v} [BEq α] [Hashable α]
instance : Membership α (IndexMultiMap α β) where
mem map key := key map.indexes
instance (key : α) (map : IndexMultiMap α β) : Decidable (key map) :=
inferInstanceAs (Decidable (key map.indexes))
/--
Retrieves all values for the given key.
-/
@[inline]
def getAll (map : IndexMultiMap α β) (key : α) (h : key map) : Array β :=
let entries := map.indexes.get key h |>.mapFinIdx fun idx _ h₁ =>
let proof := map.validity key h |>.right _ (Array.getElem_mem h₁)
map.entries[(map.indexes.get key h)[idx]]'proof |>.snd
entries
/--
Retrieves the first value for the given key.
-/
@[inline]
def get (map : IndexMultiMap α β) (key : α) (h : key map) : β :=
let nonEmpty, isIn := map.validity key h
let entry := ((map.indexes.get key h)[0]'nonEmpty)
let proof := map.validity key h |>.right
entry
(by simp only [entry, HashMap.get_eq_getElem, Array.getElem_mem])
map.entries[entry]'proof |>.snd
/--
Retrieves all values for the given key, or `none` if the key is absent.
-/
@[inline]
def getAll? (map : IndexMultiMap α β) (key : α) : Option (Array β) :=
if h : key map then
some (map.getAll key h)
else
none
/--
Retrieves the first value for the given key, or `none` if the key is absent.
-/
@[inline]
def get? (map : IndexMultiMap α β) (key : α) : Option β :=
if h : key map then
some (map.get key h)
else
none
/--
Checks if the key-value pair is present in the map.
-/
@[inline]
def hasEntry (map : IndexMultiMap α β) [BEq β] (key : α) (value : β) : Bool :=
map.getAll? key
|>.bind (fun arr => arr.find? (· == value))
|>.isSome
/--
Retrieves the last value for the given key.
Returns `none` if the key is absent.
-/
@[inline]
def getLast? (map : IndexMultiMap α β) (key : α) : Option β :=
match map.getAll? key with
| none => none
| some idxs => idxs.back?
/--
Like `get?`, but returns a default value if absent.
-/
@[inline]
def getD (map : IndexMultiMap α β) (key : α) (d : β) : β :=
map.get? key |>.getD d
/--
Like `get?`, but panics if absent.
-/
@[inline]
def get! [Inhabited β] (map : IndexMultiMap α β) (key : α) : β :=
map.get? key |>.get!
/--
Inserts a new key-value pair into the map.
If the key already exists, appends the value to existing values.
-/
@[inline]
def insert [EquivBEq α] [LawfulHashable α] (map : IndexMultiMap α β) (key : α) (value : β) : IndexMultiMap α β :=
let i := map.entries.size
let entries := map.entries.push (key, value)
let f := fun
| some idxs => some (idxs.push i)
| none => some #[i]
let indexes := map.indexes.alter key f
{ entries, indexes, validity := ?_ }
where finally
have _ := map.validity
grind
/--
Inserts multiple values for a given key, appending to any existing values.
-/
@[inline]
def insertMany [EquivBEq α] [LawfulHashable α] (map : IndexMultiMap α β) (key : α) (values : Array β) : IndexMultiMap α β :=
values.foldl (insert · key) map
/--
Creates an empty multimap.
-/
def empty : IndexMultiMap α β :=
#[], .emptyWithCapacity, by intro h p; simp at p
/--
Creates a multimap from a list of key-value pairs.
-/
def ofList [EquivBEq α] [LawfulHashable α] (pairs : List (α × β)) : IndexMultiMap α β :=
pairs.foldl (fun acc (k, v) => acc.insert k v) empty
/--
Checks if a key exists in the map.
-/
@[inline]
def contains (map : IndexMultiMap α β) (key : α) : Bool :=
map.indexes.contains key
/--
Updates all values associated with `key` by applying `f` to each one.
If the key is absent, returns the map unchanged.
-/
@[inline]
def update [EquivBEq α] [LawfulHashable α] (map : IndexMultiMap α β) (key : α) (f : β β) : IndexMultiMap α β :=
if key map then
map
else
map.entries.foldl (fun acc (k, v) => acc.insert k (if k == key then f v else v)) empty
/--
Replaces the last value associated with `key` with `value`.
If the key is absent, returns the map unchanged.
-/
@[inline]
def replaceLast (map : IndexMultiMap α β) (key : α) (value : β) : IndexMultiMap α β :=
if h : key map then
let idxs := map.indexes.get key h
let nonEmpty, isIn := map.validity key h
let lastPos : Fin idxs.size := idxs.size - 1, Nat.sub_lt nonEmpty (by omega)
let lastIdx : Nat := idxs[lastPos]
have lastIdxValid : lastIdx < map.entries.size := isIn lastIdx (Array.getElem_mem lastPos.isLt)
let entries := map.entries.set (Fin.mk lastIdx lastIdxValid) (key, value)
{ map with entries, validity := ?_ }
else
map
where finally
have _ := map.validity
grind
/--
Removes a key and all its values from the map. This function rebuilds the entire
`entries` array and `indexes` map from scratch by filtering out all pairs whose
key matches, then re-inserting the survivors.
-/
@[inline]
def erase [EquivBEq α] [LawfulHashable α] (map : IndexMultiMap α β) (key : α) : IndexMultiMap α β :=
if key map then
map
else
map.entries.filter (fun (k, _) => !(k == key))
|>.foldl (fun acc (k, v) => acc.insert k v) empty
/--
Gets the number of entries in the map.
-/
@[inline]
def size (map : IndexMultiMap α β) : Nat :=
map.entries.size
/--
Checks if the map is empty.
-/
@[inline]
def isEmpty (map : IndexMultiMap α β) : Bool :=
map.entries.isEmpty
/--
Converts the multimap to an array of key-value pairs (flattened).
-/
def toArray (map : IndexMultiMap α β) : Array (α × β) :=
map.entries
/--
Converts the multimap to a list of key-value pairs (flattened).
-/
def toList (map : IndexMultiMap α β) : List (α × β) :=
map.entries.toList
/--
Merges two multimaps, combining values for shared keys.
-/
def merge [EquivBEq α] [LawfulHashable α] (m1 m2 : IndexMultiMap α β) : IndexMultiMap α β :=
m2.entries.foldl (fun acc (k, v) => acc.insert k v) m1
instance : EmptyCollection (IndexMultiMap α β) :=
IndexMultiMap.empty
instance [EquivBEq α] [LawfulHashable α] : Singleton (α × β) (IndexMultiMap α β) :=
fun a, b => ( : IndexMultiMap α β).insert a b
instance [EquivBEq α] [LawfulHashable α] : Insert (α × β) (IndexMultiMap α β) :=
fun a, b m => m.insert a b
instance [EquivBEq α] [LawfulHashable α] : Union (IndexMultiMap α β) :=
merge
instance [Monad m] : ForIn m (IndexMultiMap α β) (α × β) where
forIn map b f := forIn map.entries b f
end Std.Internal.IndexMultiMap

View File

@@ -0,0 +1,68 @@
/-
Copyright (c) 2025 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sofia Rodrigues
-/
module
prelude
import Init.Grind
import Init.Data.Int.OfNat
import Init.Data.UInt.Lemmas
public import Init.Data.String
@[expose]
public section
/-!
# LowerCase
This module provides predicates and normalization functions for handling ASCII case-insensitivity.
It includes proofs of idempotency for lowercase transformations, as well as utilities for validating
the lowercase state of a String.
-/
namespace Std.Http.Internal
set_option linter.all true
/--
Predicate asserting that a string is in lowercase form.
-/
@[expose] def IsLowerCase (s : String) : Prop :=
s.toLower = s
private theorem Char.toLower_eq_self_iff {c : Char} : c.toLower = c c.isUpper = false := by
simp only [Char.toLower, Char.isUpper]
split <;> rename_i h <;> simpa [UInt32.le_iff_toNat_le, Char.ext_iff] using h
private theorem String.toLower_eq_self_iff {s : String} : s.toLower = s s.toList.any Char.isUpper = false := by
simp only [String.toLower, String.toList_inj, String.toList_map]
rw (occs := [2]) [ List.map_id s.toList]
rw [List.map_eq_map_iff]
simp [Char.toLower_eq_self_iff]
instance : Decidable (IsLowerCase s) :=
decidable_of_decidable_of_iff (p := s.toList.any Char.isUpper = false)
(by exact String.toLower_eq_self_iff.symm)
namespace IsLowerCase
private theorem Char.toLower_idempotent (c : Char) : c.toLower.toLower = c.toLower := by
grind [Char.toLower]
/--
Proof that applying `toLower` to any string results in a string that satisfies the `IsLowerCase`
predicate.
-/
theorem isLowerCase_toLower {s : String} : IsLowerCase s.toLower := by
unfold IsLowerCase String.toLower
rw [String.map_map, Function.comp_def]
simp [Char.toLower_idempotent]
theorem isLowerCase_empty : IsLowerCase "" := by
simp [IsLowerCase, String.toLower]
end IsLowerCase
end Std.Http.Internal

View File

@@ -566,7 +566,7 @@ instance : HAdd PlainDateTime Millisecond.Offset PlainDateTime where
hAdd := addMilliseconds
instance : HSub PlainDateTime Millisecond.Offset PlainDateTime where
hSub := addMilliseconds
hSub := subMilliseconds
instance : HAdd PlainDateTime Second.Offset PlainDateTime where
hAdd := addSeconds

View File

@@ -384,7 +384,7 @@ instance : HAdd PlainTime Duration PlainTime where
hAdd pt d := PlainTime.ofNanoseconds (d.toNanoseconds + pt.toNanoseconds)
instance : HSub PlainTime Duration PlainTime where
hSub pt d := PlainTime.ofNanoseconds (d.toNanoseconds - pt.toNanoseconds)
hSub pt d := PlainTime.ofNanoseconds (pt.toNanoseconds - d.toNanoseconds)
end Duration
end Time

View File

@@ -87,21 +87,20 @@ public def compileO
}
public def mkArgs (basePath : FilePath) (args : Array String) : LogIO (Array String) := do
if Platform.isWindows then
-- Use response file to avoid potentially exceeding CLI length limits.
let rspFile := basePath.addExtension "rsp"
let h IO.FS.Handle.mk rspFile .write
args.forM fun arg =>
-- Escape special characters
let arg := arg.foldl (init := "") fun s c =>
if c == '\\' || c == '"' then
s.push '\\' |>.push c
else
s.push c
h.putStr s!"\"{arg}\"\n"
return #[s!"@{rspFile}"]
else
return args
-- Use response file to avoid potentially exceeding CLI length limits.
-- On Windows this is always needed; on macOS/Linux this is needed for large
-- projects like Mathlib where the number of object files exceeds ARG_MAX.
let rspFile := basePath.addExtension "rsp"
let h IO.FS.Handle.mk rspFile .write
args.forM fun arg =>
-- Escape special characters
let arg := arg.foldl (init := "") fun s c =>
if c == '\\' || c == '"' then
s.push '\\' |>.push c
else
s.push c
h.putStr s!"\"{arg}\"\n"
return #[s!"@{rspFile}"]
public def compileStaticLib
(libFile : FilePath) (oFiles : Array FilePath)

View File

@@ -499,43 +499,71 @@ public def cacheArtifact
/-- **For internal use only.** -/
public class ResolveOutputs (α : Type) where
/-- **For internal use only.** -/
resolveOutputs (out : Json)
(service? : Option CacheServiceName) (scope? : Option CacheServiceScope) : JobM α
resolveOutputs (out : CacheOutput) : JobM α
open ResolveOutputs in
/--
Retrieve artifacts from the Lake cache using the outputs stored
in either the saved trace file or in the cached input-to-content mapping.
Retrieve artifacts from the Lake cache using only the outputs
stored in the cached input-to-content mapping.
**For internal use only.**
-/
@[specialize] public nonrec def getArtifacts?
[ResolveOutputs α] (inputHash : Hash) (savedTrace : SavedTrace) (pkg : Package)
@[specialize] private def getArtifactsUsingCache?
[ResolveOutputs α] (inputHash : Hash) (pkg : Package)
: JobM (Option α) := do
let cache getLakeCache
let updateCache pkg.isArtifactCacheWritable
if let some out cache.readOutputs? pkg.cacheScope inputHash then
if let some out ( getLakeCache).readOutputs? pkg.cacheScope inputHash then
try
return some ( resolveOutputs out.data out.service? out.scope?)
return some ( resolveOutputs out)
catch e =>
let log takeLogFrom e
let msg := s!"input '{inputHash.toString.take 7}' found in package artifact cache, \
but some output(s) have issues:"
let msg := log.entries.foldl (s!"{·}\n- {·.message}") msg
logWarning msg
return none
else
return none
open ResolveOutputs in
/--
Retrieve artifacts from the Lake cache using only the outputs stored in the saved trace file.
**For internal use only.**
-/
@[specialize] public def getArtifactsUsingTrace?
[ResolveOutputs α] (inputHash : Hash) (savedTrace : SavedTrace) (pkg : Package)
: JobM (Option α) := do
if let .ok data := savedTrace then
if data.depHash == inputHash then
if let some out := data.outputs? then
try
let arts resolveOutputs out none none
if updateCache then
if let .error e (cache.writeOutputs pkg.cacheScope inputHash out).toBaseIO then
let arts resolveOutputs (.ofData out)
if ( pkg.isArtifactCacheWritable) then
let act := ( getLakeCache).writeOutputs pkg.cacheScope inputHash out
if let .error e act.toBaseIO then
logWarning s!"could not write outputs to cache: {e}"
return some arts
catch e =>
dropLogFrom e
return none
open ResolveOutputs in
/--
Retrieve artifacts from the Lake cache using the outputs stored in either
the saved trace file or (unless `traceOnly` is `true`) in the cached input-to-content mapping.
**For internal use only.**
-/
@[inline] public nonrec def getArtifacts?
[ResolveOutputs α] (inputHash : Hash) (savedTrace : SavedTrace) (pkg : Package)
: JobM (Option α) := do
if let some a getArtifactsUsingCache? inputHash pkg then
return some a
if let some a getArtifactsUsingTrace? inputHash savedTrace pkg then
return some a
return none
/-- **For internal use only.** -/
public def resolveArtifact
(descr : ArtifactDescr)
@@ -549,6 +577,7 @@ public def resolveArtifact
| .error (.noFileOrDirectory ..) =>
-- we redownload artifacts on any error
if let some service := service? then
updateAction .fetch
if let some service := ws.findCacheService? service.toString then
let some scope := scope?
| error s!"artifact with associated cache service but no scope"
@@ -573,19 +602,17 @@ public def resolveArtifact
| .error e =>
error s!"failed to retrieve artifact from cache: {e}"
def resolveArtifactOutput
(out : Json) (service? : Option CacheServiceName) (scope? : Option CacheServiceScope)
(exe := false)
: JobM Artifact := do
match fromJson? out with
| .ok descr => resolveArtifact descr service? scope? exe
| .error e => error s!"ill-formed artifact output:\n{out.render.pretty 80 2}\n{e}"
/-- **For internal use only.** -/
public def resolveArtifactOutput (out : CacheOutput) (exe := false) : JobM Artifact := do
match fromJson? out.data with
| .ok descr => resolveArtifact descr out.service? out.scope? exe
| .error e => error s!"ill-formed artifact output:\n{out.data.render.pretty 80 2}\n{e}"
set_option linter.unusedVariables false in
/-- An artifact equipped with information about whether it is executable. -/
def XArtifact (exe : Bool) := Artifact
instance : ResolveOutputs (XArtifact exe) := resolveArtifactOutput (exe := exe)
instance : ResolveOutputs (XArtifact exe) := (resolveArtifactOutput (exe := exe))
/--
Construct an artifact from a path outside the Lake artifact cache.
@@ -675,8 +702,9 @@ public def buildArtifactUnlessUpToDate
if let some art fetchArt? (restore := true) then
return art
doBuild depTrace traceFile
if let some outputsRef := pkg.outputsRef? then
outputsRef.insert inputHash art.descr
if pkg.isRoot then
if let some outputsRef := ( getBuildContext).outputsRef? then
outputsRef.insert inputHash art.descr
setTrace art.trace
setMTime art traceFile
else

View File

@@ -6,6 +6,7 @@ Authors: Mac Malone
module
prelude
public import Lake.Config.Cache
public import Lake.Config.Context
public import Lake.Build.Job.Basic
@@ -48,6 +49,11 @@ public def JobQueue := IO.Ref (Array OpaqueJob)
public structure BuildContext extends BuildConfig, Context where
leanTrace : BuildTrace
registeredJobs : JobQueue
/--
Input-to-output(s) map for hashes of the root package's artifacts.
If `none`, tracking outputs is disabled for this build.
-/
outputsRef? : Option CacheRef := none
/-- A transformer to equip a monad with a `BuildContext`. -/
public abbrev BuildT := ReaderT BuildContext

View File

@@ -126,6 +126,9 @@ Its trace just includes its dependencies.
-/
builtin_facet leanArts : Module => ModuleOutputArtifacts
/-- A compressed archive (produced via `leantar`) of the module's build artifacts. -/
builtin_facet ltar : Module => FilePath
/-- The `olean` file produced by `lean`. -/
builtin_facet olean : Module => FilePath

View File

@@ -180,6 +180,9 @@ namespace Module
@[inherit_doc cFacet] public abbrev bc (self : Module) :=
self.facetCore bcFacet
@[inherit_doc ltarFacet] public abbrev ltar (self : Module) :=
self.facetCore ltarFacet
@[inherit_doc oFacet] public abbrev o (self : Module) :=
self.facetCore oFacet

View File

@@ -562,6 +562,7 @@ public def Module.depsFacetConfig : ModuleFacetConfig depsFacet :=
/-- Remove all existing artifacts produced by the Lean build of the module. -/
public def Module.clearOutputArtifacts (mod : Module) : IO PUnit := do
try
removeFileIfExists mod.ltarFile
removeFileIfExists mod.oleanFile
removeFileIfExists mod.oleanServerFile
removeFileIfExists mod.oleanPrivateFile
@@ -575,6 +576,7 @@ public def Module.clearOutputArtifacts (mod : Module) : IO PUnit := do
/-- Remove any cached file hashes of the module build outputs (in `.hash` files). -/
public def Module.clearOutputHashes (mod : Module) : IO PUnit := do
try
clearFileHash mod.ltarFile
clearFileHash mod.oleanFile
clearFileHash mod.oleanServerFile
clearFileHash mod.oleanPrivateFile
@@ -587,6 +589,8 @@ public def Module.clearOutputHashes (mod : Module) : IO PUnit := do
/-- Cache the file hashes of the module build outputs in `.hash` files. -/
public def Module.cacheOutputHashes (mod : Module) : IO PUnit := do
if ( mod.ltarFile.pathExists) then
cacheFileHash mod.ltarFile
cacheFileHash mod.oleanFile
if ( mod.oleanServerFile.pathExists) then
cacheFileHash mod.oleanServerFile
@@ -599,37 +603,68 @@ public def Module.cacheOutputHashes (mod : Module) : IO PUnit := do
if Lean.Internal.hasLLVMBackend () then
cacheFileHash mod.bcFile
def resolveModuleOutputs
(out : Json) (service? : Option CacheServiceName) (scope? : Option CacheServiceScope)
def ModuleOutputDescrs.resolve
(descrs : ModuleOutputDescrs) (service? : Option CacheServiceName) (scope? : Option CacheServiceScope)
: JobM ModuleOutputArtifacts := do
match fromJson? out with
| .ok (descrs : ModuleOutputDescrs) => do
let arts : ModuleOutputArtifacts := {
olean := resolve descrs.olean
oleanServer? := descrs.oleanServer?.mapM resolve
oleanPrivate? := descrs.oleanPrivate?.mapM resolve
ir? := descrs.ir?.mapM resolve
ilean := resolve descrs.ilean
c := resolve descrs.c
bc? := none
}
if Lean.Internal.hasLLVMBackend () then
let some descr := descrs.bc?
| error "LLVM backend enabled but module outputs lack bitcode"
return {arts with bc? := some ( resolve descr)}
else
return arts
| .error e =>
error s!"ill-formed module outputs:\n{out.render.pretty 80 2}\n{e}"
let arts : ModuleOutputArtifacts := {
isModule := descrs.isModule
olean := resolve descrs.olean
oleanServer? := descrs.oleanServer?.mapM resolve
oleanPrivate? := descrs.oleanPrivate?.mapM resolve
ir? := descrs.ir?.mapM resolve
ilean := resolve descrs.ilean
c := resolve descrs.c
ltar? := descrs.ltar?.mapM resolve
}
if Lean.Internal.hasLLVMBackend () then
let some descr := descrs.bc?
| error "LLVM backend enabled but module outputs lack bitcode"
return {arts with bc? := some ( resolve descr)}
else
return arts
where @[inline] resolve descr := resolveArtifact descr service? scope?
instance : ResolveOutputs ModuleOutputArtifacts := resolveModuleOutputs
def resolveModuleOutputArtifacts (out : CacheOutput) : JobM ModuleOutputArtifacts := do
match fromJson? out.data with
| .ok (descrs : ModuleOutputDescrs) => descrs.resolve out.service? out.scope?
| .error e => error s!"ill-formed module outputs:\n{out.data.render.pretty 80 2}\n{e}"
instance : ResolveOutputs ModuleOutputArtifacts := resolveModuleOutputArtifacts
inductive ModuleOutputs
| ltar (art : Artifact)
| arts (arts : ModuleOutputArtifacts)
instance : Coe ModuleOutputArtifacts ModuleOutputs := .arts
def resolveModuleOutputs (out : CacheOutput) : JobM ModuleOutputs := do
if let .str descr := out.data then
match ArtifactDescr.ofFilePath? descr with
| .ok descr =>
return .ltar ( resolveArtifact descr out.service? out.scope?)
| .error e =>
error s!"ill-formed module archive output:\n{out.data.render.pretty 80 2}\n{e}"
else
match fromJson? out.data with
| .ok (descrs : ModuleOutputDescrs) =>
if let some descr := descrs.ltar? then
try
descrs.resolve out.service? out.scope?
catch e =>
dropLogFrom e
return .ltar ( resolveArtifact descr out.service? out.scope?)
else
descrs.resolve out.service? out.scope?
| .error e => error s!"ill-formed module outputs:\n{out.data.render.pretty 80 2}\n{e}"
instance : ResolveOutputs ModuleOutputs := resolveModuleOutputs
/-- Save module build artifacts to the local Lake cache. -/
private def Module.cacheOutputArtifacts
(mod : Module) (isModule : Bool) (useLocalFile : Bool)
: JobM ModuleOutputArtifacts := do
return {
isModule
olean := cache mod.oleanFile "olean"
oleanServer? := cacheIf? isModule mod.oleanServerFile "olean.server"
oleanPrivate? := cacheIf? isModule mod.oleanPrivateFile "olean.private"
@@ -637,6 +672,7 @@ private def Module.cacheOutputArtifacts
ilean := cache mod.ileanFile "ilean"
c := cache mod.cFile "c"
bc? := cacheIf? (Lean.Internal.hasLLVMBackend ()) mod.bcFile "bc"
ltar? := cacheIf? ( mod.ltarFile.pathExists) mod.ltarFile "ltar"
}
where
@[inline] cache file ext := do
@@ -664,11 +700,12 @@ private def Module.restoreAllArtifacts (mod : Module) (cached : ModuleOutputArti
ir? := restoreSome mod.irFile cached.ir?
c := restoreArtifact mod.cFile cached.c
bc? := restoreSome mod.bcFile cached.bc?
ltar? := restoreSome mod.ltarFile cached.ltar?
}
where
@[inline] restoreSome file art? := art?.mapM (restoreArtifact file ·)
public protected def Module.checkExists (self : Module) (isModule : Bool) : BaseIO Bool := do
public def Module.checkArtifactsExsist (self : Module) (isModule : Bool) : BaseIO Bool := do
unless ( self.oleanFile.pathExists) do return false
unless ( self.ileanFile.pathExists) do return false
unless ( self.cFile.pathExists) do return false
@@ -680,22 +717,30 @@ public protected def Module.checkExists (self : Module) (isModule : Bool) : Base
unless ( self.irFile.pathExists) do return false
return true
public protected def Module.checkExists (self : Module) (isModule : Bool) : BaseIO Bool := do
self.ltarFile.pathExists <||> self.checkArtifactsExsist isModule
@[deprecated Module.checkExists (since := "2025-03-04")]
public instance : CheckExists Module := Module.checkExists (isModule := false)
public protected def Module.getMTime (self : Module) (isModule : Bool) : IO MTime := do
let mut mtime :=
( getMTime self.oleanFile)
|> max ( getMTime self.ileanFile)
|> max ( getMTime self.cFile)
if Lean.Internal.hasLLVMBackend () then
mtime := max mtime ( getMTime self.bcFile)
if isModule then
mtime := mtime
|> max ( getMTime self.oleanServerFile)
|> max ( getMTime self.oleanPrivateFile)
|> max ( getMTime self.irFile)
return mtime
try
let mut mtime :=
( getMTime self.oleanFile)
|> max ( getMTime self.ileanFile)
|> max ( getMTime self.cFile)
if Lean.Internal.hasLLVMBackend () then
mtime := max mtime ( getMTime self.bcFile)
if isModule then
mtime := mtime
|> max ( getMTime self.oleanServerFile)
|> max ( getMTime self.oleanPrivateFile)
|> max ( getMTime self.irFile)
return mtime
catch e =>
try getMTime self.ltarFile catch
| .noFileOrDirectory .. => throw e
| e => throw e
@[deprecated Module.getMTime (since := "2025-03-04")]
public instance : GetMTime Module := Module.getMTime (isModule := false)
@@ -711,7 +756,6 @@ private def ModuleOutputArtifacts.setMTime (self : ModuleOutputArtifacts) (mtime
bc? := self.bc?.map ({· with mtime})
}
private def Module.mkArtifacts (mod : Module) (srcFile : FilePath) (isModule : Bool) : ModuleArtifacts where
lean? := srcFile
olean? := mod.oleanFile
@@ -724,6 +768,7 @@ private def Module.mkArtifacts (mod : Module) (srcFile : FilePath) (isModule : B
private def Module.computeArtifacts (mod : Module) (isModule : Bool) : FetchM ModuleOutputArtifacts :=
return {
isModule
olean := compute mod.oleanFile "olean"
oleanServer? := computeIf isModule mod.oleanServerFile "olean.server"
oleanPrivate? := computeIf isModule mod.oleanPrivateFile "olean.private"
@@ -741,6 +786,66 @@ where
instance : ToOutputJson ModuleOutputArtifacts := (toJson ·.descrs)
def Module.packLtar (self : Module) (arts : ModuleOutputArtifacts) : JobM Artifact := do
let arts id do
if ( self.checkArtifactsExsist arts.isModule) then
return arts
else self.restoreAllArtifacts arts
let args id do
let mut args := #[
"-C", self.leanLibDir.toString,
"-C", self.irDir.toString,
self.ltarFile.toString,
self.fileName "trace"
]
let addArt args idx art :=
args.push "-i" |>.push idx |>.push (self.fileName art.ext)
-- Note: oleans parts must be in the order of `.olean`, `.olean.server`, `.olean.private`
args := addArt args "0" arts.olean
if let some art := arts.oleanServer? then
args := addArt args "0" art
if let some art := arts.oleanPrivate? then
args := addArt args "0" art
args := addArt args "0" arts.ilean
if let some art := arts.ir? then
args := addArt args "0" art
args := addArt args "1" arts.c
if Lean.Internal.hasLLVMBackend () then
let some art := arts.bc?
| error "LLVM backend enabled but module outputs lack bitcode"
args := addArt args "1" art
return args
proc (quiet := true) {cmd := ( getLeantar).toString, args}
if ( self.pkg.isArtifactCacheWritable) then
cacheArtifact self.ltarFile "ltar" ( self.pkg.restoreAllArtifacts)
else
computeArtifact self.ltarFile "ltar"
def Module.unpackLtar (self : Module) (ltar : FilePath) : JobM Unit := do
let args := #[
"-C", self.leanLibDir.toString,
"-C", self.irDir.toString,
"-x", ltar.toString
]
proc (quiet := true) {cmd := ( getLeantar).toString, args}
private def Module.recBuildLtar (self : Module) : FetchM (Job FilePath) := do
withRegisterJob s!"{self.name}:ltar" <| withCurrPackage self.pkg do
( self.leanArts.fetch).mapM fun arts => do
let art arts.ltar?.getDM do
if ( getNoBuild) then
modify ({· with wantsRebuild := true})
error "archive does not exist and needs to be built"
else
self.packLtar arts
newTrace s!"{self.name.toString}:ltar"
addTrace art.trace
return art.path
/-- The `ModuleFacetConfig` for the builtin `ltarFacet`. -/
public def Module.ltarFacetConfig : ModuleFacetConfig ltarFacet :=
mkFacetJobConfig recBuildLtar
private def Module.buildLean
(mod : Module) (depTrace : BuildTrace) (srcFile : FilePath) (setup : ModuleSetup)
: JobM ModuleOutputArtifacts := buildAction depTrace mod.traceFile do
@@ -786,45 +891,100 @@ private def Module.recBuildLean (mod : Module) : FetchM (Job ModuleOutputArtifac
addPureTrace mod.pkg.id? "Package.id?"
addPureTrace mod.leanArgs "Module.leanArgs"
setTraceCaption s!"{mod.name.toString}:leanArts"
let depTrace getTrace
let inputHash := depTrace.hash
let savedTrace readTraceFile mod.traceFile
have : CheckExists Module := Module.checkExists (isModule := setup.isModule)
have : GetMTime Module := Module.getMTime (isModule := setup.isModule)
let fetchArtsFromCache? restoreAll := do
let some arts getArtifacts? inputHash savedTrace mod.pkg
| return none
let arts fetchCore setup srcFile srcTrace
let arts trackOutputsIfEnabled arts
let arts adjustMTime arts
return arts
where
fetchFromCache? setup savedTrace restoreAll : JobM (ModuleOutputArtifacts SavedTrace) := do
let inputHash := ( getTrace).hash
let some ltarOrArts getArtifacts? inputHash savedTrace mod.pkg
| return .inr savedTrace
match (ltarOrArts : ModuleOutputs) with
| .ltar ltar =>
mod.clearOutputArtifacts
mod.unpackLtar ltar.path
-- Note: This branch implies that only the ltar output is (validly) cached.
-- Thus, we use only the new trace unpacked from the ltar to resolve further artifacts.
let savedTrace readTraceFile mod.traceFile
let arts? getArtifactsUsingTrace? inputHash savedTrace mod.pkg
if let some (arts : ModuleOutputArtifacts) := arts? then
-- on initial unpack from cache ensure all artifacts uniformly
-- end up in the build directory and, if writable, the cache
let arts mod.restoreAllArtifacts {arts with ltar? := some ltar}
if ( mod.pkg.isArtifactCacheWritable) then
.inl <$> mod.cacheOutputArtifacts setup.isModule restoreAll
else
return .inl arts
else
return .inr savedTrace
| .arts arts =>
unless ( savedTrace.replayOrFetchIfUpToDate inputHash) do
mod.clearOutputArtifacts
writeFetchTrace mod.traceFile inputHash (toJson arts.descrs)
if restoreAll then
some <$> mod.restoreAllArtifacts arts
else
some <$> mod.restoreNeededArtifacts arts
let arts id do
if ( mod.pkg.isArtifactCacheWritable) then
let restore mod.pkg.restoreAllArtifacts
if let some arts fetchArtsFromCache? restore then
let arts
if restoreAll then
mod.restoreAllArtifacts arts
else
mod.restoreNeededArtifacts arts
return .inl arts
fetchCore setup srcFile srcTrace : JobM ModuleOutputArtifacts := do
let depTrace getTrace
have : GetMTime Module := Module.getMTime (isModule := setup.isModule)
have : CheckExists Module := Module.checkExists (isModule := setup.isModule)
let savedTrace readTraceFile mod.traceFile
if ( mod.pkg.isArtifactCacheWritable) then
let restore mod.pkg.restoreAllArtifacts
match ( fetchFromCache? setup savedTrace restore) with
| .inl arts =>
return arts
| .inr savedTrace =>
let status savedTrace.replayIfUpToDate' (oldTrace := srcTrace.mtime) mod depTrace
if status.isUpToDate then
unless ( mod.checkArtifactsExsist setup.isModule) do
mod.unpackLtar mod.ltarFile
else
discard <| mod.buildLean depTrace srcFile setup
if status.isCacheable then
let arts mod.cacheOutputArtifacts setup.isModule restore
( getLakeCache).writeOutputs mod.pkg.cacheScope depTrace.hash arts.descrs
return arts
else
let status savedTrace.replayIfUpToDate' (oldTrace := srcTrace.mtime) mod depTrace
unless status.isUpToDate do
discard <| mod.buildLean depTrace srcFile setup
if status.isCacheable then
let arts mod.cacheOutputArtifacts setup.isModule restore
( getLakeCache).writeOutputs mod.pkg.cacheScope inputHash arts.descrs
return arts
else
mod.computeArtifacts setup.isModule
else if ( savedTrace.replayIfUpToDate (oldTrace := srcTrace.mtime) mod depTrace) then
mod.computeArtifacts setup.isModule
else
if ( savedTrace.replayIfUpToDate (oldTrace := srcTrace.mtime) mod depTrace) then
unless ( mod.checkArtifactsExsist setup.isModule) do
mod.unpackLtar mod.ltarFile
mod.computeArtifacts setup.isModule
else
if ( mod.pkg.isArtifactCacheReadable) then
if let some arts fetchArtsFromCache? true then
return arts
mod.buildLean depTrace srcFile setup
if let some ref := mod.pkg.outputsRef? then
ref.insert inputHash arts.descrs
match ( fetchFromCache? setup savedTrace true) with
| .inl arts =>
return arts
| .inr savedTrace =>
if ( savedTrace.replayIfUpToDate (oldTrace := srcTrace.mtime) mod depTrace) then
mod.computeArtifacts setup.isModule
else
mod.buildLean depTrace srcFile setup
else
mod.buildLean depTrace srcFile setup
trackOutputsIfEnabled arts : JobM ModuleOutputArtifacts := do
if mod.pkg.isRoot then
if let some ref := ( getBuildContext).outputsRef? then
let inputHash := ( getTrace).hash
if let some ltar := arts.ltar? then
ref.insert inputHash ltar.descr
return arts
else
let ltar id do
if ( mod.ltarFile.pathExists) then
computeArtifact mod.ltarFile "ltar"
else
mod.packLtar arts
ref.insert inputHash ltar.descr
return {arts with ltar? := some ltar}
return arts
adjustMTime arts : JobM ModuleOutputArtifacts := do
match ( getMTime mod.traceFile |>.toBaseIO) with
| .ok mtime =>
return arts.setMTime mtime
@@ -846,7 +1006,7 @@ public def Module.leanArtsFacetConfig : ModuleFacetConfig leanArtsFacet :=
/-
Avoid recompiling unchanged OLean files.
OLean files transitively include their imports.
THowever, imports are pre-resolved by Lake, so they are not included in their trace.
However, imports are pre-resolved by Lake, so they are not included in their trace.
-/
newTrace s!"{mod.name.toString}:{facet}"
addTrace art.trace
@@ -891,7 +1051,7 @@ public def Module.cFacetConfig : ModuleFacetConfig cFacet :=
let art := arts.c
/-
Avoid recompiling unchanged C files.
C files are assumed to incorporate their own content
C files are assumed to only incorporate their own content
and not transitively include their inputs (e.g., imports).
They do, however, include `lean/lean.h`.
Lean also produces LF-only C files, so no line ending normalization.
@@ -1031,6 +1191,7 @@ public def Module.initFacetConfigs : DNameMap ModuleFacetConfig :=
|>.insert importArtsFacet importArtsFacetConfig
|>.insert importAllArtsFacet importAllArtsFacetConfig
|>.insert exportInfoFacet exportInfoFacetConfig
|>.insert ltarFacet ltarFacetConfig
|>.insert oleanFacet oleanFacetConfig
|>.insert oleanServerFacet oleanServerFacetConfig
|>.insert oleanPrivateFacet oleanPrivateFacetConfig

Some files were not shown because too many files have changed in this diff Show More