mirror of
https://github.com/leanprover/lean4.git
synced 2026-04-15 00:24:07 +00:00
Compare commits
55 Commits
preprocess
...
v4.28.1
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
978f81d363 | ||
|
|
76dea4d656 | ||
|
|
1df9f3b862 | ||
|
|
7e01a1bf5c | ||
|
|
e18f78acfb | ||
|
|
3b0f286219 | ||
|
|
9e241a4087 | ||
|
|
e90f6f77db | ||
|
|
9deb9ab59d | ||
|
|
6de7100f69 | ||
|
|
6f409e0eea | ||
|
|
3de1cc54c5 | ||
|
|
a3755fe0a5 | ||
|
|
4c1e4a77b4 | ||
|
|
896da85304 | ||
|
|
11cd55b4f1 | ||
|
|
88823b27a6 | ||
|
|
c9facc8102 | ||
|
|
63d1b530ba | ||
|
|
3f09741fb9 | ||
|
|
9f9531fa13 | ||
|
|
dae0d6fa05 | ||
|
|
4a3401f69a | ||
|
|
4526cdda5f | ||
|
|
c4639150c1 | ||
|
|
37870c168b | ||
|
|
57003e5c79 | ||
|
|
b2f485e352 | ||
|
|
5e29d7660a | ||
|
|
567cf74f1b | ||
|
|
fa2ddf1c56 | ||
|
|
f9af240bc4 | ||
|
|
3bfeb0bc1f | ||
|
|
8447586fea | ||
|
|
470e3b7fd0 | ||
|
|
0a0323734b | ||
|
|
69b058dc82 | ||
|
|
2c48ae7dfb | ||
|
|
c81a8897a9 | ||
|
|
3bc63aefb7 | ||
|
|
fa40491c78 | ||
|
|
af438425d5 | ||
|
|
648e1b1877 | ||
|
|
f84aa23d6d | ||
|
|
6bec8adf16 | ||
|
|
16873fb123 | ||
|
|
34d8eeb3be | ||
|
|
f1cc85eb19 | ||
|
|
08e6f714ca | ||
|
|
b8f8dde0b3 | ||
|
|
b09e33f76b | ||
|
|
a95227c7d7 | ||
|
|
8258cfe2a1 | ||
|
|
94e8fd4845 | ||
|
|
9063adbd51 |
@@ -13,12 +13,54 @@ These comments explain the scripts' behavior, which repositories get special han
|
||||
## Arguments
|
||||
- `version`: The version to release (e.g., v4.24.0)
|
||||
|
||||
## Release Notes (Required for -rc1 releases)
|
||||
|
||||
For first release candidates (`-rc1`), you must create release notes BEFORE the reference-manual toolchain bump PR can be merged.
|
||||
|
||||
**Steps to create release notes:**
|
||||
|
||||
1. Generate the release notes:
|
||||
```bash
|
||||
cd /path/to/lean4
|
||||
python3 script/release_notes.py --since <previous_version> > /tmp/release-notes-<version>.md
|
||||
```
|
||||
Replace `<previous_version>` with the last stable release (e.g., `v4.27.0` when releasing `v4.28.0-rc1`).
|
||||
|
||||
2. Review `/tmp/release-notes-<version>.md` for common issues:
|
||||
- **Unterminated code blocks**: Look for code fences that aren't closed. Fetch original PR with `gh pr view <number>` to repair.
|
||||
- **Truncated descriptions**: Some may end mid-sentence. Complete them from the original PR.
|
||||
- **Markdown issues**: Other syntax problems that could cause parsing errors.
|
||||
|
||||
3. Create the release notes file in the reference-manual repository:
|
||||
- File path: `Manual/Releases/v<version>.lean` (e.g., `v4_28_0.lean`)
|
||||
- Use Verso format with proper imports and `#doc (Manual)` block
|
||||
- **Use `#` for headers, not `##`** (Verso uses level 1 for subsections)
|
||||
- **Use plain ` ``` ` not ` ```lean `** (the latter executes code)
|
||||
- **Wrap underscore identifiers in backticks**: `` `bv_decide` `` not `bv_decide`
|
||||
|
||||
4. Update `Manual/Releases.lean`:
|
||||
- Add import: `import Manual.Releases.«v4_28_0»`
|
||||
- Add include: `{include 0 Manual.Releases.«v4_28_0»}`
|
||||
|
||||
5. Build to verify: `lake build Manual.Releases.v4_28_0`
|
||||
|
||||
6. Create a **separate PR** for release notes (not bundled with toolchain bump):
|
||||
```bash
|
||||
git checkout -b v<version>-release-notes
|
||||
gh pr create --title "doc: add v<version> release notes"
|
||||
```
|
||||
|
||||
For subsequent RCs (`-rc2`, etc.) and stable releases, just update the version number in the existing release notes file title.
|
||||
|
||||
See `doc/dev/release_checklist.md` section "Writing the release notes" for full details.
|
||||
|
||||
## Process
|
||||
|
||||
1. Run `script/release_checklist.py {version}` to check the current status
|
||||
2. **CRITICAL: If preliminary lean4 checks fail, STOP immediately and alert the user**
|
||||
- Check for: release branch exists, CMake version correct, tag exists, release page exists, release notes exist
|
||||
- Check for: release branch exists, CMake version correct, tag exists, release page exists, release notes file exists
|
||||
- **IMPORTANT**: The release page is created AUTOMATICALLY by CI after pushing the tag - DO NOT create it manually
|
||||
- **IMPORTANT**: For -rc1 releases, release notes must be created before proceeding
|
||||
- Do NOT create any PRs or proceed with repository updates if these checks fail
|
||||
3. Create a todo list tracking all repositories that need updates
|
||||
4. **CRITICAL RULE: You can ONLY run `release_steps.py` for a repository if `release_checklist.py` explicitly says to do so**
|
||||
@@ -61,6 +103,15 @@ Every time you run `release_checklist.py`, you MUST:
|
||||
This summary should be provided EVERY time you run the checklist, not just after creating new PRs.
|
||||
The user needs to see the complete picture of what's waiting for review.
|
||||
|
||||
## Nightly Infrastructure
|
||||
|
||||
The nightly build system uses branches and tags across two repositories:
|
||||
|
||||
- `leanprover/lean4` has **branches** `nightly` and `nightly-with-mathlib` tracking the latest nightly builds
|
||||
- `leanprover/lean4-nightly` has **dated tags** like `nightly-2026-01-23`
|
||||
|
||||
When a nightly succeeds with mathlib, all three should point to the same commit. Don't confuse these: branches are in the main lean4 repo, dated tags are in lean4-nightly.
|
||||
|
||||
## Error Handling
|
||||
|
||||
**CRITICAL**: If something goes wrong or a command fails:
|
||||
|
||||
6
.github/workflows/build-template.yml
vendored
6
.github/workflows/build-template.yml
vendored
@@ -66,16 +66,10 @@ jobs:
|
||||
brew install ccache tree zstd coreutils gmp libuv
|
||||
if: runner.os == 'macOS'
|
||||
- name: Checkout
|
||||
if: (!endsWith(matrix.os, '-with-cache'))
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
# the default is to use a virtual merge commit between the PR and master: just use the PR
|
||||
ref: ${{ github.event.pull_request.head.sha }}
|
||||
- name: Namespace Checkout
|
||||
if: endsWith(matrix.os, '-with-cache')
|
||||
uses: namespacelabs/nscloud-checkout-action@v8
|
||||
with:
|
||||
ref: ${{ github.event.pull_request.head.sha }}
|
||||
- name: Open Nix shell once
|
||||
run: true
|
||||
if: runner.os == 'Linux'
|
||||
|
||||
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
@@ -115,7 +115,7 @@ jobs:
|
||||
CMAKE_MAJOR=$(grep -E "^set\(LEAN_VERSION_MAJOR " src/CMakeLists.txt | grep -oE '[0-9]+')
|
||||
CMAKE_MINOR=$(grep -E "^set\(LEAN_VERSION_MINOR " src/CMakeLists.txt | grep -oE '[0-9]+')
|
||||
CMAKE_PATCH=$(grep -E "^set\(LEAN_VERSION_PATCH " src/CMakeLists.txt | grep -oE '[0-9]+')
|
||||
CMAKE_IS_RELEASE=$(grep -m 1 -E "^set\(LEAN_VERSION_IS_RELEASE " src/CMakeLists.txt | grep -oE '[0-9]+')
|
||||
CMAKE_IS_RELEASE=$(grep -m 1 -E "^set\(LEAN_VERSION_IS_RELEASE " src/CMakeLists.txt | sed -nE 's/^set\(LEAN_VERSION_IS_RELEASE ([0-9]+)\).*/\1/p')
|
||||
|
||||
# Expected values from tag parsing
|
||||
TAG_MAJOR="${{ steps.set-release.outputs.LEAN_VERSION_MAJOR }}"
|
||||
|
||||
@@ -218,6 +218,11 @@ Please read https://leanprover-community.github.io/contribute/tags_and_branches.
|
||||
|
||||
# Writing the release notes
|
||||
|
||||
Release notes are only needed for the first release candidate (`-rc1`). For subsequent RCs and stable releases,
|
||||
just update the version number in the title of the existing release notes file.
|
||||
|
||||
## Generating the release notes
|
||||
|
||||
Release notes are automatically generated from the commit history, using `script/release_notes.py`.
|
||||
|
||||
Run this as `script/release_notes.py --since v4.6.0`, where `v4.6.0` is the *previous* release version.
|
||||
@@ -232,4 +237,93 @@ Some judgement is required here: ignore commits which look minor,
|
||||
but manually add items to the release notes for significant PRs that were rebase-merged.
|
||||
|
||||
There can also be pre-written entries in `./releases_drafts`, which should be all incorporated in the release notes and then deleted from the branch.
|
||||
|
||||
## Reviewing and fixing the generated markdown
|
||||
|
||||
Before adding the release notes to the reference manual, carefully review the generated markdown for these common issues:
|
||||
|
||||
1. **Unterminated code blocks**: PR descriptions sometimes have unclosed code fences. Look for code blocks
|
||||
that don't have a closing ` ``` `. If found, fetch the original PR description with `gh pr view <number>`
|
||||
and repair the code block with the complete content.
|
||||
|
||||
2. **Truncated descriptions**: Some PR descriptions may end abruptly mid-sentence. Review these and complete
|
||||
the descriptions based on the original PR.
|
||||
|
||||
3. **Markdown syntax issues**: Check for other markdown problems that could cause parsing errors.
|
||||
|
||||
## Creating the release notes file
|
||||
|
||||
The release notes go in `Manual/Releases/v4_7_0.lean` in the reference-manual repository.
|
||||
|
||||
The file structure must follow the Verso format:
|
||||
|
||||
```lean
|
||||
/-
|
||||
Copyright (c) 2025 Lean FRO LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: <Your Name>
|
||||
-/
|
||||
|
||||
import VersoManual
|
||||
import Manual.Meta
|
||||
import Manual.Meta.Markdown
|
||||
|
||||
open Manual
|
||||
open Verso.Genre
|
||||
open Verso.Genre.Manual
|
||||
open Verso.Genre.Manual.InlineLean
|
||||
|
||||
#doc (Manual) "Lean 4.7.0-rc1 (YYYY-MM-DD)" =>
|
||||
%%%
|
||||
tag := "release-v4.7.0"
|
||||
file := "v4.7.0"
|
||||
%%%
|
||||
|
||||
<release notes content here>
|
||||
```
|
||||
|
||||
**Important formatting rules for Verso:**
|
||||
- Use `#` for section headers inside the document, not `##` (Verso uses header level 1 for subsections)
|
||||
- Use plain ` ``` ` for code blocks, not ` ```lean ` (the latter will cause Lean to execute the code)
|
||||
- Identifiers with underscores like `bv_decide` should be wrapped in backticks: `` `bv_decide` ``
|
||||
(otherwise the underscore may be interpreted as markdown emphasis)
|
||||
|
||||
## Updating Manual/Releases.lean
|
||||
|
||||
After creating the release notes file, update `Manual/Releases.lean` to include it:
|
||||
|
||||
1. Add the import near the top with other version imports:
|
||||
```lean
|
||||
import Manual.Releases.«v4_7_0»
|
||||
```
|
||||
|
||||
2. Add the include statement after the other includes:
|
||||
```lean
|
||||
{include 0 Manual.Releases.«v4_7_0»}
|
||||
```
|
||||
|
||||
## Building and verifying
|
||||
|
||||
Build the release notes to check for errors:
|
||||
```bash
|
||||
lake build Manual.Releases.v4_7_0
|
||||
```
|
||||
|
||||
Common errors and fixes:
|
||||
- "Wrong header nesting - got ## but expected at most #": Change `##` to `#`
|
||||
- "Tactic 'X' failed" or similar: Code is being executed; change ` ```lean ` to ` ``` `
|
||||
- "'_'" errors: Underscore in identifier being parsed as emphasis; wrap in backticks
|
||||
|
||||
## Creating the PR
|
||||
|
||||
Create a separate PR for the release notes (don't bundle with the toolchain bump PR):
|
||||
```bash
|
||||
git checkout -b v4.7.0-release-notes
|
||||
git add Manual/Releases/v4_7_0.lean Manual/Releases.lean
|
||||
git commit -m "doc: add v4.7.0 release notes"
|
||||
git push -u origin v4.7.0-release-notes
|
||||
gh pr create --title "doc: add v4.7.0 release notes" --body "This PR adds the release notes for Lean v4.7.0."
|
||||
```
|
||||
|
||||
See `./releases_drafts/README.md` for more information about pre-written release note entries.
|
||||
See `./releases_drafts/README.md` for more information.
|
||||
|
||||
@@ -29,7 +29,7 @@ def main (args : List String) : IO Unit := do
|
||||
if !msgs.toList.isEmpty then -- skip this file if there are parse errors
|
||||
msgs.forM fun msg => msg.toString >>= IO.println
|
||||
throw <| .userError "parse errors in file"
|
||||
let `(header| $[module%$moduleTk?]? $imps:import*) := header
|
||||
let `(header| $[module%$moduleTk?]? $[prelude%$preludeTk?]? $imps:import*) := header
|
||||
| throw <| .userError s!"unexpected header syntax of {path}"
|
||||
if moduleTk?.isSome then
|
||||
continue
|
||||
@@ -38,11 +38,11 @@ def main (args : List String) : IO Unit := do
|
||||
let startPos := header.raw.getPos? |>.getD parserState.pos
|
||||
|
||||
let dummyEnv ← mkEmptyEnvironment
|
||||
let (initCmd, parserState', _) :=
|
||||
let (initCmd, parserState', msgs') :=
|
||||
Parser.parseCommand inputCtx { env := dummyEnv, options := {} } parserState msgs
|
||||
|
||||
-- insert section if any trailing command
|
||||
if !initCmd.isOfKind ``Parser.Command.eoi then
|
||||
-- insert section if any trailing command (or error, which could be from an unknown command)
|
||||
if !initCmd.isOfKind ``Parser.Command.eoi || msgs'.hasErrors then
|
||||
let insertPos? :=
|
||||
-- put below initial module docstring if any
|
||||
guard (initCmd.isOfKind ``Parser.Command.moduleDoc) *> initCmd.getTailPos? <|>
|
||||
@@ -57,19 +57,21 @@ def main (args : List String) : IO Unit := do
|
||||
sec := "\n\n" ++ sec
|
||||
if insertPos?.isNone then
|
||||
sec := sec ++ "\n\n"
|
||||
text := text.extract 0 insertPos ++ sec ++ text.extract insertPos text.rawEndPos
|
||||
let insertPos := text.pos! insertPos
|
||||
text := text.extract text.startPos insertPos ++ sec ++ text.extract insertPos text.endPos
|
||||
|
||||
-- prepend each import with `public `
|
||||
for imp in imps.reverse do
|
||||
let insertPos := imp.raw.getPos?.get!
|
||||
let prfx := if doMeta then "public meta " else "public "
|
||||
text := text.extract 0 insertPos ++ prfx ++ text.extract insertPos text.rawEndPos
|
||||
let insertPos := text.pos! insertPos
|
||||
text := text.extract text.startPos insertPos ++ prfx ++ text.extract insertPos text.endPos
|
||||
|
||||
-- insert `module` header
|
||||
let mut initText := text.extract 0 startPos
|
||||
if !initText.trim.isEmpty then
|
||||
let mut initText := text.extract text.startPos (text.pos! startPos)
|
||||
if !initText.trimAscii.isEmpty then
|
||||
-- If there is a header comment, preserve it and put `module` in the line after
|
||||
initText := initText.trimRight ++ "\n"
|
||||
text := initText ++ "module\n\n" ++ text.extract startPos text.rawEndPos
|
||||
initText := initText.trimAsciiEnd.toString ++ "\n"
|
||||
text := initText ++ "module\n\n" ++ text.extract (text.pos! startPos) text.endPos
|
||||
|
||||
IO.FS.writeFile path text
|
||||
|
||||
@@ -185,6 +185,30 @@ def get_release_notes(tag_name):
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def check_release_notes_file_exists(toolchain, github_token):
|
||||
"""Check if the release notes file exists in the reference-manual repository.
|
||||
|
||||
For -rc1 releases, this checks that the release notes have been created.
|
||||
For subsequent RCs and stable releases, release notes should already exist.
|
||||
|
||||
Returns tuple (exists: bool, is_rc1: bool) where is_rc1 indicates if this is
|
||||
the first release candidate (when release notes need to be written).
|
||||
"""
|
||||
# Determine the release notes file path
|
||||
# e.g., v4.28.0-rc1 -> Manual/Releases/v4_28_0.lean
|
||||
base_version = strip_rc_suffix(toolchain.lstrip('v')) # "4.28.0"
|
||||
file_name = f"v{base_version.replace('.', '_')}.lean" # "v4_28_0.lean"
|
||||
file_path = f"Manual/Releases/{file_name}"
|
||||
|
||||
is_rc1 = toolchain.endswith("-rc1")
|
||||
|
||||
repo_url = "https://github.com/leanprover/reference-manual"
|
||||
|
||||
# Check if the file exists on main branch
|
||||
content = get_branch_content(repo_url, "main", file_path, github_token)
|
||||
|
||||
return (content is not None, is_rc1)
|
||||
|
||||
def get_branch_content(repo_url, branch, file_path, github_token):
|
||||
api_url = repo_url.replace("https://github.com/", "https://api.github.com/repos/") + f"/contents/{file_path}?ref={branch}"
|
||||
headers = {'Authorization': f'token {github_token}'} if github_token else {}
|
||||
@@ -501,6 +525,76 @@ def check_proofwidgets4_release(repo_url, target_toolchain, github_token):
|
||||
print(f" You will need to create and push a tag v0.0.{next_version}")
|
||||
return False
|
||||
|
||||
def check_reference_manual_release_title(repo_url, toolchain, pr_branch, github_token):
|
||||
"""Check if the reference-manual release notes title matches the release type.
|
||||
|
||||
For RC releases (e.g., v4.27.0-rc1), the title should contain the exact RC suffix.
|
||||
For final releases (e.g., v4.27.0), the title should NOT contain any "-rc".
|
||||
|
||||
Returns True if check passes or is not applicable, False if title needs updating.
|
||||
"""
|
||||
is_rc = is_release_candidate(toolchain)
|
||||
|
||||
# For RC releases, get the base version and RC suffix
|
||||
# e.g., "v4.27.0-rc1" -> version="4.27.0", rc_suffix="-rc1"
|
||||
if is_rc:
|
||||
parts = toolchain.lstrip('v').split('-', 1)
|
||||
version = parts[0]
|
||||
rc_suffix = '-' + parts[1] if len(parts) > 1 else ''
|
||||
else:
|
||||
version = toolchain.lstrip('v')
|
||||
rc_suffix = ''
|
||||
|
||||
# Construct the release notes file path (e.g., Manual/Releases/v4_27_0.lean for v4.27.0)
|
||||
file_name = f"v{version.replace('.', '_')}.lean" # "v4_27_0.lean"
|
||||
file_path = f"Manual/Releases/{file_name}"
|
||||
|
||||
# Try to get the file from the PR branch first, then fall back to main branch
|
||||
content = get_branch_content(repo_url, pr_branch, file_path, github_token)
|
||||
if content is None:
|
||||
# Try the default branch
|
||||
content = get_branch_content(repo_url, "main", file_path, github_token)
|
||||
|
||||
if content is None:
|
||||
print(f" ⚠️ Could not check release notes file: {file_path}")
|
||||
return True # Don't block on this
|
||||
|
||||
# Look for the #doc line with the title
|
||||
for line in content.splitlines():
|
||||
if line.strip().startswith('#doc') and 'Manual' in line:
|
||||
has_rc_in_title = '-rc' in line.lower()
|
||||
|
||||
if is_rc:
|
||||
# For RC releases, title should contain the exact RC suffix (e.g., "-rc1")
|
||||
# Use regex to match exact suffix followed by non-digit (to avoid -rc1 matching -rc10)
|
||||
# Pattern matches the RC suffix followed by a non-digit or end-of-string context
|
||||
# e.g., "-rc1" followed by space, quote, paren, or similar
|
||||
exact_match = re.search(rf'{re.escape(rc_suffix)}(?![0-9])', line, re.IGNORECASE)
|
||||
if exact_match:
|
||||
print(f" ✅ Release notes title correctly shows {rc_suffix}")
|
||||
return True
|
||||
elif has_rc_in_title:
|
||||
print(f" ❌ Release notes title shows wrong RC version (expected {rc_suffix})")
|
||||
print(f" Update {file_path} to use '{rc_suffix}' in the title")
|
||||
return False
|
||||
else:
|
||||
print(f" ❌ Release notes title missing RC suffix")
|
||||
print(f" Update {file_path} to include '{rc_suffix}' in the title")
|
||||
return False
|
||||
else:
|
||||
# For final releases, title should NOT contain -rc
|
||||
if has_rc_in_title:
|
||||
print(f" ❌ Release notes title still shows RC version")
|
||||
print(f" Update {file_path} to remove '-rcN' from the title")
|
||||
return False
|
||||
else:
|
||||
print(f" ✅ Release notes title is updated for final release")
|
||||
return True
|
||||
|
||||
# If we didn't find the #doc line, don't block
|
||||
print(f" ⚠️ Could not find release notes title in {file_path}")
|
||||
return True
|
||||
|
||||
def run_mathlib_verify_version_tags(toolchain, verbose=False):
|
||||
"""Run mathlib4's verify_version_tags.py script to validate the release tag.
|
||||
|
||||
@@ -644,6 +738,27 @@ def main():
|
||||
else:
|
||||
print(f" ✅ Release notes page title looks good ('{actual_title}').")
|
||||
|
||||
# Check if release notes file exists in reference-manual repository
|
||||
# For -rc1 releases, this is when release notes need to be written
|
||||
# For subsequent RCs and stable releases, they should already exist
|
||||
release_notes_exists, is_rc1 = check_release_notes_file_exists(toolchain, github_token)
|
||||
base_version = strip_rc_suffix(toolchain.lstrip('v'))
|
||||
release_notes_file = f"Manual/Releases/v{base_version.replace('.', '_')}.lean"
|
||||
|
||||
if not release_notes_exists:
|
||||
if is_rc1:
|
||||
print(f" ❌ Release notes file not found: {release_notes_file}")
|
||||
print(f" This is an -rc1 release, so release notes need to be written.")
|
||||
print(f" Run `script/release_notes.py --since <previous_version>` to generate them.")
|
||||
print(f" See doc/dev/release_checklist.md section 'Writing the release notes' for details.")
|
||||
lean4_success = False
|
||||
else:
|
||||
print(f" ❌ Release notes file not found: {release_notes_file}")
|
||||
print(f" Release notes should have been created for -rc1. Check the reference-manual repository.")
|
||||
lean4_success = False
|
||||
else:
|
||||
print(f" ✅ Release notes file exists: {release_notes_file}")
|
||||
|
||||
repo_status["lean4"] = lean4_success
|
||||
|
||||
# If the release page doesn't exist, skip repository checks and master branch checks
|
||||
@@ -709,6 +824,11 @@ def main():
|
||||
print(f" ⚠️ CI: {ci_message}")
|
||||
else:
|
||||
print(f" ❓ CI: {ci_message}")
|
||||
|
||||
# For reference-manual, check that the release notes title has been updated
|
||||
if name == "reference-manual":
|
||||
pr_branch = f"bump_to_{toolchain}"
|
||||
check_reference_manual_release_title(url, toolchain, pr_branch, github_token)
|
||||
else:
|
||||
print(f" ❌ PR with title '{pr_title}' does not exist")
|
||||
print(f" Run `script/release_steps.py {toolchain} {name}` to create it")
|
||||
|
||||
@@ -14,13 +14,6 @@ repositories:
|
||||
bump-branch: true
|
||||
dependencies: []
|
||||
|
||||
- name: verso
|
||||
url: https://github.com/leanprover/verso
|
||||
toolchain-tag: true
|
||||
stable-branch: false
|
||||
branch: main
|
||||
dependencies: []
|
||||
|
||||
- name: lean4checker
|
||||
url: https://github.com/leanprover/lean4checker
|
||||
toolchain-tag: true
|
||||
@@ -42,6 +35,14 @@ repositories:
|
||||
branch: main
|
||||
dependencies: []
|
||||
|
||||
- name: verso
|
||||
url: https://github.com/leanprover/verso
|
||||
toolchain-tag: true
|
||||
stable-branch: false
|
||||
branch: main
|
||||
dependencies:
|
||||
- plausible
|
||||
|
||||
- name: import-graph
|
||||
url: https://github.com/leanprover-community/import-graph
|
||||
toolchain-tag: true
|
||||
|
||||
@@ -11,8 +11,8 @@ include(ExternalProject)
|
||||
project(LEAN CXX C)
|
||||
set(LEAN_VERSION_MAJOR 4)
|
||||
set(LEAN_VERSION_MINOR 28)
|
||||
set(LEAN_VERSION_PATCH 0)
|
||||
set(LEAN_VERSION_IS_RELEASE 0) # This number is 1 in the release revision, and 0 otherwise.
|
||||
set(LEAN_VERSION_PATCH 1)
|
||||
set(LEAN_VERSION_IS_RELEASE 1) # This number is 1 in the release revision, and 0 otherwise.
|
||||
set(LEAN_SPECIAL_VERSION_DESC "" CACHE STRING "Additional version description like 'nightly-2018-03-11'")
|
||||
set(LEAN_VERSION_STRING "${LEAN_VERSION_MAJOR}.${LEAN_VERSION_MINOR}.${LEAN_VERSION_PATCH}")
|
||||
if (LEAN_SPECIAL_VERSION_DESC)
|
||||
|
||||
@@ -9,3 +9,4 @@ prelude
|
||||
public import Init.Data.Char.Basic
|
||||
public import Init.Data.Char.Lemmas
|
||||
public import Init.Data.Char.Order
|
||||
public import Init.Data.Char.Ordinal
|
||||
|
||||
242
src/Init/Data/Char/Ordinal.lean
Normal file
242
src/Init/Data/Char/Ordinal.lean
Normal file
@@ -0,0 +1,242 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.Fin.OverflowAware
|
||||
public import Init.Data.UInt.Basic
|
||||
public import Init.Data.Function
|
||||
import Init.Data.Char.Lemmas
|
||||
import Init.Data.Char.Order
|
||||
import Init.Grind
|
||||
|
||||
/-!
|
||||
# Bijection between `Char` and `Fin Char.numCodePoints`
|
||||
|
||||
In this file, we construct a bijection between `Char` and `Fin Char.numCodePoints` and show that
|
||||
it is compatible with various operations. Since `Fin` is simpler than `Char` due to being based
|
||||
on natural numbers instead of `UInt32` and not having a hole in the middle (surrogate code points),
|
||||
this is sometimes useful to simplify reasoning about `Char`.
|
||||
|
||||
We use these declarations in the construction of `Char` ranges, see the module
|
||||
`Init.Data.Range.Polymorphic.Char`.
|
||||
-/
|
||||
|
||||
set_option doc.verso true
|
||||
|
||||
public section
|
||||
|
||||
namespace Char
|
||||
|
||||
/-- The number of surrogate code points. -/
|
||||
abbrev numSurrogates : Nat :=
|
||||
-- 0xe000 - 0xd800
|
||||
2048
|
||||
|
||||
/-- The size of the {name}`Char` type. -/
|
||||
abbrev numCodePoints : Nat :=
|
||||
-- 0x110000 - numSurrogates
|
||||
1112064
|
||||
|
||||
/--
|
||||
Packs {name}`Char` bijectively into {lean}`Fin Char.numCodePoints` by shifting code points which are
|
||||
greater than the surrogate code points by the number of surrogate code points.
|
||||
|
||||
The inverse of this function is called {name (scope := "Init.Data.Char.Ordinal")}`Char.ofOrdinal`.
|
||||
-/
|
||||
def ordinal (c : Char) : Fin Char.numCodePoints :=
|
||||
if h : c.val < 0xd800 then
|
||||
⟨c.val.toNat, by grind [UInt32.lt_iff_toNat_lt]⟩
|
||||
else
|
||||
⟨c.val.toNat - Char.numSurrogates, by grind [UInt32.lt_iff_toNat_lt]⟩
|
||||
|
||||
/--
|
||||
Unpacks {lean}`Fin Char.numCodePoints` bijectively to {name}`Char` by shifting code points which are
|
||||
greater than the surrogate code points by the number of surrogate code points.
|
||||
|
||||
The inverse of this function is called {name}`Char.ordinal`.
|
||||
-/
|
||||
def ofOrdinal (f : Fin Char.numCodePoints) : Char :=
|
||||
if h : (f : Nat) < 0xd800 then
|
||||
⟨UInt32.ofNatLT f (by grind), by grind [UInt32.toNat_ofNatLT]⟩
|
||||
else
|
||||
⟨UInt32.ofNatLT (f + Char.numSurrogates) (by grind), by grind [UInt32.toNat_ofNatLT]⟩
|
||||
|
||||
/--
|
||||
Computes the next {name}`Char`, skipping over surrogate code points (which are not valid
|
||||
{name}`Char`s) as necessary.
|
||||
|
||||
This function is specified by its interaction with {name}`Char.ordinal`, see
|
||||
{name (scope := "Init.Data.Char.Ordinal")}`Char.succ?_eq`.
|
||||
-/
|
||||
def succ? (c : Char) : Option Char :=
|
||||
if h₀ : c.val < 0xd7ff then
|
||||
some ⟨c.val + 1, by grind [UInt32.lt_iff_toNat_lt, UInt32.toNat_add]⟩
|
||||
else if h₁ : c.val = 0xd7ff then
|
||||
some ⟨0xe000, by decide⟩
|
||||
else if h₂ : c.val < 0x10ffff then
|
||||
some ⟨c.val + 1, by
|
||||
simp only [UInt32.lt_iff_toNat_lt, UInt32.reduceToNat, Nat.not_lt, ← UInt32.toNat_inj,
|
||||
UInt32.isValidChar, Nat.isValidChar, UInt32.toNat_add, Nat.reducePow] at *
|
||||
grind⟩
|
||||
else none
|
||||
|
||||
/--
|
||||
Computes the {name}`m`-th next {name}`Char`, skipping over surrogate code points (which are not
|
||||
valid {name}`Char`s) as necessary.
|
||||
|
||||
This function is specified by its interaction with {name}`Char.ordinal`, see
|
||||
{name (scope := "Init.Data.Char.Ordinal")}`Char.succMany?_eq`.
|
||||
-/
|
||||
def succMany? (m : Nat) (c : Char) : Option Char :=
|
||||
c.ordinal.addNat? m |>.map Char.ofOrdinal
|
||||
|
||||
@[grind =]
|
||||
theorem coe_ordinal {c : Char} :
|
||||
(c.ordinal : Nat) =
|
||||
if c.val < 0xd800 then
|
||||
c.val.toNat
|
||||
else
|
||||
c.val.toNat - Char.numSurrogates := by
|
||||
grind [Char.ordinal]
|
||||
|
||||
@[simp]
|
||||
theorem ordinal_zero : '\x00'.ordinal = 0 := by
|
||||
ext
|
||||
simp [coe_ordinal]
|
||||
|
||||
@[grind =]
|
||||
theorem val_ofOrdinal {f : Fin Char.numCodePoints} :
|
||||
(Char.ofOrdinal f).val =
|
||||
if h : (f : Nat) < 0xd800 then
|
||||
UInt32.ofNatLT f (by grind)
|
||||
else
|
||||
UInt32.ofNatLT (f + Char.numSurrogates) (by grind) := by
|
||||
grind [Char.ofOrdinal]
|
||||
|
||||
@[simp]
|
||||
theorem ofOrdinal_ordinal {c : Char} : Char.ofOrdinal c.ordinal = c := by
|
||||
ext
|
||||
simp only [val_ofOrdinal, coe_ordinal, UInt32.ofNatLT_add]
|
||||
split
|
||||
· grind [UInt32.lt_iff_toNat_lt, UInt32.ofNatLT_toNat]
|
||||
· rw [dif_neg]
|
||||
· simp only [← UInt32.toNat_inj, UInt32.toNat_add, UInt32.toNat_ofNatLT, Nat.reducePow]
|
||||
grind [UInt32.toNat_lt, UInt32.lt_iff_toNat_lt]
|
||||
· grind [UInt32.lt_iff_toNat_lt]
|
||||
|
||||
@[simp]
|
||||
theorem ordinal_ofOrdinal {f : Fin Char.numCodePoints} : (Char.ofOrdinal f).ordinal = f := by
|
||||
ext
|
||||
simp [coe_ordinal, val_ofOrdinal]
|
||||
split
|
||||
· rw [if_pos, UInt32.toNat_ofNatLT]
|
||||
simpa [UInt32.lt_iff_toNat_lt]
|
||||
· rw [if_neg, UInt32.toNat_add, UInt32.toNat_ofNatLT, UInt32.toNat_ofNatLT, Nat.mod_eq_of_lt,
|
||||
Nat.add_sub_cancel]
|
||||
· grind
|
||||
· simp only [UInt32.lt_iff_toNat_lt, UInt32.toNat_add, UInt32.toNat_ofNatLT, Nat.reducePow,
|
||||
UInt32.reduceToNat, Nat.not_lt]
|
||||
grind
|
||||
|
||||
@[simp]
|
||||
theorem ordinal_comp_ofOrdinal : Char.ordinal ∘ Char.ofOrdinal = id := by
|
||||
ext; simp
|
||||
|
||||
@[simp]
|
||||
theorem ofOrdinal_comp_ordinal : Char.ofOrdinal ∘ Char.ordinal = id := by
|
||||
ext; simp
|
||||
|
||||
@[simp]
|
||||
theorem ordinal_inj {c d : Char} : c.ordinal = d.ordinal ↔ c = d :=
|
||||
⟨fun h => by simpa using congrArg Char.ofOrdinal h, (· ▸ rfl)⟩
|
||||
|
||||
theorem ordinal_injective : Function.Injective Char.ordinal :=
|
||||
fun _ _ => ordinal_inj.1
|
||||
|
||||
@[simp]
|
||||
theorem ofOrdinal_inj {f g : Fin Char.numCodePoints} :
|
||||
Char.ofOrdinal f = Char.ofOrdinal g ↔ f = g :=
|
||||
⟨fun h => by simpa using congrArg Char.ordinal h, (· ▸ rfl)⟩
|
||||
|
||||
theorem ofOrdinal_injective : Function.Injective Char.ofOrdinal :=
|
||||
fun _ _ => ofOrdinal_inj.1
|
||||
|
||||
theorem ordinal_le_of_le {c d : Char} (h : c ≤ d) : c.ordinal ≤ d.ordinal := by
|
||||
simp only [le_def, UInt32.le_iff_toNat_le] at h
|
||||
simp only [Fin.le_def, coe_ordinal, UInt32.lt_iff_toNat_lt, UInt32.reduceToNat]
|
||||
grind
|
||||
|
||||
theorem ofOrdinal_le_of_le {f g : Fin Char.numCodePoints} (h : f ≤ g) :
|
||||
Char.ofOrdinal f ≤ Char.ofOrdinal g := by
|
||||
simp only [Fin.le_def] at h
|
||||
simp only [le_def, val_ofOrdinal, UInt32.ofNatLT_add, UInt32.le_iff_toNat_le]
|
||||
split
|
||||
· simp only [UInt32.toNat_ofNatLT]
|
||||
split
|
||||
· simpa
|
||||
· simp only [UInt32.toNat_add, UInt32.toNat_ofNatLT, Nat.reducePow]
|
||||
grind
|
||||
· simp only [UInt32.toNat_add, UInt32.toNat_ofNatLT, Nat.reducePow]
|
||||
rw [dif_neg (by grind)]
|
||||
simp only [UInt32.toNat_add, UInt32.toNat_ofNatLT, Nat.reducePow]
|
||||
grind
|
||||
|
||||
theorem le_iff_ordinal_le {c d : Char} : c ≤ d ↔ c.ordinal ≤ d.ordinal :=
|
||||
⟨ordinal_le_of_le, fun h => by simpa using ofOrdinal_le_of_le h⟩
|
||||
|
||||
theorem le_iff_ofOrdinal_le {f g : Fin Char.numCodePoints} :
|
||||
f ≤ g ↔ Char.ofOrdinal f ≤ Char.ofOrdinal g :=
|
||||
⟨ofOrdinal_le_of_le, fun h => by simpa using ordinal_le_of_le h⟩
|
||||
|
||||
theorem lt_iff_ordinal_lt {c d : Char} : c < d ↔ c.ordinal < d.ordinal := by
|
||||
simp only [Std.lt_iff_le_and_not_ge, le_iff_ordinal_le]
|
||||
|
||||
theorem lt_iff_ofOrdinal_lt {f g : Fin Char.numCodePoints} :
|
||||
f < g ↔ Char.ofOrdinal f < Char.ofOrdinal g := by
|
||||
simp only [Std.lt_iff_le_and_not_ge, le_iff_ofOrdinal_le]
|
||||
|
||||
theorem succ?_eq {c : Char} : c.succ? = (c.ordinal.addNat? 1).map Char.ofOrdinal := by
|
||||
fun_cases Char.succ? with
|
||||
| case1 h =>
|
||||
rw [Fin.addNat?_eq_some]
|
||||
· simp only [coe_ordinal, Option.map_some, Option.some.injEq, Char.ext_iff, val_ofOrdinal,
|
||||
UInt32.ofNatLT_add, UInt32.reduceOfNatLT]
|
||||
split
|
||||
· simp only [UInt32.ofNatLT_toNat, dite_eq_ite, left_eq_ite_iff, Nat.not_lt,
|
||||
Nat.reduceLeDiff, UInt32.left_eq_add]
|
||||
grind [UInt32.lt_iff_toNat_lt]
|
||||
· grind
|
||||
· simp [coe_ordinal]
|
||||
grind [UInt32.lt_iff_toNat_lt]
|
||||
| case2 =>
|
||||
rw [Fin.addNat?_eq_some]
|
||||
· simp [coe_ordinal, *, Char.ext_iff, val_ofOrdinal, numSurrogates]
|
||||
· simp [coe_ordinal, *, numCodePoints]
|
||||
| case3 =>
|
||||
rw [Fin.addNat?_eq_some]
|
||||
· simp only [coe_ordinal, Option.map_some, Option.some.injEq, Char.ext_iff, val_ofOrdinal,
|
||||
UInt32.ofNatLT_add, UInt32.reduceOfNatLT]
|
||||
split
|
||||
· grind
|
||||
· rw [dif_neg]
|
||||
· simp only [← UInt32.toNat_inj, UInt32.toNat_add, UInt32.reduceToNat, Nat.reducePow,
|
||||
UInt32.toNat_ofNatLT, Nat.mod_add_mod]
|
||||
grind [UInt32.lt_iff_toNat_lt, UInt32.toNat_inj]
|
||||
· grind [UInt32.lt_iff_toNat_lt, UInt32.toNat_inj]
|
||||
· grind [UInt32.lt_iff_toNat_lt]
|
||||
| case4 =>
|
||||
rw [eq_comm]
|
||||
grind [UInt32.lt_iff_toNat_lt]
|
||||
|
||||
theorem map_ordinal_succ? {c : Char} : c.succ?.map ordinal = c.ordinal.addNat? 1 := by
|
||||
simp [succ?_eq]
|
||||
|
||||
theorem succMany?_eq {m : Nat} {c : Char} :
|
||||
c.succMany? m = (c.ordinal.addNat? m).map Char.ofOrdinal := by
|
||||
rfl
|
||||
|
||||
end Char
|
||||
@@ -11,3 +11,4 @@ public import Init.Data.Fin.Log2
|
||||
public import Init.Data.Fin.Iterate
|
||||
public import Init.Data.Fin.Fold
|
||||
public import Init.Data.Fin.Lemmas
|
||||
public import Init.Data.Fin.OverflowAware
|
||||
|
||||
51
src/Init/Data/Fin/OverflowAware.lean
Normal file
51
src/Init/Data/Fin/OverflowAware.lean
Normal file
@@ -0,0 +1,51 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.Fin.Basic
|
||||
import Init.Data.Fin.Lemmas
|
||||
|
||||
set_option doc.verso true
|
||||
|
||||
public section
|
||||
|
||||
namespace Fin
|
||||
|
||||
/--
|
||||
Overflow-aware addition of a natural number to an element of {lean}`Fin n`.
|
||||
|
||||
Examples:
|
||||
* {lean}`(2 : Fin 3).addNat? 1 = (none : Option (Fin 3))`
|
||||
* {lean}`(2 : Fin 4).addNat? 1 = (some 3 : Option (Fin 4))`
|
||||
-/
|
||||
@[inline]
|
||||
protected def addNat? (i : Fin n) (m : Nat) : Option (Fin n) :=
|
||||
if h : i + m < n then some ⟨i + m, h⟩ else none
|
||||
|
||||
theorem addNat?_eq_some {i : Fin n} (h : i + m < n) : i.addNat? m = some ⟨i + m, h⟩ := by
|
||||
simp [Fin.addNat?, h]
|
||||
|
||||
theorem addNat?_eq_some_iff {i : Fin n} :
|
||||
i.addNat? m = some j ↔ i + m < n ∧ j = i + m := by
|
||||
simp only [Fin.addNat?]
|
||||
split <;> simp [Fin.ext_iff, eq_comm, *]
|
||||
|
||||
@[simp]
|
||||
theorem addNat?_eq_none_iff {i : Fin n} : i.addNat? m = none ↔ n ≤ i + m := by
|
||||
simp only [Fin.addNat?]
|
||||
split <;> simp_all [Nat.not_lt]
|
||||
|
||||
@[simp]
|
||||
theorem addNat?_zero {i : Fin n} : i.addNat? 0 = some i := by
|
||||
simp [addNat?_eq_some_iff]
|
||||
|
||||
@[grind =]
|
||||
theorem addNat?_eq_dif {i : Fin n} :
|
||||
i.addNat? m = if h : i + m < n then some ⟨i + m, h⟩ else none := by
|
||||
rfl
|
||||
|
||||
end Fin
|
||||
@@ -15,3 +15,4 @@ public import Init.Data.Option.Attach
|
||||
public import Init.Data.Option.List
|
||||
public import Init.Data.Option.Monadic
|
||||
public import Init.Data.Option.Array
|
||||
public import Init.Data.Option.Function
|
||||
|
||||
26
src/Init/Data/Option/Function.lean
Normal file
26
src/Init/Data/Option/Function.lean
Normal file
@@ -0,0 +1,26 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.Function
|
||||
import Init.Data.Option.Lemmas
|
||||
|
||||
public section
|
||||
|
||||
namespace Option
|
||||
|
||||
theorem map_injective {f : α → β} (hf : Function.Injective f) :
|
||||
Function.Injective (Option.map f) := by
|
||||
intros a b hab
|
||||
cases a <;> cases b
|
||||
· simp
|
||||
· simp at hab
|
||||
· simp at hab
|
||||
· simp only [map_some, some.injEq] at hab
|
||||
simpa using hf hab
|
||||
|
||||
end Option
|
||||
@@ -307,12 +307,20 @@ theorem map_id' {x : Option α} : (x.map fun a => a) = x := congrFun map_id x
|
||||
|
||||
theorem map_id_apply' {α : Type u} {x : Option α} : Option.map (fun (a : α) => a) x = x := by simp
|
||||
|
||||
/-- See `Option.apply_get` for a version that can be rewritten in the reverse direction. -/
|
||||
@[simp, grind =] theorem get_map {f : α → β} {o : Option α} {h : (o.map f).isSome} :
|
||||
(o.map f).get h = f (o.get (by simpa using h)) := by
|
||||
cases o with
|
||||
| none => simp at h
|
||||
| some a => simp
|
||||
|
||||
/-- See `Option.get_map` for a version that can be rewritten in the reverse direction. -/
|
||||
theorem apply_get {f : α → β} {o : Option α} {h} :
|
||||
f (o.get h) = (o.map f).get (by simp [h]) := by
|
||||
cases o
|
||||
· simp at h
|
||||
· simp
|
||||
|
||||
@[simp] theorem map_map (h : β → γ) (g : α → β) (x : Option α) :
|
||||
(x.map g).map h = x.map (h ∘ g) := by
|
||||
cases x <;> simp only [map_none, map_some, ·∘·]
|
||||
@@ -732,6 +740,11 @@ theorem get_merge {o o' : Option α} {f : α → α → α} {i : α} [Std.Lawful
|
||||
theorem elim_guard : (guard p a).elim b f = if p a then f a else b := by
|
||||
cases h : p a <;> simp [*, guard]
|
||||
|
||||
@[simp]
|
||||
theorem Option.elim_map {f : α → β} {g' : γ} {g : β → γ} (o : Option α) :
|
||||
(o.map f).elim g' g = o.elim g' (g ∘ f) := by
|
||||
cases o <;> simp
|
||||
|
||||
-- I don't see how to construct a good grind pattern to instantiate this.
|
||||
@[simp] theorem getD_map (f : α → β) (x : α) (o : Option α) :
|
||||
(o.map f).getD (f x) = f (getD o x) := by cases o <;> rfl
|
||||
|
||||
@@ -10,7 +10,10 @@ public import Init.Data.Range.Polymorphic.Basic
|
||||
public import Init.Data.Range.Polymorphic.Iterators
|
||||
public import Init.Data.Range.Polymorphic.Stream
|
||||
public import Init.Data.Range.Polymorphic.Lemmas
|
||||
public import Init.Data.Range.Polymorphic.Map
|
||||
|
||||
public import Init.Data.Range.Polymorphic.Fin
|
||||
public import Init.Data.Range.Polymorphic.Char
|
||||
public import Init.Data.Range.Polymorphic.Nat
|
||||
public import Init.Data.Range.Polymorphic.Int
|
||||
public import Init.Data.Range.Polymorphic.BitVec
|
||||
|
||||
79
src/Init/Data/Range/Polymorphic/Char.lean
Normal file
79
src/Init/Data/Range/Polymorphic/Char.lean
Normal file
@@ -0,0 +1,79 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.Char.Ordinal
|
||||
public import Init.Data.Range.Polymorphic.Fin
|
||||
import Init.Data.Range.Polymorphic.Lemmas
|
||||
import Init.Data.Range.Polymorphic.Map
|
||||
import Init.Data.Char.Order
|
||||
|
||||
open Std Std.PRange Std.PRange.UpwardEnumerable
|
||||
|
||||
namespace Char
|
||||
|
||||
public instance : UpwardEnumerable Char where
|
||||
succ?
|
||||
succMany?
|
||||
|
||||
@[simp]
|
||||
public theorem pRangeSucc?_eq : PRange.succ? (α := Char) = Char.succ? := rfl
|
||||
|
||||
@[simp]
|
||||
public theorem pRangeSuccMany?_eq : PRange.succMany? (α := Char) = Char.succMany? := rfl
|
||||
|
||||
public instance : Rxc.HasSize Char where
|
||||
size lo hi := Rxc.HasSize.size lo.ordinal hi.ordinal
|
||||
|
||||
public instance : Rxo.HasSize Char where
|
||||
size lo hi := Rxo.HasSize.size lo.ordinal hi.ordinal
|
||||
|
||||
public instance : Rxi.HasSize Char where
|
||||
size hi := Rxi.HasSize.size hi.ordinal
|
||||
|
||||
public instance : Least? Char where
|
||||
least? := some '\x00'
|
||||
|
||||
@[simp]
|
||||
public theorem least?_eq : Least?.least? (α := Char) = some '\x00' := rfl
|
||||
|
||||
def map : Map Char (Fin Char.numCodePoints) where
|
||||
toFun := Char.ordinal
|
||||
injective := ordinal_injective
|
||||
succ?_toFun := by simp [succ?_eq]
|
||||
succMany?_toFun := by simp [succMany?_eq]
|
||||
|
||||
@[simp]
|
||||
theorem toFun_map : map.toFun = Char.ordinal := rfl
|
||||
|
||||
instance : Map.PreservesLE map where
|
||||
le_iff := by simp [le_iff_ordinal_le]
|
||||
|
||||
instance : Map.PreservesRxcSize map where
|
||||
size_eq := rfl
|
||||
|
||||
instance : Map.PreservesRxoSize map where
|
||||
size_eq := rfl
|
||||
|
||||
instance : Map.PreservesRxiSize map where
|
||||
size_eq := rfl
|
||||
|
||||
instance : Map.PreservesLeast? map where
|
||||
map_least? := by simp
|
||||
|
||||
public instance : LawfulUpwardEnumerable Char := .ofMap map
|
||||
public instance : LawfulUpwardEnumerableLE Char := .ofMap map
|
||||
public instance : LawfulUpwardEnumerableLT Char := .ofMap map
|
||||
public instance : LawfulUpwardEnumerableLeast? Char := .ofMap map
|
||||
public instance : Rxc.LawfulHasSize Char := .ofMap map
|
||||
public instance : Rxc.IsAlwaysFinite Char := .ofMap map
|
||||
public instance : Rxo.LawfulHasSize Char := .ofMap map
|
||||
public instance : Rxo.IsAlwaysFinite Char := .ofMap map
|
||||
public instance : Rxi.LawfulHasSize Char := .ofMap map
|
||||
public instance : Rxi.IsAlwaysFinite Char := .ofMap map
|
||||
|
||||
end Char
|
||||
92
src/Init/Data/Range/Polymorphic/Fin.lean
Normal file
92
src/Init/Data/Range/Polymorphic/Fin.lean
Normal file
@@ -0,0 +1,92 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.Range.Polymorphic.Instances
|
||||
public import Init.Data.Fin.OverflowAware
|
||||
import Init.Grind
|
||||
|
||||
public section
|
||||
|
||||
open Std Std.PRange
|
||||
|
||||
namespace Fin
|
||||
|
||||
instance : UpwardEnumerable (Fin n) where
|
||||
succ? i := i.addNat? 1
|
||||
succMany? m i := i.addNat? m
|
||||
|
||||
@[simp, grind =]
|
||||
theorem pRangeSucc?_eq : PRange.succ? (α := Fin n) = (·.addNat? 1) := rfl
|
||||
|
||||
@[simp, grind =]
|
||||
theorem pRangeSuccMany?_eq : PRange.succMany? m (α := Fin n) = (·.addNat? m) :=
|
||||
rfl
|
||||
|
||||
instance : LawfulUpwardEnumerable (Fin n) where
|
||||
ne_of_lt a b := by grind [UpwardEnumerable.LT]
|
||||
succMany?_zero a := by simp
|
||||
succMany?_add_one m a := by grind
|
||||
|
||||
instance : LawfulUpwardEnumerableLE (Fin n) where
|
||||
le_iff x y := by
|
||||
simp only [le_def, UpwardEnumerable.LE, pRangeSuccMany?_eq, Fin.addNat?_eq_dif,
|
||||
Option.dite_none_right_eq_some, Option.some.injEq, ← val_inj, exists_prop]
|
||||
exact ⟨fun h => ⟨y - x, by grind⟩, by grind⟩
|
||||
|
||||
instance : Least? (Fin 0) where
|
||||
least? := none
|
||||
|
||||
instance : LawfulUpwardEnumerableLeast? (Fin 0) where
|
||||
least?_le a := False.elim (Nat.not_lt_zero _ a.isLt)
|
||||
|
||||
@[simp]
|
||||
theorem least?_eq_of_zero : Least?.least? (α := Fin 0) = none := rfl
|
||||
|
||||
instance [NeZero n] : Least? (Fin n) where
|
||||
least? := some 0
|
||||
|
||||
instance [NeZero n] : LawfulUpwardEnumerableLeast? (Fin n) where
|
||||
least?_le a := ⟨0, rfl, (LawfulUpwardEnumerableLE.le_iff 0 a).1 (Fin.zero_le _)⟩
|
||||
|
||||
@[simp]
|
||||
theorem least?_eq [NeZero n] : Least?.least? (α := Fin n) = some 0 := rfl
|
||||
|
||||
instance : LawfulUpwardEnumerableLT (Fin n) := inferInstance
|
||||
|
||||
instance : Rxc.HasSize (Fin n) where
|
||||
size lo hi := hi + 1 - lo
|
||||
|
||||
@[grind =]
|
||||
theorem rxcHasSize_eq :
|
||||
Rxc.HasSize.size (α := Fin n) = fun (lo hi : Fin n) => (hi + 1 - lo : Nat) := rfl
|
||||
|
||||
instance : Rxc.LawfulHasSize (Fin n) where
|
||||
size_eq_zero_of_not_le bound x := by grind
|
||||
size_eq_one_of_succ?_eq_none lo hi := by grind
|
||||
size_eq_succ_of_succ?_eq_some lo hi x := by grind
|
||||
|
||||
instance : Rxc.IsAlwaysFinite (Fin n) := inferInstance
|
||||
|
||||
instance : Rxo.HasSize (Fin n) := .ofClosed
|
||||
instance : Rxo.LawfulHasSize (Fin n) := inferInstance
|
||||
instance : Rxo.IsAlwaysFinite (Fin n) := inferInstance
|
||||
|
||||
instance : Rxi.HasSize (Fin n) where
|
||||
size lo := n - lo
|
||||
|
||||
@[grind =]
|
||||
theorem rxiHasSize_eq :
|
||||
Rxi.HasSize.size (α := Fin n) = fun (lo : Fin n) => (n - lo : Nat) := rfl
|
||||
|
||||
instance : Rxi.LawfulHasSize (Fin n) where
|
||||
size_eq_one_of_succ?_eq_none x := by grind
|
||||
size_eq_succ_of_succ?_eq_some lo lo' := by grind
|
||||
|
||||
instance : Rxi.IsAlwaysFinite (Fin n) := inferInstance
|
||||
|
||||
end Fin
|
||||
195
src/Init/Data/Range/Polymorphic/Map.lean
Normal file
195
src/Init/Data/Range/Polymorphic/Map.lean
Normal file
@@ -0,0 +1,195 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.Range.Polymorphic.Instances
|
||||
public import Init.Data.Function
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Data.Option.Function
|
||||
|
||||
public section
|
||||
|
||||
/-!
|
||||
# Mappings between `UpwardEnumerable` types
|
||||
|
||||
In this file we build machinery for pulling back lawfulness properties for `UpwardEnumerable` along
|
||||
injective functions that commute with the relevant operations.
|
||||
-/
|
||||
|
||||
namespace Std
|
||||
|
||||
namespace PRange
|
||||
|
||||
namespace UpwardEnumerable
|
||||
|
||||
/--
|
||||
An injective mapping between two types implementing `UpwardEnumerable` that commutes with `succ?`
|
||||
and `succMany?`.
|
||||
|
||||
Having such a mapping means that all of the `Prop`-valued lawfulness classes around
|
||||
`UpwardEnumerable` can be pulled back.
|
||||
-/
|
||||
structure Map (α : Type u) (β : Type v) [UpwardEnumerable α] [UpwardEnumerable β] where
|
||||
toFun : α → β
|
||||
injective : Function.Injective toFun
|
||||
succ?_toFun (a : α) : succ? (toFun a) = (succ? a).map toFun
|
||||
succMany?_toFun (n : Nat) (a : α) : succMany? n (toFun a) = (succMany? n a).map toFun
|
||||
|
||||
namespace Map
|
||||
|
||||
variable [UpwardEnumerable α] [UpwardEnumerable β]
|
||||
|
||||
theorem succ?_eq_none_iff (f : Map α β) {a : α} :
|
||||
succ? a = none ↔ succ? (f.toFun a) = none := by
|
||||
rw [← (Option.map_injective f.injective).eq_iff, Option.map_none, ← f.succ?_toFun]
|
||||
|
||||
theorem succ?_eq_some_iff (f : Map α β) {a b : α} :
|
||||
succ? a = some b ↔ succ? (f.toFun a) = some (f.toFun b) := by
|
||||
rw [← (Option.map_injective f.injective).eq_iff, Option.map_some, ← f.succ?_toFun]
|
||||
|
||||
theorem le_iff (f : Map α β) {a b : α} :
|
||||
UpwardEnumerable.LE a b ↔ UpwardEnumerable.LE (f.toFun a) (f.toFun b) := by
|
||||
simp only [UpwardEnumerable.LE, f.succMany?_toFun, Option.map_eq_some_iff]
|
||||
refine ⟨fun ⟨n, hn⟩ => ⟨n, b, by simp [hn]⟩, fun ⟨n, c, hn⟩ => ⟨n, ?_⟩⟩
|
||||
rw [hn.1, Option.some_inj, f.injective hn.2]
|
||||
|
||||
theorem lt_iff (f : Map α β) {a b : α} :
|
||||
UpwardEnumerable.LT a b ↔ UpwardEnumerable.LT (f.toFun a) (f.toFun b) := by
|
||||
simp only [UpwardEnumerable.LT, f.succMany?_toFun, Option.map_eq_some_iff]
|
||||
refine ⟨fun ⟨n, hn⟩ => ⟨n, b, by simp [hn]⟩, fun ⟨n, c, hn⟩ => ⟨n, ?_⟩⟩
|
||||
rw [hn.1, Option.some_inj, f.injective hn.2]
|
||||
|
||||
theorem succ?_toFun' (f : Map α β) : succ? ∘ f.toFun = Option.map f.toFun ∘ succ? := by
|
||||
ext
|
||||
simp [f.succ?_toFun]
|
||||
|
||||
/-- Compatibility class for `Map` and `≤`. -/
|
||||
class PreservesLE [LE α] [LE β] (f : Map α β) where
|
||||
le_iff : a ≤ b ↔ f.toFun a ≤ f.toFun b
|
||||
|
||||
/-- Compatibility class for `Map` and `<`. -/
|
||||
class PreservesLT [LT α] [LT β] (f : Map α β) where
|
||||
lt_iff : a < b ↔ f.toFun a < f.toFun b
|
||||
|
||||
/-- Compatibility class for `Map` and `Rxc.HasSize`. -/
|
||||
class PreservesRxcSize [Rxc.HasSize α] [Rxc.HasSize β] (f : Map α β) where
|
||||
size_eq : Rxc.HasSize.size a b = Rxc.HasSize.size (f.toFun a) (f.toFun b)
|
||||
|
||||
/-- Compatibility class for `Map` and `Rxo.HasSize`. -/
|
||||
class PreservesRxoSize [Rxo.HasSize α] [Rxo.HasSize β] (f : Map α β) where
|
||||
size_eq : Rxo.HasSize.size a b = Rxo.HasSize.size (f.toFun a) (f.toFun b)
|
||||
|
||||
/-- Compatibility class for `Map` and `Rxi.HasSize`. -/
|
||||
class PreservesRxiSize [Rxi.HasSize α] [Rxi.HasSize β] (f : Map α β) where
|
||||
size_eq : Rxi.HasSize.size b = Rxi.HasSize.size (f.toFun b)
|
||||
|
||||
/-- Compatibility class for `Map` and `Least?`. -/
|
||||
class PreservesLeast? [Least? α] [Least? β] (f : Map α β) where
|
||||
map_least? : Least?.least?.map f.toFun = Least?.least?
|
||||
|
||||
end UpwardEnumerable.Map
|
||||
|
||||
open UpwardEnumerable
|
||||
|
||||
variable [UpwardEnumerable α] [UpwardEnumerable β]
|
||||
|
||||
theorem LawfulUpwardEnumerable.ofMap [LawfulUpwardEnumerable β] (f : Map α β) :
|
||||
LawfulUpwardEnumerable α where
|
||||
ne_of_lt a b := by
|
||||
simpa only [f.lt_iff, ← f.injective.ne_iff] using LawfulUpwardEnumerable.ne_of_lt _ _
|
||||
succMany?_zero a := by
|
||||
apply Option.map_injective f.injective
|
||||
simpa [← f.succMany?_toFun] using LawfulUpwardEnumerable.succMany?_zero _
|
||||
succMany?_add_one n a := by
|
||||
apply Option.map_injective f.injective
|
||||
rw [← f.succMany?_toFun, LawfulUpwardEnumerable.succMany?_add_one,
|
||||
f.succMany?_toFun, Option.bind_map, Map.succ?_toFun', Option.map_bind]
|
||||
|
||||
instance [LE α] [LT α] [LawfulOrderLT α] [LE β] [LT β] [LawfulOrderLT β] (f : Map α β)
|
||||
[f.PreservesLE] : f.PreservesLT where
|
||||
lt_iff := by simp [lt_iff_le_and_not_ge, Map.PreservesLE.le_iff (f := f)]
|
||||
|
||||
theorem LawfulUpwardEnumerableLE.ofMap [LE α] [LE β] [LawfulUpwardEnumerableLE β] (f : Map α β)
|
||||
[f.PreservesLE] : LawfulUpwardEnumerableLE α where
|
||||
le_iff := by simp [Map.PreservesLE.le_iff (f := f), f.le_iff, LawfulUpwardEnumerableLE.le_iff]
|
||||
|
||||
theorem LawfulUpwardEnumerableLT.ofMap [LT α] [LT β] [LawfulUpwardEnumerableLT β] (f : Map α β)
|
||||
[f.PreservesLT] : LawfulUpwardEnumerableLT α where
|
||||
lt_iff := by simp [Map.PreservesLT.lt_iff (f := f), f.lt_iff, LawfulUpwardEnumerableLT.lt_iff]
|
||||
|
||||
theorem LawfulUpwardEnumerableLeast?.ofMap [Least? α] [Least? β] [LawfulUpwardEnumerableLeast? β]
|
||||
(f : Map α β) [f.PreservesLeast?] : LawfulUpwardEnumerableLeast? α where
|
||||
least?_le a := by
|
||||
obtain ⟨l, hl, hl'⟩ := LawfulUpwardEnumerableLeast?.least?_le (f.toFun a)
|
||||
have : (Least?.least? (α := α)).isSome := by
|
||||
rw [← Option.isSome_map (f := f.toFun), Map.PreservesLeast?.map_least?,
|
||||
hl, Option.isSome_some]
|
||||
refine ⟨Option.get _ this, by simp, ?_⟩
|
||||
rw [f.le_iff, Option.apply_get (f := f.toFun)]
|
||||
simpa [Map.PreservesLeast?.map_least?, hl] using hl'
|
||||
|
||||
end PRange
|
||||
|
||||
open PRange PRange.UpwardEnumerable
|
||||
|
||||
variable [UpwardEnumerable α] [UpwardEnumerable β]
|
||||
|
||||
theorem Rxc.LawfulHasSize.ofMap [LE α] [LE β] [Rxc.HasSize α] [Rxc.HasSize β] [Rxc.LawfulHasSize β]
|
||||
(f : Map α β) [f.PreservesLE] [f.PreservesRxcSize] : Rxc.LawfulHasSize α where
|
||||
size_eq_zero_of_not_le a b := by
|
||||
simpa [Map.PreservesRxcSize.size_eq (f := f), Map.PreservesLE.le_iff (f := f)] using
|
||||
Rxc.LawfulHasSize.size_eq_zero_of_not_le _ _
|
||||
size_eq_one_of_succ?_eq_none lo hi := by
|
||||
simpa [Map.PreservesRxcSize.size_eq (f := f), Map.PreservesLE.le_iff (f := f),
|
||||
f.succ?_eq_none_iff] using
|
||||
Rxc.LawfulHasSize.size_eq_one_of_succ?_eq_none _ _
|
||||
size_eq_succ_of_succ?_eq_some lo hi lo' := by
|
||||
simpa [Map.PreservesRxcSize.size_eq (f := f), Map.PreservesLE.le_iff (f := f),
|
||||
f.succ?_eq_some_iff] using
|
||||
Rxc.LawfulHasSize.size_eq_succ_of_succ?_eq_some _ _ _
|
||||
|
||||
theorem Rxo.LawfulHasSize.ofMap [LT α] [LT β] [Rxo.HasSize α] [Rxo.HasSize β] [Rxo.LawfulHasSize β]
|
||||
(f : Map α β) [f.PreservesLT] [f.PreservesRxoSize] : Rxo.LawfulHasSize α where
|
||||
size_eq_zero_of_not_le a b := by
|
||||
simpa [Map.PreservesRxoSize.size_eq (f := f), Map.PreservesLT.lt_iff (f := f)] using
|
||||
Rxo.LawfulHasSize.size_eq_zero_of_not_le _ _
|
||||
size_eq_one_of_succ?_eq_none lo hi := by
|
||||
simpa [Map.PreservesRxoSize.size_eq (f := f), Map.PreservesLT.lt_iff (f := f),
|
||||
f.succ?_eq_none_iff] using
|
||||
Rxo.LawfulHasSize.size_eq_one_of_succ?_eq_none _ _
|
||||
size_eq_succ_of_succ?_eq_some lo hi lo' := by
|
||||
simpa [Map.PreservesRxoSize.size_eq (f := f), Map.PreservesLT.lt_iff (f := f),
|
||||
f.succ?_eq_some_iff] using
|
||||
Rxo.LawfulHasSize.size_eq_succ_of_succ?_eq_some _ _ _
|
||||
|
||||
theorem Rxi.LawfulHasSize.ofMap [Rxi.HasSize α] [Rxi.HasSize β] [Rxi.LawfulHasSize β]
|
||||
(f : Map α β) [f.PreservesRxiSize] : Rxi.LawfulHasSize α where
|
||||
size_eq_one_of_succ?_eq_none lo := by
|
||||
simpa [Map.PreservesRxiSize.size_eq (f := f), f.succ?_eq_none_iff] using
|
||||
Rxi.LawfulHasSize.size_eq_one_of_succ?_eq_none _
|
||||
size_eq_succ_of_succ?_eq_some lo lo' := by
|
||||
simpa [Map.PreservesRxiSize.size_eq (f := f), f.succ?_eq_some_iff] using
|
||||
Rxi.LawfulHasSize.size_eq_succ_of_succ?_eq_some _ _
|
||||
|
||||
theorem Rxc.IsAlwaysFinite.ofMap [LE α] [LE β] [Rxc.IsAlwaysFinite β] (f : Map α β)
|
||||
[f.PreservesLE] : Rxc.IsAlwaysFinite α where
|
||||
finite init hi := by
|
||||
obtain ⟨n, hn⟩ := Rxc.IsAlwaysFinite.finite (f.toFun init) (f.toFun hi)
|
||||
exact ⟨n, by simpa [f.succMany?_toFun, Map.PreservesLE.le_iff (f := f)] using hn⟩
|
||||
|
||||
theorem Rxo.IsAlwaysFinite.ofMap [LT α] [LT β] [Rxo.IsAlwaysFinite β] (f : Map α β)
|
||||
[f.PreservesLT] : Rxo.IsAlwaysFinite α where
|
||||
finite init hi := by
|
||||
obtain ⟨n, hn⟩ := Rxo.IsAlwaysFinite.finite (f.toFun init) (f.toFun hi)
|
||||
exact ⟨n, by simpa [f.succMany?_toFun, Map.PreservesLT.lt_iff (f := f)] using hn⟩
|
||||
|
||||
theorem Rxi.IsAlwaysFinite.ofMap [Rxi.IsAlwaysFinite β] (f : Map α β) : Rxi.IsAlwaysFinite α where
|
||||
finite init := by
|
||||
obtain ⟨n, hn⟩ := Rxi.IsAlwaysFinite.finite (f.toFun init)
|
||||
exact ⟨n, by simpa [f.succMany?_toFun] using hn⟩
|
||||
|
||||
end Std
|
||||
@@ -157,7 +157,7 @@ Converts an 8-bit signed integer to a natural number, mapping all negative numbe
|
||||
|
||||
Use `Int8.toBitVec` to obtain the two's complement representation.
|
||||
-/
|
||||
@[inline] def Int8.toNatClampNeg (i : Int8) : Nat := i.toInt.toNat
|
||||
@[suggest_for Int8.toNat, inline] def Int8.toNatClampNeg (i : Int8) : Nat := i.toInt.toNat
|
||||
|
||||
/-- Obtains the `Int8` whose 2's complement representation is the given `BitVec 8`. -/
|
||||
@[inline] def Int8.ofBitVec (b : BitVec 8) : Int8 := ⟨⟨b⟩⟩
|
||||
@@ -510,7 +510,7 @@ Converts a 16-bit signed integer to a natural number, mapping all negative numbe
|
||||
|
||||
Use `Int16.toBitVec` to obtain the two's complement representation.
|
||||
-/
|
||||
@[inline] def Int16.toNatClampNeg (i : Int16) : Nat := i.toInt.toNat
|
||||
@[suggest_for Int16.toNat, inline] def Int16.toNatClampNeg (i : Int16) : Nat := i.toInt.toNat
|
||||
|
||||
/-- Obtains the `Int16` whose 2's complement representation is the given `BitVec 16`. -/
|
||||
@[inline] def Int16.ofBitVec (b : BitVec 16) : Int16 := ⟨⟨b⟩⟩
|
||||
@@ -880,7 +880,7 @@ Converts a 32-bit signed integer to a natural number, mapping all negative numbe
|
||||
|
||||
Use `Int32.toBitVec` to obtain the two's complement representation.
|
||||
-/
|
||||
@[inline] def Int32.toNatClampNeg (i : Int32) : Nat := i.toInt.toNat
|
||||
@[suggest_for Int32.toNat, inline] def Int32.toNatClampNeg (i : Int32) : Nat := i.toInt.toNat
|
||||
|
||||
/-- Obtains the `Int32` whose 2's complement representation is the given `BitVec 32`. -/
|
||||
@[inline] def Int32.ofBitVec (b : BitVec 32) : Int32 := ⟨⟨b⟩⟩
|
||||
@@ -1270,7 +1270,7 @@ Converts a 64-bit signed integer to a natural number, mapping all negative numbe
|
||||
|
||||
Use `Int64.toBitVec` to obtain the two's complement representation.
|
||||
-/
|
||||
@[inline] def Int64.toNatClampNeg (i : Int64) : Nat := i.toInt.toNat
|
||||
@[suggest_for Int64.toNat, inline] def Int64.toNatClampNeg (i : Int64) : Nat := i.toInt.toNat
|
||||
|
||||
/-- Obtains the `Int64` whose 2's complement representation is the given `BitVec 64`. -/
|
||||
@[inline] def Int64.ofBitVec (b : BitVec 64) : Int64 := ⟨⟨b⟩⟩
|
||||
@@ -1637,7 +1637,7 @@ Converts a word-sized signed integer to a natural number, mapping all negative n
|
||||
|
||||
Use `ISize.toBitVec` to obtain the two's complement representation.
|
||||
-/
|
||||
@[inline] def ISize.toNatClampNeg (i : ISize) : Nat := i.toInt.toNat
|
||||
@[suggest_for ISize.toNat, inline] def ISize.toNatClampNeg (i : ISize) : Nat := i.toInt.toNat
|
||||
|
||||
/-- Obtains the `ISize` whose 2's complement representation is the given `BitVec`. -/
|
||||
@[inline] def ISize.ofBitVec (b : BitVec System.Platform.numBits) : ISize := ⟨⟨b⟩⟩
|
||||
|
||||
@@ -4,12 +4,9 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Classical
|
||||
|
||||
public section
|
||||
|
||||
namespace Lean.Grind
|
||||
|
||||
/-- A helper gadget for annotating nested proofs in goals. -/
|
||||
|
||||
@@ -360,7 +360,7 @@ recommended_spelling "smul" for "•" in [HSMul.hSMul, «term_•_»]
|
||||
recommended_spelling "append" for "++" in [HAppend.hAppend, «term_++_»]
|
||||
/-- when used as a unary operator -/
|
||||
recommended_spelling "neg" for "-" in [Neg.neg, «term-_»]
|
||||
recommended_spelling "inv" for "⁻¹" in [Inv.inv]
|
||||
recommended_spelling "inv" for "⁻¹" in [Inv.inv, «term_⁻¹»]
|
||||
recommended_spelling "dvd" for "∣" in [Dvd.dvd, «term_∣_»]
|
||||
recommended_spelling "shiftLeft" for "<<<" in [HShiftLeft.hShiftLeft, «term_<<<_»]
|
||||
recommended_spelling "shiftRight" for ">>>" in [HShiftRight.hShiftRight, «term_>>>_»]
|
||||
|
||||
@@ -2810,6 +2810,8 @@ structure Char where
|
||||
/-- The value must be a legal scalar value. -/
|
||||
valid : val.isValidChar
|
||||
|
||||
grind_pattern Char.valid => self.val
|
||||
|
||||
private theorem isValidChar_UInt32 {n : Nat} (h : n.isValidChar) : LT.lt n UInt32.size :=
|
||||
match h with
|
||||
| Or.inl h => Nat.lt_trans h (of_decide_eq_true rfl)
|
||||
|
||||
@@ -14,6 +14,8 @@ public section
|
||||
namespace Lean.Sym
|
||||
|
||||
theorem ne_self (a : α) : (a ≠ a) = False := by simp
|
||||
theorem not_true_eq : (¬ True) = False := by simp
|
||||
theorem not_false_eq : (¬ False) = True := by simp
|
||||
|
||||
theorem ite_cond_congr {α : Sort u} (c : Prop) {inst : Decidable c} (a b : α)
|
||||
(c' : Prop) {inst' : Decidable c'} (h : c = c') : @ite α c inst a b = @ite α c' inst' a b := by
|
||||
@@ -46,6 +48,8 @@ theorem UInt32.lt_eq_true (a b : UInt32) (h : decide (a < b) = true) : (a < b) =
|
||||
theorem UInt64.lt_eq_true (a b : UInt64) (h : decide (a < b) = true) : (a < b) = True := by simp_all
|
||||
theorem Fin.lt_eq_true (a b : Fin n) (h : decide (a < b) = true) : (a < b) = True := by simp_all
|
||||
theorem BitVec.lt_eq_true (a b : BitVec n) (h : decide (a < b) = true) : (a < b) = True := by simp_all
|
||||
theorem String.lt_eq_true (a b : String) (h : decide (a < b) = true) : (a < b) = True := by simp_all
|
||||
theorem Char.lt_eq_true (a b : Char) (h : decide (a < b) = true) : (a < b) = True := by simp_all
|
||||
|
||||
theorem Nat.lt_eq_false (a b : Nat) (h : decide (a < b) = false) : (a < b) = False := by simp_all
|
||||
theorem Int.lt_eq_false (a b : Int) (h : decide (a < b) = false) : (a < b) = False := by simp_all
|
||||
@@ -60,6 +64,8 @@ theorem UInt32.lt_eq_false (a b : UInt32) (h : decide (a < b) = false) : (a < b)
|
||||
theorem UInt64.lt_eq_false (a b : UInt64) (h : decide (a < b) = false) : (a < b) = False := by simp_all
|
||||
theorem Fin.lt_eq_false (a b : Fin n) (h : decide (a < b) = false) : (a < b) = False := by simp_all
|
||||
theorem BitVec.lt_eq_false (a b : BitVec n) (h : decide (a < b) = false) : (a < b) = False := by simp_all
|
||||
theorem String.lt_eq_false (a b : String) (h : decide (a < b) = false) : (a < b) = False := by simp_all
|
||||
theorem Char.lt_eq_false (a b : Char) (h : decide (a < b) = false) : (a < b) = False := by simp_all
|
||||
|
||||
theorem Nat.le_eq_true (a b : Nat) (h : decide (a ≤ b) = true) : (a ≤ b) = True := by simp_all
|
||||
theorem Int.le_eq_true (a b : Int) (h : decide (a ≤ b) = true) : (a ≤ b) = True := by simp_all
|
||||
@@ -74,6 +80,8 @@ theorem UInt32.le_eq_true (a b : UInt32) (h : decide (a ≤ b) = true) : (a ≤
|
||||
theorem UInt64.le_eq_true (a b : UInt64) (h : decide (a ≤ b) = true) : (a ≤ b) = True := by simp_all
|
||||
theorem Fin.le_eq_true (a b : Fin n) (h : decide (a ≤ b) = true) : (a ≤ b) = True := by simp_all
|
||||
theorem BitVec.le_eq_true (a b : BitVec n) (h : decide (a ≤ b) = true) : (a ≤ b) = True := by simp_all
|
||||
theorem String.le_eq_true (a b : String) (h : decide (a ≤ b) = true) : (a ≤ b) = True := by simp_all
|
||||
theorem Char.le_eq_true (a b : Char) (h : decide (a ≤ b) = true) : (a ≤ b) = True := by simp_all
|
||||
|
||||
theorem Nat.le_eq_false (a b : Nat) (h : decide (a ≤ b) = false) : (a ≤ b) = False := by simp_all
|
||||
theorem Int.le_eq_false (a b : Int) (h : decide (a ≤ b) = false) : (a ≤ b) = False := by simp_all
|
||||
@@ -88,62 +96,8 @@ theorem UInt32.le_eq_false (a b : UInt32) (h : decide (a ≤ b) = false) : (a
|
||||
theorem UInt64.le_eq_false (a b : UInt64) (h : decide (a ≤ b) = false) : (a ≤ b) = False := by simp_all
|
||||
theorem Fin.le_eq_false (a b : Fin n) (h : decide (a ≤ b) = false) : (a ≤ b) = False := by simp_all
|
||||
theorem BitVec.le_eq_false (a b : BitVec n) (h : decide (a ≤ b) = false) : (a ≤ b) = False := by simp_all
|
||||
|
||||
theorem Nat.gt_eq_true (a b : Nat) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem Int.gt_eq_true (a b : Int) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem Rat.gt_eq_true (a b : Rat) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem Int8.gt_eq_true (a b : Int8) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem Int16.gt_eq_true (a b : Int16) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem Int32.gt_eq_true (a b : Int32) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem Int64.gt_eq_true (a b : Int64) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem UInt8.gt_eq_true (a b : UInt8) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem UInt16.gt_eq_true (a b : UInt16) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem UInt32.gt_eq_true (a b : UInt32) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem UInt64.gt_eq_true (a b : UInt64) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem Fin.gt_eq_true (a b : Fin n) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
theorem BitVec.gt_eq_true (a b : BitVec n) (h : decide (a > b) = true) : (a > b) = True := by simp_all
|
||||
|
||||
theorem Nat.gt_eq_false (a b : Nat) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem Int.gt_eq_false (a b : Int) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem Rat.gt_eq_false (a b : Rat) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem Int8.gt_eq_false (a b : Int8) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem Int16.gt_eq_false (a b : Int16) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem Int32.gt_eq_false (a b : Int32) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem Int64.gt_eq_false (a b : Int64) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem UInt8.gt_eq_false (a b : UInt8) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem UInt16.gt_eq_false (a b : UInt16) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem UInt32.gt_eq_false (a b : UInt32) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem UInt64.gt_eq_false (a b : UInt64) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem Fin.gt_eq_false (a b : Fin n) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
theorem BitVec.gt_eq_false (a b : BitVec n) (h : decide (a > b) = false) : (a > b) = False := by simp_all
|
||||
|
||||
theorem Nat.ge_eq_true (a b : Nat) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem Int.ge_eq_true (a b : Int) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem Rat.ge_eq_true (a b : Rat) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem Int8.ge_eq_true (a b : Int8) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem Int16.ge_eq_true (a b : Int16) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem Int32.ge_eq_true (a b : Int32) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem Int64.ge_eq_true (a b : Int64) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem UInt8.ge_eq_true (a b : UInt8) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem UInt16.ge_eq_true (a b : UInt16) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem UInt32.ge_eq_true (a b : UInt32) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem UInt64.ge_eq_true (a b : UInt64) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem Fin.ge_eq_true (a b : Fin n) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
theorem BitVec.ge_eq_true (a b : BitVec n) (h : decide (a ≥ b) = true) : (a ≥ b) = True := by simp_all
|
||||
|
||||
theorem Nat.ge_eq_false (a b : Nat) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem Int.ge_eq_false (a b : Int) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem Rat.ge_eq_false (a b : Rat) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem Int8.ge_eq_false (a b : Int8) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem Int16.ge_eq_false (a b : Int16) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem Int32.ge_eq_false (a b : Int32) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem Int64.ge_eq_false (a b : Int64) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem UInt8.ge_eq_false (a b : UInt8) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem UInt16.ge_eq_false (a b : UInt16) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem UInt32.ge_eq_false (a b : UInt32) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem UInt64.ge_eq_false (a b : UInt64) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem Fin.ge_eq_false (a b : Fin n) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem BitVec.ge_eq_false (a b : BitVec n) (h : decide (a ≥ b) = false) : (a ≥ b) = False := by simp_all
|
||||
theorem String.le_eq_false (a b : String) (h : decide (a ≤ b) = false) : (a ≤ b) = False := by simp_all
|
||||
theorem Char.le_eq_false (a b : Char) (h : decide (a ≤ b) = false) : (a ≤ b) = False := by simp_all
|
||||
|
||||
theorem Nat.eq_eq_true (a b : Nat) (h : decide (a = b) = true) : (a = b) = True := by simp_all
|
||||
theorem Int.eq_eq_true (a b : Int) (h : decide (a = b) = true) : (a = b) = True := by simp_all
|
||||
@@ -158,6 +112,8 @@ theorem UInt32.eq_eq_true (a b : UInt32) (h : decide (a = b) = true) : (a = b) =
|
||||
theorem UInt64.eq_eq_true (a b : UInt64) (h : decide (a = b) = true) : (a = b) = True := by simp_all
|
||||
theorem Fin.eq_eq_true (a b : Fin n) (h : decide (a = b) = true) : (a = b) = True := by simp_all
|
||||
theorem BitVec.eq_eq_true (a b : BitVec n) (h : decide (a = b) = true) : (a = b) = True := by simp_all
|
||||
theorem String.eq_eq_true (a b : String) (h : decide (a = b) = true) : (a = b) = True := by simp_all
|
||||
theorem Char.eq_eq_true (a b : Char) (h : decide (a = b) = true) : (a = b) = True := by simp_all
|
||||
|
||||
theorem Nat.eq_eq_false (a b : Nat) (h : decide (a = b) = false) : (a = b) = False := by simp_all
|
||||
theorem Int.eq_eq_false (a b : Int) (h : decide (a = b) = false) : (a = b) = False := by simp_all
|
||||
@@ -172,34 +128,8 @@ theorem UInt32.eq_eq_false (a b : UInt32) (h : decide (a = b) = false) : (a = b)
|
||||
theorem UInt64.eq_eq_false (a b : UInt64) (h : decide (a = b) = false) : (a = b) = False := by simp_all
|
||||
theorem Fin.eq_eq_false (a b : Fin n) (h : decide (a = b) = false) : (a = b) = False := by simp_all
|
||||
theorem BitVec.eq_eq_false (a b : BitVec n) (h : decide (a = b) = false) : (a = b) = False := by simp_all
|
||||
|
||||
theorem Nat.ne_eq_true (a b : Nat) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem Int.ne_eq_true (a b : Int) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem Rat.ne_eq_true (a b : Rat) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem Int8.ne_eq_true (a b : Int8) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem Int16.ne_eq_true (a b : Int16) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem Int32.ne_eq_true (a b : Int32) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem Int64.ne_eq_true (a b : Int64) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem UInt8.ne_eq_true (a b : UInt8) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem UInt16.ne_eq_true (a b : UInt16) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem UInt32.ne_eq_true (a b : UInt32) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem UInt64.ne_eq_true (a b : UInt64) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem Fin.ne_eq_true (a b : Fin n) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
theorem BitVec.ne_eq_true (a b : BitVec n) (h : decide (a ≠ b) = true) : (a ≠ b) = True := by simp_all
|
||||
|
||||
theorem Nat.ne_eq_false (a b : Nat) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem Int.ne_eq_false (a b : Int) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem Rat.ne_eq_false (a b : Rat) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem Int8.ne_eq_false (a b : Int8) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem Int16.ne_eq_false (a b : Int16) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem Int32.ne_eq_false (a b : Int32) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem Int64.ne_eq_false (a b : Int64) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem UInt8.ne_eq_false (a b : UInt8) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem UInt16.ne_eq_false (a b : UInt16) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem UInt32.ne_eq_false (a b : UInt32) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem UInt64.ne_eq_false (a b : UInt64) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem Fin.ne_eq_false (a b : Fin n) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem BitVec.ne_eq_false (a b : BitVec n) (h : decide (a ≠ b) = false) : (a ≠ b) = False := by simp_all
|
||||
theorem String.eq_eq_false (a b : String) (h : decide (a = b) = false) : (a = b) = False := by simp_all
|
||||
theorem Char.eq_eq_false (a b : Char) (h : decide (a = b) = false) : (a = b) = False := by simp_all
|
||||
|
||||
theorem Nat.dvd_eq_true (a b : Nat) (h : decide (a ∣ b) = true) : (a ∣ b) = True := by simp_all
|
||||
theorem Int.dvd_eq_true (a b : Int) (h : decide (a ∣ b) = true) : (a ∣ b) = True := by simp_all
|
||||
|
||||
@@ -7,6 +7,7 @@ module
|
||||
|
||||
prelude
|
||||
public import Lean.Attributes
|
||||
import Lean.Meta.RecExt
|
||||
|
||||
public section
|
||||
|
||||
@@ -33,14 +34,8 @@ private def isValidMacroInline (declName : Name) : CoreM Bool := do
|
||||
unless info.all.length = 1 do
|
||||
-- We do not allow `[macro_inline]` attributes at mutual recursive definitions
|
||||
return false
|
||||
let env ← getEnv
|
||||
let isRec (declName' : Name) : Bool :=
|
||||
isBRecOnRecursor env declName' ||
|
||||
declName' == ``WellFounded.fix ||
|
||||
declName' == ``WellFounded.Nat.fix ||
|
||||
declName' == declName ++ `_unary -- Auxiliary declaration created by `WF` module
|
||||
if Option.isSome <| info.value.find? fun e => e.isConst && isRec e.constName! then
|
||||
-- It contains a `brecOn` or `WellFounded.fix` application. So, it should be recursvie
|
||||
if (← Meta.isRecursiveDefinition declName) then
|
||||
-- It is recursive
|
||||
return false
|
||||
return true
|
||||
|
||||
|
||||
@@ -56,9 +56,9 @@ public def Environment.getModulePackageByIdx? (env : Environment) (idx : ModuleI
|
||||
Returns the standard base of the native symbol for the compiled constant {lean}`declName`.
|
||||
|
||||
For many constants, this is the full symbol. However, initializers have an additional prefix
|
||||
(i.e., {lit}`_init_`) and boxed functions have an additional suffix (i.e., {lit}`___boxed`).
|
||||
Furthermore, some constants do not use this stem at all (e.g., {lit}`main` and definitions
|
||||
with {lit}`@[export]`).
|
||||
(i.e., {lit}`_init_`) and boxed functions have an additional suffix
|
||||
(see {name}`mkMangledBoxedName`). Furthermore, some constants do not use this stem at all
|
||||
(e.g., {lit}`main` and definitions with {lit}`@[export]`).
|
||||
-/
|
||||
@[export lean_get_symbol_stem]
|
||||
public def getSymbolStem (env : Environment) (declName : Name) : String :=
|
||||
|
||||
@@ -7,7 +7,7 @@ module
|
||||
|
||||
prelude
|
||||
public import Lean.Setup
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.String.TakeDrop
|
||||
|
||||
namespace String
|
||||
|
||||
@@ -133,6 +133,18 @@ def Name.mangleAux : Name → String
|
||||
public def Name.mangle (n : Name) (pre : String := "l_") : String :=
|
||||
pre ++ Name.mangleAux n
|
||||
|
||||
/--
|
||||
Given `s = nm.mangle pre` for some `nm : Name` and `pre : String` with `nm != Name.anonymous`,
|
||||
returns `(mkBoxedName nm).mangle pre`. This is used in the interpreter to find names of boxed
|
||||
IR declarations.
|
||||
-/
|
||||
@[export lean_mk_mangled_boxed_name]
|
||||
public def mkMangledBoxedName (s : String) : String :=
|
||||
if s.endsWith "__" then
|
||||
s ++ "_00__boxed"
|
||||
else
|
||||
s ++ "___boxed"
|
||||
|
||||
/--
|
||||
The mangled name of the name used to create the module initialization function.
|
||||
|
||||
|
||||
@@ -543,12 +543,10 @@ def logSnapshotTask (task : Language.SnapshotTask Language.SnapshotTree) : CoreM
|
||||
/-- Wraps the given action for use in `EIO.asTask` etc., discarding its final monadic state. -/
|
||||
def wrapAsync {α : Type} (act : α → CoreM β) (cancelTk? : Option IO.CancelToken) :
|
||||
CoreM (α → EIO Exception β) := do
|
||||
let (childNGen, parentNGen) := (← getNGen).mkChild
|
||||
setNGen parentNGen
|
||||
let (childDeclNGen, parentDeclNGen) := (← getDeclNGen).mkChild
|
||||
setDeclNGen parentDeclNGen
|
||||
let (childNGen, parentNGen) := (← getDeclNGen).mkChild
|
||||
setDeclNGen parentNGen
|
||||
let st ← get
|
||||
let st := { st with auxDeclNGen := childDeclNGen, ngen := childNGen }
|
||||
let st := { st with auxDeclNGen := childNGen }
|
||||
let ctx ← read
|
||||
let ctx := { ctx with cancelTk? }
|
||||
let heartbeats := (← IO.getNumHeartbeats) - ctx.initHeartbeats
|
||||
|
||||
@@ -125,7 +125,7 @@ Parses and elaborates a Verso module docstring.
|
||||
def versoModDocString
|
||||
(range : DeclarationRange) (doc : TSyntax ``document) :
|
||||
TermElabM VersoModuleDocs.Snippet := do
|
||||
let level := getVersoModuleDocs (← getEnv) |>.terminalNesting |>.map (· + 1)
|
||||
let level := getMainVersoModuleDocs (← getEnv) |>.terminalNesting |>.map (· + 1)
|
||||
Doc.elabModSnippet range (doc.raw.getArgs.map (⟨·⟩)) (level.getD 0) |>.execForModule
|
||||
|
||||
|
||||
|
||||
@@ -409,11 +409,29 @@ private builtin_initialize versoModuleDocExt :
|
||||
}
|
||||
|
||||
|
||||
def getVersoModuleDocs (env : Environment) : VersoModuleDocs :=
|
||||
/--
|
||||
Returns the Verso module docs for the current main module.
|
||||
|
||||
During elaboration, this will return the modules docs that have been added thus far, rather than
|
||||
those for the entire module.
|
||||
-/
|
||||
def getMainVersoModuleDocs (env : Environment) : VersoModuleDocs :=
|
||||
versoModuleDocExt.getState env
|
||||
|
||||
@[deprecated getMainVersoModuleDocs (since := "2026-01-21")]
|
||||
def getVersoModuleDocs := @getMainVersoModuleDocs
|
||||
|
||||
|
||||
/--
|
||||
Returns all snippets of the Verso module docs from the indicated module, if they exist.
|
||||
-/
|
||||
def getVersoModuleDoc? (env : Environment) (moduleName : Name) :
|
||||
Option (Array VersoModuleDocs.Snippet) :=
|
||||
env.getModuleIdx? moduleName |>.map fun modIdx =>
|
||||
versoModuleDocExt.getModuleEntries (level := .server) env modIdx
|
||||
|
||||
def addVersoModuleDocSnippet (env : Environment) (snippet : VersoModuleDocs.Snippet) : Except String Environment :=
|
||||
let docs := getVersoModuleDocs env
|
||||
let docs := getMainVersoModuleDocs env
|
||||
if docs.canAdd snippet then
|
||||
pure <| versoModuleDocExt.addEntry env snippet
|
||||
else throw s!"Can't add - incorrect nesting {docs.terminalNesting.map (s!"(expected at most {·})") |>.getD ""})"
|
||||
|
||||
@@ -21,7 +21,7 @@ namespace Lean.Elab.Command
|
||||
|
||||
match stx[1] with
|
||||
| Syntax.atom _ val =>
|
||||
if getVersoModuleDocs (← getEnv) |>.isEmpty then
|
||||
if getMainVersoModuleDocs (← getEnv) |>.isEmpty then
|
||||
let doc := String.Pos.Raw.extract val 0 (val.rawEndPos.unoffsetBy ⟨2⟩)
|
||||
modifyEnv fun env => addMainModuleDoc env ⟨doc, range⟩
|
||||
else
|
||||
|
||||
@@ -274,12 +274,10 @@ def wrapAsync {α β : Type} (act : α → CommandElabM β) (cancelTk? : Option
|
||||
CommandElabM (α → EIO Exception β) := do
|
||||
let ctx ← read
|
||||
let ctx := { ctx with cancelTk? }
|
||||
let (childNGen, parentNGen) := (← get).ngen.mkChild
|
||||
modify fun s => { s with ngen := parentNGen }
|
||||
let (childDeclNGen, parentDeclNGen) := (← getDeclNGen).mkChild
|
||||
setDeclNGen parentDeclNGen
|
||||
let (childNGen, parentNGen) := (← getDeclNGen).mkChild
|
||||
setDeclNGen parentNGen
|
||||
let st ← get
|
||||
let st := { st with auxDeclNGen := childDeclNGen, ngen := childNGen }
|
||||
let st := { st with auxDeclNGen := childNGen }
|
||||
return (act · |>.run ctx |>.run' st)
|
||||
|
||||
open Language in
|
||||
|
||||
@@ -907,23 +907,26 @@ def lean (name : Option Ident := none) (error warning : flag false) («show» :
|
||||
(endPos := endPos) (endPos_valid := by simp only [endPos]; split <;> simp [*])
|
||||
let cctx : Command.Context := {fileName := ← getFileName, fileMap := text, snap? := none, cancelTk? := none}
|
||||
let scopes := (← get).scopes
|
||||
let mut cmdState : Command.State := { env, maxRecDepth := ← MonadRecDepth.getMaxRecDepth, scopes }
|
||||
let mut pstate : Parser.ModuleParserState := {pos := pos, recovering := false}
|
||||
let mut cmds := #[]
|
||||
repeat
|
||||
let scope := cmdState.scopes.head!
|
||||
let pmctx := { env := cmdState.env, options := scope.opts, currNamespace := scope.currNamespace, openDecls := scope.openDecls }
|
||||
let (cmd, ps', messages) := Parser.parseCommand ictx pmctx pstate cmdState.messages
|
||||
cmds := cmds.push cmd
|
||||
pstate := ps'
|
||||
cmdState := { cmdState with messages := messages }
|
||||
cmdState ← runCommand (Command.elabCommand cmd) cmd cctx cmdState
|
||||
if Parser.isTerminalCommand cmd then break
|
||||
setEnv cmdState.env
|
||||
modify fun st => { st with scopes := cmdState.scopes }
|
||||
let (cmds, cmdState, trees) ← withSaveInfoContext do
|
||||
let mut cmdState : Command.State := { env, maxRecDepth := ← MonadRecDepth.getMaxRecDepth, scopes }
|
||||
let mut pstate : Parser.ModuleParserState := {pos := pos, recovering := false}
|
||||
let mut cmds := #[]
|
||||
repeat
|
||||
let scope := cmdState.scopes.head!
|
||||
let pmctx := { env := cmdState.env, options := scope.opts, currNamespace := scope.currNamespace, openDecls := scope.openDecls }
|
||||
let (cmd, ps', messages) := Parser.parseCommand ictx pmctx pstate cmdState.messages
|
||||
cmds := cmds.push cmd
|
||||
pstate := ps'
|
||||
cmdState := { cmdState with messages := messages }
|
||||
cmdState ← runCommand (Command.elabCommand cmd) cmd cctx cmdState
|
||||
if Parser.isTerminalCommand cmd then break
|
||||
setEnv cmdState.env
|
||||
modify fun st => { st with scopes := cmdState.scopes }
|
||||
|
||||
for t in cmdState.infoState.trees do
|
||||
pushInfoTree t
|
||||
for t in cmdState.infoState.trees do
|
||||
pushInfoTree t
|
||||
let trees := (← getInfoTrees)
|
||||
pure (cmds, cmdState, trees)
|
||||
|
||||
let mut output := #[]
|
||||
for msg in cmdState.messages.toArray do
|
||||
@@ -937,14 +940,13 @@ def lean (name : Option Ident := none) (error warning : flag false) («show» :
|
||||
let hint ← flagHint m!"The `+error` flag indicates that errors are expected:" #[" +error"]
|
||||
logErrorAt msgStx m!"Unexpected error:{indentD msg.data}{hint.getD m!""}"
|
||||
if msg.severity == .warning && !warning then
|
||||
let hint ← flagHint m!"The `+error` flag indicates that warnings are expected:" #[" +warning"]
|
||||
let hint ← flagHint m!"The `+warning` flag indicates that warnings are expected:" #[" +warning"]
|
||||
logErrorAt msgStx m!"Unexpected warning:{indentD msg.data}{hint.getD m!""}"
|
||||
else
|
||||
withRef msgStx <| log msg.data (severity := .information) (isSilent := true)
|
||||
if let some x := name then
|
||||
modifyEnv (leanOutputExt.modifyState · (·.insert x.getId output))
|
||||
if «show» then
|
||||
let trees := (← getInfoTrees)
|
||||
if h : trees.size > 0 then
|
||||
let hl := Data.LeanBlock.mk (← highlightSyntax trees (mkNullNode cmds))
|
||||
return .other {name := ``Data.LeanBlock, val := .mk hl} #[.code code.getString]
|
||||
|
||||
@@ -20,10 +20,12 @@ structure LetRecDeclView where
|
||||
declName : Name
|
||||
parentName? : Option Name
|
||||
binderIds : Array Syntax
|
||||
binders : Syntax -- binder syntax for docstring elaboration
|
||||
type : Expr
|
||||
mvar : Expr -- auxiliary metavariable used to lift the 'let rec'
|
||||
valStx : Syntax
|
||||
termination : TerminationHints
|
||||
docString? : Option (TSyntax ``Parser.Command.docComment × Bool) := none
|
||||
|
||||
structure LetRecView where
|
||||
decls : Array LetRecDeclView
|
||||
@@ -32,8 +34,9 @@ structure LetRecView where
|
||||
/- group ("let " >> nonReservedSymbol "rec ") >> sepBy1 (group (optional «attributes» >> letDecl)) ", " >> "; " >> termParser -/
|
||||
private def mkLetRecDeclView (letRec : Syntax) : TermElabM LetRecView := do
|
||||
let mut decls : Array LetRecDeclView := #[]
|
||||
let isVerso := doc.verso.get (← getOptions)
|
||||
for attrDeclStx in letRec[1][0].getSepArgs do
|
||||
let docStr? := attrDeclStx[0].getOptional?.map TSyntax.mk
|
||||
let docStr? := attrDeclStx[0].getOptional?.map (TSyntax.mk ·, isVerso)
|
||||
let attrOptStx := attrDeclStx[1]
|
||||
let attrs ← if attrOptStx.isNone then pure #[] else elabDeclAttrs attrOptStx[0]
|
||||
let decl := attrDeclStx[2][0]
|
||||
@@ -45,16 +48,21 @@ private def mkLetRecDeclView (letRec : Syntax) : TermElabM LetRecView := do
|
||||
throwErrorAt declId "'let rec' expressions must be named"
|
||||
let shortDeclName := declId.getId
|
||||
let parentName? ← getDeclName?
|
||||
let declName := parentName?.getD Name.anonymous ++ shortDeclName
|
||||
let mut declName := parentName?.getD Name.anonymous ++ shortDeclName
|
||||
let env ← getEnv
|
||||
if env.header.isModule && !env.isExporting then
|
||||
declName := mkPrivateName env declName
|
||||
if decls.any fun decl => decl.declName == declName then
|
||||
withRef declId do
|
||||
throwError "`{.ofConstName declName}` has already been declared"
|
||||
let binders := decl[1]
|
||||
let binderStx := decl[1]
|
||||
checkNotAlreadyDeclared declName
|
||||
applyAttributesAt declName attrs AttributeApplicationTime.beforeElaboration
|
||||
addDocString' declName binders docStr?
|
||||
-- Docstring processing is deferred until the declaration is added to the environment.
|
||||
-- This is necessary for Verso docstrings to work correctly, as they may reference the
|
||||
-- declaration being defined.
|
||||
addDeclarationRangesFromSyntax declName decl declId
|
||||
let binders := binders.getArgs
|
||||
let binders := binderStx.getArgs
|
||||
let typeStx := expandOptType declId decl[2]
|
||||
let (type, binderIds) ← elabBindersEx binders fun xs => do
|
||||
let type ← elabType typeStx
|
||||
@@ -70,7 +78,7 @@ private def mkLetRecDeclView (letRec : Syntax) : TermElabM LetRecView := do
|
||||
let termination ← elabTerminationHints ⟨attrDeclStx[3]⟩
|
||||
decls := decls.push {
|
||||
ref := declId, attrs, shortDeclName, declName, parentName?,
|
||||
binderIds, type, mvar, valStx, termination
|
||||
binderIds, binders := binderStx, type, mvar, valStx, termination, docString? := docStr?
|
||||
}
|
||||
else
|
||||
throwUnsupportedSyntax
|
||||
@@ -111,15 +119,12 @@ private def registerLetRecsToLift (views : Array LetRecDeclView) (fvars : Array
|
||||
let toLift ← views.mapIdxM fun i view => do
|
||||
let value := values[i]!
|
||||
let termination := view.termination.rememberExtraParams view.binderIds.size value
|
||||
let env ← getEnv
|
||||
pure {
|
||||
ref := view.ref
|
||||
fvarId := fvars[i]!.fvarId!
|
||||
attrs := view.attrs
|
||||
shortDeclName := view.shortDeclName
|
||||
declName :=
|
||||
if env.isExporting || !env.header.isModule then view.declName
|
||||
else mkPrivateName env view.declName
|
||||
declName := view.declName
|
||||
parentName? := view.parentName?
|
||||
lctx
|
||||
localInstances
|
||||
@@ -127,6 +132,8 @@ private def registerLetRecsToLift (views : Array LetRecDeclView) (fvars : Array
|
||||
val := value
|
||||
mvarId := view.mvar.mvarId!
|
||||
termination
|
||||
binders := view.binders
|
||||
docString? := view.docString?
|
||||
}
|
||||
modify fun s => { s with letRecsToLift := toLift.toList ++ s.letRecsToLift }
|
||||
|
||||
|
||||
@@ -1092,8 +1092,8 @@ def pushLetRecs (preDefs : Array PreDefinition) (letRecClosures : List LetRecClo
|
||||
ref := c.ref
|
||||
declName := c.toLift.declName
|
||||
levelParams := [] -- we set it later
|
||||
binders := mkNullNode -- No docstrings, so we don't need these
|
||||
modifiers := { modifiers with attrs := c.toLift.attrs }
|
||||
binders := c.toLift.binders
|
||||
modifiers := { modifiers with attrs := c.toLift.attrs, docString? := c.toLift.docString? }
|
||||
kind, type, value,
|
||||
termination := c.toLift.termination
|
||||
}
|
||||
|
||||
@@ -29,6 +29,10 @@ def addPreDefsFromUnary (docCtx : LocalContext × LocalInstances) (preDefs : Arr
|
||||
let preDefNonRec := unaryPreDefNonRec.filterAttrs fun attr => attr.name != `implemented_by
|
||||
let declNames := preDefs.toList.map (·.declName)
|
||||
|
||||
preDefs.forM fun preDef =>
|
||||
unless preDef.kind.isTheorem do
|
||||
markAsRecursive preDef.declName
|
||||
|
||||
-- Do not complain if the user sets @[semireducible], which usually is a noop,
|
||||
-- we recognize that below and then do not set @[irreducible]
|
||||
withOptions (allowUnsafeReducibility.set · true) do
|
||||
@@ -53,8 +57,6 @@ def cleanPreDef (preDef : PreDefinition) (cacheProofs := true) : MetaM PreDefini
|
||||
Assign final attributes to the definitions. Assumes the EqnInfos to be already present.
|
||||
-/
|
||||
def addPreDefAttributes (preDefs : Array PreDefinition) : TermElabM Unit := do
|
||||
for preDef in preDefs do
|
||||
markAsRecursive preDef.declName
|
||||
for preDef in preDefs.reverse do
|
||||
-- must happen before `generateEagerEqns`
|
||||
-- must happen in reverse order so that constants realized as part of the first decl
|
||||
|
||||
@@ -140,6 +140,8 @@ def structuralRecursion
|
||||
preDefsNonRec.forM fun preDefNonRec => do
|
||||
let preDefNonRec ← eraseRecAppSyntax preDefNonRec
|
||||
prependError m!"structural recursion failed, produced type incorrect term" do
|
||||
unless preDefNonRec.kind.isTheorem do
|
||||
markAsRecursive preDefNonRec.declName
|
||||
-- We create the `_unsafe_rec` before we abstract nested proofs.
|
||||
-- Reason: the nested proofs may be referring to the _unsafe_rec.
|
||||
addNonRec docCtx preDefNonRec (applyAttrAfterCompilation := false) (all := names.toList)
|
||||
@@ -157,7 +159,6 @@ def structuralRecursion
|
||||
-/
|
||||
registerEqnsInfo preDef (preDefs.map (·.declName)) recArgPos fixedParamPerms
|
||||
addSmartUnfoldingDef docCtx preDef recArgPos
|
||||
markAsRecursive preDef.declName
|
||||
for preDef in preDefs do
|
||||
-- must happen in separate loop so realizations can see eqnInfos of all other preDefs
|
||||
enableRealizationsForConst preDef.declName
|
||||
|
||||
@@ -16,6 +16,7 @@ open Meta
|
||||
|
||||
structure Context extends Tactic.Context where
|
||||
ctx : Meta.Grind.Context
|
||||
sctx : Meta.Sym.Context
|
||||
methods : Grind.Methods
|
||||
params : Grind.Params
|
||||
|
||||
@@ -289,7 +290,7 @@ open Grind
|
||||
def liftGrindM (k : GrindM α) : GrindTacticM α := do
|
||||
let ctx ← read
|
||||
let s ← get
|
||||
let ((a, grindState), symState) ← liftMetaM <| StateRefT'.run ((Grind.withGTransparency k) ctx.methods.toMethodsRef ctx.ctx |>.run s.grindState) s.symState
|
||||
let ((a, grindState), symState) ← liftMetaM <| StateRefT'.run (((Grind.withGTransparency k) ctx.methods.toMethodsRef ctx.ctx |>.run s.grindState) ctx.sctx) s.symState
|
||||
modify fun s => { s with grindState, symState }
|
||||
return a
|
||||
|
||||
@@ -358,12 +359,13 @@ def mkEvalTactic' (elaborator : Name) (params : Params) : TermElabM (Goal → TS
|
||||
let eval (goal : Goal) (stx : TSyntax `grind) : GrindM (List Goal) := do
|
||||
let methods ← getMethods
|
||||
let grindCtx ← readThe Meta.Grind.Context
|
||||
let symCtx ← readThe Meta.Sym.Context
|
||||
let grindState ← get
|
||||
let symState ← getThe Sym.State
|
||||
-- **Note**: we discard changes to `Term.State`
|
||||
let (subgoals, grindState', symState') ← Term.TermElabM.run' (ctx := termCtx) (s := termState) do
|
||||
let (_, s) ← GrindTacticM.run
|
||||
(ctx := { recover := false, methods, ctx := grindCtx, params, elaborator })
|
||||
(ctx := { recover := false, methods, ctx := grindCtx, sctx := symCtx, params, elaborator })
|
||||
(s := { grindState, symState, goals := [goal] }) do
|
||||
evalGrindTactic stx.raw
|
||||
pruneSolvedGoals
|
||||
@@ -383,7 +385,7 @@ def GrindTacticM.runAtGoal (mvarId : MVarId) (params : Params) (k : GrindTacticM
|
||||
Reconsider the option `useSorry`.
|
||||
-/
|
||||
let params' := { params with config.useSorry := false }
|
||||
let (methods, ctx, state) ← liftMetaM <| GrindM.runAtGoal mvarId params' (evalTactic? := some evalTactic) fun goal => do
|
||||
let (methods, ctx, sctx, state) ← liftMetaM <| GrindM.runAtGoal mvarId params' (evalTactic? := some evalTactic) fun goal => do
|
||||
let a : Action := Action.intros 0 >> Action.assertAll
|
||||
let goals ← match (← a.run goal) with
|
||||
| .closed _ => pure []
|
||||
@@ -392,10 +394,11 @@ def GrindTacticM.runAtGoal (mvarId : MVarId) (params : Params) (k : GrindTacticM
|
||||
let ctx ← readThe Meta.Grind.Context
|
||||
/- Restore original config -/
|
||||
let ctx := { ctx with config := params.config }
|
||||
let sctx ← readThe Meta.Sym.Context
|
||||
let grindState ← get
|
||||
let symState ← getThe Sym.State
|
||||
return (methods, ctx, { grindState, symState, goals })
|
||||
return (methods, ctx, sctx, { grindState, symState, goals })
|
||||
let tctx ← read
|
||||
k { tctx with methods, ctx, params } |>.run state
|
||||
k { tctx with methods, ctx, sctx, params } |>.run state
|
||||
|
||||
end Lean.Elab.Tactic.Grind
|
||||
|
||||
@@ -167,6 +167,11 @@ structure LetRecToLift where
|
||||
val : Expr
|
||||
mvarId : MVarId
|
||||
termination : TerminationHints
|
||||
/-- The binders syntax for the declaration, used for docstring elaboration. -/
|
||||
binders : Syntax := .missing
|
||||
/-- The docstring, if present, and whether it's Verso. Docstring processing is deferred until the
|
||||
declaration is added to the environment (needed for Verso docstrings to work). -/
|
||||
docString? : Option (TSyntax ``Lean.Parser.Command.docComment × Bool) := none
|
||||
deriving Inhabited
|
||||
|
||||
/--
|
||||
|
||||
@@ -179,6 +179,13 @@ structure EnvironmentHeader where
|
||||
`ModuleIdx` for the same module.
|
||||
-/
|
||||
modules : Array EffectiveImport := #[]
|
||||
/-- For `getModuleIdx?` -/
|
||||
private moduleName2Idx : Std.HashMap Name ModuleIdx := Id.run do
|
||||
let mut m := {}
|
||||
for _h : idx in [0:modules.size] do
|
||||
let mod := modules[idx]
|
||||
m := m.insert mod.module idx
|
||||
return m
|
||||
/--
|
||||
Subset of `modules` for which `importAll` is `true`. This is assumed to be a much smaller set so
|
||||
we precompute it instead of iterating over all of `modules` multiple times. However, note that
|
||||
@@ -267,7 +274,7 @@ structure Environment where
|
||||
-/
|
||||
private irBaseExts : Array EnvExtensionState
|
||||
/-- The header contains additional information that is set at import time. -/
|
||||
header : EnvironmentHeader := {}
|
||||
header : EnvironmentHeader := private_decl% {}
|
||||
deriving Nonempty
|
||||
|
||||
/-- Exceptions that can be raised by the kernel when type checking new declarations. -/
|
||||
@@ -1174,7 +1181,7 @@ def isSafeDefinition (env : Environment) (declName : Name) : Bool :=
|
||||
| _ => false
|
||||
|
||||
def getModuleIdx? (env : Environment) (moduleName : Name) : Option ModuleIdx :=
|
||||
env.header.modules.findIdx? (·.module == moduleName)
|
||||
env.header.moduleName2Idx[moduleName]?
|
||||
|
||||
end Environment
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ module
|
||||
prelude
|
||||
public import Lean.Meta.Match.MatcherInfo
|
||||
public import Lean.DefEqAttrib
|
||||
public import Lean.Meta.RecExt
|
||||
public import Lean.Meta.LetToHave
|
||||
import Lean.Meta.AppBuilder
|
||||
|
||||
@@ -40,26 +41,6 @@ This is implemented by
|
||||
-/
|
||||
def eqnAffectingOptions : Array (Lean.Option Bool) := #[backward.eqns.nonrecursive, backward.eqns.deepRecursiveSplit]
|
||||
|
||||
/--
|
||||
Environment extension for storing which declarations are recursive.
|
||||
This information is populated by the `PreDefinition` module, but the simplifier
|
||||
uses when unfolding declarations.
|
||||
-/
|
||||
builtin_initialize recExt : TagDeclarationExtension ←
|
||||
mkTagDeclarationExtension `recExt (asyncMode := .async .asyncEnv)
|
||||
|
||||
/--
|
||||
Marks the given declaration as recursive.
|
||||
-/
|
||||
def markAsRecursive (declName : Name) : CoreM Unit :=
|
||||
modifyEnv (recExt.tag · declName)
|
||||
|
||||
/--
|
||||
Returns `true` if `declName` was defined using well-founded recursion, or structural recursion.
|
||||
-/
|
||||
def isRecursiveDefinition (declName : Name) : CoreM Bool :=
|
||||
return recExt.isTagged (← getEnv) declName
|
||||
|
||||
def eqnThmSuffixBase := "eq"
|
||||
def eqnThmSuffixBasePrefix := eqnThmSuffixBase ++ "_"
|
||||
def eqn1ThmSuffix := eqnThmSuffixBasePrefix ++ "1"
|
||||
|
||||
33
src/Lean/Meta/RecExt.lean
Normal file
33
src/Lean/Meta/RecExt.lean
Normal file
@@ -0,0 +1,33 @@
|
||||
/-
|
||||
Copyright (c) 2021 Microsoft Corporation. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Lean.Attributes
|
||||
|
||||
public section
|
||||
|
||||
namespace Lean.Meta
|
||||
|
||||
/--
|
||||
Environment extension for storing which declarations are recursive.
|
||||
This information is populated by the `PreDefinition` module, but the simplifier
|
||||
uses when unfolding declarations.
|
||||
-/
|
||||
builtin_initialize recExt : TagDeclarationExtension ←
|
||||
mkTagDeclarationExtension `recExt (asyncMode := .async .asyncEnv)
|
||||
|
||||
/--
|
||||
Marks the given declaration as recursive.
|
||||
-/
|
||||
def markAsRecursive (declName : Name) : CoreM Unit :=
|
||||
modifyEnv (recExt.tag · declName)
|
||||
|
||||
/--
|
||||
Returns `true` if `declName` was defined using well-founded recursion, or structural recursion.
|
||||
-/
|
||||
def isRecursiveDefinition (declName : Name) : CoreM Bool :=
|
||||
return recExt.isTagged (← getEnv) declName
|
||||
@@ -23,13 +23,14 @@ public import Lean.Meta.Sym.Apply
|
||||
public import Lean.Meta.Sym.InferType
|
||||
public import Lean.Meta.Sym.Simp
|
||||
public import Lean.Meta.Sym.Util
|
||||
public import Lean.Meta.Sym.Grind
|
||||
|
||||
/-!
|
||||
# Symbolic simulation support.
|
||||
# Symbolic computation support.
|
||||
|
||||
This module provides `SymM`, a monad for implementing symbolic simulators (e.g., verification condition generators)
|
||||
using Lean. The monad addresses performance issues found in symbolic simulators built on top of user-facing
|
||||
tactics (e.g., `apply` and `intros`).
|
||||
This module provides `SymM`, a monad for implementing symbolic computation (e.g., decision procedures and
|
||||
verification condition generators) using Lean. The monad addresses performance issues found in symbolic
|
||||
computation engines built on top of user-facing tactics (e.g., `apply` and `intros`).
|
||||
|
||||
## Overview
|
||||
|
||||
@@ -65,14 +66,14 @@ whether `maxFVar[e]` is in `?m.lctx` — a single hash lookup, O(1).
|
||||
|
||||
**The problem:** The `isDefEq` predicate in `MetaM` is designed for elaboration and user-facing tactics.
|
||||
It supports reduction, type-class resolution, and many other features that can be expensive or have
|
||||
unpredictable running time. For symbolic simulation, where pattern matching is called frequently on
|
||||
unpredictable running time. For symbolic computation, where pattern matching is called frequently on
|
||||
large ground terms, these features become performance bottlenecks.
|
||||
|
||||
**The solution:** In `SymM`, pattern matching and definitional equality are restricted to a more syntactic,
|
||||
predictable subset. Key design choices:
|
||||
|
||||
1. **Reducible declarations are abbreviations.** Reducible declarations are eagerly expanded when indexing
|
||||
terms and when entering symbolic simulation mode. During matching, we assume abbreviations have already
|
||||
terms and when entering symbolic computation mode. During matching, we assume abbreviations have already
|
||||
been expanded.
|
||||
|
||||
**Why `MetaM` `simp` cannot make this assumption**: The simplifier in `MetaM` is designed for interactive use,
|
||||
@@ -99,7 +100,7 @@ predictable subset. Key design choices:
|
||||
4. **Types must be indexed.** Unlike proofs and instances, types cannot be ignored, without indexing them,
|
||||
pattern matching produces too many candidates. Like other abbreviations, type abbreviations are expanded.
|
||||
Note that given `def Foo : Type := Bla`, the terms `Foo` and `Bla` are *not* considered structurally
|
||||
equal in the symbolic simulator framework.
|
||||
equal in the symbolic computation framework.
|
||||
|
||||
### Skipping type checks on assignment
|
||||
|
||||
@@ -117,7 +118,7 @@ so the check is almost always skipped.
|
||||
|
||||
### `GrindM` state
|
||||
|
||||
**The problem:** In symbolic simulation, we often want to discharge many goals using proof automation such
|
||||
**The problem:** In symbolic computation, we often want to discharge many goals using proof automation such
|
||||
as `grind`. Many of these goals share very similar local contexts. If we invoke `grind` on each goal
|
||||
independently, we repeatedly reprocess the same hypotheses.
|
||||
|
||||
|
||||
@@ -44,8 +44,11 @@ first because solving it often solves `?w`.
|
||||
def mkResultPos (pattern : Pattern) : List Nat := Id.run do
|
||||
let auxPrefix := `_sym_pre
|
||||
-- Initialize "found" mask with arguments that can be synthesized by type class resolution.
|
||||
let mut found := pattern.isInstance
|
||||
let numArgs := pattern.varTypes.size
|
||||
let mut found := if let some varInfos := pattern.varInfos? then
|
||||
varInfos.argsInfo.map fun info : ProofInstArgInfo => info.isInstance
|
||||
else
|
||||
Array.replicate numArgs false
|
||||
let auxVars := pattern.varTypes.mapIdx fun i _ => mkFVar ⟨.num auxPrefix i⟩
|
||||
-- Collect arguments that occur in the pattern
|
||||
for fvarId in collectFVars {} (pattern.pattern.instantiateRev auxVars) |>.fvarIds do
|
||||
@@ -96,6 +99,10 @@ def mkValue (expr : Expr) (pattern : Pattern) (result : MatchUnifyResult) : Expr
|
||||
else
|
||||
mkAppN (expr.instantiateLevelParams pattern.levelParams result.us) result.args
|
||||
|
||||
public inductive ApplyResult where
|
||||
| failed
|
||||
| goals (mvarIds : List MVarId)
|
||||
|
||||
/--
|
||||
Applies a backward rule to a goal, returning new subgoals.
|
||||
|
||||
@@ -103,27 +110,23 @@ Applies a backward rule to a goal, returning new subgoals.
|
||||
2. Assigns the goal metavariable to the theorem application
|
||||
3. Returns new goals for unassigned arguments (per `resultPos`)
|
||||
|
||||
Returns `none` if unification fails.
|
||||
Returns `.notApplicable` if unification fails.
|
||||
-/
|
||||
public def BackwardRule.apply? (mvarId : MVarId) (rule : BackwardRule) : SymM (Option (List MVarId)) := mvarId.withContext do
|
||||
public def BackwardRule.apply (mvarId : MVarId) (rule : BackwardRule) : SymM ApplyResult := mvarId.withContext do
|
||||
let decl ← mvarId.getDecl
|
||||
if let some result ← rule.pattern.unify? decl.type then
|
||||
mvarId.assign (mkValue rule.expr rule.pattern result)
|
||||
return some <| rule.resultPos.map fun i =>
|
||||
return .goals <| rule.resultPos.map fun i =>
|
||||
result.args[i]!.mvarId!
|
||||
else
|
||||
return none
|
||||
return .failed
|
||||
|
||||
/--
|
||||
Similar to `BackwardRule.apply?`, but throws an error if unification fails.
|
||||
Similar to `BackwardRule.apply', but throws an error if unification fails.
|
||||
-/
|
||||
public def BackwardRule.apply (mvarId : MVarId) (rule : BackwardRule) : SymM (List MVarId) := mvarId.withContext do
|
||||
let decl ← mvarId.getDecl
|
||||
if let some result ← rule.pattern.unify? decl.type then
|
||||
mvarId.assign (mkValue rule.expr rule.pattern result)
|
||||
return rule.resultPos.map fun i =>
|
||||
result.args[i]!.mvarId!
|
||||
else
|
||||
throwError "rule is not applicable to goal{mvarId}rule:{indentExpr rule.expr}"
|
||||
public def BackwardRule.apply' (mvarId : MVarId) (rule : BackwardRule) : SymM (List MVarId) := do
|
||||
let .goals mvarIds ← rule.apply mvarId
|
||||
| throwError "rule is not applicable to goal{mvarId}rule:{indentExpr rule.expr}"
|
||||
return mvarIds
|
||||
|
||||
end Lean.Meta.Sym
|
||||
|
||||
129
src/Lean/Meta/Sym/Grind.lean
Normal file
129
src/Lean/Meta/Sym/Grind.lean
Normal file
@@ -0,0 +1,129 @@
|
||||
/-
|
||||
Copyright (c) 2026 Amazon.com, Inc. or its affiliates. All Rights Reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
module
|
||||
prelude
|
||||
public import Lean.Meta.Tactic.Grind.Types
|
||||
public import Lean.Meta.Sym.Simp.SimpM
|
||||
public import Lean.Meta.Sym.Apply
|
||||
import Lean.Meta.Tactic.Grind.Main
|
||||
import Lean.Meta.Sym.Simp.Goal
|
||||
import Lean.Meta.Sym.Intro
|
||||
import Lean.Meta.Sym.Util
|
||||
import Lean.Meta.Tactic.Grind.Solve
|
||||
import Lean.Meta.Tactic.Assumption
|
||||
namespace Lean.Meta.Grind
|
||||
|
||||
/-!
|
||||
# Grind Goal API for Symbolic Simulation
|
||||
|
||||
This module provides an API for building symbolic simulation engines and
|
||||
verification condition generators on top of `grind`. It wraps `Sym` operations
|
||||
to work with `grind`'s `Goal` type, enabling users to carry `grind` state
|
||||
through symbolic execution while using lightweight `Sym` operations for
|
||||
the main loop.
|
||||
|
||||
## Typical usage pattern
|
||||
```
|
||||
let goal ← mkGoal mvarId
|
||||
let .goal xs goal ← goal.introN 2 | failure
|
||||
let .goal goal ← goal.simp methods | failure
|
||||
let goal ← goal.internalizeAll
|
||||
-- ... symbolic execution loop using goal.apply ...
|
||||
let .closed ← goal.grind | failure
|
||||
```
|
||||
|
||||
## Design
|
||||
|
||||
Operations like `introN`, `apply`, and `simp` run in `SymM` for performance.
|
||||
`internalize` and `grind` run in `GrindM` to access the E-graph.
|
||||
-/
|
||||
|
||||
|
||||
/--
|
||||
Creates a `Goal` from an `MVarId`, applying `Sym` preprocessing.
|
||||
Preprocessing ensures the goal is compatible with `Sym` operations.
|
||||
-/
|
||||
public def mkGoal (mvarId : MVarId) : GrindM Goal := do
|
||||
let mvarId ← Sym.preprocessMVar mvarId
|
||||
mkGoalCore mvarId
|
||||
|
||||
open Sym (SymM)
|
||||
|
||||
public inductive IntrosResult where
|
||||
| failed
|
||||
| goal (newDecls : Array FVarId) (goal : Goal)
|
||||
|
||||
/-- Introduces `num` binders from the goal's target. -/
|
||||
public def Goal.introN (goal : Goal) (num : Nat) : SymM IntrosResult := do
|
||||
let .goal xs mvarId ← Sym.introN goal.mvarId num | return .failed
|
||||
return .goal xs { goal with mvarId }
|
||||
|
||||
/-- Introduces binders with the specified names. -/
|
||||
public def Goal.intros (goal : Goal) (names : Array Name) : SymM IntrosResult := do
|
||||
let .goal xs mvarId ← Sym.intros goal.mvarId names | return .failed
|
||||
return .goal xs { goal with mvarId }
|
||||
|
||||
public inductive ApplyResult where
|
||||
| failed
|
||||
| goals (subgoals : List Goal)
|
||||
|
||||
/-- Applies a backward rule, returning subgoals on success. -/
|
||||
public def Goal.apply (goal : Goal) (rule : Sym.BackwardRule) : SymM ApplyResult := do
|
||||
let .goals mvarIds ← rule.apply goal.mvarId | return .failed
|
||||
return .goals <| mvarIds.map fun mvarId => { goal with mvarId }
|
||||
|
||||
public inductive SimpGoalResult where
|
||||
| noProgress
|
||||
| closed
|
||||
| goal (goal : Goal)
|
||||
|
||||
/-- Simplifies the goal using the given methods. -/
|
||||
public def Goal.simp (goal : Goal) (methods : Sym.Simp.Methods := {}) (config : Sym.Simp.Config := {}) : SymM SimpGoalResult := do
|
||||
match (← Sym.simpGoal goal.mvarId methods config) with
|
||||
| .goal mvarId => return .goal { goal with mvarId }
|
||||
| .noProgress => return .noProgress
|
||||
| .closed => return .closed
|
||||
|
||||
/-- Like `simp`, but returns the original goal unchanged when no progress is made. -/
|
||||
public def Goal.simpIgnoringNoProgress (goal : Goal) (methods : Sym.Simp.Methods := {}) (config : Sym.Simp.Config := {}) : SymM SimpGoalResult := do
|
||||
match (← Sym.simpGoal goal.mvarId methods config) with
|
||||
| .goal mvarId => return .goal { goal with mvarId }
|
||||
| .noProgress => return .goal goal
|
||||
| .closed => return .closed
|
||||
|
||||
/--
|
||||
Internalizes the next `num` hypotheses from the local context into the `grind` state (e.g., its E-graph).
|
||||
-/
|
||||
public def Goal.internalize (goal : Goal) (num : Nat) : GrindM Goal := do
|
||||
Grind.processHypotheses goal (some num)
|
||||
|
||||
/-- Internalizes all (un-internalized) hypotheses from the local context into the `grind` state. -/
|
||||
public def Goal.internalizeAll (goal : Goal) : GrindM Goal := do
|
||||
Grind.processHypotheses goal none
|
||||
|
||||
public inductive GrindResult where
|
||||
| failed (goal : Goal)
|
||||
| closed
|
||||
|
||||
/--
|
||||
Attempts to close the goal using `grind`.
|
||||
Returns `.closed` on success, or `.failed` with the first subgoal that failed to be closed.
|
||||
-/
|
||||
public def Goal.grind (goal : Goal) : GrindM GrindResult := do
|
||||
if let some failure ← solve goal then
|
||||
return .failed failure
|
||||
else
|
||||
return .closed
|
||||
|
||||
/--
|
||||
Closes the goal if its target matches a hypothesis.
|
||||
Returns `true` on success.
|
||||
-/
|
||||
public def Goal.assumption (goal : Goal) : MetaM Bool := do
|
||||
-- **TODO**: add indexing
|
||||
goal.mvarId.assumptionCore
|
||||
|
||||
end Lean.Meta.Grind
|
||||
@@ -96,48 +96,39 @@ def introCore (mvarId : MVarId) (max : Nat) (names : Array Name) : SymM (Array F
|
||||
|
||||
def hugeNat := 1000000
|
||||
|
||||
public inductive IntrosResult where
|
||||
| failed
|
||||
| goal (newDecls : Array FVarId) (mvarId : MVarId)
|
||||
|
||||
/--
|
||||
Introduces leading binders (universal quantifiers and let-expressions) from the goal's target type.
|
||||
|
||||
If `names` is non-empty, introduces (at most) `names.size` binders using the provided names.
|
||||
If `names` is empty, introduces all leading binders using inaccessible names.
|
||||
|
||||
Returns the introduced free variable Ids and the updated goal.
|
||||
|
||||
Throws an error if the target type does not have a leading binder.
|
||||
Returns `.goal newDecls mvarId` with new introduced free variable Ids and the updated goal.
|
||||
Returns `.failed` if no new declaration was introduced.
|
||||
-/
|
||||
public def intros (mvarId : MVarId) (names : Array Name := #[]) : SymM (Array FVarId × MVarId) := do
|
||||
public def intros (mvarId : MVarId) (names : Array Name := #[]) : SymM IntrosResult := do
|
||||
let result ← if names.isEmpty then
|
||||
introCore mvarId hugeNat #[]
|
||||
else
|
||||
introCore mvarId names.size names
|
||||
if result.1.isEmpty then
|
||||
throwError "`intros` failed, binder expected"
|
||||
return result
|
||||
|
||||
/--
|
||||
Introduces a single binder from the goal's target type with the given name.
|
||||
|
||||
Returns the introduced free variable ID and the updated goal.
|
||||
Throws an error if the target type does not have a leading binder.
|
||||
-/
|
||||
public def intro (mvarId : MVarId) (name : Name) : SymM (FVarId × MVarId) := do
|
||||
let (fvarIds, goal') ← introCore mvarId 1 #[name]
|
||||
if h : 0 < fvarIds.size then
|
||||
return (fvarIds[0], goal')
|
||||
else
|
||||
throwError "`intro` failed, binder expected"
|
||||
return .failed
|
||||
return .goal result.1 result.2
|
||||
|
||||
/--
|
||||
Introduces exactly `num` binders from the goal's target type.
|
||||
|
||||
Returns the introduced free variable IDs and the updated goal.
|
||||
Throws an error if the target type has fewer than `num` leading binders.
|
||||
Returns `.goal newDecls mvarId` if successful where `newDecls` are the introduced free variable IDs,
|
||||
`mvarId` the updated goal.
|
||||
Returns `.failed` if it was not possible to introduce `num` new local declarations.
|
||||
-/
|
||||
public def introN (mvarId : MVarId) (num : Nat) : SymM (Array FVarId × MVarId) := do
|
||||
public def introN (mvarId : MVarId) (num : Nat) : SymM IntrosResult := do
|
||||
let result ← introCore mvarId num #[]
|
||||
unless result.1.size == num do
|
||||
throwError "`introN` failed, insufficient number of binders"
|
||||
return result
|
||||
return .failed
|
||||
return .goal result.1 result.2
|
||||
|
||||
end Lean.Meta.Sym
|
||||
|
||||
@@ -73,4 +73,14 @@ def getFinValue? (e : Expr) : OptionT Id FinValue := do
|
||||
let : NeZero n := ⟨h⟩
|
||||
return { n, val := Fin.ofNat n v }
|
||||
|
||||
def getCharValue? (e : Expr) : OptionT Id Char := do
|
||||
let_expr Char.ofNat n := e | failure
|
||||
let .lit (.natVal n) := n | failure
|
||||
return Char.ofNat n
|
||||
|
||||
def getStringValue? (e : Expr) : Option String :=
|
||||
match e with
|
||||
| .lit (.strVal s) => some s
|
||||
| _ => none
|
||||
|
||||
end Lean.Meta.Sym
|
||||
|
||||
@@ -64,7 +64,11 @@ def mkProofInstInfoMapFor (pattern : Expr) : MetaM (AssocList Name ProofInstInfo
|
||||
public structure Pattern where
|
||||
levelParams : List Name
|
||||
varTypes : Array Expr
|
||||
isInstance : Array Bool
|
||||
/--
|
||||
If `some argsInfo`, `argsInfo` stores whether the pattern variables are instances/proofs.
|
||||
It is `none` if no pattern variables are instance/proof.
|
||||
-/
|
||||
varInfos? : Option ProofInstInfo
|
||||
pattern : Expr
|
||||
fnInfos : AssocList Name ProofInstInfo
|
||||
/--
|
||||
@@ -78,6 +82,16 @@ public structure Pattern where
|
||||
|
||||
def uvarPrefix : Name := `_uvar
|
||||
|
||||
/-- Returns `true` if the `i`th argument / pattern variable is an instance. -/
|
||||
def Pattern.isInstance (p : Pattern) (i : Nat) : Bool := Id.run do
|
||||
let some varInfos := p.varInfos? | return false
|
||||
varInfos.argsInfo[i]!.isInstance
|
||||
|
||||
/-- Returns `true` if the `i`th argument / pattern variable is a proof. -/
|
||||
def Pattern.isProof (p : Pattern) (i : Nat) : Bool := Id.run do
|
||||
let some varInfos := p.varInfos? | return false
|
||||
varInfos.argsInfo[i]!.isProof
|
||||
|
||||
def isUVar? (n : Name) : Option Nat := Id.run do
|
||||
let .num p idx := n | return none
|
||||
unless p == uvarPrefix do return none
|
||||
@@ -144,12 +158,13 @@ where
|
||||
else
|
||||
mask
|
||||
|
||||
def mkPatternCore (levelParams : List Name) (varTypes : Array Expr) (isInstance : Array Bool)
|
||||
(pattern : Expr) : MetaM Pattern := do
|
||||
def mkPatternCore (type : Expr) (levelParams : List Name) (varTypes : Array Expr) (pattern : Expr) : MetaM Pattern := do
|
||||
let fnInfos ← mkProofInstInfoMapFor pattern
|
||||
let checkTypeMask := mkCheckTypeMask pattern varTypes.size
|
||||
let checkTypeMask? := if checkTypeMask.all (· == false) then none else some checkTypeMask
|
||||
return { levelParams, varTypes, isInstance, pattern, fnInfos, checkTypeMask? }
|
||||
let varInfos? ← forallBoundedTelescope type varTypes.size fun xs _ =>
|
||||
mkProofInstArgInfo? xs
|
||||
return { levelParams, varTypes, pattern, fnInfos, varInfos?, checkTypeMask? }
|
||||
|
||||
/--
|
||||
Creates a `Pattern` from the type of a theorem.
|
||||
@@ -168,12 +183,12 @@ public def mkPatternFromDecl (declName : Name) (num? : Option Nat := none) : Met
|
||||
let (levelParams, type) ← preprocessPattern declName
|
||||
let hugeNumber := 10000000
|
||||
let num := num?.getD hugeNumber
|
||||
let rec go (i : Nat) (type : Expr) (varTypes : Array Expr) (isInstance : Array Bool) : MetaM Pattern := do
|
||||
let rec go (i : Nat) (pattern : Expr) (varTypes : Array Expr) : MetaM Pattern := do
|
||||
if i < num then
|
||||
if let .forallE _ d b _ := type then
|
||||
return (← go (i+1) b (varTypes.push d) (isInstance.push (isClass? (← getEnv) d).isSome))
|
||||
mkPatternCore levelParams varTypes isInstance type
|
||||
go 0 type #[] #[]
|
||||
if let .forallE _ d b _ := pattern then
|
||||
return (← go (i+1) b (varTypes.push d))
|
||||
mkPatternCore type levelParams varTypes pattern
|
||||
go 0 type #[]
|
||||
|
||||
/--
|
||||
Creates a `Pattern` from an equational theorem, using the left-hand side of the equation.
|
||||
@@ -188,14 +203,14 @@ Throws an error if the theorem's conclusion is not an equality.
|
||||
-/
|
||||
public def mkEqPatternFromDecl (declName : Name) : MetaM (Pattern × Expr) := do
|
||||
let (levelParams, type) ← preprocessPattern declName
|
||||
let rec go (type : Expr) (varTypes : Array Expr) (isInstance : Array Bool) : MetaM (Pattern × Expr) := do
|
||||
if let .forallE _ d b _ := type then
|
||||
return (← go b (varTypes.push d) (isInstance.push (isClass? (← getEnv) d).isSome))
|
||||
let rec go (pattern : Expr) (varTypes : Array Expr) : MetaM (Pattern × Expr) := do
|
||||
if let .forallE _ d b _ := pattern then
|
||||
return (← go b (varTypes.push d))
|
||||
else
|
||||
let_expr Eq _ lhs rhs := type | throwError "resulting type for `{.ofConstName declName}` is not an equality"
|
||||
let pattern ← mkPatternCore levelParams varTypes isInstance lhs
|
||||
let_expr Eq _ lhs rhs := pattern | throwError "resulting type for `{.ofConstName declName}` is not an equality"
|
||||
let pattern ← mkPatternCore type levelParams varTypes lhs
|
||||
return (pattern, rhs)
|
||||
go type #[] #[]
|
||||
go type #[]
|
||||
|
||||
structure UnifyM.Context where
|
||||
pattern : Pattern
|
||||
@@ -799,7 +814,6 @@ def mkPreResult : UnifyM MkPreResultResult := do
|
||||
| none => mkFreshLevelMVar
|
||||
let pattern := (← read).pattern
|
||||
let varTypes := pattern.varTypes
|
||||
let isInstance := pattern.isInstance
|
||||
let eAssignment := (← get).eAssignment
|
||||
let tPending := (← get).tPending
|
||||
let mut args := #[]
|
||||
@@ -820,7 +834,7 @@ def mkPreResult : UnifyM MkPreResultResult := do
|
||||
let type := varTypes[i]!
|
||||
let type ← instantiateLevelParamsS type pattern.levelParams us
|
||||
let type ← instantiateRevBetaS type args
|
||||
if isInstance[i]! then
|
||||
if pattern.isInstance i then
|
||||
if let .some val ← trySynthInstance type then
|
||||
args := args.push (← shareCommon val)
|
||||
continue
|
||||
|
||||
@@ -7,17 +7,38 @@ module
|
||||
prelude
|
||||
public import Lean.Meta.Sym.SymM
|
||||
import Lean.Meta.Sym.IsClass
|
||||
import Lean.Meta.Tactic.Grind.Util
|
||||
import Lean.Meta.Sym.Util
|
||||
import Lean.Meta.Transform
|
||||
namespace Lean.Meta.Sym
|
||||
|
||||
/--
|
||||
Preprocesses types that used for pattern matching and unification.
|
||||
-/
|
||||
public def preprocessType (type : Expr) : MetaM Expr := do
|
||||
let type ← Grind.unfoldReducible type
|
||||
let type ← Sym.unfoldReducible type
|
||||
let type ← Core.betaReduce type
|
||||
zetaReduce type
|
||||
|
||||
/--
|
||||
Analyzes whether the given free variables (aka arguments) are proofs or instances.
|
||||
Returns `none` if no arguments are proofs or instances.
|
||||
-/
|
||||
public def mkProofInstArgInfo? (xs : Array Expr) : MetaM (Option ProofInstInfo) := do
|
||||
let env ← getEnv
|
||||
let mut argsInfo := #[]
|
||||
let mut found := false
|
||||
for x in xs do
|
||||
let type ← Meta.inferType x
|
||||
let isInstance := isClass? env type |>.isSome
|
||||
let isProof ← isProp type
|
||||
if isInstance || isProof then
|
||||
found := true
|
||||
argsInfo := argsInfo.push { isInstance, isProof }
|
||||
if found then
|
||||
return some { argsInfo }
|
||||
else
|
||||
return none
|
||||
|
||||
/--
|
||||
Analyzes the type signature of `declName` and returns information about which arguments
|
||||
are proofs or instances. Returns `none` if no arguments are proofs or instances.
|
||||
@@ -25,21 +46,7 @@ are proofs or instances. Returns `none` if no arguments are proofs or instances.
|
||||
public def mkProofInstInfo? (declName : Name) : MetaM (Option ProofInstInfo) := do
|
||||
let info ← getConstInfo declName
|
||||
let type ← preprocessType info.type
|
||||
forallTelescopeReducing type fun xs _ => do
|
||||
let env ← getEnv
|
||||
let mut argsInfo := #[]
|
||||
let mut found := false
|
||||
for x in xs do
|
||||
let type ← Meta.inferType x
|
||||
let isInstance := isClass? env type |>.isSome
|
||||
let isProof ← isProp type
|
||||
if isInstance || isProof then
|
||||
found := true
|
||||
argsInfo := argsInfo.push { isInstance, isProof }
|
||||
if found then
|
||||
return some { argsInfo }
|
||||
else
|
||||
return none
|
||||
forallTelescopeReducing type fun xs _ => mkProofInstArgInfo? xs
|
||||
|
||||
/--
|
||||
Returns information about the type signature of `declName`. It contains information about which arguments
|
||||
|
||||
@@ -21,3 +21,4 @@ public import Lean.Meta.Sym.Simp.Debug
|
||||
public import Lean.Meta.Sym.Simp.EvalGround
|
||||
public import Lean.Meta.Sym.Simp.Discharger
|
||||
public import Lean.Meta.Sym.Simp.ControlFlow
|
||||
public import Lean.Meta.Sym.Simp.Goal
|
||||
|
||||
@@ -224,7 +224,7 @@ position. However, the type is only meaningful (non-`default`) when `Result` is
|
||||
`.step`, since we only need types for constructing congruence proofs. This avoids
|
||||
unnecessary type inference when no rewriting occurs.
|
||||
-/
|
||||
def simpFixedPrefix (e : Expr) (prefixSize : Nat) (suffixSize : Nat) : SimpM Result := do
|
||||
public def simpFixedPrefix (e : Expr) (prefixSize : Nat) (suffixSize : Nat) : SimpM Result := do
|
||||
let numArgs := e.getAppNumArgs
|
||||
if numArgs ≤ prefixSize then
|
||||
-- Nothing to be done
|
||||
@@ -274,7 +274,7 @@ Uses `rewritable[i]` to determine whether argument `i` should be simplified.
|
||||
For rewritable arguments, calls `simp` and uses `congrFun'`, `congrArg`, and `congr`; for fixed arguments,
|
||||
uses `congrFun` to propagate changes from earlier arguments.
|
||||
-/
|
||||
def simpInterlaced (e : Expr) (rewritable : Array Bool) : SimpM Result := do
|
||||
public def simpInterlaced (e : Expr) (rewritable : Array Bool) : SimpM Result := do
|
||||
let numArgs := e.getAppNumArgs
|
||||
if h : numArgs = 0 then
|
||||
-- Nothing to be done
|
||||
|
||||
@@ -27,16 +27,16 @@ def simpIte : Simproc := fun e => do
|
||||
let_expr f@ite α c _ a b := e | return .rfl
|
||||
match (← simp c) with
|
||||
| .rfl _ =>
|
||||
if c.isTrue then
|
||||
if isSameExpr c (← getTrueExpr) then
|
||||
return .step a <| mkApp3 (mkConst ``ite_true f.constLevels!) α a b
|
||||
else if c.isFalse then
|
||||
else if isSameExpr c (← getFalseExpr) then
|
||||
return .step b <| mkApp3 (mkConst ``ite_false f.constLevels!) α a b
|
||||
else
|
||||
return .rfl (done := true)
|
||||
| .step c' h _ =>
|
||||
if c'.isTrue then
|
||||
if isSameExpr c' (← getTrueExpr) then
|
||||
return .step a <| mkApp (e.replaceFn ``ite_cond_eq_true) h
|
||||
else if c'.isFalse then
|
||||
else if isSameExpr c' (← getFalseExpr) then
|
||||
return .step b <| mkApp (e.replaceFn ``ite_cond_eq_false) h
|
||||
else
|
||||
let .some inst' ← trySynthInstance (mkApp (mkConst ``Decidable) c') | return .rfl
|
||||
@@ -56,20 +56,20 @@ def simpDIte : Simproc := fun e => do
|
||||
let_expr f@dite α c _ a b := e | return .rfl
|
||||
match (← simp c) with
|
||||
| .rfl _ =>
|
||||
if c.isTrue then
|
||||
if isSameExpr c (← getTrueExpr) then
|
||||
let a' ← share <| a.betaRev #[mkConst ``True.intro]
|
||||
return .step a' <| mkApp3 (mkConst ``dite_true f.constLevels!) α a b
|
||||
else if c.isFalse then
|
||||
else if isSameExpr c (← getFalseExpr) then
|
||||
let b' ← share <| b.betaRev #[mkConst ``not_false]
|
||||
return .step b' <| mkApp3 (mkConst ``dite_false f.constLevels!) α a b
|
||||
else
|
||||
return .rfl (done := true)
|
||||
| .step c' h _ =>
|
||||
if c'.isTrue then
|
||||
if isSameExpr c' (← getTrueExpr) then
|
||||
let h' ← shareCommon <| mkOfEqTrueCore c h
|
||||
let a ← share <| a.betaRev #[h']
|
||||
return .step a <| mkApp (e.replaceFn ``dite_cond_eq_true) h
|
||||
else if c'.isFalse then
|
||||
else if isSameExpr c' (← getFalseExpr) then
|
||||
let h' ← shareCommon <| mkOfEqFalseCore c h
|
||||
let b ← share <| b.betaRev #[h']
|
||||
return .step b <| mkApp (e.replaceFn ``dite_cond_eq_false) h
|
||||
@@ -94,16 +94,16 @@ def simpCond : Simproc := fun e => do
|
||||
let_expr f@cond α c a b := e | return .rfl
|
||||
match (← simp c) with
|
||||
| .rfl _ =>
|
||||
if c.isConstOf ``true then
|
||||
if isSameExpr c (← getBoolTrueExpr) then
|
||||
return .step a <| mkApp3 (mkConst ``cond_true f.constLevels!) α a b
|
||||
else if c.isConstOf ``false then
|
||||
else if isSameExpr c (← getBoolFalseExpr) then
|
||||
return .step b <| mkApp3 (mkConst ``cond_false f.constLevels!) α a b
|
||||
else
|
||||
return .rfl (done := true)
|
||||
| .step c' h _ =>
|
||||
if c'.isConstOf ``true then
|
||||
if isSameExpr c' (← getBoolTrueExpr) then
|
||||
return .step a <| mkApp (e.replaceFn ``Sym.cond_cond_eq_true) h
|
||||
else if c'.isConstOf ``false then
|
||||
else if isSameExpr c' (← getBoolFalseExpr) then
|
||||
return .step b <| mkApp (e.replaceFn ``Sym.cond_cond_eq_false) h
|
||||
else
|
||||
let e' := e.getBoundedAppFn 3
|
||||
|
||||
@@ -9,6 +9,7 @@ public import Lean.Meta.Sym.Simp.SimpM
|
||||
public import Lean.Meta.Sym.Simp.Discharger
|
||||
import Lean.Meta.Sym.Simp.Theorems
|
||||
import Lean.Meta.Sym.Simp.Rewrite
|
||||
import Lean.Meta.Sym.Simp.Goal
|
||||
import Lean.Meta.Sym.Util
|
||||
import Lean.Meta.Tactic.Util
|
||||
import Lean.Meta.AppBuilder
|
||||
@@ -27,24 +28,9 @@ public def mkSimprocFor (declNames : Array Name) (d : Discharger := dischargeNon
|
||||
public def mkMethods (declNames : Array Name) : MetaM Methods := do
|
||||
return { post := (← mkSimprocFor declNames) }
|
||||
|
||||
public def simpWith (k : Expr → SymM Result) (mvarId : MVarId) : MetaM (Option MVarId) := SymM.run do
|
||||
let mvarId ← preprocessMVar mvarId
|
||||
let decl ← mvarId.getDecl
|
||||
let target := decl.type
|
||||
match (← k target) with
|
||||
| .rfl _ => throwError "`Sym.simp` made no progress "
|
||||
| .step target' h _ =>
|
||||
let mvarNew ← mkFreshExprSyntheticOpaqueMVar target' decl.userName
|
||||
let h ← mkAppM ``Eq.mpr #[h, mvarNew]
|
||||
mvarId.assign h
|
||||
if target'.isTrue then
|
||||
mvarNew.mvarId!.assign (mkConst ``True.intro)
|
||||
return none
|
||||
else
|
||||
return some mvarNew.mvarId!
|
||||
|
||||
public def simpGoal (declNames : Array Name) (mvarId : MVarId) : MetaM (Option MVarId) := SymM.run do mvarId.withContext do
|
||||
public def simpGoalUsing (declNames : Array Name) (mvarId : MVarId) : MetaM (Option MVarId) := SymM.run do
|
||||
let methods ← mkMethods declNames
|
||||
simpWith (simp · methods) mvarId
|
||||
let mvarId ← preprocessMVar mvarId
|
||||
(← simpGoal mvarId methods).toOption
|
||||
|
||||
end Lean.Meta.Sym
|
||||
|
||||
@@ -9,6 +9,7 @@ public import Lean.Meta.Sym.Simp.SimpM
|
||||
import Init.Sym.Lemmas
|
||||
import Init.Data.Int.Gcd
|
||||
import Lean.Meta.Sym.LitValues
|
||||
import Lean.Meta.Sym.AlphaShareBuilder
|
||||
namespace Lean.Meta.Sym.Simp
|
||||
|
||||
/-!
|
||||
@@ -343,10 +344,10 @@ abbrev evalBinPred (toValue? : Expr → Option α) (trueThm falseThm : Expr) (op
|
||||
let some va := toValue? a | return .rfl
|
||||
let some vb := toValue? b | return .rfl
|
||||
if op va vb then
|
||||
let e ← share <| mkConst ``True
|
||||
let e ← getTrueExpr
|
||||
return .step e (mkApp3 trueThm a b eagerReflBoolTrue) (done := true)
|
||||
else
|
||||
let e ← share <| mkConst ``False
|
||||
let e ← getFalseExpr
|
||||
return .step e (mkApp3 falseThm a b eagerReflBoolFalse) (done := true)
|
||||
|
||||
def evalBitVecPred (n : Expr) (trueThm falseThm : Expr) (op : {n : Nat} → BitVec n → BitVec n → Bool) (a b : Expr) : SimpM Result := do
|
||||
@@ -354,10 +355,10 @@ def evalBitVecPred (n : Expr) (trueThm falseThm : Expr) (op : {n : Nat} → BitV
|
||||
let some vb := getBitVecValue? b | return .rfl
|
||||
if h : va.n = vb.n then
|
||||
if op va.val (h ▸ vb.val) then
|
||||
let e ← share <| mkConst ``True
|
||||
let e ← getTrueExpr
|
||||
return .step e (mkApp4 trueThm n a b eagerReflBoolTrue) (done := true)
|
||||
else
|
||||
let e ← share <| mkConst ``False
|
||||
let e ← getFalseExpr
|
||||
return .step e (mkApp4 falseThm n a b eagerReflBoolFalse) (done := true)
|
||||
else
|
||||
return .rfl
|
||||
@@ -367,10 +368,10 @@ def evalFinPred (n : Expr) (trueThm falseThm : Expr) (op : {n : Nat} → Fin n
|
||||
let some vb := getFinValue? b | return .rfl
|
||||
if h : va.n = vb.n then
|
||||
if op va.val (h ▸ vb.val) then
|
||||
let e ← share <| mkConst ``True
|
||||
let e ← getTrueExpr
|
||||
return .step e (mkApp4 trueThm n a b eagerReflBoolTrue) (done := true)
|
||||
else
|
||||
let e ← share <| mkConst ``False
|
||||
let e ← getFalseExpr
|
||||
return .step e (mkApp4 falseThm n a b eagerReflBoolFalse) (done := true)
|
||||
else
|
||||
return .rfl
|
||||
@@ -392,6 +393,8 @@ def evalLT (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
| UInt64 => evalBinPred getUInt64Value? (mkConst ``UInt64.lt_eq_true) (mkConst ``UInt64.lt_eq_false) (. < .) a b
|
||||
| Fin n => evalFinPred n (mkConst ``Fin.lt_eq_true) (mkConst ``Fin.lt_eq_false) (. < .) a b
|
||||
| BitVec n => evalBitVecPred n (mkConst ``BitVec.lt_eq_true) (mkConst ``BitVec.lt_eq_false) (. < .) a b
|
||||
| String => evalBinPred getStringValue? (mkConst ``String.lt_eq_true) (mkConst ``String.lt_eq_false) (. < .) a b
|
||||
| Char => evalBinPred getCharValue? (mkConst ``Char.lt_eq_true) (mkConst ``Char.lt_eq_false) (. < .) a b
|
||||
| _ => return .rfl
|
||||
|
||||
def evalLE (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
@@ -409,45 +412,13 @@ def evalLE (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
| UInt64 => evalBinPred getUInt64Value? (mkConst ``UInt64.le_eq_true) (mkConst ``UInt64.le_eq_false) (. ≤ .) a b
|
||||
| Fin n => evalFinPred n (mkConst ``Fin.le_eq_true) (mkConst ``Fin.le_eq_false) (. ≤ .) a b
|
||||
| BitVec n => evalBitVecPred n (mkConst ``BitVec.le_eq_true) (mkConst ``BitVec.le_eq_false) (. ≤ .) a b
|
||||
| _ => return .rfl
|
||||
|
||||
def evalGT (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
match_expr α with
|
||||
| Nat => evalBinPred getNatValue? (mkConst ``Nat.gt_eq_true) (mkConst ``Nat.gt_eq_false) (. > .) a b
|
||||
| Int => evalBinPred getIntValue? (mkConst ``Int.gt_eq_true) (mkConst ``Int.gt_eq_false) (. > .) a b
|
||||
| Rat => evalBinPred getRatValue? (mkConst ``Rat.gt_eq_true) (mkConst ``Rat.gt_eq_false) (. > .) a b
|
||||
| Int8 => evalBinPred getInt8Value? (mkConst ``Int8.gt_eq_true) (mkConst ``Int8.gt_eq_false) (. > .) a b
|
||||
| Int16 => evalBinPred getInt16Value? (mkConst ``Int16.gt_eq_true) (mkConst ``Int16.gt_eq_false) (. > .) a b
|
||||
| Int32 => evalBinPred getInt32Value? (mkConst ``Int32.gt_eq_true) (mkConst ``Int32.gt_eq_false) (. > .) a b
|
||||
| Int64 => evalBinPred getInt64Value? (mkConst ``Int64.gt_eq_true) (mkConst ``Int64.gt_eq_false) (. > .) a b
|
||||
| UInt8 => evalBinPred getUInt8Value? (mkConst ``UInt8.gt_eq_true) (mkConst ``UInt8.gt_eq_false) (. > .) a b
|
||||
| UInt16 => evalBinPred getUInt16Value? (mkConst ``UInt16.gt_eq_true) (mkConst ``UInt16.gt_eq_false) (. > .) a b
|
||||
| UInt32 => evalBinPred getUInt32Value? (mkConst ``UInt32.gt_eq_true) (mkConst ``UInt32.gt_eq_false) (. > .) a b
|
||||
| UInt64 => evalBinPred getUInt64Value? (mkConst ``UInt64.gt_eq_true) (mkConst ``UInt64.gt_eq_false) (. > .) a b
|
||||
| Fin n => evalFinPred n (mkConst ``Fin.gt_eq_true) (mkConst ``Fin.gt_eq_false) (. > .) a b
|
||||
| BitVec n => evalBitVecPred n (mkConst ``BitVec.gt_eq_true) (mkConst ``BitVec.gt_eq_false) (. > .) a b
|
||||
| _ => return .rfl
|
||||
|
||||
def evalGE (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
match_expr α with
|
||||
| Nat => evalBinPred getNatValue? (mkConst ``Nat.ge_eq_true) (mkConst ``Nat.ge_eq_false) (. ≥ .) a b
|
||||
| Int => evalBinPred getIntValue? (mkConst ``Int.ge_eq_true) (mkConst ``Int.ge_eq_false) (. ≥ .) a b
|
||||
| Rat => evalBinPred getRatValue? (mkConst ``Rat.ge_eq_true) (mkConst ``Rat.ge_eq_false) (. ≥ .) a b
|
||||
| Int8 => evalBinPred getInt8Value? (mkConst ``Int8.ge_eq_true) (mkConst ``Int8.ge_eq_false) (. ≥ .) a b
|
||||
| Int16 => evalBinPred getInt16Value? (mkConst ``Int16.ge_eq_true) (mkConst ``Int16.ge_eq_false) (. ≥ .) a b
|
||||
| Int32 => evalBinPred getInt32Value? (mkConst ``Int32.ge_eq_true) (mkConst ``Int32.ge_eq_false) (. ≥ .) a b
|
||||
| Int64 => evalBinPred getInt64Value? (mkConst ``Int64.ge_eq_true) (mkConst ``Int64.ge_eq_false) (. ≥ .) a b
|
||||
| UInt8 => evalBinPred getUInt8Value? (mkConst ``UInt8.ge_eq_true) (mkConst ``UInt8.ge_eq_false) (. ≥ .) a b
|
||||
| UInt16 => evalBinPred getUInt16Value? (mkConst ``UInt16.ge_eq_true) (mkConst ``UInt16.ge_eq_false) (. ≥ .) a b
|
||||
| UInt32 => evalBinPred getUInt32Value? (mkConst ``UInt32.ge_eq_true) (mkConst ``UInt32.ge_eq_false) (. ≥ .) a b
|
||||
| UInt64 => evalBinPred getUInt64Value? (mkConst ``UInt64.ge_eq_true) (mkConst ``UInt64.ge_eq_false) (. ≥ .) a b
|
||||
| Fin n => evalFinPred n (mkConst ``Fin.ge_eq_true) (mkConst ``Fin.ge_eq_false) (. ≥ .) a b
|
||||
| BitVec n => evalBitVecPred n (mkConst ``BitVec.ge_eq_true) (mkConst ``BitVec.ge_eq_false) (. ≥ .) a b
|
||||
| String => evalBinPred getStringValue? (mkConst ``String.le_eq_true) (mkConst ``String.le_eq_false) (. ≤ .) a b
|
||||
| Char => evalBinPred getCharValue? (mkConst ``Char.le_eq_true) (mkConst ``Char.le_eq_false) (. ≤ .) a b
|
||||
| _ => return .rfl
|
||||
|
||||
def evalEq (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
if isSameExpr a b then do
|
||||
let e ← share <| mkConst ``True
|
||||
let e ← getTrueExpr
|
||||
let u ← getLevel α
|
||||
return .step e (mkApp2 (mkConst ``eq_self [u]) α a) (done := true)
|
||||
else match_expr α with
|
||||
@@ -464,27 +435,8 @@ def evalEq (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
| UInt64 => evalBinPred getUInt64Value? (mkConst ``UInt64.eq_eq_true) (mkConst ``UInt64.eq_eq_false) (. = .) a b
|
||||
| Fin n => evalFinPred n (mkConst ``Fin.eq_eq_true) (mkConst ``Fin.eq_eq_false) (. = .) a b
|
||||
| BitVec n => evalBitVecPred n (mkConst ``BitVec.eq_eq_true) (mkConst ``BitVec.eq_eq_false) (. = .) a b
|
||||
| _ => return .rfl
|
||||
|
||||
def evalNe (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
if isSameExpr a b then do
|
||||
let e ← share <| mkConst ``False
|
||||
let u ← getLevel α
|
||||
return .step e (mkApp2 (mkConst ``ne_self [u]) α a) (done := true)
|
||||
else match_expr α with
|
||||
| Nat => evalBinPred getNatValue? (mkConst ``Nat.ne_eq_true) (mkConst ``Nat.ne_eq_false) (. ≠ .) a b
|
||||
| Int => evalBinPred getIntValue? (mkConst ``Int.ne_eq_true) (mkConst ``Int.ne_eq_false) (. ≠ .) a b
|
||||
| Rat => evalBinPred getRatValue? (mkConst ``Rat.ne_eq_true) (mkConst ``Rat.ne_eq_false) (. ≠ .) a b
|
||||
| Int8 => evalBinPred getInt8Value? (mkConst ``Int8.ne_eq_true) (mkConst ``Int8.ne_eq_false) (. ≠ .) a b
|
||||
| Int16 => evalBinPred getInt16Value? (mkConst ``Int16.ne_eq_true) (mkConst ``Int16.ne_eq_false) (. ≠ .) a b
|
||||
| Int32 => evalBinPred getInt32Value? (mkConst ``Int32.ne_eq_true) (mkConst ``Int32.ne_eq_false) (. ≠ .) a b
|
||||
| Int64 => evalBinPred getInt64Value? (mkConst ``Int64.ne_eq_true) (mkConst ``Int64.ne_eq_false) (. ≠ .) a b
|
||||
| UInt8 => evalBinPred getUInt8Value? (mkConst ``UInt8.ne_eq_true) (mkConst ``UInt8.ne_eq_false) (. ≠ .) a b
|
||||
| UInt16 => evalBinPred getUInt16Value? (mkConst ``UInt16.ne_eq_true) (mkConst ``UInt16.ne_eq_false) (. ≠ .) a b
|
||||
| UInt32 => evalBinPred getUInt32Value? (mkConst ``UInt32.ne_eq_true) (mkConst ``UInt32.ne_eq_false) (. ≠ .) a b
|
||||
| UInt64 => evalBinPred getUInt64Value? (mkConst ``UInt64.ne_eq_true) (mkConst ``UInt64.ne_eq_false) (. ≠ .) a b
|
||||
| Fin n => evalFinPred n (mkConst ``Fin.ne_eq_true) (mkConst ``Fin.ne_eq_false) (. ≠ .) a b
|
||||
| BitVec n => evalBitVecPred n (mkConst ``BitVec.ne_eq_true) (mkConst ``BitVec.ne_eq_false) (. ≠ .) a b
|
||||
| Char => evalBinPred getCharValue? (mkConst ``Char.eq_eq_true) (mkConst ``Char.eq_eq_false) (. = .) a b
|
||||
| String => evalBinPred getStringValue? (mkConst ``String.eq_eq_true) (mkConst ``String.eq_eq_false) (. = .) a b
|
||||
| _ => return .rfl
|
||||
|
||||
def evalDvd (α : Expr) (a b : Expr) : SimpM Result :=
|
||||
@@ -554,6 +506,16 @@ macro "declare_eval_bin_bool_pred" id:ident op:term : command =>
|
||||
declare_eval_bin_bool_pred evalBEq (· == ·)
|
||||
declare_eval_bin_bool_pred evalBNe (· != ·)
|
||||
|
||||
open Internal in
|
||||
def evalNot (a : Expr) : SimpM Result :=
|
||||
/-
|
||||
**Note**: We added `evalNot` because some abbreviations expanded into `Not`s.
|
||||
-/
|
||||
match_expr a with
|
||||
| True => return .step (← getFalseExpr) (mkConst ``Sym.not_true_eq) (done := true)
|
||||
| False => return .step (← getTrueExpr) (mkConst ``Sym.not_false_eq) (done := true)
|
||||
| _ => return .rfl
|
||||
|
||||
public structure EvalStepConfig where
|
||||
maxExponent := 255
|
||||
|
||||
@@ -594,14 +556,12 @@ public def evalGround (config : EvalStepConfig := {}) : Simproc := fun e =>
|
||||
| Int.fmod a b => evalBinInt Int.fmod a b
|
||||
| Int.bmod a b => evalIntBMod a b
|
||||
| LE.le α _ a b => evalLE α a b
|
||||
| GE.ge α _ a b => evalGE α a b
|
||||
| LT.lt α _ a b => evalLT α a b
|
||||
| GT.gt α _ a b => evalGT α a b
|
||||
| Dvd.dvd α _ a b => evalDvd α a b
|
||||
| Eq α a b => evalEq α a b
|
||||
| Ne α a b => evalNe α a b
|
||||
| BEq.beq α _ a b => evalBEq α a b
|
||||
| bne α _ a b => evalBNe α a b
|
||||
| Not a => evalNot a
|
||||
| _ => return .rfl
|
||||
|
||||
end Lean.Meta.Sym.Simp
|
||||
|
||||
@@ -81,7 +81,7 @@ public def simpForall (e : Expr) : SimpM Result := do
|
||||
else if (← isProp e) then
|
||||
let n := getForallTelescopeSize e.bindingBody! 1
|
||||
forallBoundedTelescope e n fun xs b => withoutModifyingCacheIfNotWellBehaved do
|
||||
main xs b
|
||||
main xs (← shareCommon b)
|
||||
else
|
||||
return .rfl
|
||||
where
|
||||
@@ -90,7 +90,7 @@ where
|
||||
| .rfl _ => return .rfl
|
||||
| .step b' h _ =>
|
||||
let h ← mkLambdaFVars xs h
|
||||
let e' ← shareCommonInc (← mkForallFVars xs b')
|
||||
let e' ← shareCommon (← mkForallFVars xs b')
|
||||
-- **Note**: consider caching the forall-congr theorems
|
||||
let hcongr ← mkForallCongrFor xs
|
||||
return .step e' (mkApp3 hcongr (← mkLambdaFVars xs b) (← mkLambdaFVars xs b') h)
|
||||
|
||||
81
src/Lean/Meta/Sym/Simp/Goal.lean
Normal file
81
src/Lean/Meta/Sym/Simp/Goal.lean
Normal file
@@ -0,0 +1,81 @@
|
||||
/-
|
||||
Copyright (c) 2026 Amazon.com, Inc. or its affiliates. All Rights Reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
module
|
||||
prelude
|
||||
public import Lean.Meta.Sym.Simp.SimpM
|
||||
import Lean.Meta.Tactic.Util
|
||||
import Lean.Meta.AppBuilder
|
||||
import Lean.Meta.Sym.InferType
|
||||
namespace Lean.Meta.Sym
|
||||
/-!
|
||||
# Goal simplification
|
||||
|
||||
Applies `Sym.simp` to a goal's target type, producing a simplified goal or closing it if
|
||||
the result is `True`.
|
||||
-/
|
||||
|
||||
/-- Result of simplifying a goal with `Sym.simp`. -/
|
||||
public inductive SimpGoalResult where
|
||||
/-- No simplification was possible. -/
|
||||
| noProgress
|
||||
/-- The goal was closed (simplified to `True`). -/
|
||||
| closed
|
||||
/-- The goal was simplified to a new goal. -/
|
||||
| goal (mvarId : MVarId)
|
||||
|
||||
/--
|
||||
Converts a `SimpGoalResult` to an optional goal.
|
||||
Returns `none` if closed, `some mvarId` if simplified, or throws an error if no progress.
|
||||
-/
|
||||
public def SimpGoalResult.toOption : SimpGoalResult → CoreM (Option MVarId)
|
||||
| .noProgress => throwError "`Sym.simp` made no progress "
|
||||
| .closed => return none
|
||||
| .goal mvarId => return some mvarId
|
||||
|
||||
public def SimpGoalResult.ignoreNoProgress : SimpGoalResult → MVarId → SimpGoalResult
|
||||
| .noProgress, mvarId => .goal mvarId
|
||||
| r, _ => r
|
||||
|
||||
/--
|
||||
Converts a `Simp.Result` value into `SimpGoalResult`.
|
||||
-/
|
||||
public def Simp.Result.toSimpGoalResult (result : Simp.Result) (mvarId : MVarId) : SymM SimpGoalResult := do
|
||||
let decl ← mvarId.getDecl
|
||||
match result with
|
||||
| .rfl _ => return .noProgress
|
||||
| .step target' h _ =>
|
||||
let mvarNew ← mkFreshExprSyntheticOpaqueMVar target' decl.userName
|
||||
let u ← getLevel decl.type
|
||||
let h := mkApp4 (mkConst ``Eq.mpr [u]) decl.type target' h mvarNew
|
||||
mvarId.assign h
|
||||
if target'.isTrue then
|
||||
mvarNew.mvarId!.assign (mkConst ``True.intro)
|
||||
return .closed
|
||||
else
|
||||
return .goal mvarNew.mvarId!
|
||||
|
||||
/--
|
||||
Simplifies the target of `mvarId` using `Sym.simp`.
|
||||
Returns `.closed` if the target simplifies to `True`, `.simp mvarId'` if simplified
|
||||
to a new goal, or `.noProgress` if no simplification occurred.
|
||||
|
||||
This function assumed the input goal is a valid `Sym` goal (e.g., expressions are maximally shared).
|
||||
-/
|
||||
public def simpGoal (mvarId : MVarId) (methods : Simp.Methods := {}) (config : Simp.Config := {})
|
||||
: SymM SimpGoalResult := mvarId.withContext do
|
||||
let decl ← mvarId.getDecl
|
||||
(← simp decl.type methods config).toSimpGoalResult mvarId
|
||||
|
||||
/--
|
||||
Similar to `simpGoal`, but returns `.goal mvarId` if no progress was made.
|
||||
-/
|
||||
public def simpGoalIgnoringNoProgress (mvarId : MVarId) (methods : Simp.Methods := {}) (config : Simp.Config := {})
|
||||
: SymM SimpGoalResult := do
|
||||
match (← simpGoal mvarId methods config) with
|
||||
| .noProgress => return .goal mvarId
|
||||
| r => return r
|
||||
|
||||
end Lean.Meta.Sym
|
||||
@@ -48,14 +48,14 @@ def mkFunextFor (xs : Array Expr) (β : Expr) : MetaM Expr := do
|
||||
|
||||
public def simpLambda (e : Expr) : SimpM Result := do
|
||||
lambdaTelescope e fun xs b => withoutModifyingCacheIfNotWellBehaved do
|
||||
main xs b
|
||||
main xs (← shareCommon b)
|
||||
where
|
||||
main (xs : Array Expr) (b : Expr) : SimpM Result := do
|
||||
match (← simp b) with
|
||||
| .rfl _ => return .rfl
|
||||
| .step b' h _ =>
|
||||
let h ← mkLambdaFVars xs h
|
||||
let e' ← shareCommonInc (← mkLambdaFVars xs b')
|
||||
let e' ← shareCommon (← mkLambdaFVars xs b')
|
||||
let funext ← getFunext xs b
|
||||
return .step e' (mkApp3 funext e e' h)
|
||||
|
||||
|
||||
@@ -22,4 +22,8 @@ public abbrev mkEqTransResult (e₁ : Expr) (e₂ : Expr) (h₁ : Expr) (r₂ :
|
||||
| .rfl done => return .step e₂ h₁ done
|
||||
| .step e₃ h₂ done => return .step e₃ (← mkEqTrans e₁ e₂ h₁ e₃ h₂) done
|
||||
|
||||
public def Result.markAsDone : Result → Result
|
||||
| .rfl _ => .rfl true
|
||||
| .step e h _ => .step e h true
|
||||
|
||||
end Lean.Meta.Sym.Simp
|
||||
|
||||
@@ -11,6 +11,7 @@ public import Lean.Meta.Sym.Simp.Theorems
|
||||
public import Lean.Meta.Sym.Simp.App
|
||||
public import Lean.Meta.Sym.Simp.Discharger
|
||||
import Lean.Meta.Sym.InstantiateS
|
||||
import Lean.Meta.Sym.InstantiateMVarsS
|
||||
import Lean.Meta.Sym.Simp.DiscrTree
|
||||
namespace Lean.Meta.Sym.Simp
|
||||
open Grind
|
||||
@@ -20,31 +21,48 @@ Creates proof term for a rewriting step.
|
||||
Handles both constant expressions (common case, avoids `instantiateLevelParams`)
|
||||
and general expressions.
|
||||
-/
|
||||
def mkValue (expr : Expr) (pattern : Pattern) (result : MatchUnifyResult) : Expr :=
|
||||
def mkValue (expr : Expr) (pattern : Pattern) (us : List Level) (args : Array Expr) : Expr :=
|
||||
if let .const declName [] := expr then
|
||||
mkAppN (mkConst declName result.us) result.args
|
||||
mkAppN (mkConst declName us) args
|
||||
else
|
||||
mkAppN (expr.instantiateLevelParams pattern.levelParams result.us) result.args
|
||||
mkAppN (expr.instantiateLevelParams pattern.levelParams us) args
|
||||
|
||||
/--
|
||||
Tries to rewrite `e` using the given theorem.
|
||||
-/
|
||||
public def Theorem.rewrite (thm : Theorem) (e : Expr) (d : Discharger := dischargeNone) : SimpM Result := do
|
||||
public def Theorem.rewrite (thm : Theorem) (e : Expr) (d : Discharger := dischargeNone) : SimpM Result :=
|
||||
/-
|
||||
**Note**: We use `withNewMCtxDepth` to ensure auxiliary metavariables used during the `match?`
|
||||
do not pollute the metavariable context.
|
||||
Thus, we must ensure that all assigned variables have be instantiate.
|
||||
-/
|
||||
withNewMCtxDepth do
|
||||
if let some result ← thm.pattern.match? e then
|
||||
-- **Note**: Potential optimization: check whether pattern covers all variables.
|
||||
for arg in result.args do
|
||||
let .mvar mvarId := arg | pure ()
|
||||
unless (← mvarId.isAssigned) do
|
||||
let decl ← mvarId.getDecl
|
||||
if let some val ← d decl.type then
|
||||
mvarId.assign val
|
||||
let mut args := result.args.toVector
|
||||
let us ← result.us.mapM instantiateLevelMVars
|
||||
for h : i in *...args.size do
|
||||
let arg := args[i]
|
||||
if let .mvar mvarId := arg then
|
||||
if (← mvarId.isAssigned) then
|
||||
let arg ← instantiateMVarsS arg
|
||||
args := args.set i arg
|
||||
else
|
||||
-- **Note**: Failed to discharge hypothesis.
|
||||
return .rfl
|
||||
let proof := mkValue thm.expr thm.pattern result
|
||||
let rhs := thm.rhs.instantiateLevelParams thm.pattern.levelParams result.us
|
||||
let rhs ← shareCommonInc rhs
|
||||
let expr ← instantiateRevBetaS rhs result.args
|
||||
let decl ← mvarId.getDecl
|
||||
if let some val ← d decl.type then
|
||||
let val ← instantiateMVarsS val
|
||||
mvarId.assign val
|
||||
args := args.set i val
|
||||
else
|
||||
-- **Note**: Failed to discharge hypothesis.
|
||||
return .rfl
|
||||
else if arg.hasMVar then
|
||||
let arg ← instantiateMVarsS arg
|
||||
args := args.set i arg
|
||||
let proof := mkValue thm.expr thm.pattern us args.toArray
|
||||
let rhs := thm.rhs.instantiateLevelParams thm.pattern.levelParams us
|
||||
let rhs ← share rhs
|
||||
let expr ← instantiateRevBetaS rhs args.toArray
|
||||
if isSameExpr e expr then
|
||||
return .rfl
|
||||
else
|
||||
|
||||
@@ -101,7 +101,7 @@ invalidating the cache and causing O(2^n) behavior on conditional trees.
|
||||
/-- Configuration options for the structural simplifier. -/
|
||||
structure Config where
|
||||
/-- Maximum number of steps that can be performed by the simplifier. -/
|
||||
maxSteps : Nat := 1000
|
||||
maxSteps : Nat := 100_000
|
||||
/--
|
||||
Maximum depth of reentrant simplifier calls through dischargers.
|
||||
Prevents infinite loops when conditional rewrite rules trigger recursive discharge attempts.
|
||||
@@ -173,16 +173,13 @@ abbrev Cache := PHashMap ExprPtr Result
|
||||
|
||||
/-- Mutable state for the simplifier. -/
|
||||
structure State where
|
||||
/-- Number of steps performed so far. -/
|
||||
numSteps := 0
|
||||
/--
|
||||
Cache of previously simplified expressions to avoid redundant work.
|
||||
**Note**: Consider moving to `SymM.State`
|
||||
-/
|
||||
cache : Cache := {}
|
||||
/-- Stack of free variables available for reuse when re-entering binders.
|
||||
Each entry is (type pointer, fvarId). -/
|
||||
binderStack : List (ExprPtr × FVarId) := []
|
||||
/-- Number of steps performed so far. -/
|
||||
numSteps := 0
|
||||
/-- Cache for generated funext theorems -/
|
||||
funext : PHashMap ExprPtr Expr := {}
|
||||
|
||||
@@ -221,8 +218,13 @@ opaque MethodsRef.toMethods (m : MethodsRef) : Methods
|
||||
def getMethods : SimpM Methods :=
|
||||
return MethodsRef.toMethods (← read)
|
||||
|
||||
/-- Runs a `SimpM` computation with the given theorems, configuration, and initial state -/
|
||||
def SimpM.run (x : SimpM α) (methods : Methods := {}) (config : Config := {}) (s : State := {}) : SymM (α × State) := do
|
||||
let initialLCtxSize := (← getLCtx).decls.size
|
||||
x methods.toMethodsRef { initialLCtxSize, config } |>.run s
|
||||
|
||||
/-- Runs a `SimpM` computation with the given theorems and configuration. -/
|
||||
def SimpM.run (x : SimpM α) (methods : Methods := {}) (config : Config := {}) : SymM α := do
|
||||
def SimpM.run' (x : SimpM α) (methods : Methods := {}) (config : Config := {}) : SymM α := do
|
||||
let initialLCtxSize := (← getLCtx).decls.size
|
||||
x methods.toMethodsRef { initialLCtxSize, config } |>.run' {}
|
||||
|
||||
@@ -243,7 +245,8 @@ abbrev post : Simproc := fun e => do
|
||||
|
||||
abbrev withoutModifyingCache (k : SimpM α) : SimpM α := do
|
||||
let cache ← getCache
|
||||
try k finally modify fun s => { s with cache }
|
||||
let funext := (← get).funext
|
||||
try k finally modify fun s => { s with cache, funext }
|
||||
|
||||
abbrev withoutModifyingCacheIfNotWellBehaved (k : SimpM α) : SimpM α := do
|
||||
if (← getMethods).wellBehavedMethods then k else withoutModifyingCache k
|
||||
@@ -251,6 +254,6 @@ abbrev withoutModifyingCacheIfNotWellBehaved (k : SimpM α) : SimpM α := do
|
||||
end Simp
|
||||
|
||||
abbrev simp (e : Expr) (methods : Simp.Methods := {}) (config : Simp.Config := {}) : SymM Simp.Result := do
|
||||
Simp.SimpM.run (Simp.simp e) methods config
|
||||
Simp.SimpM.run' (Simp.simp e) methods config
|
||||
|
||||
end Lean.Meta.Sym
|
||||
|
||||
@@ -83,7 +83,21 @@ inductive CongrInfo where
|
||||
-/
|
||||
congrTheorem (thm : CongrTheorem)
|
||||
|
||||
/-- Mutable state for the symbolic simulator framework. -/
|
||||
/-- Pre-shared expressions for commonly used terms. -/
|
||||
structure SharedExprs where
|
||||
trueExpr : Expr
|
||||
falseExpr : Expr
|
||||
natZExpr : Expr
|
||||
btrueExpr : Expr
|
||||
bfalseExpr : Expr
|
||||
ordEqExpr : Expr
|
||||
intExpr : Expr
|
||||
|
||||
/-- Readonly context for the symbolic computation framework. -/
|
||||
structure Context where
|
||||
sharedExprs : SharedExprs
|
||||
|
||||
/-- Mutable state for the symbolic computation framework. -/
|
||||
structure State where
|
||||
/-- `ShareCommon` (aka `Hash-consing`) state. -/
|
||||
share : AlphaShareCommon.State := {}
|
||||
@@ -120,11 +134,41 @@ structure State where
|
||||
congrInfo : PHashMap ExprPtr CongrInfo := {}
|
||||
debug : Bool := false
|
||||
|
||||
abbrev SymM := StateRefT State MetaM
|
||||
abbrev SymM := ReaderT Context <| StateRefT State MetaM
|
||||
|
||||
private def mkSharedExprs : AlphaShareCommonM SharedExprs := do
|
||||
let falseExpr ← shareCommonAlphaInc <| mkConst ``False
|
||||
let trueExpr ← shareCommonAlphaInc <| mkConst ``True
|
||||
let bfalseExpr ← shareCommonAlphaInc <| mkConst ``Bool.false
|
||||
let btrueExpr ← shareCommonAlphaInc <| mkConst ``Bool.true
|
||||
let natZExpr ← shareCommonAlphaInc <| mkNatLit 0
|
||||
let ordEqExpr ← shareCommonAlphaInc <| mkConst ``Ordering.eq
|
||||
let intExpr ← shareCommonAlphaInc <| Int.mkType
|
||||
return { falseExpr, trueExpr, bfalseExpr, btrueExpr, natZExpr, ordEqExpr, intExpr }
|
||||
|
||||
def SymM.run (x : SymM α) : MetaM α := do
|
||||
let (sharedExprs, share) := mkSharedExprs |>.run {}
|
||||
let debug := sym.debug.get (← getOptions)
|
||||
x |>.run' { debug }
|
||||
x { sharedExprs } |>.run' { debug, share }
|
||||
|
||||
/-- Returns maximally shared commonly used terms -/
|
||||
def getSharedExprs : SymM SharedExprs :=
|
||||
return (← read).sharedExprs
|
||||
|
||||
/-- Returns the internalized `True` constant. -/
|
||||
def getTrueExpr : SymM Expr := return (← getSharedExprs).trueExpr
|
||||
/-- Returns the internalized `False` constant. -/
|
||||
def getFalseExpr : SymM Expr := return (← getSharedExprs).falseExpr
|
||||
/-- Returns the internalized `Bool.true`. -/
|
||||
def getBoolTrueExpr : SymM Expr := return (← getSharedExprs).btrueExpr
|
||||
/-- Returns the internalized `Bool.false`. -/
|
||||
def getBoolFalseExpr : SymM Expr := return (← getSharedExprs).bfalseExpr
|
||||
/-- Returns the internalized `0 : Nat` numeral. -/
|
||||
def getNatZeroExpr : SymM Expr := return (← getSharedExprs).natZExpr
|
||||
/-- Returns the internalized `Ordering.eq`. -/
|
||||
def getOrderingEqExpr : SymM Expr := return (← getSharedExprs).ordEqExpr
|
||||
/-- Returns the internalized `Int`. -/
|
||||
def getIntExpr : SymM Expr := return (← getSharedExprs).intExpr
|
||||
|
||||
/--
|
||||
Applies hash-consing to `e`. Recall that all expressions in a `grind` goal have
|
||||
|
||||
@@ -6,13 +6,56 @@ Authors: Leonardo de Moura
|
||||
module
|
||||
prelude
|
||||
public import Lean.Meta.Sym.SymM
|
||||
public import Lean.Meta.Transform
|
||||
import Init.Grind.Util
|
||||
import Lean.Meta.WHNF
|
||||
import Lean.Util.ForEachExpr
|
||||
namespace Lean.Meta.Sym
|
||||
open Grind
|
||||
|
||||
/--
|
||||
Returns `true` if `declName` is the name of a grind helper declaration that
|
||||
should not be unfolded by `unfoldReducible`.
|
||||
-/
|
||||
def isGrindGadget (declName : Name) : Bool :=
|
||||
declName == ``Grind.EqMatch
|
||||
|
||||
/--
|
||||
Auxiliary function for implementing `unfoldReducible` and `unfoldReducibleSimproc`.
|
||||
Performs a single step.
|
||||
-/
|
||||
public def unfoldReducibleStep (e : Expr) : MetaM TransformStep := do
|
||||
let .const declName _ := e.getAppFn | return .continue
|
||||
unless (← isReducible declName) do return .continue
|
||||
if isGrindGadget declName then return .continue
|
||||
-- See comment at isUnfoldReducibleTarget.
|
||||
if (← getEnv).isProjectionFn declName then return .continue
|
||||
let some v ← unfoldDefinition? e | return .continue
|
||||
return .visit v
|
||||
|
||||
def isUnfoldReducibleTarget (e : Expr) : CoreM Bool := do
|
||||
let env ← getEnv
|
||||
return Option.isSome <| e.find? fun e => Id.run do
|
||||
let .const declName _ := e | return false
|
||||
if getReducibilityStatusCore env declName matches .reducible then
|
||||
-- Remark: it is wasteful to unfold projection functions since
|
||||
-- kernel projections are folded again in the `foldProjs` preprocessing step.
|
||||
return !isGrindGadget declName && !env.isProjectionFn declName
|
||||
else
|
||||
return false
|
||||
|
||||
/--
|
||||
Unfolds all `reducible` declarations occurring in `e`.
|
||||
This is meant as a preprocessing step. It does **not** guarantee maximally shared terms
|
||||
-/
|
||||
public def unfoldReducible (e : Expr) : MetaM Expr := do
|
||||
if !(← isUnfoldReducibleTarget e) then return e
|
||||
Meta.transform e (pre := unfoldReducibleStep)
|
||||
|
||||
/--
|
||||
Instantiates metavariables, unfold reducible, and applies `shareCommon`.
|
||||
-/
|
||||
def preprocessExpr (e : Expr) : SymM Expr := do
|
||||
shareCommon (← instantiateMVars e)
|
||||
shareCommon (← unfoldReducible (← instantiateMVars e))
|
||||
|
||||
/--
|
||||
Helper function that removes gaps, instantiate metavariables, and applies `shareCommon`.
|
||||
@@ -32,6 +75,7 @@ def preprocessLCtx (lctx : LocalContext) : SymM LocalContext := do
|
||||
let type ← preprocessExpr type
|
||||
let value ← preprocessExpr value
|
||||
pure <| LocalDecl.ldecl index fvarId userName type value nondep kind
|
||||
index := index + 1
|
||||
decls := decls.push (some decl)
|
||||
fvarIdToDecl := fvarIdToDecl.insert decl.fvarId decl
|
||||
return { fvarIdToDecl, decls, auxDeclToFullName }
|
||||
@@ -48,4 +92,21 @@ public def preprocessMVar (mvarId : MVarId) : SymM MVarId := do
|
||||
mvarId.assign mvarNew
|
||||
return mvarNew.mvarId!
|
||||
|
||||
/-- Debug helper: throws if any subexpression of `e` is not in the table of maximally shared terms. -/
|
||||
public def _root_.Lean.Expr.checkMaxShared (e : Expr) (msg := "") : SymM Unit := do
|
||||
e.forEach fun e => do
|
||||
if let some prev := (← get).share.set.find? { expr := e } then
|
||||
unless isSameExpr prev.expr e do
|
||||
throwNotMaxShared e
|
||||
else
|
||||
throwNotMaxShared e
|
||||
where
|
||||
throwNotMaxShared (e : Expr) : SymM Unit := do
|
||||
let msg := if msg == "" then msg else s!"[{msg}] "
|
||||
throwError "{msg}term is not in the maximally shared table{indentExpr e}"
|
||||
|
||||
/-- Debug helper: throws if any subexpression of the goal's target type is not in the table of maximally shared. -/
|
||||
public def _root_.Lean.MVarId.checkMaxShared (mvarId : MVarId) (msg := "") : SymM Unit := do
|
||||
(← mvarId.getDecl).type.checkMaxShared msg
|
||||
|
||||
end Lean.Meta.Sym
|
||||
|
||||
@@ -11,6 +11,7 @@ import Lean.Util.ForEachExpr
|
||||
import Lean.Meta.Tactic.Grind.Util
|
||||
import Lean.Meta.Match.Basic
|
||||
import Lean.Meta.Tactic.TryThis
|
||||
import Lean.Meta.Sym.Util
|
||||
public section
|
||||
namespace Lean.Meta.Grind
|
||||
/-!
|
||||
@@ -281,7 +282,7 @@ private theorem normConfig_zetaDelta : normConfig.zetaDelta = true := rfl
|
||||
|
||||
def preprocessPattern (pat : Expr) (normalizePattern := true) : MetaM Expr := do
|
||||
let pat ← instantiateMVars pat
|
||||
let pat ← unfoldReducible pat
|
||||
let pat ← Sym.unfoldReducible pat
|
||||
let pat ← if normalizePattern then normalize pat normConfig else pure pat
|
||||
let pat ← detectOffsets pat
|
||||
let pat ← foldProjs pat
|
||||
|
||||
@@ -107,13 +107,6 @@ private def discharge? (e : Expr) : SimpM (Option Expr) := do
|
||||
open Sym
|
||||
|
||||
def GrindM.run (x : GrindM α) (params : Params) (evalTactic? : Option EvalTactic := none) : MetaM α := Sym.SymM.run do
|
||||
let falseExpr ← share <| mkConst ``False
|
||||
let trueExpr ← share <| mkConst ``True
|
||||
let bfalseExpr ← share <| mkConst ``Bool.false
|
||||
let btrueExpr ← share <| mkConst ``Bool.true
|
||||
let natZExpr ← share <| mkNatLit 0
|
||||
let ordEqExpr ← share <| mkConst ``Ordering.eq
|
||||
let intExpr ← share <| Int.mkType
|
||||
/- **Note**: Consider using `Sym.simp` in the future. -/
|
||||
let simprocs := params.normProcs
|
||||
let simpMethods := Simp.mkMethods simprocs discharge? (wellBehavedDischarge := true)
|
||||
@@ -124,9 +117,7 @@ def GrindM.run (x : GrindM α) (params : Params) (evalTactic? : Option EvalTacti
|
||||
let anchorRefs? := params.anchorRefs?
|
||||
let debug := grind.debug.get (← getOptions)
|
||||
x (← mkMethods evalTactic?).toMethodsRef
|
||||
{ config, anchorRefs?, simpMethods, simp, extensions, symPrios
|
||||
trueExpr, falseExpr, natZExpr, btrueExpr, bfalseExpr, ordEqExpr, intExpr
|
||||
debug }
|
||||
{ config, anchorRefs?, simpMethods, simp, extensions, symPrios, debug }
|
||||
|>.run' {}
|
||||
|
||||
private def mkCleanState (mvarId : MVarId) : GrindM Clean.State := mvarId.withContext do
|
||||
@@ -155,7 +146,7 @@ private def initENodeCore (e : Expr) (interpreted ctor : Bool) : GoalM Unit := d
|
||||
mkENodeCore e interpreted ctor (generation := 0) (funCC := false)
|
||||
|
||||
/-- Returns a new goal for the given metavariable. -/
|
||||
public def mkGoal (mvarId : MVarId) : GrindM Goal := do
|
||||
public def mkGoalCore (mvarId : MVarId) : GrindM Goal := do
|
||||
let config ← getConfig
|
||||
let mvarId ← if config.clean then mvarId.exposeNames else pure mvarId
|
||||
let trueExpr ← getTrueExpr
|
||||
@@ -288,7 +279,7 @@ private def initCore (mvarId : MVarId) : GrindM Goal := do
|
||||
let mvarId ← mvarId.unfoldReducible
|
||||
let mvarId ← mvarId.betaReduce
|
||||
appendTagSuffix mvarId `grind
|
||||
let goal ← mkGoal mvarId
|
||||
let goal ← mkGoalCore mvarId
|
||||
if config.revert then
|
||||
return goal
|
||||
else
|
||||
|
||||
@@ -8,6 +8,7 @@ prelude
|
||||
public import Lean.Meta.Tactic.Grind.Types
|
||||
import Init.Grind.Util
|
||||
import Lean.Meta.Sym.ExprPtr
|
||||
import Lean.Meta.Sym.Util
|
||||
import Lean.Meta.Tactic.Grind.Util
|
||||
public section
|
||||
namespace Lean.Meta.Grind
|
||||
@@ -103,7 +104,7 @@ where
|
||||
-/
|
||||
/- We must also apply beta-reduction to improve the effectiveness of the congruence closure procedure. -/
|
||||
let e ← Core.betaReduce e
|
||||
let e ← unfoldReducible e
|
||||
let e ← Sym.unfoldReducible e
|
||||
/- We must mask proofs occurring in `prop` too. -/
|
||||
let e ← visit e
|
||||
let e ← eraseIrrelevantMData e
|
||||
|
||||
@@ -11,6 +11,7 @@ public import Lean.Meta.Tactic.Grind.Types
|
||||
import Lean.Meta.Tactic.Grind.Util
|
||||
import Lean.Meta.Tactic.Grind.MatchDiscrOnly
|
||||
import Lean.Meta.Tactic.Grind.MarkNestedSubsingletons
|
||||
import Lean.Meta.Sym.Util
|
||||
public section
|
||||
namespace Lean.Meta.Grind
|
||||
|
||||
@@ -57,7 +58,7 @@ def preprocessImpl (e : Expr) : GoalM Simp.Result := do
|
||||
let e' ← instantiateMVars r.expr
|
||||
-- Remark: `simpCore` unfolds reducible constants, but it does not consistently visit all possible subterms.
|
||||
-- So, we must use the following `unfoldReducible` step. It is non-op in most cases
|
||||
let e' ← unfoldReducible e'
|
||||
let e' ← Sym.unfoldReducible e'
|
||||
let e' ← abstractNestedProofs e'
|
||||
let e' ← markNestedSubsingletons e'
|
||||
let e' ← eraseIrrelevantMData e'
|
||||
@@ -97,6 +98,6 @@ but ensures assumptions made by `grind` are satisfied.
|
||||
-/
|
||||
def preprocessLight (e : Expr) : GoalM Expr := do
|
||||
let e ← instantiateMVars e
|
||||
shareCommon (← canon (← normalizeLevels (← foldProjs (← eraseIrrelevantMData (← markNestedSubsingletons (← unfoldReducible e))))))
|
||||
shareCommon (← canon (← normalizeLevels (← foldProjs (← eraseIrrelevantMData (← markNestedSubsingletons (← Sym.unfoldReducible e))))))
|
||||
|
||||
end Lean.Meta.Grind
|
||||
|
||||
@@ -14,6 +14,7 @@ import Lean.Meta.Tactic.Grind.Arith.Simproc
|
||||
import Lean.Meta.Tactic.Simp.BuiltinSimprocs.List
|
||||
import Lean.Meta.Tactic.Simp.BuiltinSimprocs.Core
|
||||
import Lean.Meta.Tactic.Grind.Util
|
||||
import Lean.Meta.Sym.Util
|
||||
import Init.Grind.Norm
|
||||
public section
|
||||
namespace Lean.Meta.Grind
|
||||
@@ -136,7 +137,7 @@ builtin_simproc_decl reduceCtorEqCheap (_ = _) := fun e => do
|
||||
return .done { expr := mkConst ``False, proof? := (← withDefault <| mkEqFalse' (← mkLambdaFVars #[h] (← mkNoConfusion (mkConst ``False) h))) }
|
||||
|
||||
builtin_dsimproc_decl unfoldReducibleSimproc (_) := fun e => do
|
||||
unfoldReducibleStep e
|
||||
Sym.unfoldReducibleStep e
|
||||
|
||||
/-- Returns the array of simprocs used by `grind`. -/
|
||||
protected def getSimprocs : MetaM (Array Simprocs) := do
|
||||
|
||||
@@ -160,15 +160,10 @@ structure Context where
|
||||
/-- Symbol priorities for inferring E-matching patterns -/
|
||||
symPrios : SymbolPriorities
|
||||
extensions : ExtensionStateArray := #[]
|
||||
trueExpr : Expr
|
||||
falseExpr : Expr
|
||||
natZExpr : Expr
|
||||
btrueExpr : Expr
|
||||
bfalseExpr : Expr
|
||||
ordEqExpr : Expr -- `Ordering.eq`
|
||||
intExpr : Expr -- `Int`
|
||||
debug : Bool -- Cached `grind.debug (← getOptions)`
|
||||
|
||||
export Sym (getTrueExpr getFalseExpr getBoolTrueExpr getBoolFalseExpr getNatZeroExpr getOrderingEqExpr getIntExpr)
|
||||
|
||||
/-- Key for the congruence theorem cache. -/
|
||||
structure CongrTheoremCacheKey where
|
||||
f : Expr
|
||||
@@ -305,34 +300,6 @@ abbrev withGTransparency [MonadControlT MetaM n] [MonadLiftT GrindM n] [Monad n]
|
||||
let m := if (← getConfig).reducible then .reducible else .default
|
||||
withTransparency m k
|
||||
|
||||
/-- Returns the internalized `True` constant. -/
|
||||
def getTrueExpr : GrindM Expr := do
|
||||
return (← readThe Context).trueExpr
|
||||
|
||||
/-- Returns the internalized `False` constant. -/
|
||||
def getFalseExpr : GrindM Expr := do
|
||||
return (← readThe Context).falseExpr
|
||||
|
||||
/-- Returns the internalized `Bool.true`. -/
|
||||
def getBoolTrueExpr : GrindM Expr := do
|
||||
return (← readThe Context).btrueExpr
|
||||
|
||||
/-- Returns the internalized `Bool.false`. -/
|
||||
def getBoolFalseExpr : GrindM Expr := do
|
||||
return (← readThe Context).bfalseExpr
|
||||
|
||||
/-- Returns the internalized `0 : Nat` numeral. -/
|
||||
def getNatZeroExpr : GrindM Expr := do
|
||||
return (← readThe Context).natZExpr
|
||||
|
||||
/-- Returns the internalized `Ordering.eq`. -/
|
||||
def getOrderingEqExpr : GrindM Expr := do
|
||||
return (← readThe Context).ordEqExpr
|
||||
|
||||
/-- Returns the internalized `Int`. -/
|
||||
def getIntExpr : GrindM Expr := do
|
||||
return (← readThe Context).intExpr
|
||||
|
||||
/-- Returns the anchor references (if any) being used to restrict the search. -/
|
||||
def getAnchorRefs : GrindM (Option (Array AnchorRef)) := do
|
||||
return (← readThe Context).anchorRefs?
|
||||
|
||||
@@ -11,6 +11,7 @@ import Lean.ProjFns
|
||||
import Lean.Meta.WHNF
|
||||
import Lean.Meta.AbstractNestedProofs
|
||||
import Lean.Meta.Tactic.Clear
|
||||
import Lean.Meta.Sym.Util
|
||||
public section
|
||||
namespace Lean.Meta.Grind
|
||||
/--
|
||||
@@ -55,49 +56,11 @@ def _root_.Lean.MVarId.transformTarget (mvarId : MVarId) (f : Expr → MetaM Exp
|
||||
mvarId.assign mvarNew
|
||||
return mvarNew.mvarId!
|
||||
|
||||
/--
|
||||
Returns `true` if `declName` is the name of a grind helper declaration that
|
||||
should not be unfolded by `unfoldReducible`.
|
||||
-/
|
||||
def isGrindGadget (declName : Name) : Bool :=
|
||||
declName == ``Grind.EqMatch
|
||||
|
||||
def isUnfoldReducibleTarget (e : Expr) : CoreM Bool := do
|
||||
let env ← getEnv
|
||||
return Option.isSome <| e.find? fun e => Id.run do
|
||||
let .const declName _ := e | return false
|
||||
if getReducibilityStatusCore env declName matches .reducible then
|
||||
-- Remark: it is wasteful to unfold projection functions since
|
||||
-- kernel projections are folded again in the `foldProjs` preprocessing step.
|
||||
return !isGrindGadget declName && !env.isProjectionFn declName
|
||||
else
|
||||
return false
|
||||
|
||||
/--
|
||||
Auxiliary function for implementing `unfoldReducible` and `unfoldReducibleSimproc`.
|
||||
Performs a single step.
|
||||
-/
|
||||
def unfoldReducibleStep (e : Expr) : MetaM TransformStep := do
|
||||
let .const declName _ := e.getAppFn | return .continue
|
||||
unless (← isReducible declName) do return .continue
|
||||
if isGrindGadget declName then return .continue
|
||||
-- See comment at isUnfoldReducibleTarget.
|
||||
if (← getEnv).isProjectionFn declName then return .continue
|
||||
let some v ← unfoldDefinition? e | return .continue
|
||||
return .visit v
|
||||
|
||||
/--
|
||||
Unfolds all `reducible` declarations occurring in `e`.
|
||||
-/
|
||||
def unfoldReducible (e : Expr) : MetaM Expr := do
|
||||
if !(← isUnfoldReducibleTarget e) then return e
|
||||
Meta.transform e (pre := unfoldReducibleStep)
|
||||
|
||||
/--
|
||||
Unfolds all `reducible` declarations occurring in the goal's target.
|
||||
-/
|
||||
def _root_.Lean.MVarId.unfoldReducible (mvarId : MVarId) : MetaM MVarId :=
|
||||
mvarId.transformTarget Grind.unfoldReducible
|
||||
mvarId.transformTarget Sym.unfoldReducible
|
||||
|
||||
/--
|
||||
Beta-reduces the goal's target.
|
||||
|
||||
@@ -188,6 +188,12 @@ def applyEqLemma (e : Expr → EqResult) (lemmaName : Name) (args : Array Expr)
|
||||
return .some (e (mkAppN (mkConst lemmaName) args))
|
||||
|
||||
def reduceNatEqExpr (x y : Expr) : SimpM (Option EqResult):= do
|
||||
/-
|
||||
**TODO**: These proofs rely too much on definitional equality.
|
||||
Example:
|
||||
`x + 1 + 1 + ... + 1 = x + 1 + ... + 1`
|
||||
It will treat both sides as `x + n = x + n`.
|
||||
-/
|
||||
let some xno ← NatOffset.fromExpr? x | return none
|
||||
let some yno ← NatOffset.fromExpr? y | return none
|
||||
match xno, yno with
|
||||
|
||||
@@ -324,6 +324,55 @@ LEAN_EXPORT LEAN_NORETURN void lean_internal_panic(char const * msg);
|
||||
LEAN_EXPORT LEAN_NORETURN void lean_internal_panic_out_of_memory(void);
|
||||
LEAN_EXPORT LEAN_NORETURN void lean_internal_panic_unreachable(void);
|
||||
LEAN_EXPORT LEAN_NORETURN void lean_internal_panic_rc_overflow(void);
|
||||
LEAN_EXPORT LEAN_NORETURN void lean_internal_panic_overflow(void);
|
||||
|
||||
static inline bool lean_usize_mul_would_overflow(size_t a, size_t b) {
|
||||
#if defined(__GNUC__) || defined(__clang__)
|
||||
size_t r;
|
||||
return __builtin_mul_overflow(a, b, &r);
|
||||
#else
|
||||
return a != 0 && b > SIZE_MAX / a;
|
||||
#endif
|
||||
}
|
||||
|
||||
static inline bool lean_usize_add_would_overflow(size_t a, size_t b) {
|
||||
#if defined(__GNUC__) || defined(__clang__)
|
||||
size_t r;
|
||||
return __builtin_add_overflow(a, b, &r);
|
||||
#else
|
||||
return a > SIZE_MAX - b;
|
||||
#endif
|
||||
}
|
||||
|
||||
static inline size_t lean_usize_mul_checked(size_t a, size_t b) {
|
||||
#if defined(__GNUC__) || defined(__clang__)
|
||||
size_t r;
|
||||
if (LEAN_UNLIKELY(__builtin_mul_overflow(a, b, &r))) {
|
||||
lean_internal_panic_overflow();
|
||||
}
|
||||
return r;
|
||||
#else
|
||||
if (a != 0 && b > SIZE_MAX / a) {
|
||||
lean_internal_panic_overflow();
|
||||
}
|
||||
return a * b;
|
||||
#endif
|
||||
}
|
||||
|
||||
static inline size_t lean_usize_add_checked(size_t a, size_t b) {
|
||||
#if defined(__GNUC__) || defined(__clang__)
|
||||
size_t r;
|
||||
if (LEAN_UNLIKELY(__builtin_add_overflow(a, b, &r))) {
|
||||
lean_internal_panic_overflow();
|
||||
}
|
||||
return r;
|
||||
#else
|
||||
if (a > SIZE_MAX - b) {
|
||||
lean_internal_panic_overflow();
|
||||
}
|
||||
return a + b;
|
||||
#endif
|
||||
}
|
||||
|
||||
static inline size_t lean_align(size_t v, size_t a) {
|
||||
return (v / a)*a + a * (v % a != 0);
|
||||
@@ -609,7 +658,7 @@ static inline uint8_t * lean_ctor_scalar_cptr(lean_object * o) {
|
||||
|
||||
static inline lean_object * lean_alloc_ctor(unsigned tag, unsigned num_objs, unsigned scalar_sz) {
|
||||
assert(tag <= LeanMaxCtorTag && num_objs < LEAN_MAX_CTOR_FIELDS && scalar_sz < LEAN_MAX_CTOR_SCALARS_SIZE);
|
||||
lean_object * o = lean_alloc_ctor_memory(sizeof(lean_ctor_object) + sizeof(void*)*num_objs + scalar_sz);
|
||||
lean_object * o = lean_alloc_ctor_memory(lean_usize_add_checked(lean_usize_add_checked(sizeof(lean_ctor_object), lean_usize_mul_checked(sizeof(void*), num_objs)), scalar_sz));
|
||||
lean_set_st_header(o, tag, num_objs);
|
||||
return o;
|
||||
}
|
||||
@@ -715,7 +764,7 @@ static inline lean_object ** lean_closure_arg_cptr(lean_object * o) { return lea
|
||||
static inline lean_obj_res lean_alloc_closure(void * fun, unsigned arity, unsigned num_fixed) {
|
||||
assert(arity > 0);
|
||||
assert(num_fixed < arity);
|
||||
lean_closure_object * o = (lean_closure_object*)lean_alloc_object(sizeof(lean_closure_object) + sizeof(void*)*num_fixed);
|
||||
lean_closure_object * o = (lean_closure_object*)lean_alloc_object(lean_usize_add_checked(sizeof(lean_closure_object), lean_usize_mul_checked(sizeof(void*), num_fixed)));
|
||||
lean_set_st_header((lean_object*)o, LeanClosure, 0);
|
||||
o->m_fun = fun;
|
||||
o->m_arity = arity;
|
||||
@@ -761,7 +810,7 @@ LEAN_EXPORT lean_object* lean_apply_m(lean_object* f, unsigned n, lean_object**
|
||||
|
||||
/* Arrays of objects (low level API) */
|
||||
static inline lean_obj_res lean_alloc_array(size_t size, size_t capacity) {
|
||||
lean_array_object * o = (lean_array_object*)lean_alloc_object(sizeof(lean_array_object) + sizeof(void*)*capacity);
|
||||
lean_array_object * o = (lean_array_object*)lean_alloc_object(lean_usize_add_checked(sizeof(lean_array_object), lean_usize_mul_checked(sizeof(void*), capacity)));
|
||||
lean_set_st_header((lean_object*)o, LeanArray, 0);
|
||||
o->m_size = size;
|
||||
o->m_capacity = capacity;
|
||||
@@ -934,8 +983,18 @@ LEAN_EXPORT lean_object * lean_mk_array(lean_obj_arg n, lean_obj_arg v);
|
||||
|
||||
/* Array of scalars */
|
||||
|
||||
static inline bool lean_alloc_sarray_would_overflow(unsigned elem_size, size_t capacity) {
|
||||
if (lean_usize_mul_would_overflow(elem_size, capacity)) {
|
||||
return true;
|
||||
}
|
||||
if (lean_usize_add_would_overflow(sizeof(lean_sarray_object), elem_size * capacity)) {
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
static inline lean_obj_res lean_alloc_sarray(unsigned elem_size, size_t size, size_t capacity) {
|
||||
lean_sarray_object * o = (lean_sarray_object*)lean_alloc_object(sizeof(lean_sarray_object) + elem_size*capacity);
|
||||
lean_sarray_object * o = (lean_sarray_object*)lean_alloc_object(lean_usize_add_checked(sizeof(lean_sarray_object), lean_usize_mul_checked(elem_size, capacity)));
|
||||
lean_set_st_header((lean_object*)o, LeanScalarArray, elem_size);
|
||||
o->m_size = size;
|
||||
o->m_capacity = capacity;
|
||||
@@ -1090,7 +1149,7 @@ static inline lean_obj_res lean_float_array_set(lean_obj_arg a, b_lean_obj_arg i
|
||||
/* Strings */
|
||||
|
||||
static inline lean_obj_res lean_alloc_string(size_t size, size_t capacity, size_t len) {
|
||||
lean_string_object * o = (lean_string_object*)lean_alloc_object(sizeof(lean_string_object) + capacity);
|
||||
lean_string_object * o = (lean_string_object*)lean_alloc_object(lean_usize_add_checked(sizeof(lean_string_object), capacity));
|
||||
lean_set_st_header((lean_object*)o, LeanString, 0);
|
||||
o->m_size = size;
|
||||
o->m_capacity = capacity;
|
||||
|
||||
@@ -304,13 +304,13 @@ and log are saved to `traceFile`, if the build completes without a fatal error
|
||||
(depTrace : BuildTrace) (traceFile : FilePath) (build : JobM α)
|
||||
(action : JobAction := .build)
|
||||
: JobM α := do
|
||||
updateAction action
|
||||
let noBuildTraceFile := traceFile.addExtension "nobuild"
|
||||
if (← getNoBuild) then
|
||||
updateAction .build
|
||||
modify ({· with wantsRebuild := true})
|
||||
writeBuildTrace noBuildTraceFile depTrace Json.null {}
|
||||
error s!"target is out-of-date and needs to be rebuilt"
|
||||
else
|
||||
updateAction action
|
||||
let startTime ← IO.monoMsNow
|
||||
try
|
||||
let iniPos ← getLogPos
|
||||
|
||||
@@ -154,7 +154,6 @@ public instance [h : FamilyDef CustomOut (p, t) α] : FamilyDef (CustomData p) t
|
||||
/-! ## Build Data -/
|
||||
--------------------------------------------------------------------------------
|
||||
|
||||
set_option linter.deprecated false in
|
||||
/--
|
||||
A mapping between a build key and its associated build data in the store.
|
||||
It is a simple type function composed of the separate open type families for
|
||||
@@ -178,7 +177,6 @@ public instance [FamilyOut DataType Module.facetKind α]
|
||||
: FamilyDef BuildData (.packageModule p m) α where
|
||||
fam_eq := by unfold BuildData; simp
|
||||
|
||||
set_option linter.deprecated false in
|
||||
public instance [FamilyOut DataType Module.facetKind α]
|
||||
: FamilyDef BuildData (.module k) α where
|
||||
fam_eq := by unfold BuildData; simp
|
||||
|
||||
@@ -63,6 +63,8 @@ public structure JobState where
|
||||
log : Log := {}
|
||||
/-- Tracks whether this job performed any significant build action. -/
|
||||
action : JobAction := .unknown
|
||||
/-- Whether this job failed due to a request to rebuild for `--no-build`. -/
|
||||
wantsRebuild : Bool := false
|
||||
/-- Current trace of a build job. -/
|
||||
trace : BuildTrace := .nil
|
||||
/-- How long the job spent building (in milliseconds). -/
|
||||
@@ -72,6 +74,7 @@ public structure JobState where
|
||||
public def JobState.merge (a b : JobState) : JobState where
|
||||
log := a.log ++ b.log
|
||||
action := a.action.merge b.action
|
||||
wantsRebuild := a.wantsRebuild || b.wantsRebuild
|
||||
trace := mixTrace a.trace b.trace
|
||||
buildTime := a.buildTime + b.buildTime
|
||||
|
||||
|
||||
@@ -77,7 +77,11 @@ where
|
||||
else
|
||||
return .package (stringToLegalOrSimpleName pkg.copy)
|
||||
else if target.startsWith "+" then
|
||||
return .module (stringToLegalOrSimpleName (target.drop 1).copy)
|
||||
let mod := target.drop 1
|
||||
if mod.isEmpty then
|
||||
throw "ill-formed target: expected module name after '+'"
|
||||
else
|
||||
return .module (stringToLegalOrSimpleName mod.copy)
|
||||
else
|
||||
parsePackageTarget .anonymous target
|
||||
| [pkg, target] =>
|
||||
@@ -241,5 +245,3 @@ public instance : Std.LawfulEqCmp quickCmp where
|
||||
· simp only [quickCmp]
|
||||
split <;> simp_all
|
||||
· simp_all [quickCmp]
|
||||
|
||||
attribute [deprecated packageModule (since := "2025-11-13")] module
|
||||
|
||||
@@ -357,6 +357,7 @@ private def Package.discriminant (self : Package) :=
|
||||
else
|
||||
s!"{self.prettyName}@{self.version}"
|
||||
|
||||
set_option linter.unusedVariables.funArgs false in
|
||||
private def fetchImportInfo
|
||||
(fileName : String) (pkgName modName : Name) (header : ModuleHeader)
|
||||
: FetchM (Job ModuleImportInfo) := do
|
||||
@@ -381,22 +382,23 @@ private def fetchImportInfo
|
||||
let importJob ← mod.exportInfo.fetch
|
||||
return s.zipWith (sync := true) (·.addImport nonModule imp ·) importJob
|
||||
else
|
||||
let isImportable (mod) :=
|
||||
mod.allowImportAll || pkgName == mod.pkg.keyName
|
||||
let allImportable :=
|
||||
if imp.importAll then
|
||||
mods.all isImportable
|
||||
else true
|
||||
unless allImportable do
|
||||
let msg := s!"{fileName}: cannot `import all` the module `{imp.module}` \
|
||||
from the following packages:"
|
||||
let msg := mods.foldl (init := msg) fun msg mod =>
|
||||
if isImportable mod then
|
||||
msg
|
||||
else
|
||||
s!"{msg}\n {mod.pkg.discriminant}"
|
||||
logError msg
|
||||
return .error
|
||||
-- Remark: We've decided to disable this check for now
|
||||
-- let isImportable (mod) :=
|
||||
-- mod.allowImportAll || pkgName == mod.pkg.keyName
|
||||
-- let allImportable :=
|
||||
-- if imp.importAll then
|
||||
-- mods.all isImportable
|
||||
-- else true
|
||||
-- unless allImportable do
|
||||
-- let msg := s!"{fileName}: cannot `import all` the module `{imp.module}` \
|
||||
-- from the following packages:"
|
||||
-- let msg := mods.foldl (init := msg) fun msg mod =>
|
||||
-- if isImportable mod then
|
||||
-- msg
|
||||
-- else
|
||||
-- s!"{msg}\n {mod.pkg.discriminant}"
|
||||
-- logError msg
|
||||
-- return .error
|
||||
let mods : Vector Module n := .mk mods rfl
|
||||
let expInfosJob ← Job.collectVector <$> mods.mapM (·.exportInfo.fetch)
|
||||
s.bindM (sync := true) fun impInfo => do
|
||||
|
||||
@@ -50,14 +50,14 @@ private structure MonitorContext where
|
||||
/-- How often to poll jobs (in milliseconds). -/
|
||||
updateFrequency : Nat
|
||||
|
||||
@[inline] def MonitorContext.logger (ctx : MonitorContext) : MonadLog IO :=
|
||||
@[inline] def MonitorContext.logger (ctx : MonitorContext) : MonadLog BaseIO :=
|
||||
.stream ctx.out ctx.outLv ctx.useAnsi
|
||||
|
||||
/-- State of the Lake build monitor. -/
|
||||
private structure MonitorState where
|
||||
jobNo : Nat := 0
|
||||
totalJobs : Nat := 0
|
||||
didBuild : Bool := false
|
||||
wantsRebuild : Bool := false
|
||||
failures : Array String
|
||||
resetCtrl : String
|
||||
lastUpdate : Nat
|
||||
@@ -114,14 +114,14 @@ private def renderProgress (running unfinished : Array OpaqueJob) (h : 0 < unfin
|
||||
flush
|
||||
|
||||
private def reportJob (job : OpaqueJob) : MonitorM PUnit := do
|
||||
let {jobNo, totalJobs, didBuild, ..} ← get
|
||||
let {jobNo, totalJobs, ..} ← get
|
||||
let {failLv, outLv, showOptional, out, useAnsi, showProgress, minAction, showTime, ..} ← read
|
||||
let {task, caption, optional, ..} := job
|
||||
let {log, action, buildTime, ..} := task.get.state
|
||||
let {log, action, wantsRebuild, buildTime, ..} := task.get.state
|
||||
let maxLv := log.maxLv
|
||||
let failed := strictAnd log.hasEntries (maxLv ≥ failLv)
|
||||
if !didBuild && action = .build then
|
||||
modify fun s => {s with didBuild := true}
|
||||
if wantsRebuild then
|
||||
modify fun s => if s.wantsRebuild then s else {s with wantsRebuild := true}
|
||||
if failed && !optional then
|
||||
modify fun s => {s with failures := s.failures.push caption}
|
||||
let hasOutput := failed || (log.hasEntries && maxLv ≥ outLv)
|
||||
@@ -195,7 +195,7 @@ end Monitor
|
||||
|
||||
/-- **For internal use only.** -/
|
||||
public structure MonitorResult where
|
||||
didBuild : Bool
|
||||
wantsRebuild : Bool
|
||||
failures : Array String
|
||||
numJobs : Nat
|
||||
|
||||
@@ -233,11 +233,11 @@ def monitorJobs'
|
||||
return {
|
||||
failures := s.failures
|
||||
numJobs := s.totalJobs
|
||||
didBuild := s.didBuild
|
||||
wantsRebuild := s.wantsRebuild
|
||||
}
|
||||
|
||||
/-- The job monitor function. An auxiliary definition for `runFetchM`. -/
|
||||
@[inline, deprecated "Deprecated without replacement" (since := "2025-01-08")]
|
||||
@[inline, deprecated "Deprecated without replacement." (since := "2025-01-08")]
|
||||
public def monitorJobs
|
||||
(initJobs : Array OpaqueJob)
|
||||
(jobs : IO.Ref (Array OpaqueJob))
|
||||
@@ -259,9 +259,9 @@ public def monitorJobs
|
||||
public def noBuildCode : ExitCode := 3
|
||||
|
||||
def Workspace.saveOutputs
|
||||
[logger : MonadLog IO] (ws : Workspace)
|
||||
[logger : MonadLog BaseIO] (ws : Workspace)
|
||||
(out : IO.FS.Stream) (outputsFile : FilePath) (isVerbose : Bool)
|
||||
: IO Unit := do
|
||||
: BaseIO Unit := do
|
||||
unless ws.isRootArtifactCacheEnabled do
|
||||
logWarning s!"{ws.root.prettyName}: \
|
||||
the artifact cache is not enabled for this package, so the artifacts described \
|
||||
@@ -313,13 +313,20 @@ def monitorJob (ctx : MonitorContext) (job : Job α) : BaseIO (BuildResult α) :
|
||||
else
|
||||
return {toMonitorResult := result, out := .error "build failed"}
|
||||
|
||||
def monitorFetchM
|
||||
(mctx : MonitorContext) (bctx : BuildContext) (build : FetchM α)
|
||||
def mkBuildContext' (ws : Workspace) (cfg : BuildConfig) (jobs : JobQueue) : BuildContext where
|
||||
opaqueWs := ws
|
||||
toBuildConfig := cfg
|
||||
registeredJobs := jobs
|
||||
leanTrace := .ofHash (pureHash ws.lakeEnv.leanGithash)
|
||||
s!"Lean {Lean.versionStringCore}, commit {ws.lakeEnv.leanGithash}"
|
||||
|
||||
def Workspace.startBuild
|
||||
(ws : Workspace) (cfg : BuildConfig) (jobs : JobQueue) (build : FetchM α)
|
||||
(caption := "job computation")
|
||||
: BaseIO (BuildResult α) := do
|
||||
: BaseIO (Job α) := do
|
||||
let bctx := mkBuildContext' ws cfg jobs
|
||||
let compute := Job.async build (caption := caption)
|
||||
let job ← compute.run.run'.run bctx |>.run nilTrace
|
||||
monitorJob mctx job
|
||||
compute.run.run'.run bctx |>.run nilTrace
|
||||
|
||||
def Workspace.finalizeBuild
|
||||
(ws : Workspace) (cfg : BuildConfig) (ctx : MonitorContext) (result : BuildResult α)
|
||||
@@ -327,17 +334,14 @@ def Workspace.finalizeBuild
|
||||
reportResult cfg ctx.out result
|
||||
if let some outputsFile := cfg.outputsFile? then
|
||||
ws.saveOutputs (logger := ctx.logger) ctx.out outputsFile (cfg.verbosity matches .verbose)
|
||||
if cfg.noBuild && result.didBuild then
|
||||
IO.Process.exit noBuildCode.toUInt8
|
||||
else
|
||||
IO.ofExcept result.out
|
||||
|
||||
def mkBuildContext' (ws : Workspace) (cfg : BuildConfig) (jobs : JobQueue) : BuildContext where
|
||||
opaqueWs := ws
|
||||
toBuildConfig := cfg
|
||||
registeredJobs := jobs
|
||||
leanTrace := .ofHash (pureHash ws.lakeEnv.leanGithash)
|
||||
s!"Lean {Lean.versionStringCore}, commit {ws.lakeEnv.leanGithash}"
|
||||
match result.out with
|
||||
| .ok a =>
|
||||
return a
|
||||
| .error e =>
|
||||
if cfg.noBuild && result.wantsRebuild then
|
||||
IO.Process.exit noBuildCode.toUInt8
|
||||
else
|
||||
throw (IO.userError e)
|
||||
|
||||
/--
|
||||
Run a build function in the Workspace's context using the provided configuration.
|
||||
@@ -349,16 +353,13 @@ public def Workspace.runFetchM
|
||||
: IO α := do
|
||||
let jobs ← mkJobQueue
|
||||
let mctx ← mkMonitorContext cfg jobs
|
||||
let bctx := mkBuildContext' ws cfg jobs
|
||||
let result ← monitorFetchM mctx bctx build caption
|
||||
let job ← ws.startBuild cfg jobs build caption
|
||||
let result ← monitorJob mctx job
|
||||
ws.finalizeBuild cfg mctx result
|
||||
|
||||
def monitorBuild
|
||||
(mctx : MonitorContext) (bctx : BuildContext) (build : FetchM (Job α))
|
||||
(caption := "job computation")
|
||||
: BaseIO (BuildResult α) := do
|
||||
let result ← monitorFetchM mctx bctx build caption
|
||||
match result.out with
|
||||
def monitorBuild (mctx : MonitorContext) (job : Job (Job α)) : BaseIO (BuildResult α) := do
|
||||
let result ← monitorJob mctx job
|
||||
match result.out with
|
||||
| .ok job =>
|
||||
if let some a ← job.wait? then
|
||||
return {result with out := .ok a}
|
||||
@@ -374,24 +375,24 @@ Returns whether a build is needed to validate `build`. Does not report on the at
|
||||
|
||||
This is equivalent to checking whether `lake build --no-build` exits with code 0.
|
||||
-/
|
||||
@[inline] public def Workspace.checkNoBuild
|
||||
public def Workspace.checkNoBuild
|
||||
(ws : Workspace) (build : FetchM (Job α))
|
||||
: BaseIO Bool := do
|
||||
let jobs ← mkJobQueue
|
||||
let cfg := {noBuild := true}
|
||||
let mctx ← mkMonitorContext cfg jobs
|
||||
let bctx := mkBuildContext' ws cfg jobs
|
||||
let result ← monitorBuild mctx bctx build
|
||||
return result.isOk && !result.didBuild
|
||||
let job ← ws.startBuild cfg jobs build
|
||||
let result ← monitorBuild mctx job
|
||||
return result.isOk -- `isOk` means no failures, and thus no `--no-build` failures
|
||||
|
||||
/-- Run a build function in the Workspace's context and await the result. -/
|
||||
@[inline] public def Workspace.runBuild
|
||||
public def Workspace.runBuild
|
||||
(ws : Workspace) (build : FetchM (Job α)) (cfg : BuildConfig := {})
|
||||
: IO α := do
|
||||
let jobs ← mkJobQueue
|
||||
let mctx ← mkMonitorContext cfg jobs
|
||||
let bctx := mkBuildContext' ws cfg jobs
|
||||
let result ← monitorBuild mctx bctx build
|
||||
let job ← ws.startBuild cfg jobs build
|
||||
let result ← monitorBuild mctx job
|
||||
ws.finalizeBuild cfg mctx result
|
||||
|
||||
/-- Produce a build job in the Lake monad's workspace and await the result. -/
|
||||
|
||||
@@ -15,8 +15,7 @@ open Lean
|
||||
|
||||
namespace Lake
|
||||
|
||||
set_option linter.deprecated false in
|
||||
variable (defaultPkg : Package) (root : BuildKey) in
|
||||
variable (defaultPkg : Package) (root : PartialBuildKey) in
|
||||
private def PartialBuildKey.fetchInCoreAux
|
||||
(self : PartialBuildKey) (facetless : Bool := false)
|
||||
: FetchM ((key : BuildKey) × Job (BuildData key)) := do
|
||||
@@ -24,7 +23,7 @@ private def PartialBuildKey.fetchInCoreAux
|
||||
| .module modName =>
|
||||
let some mod ← findModule? modName
|
||||
| error s!"invalid target '{root}': module '{modName}' not found in workspace"
|
||||
return ⟨.module modName, cast (by simp) <| Job.pure mod⟩
|
||||
return ⟨.packageModule mod.pkg.keyName modName, cast (by simp) <| Job.pure mod⟩
|
||||
| .package pkgName =>
|
||||
let pkg ← resolveTargetPackageD pkgName
|
||||
return ⟨.package pkg.keyName, cast (by simp) <| Job.pure pkg⟩
|
||||
@@ -98,7 +97,6 @@ Fetches the target specified by this key, resolving gaps as needed.
|
||||
@[inline] public def PartialBuildKey.fetchIn (defaultPkg : Package) (self : PartialBuildKey) : FetchM OpaqueJob :=
|
||||
(·.2.toOpaque) <$> fetchInCore defaultPkg self
|
||||
|
||||
set_option linter.deprecated false in
|
||||
variable (root : BuildKey) in
|
||||
private def BuildKey.fetchCore
|
||||
(self : BuildKey)
|
||||
@@ -109,17 +107,17 @@ private def BuildKey.fetchCore
|
||||
| error s!"invalid target '{root}': module '{modName}' not found in workspace"
|
||||
return cast (by simp) <| Job.pure mod
|
||||
| package pkgName =>
|
||||
let some pkg ← findPackage? pkgName
|
||||
let some pkg ← findPackageByKey? pkgName
|
||||
| error s!"invalid target '{root}': package '{pkgName}' not found in workspace"
|
||||
return cast (by simp) <| Job.pure pkg.toPackage
|
||||
| packageModule pkgName modName =>
|
||||
let some pkg ← findPackage? pkgName
|
||||
let some pkg ← findPackageByKey? pkgName
|
||||
| error s!"invalid target '{root}': package '{pkgName}' not found in workspace"
|
||||
let some mod := pkg.findTargetModule? modName
|
||||
| error s!"invalid target '{root}': module '{modName}' not found in package '{pkg.name}'"
|
||||
| error s!"invalid target '{root}': module '{modName}' not found in package '{pkg.prettyName}'"
|
||||
return cast (by simp) <| Job.pure mod
|
||||
| packageTarget pkgName target =>
|
||||
let some pkg ← findPackage? pkgName
|
||||
let some pkg ← findPackageByKey? pkgName
|
||||
| error s!"invalid target '{root}': package '{pkgName}' not found in workspace"
|
||||
fetch <| pkg.target target
|
||||
| facet target facetName =>
|
||||
|
||||
@@ -395,7 +395,7 @@ protected def get : CliM PUnit := do
|
||||
if let some file := mappings? then liftM (m := LoggerIO) do
|
||||
if opts.platform?.isSome || opts.toolchain?.isSome then
|
||||
logWarning "the `--platform` and `--toolchain` options do nothing for `cache get` with a mappings file"
|
||||
if .warning ≤ opts.failLv then
|
||||
if opts.failLv ≤ .warning then
|
||||
failure
|
||||
let some remoteScope := opts.scope?
|
||||
| error "to use `cache get` with a mappings file, `--scope` or `--repo` must be set"
|
||||
@@ -470,7 +470,7 @@ where
|
||||
if (← repo.hasDiff) then
|
||||
logWarning s!"{pkg.prettyName}: package has changes; \
|
||||
only artifacts for committed code will be downloaded"
|
||||
if .warning ≤ opts.failLv then
|
||||
if opts.failLv ≤ .warning then
|
||||
failure
|
||||
let n := opts.maxRevs
|
||||
let revs ← repo.getHeadRevisions n
|
||||
@@ -505,7 +505,7 @@ protected def put : CliM PUnit := do
|
||||
if (← repo.hasDiff) then
|
||||
logWarning s!"{pkg.prettyName}: package has changes; \
|
||||
artifacts will be uploaded for the most recent commit"
|
||||
if .warning ≤ opts.failLv then
|
||||
if opts.failLv ≤ .warning then
|
||||
exit 1
|
||||
let rev ← repo.getHeadRevision
|
||||
let map ← CacheMap.load file
|
||||
|
||||
@@ -221,7 +221,6 @@ protected def PathPatDescr.toLean? (p : PathPatDescr) : Option Term :=
|
||||
|
||||
instance : ToLean? PathPatDescr := ⟨PathPatDescr.toLean?⟩
|
||||
|
||||
set_option linter.deprecated false in
|
||||
@[inline] protected def PartialBuildKey.toLean (k : PartialBuildKey) : Term :=
|
||||
go k []
|
||||
where
|
||||
|
||||
@@ -13,6 +13,7 @@ import Lake.Build.Actions
|
||||
import Lake.Util.Url
|
||||
import Lake.Util.Proc
|
||||
import Lake.Util.Reservoir
|
||||
import Lake.Util.JsonObject
|
||||
import Lake.Util.IO
|
||||
import Init.Data.String.Search
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
@@ -305,15 +306,26 @@ end Cache
|
||||
def uploadS3
|
||||
(file : FilePath) (contentType : String) (url : String) (key : String)
|
||||
: LoggerIO Unit := do
|
||||
proc {
|
||||
let out ← captureProc' {
|
||||
cmd := "curl"
|
||||
args := #[
|
||||
"-s",
|
||||
"-s", "-w", "%{stderr}%{json}\n",
|
||||
"--aws-sigv4", "aws:amz:auto:s3", "--user", key,
|
||||
"-X", "PUT", "-T", file.toString, url,
|
||||
"-H", s!"Content-Type: {contentType}"
|
||||
]
|
||||
} (quiet := true)
|
||||
}
|
||||
match Json.parse out.stderr >>= JsonObject.fromJson? with
|
||||
| .ok data =>
|
||||
let code ← id do
|
||||
match (data.get? "response_code" <|> data.get? "http_code") with
|
||||
| .ok (some code) => return code
|
||||
| .ok none => error s!"curl's JSON output did not contain a response code"
|
||||
| .error e => error s!"curl's JSON output contained an invalid JSON response code: {e}"
|
||||
unless code == 200 do
|
||||
error s!"failed to upload artifact, error {code}; received:\n{out.stdout}"
|
||||
| .error e =>
|
||||
error s!"curl produced invalid JSON output: {e}"
|
||||
|
||||
/--
|
||||
Configuration of a remote cache service (e.g., Reservoir or an S3 bucket).
|
||||
|
||||
@@ -25,9 +25,15 @@ export ToText (toText)
|
||||
|
||||
public instance (priority := 0) [ToString α] : ToText α := ⟨toString⟩
|
||||
|
||||
@[inline] public def listToLines (as : List α) (f : α → String) : String :=
|
||||
as.foldl (s!"{·}{f ·}\n") "" |>.dropEnd 1 |>.copy
|
||||
|
||||
@[inline] public def arrayToLines (as : Array α) (f : α → String) : String :=
|
||||
as.foldl (s!"{·}{f ·}\n") "" |>.dropEnd 1 |>.copy
|
||||
|
||||
public instance : ToText Json := ⟨Json.compress⟩
|
||||
public instance [ToText α] : ToText (List α) := ⟨(·.foldl (s!"{·}{toText ·}\n") "" |>.dropEnd 1 |>.copy)⟩
|
||||
public instance [ToText α] : ToText (Array α) := ⟨(·.foldl (s!"{·}{toText ·}\n") "" |>.dropEnd 1 |>.copy)⟩
|
||||
public instance [ToText α] : ToText (List α) := ⟨(listToLines · toText)⟩
|
||||
public instance [ToText α] : ToText (Array α) := ⟨(arrayToLines · toText)⟩
|
||||
|
||||
/-- Class used to format target output as text for `lake query`. -/
|
||||
public class QueryText (α : Type u) where
|
||||
@@ -38,6 +44,8 @@ export QueryText (queryText)
|
||||
|
||||
public instance (priority := 0) : QueryText α := ⟨fun _ => ""⟩
|
||||
public instance (priority := low) [ToText α] : QueryText α := ⟨toText⟩
|
||||
public instance [QueryText α] : QueryText (List α) := ⟨(listToLines · queryText)⟩
|
||||
public instance [QueryText α] : QueryText (Array α) := ⟨(arrayToLines · queryText)⟩
|
||||
public instance : QueryText Unit := ⟨fun _ => ""⟩
|
||||
|
||||
attribute [deprecated QueryText (since := "2025-07-25")] ToText
|
||||
@@ -51,6 +59,8 @@ export QueryJson (queryJson)
|
||||
|
||||
public instance (priority := 0) : QueryJson α := ⟨fun _ => .null⟩
|
||||
public instance (priority := low) [ToJson α] : QueryJson α := ⟨toJson⟩
|
||||
public instance [QueryJson α] : QueryJson (List α) := ⟨(Json.arr <| ·.toArray.map queryJson)⟩
|
||||
public instance [QueryJson α] : QueryJson (Array α) := ⟨(Json.arr <| ·.map queryJson)⟩
|
||||
public instance : QueryJson Unit := ⟨fun _ => .null⟩
|
||||
|
||||
/-- Class used to format target output for `lake query`. -/
|
||||
|
||||
@@ -39,8 +39,7 @@ def expandModuleTargetKeyLit : Macro := fun stx => do
|
||||
| Macro.throwUnsupported
|
||||
withRef tk do
|
||||
let modLit := Name.quoteFrom mod mod.getId
|
||||
let pkgLit := Name.quoteFrom tk Name.anonymous
|
||||
let tgt ← `(BuildKey.packageModule $pkgLit $modLit)
|
||||
let tgt ← `(BuildKey.module $modLit)
|
||||
let key ← expandFacets tgt facets
|
||||
`(PartialBuildKey.mk $key)
|
||||
|
||||
|
||||
@@ -200,7 +200,6 @@ option_ref<decl> find_ir_decl_boxed(elab_environment const & env, name const & n
|
||||
extern "C" double lean_float_of_nat(lean_obj_arg a);
|
||||
extern "C" float lean_float32_of_nat(lean_obj_arg a);
|
||||
|
||||
static string_ref * g_boxed_mangled_suffix = nullptr;
|
||||
static name * g_interpreter_prefer_native = nullptr;
|
||||
DEBUG_CODE(static name * g_interpreter_step = nullptr;)
|
||||
DEBUG_CODE(static name * g_interpreter_call = nullptr;)
|
||||
@@ -216,6 +215,12 @@ string_ref get_symbol_stem(elab_environment const & env, name const & fn) {
|
||||
return string_ref(lean_get_symbol_stem(env.to_obj_arg(), fn.to_obj_arg()));
|
||||
}
|
||||
|
||||
extern "C" obj_res lean_mk_mangled_boxed_name(obj_arg str);
|
||||
|
||||
string_ref mk_mangled_boxed_name(string_ref const & str) {
|
||||
return string_ref(lean_mk_mangled_boxed_name(str.to_obj_arg()));
|
||||
}
|
||||
|
||||
extern "C" object * lean_ir_format_fn_body_head(object * b);
|
||||
std::string format_fn_body_head(fn_body const & b) {
|
||||
object_ref s(lean_ir_format_fn_body_head(b.to_obj_arg()));
|
||||
@@ -843,7 +848,7 @@ private:
|
||||
symbol_cache_entry e_new { get_decl(fn), {nullptr, false} };
|
||||
if (m_prefer_native || decl_tag(e_new.m_decl) == decl_kind::Extern || has_init_attribute(m_env, fn)) {
|
||||
string_ref mangled = get_symbol_stem(m_env, fn);
|
||||
string_ref boxed_mangled(string_append(mangled.to_obj_arg(), g_boxed_mangled_suffix->raw()));
|
||||
string_ref boxed_mangled = mk_mangled_boxed_name(mangled);
|
||||
// check for boxed version first
|
||||
if (void *p_boxed = lookup_symbol_in_cur_exe(boxed_mangled.data())) {
|
||||
e_new.m_native.m_addr = p_boxed;
|
||||
@@ -965,7 +970,7 @@ private:
|
||||
} else {
|
||||
if (decl_tag(e.m_decl) == decl_kind::Extern) {
|
||||
string_ref mangled = get_symbol_stem(m_env, fn);
|
||||
string_ref boxed_mangled(string_append(mangled.to_obj_arg(), g_boxed_mangled_suffix->raw()));
|
||||
string_ref boxed_mangled = mk_mangled_boxed_name(mangled);
|
||||
throw exception(sstream() << "Could not find native implementation of external declaration '" << fn
|
||||
<< "' (symbols '" << boxed_mangled.data() << "' or '" << mangled.data() << "').\n"
|
||||
<< "For declarations from `Init`, `Std`, or `Lean`, you need to set `supportInterpreter := true` "
|
||||
@@ -1212,8 +1217,6 @@ extern "C" LEAN_EXPORT object * lean_run_init(object * env, object * opts, objec
|
||||
}
|
||||
|
||||
void initialize_ir_interpreter() {
|
||||
ir::g_boxed_mangled_suffix = new string_ref("___boxed");
|
||||
mark_persistent(ir::g_boxed_mangled_suffix->raw());
|
||||
ir::g_interpreter_prefer_native = new name({"interpreter", "prefer_native"});
|
||||
ir::g_init_globals = new name_hash_map<object *>();
|
||||
register_bool_option(*ir::g_interpreter_prefer_native, LEAN_DEFAULT_INTERPRETER_PREFER_NATIVE, "(interpreter) whether to use precompiled code where available");
|
||||
@@ -1236,6 +1239,5 @@ void finalize_ir_interpreter() {
|
||||
});
|
||||
delete ir::g_init_globals;
|
||||
delete ir::g_interpreter_prefer_native;
|
||||
delete ir::g_boxed_mangled_suffix;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -143,7 +143,7 @@ object * object_compactor::copy_object(object * o) {
|
||||
void object_compactor::insert_sarray(object * o) {
|
||||
size_t sz = lean_sarray_size(o);
|
||||
unsigned elem_sz = lean_sarray_elem_size(o);
|
||||
size_t obj_sz = sizeof(lean_sarray_object) + elem_sz*sz;
|
||||
size_t obj_sz = lean_usize_add_checked(sizeof(lean_sarray_object), lean_usize_mul_checked(elem_sz, sz));
|
||||
lean_sarray_object * new_o = (lean_sarray_object*)alloc(obj_sz);
|
||||
lean_set_non_heap_header_for_big((lean_object*)new_o, LeanScalarArray, elem_sz);
|
||||
new_o->m_size = sz;
|
||||
@@ -155,7 +155,7 @@ void object_compactor::insert_sarray(object * o) {
|
||||
void object_compactor::insert_string(object * o) {
|
||||
size_t sz = lean_string_size(o);
|
||||
size_t len = lean_string_len(o);
|
||||
size_t obj_sz = sizeof(lean_string_object) + sz;
|
||||
size_t obj_sz = lean_usize_add_checked(sizeof(lean_string_object), sz);
|
||||
lean_string_object * new_o = (lean_string_object*)alloc(obj_sz);
|
||||
lean_set_non_heap_header_for_big((lean_object*)new_o, LeanString, 0);
|
||||
new_o->m_size = sz;
|
||||
@@ -214,7 +214,7 @@ bool object_compactor::insert_array(object * o) {
|
||||
}
|
||||
if (missing_children)
|
||||
return false;
|
||||
size_t obj_sz = sizeof(lean_array_object) + sizeof(void*)*sz;
|
||||
size_t obj_sz = lean_usize_add_checked(sizeof(lean_array_object), lean_usize_mul_checked(sizeof(void*), sz));
|
||||
lean_array_object * new_o = (lean_array_object*)alloc(obj_sz);
|
||||
lean_set_non_heap_header_for_big((lean_object*)new_o, LeanArray, 0);
|
||||
new_o->m_size = sz;
|
||||
@@ -274,8 +274,8 @@ bool object_compactor::insert_promise(object * o) {
|
||||
void object_compactor::insert_mpz(object * o) {
|
||||
#ifdef LEAN_USE_GMP
|
||||
size_t nlimbs = mpz_size(to_mpz(o)->m_value.m_val);
|
||||
size_t data_sz = sizeof(mp_limb_t) * nlimbs;
|
||||
size_t sz = sizeof(mpz_object) + data_sz;
|
||||
size_t data_sz = lean_usize_mul_checked(sizeof(mp_limb_t), nlimbs);
|
||||
size_t sz = lean_usize_add_checked(sizeof(mpz_object), data_sz);
|
||||
mpz_object * new_o = (mpz_object *)alloc(sz);
|
||||
memcpy(new_o, to_mpz(o), sizeof(mpz_object));
|
||||
lean_set_non_heap_header((lean_object*)new_o, sz, LeanMPZ, 0);
|
||||
@@ -287,8 +287,8 @@ void object_compactor::insert_mpz(object * o) {
|
||||
m._mp_alloc = nlimbs;
|
||||
save(o, (lean_object*)new_o);
|
||||
#else
|
||||
size_t data_sz = sizeof(mpn_digit) * to_mpz(o)->m_value.m_size;
|
||||
size_t sz = sizeof(mpz_object) + data_sz;
|
||||
size_t data_sz = lean_usize_mul_checked(sizeof(mpn_digit), to_mpz(o)->m_value.m_size);
|
||||
size_t sz = lean_usize_add_checked(sizeof(mpz_object), data_sz);
|
||||
mpz_object * new_o = (mpz_object *)alloc(sz);
|
||||
// Manually copy the `mpz_object` to ensure `mpz` struct padding is left as
|
||||
// zero as prepared by `object_compactor::alloc`. `memcpy` would copy the
|
||||
|
||||
@@ -583,6 +583,9 @@ extern "C" LEAN_EXPORT obj_res lean_io_prim_handle_truncate(b_obj_arg h) {
|
||||
/* Handle.read : (@& Handle) → USize → IO ByteArray */
|
||||
extern "C" LEAN_EXPORT obj_res lean_io_prim_handle_read(b_obj_arg h, usize nbytes) {
|
||||
FILE * fp = io_get_handle(h);
|
||||
if (lean_alloc_sarray_would_overflow(1, nbytes)) {
|
||||
return io_result_mk_error(decode_io_error(ENOMEM, NULL));
|
||||
}
|
||||
obj_res res = lean_alloc_sarray(1, 0, nbytes);
|
||||
usize n = std::fread(lean_sarray_cptr(res), 1, nbytes, fp);
|
||||
if (n > 0) {
|
||||
@@ -861,6 +864,9 @@ extern "C" LEAN_EXPORT obj_res lean_io_get_random_bytes (size_t nbytes) {
|
||||
}
|
||||
#endif
|
||||
|
||||
if (lean_alloc_sarray_would_overflow(1, nbytes)) {
|
||||
return io_result_mk_error(decode_io_error(ENOMEM, NULL));
|
||||
}
|
||||
obj_res res = lean_alloc_sarray(1, 0, nbytes);
|
||||
size_t remain = nbytes;
|
||||
uint8_t *dst = lean_sarray_cptr(res);
|
||||
|
||||
@@ -340,7 +340,7 @@ static void mpz_dealloc(void *ptr, size_t size) {
|
||||
|
||||
void mpz::allocate(size_t s) {
|
||||
m_size = s;
|
||||
m_digits = static_cast<mpn_digit*>(mpz_alloc(s * sizeof(mpn_digit)));
|
||||
m_digits = static_cast<mpn_digit*>(mpz_alloc(lean_usize_mul_checked(s, sizeof(mpn_digit))));
|
||||
}
|
||||
|
||||
void mpz::init() {
|
||||
@@ -409,8 +409,8 @@ void mpz::init_int64(int64 v) {
|
||||
void mpz::init_mpz(mpz const & v) {
|
||||
m_sign = v.m_sign;
|
||||
m_size = v.m_size;
|
||||
m_digits = static_cast<mpn_digit*>(mpz_alloc(m_size * sizeof(mpn_digit)));
|
||||
memcpy(m_digits, v.m_digits, m_size * sizeof(mpn_digit));
|
||||
m_digits = static_cast<mpn_digit*>(mpz_alloc(lean_usize_mul_checked(m_size, sizeof(mpn_digit))));
|
||||
memcpy(m_digits, v.m_digits, lean_usize_mul_checked(m_size, sizeof(mpn_digit)));
|
||||
}
|
||||
|
||||
mpz::mpz() {
|
||||
|
||||
@@ -99,6 +99,10 @@ extern "C" LEAN_EXPORT void lean_internal_panic_rc_overflow() {
|
||||
lean_internal_panic("reference counter overflowed");
|
||||
}
|
||||
|
||||
extern "C" LEAN_EXPORT void lean_internal_panic_overflow() {
|
||||
lean_internal_panic("integer overflow in runtime computation");
|
||||
}
|
||||
|
||||
bool g_exit_on_panic = false;
|
||||
bool g_panic_messages = true;
|
||||
|
||||
|
||||
@@ -442,6 +442,13 @@ static obj_res spawn(string_ref const & proc_name, array_ref<string_ref> const &
|
||||
auto stdout_pipe = setup_stdio(stdout_mode);
|
||||
auto stderr_pipe = setup_stdio(stderr_mode);
|
||||
|
||||
// It is crucial to not allocate between `fork` and `execvp` for ASAN to work.
|
||||
buffer<char *> pargs;
|
||||
pargs.push_back(strdup(proc_name.data()));
|
||||
for (auto & arg : args)
|
||||
pargs.push_back(strdup(arg.data()));
|
||||
pargs.push_back(NULL);
|
||||
|
||||
int pid = fork();
|
||||
|
||||
if (pid == 0) {
|
||||
@@ -495,12 +502,6 @@ static obj_res spawn(string_ref const & proc_name, array_ref<string_ref> const &
|
||||
lean_always_assert(setsid() >= 0);
|
||||
}
|
||||
|
||||
buffer<char *> pargs;
|
||||
pargs.push_back(strdup(proc_name.data()));
|
||||
for (auto & arg : args)
|
||||
pargs.push_back(strdup(arg.data()));
|
||||
pargs.push_back(NULL);
|
||||
|
||||
if (execvp(pargs[0], pargs.data()) < 0) {
|
||||
std::cerr << "could not execute external process '" << pargs[0] << "'" << std::endl;
|
||||
exit(-1);
|
||||
@@ -509,6 +510,12 @@ static obj_res spawn(string_ref const & proc_name, array_ref<string_ref> const &
|
||||
throw errno;
|
||||
}
|
||||
|
||||
for (char* parg : pargs) {
|
||||
if (parg != NULL) {
|
||||
free(parg);
|
||||
}
|
||||
}
|
||||
|
||||
object * parent_stdin = box(0);
|
||||
object * parent_stdout = box(0);
|
||||
object * parent_stderr = box(0);
|
||||
|
||||
@@ -182,6 +182,9 @@ extern "C" LEAN_EXPORT lean_obj_res lean_uv_tcp_send(b_obj_arg socket, obj_arg d
|
||||
}
|
||||
|
||||
// Allocate buffer array for uv_write
|
||||
if (lean_usize_mul_would_overflow(array_len, sizeof(uv_buf_t))) {
|
||||
return lean_io_result_mk_error(decode_io_error(ENOMEM, nullptr));
|
||||
}
|
||||
uv_buf_t* bufs = (uv_buf_t*)malloc(array_len * sizeof(uv_buf_t));
|
||||
|
||||
for (size_t i = 0; i < array_len; i++) {
|
||||
|
||||
@@ -140,6 +140,9 @@ extern "C" LEAN_EXPORT lean_obj_res lean_uv_udp_send(b_obj_arg socket, obj_arg d
|
||||
return lean_io_result_mk_ok(promise);
|
||||
}
|
||||
|
||||
if (lean_usize_mul_would_overflow(array_len, sizeof(uv_buf_t))) {
|
||||
return lean_io_result_mk_error(decode_io_error(ENOMEM, nullptr));
|
||||
}
|
||||
uv_buf_t* bufs = (uv_buf_t*)malloc(array_len * sizeof(uv_buf_t));
|
||||
|
||||
for (size_t i = 0; i < array_len; i++) {
|
||||
|
||||
BIN
stage0/src/library/ir_interpreter.cpp
generated
BIN
stage0/src/library/ir_interpreter.cpp
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Char.c
generated
BIN
stage0/stdlib/Init/Data/Char.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Char/Ordinal.c
generated
Normal file
BIN
stage0/stdlib/Init/Data/Char/Ordinal.c
generated
Normal file
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Fin.c
generated
BIN
stage0/stdlib/Init/Data/Fin.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Fin/OverflowAware.c
generated
Normal file
BIN
stage0/stdlib/Init/Data/Fin/OverflowAware.c
generated
Normal file
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Option.c
generated
BIN
stage0/stdlib/Init/Data/Option.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Option/Function.c
generated
Normal file
BIN
stage0/stdlib/Init/Data/Option/Function.c
generated
Normal file
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Range/Polymorphic.c
generated
BIN
stage0/stdlib/Init/Data/Range/Polymorphic.c
generated
Binary file not shown.
BIN
stage0/stdlib/Init/Data/Range/Polymorphic/Char.c
generated
Normal file
BIN
stage0/stdlib/Init/Data/Range/Polymorphic/Char.c
generated
Normal file
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user