Compare commits

..

28 Commits

Author SHA1 Message Date
Joachim Breitner
9832551928 Different delaboration in lazy context? oh well, fine with me 2025-06-30 17:29:45 +02:00
Joachim Breitner
d1f9f3cad5 Making the error message lazy seems to help 2025-06-30 17:12:54 +02:00
Joachim Breitner
b6abfa0741 Revert "Not even this helps?"
This reverts commit afaaa6ef4b.
2025-06-30 14:17:59 +02:00
Joachim Breitner
afaaa6ef4b Not even this helps? 2025-06-30 14:17:55 +02:00
Joachim Breitner
75cdb4aac4 Reduce diff 2025-06-30 14:00:02 +02:00
Joachim Breitner
41b01872f6 feat: prettier expected type mismatch error message
This PR improves the “expected type mismatch” error message by
omitting the type's types when they are defeq, and putting them into
separate lines when not.

I found it rather tediuos to parse the error message when the expected
type is long, because I had to find the `:` in the middle of a large
expression somewhere. Also, when both are of sort `Prop` or `Type` it
doesn't add much value to print the sort (and it’s only one hover away
anyways).
2025-06-30 13:55:30 +02:00
David Thrane Christiansen
9bbd2e64aa doc: add missing docstring for ToFormat.toFormat (#9093)
This PR adds a missing docstring for `ToFormat.toFormat`.
2025-06-30 06:59:12 +00:00
Kim Morrison
ae89b7ed43 feat: further release automation (#9092)
This PR further updates release automation. The per-repository update
scripts `script/release_steps.py` now actually performs the tests,
rather than outputting a script for the release manager to run line by
line. It's been tested on `v4.21.0` (i.e. the easy case of a stable
release), and we'll debug its behaviour on `v4.22.0-rc1` tonight.
2025-06-30 05:44:10 +00:00
David Thrane Christiansen
ede8a7e494 fix: error explanation needs updating (#9091)
This PR updates an error explanation to match the actual error.
2025-06-30 05:10:04 +00:00
Kyle Miller
044bfdb098 feat: eliminate letFun support, deprecate let_fun syntax (#9086)
This PR deprecates `let_fun` syntax in favor of `have` and removes
`letFun` support from WHNF and `simp`.
2025-06-30 02:10:18 +00:00
Mac Malone
5049a4893e refactor: lake: fix local artifact cache bugs, opt-in, & related cleanup (#9068)
This PR fixes some bugs with the local Lake artifact cache and cleans up
the surrounding API. It also adds the ability to opt-in to the cache on
packages without `enableArtifactCache` set using the
`LAKE_ARTIFACT_CACHE` environment variable.

Bug-wise, this fixes an issue where cached executable did not have right
permissions to run on Unix systems and a bug where cached artifacts
would not be invalidated on changes. Lake also now writes a trace file
to local build directory if there is none when fetching an artifact from
the cache. This trace has a new `synthetic` field set to `true` to
distinguish it from traces produced by full builds.
2025-06-30 01:02:55 +00:00
Lean stage0 autoupdater
d6c5c8c880 chore: update stage0 2025-06-30 01:20:30 +00:00
Kyle Miller
32894e7349 feat: remove irreducible from letFun (#9087)
This PR removes the `irreducible` attribute from `letFun`, which is one
step toward removing special `letFun` support; part of #9086.

Removing the attribute seems to break some `module` tests in stage2.
2025-06-30 00:04:59 +00:00
Sebastian Ullrich
d53d4722cc chore: compile stage 0 against correct version string (#9085)
This ensure the correct version string is embedded into shipped .oleans,
though only the githash is relevant for the version check.
2025-06-29 20:52:46 +00:00
Kyle Miller
cb3174b1c6 feat: hovers when pp.oneline is true (#7954)
This PR improves `pp.oneline`, where it now preserves tags when
truncating formatted syntax to a single line. Note that the `[...]`
continuation does not yet have any functionality to enable seeing the
untruncated syntax. Closes #3681.
2025-06-29 20:06:24 +00:00
Kyle Miller
68c006a95b feat: transform nondependent lets into haves in declarations and equation lemmas (#8373)
This PR enables transforming nondependent `let`s into `have`s in a
number of contexts: the bodies of nonrecursive definitions, equation
lemmas, smart unfolding definitions, and types of theorems. A motivation
for this change is that when zeta reduction is disabled, `simp` can only
effectively rewrite `have` expressions (e.g. `split` uses `simp` with
zeta reduction disabled), and so we cache the nondependence calculations
by transforming `let`s to `have`s. The transformation can be disabled
using `set_option cleanup.letToHave false`.

Uses `Meta.letToHave`, introduced in #8954.
2025-06-29 19:45:45 +00:00
Kyle Miller
44c8b0df85 feat: warn.sorry option (#8662)
This PR adds a `warn.sorry` option (default true) that logs the
"declaration uses 'sorry'" warning when declarations contain `sorryAx`.
When false, the warning is not logged.

Closes #8611 (assuming that one would set `warn.sorry` as an extra flag
when building).

Other change: Uses `warn.sorry` when creating auxiliary declarations in
`structure` elaborator, to suppress irrelevant 'sorry' warnings.

We could include the sorries themselves in the message if they are
labeled, letting users "go to definition" to see where the sorries are
coming from.

In an earlier version, added additional information to the warning when
it is a synthetic sorry, since these can be caused by elaboration bugs
and they can also be caused by elaboration failures in previous
declarations. This idea needs some more work, so it's not included.
2025-06-29 19:31:17 +00:00
Cameron Zwarich
85c45c409e chore: move lowerType to ToIRType and rename it (#9083) 2025-06-29 19:16:00 +00:00
Mac Malone
e0354cd856 fix: lake: module builds not appearing in job monitor (#9081)
This PR fixes a bug with Lake where the job monitor would sit on a
top-level build (e.g., `mathlib/Mathlib:default`) instead of reporting
module build progress.

The issue was actually simpler than it initially appeared. The wrong
portion of the module build was being registered to job monitor. Moving
it to right place fixes it, no job priorities necessary.
2025-06-29 18:39:06 +00:00
Cameron Zwarich
5d8cd35471 chore: rename Lean.Compiler.IR.CtorLayout to ToIRType (#9082) 2025-06-29 18:36:55 +00:00
Cameron Zwarich
0b738e07b4 chore: move more functions to CtorLayout (#9080) 2025-06-29 17:31:41 +00:00
Sebastian Ullrich
f475d5a428 chore: module-ize Init.lean 2025-06-29 16:52:13 +02:00
Sebastian Ullrich
5aa1950c3f chore: update stage0 2025-06-29 16:15:49 +02:00
Sebastian Ullrich
8085d3c930 chore: allow use of the module system in all of core (#9078) 2025-06-29 13:49:27 +00:00
Kim Morrison
a35425b192 feat: support for ReflCmp in grind (#9073)
This PR copies #9069 to handle `ReflCmp` the same way; we need to call
this in propagateUp rather than propagateDown.
2025-06-29 11:36:39 +00:00
Leonardo de Moura
8b1d2fc2d5 feat: OfSemiring.toQ unexpander (#9076)
This PR adds an unexpander for `OfSemiring.toQ`. This an auxiliary
function used by the `ring` module in `grind`, but we want to reduce the
clutter in the diagnostic information produced by `grind`. Example:
```
example [CommSemiring α] [AddRightCancel α] [IsCharP α 0] (x y : α)
    : x^2*y = 1 → x*y^2 = y → x + y = 2 → False := by
  grind
```
produces
```
  [ring] Ring `Ring.OfSemiring.Q α` ▼
    [basis] Basis ▼
      [_] ↑x + ↑y + -2 = 0
      [_] ↑y + -1 = 0
```
2025-06-29 11:22:24 +00:00
Kim Morrison
98e868e3d2 feat: BEq instances for ByteArray/FloatArray (#9075)
This PR adds `BEq` instances for `ByteArray` and `FloatArray` (also a
`DecidableEq` instance for `ByteArray`).
2025-06-29 11:12:48 +00:00
Leonardo de Moura
b95b0069e7 feat: use comm ring module to normalize nonlinear polynomials in grind cutsat (#9074)
This PR uses the commutative ring module to normalize nonlinear
polynomials in `grind cutsat`. Examples:
```lean
example (a b : Nat) (h₁ : a + 1 ≠ a * b * a) (h₂ : a * a * b ≤ a + 1) : b * a^2 < a + 1 := by 
  grind

example (a b c : Int) (h₁ : a + 1 + c = b * a) (h₂ : c + 2*b*a = 0) : 6 * a * b - 2 * a ≤ 2 := by 
  grind
```
2025-06-29 11:09:29 +00:00
464 changed files with 2447 additions and 1107 deletions

1
.gitignore vendored
View File

@@ -31,3 +31,4 @@ fwOut.txt
wdErr.txt
wdIn.txt
wdOut.txt
downstream_releases/

View File

@@ -16,7 +16,7 @@ foreach(var ${vars})
list(APPEND STAGE1_ARGS "-D${CMAKE_MATCH_1}=${${var}}")
elseif("${currentHelpString}" MATCHES "No help, variable specified on the command line." OR "${currentHelpString}" STREQUAL "")
list(APPEND CL_ARGS "-D${var}=${${var}}")
if("${var}" MATCHES "USE_GMP|CHECK_OLEAN_VERSION")
if("${var}" MATCHES "USE_GMP|CHECK_OLEAN_VERSION|LEAN_VERSION_.*|LEAN_SPECIAL_VERSION_DESC")
# must forward options that generate incompatible .olean format
list(APPEND STAGE0_ARGS "-D${var}=${${var}}")
elseif("${var}" MATCHES "LLVM*|PKG_CONFIG|USE_LAKE|USE_MIMALLOC")

View File

@@ -3,6 +3,32 @@ import sys
import subprocess
import requests
def check_gh_auth():
"""Check if GitHub CLI is properly authenticated."""
try:
result = subprocess.run(["gh", "auth", "status"], capture_output=True, text=True)
if result.returncode != 0:
return False, result.stderr
return True, None
except FileNotFoundError:
return False, "GitHub CLI (gh) is not installed. Please install it first."
except Exception as e:
return False, f"Error checking authentication: {e}"
def handle_gh_error(error_output):
"""Handle GitHub CLI errors and provide helpful messages."""
if "Not Found (HTTP 404)" in error_output:
return "Repository not found or access denied. Please check:\n" \
"1. The repository name is correct\n" \
"2. You have access to the repository\n" \
"3. Your GitHub CLI authentication is valid"
elif "Bad credentials" in error_output or "invalid" in error_output.lower():
return "Authentication failed. Please run 'gh auth login' to re-authenticate."
elif "rate limit" in error_output.lower():
return "GitHub API rate limit exceeded. Please try again later."
else:
return f"GitHub API error: {error_output}"
def main():
if len(sys.argv) != 4:
print("Usage: ./push_repo_release_tag.py <repo> <branch> <version_tag>")
@@ -14,6 +40,13 @@ def main():
print(f"Error: Branch '{branch}' is not 'master' or 'main'.")
sys.exit(1)
# Check GitHub CLI authentication first
auth_ok, auth_error = check_gh_auth()
if not auth_ok:
print(f"Authentication error: {auth_error}")
print("\nTo fix this, run: gh auth login")
sys.exit(1)
# Get the `lean-toolchain` file content
lean_toolchain_url = f"https://raw.githubusercontent.com/{repo}/{branch}/lean-toolchain"
try:
@@ -43,12 +76,23 @@ def main():
for tag in existing_tags:
print(tag.replace("refs/tags/", ""))
sys.exit(1)
elif list_tags_output.returncode != 0:
# Handle API errors when listing tags
error_msg = handle_gh_error(list_tags_output.stderr)
print(f"Error checking existing tags: {error_msg}")
sys.exit(1)
# Get the SHA of the branch
get_sha_cmd = [
"gh", "api", f"repos/{repo}/git/ref/heads/{branch}", "--jq", ".object.sha"
]
sha_result = subprocess.run(get_sha_cmd, capture_output=True, text=True, check=True)
sha_result = subprocess.run(get_sha_cmd, capture_output=True, text=True)
if sha_result.returncode != 0:
error_msg = handle_gh_error(sha_result.stderr)
print(f"Error getting branch SHA: {error_msg}")
sys.exit(1)
sha = sha_result.stdout.strip()
# Create the tag
@@ -58,11 +102,20 @@ def main():
"-F", f"ref=refs/tags/{version_tag}",
"-F", f"sha={sha}"
]
subprocess.run(create_tag_cmd, capture_output=True, text=True, check=True)
create_result = subprocess.run(create_tag_cmd, capture_output=True, text=True)
if create_result.returncode != 0:
error_msg = handle_gh_error(create_result.stderr)
print(f"Error creating tag: {error_msg}")
sys.exit(1)
print(f"Successfully created and pushed tag '{version_tag}' to {repo}.")
except subprocess.CalledProcessError as e:
print(f"Error while creating/pushing tag: {e.stderr.strip() if e.stderr else e}")
error_msg = handle_gh_error(e.stderr.strip() if e.stderr else str(e))
print(f"Error while creating/pushing tag: {error_msg}")
sys.exit(1)
except Exception as e:
print(f"Unexpected error: {e}")
sys.exit(1)
if __name__ == "__main__":

View File

@@ -262,6 +262,61 @@ def pr_exists_with_title(repo_url, title, github_token):
return pr['number'], pr['html_url']
return None
def check_proofwidgets4_release(repo_url, target_toolchain, github_token):
"""Check if ProofWidgets4 has a release tag that uses the target toolchain."""
api_base = repo_url.replace("https://github.com/", "https://api.github.com/repos/")
headers = {'Authorization': f'token {github_token}'} if github_token else {}
# Get all tags matching v0.0.* pattern
response = requests.get(f"{api_base}/git/matching-refs/tags/v0.0.", headers=headers)
if response.status_code != 200:
print(f" ❌ Could not fetch ProofWidgets4 tags")
return False
tags = response.json()
if not tags:
print(f" ❌ No v0.0.* tags found for ProofWidgets4")
return False
# Extract tag names and sort by version number (descending)
tag_names = []
for tag in tags:
ref = tag['ref']
if ref.startswith('refs/tags/v0.0.'):
tag_name = ref.replace('refs/tags/', '')
try:
# Extract the number after v0.0.
version_num = int(tag_name.split('.')[-1])
tag_names.append((version_num, tag_name))
except (ValueError, IndexError):
continue
if not tag_names:
print(f" ❌ No valid v0.0.* tags found for ProofWidgets4")
return False
# Sort by version number (descending) and take the most recent 10
tag_names.sort(reverse=True)
recent_tags = tag_names[:10]
# Check each recent tag to see if it uses the target toolchain
for version_num, tag_name in recent_tags:
toolchain_content = get_branch_content(repo_url, tag_name, "lean-toolchain", github_token)
if toolchain_content is None:
continue
if is_version_gte(toolchain_content.strip(), target_toolchain):
print(f" ✅ Found release {tag_name} using compatible toolchain (>= {target_toolchain})")
return True
# If we get here, no recent release uses the target toolchain
# Find the highest version number to suggest the next one
highest_version = max(version_num for version_num, _ in recent_tags)
next_version = highest_version + 1
print(f" ❌ No recent ProofWidgets4 release uses toolchain >= {target_toolchain}")
print(f" You will need to create and push a tag v0.0.{next_version}")
return False
def main():
parser = argparse.ArgumentParser(description="Check release status of Lean4 repositories")
parser.add_argument("toolchain", help="The toolchain version to check (e.g., v4.6.0)")
@@ -386,6 +441,12 @@ def main():
continue
print(f" ✅ On compatible toolchain (>= {toolchain})")
# Special handling for ProofWidgets4
if name == "ProofWidgets4":
if not check_proofwidgets4_release(url, toolchain, github_token):
repo_status[name] = False
continue
if check_tag:
tag_exists_initially = tag_exists(url, toolchain, github_token)
if not tag_exists_initially:

View File

@@ -1,10 +1,10 @@
#!/usr/bin/env python3
"""
Generate release steps script for Lean4 repositories.
Execute release steps for Lean4 repositories.
This script helps automate the release process for Lean4 and its dependent repositories
by generating step-by-step instructions for updating toolchains, creating tags,
by actually executing the step-by-step instructions for updating toolchains, creating tags,
and managing branches.
Usage:
@@ -12,11 +12,11 @@ Usage:
Arguments:
version: The version to set in the lean-toolchain file (e.g., v4.6.0)
repo: A substring of the repository name as specified in release_repos.yml
repo: The repository name as specified in release_repos.yml
Example:
python3 release_steps.py v4.6.0 mathlib
python3 release_steps.py v4.6.0 batt
python3 release_steps.py v4.6.0 mathlib4
python3 release_steps.py v4.6.0 batteries
The script reads repository configurations from release_repos.yml in the same directory.
Each repository may have specific requirements for:
@@ -32,23 +32,97 @@ import yaml
import os
import sys
import re
import subprocess
import shutil
import json
from pathlib import Path
# Color functions for terminal output
def blue(text):
"""Blue text for 'I'm doing something' messages."""
return f"\033[94m{text}\033[0m"
def green(text):
"""Green text for 'successful step' messages."""
return f"\033[92m{text}\033[0m"
def red(text):
"""Red text for 'this looks bad' messages."""
return f"\033[91m{text}\033[0m"
def yellow(text):
"""Yellow text for warnings."""
return f"\033[93m{text}\033[0m"
def run_command(cmd, cwd=None, check=True, stream_output=False):
"""Run a shell command and return the result."""
print(blue(f"Running: {cmd}"))
try:
if stream_output:
# Stream output in real-time for long-running commands
result = subprocess.run(cmd, shell=True, cwd=cwd, check=check, text=True)
return result
else:
# Capture output for commands where we need to process the result
result = subprocess.run(cmd, shell=True, cwd=cwd, check=check,
capture_output=True, text=True)
if result.stdout:
# Command output in plain white (default terminal color)
print(result.stdout)
return result
except subprocess.CalledProcessError as e:
print(red(f"Error running command: {cmd}"))
print(red(f"Exit code: {e.returncode}"))
if not stream_output:
print(f"Stdout: {e.stdout}")
print(f"Stderr: {e.stderr}")
raise
def load_repos_config(file_path):
with open(file_path, "r") as f:
return yaml.safe_load(f)["repositories"]
def find_repo(repo_substring, config):
pattern = re.compile(re.escape(repo_substring), re.IGNORECASE)
matching_repos = [r for r in config if pattern.search(r["name"])]
def find_repo(repo_name, config):
matching_repos = [r for r in config if r["name"] == repo_name]
if not matching_repos:
print(f"Error: No repository matching '{repo_substring}' found in configuration.")
sys.exit(1)
if len(matching_repos) > 1:
print(f"Error: Multiple repositories matching '{repo_substring}' found in configuration: {', '.join(r['name'] for r in matching_repos)}")
print(red(f"Error: No repository named '{repo_name}' found in configuration."))
available_repos = [r["name"] for r in config]
print(yellow(f"Available repositories: {', '.join(available_repos)}"))
sys.exit(1)
return matching_repos[0]
def generate_script(repo, version, config):
def setup_downstream_releases_dir():
"""Create the downstream_releases directory if it doesn't exist."""
downstream_dir = Path("downstream_releases")
if not downstream_dir.exists():
print(blue(f"Creating {downstream_dir} directory..."))
downstream_dir.mkdir()
print(green(f"Created {downstream_dir} directory"))
return downstream_dir
def clone_or_update_repo(repo_url, repo_dir, downstream_dir):
"""Clone the repository if it doesn't exist, or update it if it does."""
repo_path = downstream_dir / repo_dir
if repo_path.exists():
print(blue(f"Repository {repo_dir} already exists, updating..."))
run_command("git fetch", cwd=repo_path)
print(green(f"Updated repository {repo_dir}"))
else:
print(blue(f"Cloning {repo_url} to {repo_path}..."))
run_command(f"git clone {repo_url}", cwd=downstream_dir)
print(green(f"Cloned repository {repo_dir}"))
return repo_path
def get_remotes_for_repo(repo_name):
"""Get the appropriate remotes for bump and nightly-testing branches based on repository."""
if repo_name == "mathlib4":
return "nightly-testing", "nightly-testing"
else:
return "origin", "origin"
def execute_release_steps(repo, version, config):
repo_config = find_repo(repo, config)
repo_name = repo_config['name']
repo_url = repo_config['url']
@@ -59,92 +133,313 @@ def generate_script(repo, version, config):
requires_tagging = repo_config.get("toolchain-tag", True)
has_stable_branch = repo_config.get("stable-branch", True)
script_lines = [
f"cd {repo_dir}",
"git fetch",
f"git checkout {default_branch} && git pull",
f"git checkout -b bump_to_{version}",
f"echo leanprover/lean4:{version} > lean-toolchain",
]
# Setup downstream releases directory
downstream_dir = setup_downstream_releases_dir()
# Clone or update the repository
repo_path = clone_or_update_repo(repo_url, repo_dir, downstream_dir)
# Special remote setup for mathlib4
if repo_name == "mathlib4":
print(blue("Setting up special remotes for mathlib4..."))
try:
# Check if nightly-testing remote already exists
result = run_command("git remote get-url nightly-testing", cwd=repo_path, check=False)
if result.returncode != 0:
# Add the nightly-testing remote
run_command("git remote add nightly-testing https://github.com/leanprover-community/mathlib4-nightly-testing.git", cwd=repo_path)
print(green("Added nightly-testing remote"))
else:
print(green("nightly-testing remote already exists"))
# Fetch from the nightly-testing remote
run_command("git fetch nightly-testing", cwd=repo_path)
print(green("Fetched from nightly-testing remote"))
except subprocess.CalledProcessError as e:
print(red(f"Error setting up mathlib4 remotes: {e}"))
print(yellow("Continuing with default remote setup..."))
print(blue(f"\n=== Executing release steps for {repo_name} ==="))
# Execute the release steps
run_command(f"git checkout {default_branch} && git pull", cwd=repo_path)
# Special rc1 safety check for batteries and mathlib4 (before creating any branches)
if re.search(r'rc\d+$', version) and repo_name in ["batteries", "mathlib4"] and version.endswith('-rc1'):
print(blue("This repo has nightly-testing infrastructure"))
print(blue(f"Checking if nightly-testing can be safely merged into bump/{version.split('-rc')[0]}..."))
# Get the base version (e.g., v4.6.0 from v4.6.0-rc1)
base_version = version.split('-rc')[0]
bump_branch = f"bump/{base_version}"
# Determine which remote to use for bump and nightly-testing branches
bump_remote, nightly_remote = get_remotes_for_repo(repo_name)
try:
# Fetch latest changes from the appropriate remote
run_command(f"git fetch {bump_remote}", cwd=repo_path)
# Check if the bump branch exists
result = run_command(f"git show-ref --verify --quiet refs/remotes/{bump_remote}/{bump_branch}", cwd=repo_path, check=False)
if result.returncode != 0:
print(red(f"Bump branch {bump_remote}/{bump_branch} does not exist. Cannot perform safety check."))
print(yellow("Please ensure the bump branch exists before proceeding."))
return
# Create a temporary branch for testing the merge
temp_branch = f"temp-merge-test-{base_version}"
run_command(f"git checkout -b {temp_branch} {bump_remote}/{bump_branch}", cwd=repo_path)
# Try to merge nightly-testing
try:
run_command(f"git merge {nightly_remote}/nightly-testing", cwd=repo_path)
# Check what files have changed compared to the bump branch
changed_files = run_command(f"git diff --name-only {bump_remote}/{bump_branch}..HEAD", cwd=repo_path)
# Filter out allowed changes
allowed_patterns = ['lean-toolchain', 'lake-manifest.json']
problematic_files = []
for file in changed_files.stdout.strip().split('\n'):
if file.strip(): # Skip empty lines
is_allowed = any(pattern in file for pattern in allowed_patterns)
if not is_allowed:
problematic_files.append(file)
# Clean up temporary branch and return to default branch
run_command(f"git checkout {default_branch}", cwd=repo_path)
run_command(f"git branch -D {temp_branch}", cwd=repo_path)
if problematic_files:
print(red("❌ Safety check failed!"))
print(red(f"Merging nightly-testing into {bump_branch} would result in changes to:"))
for file in problematic_files:
print(red(f" - {file}"))
print(yellow("\nYou need to make a PR merging the changes from nightly-testing into the bump branch first."))
print(yellow(f"Create a PR from nightly-testing targeting {bump_branch} to resolve these conflicts."))
return
else:
print(green("✅ Safety check passed - only lean-toolchain and/or lake-manifest.json would change"))
except subprocess.CalledProcessError:
# Clean up temporary branch on merge failure
run_command(f"git checkout {default_branch}", cwd=repo_path)
run_command(f"git branch -D {temp_branch}", cwd=repo_path)
print(red("❌ Safety check failed!"))
print(red(f"Merging nightly-testing into {bump_branch} would result in merge conflicts."))
print(yellow("\nYou need to make a PR merging the changes from nightly-testing into the bump branch first."))
print(yellow(f"Create a PR from nightly-testing targeting {bump_branch} to resolve these conflicts."))
return
except subprocess.CalledProcessError as e:
# Ensure we're back on the default branch even if setup failed
try:
run_command(f"git checkout {default_branch}", cwd=repo_path)
except subprocess.CalledProcessError:
print(red(f"Cannot return to {default_branch} branch. Repository is in an inconsistent state."))
print(red("Please manually check the repository state and fix any issues."))
return
print(red(f"Error during safety check: {e}"))
print(yellow("Skipping safety check and proceeding with normal merge..."))
# Check if the branch already exists
branch_name = f"bump_to_{version}"
try:
# Check if branch exists locally
result = run_command(f"git show-ref --verify --quiet refs/heads/{branch_name}", cwd=repo_path, check=False)
if result.returncode == 0:
print(blue(f"Branch {branch_name} already exists, checking it out..."))
run_command(f"git checkout {branch_name}", cwd=repo_path)
print(green(f"Checked out existing branch {branch_name}"))
else:
print(blue(f"Creating new branch {branch_name}..."))
run_command(f"git checkout -b {branch_name}", cwd=repo_path)
print(green(f"Created new branch {branch_name}"))
except subprocess.CalledProcessError:
print(blue(f"Creating new branch {branch_name}..."))
run_command(f"git checkout -b {branch_name}", cwd=repo_path)
print(green(f"Created new branch {branch_name}"))
# Update lean-toolchain
print(blue("Updating lean-toolchain file..."))
toolchain_file = repo_path / "lean-toolchain"
with open(toolchain_file, "w") as f:
f.write(f"leanprover/lean4:{version}\n")
print(green(f"Updated lean-toolchain to leanprover/lean4:{version}"))
# Special cases for specific repositories
if repo_name == "repl":
script_lines.extend([
"lake update",
"cd test/Mathlib",
f"perl -pi -e 's/rev = \"v\\d+\\.\\d+\\.\\d+(-rc\\d+)?\"/rev = \"{version}\"/g' lakefile.toml",
f"echo leanprover/lean4:{version} > lean-toolchain",
"lake update",
"cd ../..",
"./test.sh"
])
run_command("lake update", cwd=repo_path, stream_output=True)
mathlib_test_dir = repo_path / "test" / "Mathlib"
run_command(f'perl -pi -e \'s/rev = "v\\d+\\.\\d+\\.\\d+(-rc\\d+)?"/rev = "{version}"/g\' lakefile.toml', cwd=mathlib_test_dir)
# Update lean-toolchain in test/Mathlib
print(blue("Updating test/Mathlib/lean-toolchain..."))
mathlib_toolchain = mathlib_test_dir / "lean-toolchain"
with open(mathlib_toolchain, "w") as f:
f.write(f"leanprover/lean4:{version}\n")
print(green(f"Updated test/Mathlib/lean-toolchain to leanprover/lean4:{version}"))
run_command("lake update", cwd=mathlib_test_dir, stream_output=True)
try:
result = run_command("./test.sh", cwd=repo_path, stream_output=True, check=False)
if result.returncode == 0:
print(green("Tests completed successfully"))
else:
print(red("Tests failed, but continuing with PR creation..."))
print(red(f"Test exit code: {result.returncode}"))
except subprocess.CalledProcessError as e:
print(red("Tests failed, but continuing with PR creation..."))
print(red(f"Test error: {e}"))
elif dependencies:
script_lines.append('perl -pi -e \'s/"v4\\.[0-9]+(\\.[0-9]+)?(-rc[0-9]+)?"/"' + version + '"/g\' lakefile.*')
script_lines.append("lake update")
run_command(f'perl -pi -e \'s/"v4\\.[0-9]+(\\.[0-9]+)?(-rc[0-9]+)?"/"' + version + '"/g\' lakefile.*', cwd=repo_path)
run_command("lake update", cwd=repo_path, stream_output=True)
script_lines.append("")
script_lines.extend([
f'git commit -am "chore: bump toolchain to {version}"',
""
])
# Commit changes (only if there are changes)
print(blue("Checking for changes to commit..."))
try:
# Check if there are any changes to commit (staged or unstaged)
result = run_command("git status --porcelain", cwd=repo_path, check=False)
if result.stdout.strip(): # There are changes
print(blue("Committing changes..."))
run_command(f'git commit -am "chore: bump toolchain to {version}"', cwd=repo_path)
print(green(f"Committed changes: chore: bump toolchain to {version}"))
else:
print(green("No changes to commit - toolchain already up to date"))
except subprocess.CalledProcessError:
print(yellow("Failed to check for changes, attempting commit anyway..."))
try:
run_command(f'git commit -am "chore: bump toolchain to {version}"', cwd=repo_path)
except subprocess.CalledProcessError as e:
if "nothing to commit" in e.stderr:
print(green("No changes to commit - toolchain already up to date"))
else:
raise
# Handle special merging cases
if re.search(r'rc\d+$', version) and repo_name in ["batteries", "mathlib4"]:
script_lines.extend([
"echo 'This repo has nightly-testing infrastructure'",
f"git merge origin/bump/{version.split('-rc')[0]}",
"echo 'Please resolve any conflicts.'",
"grep nightly-testing lakefile.* && echo 'Please ensure the lakefile does not include nightly-testing versions.'",
""
])
print(blue("This repo has nightly-testing infrastructure"))
# Determine which remote to use for bump branches
bump_remote, nightly_remote = get_remotes_for_repo(repo_name)
try:
print(blue(f"Merging {bump_remote}/bump/{version.split('-rc')[0]}..."))
run_command(f"git merge {bump_remote}/bump/{version.split('-rc')[0]}", cwd=repo_path)
print(green("Merge completed successfully"))
except subprocess.CalledProcessError:
print(red("Merge conflicts detected. Please resolve them manually."))
return
if re.search(r'rc\d+$', version) and repo_name in ["verso", "reference-manual"]:
script_lines.extend([
"echo 'This repo does development on nightly-testing: remember to rebase merge the PR.'",
f"git merge origin/nightly-testing",
"echo 'Please resolve any conflicts.'",
""
])
if repo_name != "Mathlib":
script_lines.extend([
"lake build && if lake check-test; then lake test; fi",
""
])
print(yellow("This repo does development on nightly-testing: remember to rebase merge the PR."))
try:
print(blue("Merging origin/nightly-testing..."))
run_command("git merge origin/nightly-testing", cwd=repo_path)
print(green("Merge completed successfully"))
except subprocess.CalledProcessError:
print(red("Merge conflicts detected. Please resolve them manually."))
return
script_lines.extend([
'gh pr create --title "chore: bump toolchain to ' + version + '" --body ""',
"echo 'Please review the PR and merge or rebase it.'",
""
])
# Build and test (skip for Mathlib)
if repo_name not in ["Mathlib", "mathlib4"]:
print(blue("Building project..."))
try:
run_command("lake build", cwd=repo_path, stream_output=True)
print(green("Build completed successfully"))
except subprocess.CalledProcessError as e:
print(red("Build failed, but continuing with PR creation..."))
print(red(f"Build error: {e}"))
# Check if lake check-test exists before running tests
print(blue("Running tests..."))
check_test_result = run_command("lake check-test", cwd=repo_path, check=False)
if check_test_result.returncode == 0:
try:
run_command("lake test", cwd=repo_path, stream_output=True)
print(green("Tests completed successfully"))
except subprocess.CalledProcessError as e:
print(red("Tests failed, but continuing with PR creation..."))
print(red(f"Test error: {e}"))
else:
print(yellow("lake check-test reports that there is no test suite"))
return "\n".join(script_lines)
# Push the branch to remote before creating PR
print(blue("Checking remote branch status..."))
try:
# Check if branch exists on remote
result = run_command(f"git ls-remote --heads origin {branch_name}", cwd=repo_path, check=False)
if not result.stdout.strip():
print(blue(f"Pushing branch {branch_name} to remote..."))
run_command(f"git push -u origin {branch_name}", cwd=repo_path)
print(green(f"Successfully pushed branch {branch_name} to remote"))
else:
print(blue(f"Branch {branch_name} already exists on remote, pushing any new commits..."))
run_command(f"git push", cwd=repo_path)
print(green("Successfully pushed commits to remote"))
except subprocess.CalledProcessError:
print(red("Failed to push branch to remote. Please check your permissions and network connection."))
print(yellow(f"You may need to run: git push -u origin {branch_name}"))
return
# Create pull request (only if one doesn't already exist)
print(blue("Checking for existing pull request..."))
try:
# Check if PR already exists for this branch
result = run_command(f'gh pr list --head {branch_name} --json number', cwd=repo_path, check=False)
if result.returncode == 0 and result.stdout.strip() != "[]":
print(green(f"Pull request already exists for branch {branch_name}"))
# Get the PR URL
pr_result = run_command(f'gh pr view {branch_name} --json url', cwd=repo_path, check=False)
if pr_result.returncode == 0:
pr_data = json.loads(pr_result.stdout)
print(green(f"PR URL: {pr_data.get('url', 'N/A')}"))
else:
# Create new PR
print(blue("Creating new pull request..."))
run_command(f'gh pr create --title "chore: bump toolchain to {version}" --body ""', cwd=repo_path)
print(green("Pull request created successfully!"))
except subprocess.CalledProcessError:
print(red("Failed to check for existing PR or create new PR."))
print(yellow("This could be due to:"))
print(yellow("1. GitHub CLI not authenticated"))
print(yellow("2. No push permissions to the repository"))
print(yellow("3. Network issues"))
print(f"Branch: {branch_name}")
print(f"Title: chore: bump toolchain to {version}")
print(yellow("Please create the PR manually if needed."))
def main():
parser = argparse.ArgumentParser(
description="Generate release steps script for Lean4 repositories.",
description="Execute release steps for Lean4 repositories.",
formatter_class=argparse.RawDescriptionHelpFormatter,
epilog="""
Examples:
%(prog)s v4.6.0 mathlib Generate steps for updating Mathlib to v4.6.0
%(prog)s v4.6.0 batt Generate steps for updating Batteries to v4.6.0
%(prog)s v4.6.0 mathlib4 Execute steps for updating Mathlib to v4.6.0
%(prog)s v4.6.0 batteries Execute steps for updating Batteries to v4.6.0
The script will generate shell commands to:
1. Update the lean-toolchain file
2. Create appropriate branches and commits
3. Create pull requests
The script will:
1. Create a downstream_releases/ directory
2. Clone or update the target repository
3. Update the lean-toolchain file
4. Create appropriate branches and commits
5. Build and test the project
6. Create pull requests
(Note that the steps of creating toolchain version tags, and merging these into `stable` branches,
are handled by `script/release_checklist.py`.)
"""
)
parser.add_argument("version", help="The version to set in the lean-toolchain file (e.g., v4.6.0)")
parser.add_argument("repo", help="A substring of the repository name as specified in release_repos.yml")
parser.add_argument("repo", help="The repository name as specified in release_repos.yml")
args = parser.parse_args()
config_path = os.path.join(os.path.dirname(__file__), "release_repos.yml")
config = load_repos_config(config_path)
script = generate_script(args.repo, args.version, config)
print(script)
execute_release_steps(args.repo, args.version, config)
if __name__ == "__main__":
main()

View File

@@ -6,41 +6,41 @@ Authors: Leonardo de Moura
module
prelude
import Init.Prelude
import Init.Notation
import Init.Tactics
import Init.TacticsExtra
import Init.ByCases
import Init.RCases
import Init.Core
import Init.Control
import Init.Data.Basic
import Init.WF
import Init.WFTactics
import Init.Data
import Init.System
import Init.Util
import Init.Dynamic
import Init.ShareCommon
import Init.MetaTypes
import Init.Meta
import Init.NotationExtra
import Init.SimpLemmas
import Init.PropLemmas
import Init.Hints
import Init.Conv
import Init.Guard
import Init.Simproc
import Init.SizeOfLemmas
import Init.BinderPredicates
import Init.Ext
import Init.Omega
import Init.MacroTrace
import Init.Grind
import Init.GrindInstances
import Init.While
import Init.Syntax
import Init.Internal
import Init.Try
import Init.BinderNameHint
import Init.Task
public import Init.Prelude
public import Init.Notation
public import Init.Tactics
public import Init.TacticsExtra
public import Init.ByCases
public import Init.RCases
public import Init.Core
public import Init.Control
public import Init.Data.Basic
public import Init.WF
public import Init.WFTactics
public import Init.Data
public import Init.System
public import Init.Util
public import Init.Dynamic
public import Init.ShareCommon
public import Init.MetaTypes
public import Init.Meta
public import Init.NotationExtra
public import Init.SimpLemmas
public import Init.PropLemmas
public import Init.Hints
public import Init.Conv
public import Init.Guard
public import Init.Simproc
public import Init.SizeOfLemmas
public import Init.BinderPredicates
public import Init.Ext
public import Init.Omega
public import Init.MacroTrace
public import Init.Grind
public import Init.GrindInstances
public import Init.While
public import Init.Syntax
public import Init.Internal
public import Init.Try
public import Init.BinderNameHint
public import Init.Task

View File

@@ -323,7 +323,7 @@ macro_rules
| `(conv| repeat $seq) => `(conv| first | ($seq); repeat $seq | skip)
/--
Extracts `let` and `let_fun` expressions from within the target expression.
Extracts `let` and `have` expressions from within the target expression.
This is the conv mode version of the `extract_lets` tactic.
- `extract_lets` extracts all the lets from the target.
@@ -336,7 +336,7 @@ See also `lift_lets`, which does not extract lets as local declarations.
syntax (name := extractLets) "extract_lets " optConfig (ppSpace colGt (ident <|> hole))* : conv
/--
Lifts `let` and `let_fun` expressions within the target expression as far out as possible.
Lifts `let` and `have` expressions within the target expression as far out as possible.
This is the conv mode version of the `lift_lets` tactic.
-/
syntax (name := liftLets) "lift_lets " optConfig : conv

View File

@@ -7,6 +7,7 @@ module
prelude
public import Init.Data.Array.Basic
public import Init.Data.Array.DecidableEq
public import Init.Data.UInt.Basic
public import all Init.Data.UInt.BasicAux
public import Init.Data.Option.Basic
@@ -21,6 +22,14 @@ attribute [extern "lean_byte_array_mk"] ByteArray.mk
attribute [extern "lean_byte_array_data"] ByteArray.data
namespace ByteArray
deriving instance BEq for ByteArray
attribute [ext] ByteArray
instance : DecidableEq ByteArray :=
fun _ _ => decidable_of_decidable_of_iff ByteArray.ext_iff.symm
@[extern "lean_mk_empty_byte_array"]
def emptyWithCapacity (c : @& Nat) : ByteArray :=
{ data := #[] }

View File

@@ -9,6 +9,8 @@ prelude
public import Init.Data.Array.Basic
public import Init.Data.Float
public import Init.Data.Option.Basic
import Init.Ext
public import Init.Data.Array.DecidableEq
public section
universe u
@@ -20,6 +22,11 @@ attribute [extern "lean_float_array_mk"] FloatArray.mk
attribute [extern "lean_float_array_data"] FloatArray.data
namespace FloatArray
deriving instance BEq for FloatArray
attribute [ext] FloatArray
@[extern "lean_mk_empty_float_array"]
def emptyWithCapacity (c : @& Nat) : FloatArray :=
{ data := #[] }

View File

@@ -425,6 +425,10 @@ expectation that the resulting string is valid code.
The `Repr` class is similar, but the expectation is that instances produce valid Lean code.
-/
class ToFormat (α : Type u) where
/--
Converts a value to a `Format` object, with no expectation that the resulting string is valid
code.
-/
format : α Format
export ToFormat (format)

View File

@@ -1524,7 +1524,7 @@ private theorem cooper_right_core
have d_pos : (0 : Int) < 1 := by decide
have h₃ : 1 0*x + 0 := Int.one_dvd _
have h := cooper_dvd_right_core a_neg b_pos d_pos h₁ h₂ h₃
simp only [Int.mul_one, gcd_zero, Int.natAbs_of_nonneg (Int.le_of_lt b_pos),
simp only [Int.mul_one, gcd_zero, Int.natAbs_of_nonneg (Int.le_of_lt b_pos),
Int.ediv_self (Int.ne_of_gt b_pos), lcm_one,
Int.zero_mul, Int.mul_zero, Int.add_zero, Int.dvd_zero,
and_true, Int.neg_zero] at h
@@ -1915,6 +1915,11 @@ theorem eq_def (ctx : Context) (x : Var) (xPoly : Poly) (p : Poly)
simp [eq_def_cert]; intro _ h; subst p; simp [h]
rw [ Int.sub_eq_add_neg, Int.sub_self]
theorem eq_def_norm (ctx : Context) (x : Var) (xPoly xPoly' : Poly) (p : Poly)
: eq_def_cert x xPoly' p x.denote ctx = xPoly.denote' ctx xPoly.denote' ctx = xPoly'.denote' ctx p.denote' ctx = 0 := by
simp [eq_def_cert]; intro _ h₁ h₂; subst p; simp [h₁, h₂]
rw [ Int.sub_eq_add_neg, Int.sub_self]
@[expose]
def eq_def'_cert (x : Var) (e : Expr) (p : Poly) : Bool :=
p == .add (-1) x e.norm
@@ -1924,6 +1929,27 @@ theorem eq_def' (ctx : Context) (x : Var) (e : Expr) (p : Poly)
simp [eq_def'_cert]; intro _ h; subst p; simp [h]
rw [ Int.sub_eq_add_neg, Int.sub_self]
@[expose]
def eq_def'_norm_cert (x : Var) (e : Expr) (ePoly ePoly' p : Poly) : Bool :=
ePoly == e.norm && p == .add (-1) x ePoly'
theorem eq_def'_norm (ctx : Context) (x : Var) (e : Expr) (ePoly ePoly' : Poly) (p : Poly)
: eq_def'_norm_cert x e ePoly ePoly' p x.denote ctx = e.denote ctx ePoly.denote' ctx = ePoly'.denote' ctx p.denote' ctx = 0 := by
simp [eq_def'_norm_cert]; intro _ _ h₁ h₂; subst ePoly p; simp [h₁, h₂]
rw [ Int.sub_eq_add_neg, Int.sub_self]
theorem eq_norm_poly (ctx : Context) (p p' : Poly) : p.denote' ctx = p'.denote' ctx p.denote' ctx = 0 p'.denote' ctx = 0 := by
intro h; rw [h]; simp
theorem le_norm_poly (ctx : Context) (p p' : Poly) : p.denote' ctx = p'.denote' ctx p.denote' ctx 0 p'.denote' ctx 0 := by
intro h; rw [h]; simp
theorem diseq_norm_poly (ctx : Context) (p p' : Poly) : p.denote' ctx = p'.denote' ctx p.denote' ctx 0 p'.denote' ctx 0 := by
intro h; rw [h]; simp
theorem dvd_norm_poly (ctx : Context) (d : Int) (p p' : Poly) : p.denote' ctx = p'.denote' ctx d p.denote' ctx d p'.denote' ctx := by
intro h; rw [h]; simp
end Int.Linear
theorem Int.not_le_eq (a b : Int) : (¬a b) = (b + 1 a) := by

View File

@@ -503,6 +503,16 @@ def ofCommSemiring : CommRing (OfSemiring.Q α) :=
attribute [instance] ofCommSemiring
end OfCommSemiring
/-
Remark: `↑a` is notation for `OfSemring.toQ a`.
We want to hide `OfSemring.toQ` applications in the diagnostic information produced by
the `ring` procedure in `grind`.
-/
@[app_unexpander OfSemiring.toQ]
meta def toQUnexpander : PrettyPrinter.Unexpander := fun stx => do
match stx with
| `($_ $a:term) => `($a)
| _ => throw ()
end OfCommSemiring
end Lean.Grind.CommRing

View File

@@ -13,7 +13,7 @@ public import Init.Data.RArray
public import Init.Grind.Ring.Basic
public import Init.Grind.Ring.Field
public import Init.Grind.Ordered.Ring
public import Init.GrindInstances.Ring.Int
public section
namespace Lean.Grind
@@ -97,6 +97,17 @@ def Mon.denote {α} [Semiring α] (ctx : Context α) : Mon → α
| unit => 1
| .mult p m => p.denote ctx * denote ctx m
@[expose]
def Mon.denote' {α} [Semiring α] (ctx : Context α) (m : Mon) : α :=
match m with
| .unit => 1
| .mult pw m => go m (pw.denote ctx)
where
go (m : Mon) (acc : α) : α :=
match m with
| .unit => acc
| .mult pw m => go m (acc * (pw.denote ctx))
@[expose]
def Mon.ofVar (x : Var) : Mon :=
.mult { x, k := 1 } .unit
@@ -236,6 +247,24 @@ def Poly.denote [Ring α] (ctx : Context α) (p : Poly) : α :=
| .num k => Int.cast k
| .add k m p => Int.cast k * m.denote ctx + denote ctx p
@[expose]
def Poly.denote' [Ring α] (ctx : Context α) (p : Poly) : α :=
match p with
| .num k => Int.cast k
| .add k m p => go p (denoteTerm k m)
where
denoteTerm (k : Int) (m : Mon) : α :=
bif k == 1 then
m.denote' ctx
else
Int.cast k * m.denote' ctx
go (p : Poly) (acc : α) : α :=
match p with
| .num 0 => acc
| .num k => acc + Int.cast k
| .add k m p => go p (acc + denoteTerm k m)
@[expose]
def Poly.ofMon (m : Mon) : Poly :=
.add 1 m (.num 0)
@@ -576,6 +605,14 @@ theorem Power.denote_eq {α} [Semiring α] (ctx : Context α) (p : Power)
: p.denote ctx = p.x.denote ctx ^ p.k := by
cases p <;> simp [Power.denote] <;> split <;> simp [pow_zero, pow_succ, one_mul]
theorem Mon.denote'_eq_denote {α} [Semiring α] (ctx : Context α) (m : Mon) : m.denote' ctx = m.denote ctx := by
cases m <;> simp [denote', denote]
next pw m =>
generalize pw.denote ctx = acc
fun_induction denote'.go
next => simp [denote, Semiring.mul_one]
next acc pw m ih => simp [ih, denote, Semiring.mul_assoc]
theorem Mon.denote_ofVar {α} [Semiring α] (ctx : Context α) (x : Var)
: denote ctx (ofVar x) = x.denote ctx := by
simp [denote, ofVar, Power.denote_eq, pow_succ, pow_zero, one_mul, mul_one]
@@ -658,6 +695,16 @@ theorem Mon.eq_of_revlex {m₁ m₂ : Mon} : revlex m₁ m₂ = .eq → m₁ = m
theorem Mon.eq_of_grevlex {m₁ m₂ : Mon} : grevlex m₁ m₂ = .eq m₁ = m₂ := by
simp [grevlex]; intro; apply eq_of_revlex
theorem Poly.denoteTerm_eq {α} [Ring α] (ctx : Context α) (k : Int) (m : Mon) : denote'.denoteTerm ctx k m = k * m.denote ctx := by
simp [denote'.denoteTerm, Mon.denote'_eq_denote, cond_eq_if]; intro; subst k; rw [Ring.intCast_one, Semiring.one_mul]
theorem Poly.denote'_eq_denote {α} [Ring α] (ctx : Context α) (p : Poly) : p.denote' ctx = p.denote ctx := by
cases p <;> simp [denote', denote, denoteTerm_eq]
next k m p =>
generalize k * m.denote ctx = acc
fun_induction denote'.go <;> simp [denote, *, Ring.intCast_zero, Semiring.add_zero, denoteTerm_eq]
next ih => simp [denoteTerm_eq] at ih; simp [ih, Semiring.add_assoc]
theorem Poly.denote_ofMon {α} [CommRing α] (ctx : Context α) (m : Mon)
: denote ctx (ofMon m) = m.denote ctx := by
simp [ofMon, denote, intCast_one, intCast_zero, one_mul, add_zero]
@@ -1379,6 +1426,7 @@ theorem Poly.normEq0_eq {α} [CommRing α] (ctx : Context α) (p : Poly) (c : Na
simp [denote, normEq0]; split <;> simp [denote, *]
next h' => rw [of_mod_eq_0 h h', Semiring.zero_mul, Semiring.zero_add]
@[expose]
def eq_normEq0_cert (c : Nat) (p₁ p₂ p : Poly) : Bool :=
p₁ == .num c && p == p₂.normEq0 c
@@ -1398,6 +1446,7 @@ theorem gcd_eq_0 [CommRing α] (g n m a b : Int) (h : g = a * n + b * m)
rw [ Ring.intCast_add, h₂, Semiring.zero_add, h] at h₁
rw [Ring.intCast_zero, h₁]
@[expose]
def eq_gcd_cert (a b : Int) (p₁ p₂ p : Poly) : Bool :=
match p₁ with
| .add .. => false
@@ -1415,6 +1464,7 @@ theorem eq_gcd {α} [CommRing α] (ctx : Context α) (a b : Int) (p₁ p₂ p :
next n m g =>
apply gcd_eq_0 g n m a b
@[expose]
def d_normEq0_cert (c : Nat) (p₁ p₂ p : Poly) : Bool :=
p₂ == .num c && p == p₁.normEq0 c
@@ -1423,5 +1473,11 @@ theorem d_normEq0 {α} [CommRing α] (ctx : Context α) (k : Int) (c : Nat) (ini
simp [d_normEq0_cert]; intro _ h₁ h₂; subst p p₂; simp [Poly.denote]
intro h; rw [p₁.normEq0_eq] <;> assumption
@[expose] def norm_int_cert (e : Expr) (p : Poly) : Bool :=
e.toPoly == p
theorem norm_int (ctx : Context Int) (e : Expr) (p : Poly) : norm_int_cert e p e.denote ctx = p.denote' ctx := by
simp [norm_int_cert, Poly.denote'_eq_denote]; intro; subst p; simp [Expr.denote_toPoly]
end CommRing
end Lean.Grind

View File

@@ -544,12 +544,6 @@ end fun_order
section monotone_lemmas
theorem monotone_letFun
{α : Sort u} {β : Sort v} {γ : Sort w} [PartialOrder α] [PartialOrder β]
(v : γ) (k : α γ β)
(hmono : y, monotone (fun x => k x y)) :
monotone fun (x : α) => letFun v (k x) := hmono v
@[partial_fixpoint_monotone]
theorem monotone_ite
{α : Sort u} {β : Sort v} [PartialOrder α] [PartialOrder β]

View File

@@ -63,20 +63,14 @@ Examples:
fun _ => a
/--
The encoding of `let_fun x := v; b` is `letFun v (fun x => b)`.
`letFun v (fun x => b)` is a function version of `have x := v; b`.
This is equal to `(fun x => b) v`, so the value of `x` is not accessible to `b`.
This is in contrast to `let x := v; b`, where the value of `x` is accessible to `b`.
There is special support for `letFun`.
Both WHNF and `simp` are aware of `letFun` and can reduce it when zeta reduction is enabled,
despite the fact it is marked `irreducible`.
For metaprogramming, the function `Lean.Expr.letFun?` can be used to recognize a `let_fun` expression
to extract its parts as if it were a `let` expression.
This used to be the way `have`/`let_fun` syntax was encoded,
and there used to be special support for `letFun` in WHNF and `simp`.
-/
def letFun {α : Sort u} {β : α Sort v} (v : α) (f : (x : α) β x) : β v := f v
-- We need to export the body of `letFun`, which is suppressed if `[irreducible]` is set directly.
-- We can work around this rare case by applying the attribute after the fact.
attribute [irreducible] letFun
set_option checkBinderAnnotations false in
/--

View File

@@ -137,21 +137,6 @@ theorem have_body_congr' {α : Sort u} {β : Sort v} (a : α) {f f' : α → β}
(h : x, f x = f' x) : f a = f' a :=
h a
theorem letFun_unused {α : Sort u} {β : Sort v} (a : α) {b b' : β} (h : b = b') : @letFun α (fun _ => β) a (fun _ => b) = b' :=
h
theorem letFun_congr {α : Sort u} {β : Sort v} {a a' : α} {f f' : α β} (h₁ : a = a') (h₂ : x, f x = f' x)
: @letFun α (fun _ => β) a f = @letFun α (fun _ => β) a' f' := by
rw [h₁, funext h₂]
theorem letFun_body_congr {α : Sort u} {β : Sort v} (a : α) {f f' : α β} (h : x, f x = f' x)
: @letFun α (fun _ => β) a f = @letFun α (fun _ => β) a f' := by
rw [funext h]
theorem letFun_val_congr {α : Sort u} {β : Sort v} {a a' : α} {f : α β} (h : a = a')
: @letFun α (fun _ => β) a f = @letFun α (fun _ => β) a' f := by
rw [h]
@[congr]
theorem ite_congr {x y u v : α} {s : Decidable b} [Decidable c]
(h₁ : b = c) (h₂ : c x = u) (h₃ : ¬ c y = v) : ite b x y = ite c u v := by

View File

@@ -544,7 +544,7 @@ performs the unification, and replaces the target with the unified version of `t
syntax (name := «show») "show " term : tactic
/--
Extracts `let` and `let_fun` expressions from within the target or a local hypothesis,
Extracts `let` and `have` expressions from within the target or a local hypothesis,
introducing new local definitions.
- `extract_lets` extracts all the lets from the target.
@@ -558,7 +558,7 @@ introduces a new local definition `z := v` and changes `h` to be `h : b z`.
syntax (name := extractLets) "extract_lets " optConfig (ppSpace colGt (ident <|> hole))* (location)? : tactic
/--
Lifts `let` and `let_fun` expressions within a term as far out as possible.
Lifts `let` and `have` expressions within a term as far out as possible.
It is like `extract_lets +lift`, but the top-level lets at the end of the procedure
are not extracted as local hypotheses.
@@ -1281,10 +1281,10 @@ syntax (name := substEqs) "subst_eqs" : tactic
/-- The `run_tac doSeq` tactic executes code in `TacticM Unit`. -/
syntax (name := runTac) "run_tac " doSeq : tactic
/-- `haveI` behaves like `have`, but inlines the value instead of producing a `let_fun` term. -/
/-- `haveI` behaves like `have`, but inlines the value instead of producing a `have` term. -/
macro "haveI" c:letConfig d:letDecl : tactic => `(tactic| refine_lift haveI $c:letConfig $d:letDecl; ?_)
/-- `letI` behaves like `let`, but inlines the value instead of producing a `let_fun` term. -/
/-- `letI` behaves like `let`, but inlines the value instead of producing a `let` term. -/
macro "letI" c:letConfig d:letDecl : tactic => `(tactic| refine_lift letI $c:letConfig $d:letDecl; ?_)
/--

View File

@@ -70,6 +70,13 @@ private def looksLikeRelevantTheoremProofType (type : Expr) : Bool :=
else
type.isAppOfArity ``WellFounded 2
/-- If `warn.sorry` is set to true, then, so long as the message log does not already have any errors,
declarations with `sorryAx` generate the "declaration uses 'sorry'" warning. -/
register_builtin_option warn.sorry : Bool := {
defValue := true
descr := "warn about uses of `sorry` in declarations added to the environment"
}
def addDecl (decl : Declaration) : CoreM Unit := do
-- register namespaces for newly added constants; this used to be done by the kernel itself
-- but that is incompatible with moving it to a separate task
@@ -135,8 +142,9 @@ where
doAdd := do
profileitM Exception "type checking" ( getOptions) do
withTraceNode `Kernel (fun _ => return m!"typechecking declarations {decl.getTopLevelNames}") do
if !( MonadLog.hasErrors) && decl.hasSorry then
logWarning <| .tagged `hasSorry m!"declaration uses 'sorry'"
if warn.sorry.get ( getOptions) then
if !( MonadLog.hasErrors) && decl.hasSorry then
logWarning <| .tagged `hasSorry m!"declaration uses 'sorry'"
try
let env ( getEnv).addDeclAux ( getOptions) decl ( read).cancelTk?
|> ofExceptKernelException

View File

@@ -20,9 +20,9 @@ import Lean.Compiler.IR.ExpandResetReuse
import Lean.Compiler.IR.UnboxResult
import Lean.Compiler.IR.ElimDeadBranches
import Lean.Compiler.IR.EmitC
import Lean.Compiler.IR.CtorLayout
import Lean.Compiler.IR.Sorry
import Lean.Compiler.IR.ToIR
import Lean.Compiler.IR.ToIRType
-- The following imports are not required by the compiler. They are here to ensure that there
-- are no orphaned modules.

View File

@@ -1,43 +0,0 @@
/-
Copyright (c) 2019 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Environment
import Lean.Compiler.IR.Format
namespace Lean
namespace IR
inductive CtorFieldInfo where
| irrelevant
| object (i : Nat)
| usize (i : Nat)
| scalar (sz : Nat) (offset : Nat) (type : IRType)
deriving Inhabited
namespace CtorFieldInfo
def format : CtorFieldInfo Format
| irrelevant => ""
| object i => f!"obj@{i}"
| usize i => f!"usize@{i}"
| scalar sz offset type => f!"scalar#{sz}@{offset}:{type}"
instance : ToFormat CtorFieldInfo := format
end CtorFieldInfo
structure CtorLayout where
cidx : Nat
fieldInfo : List CtorFieldInfo
numObjs : Nat
numUSize : Nat
scalarSize : Nat
@[extern "lean_ir_get_ctor_layout"]
opaque getCtorLayout (env : @& Environment) (ctorName : @& Name) : Except String CtorLayout
end IR
end Lean

View File

@@ -9,14 +9,14 @@ import Lean.Compiler.LCNF.CompilerM
import Lean.Compiler.LCNF.PhaseExt
import Lean.Compiler.IR.Basic
import Lean.Compiler.IR.CompilerM
import Lean.Compiler.IR.CtorLayout
import Lean.Compiler.IR.ToIRType
import Lean.CoreM
import Lean.Environment
namespace Lean.IR
open Lean.Compiler (LCNF.Alt LCNF.Arg LCNF.CacheExtension LCNF.Code LCNF.Decl LCNF.DeclValue
LCNF.LCtx LCNF.LetDecl LCNF.LetValue LCNF.LitValue LCNF.Param LCNF.getMonoDecl?)
open Lean.Compiler (LCNF.Alt LCNF.Arg LCNF.Code LCNF.Decl LCNF.DeclValue LCNF.LCtx LCNF.LetDecl
LCNF.LetValue LCNF.LitValue LCNF.Param LCNF.getMonoDecl?)
namespace ToIR
@@ -72,86 +72,6 @@ def lowerLitValue (v : LCNF.LitValue) : LitVal :=
| .uint32 v => .num (UInt32.toNat v)
| .uint64 v | .usize v => .num (UInt64.toNat v)
builtin_initialize scalarTypeExt : LCNF.CacheExtension Name (Option IRType)
LCNF.CacheExtension.register
def lowerEnumToScalarType? (name : Name) : CoreM (Option IRType) := do
match ( scalarTypeExt.find? name) with
| some info? => return info?
| none =>
let info? fillCache
scalarTypeExt.insert name info?
return info?
where fillCache : CoreM (Option IRType) := do
let env Lean.getEnv
let some (.inductInfo inductiveVal) := env.find? name | return none
let ctorNames := inductiveVal.ctors
let numCtors := ctorNames.length
for ctorName in ctorNames do
let some (.ctorInfo ctorVal) := env.find? ctorName | panic! "expected valid constructor name"
if ctorVal.type.isForall then return none
return if numCtors == 1 then
none
else if numCtors < Nat.pow 2 8 then
some .uint8
else if numCtors < Nat.pow 2 16 then
some .uint16
else if numCtors < Nat.pow 2 32 then
some .uint32
else
none
def lowerType (e : Lean.Expr) : CoreM IRType := do
match e with
| .const name .. =>
match name with
| ``UInt8 | ``Bool => return .uint8
| ``UInt16 => return .uint16
| ``UInt32 => return .uint32
| ``UInt64 => return .uint64
| ``USize => return .usize
| ``Float => return .float
| ``Float32 => return .float32
| ``lcErased => return .irrelevant
| _ =>
if let some scalarType lowerEnumToScalarType? name then
return scalarType
else
return .object
| .app f _ =>
-- All mono types are in headBeta form.
if let .const name _ := f then
if let some scalarType lowerEnumToScalarType? name then
return scalarType
else
return .object
else
return .object
| .forallE .. => return .object
| _ => panic! "invalid type"
builtin_initialize ctorInfoExt : LCNF.CacheExtension Name (CtorInfo × (Array CtorFieldInfo))
LCNF.CacheExtension.register
def getCtorInfo (name : Name) : CoreM (CtorInfo × (Array CtorFieldInfo)) := do
match ( ctorInfoExt.find? name) with
| some info => return info
| none =>
let info fillCache
ctorInfoExt.insert name info
return info
where fillCache := do
match getCtorLayout ( Lean.getEnv) name with
| .ok ctorLayout =>
return {
name,
cidx := ctorLayout.cidx,
size := ctorLayout.numObjs,
usize := ctorLayout.numUSize,
ssize := ctorLayout.scalarSize
}, ctorLayout.fieldInfo.toArray
| .error .. => panic! "unrecognized constructor"
def lowerArg (a : LCNF.Arg) : M Arg := do
match a with
| .fvar fvarId =>
@@ -176,7 +96,7 @@ def lowerProj (base : VarId) (ctorInfo : CtorInfo) (field : CtorFieldInfo)
def lowerParam (p : LCNF.Param) : M Param := do
let x bindVar p.fvarId
let ty lowerType p.type
let ty toIRType p.type
return { x, borrow := p.borrow, ty }
mutual
@@ -198,7 +118,7 @@ partial def lowerCode (c : LCNF.Code) : M FnBody := do
| some (.var varId) =>
return .case cases.typeName
varId
( lowerType cases.resultType)
( toIRType cases.resultType)
( cases.alts.mapM (lowerAlt varId))
| some (.joinPoint ..) | some .erased | none => panic! "unexpected value"
| .return fvarId =>
@@ -219,7 +139,7 @@ partial def lowerLet (decl : LCNF.LetDecl) (k : LCNF.Code) : M FnBody := do
let var bindVar decl.fvarId
let type match e with
| .ctor .. | .pap .. | .proj .. => pure <| .object
| _ => lowerType decl.type
| _ => toIRType decl.type
return .vdecl var type e ( lowerCode k)
let rec mkErased (_ : Unit) : M FnBody := do
bindErased decl.fvarId
@@ -229,7 +149,7 @@ partial def lowerLet (decl : LCNF.LetDecl) (k : LCNF.Code) : M FnBody := do
let tmpVar newVar
let type match e with
| .ctor .. | .pap .. | .proj .. => pure <| .object
| _ => lowerType decl.type
| _ => toIRType decl.type
return .vdecl tmpVar .object e (.vdecl var type (.ap tmpVar restArgs) ( lowerCode k))
let rec tryIrDecl? (name : Name) (args : Array Arg) : M (Option FnBody) := do
if let some decl LCNF.getMonoDecl? name then
@@ -401,7 +321,7 @@ partial def lowerAlt (discr : VarId) (a : LCNF.Alt) : M Alt := do
end
def lowerResultType (type : Lean.Expr) (arity : Nat) : M IRType :=
lowerType (resultTypeForArity type arity)
toIRType (resultTypeForArity type arity)
where resultTypeForArity (type : Lean.Expr) (arity : Nat) : Lean.Expr :=
if arity == 0 then
type

View File

@@ -0,0 +1,126 @@
/-
Copyright (c) 2019 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Environment
import Lean.Compiler.IR.Format
import Lean.Compiler.LCNF.CompilerM
namespace Lean
namespace IR
open Lean.Compiler (LCNF.CacheExtension)
builtin_initialize scalarTypeExt : LCNF.CacheExtension Name (Option IRType)
LCNF.CacheExtension.register
def lowerEnumToScalarType? (name : Name) : CoreM (Option IRType) := do
match ( scalarTypeExt.find? name) with
| some info? => return info?
| none =>
let info? fillCache
scalarTypeExt.insert name info?
return info?
where fillCache : CoreM (Option IRType) := do
let env Lean.getEnv
let some (.inductInfo inductiveVal) := env.find? name | return none
let ctorNames := inductiveVal.ctors
let numCtors := ctorNames.length
for ctorName in ctorNames do
let some (.ctorInfo ctorVal) := env.find? ctorName | panic! "expected valid constructor name"
if ctorVal.type.isForall then return none
return if numCtors == 1 then
none
else if numCtors < Nat.pow 2 8 then
some .uint8
else if numCtors < Nat.pow 2 16 then
some .uint16
else if numCtors < Nat.pow 2 32 then
some .uint32
else
none
def toIRType (e : Lean.Expr) : CoreM IRType := do
match e with
| .const name .. =>
match name with
| ``UInt8 | ``Bool => return .uint8
| ``UInt16 => return .uint16
| ``UInt32 => return .uint32
| ``UInt64 => return .uint64
| ``USize => return .usize
| ``Float => return .float
| ``Float32 => return .float32
| ``lcErased => return .irrelevant
| _ =>
if let some scalarType lowerEnumToScalarType? name then
return scalarType
else
return .object
| .app f _ =>
-- All mono types are in headBeta form.
if let .const name _ := f then
if let some scalarType lowerEnumToScalarType? name then
return scalarType
else
return .object
else
return .object
| .forallE .. => return .object
| _ => panic! "invalid type"
inductive CtorFieldInfo where
| irrelevant
| object (i : Nat)
| usize (i : Nat)
| scalar (sz : Nat) (offset : Nat) (type : IRType)
deriving Inhabited
namespace CtorFieldInfo
def format : CtorFieldInfo Format
| irrelevant => ""
| object i => f!"obj@{i}"
| usize i => f!"usize@{i}"
| scalar sz offset type => f!"scalar#{sz}@{offset}:{type}"
instance : ToFormat CtorFieldInfo := format
end CtorFieldInfo
structure CtorLayout where
cidx : Nat
fieldInfo : List CtorFieldInfo
numObjs : Nat
numUSize : Nat
scalarSize : Nat
@[extern "lean_ir_get_ctor_layout"]
opaque getCtorLayout (env : @& Environment) (ctorName : @& Name) : Except String CtorLayout
builtin_initialize ctorInfoExt : LCNF.CacheExtension Name (CtorInfo × (Array CtorFieldInfo))
LCNF.CacheExtension.register
def getCtorInfo (name : Name) : CoreM (CtorInfo × (Array CtorFieldInfo)) := do
match ( ctorInfoExt.find? name) with
| some info => return info
| none =>
let info fillCache
ctorInfoExt.insert name info
return info
where fillCache := do
match getCtorLayout ( Lean.getEnv) name with
| .ok ctorLayout =>
return {
name,
cidx := ctorLayout.cidx,
size := ctorLayout.numObjs,
usize := ctorLayout.numUSize,
ssize := ctorLayout.scalarSize
}, ctorLayout.fieldInfo.toArray
| .error .. => panic! "unrecognized constructor"
end IR
end Lean

View File

@@ -679,9 +679,7 @@ where
visit (f.beta e.getAppArgs)
visitApp (e : Expr) : M Arg := do
if let some (args, n, t, v, b) := e.letFunAppArgs? then
visitCore <| mkAppN (.letE n t v b (nondep := true)) args
else if let .const declName us := CSimp.replaceConstants ( getEnv) e.getAppFn then
if let .const declName us := CSimp.replaceConstants ( getEnv) e.getAppFn then
if declName == ``Quot.lift then
visitQuotLift e
else if declName == ``Quot.mk then

View File

@@ -932,7 +932,10 @@ def elabLetDeclCore (stx : Syntax) (expectedType? : Option Expr) (initConfig : L
fun stx expectedType? => elabLetDeclCore stx expectedType? { nondep := true }
@[builtin_term_elab «let_fun»] def elabLetFunDecl : TermElab :=
fun stx expectedType? => elabLetDeclCore stx expectedType? { nondep := true }
fun stx expectedType? => do
withRef stx <| Linter.logLintIf Linter.linter.deprecated stx[0]
"`let_fun` has been deprecated in favor of `have`"
elabLetDeclCore stx expectedType? { nondep := true }
@[builtin_term_elab «let_delayed»] def elabLetDelayedDecl : TermElab :=
fun stx expectedType? => elabLetDeclCore stx expectedType? { postponeValue := true }

View File

@@ -108,7 +108,7 @@ open Meta
Recall that we do not use the same approach used to elaborate type ascriptions.
For the `($val : $type)` notation, we just elaborate `val` using `type` and
ensure it has type `type`. This approach only ensure the type resulting expression
is definitionally equal to `type`. For the `show` notation we use `let_fun` to ensure the type
is definitionally equal to `type`. For the `show` notation we use `have` to ensure the type
of the resulting expression is *structurally equal* `type`. Structural equality is important,
for example, if the resulting expression is a `simp`/`rw` parameter. Here is an example:
```

View File

@@ -1156,6 +1156,7 @@ where
finishElab headers
processDeriving headers
elabAsync header view declId := do
assert! view.kind.isTheorem
let env getEnv
let async env.addConstAsync declId.declName .thm
(exportedKind? := guard (!isPrivateName declId.declName) *> some .axiom)
@@ -1178,6 +1179,12 @@ where
s := collectLevelParams s type
let scopeLevelNames getLevelNames
let levelParams IO.ofExcept <| sortDeclLevelParams scopeLevelNames allUserLevelNames s.params
let type if cleanup.letToHave.get ( getOptions) then
withRef header.declId <| Meta.letToHave type
else
pure type
async.commitSignature { name := header.declName, levelParams, type }
-- attributes should be applied on the main thread; see below

View File

@@ -13,6 +13,7 @@ import Lean.Util.NumApps
import Lean.Meta.AbstractNestedProofs
import Lean.Meta.ForEachExpr
import Lean.Meta.Eqns
import Lean.Meta.LetToHave
import Lean.Elab.RecAppSyntax
import Lean.Elab.DefView
import Lean.Elab.PreDefinition.TerminationHint
@@ -21,6 +22,11 @@ namespace Lean.Elab
open Meta
open Term
register_builtin_option cleanup.letToHave : Bool := {
defValue := true
descr := "Enables transforming `let`s to `have`s after elaborating declarations."
}
/--
A (potentially recursive) definition.
The elaborator converts it into Kernel definitions using many different strategies.
@@ -88,6 +94,31 @@ def applyAttributesOf (preDefs : Array PreDefinition) (applicationTime : Attribu
for preDef in preDefs do
applyAttributesAt preDef.declName preDef.modifiers.attrs applicationTime
/--
Applies `Meta.letToHave` to the values of defs, instances, and abbrevs.
Does not apply the transformation to values that are proofs, or to unsafe definitions.
-/
def letToHaveValue (preDef : PreDefinition) : MetaM PreDefinition := withRef preDef.ref do
if !cleanup.letToHave.get ( getOptions)
|| preDef.modifiers.isUnsafe
|| preDef.kind matches .theorem | .example | .opaque then
return preDef
else if Meta.isProp preDef.type then
return preDef
else
let value Meta.letToHave preDef.value
return { preDef with value }
/--
Applies `Meta.letToHave` to the type of the predef.
-/
def letToHaveType (preDef : PreDefinition) : MetaM PreDefinition := withRef preDef.ref do
if !cleanup.letToHave.get ( getOptions) || preDef.kind matches .example then
return preDef
else
let type Meta.letToHave preDef.type
return { preDef with type }
def abstractNestedProofs (preDef : PreDefinition) (cache := true) : MetaM PreDefinition := withRef preDef.ref do
if preDef.kind.isTheorem || preDef.kind.isExample then
pure preDef
@@ -138,9 +169,11 @@ private def checkMeta (preDef : PreDefinition) : TermElabM Unit := do
| _, _ => pure ()
return true
private def addNonRecAux (preDef : PreDefinition) (compile : Bool) (all : List Name) (applyAttrAfterCompilation := true) (cacheProofs := true) : TermElabM Unit :=
private def addNonRecAux (preDef : PreDefinition) (compile : Bool) (all : List Name) (applyAttrAfterCompilation := true) (cacheProofs := true) (cleanupValue := false) : TermElabM Unit :=
withRef preDef.ref do
let preDef abstractNestedProofs (cache := cacheProofs) preDef
let preDef letToHaveType preDef
let preDef if cleanupValue then letToHaveValue preDef else pure preDef
let mkDefDecl : TermElabM Declaration :=
return Declaration.defnDecl {
name := preDef.declName, levelParams := preDef.levelParams, type := preDef.type, value := preDef.value
@@ -185,11 +218,11 @@ private def addNonRecAux (preDef : PreDefinition) (compile : Bool) (all : List N
generateEagerEqns preDef.declName
applyAttributesOf #[preDef] AttributeApplicationTime.afterCompilation
def addAndCompileNonRec (preDef : PreDefinition) (all : List Name := [preDef.declName]) : TermElabM Unit := do
addNonRecAux preDef (compile := true) (all := all)
def addAndCompileNonRec (preDef : PreDefinition) (all : List Name := [preDef.declName]) (cleanupValue := false) : TermElabM Unit := do
addNonRecAux preDef (compile := true) (all := all) (cleanupValue := cleanupValue)
def addNonRec (preDef : PreDefinition) (applyAttrAfterCompilation := true) (all : List Name := [preDef.declName]) (cacheProofs := true) : TermElabM Unit := do
addNonRecAux preDef (compile := false) (applyAttrAfterCompilation := applyAttrAfterCompilation) (all := all) (cacheProofs := cacheProofs)
def addNonRec (preDef : PreDefinition) (applyAttrAfterCompilation := true) (all : List Name := [preDef.declName]) (cacheProofs := true) (cleanupValue := false) : TermElabM Unit := do
addNonRecAux preDef (compile := false) (applyAttrAfterCompilation := applyAttrAfterCompilation) (all := all) (cacheProofs := cacheProofs) (cleanupValue := cleanupValue)
/--
Eliminate recursive application annotations containing syntax. These annotations are used by the well-founded recursion module

View File

@@ -25,18 +25,14 @@ structure EqnInfoCore where
deriving Inhabited
/--
Zeta reduces `let` and `let_fun` while consuming metadata.
Zeta reduces `let` and `have` while consuming metadata.
Returns true if progress is made.
-/
partial def expand (progress : Bool) (e : Expr) : Bool × Expr :=
match e with
| Expr.letE _ _ v b _ => expand true (b.instantiate1 v)
| Expr.mdata _ b => expand true b
| e =>
if let some (_, _, v, b) := e.letFun? then
expand true (b.instantiate1 v)
else
(progress, e)
| e => (progress, e)
def expandRHS? (mvarId : MVarId) : MetaM (Option MVarId) := do
let target mvarId.getType'

View File

@@ -98,11 +98,7 @@ private partial def ensureNoUnassignedLevelMVarsAtPreDef (preDef : PreDefinition
| .proj _ _ b => withExpr e do visit b
| .sort u => visitLevel u ( read)
| .const _ us => (if head then id else withExpr e) <| us.forM (visitLevel · ( read))
| .app .. => withExpr e do
if let some (args, n, t, v, b) := e.letFunAppArgs? then
visit t; visit v; withLocalDeclD n t fun x => visit (b.instantiate1 x); args.forM visit
else
e.withApp fun f args => do visit f true; args.forM visit
| .app .. => withExpr e do e.withApp fun f args => do visit f true; args.forM visit
| _ => pure ()
try
visit preDef.value |>.run preDef.value |>.run {}
@@ -319,9 +315,9 @@ def addPreDefinitions (preDefs : Array PreDefinition) : TermElabM Unit := withLC
let preDef eraseRecAppSyntax preDefs[0]!
ensureEqnReservedNamesAvailable preDef.declName
if preDef.modifiers.isNoncomputable then
addNonRec preDef
addNonRec preDef (cleanupValue := true)
else
addAndCompileNonRec preDef
addAndCompileNonRec preDef (cleanupValue := true)
preDef.termination.ensureNone "not recursive"
else if preDefs.any (·.modifiers.isUnsafe) then
addAndCompileUnsafe preDefs

View File

@@ -97,6 +97,7 @@ where
catch e =>
throwError "failed to generate unfold theorem for '{declName}':\n{e.toMessageData}"
let type mkForallFVars xs type
let type letToHave type
let value mkLambdaFVars xs goal
addDecl <| Declaration.thmDecl {
name, type, value

View File

@@ -93,6 +93,7 @@ where
doRealize name type := withOptions (tactic.hygienic.set · false) do
let value mkProof info.declName type
let (type, value) removeUnusedEqnHypotheses type value
let type letToHave type
addDecl <| Declaration.thmDecl {
name, type, value
levelParams := info.levelParams
@@ -126,6 +127,7 @@ where
let goal mkFreshExprSyntheticOpaqueMVar type
mkUnfoldProof declName goal.mvarId!
let type mkForallFVars xs type
let type letToHave type
let value mkLambdaFVars xs ( instantiateMVars goal)
addDecl <| Declaration.thmDecl {
name, type, value

View File

@@ -66,6 +66,6 @@ partial def addSmartUnfoldingDef (preDef : PreDefinition) (recArgPos : Nat) : Te
else
withEnableInfoTree false do
let preDefSUnfold addSmartUnfoldingDefAux preDef recArgPos
addNonRec preDefSUnfold
addNonRec preDefSUnfold (cleanupValue := true)
end Lean.Elab.Structural

View File

@@ -23,6 +23,7 @@ Preprocesses the expressions to improve the effectiveness of `wfRecursion`.
Unlike `Lean.Elab.Structural.preprocess`, do _not_ beta-reduce, as it could
remove `let_fun`-lambdas that contain explicit termination proofs.
(Note(kmill): this last statement no longer affects `let_fun`/`have`.)
-/
def floatRecApp (e : Expr) : CoreM Expr :=
Core.transform e

View File

@@ -91,6 +91,7 @@ def mkUnfoldEq (preDef : PreDefinition) (unaryPreDefName : Name) (wfPreprocessPr
let value instantiateMVars main
let type mkForallFVars xs type
let type letToHave type
let value mkLambdaFVars xs value
addDecl <| Declaration.thmDecl {
name, type, value
@@ -123,6 +124,7 @@ def mkBinaryUnfoldEq (preDef : PreDefinition) (unaryPreDefName : Name) : MetaM U
let value instantiateMVars main
let type mkForallFVars xs type
let type letToHave type
let value mkLambdaFVars xs value
addDecl <| Declaration.thmDecl {
name, type, value

View File

@@ -1155,12 +1155,11 @@ private partial def mkFlatCtor (levelParams : List Name) (params : Array Expr) (
let ctor := getStructureCtor env structName
let val mkFlatCtorExpr levelParams params ctor replaceIndFVars
withLCtx {} {} do trace[Elab.structure] "created flat constructor:{indentExpr val}"
unless val.hasSyntheticSorry do
-- Note: flatCtorName will be private if the constructor is private
let flatCtorName := mkFlatCtorOfStructCtorName ctor.name
let valType replaceIndFVars ( instantiateMVars ( inferType val))
let valType := valType.inferImplicit params.size true
addDecl <| Declaration.defnDecl ( mkDefinitionValInferrringUnsafe flatCtorName levelParams valType val .abbrev)
-- Note: flatCtorName will be private if the constructor is private
let flatCtorName := mkFlatCtorOfStructCtorName ctor.name
let valType replaceIndFVars ( instantiateMVars ( inferType val))
let valType := valType.inferImplicit params.size true
addDecl <| Declaration.defnDecl ( mkDefinitionValInferrringUnsafe flatCtorName levelParams valType val .abbrev)
private partial def checkResultingUniversesForFields (fieldInfos : Array StructFieldInfo) (u : Level) : TermElabM Unit := do
for info in fieldInfos do
@@ -1442,13 +1441,16 @@ def elabStructureCommand : InductiveElabDescr where
finalizeTermElab := withLCtx lctx localInsts do checkDefaults fieldInfos
prefinalize := fun levelParams params replaceIndFVars => do
withLCtx lctx localInsts do
addProjections params r fieldInfos
withOptions (warn.sorry.set · false) do
addProjections params r fieldInfos
registerStructure view.declName fieldInfos
runStructElabM (init := state) do
mkFlatCtor levelParams params view.declName replaceIndFVars
withOptions (warn.sorry.set · false) do
mkFlatCtor levelParams params view.declName replaceIndFVars
addDefaults levelParams params replaceIndFVars
let parentInfos withLCtx lctx localInsts <| runStructElabM (init := state) do
mkRemainingProjections levelParams params view
withOptions (warn.sorry.set · false) do
mkRemainingProjections levelParams params view
setStructureParents view.declName parentInfos
withSaveInfoContext do -- save new env
for field in view.fields do

View File

@@ -111,7 +111,7 @@ def solveMonoStep (failK : ∀ {α}, Expr → Array Name → MetaM α := @defaul
return [goal]
match_expr type with
| monotone α inst_α β inst_β f =>
| monotone α inst_α _ _ f =>
-- Ensure f is not headed by a redex and headed by at least one lambda, and clean some
-- redexes left by some of the lemmas we tend to apply
let f instantiateMVars f
@@ -150,26 +150,6 @@ def solveMonoStep (failK : ∀ {α}, Expr → Array Name → MetaM α := @defaul
pure goal'
return [goal'.mvarId!]
-- Float `letFun` to the environment.
-- (cannot use `applyConst`, it tends to reduce the let redex)
match_expr e with
| letFun γ _ v b =>
if γ.hasLooseBVars || v.hasLooseBVars then
failK f #[]
let b' := f.updateLambdaE! f.bindingDomain! b
let p mkAppOptM ``monotone_letFun #[α, β, γ, inst_α, inst_β, v, b']
let new_goals prependError m!"Could not apply {p}:" do
goal.apply p
let [new_goal] := new_goals
| throwError "Unexpected number of goals after {.ofConstName ``monotone_letFun}."
let (_, new_goal)
if b.isLambda then
new_goal.intro b.bindingName!
else
new_goal.intro1
return [new_goal]
| _ => pure ()
-- Handle lambdas, preserving the name of the binder
if e.isLambda then
let [new_goal] applyConst goal ``monotone_of_monotone_apply

View File

@@ -25,7 +25,7 @@ def reverseDuplicate (xs : List α) :=
.reverse (xs ++ xs)
```
```output
Invalid dotted identifier notation: The type of `.reverse` could not be determined
Invalid dotted identifier notation: The expected type of `.reverse` could not be determined
```
```lean fixed
def reverseDuplicate (xs : List α) : List α :=

View File

@@ -1963,10 +1963,11 @@ def setAppPPExplicitForExposingMVars (e : Expr) : Expr :=
| _ => e
/--
Returns true if `e` is a `let_fun` expression, which is an expression of the form `letFun v f`.
Returns true if `e` is an expression of the form `letFun v f`.
Ideally `f` is a lambda, but we do not require that here.
Warning: if the `let_fun` is applied to additional arguments (such as in `(let_fun f := id; id) 1`), this function returns `false`.
-/
@[deprecated Expr.isHave (since := "2025-06-29")]
def isLetFun (e : Expr) : Bool := e.isAppOfArity ``letFun 4
/--
@@ -1979,6 +1980,7 @@ They can be created using `Lean.Meta.mkLetFun`.
If in the encoding of `let_fun` the last argument to `letFun` is eta reduced, this returns `Name.anonymous` for the binder name.
-/
@[deprecated Expr.isHave (since := "2025-06-29")]
def letFun? (e : Expr) : Option (Name × Expr × Expr × Expr) :=
match e with
| .app (.app (.app (.app (.const ``letFun _) t) _β) v) f =>
@@ -1991,6 +1993,7 @@ def letFun? (e : Expr) : Option (Name × Expr × Expr × Expr) :=
Like `Lean.Expr.letFun?`, but handles the case when the `let_fun` expression is possibly applied to additional arguments.
Returns those arguments in addition to the values returned by `letFun?`.
-/
@[deprecated Expr.isHave (since := "2025-06-29")]
def letFunAppArgs? (e : Expr) : Option (Array Expr × Name × Expr × Expr × Expr) := do
guard <| 4 e.getAppNumArgs
guard <| e.isAppOf ``letFun

View File

@@ -484,7 +484,7 @@ where
let startTime := ( IO.monoNanosNow).toFloat / 1000000000
let mut opts := setup.opts
-- HACK: no better way to enable in core with `USE_LAKE` off
if (`Init).isPrefixOf setup.mainModuleName then
if setup.mainModuleName.getRoot [`Init, `Std, `Lean, `Lake] then
opts := experimental.module.setIfNotSet opts true
if !stx.raw[0].isNone && !experimental.module.get opts then
throw <| IO.Error.userError "`module` keyword is experimental and not enabled here"

View File

@@ -34,9 +34,10 @@ def mkExpectedTypeHint (e : Expr) (expectedType : Expr) : MetaM Expr := do
return mkExpectedTypeHintCore e expectedType u
/--
`mkLetFun x v e` creates the encoding for the `let_fun x := v; e` expression.
`mkLetFun x v e` creates `letFun v (fun x => e)`.
The expression `x` can either be a free variable or a metavariable, and the function suitably abstracts `x` in `e`.
-/
@[deprecated mkLetFVars (since := "2026-06-29")]
def mkLetFun (x : Expr) (v : Expr) (e : Expr) : MetaM Expr := do
-- If `x` is an `ldecl`, then the result of `mkLambdaFVars` is a let expression.
let ensureLambda : Expr Expr

View File

@@ -2545,6 +2545,38 @@ where
end Meta
open Meta
namespace PPContext
def runCoreM {α : Type} (ppCtx : PPContext) (x : CoreM α) : IO α :=
Prod.fst <$> x.toIO { options := ppCtx.opts, currNamespace := ppCtx.currNamespace
openDecls := ppCtx.openDecls
fileName := "<PrettyPrinter>", fileMap := default
diag := getDiag ppCtx.opts }
{ env := ppCtx.env, ngen := { namePrefix := `_pp_uniq } }
def runMetaM {α : Type} (ppCtx : PPContext) (x : MetaM α) : IO α :=
ppCtx.runCoreM <| x.run' { lctx := ppCtx.lctx } { mctx := ppCtx.mctx }
end PPContext
/--
Turns a `MetaM MessageData` into a `MessageData.lazy` which will run the monadic value.
The optional array of expressions is used to set the `hasSyntheticSorry` fields, and should
comprise the expressions that are included in the message data.
-/
def MessageData.ofLazyM (f : MetaM MessageData) (es : Array Expr := #[]) : MessageData :=
.lazy
(f := fun ppctxt => do
match ( ppctxt.runMetaM f |>.toBaseIO) with
| .ok fmt => return fmt
| .error ex => return m!"[Error pretty printing: {ex}]"
)
(hasSyntheticSorry := fun mvarctxt => es.any (fun a =>
instantiateMVarsCore mvarctxt a |>.1.hasSyntheticSorry
))
builtin_initialize
registerTraceClass `Meta.isLevelDefEq.postponed
registerTraceClass `Meta.realizeConst

View File

@@ -191,18 +191,26 @@ def throwLetTypeMismatchMessage {α} (fvarId : FVarId) : MetaM α := do
| _ => unreachable!
/--
Return error message "has type{givenType}\nbut is expected to have type{expectedType}"
Return error message "has type{givenType}\nbut is expected to have type{expectedType}"
Adds the types types unless they are defeq.
-/
def mkHasTypeButIsExpectedMsg (givenType expectedType : Expr) : MetaM MessageData := do
try
let givenTypeType inferType givenType
let expectedTypeType inferType expectedType
let (givenType, expectedType) addPPExplicitToExposeDiff givenType expectedType
let (givenTypeType, expectedTypeType) addPPExplicitToExposeDiff givenTypeType expectedTypeType
return m!"has type{indentD m!"{givenType} : {givenTypeType}"}\nbut is expected to have type{indentD m!"{expectedType} : {expectedTypeType}"}"
catch _ =>
let (givenType, expectedType) addPPExplicitToExposeDiff givenType expectedType
return m!"has type{indentExpr givenType}\nbut is expected to have type{indentExpr expectedType}"
def mkHasTypeButIsExpectedMsg (givenType expectedType : Expr) : MetaM MessageData :=
return MessageData.ofLazyM (es := #[givenType, expectedType]) do
try
let givenTypeType inferType givenType
let expectedTypeType inferType expectedType
if ( isDefEqGuarded givenTypeType expectedTypeType) then
let (givenType, expectedType) addPPExplicitToExposeDiff givenType expectedType
return m!"has type{indentExpr givenType}\n\
but is expected to have type{indentExpr expectedType}"
else
let (givenType, expectedType) addPPExplicitToExposeDiff givenType expectedType
let (givenTypeType, expectedTypeType) addPPExplicitToExposeDiff givenTypeType expectedTypeType
return m!"has type{indentExpr givenType}\nof sort{inlineExpr givenTypeType}\
but is expected to have type{indentExpr expectedType}\nof sort{inlineExprTrailing expectedTypeType}"
catch _ =>
let (givenType, expectedType) addPPExplicitToExposeDiff givenType expectedType
return m!"has type{indentExpr givenType}\nbut is expected to have type{indentExpr expectedType}"
def throwAppTypeMismatch (f a : Expr) : MetaM α := do
let (expectedType, binfo) getFunctionDomain f

View File

@@ -10,6 +10,7 @@ import Lean.Meta.Basic
import Lean.Meta.AppBuilder
import Lean.Meta.Match.MatcherInfo
import Lean.DefEqAttrib
import Lean.Meta.LetToHave
namespace Lean.Meta
@@ -178,6 +179,8 @@ where doRealize name info := do
lambdaTelescope (cleanupAnnotations := true) info.value fun xs body => do
let lhs := mkAppN (mkConst info.name <| info.levelParams.map mkLevelParam) xs
let type mkForallFVars xs ( mkEq lhs body)
-- Note: if this definition was added using `def`, then `letToHave` has already been applied to the body.
let type letToHave type
let value mkLambdaFVars xs ( mkEqRefl lhs)
addDecl <| Declaration.thmDecl {
name, type, value

View File

@@ -434,7 +434,7 @@ The `Meta.letToHave` trace class logs errors and messages.
def letToHave (e : Expr) : MetaM Expr := do
profileitM Exception "let-to-have transformation" ( getOptions) do
let e instantiateMVars e
LetToHave.main e
withoutExporting <| LetToHave.main e
builtin_initialize
registerTraceClass `Meta.letToHave

View File

@@ -290,14 +290,6 @@ partial def foldAndCollect (oldIH newIH : FVarId) (isRecCall : Expr → Option E
withTraceNode `Meta.FunInd (pure m!"{exceptEmoji ·} foldAndCollect ({mkFVar oldIH} → {mkFVar newIH})::{indentExpr e}") do
let e' id do
if let some (n, t, v, b) := e.letFun? then
let t' foldAndCollect oldIH newIH isRecCall t
let v' foldAndCollect oldIH newIH isRecCall v
return withLocalDeclD n t' fun x => do
M.localMapM (mkLetFun x v' ·) do
let b' foldAndCollect oldIH newIH isRecCall (b.instantiate1 x)
mkLetFun x v' b'
if let some matcherApp matchMatcherApp? e (alsoCasesOn := true) then
if matcherApp.remaining.size == 1 && matcherApp.remaining[0]!.isFVarOf oldIH then
-- We do different things to the matcher when folding recursive calls and when
@@ -673,12 +665,6 @@ def rwLetWith (h : Expr) (e : Expr) : MetaM Simp.Result := do
def rwMData (e : Expr) : MetaM Simp.Result := do
return { expr := e.consumeMData }
def rwHaveWith (h : Expr) (e : Expr) : MetaM Simp.Result := do
if let some (_n, t, _v, b) := e.letFun? then
if ( isDefEq t ( inferType h)) then
return { expr := b.instantiate1 h }
return { expr := e }
def rwFun (names : Array Name) (e : Expr) : MetaM Simp.Result := do
e.withApp fun f xs => do
if let some name := names.find? f.isConstOf then
@@ -910,14 +896,6 @@ partial def buildInductionBody (toErase toClear : Array FVarId) (goal : Expr)
buildInductionBody toErase toClear goal' oldIH newIH isRecCall (b.instantiate1 x)
mkLetFVars #[x] b'
if let some (n, t, v, b) := e.letFun? then
let t' foldAndCollect oldIH newIH isRecCall t
let v' foldAndCollect oldIH newIH isRecCall v
return withLetDecl n t' v' fun x => M2.branch do
let b' withRewrittenMotiveArg goal (rwHaveWith x) fun goal' =>
buildInductionBody toErase toClear goal' oldIH newIH isRecCall (b.instantiate1 x)
mkLetFVars #[x] b' (usedLetOnly := false)
-- Special case for traversing the PProded bodies in our encoding of structural mutual recursion
if let .lam n t b bi := e then
if goal.isForall then
@@ -1057,6 +1035,7 @@ where doRealize (inductName : Name) := do
let eTyp inferType e'
let eTyp elimTypeAnnotations eTyp
let eTyp letToHave eTyp
-- logInfo m!"eTyp: {eTyp}"
let levelParams := (collectLevelParams {} eTyp).params
-- Prune unused level parameters, preserving the original order
@@ -1090,6 +1069,7 @@ def projectMutualInduct (unfolding : Bool) (names : Array Name) (mutualInduct :
let value PProdN.projM names.size idx value
mkLambdaFVars xs value
let type inferType value
let type letToHave type
addDecl <| Declaration.thmDecl { name := inductName, levelParams, type, value }
if idx == 0 then finalizeFirstInd
@@ -1248,6 +1228,7 @@ where doRealize inductName := do
check value
let type inferType value
let type elimOptParam type
let type letToHave type
addDecl <| Declaration.thmDecl
{ name := inductName, levelParams := ci.levelParams, type, value }
@@ -1480,6 +1461,7 @@ where doRealize inductName := do
let eTyp inferType e'
let eTyp elimTypeAnnotations eTyp
let eTyp letToHave eTyp
-- logInfo m!"eTyp: {eTyp}"
let levelParams := (collectLevelParams {} eTyp).params
-- Prune unused level parameters, preserving the original order
@@ -1546,9 +1528,6 @@ where
modify (·.set! i true)
for alt in args[matchInfo.getFirstAltPos...matchInfo.arity] do
go xs alt
if f.isConstOf ``letFun then
for arg in args[3...4] do
go xs arg
if f.isConstOf ``ite || f.isConstOf ``dite then
for arg in args[3...5] do
go xs arg
@@ -1623,6 +1602,7 @@ def deriveCases (unfolding : Bool) (name : Name) : MetaM Unit := do
let eTyp inferType e'
let eTyp elimTypeAnnotations eTyp
let eTyp letToHave eTyp
-- logInfo m!"eTyp: {eTyp}"
let levelParams := (collectLevelParams {} eTyp).params
-- Prune unused level parameters, preserving the original order

View File

@@ -18,6 +18,7 @@ import Lean.Meta.Tactic.Grind.Arith.Cutsat.SearchM
import Lean.Meta.Tactic.Grind.Arith.Cutsat.Model
import Lean.Meta.Tactic.Grind.Arith.Cutsat.MBTC
import Lean.Meta.Tactic.Grind.Arith.Cutsat.Nat
import Lean.Meta.Tactic.Grind.Arith.Cutsat.CommRing
namespace Lean
@@ -27,6 +28,7 @@ builtin_initialize registerTraceClass `grind.cutsat.assert
builtin_initialize registerTraceClass `grind.cutsat.assert.trivial
builtin_initialize registerTraceClass `grind.cutsat.assert.unsat
builtin_initialize registerTraceClass `grind.cutsat.assert.store
builtin_initialize registerTraceClass `grind.cutsat.assert.nonlinear
builtin_initialize registerTraceClass `grind.debug.cutsat.subst
builtin_initialize registerTraceClass `grind.debug.cutsat.search

View File

@@ -0,0 +1,58 @@
/-
Copyright (c) 2025 Amazon.com, Inc. or its affiliates. All Rights Reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Lean.Meta.Tactic.Grind.ProveEq
import Lean.Meta.Tactic.Grind.Arith.CommRing.RingId
import Lean.Meta.Tactic.Grind.Arith.CommRing.Reify
import Lean.Meta.Tactic.Grind.Arith.CommRing.DenoteExpr
import Lean.Meta.Tactic.Grind.Arith.Cutsat.Var
/-!
CommRing interface for cutsat. We use it to normalize nonlinear polynomials.
-/
namespace Lean.Meta.Grind.Arith.Cutsat
/-- Returns `true` if `p` contains a nonlinear monomial. -/
def _root_.Int.Linear.Poly.isNonlinear (p : Poly) : GoalM Bool := do
let .add _ x p := p | return false
if ( getVar x).isAppOf ``HMul.hMul then return true
p.isNonlinear
def _root_.Int.Linear.Poly.getGeneration (p : Poly) : GoalM Nat := do
go p 0
where
go : Poly Nat GoalM Nat
| .num _, gen => return gen
| .add _ x p, gen => do go p (max ( Grind.getGeneration ( getVar x)) gen)
def getIntRingId? : GoalM (Option Nat) := do
CommRing.getRingId? ( getIntExpr)
/-- Normalize the polynomial using `CommRing`-/
def _root_.Int.Linear.Poly.normCommRing? (p : Poly) : GoalM (Option (CommRing.RingExpr × CommRing.Poly × Poly)) := do
unless ( p.isNonlinear) do return none
let some ringId getIntRingId? | return none
CommRing.RingM.run ringId do
let e p.denoteExpr'
-- TODO: we can avoid the following statement if we construct the `Int` denotation using
-- Internalized operators instead of `mkIntMul` and `mkIntAdd`
let e shareCommon ( canon e)
let gen p.getGeneration
let some re CommRing.reify? e (gen := gen) | return none
let p' := re.toPoly
let e' p'.denoteExpr
let e' preprocessLight e'
-- Remark: we are reusing the `IntModule` virtual parent.
-- TODO: Investigate whether we should have a custom virtual parent for cutsat
internalize e' gen (some getIntModuleVirtualParent)
let p'' toPoly e'
if p == p'' then return none
modify' fun s => { s with usedCommRing := true }
trace[grind.cutsat.assert.nonlinear] "{← p.pp} ===> {← p''.pp}"
return some (re, p', p'')
end Lean.Meta.Grind.Arith.Cutsat

View File

@@ -85,6 +85,14 @@ partial def DvdCnstr.assert (c : DvdCnstr) : GoalM Unit := withIncRecDepth do
c.p.updateOccs
modify' fun s => { s with dvds := s.dvds.set x (some c) }
/-- Asserts a constraint coming from the core. -/
private def DvdCnstr.assertCore (c : DvdCnstr) : GoalM Unit := do
if let some (re, rp, p) c.p.normCommRing? then
let c := { c with p, h := .commRingNorm c re rp : DvdCnstr}
c.assert
else
c.assert
def propagateIntDvd (e : Expr) : GoalM Unit := do
let_expr Dvd.dvd _ inst a b e | return ()
unless ( isInstDvdInt inst) do return ()
@@ -94,7 +102,7 @@ def propagateIntDvd (e : Expr) : GoalM Unit := do
if ( isEqTrue e) then
let p toPoly b
let c := { d, p, h := .core e : DvdCnstr }
c.assert
c.assertCore
else if ( isEqFalse e) then
pushNewFact <| mkApp4 (mkConst ``Int.Linear.of_not_dvd) a b reflBoolTrue (mkOfEqFalseCore e ( mkEqFalseProof e))
@@ -106,7 +114,7 @@ def propagateNatDvd (e : Expr) : GoalM Unit := do
let p := b'.norm
if ( isEqTrue e) then
let c := { d, p, h := .coreNat e d b b' : DvdCnstr }
c.assert
c.assertCore
else if ( isEqFalse e) then
let_expr Dvd.dvd _ _ a b e | return ()
pushNewFact <| mkApp3 (mkConst ``Nat.emod_pos_of_not_dvd) a b (mkOfEqFalseCore e ( mkEqFalseProof e))

View File

@@ -9,6 +9,7 @@ import Lean.Meta.Tactic.Grind.Arith.Cutsat.Var
import Lean.Meta.Tactic.Grind.Arith.Cutsat.DvdCnstr
import Lean.Meta.Tactic.Grind.Arith.Cutsat.LeCnstr
import Lean.Meta.Tactic.Grind.Arith.Cutsat.ToInt
import Lean.Meta.Tactic.Grind.Arith.Cutsat.CommRing
namespace Lean.Meta.Grind.Arith.Cutsat
@@ -223,9 +224,18 @@ private def exprAsPoly (a : Expr) : GoalM Poly := do
private def processNewIntEq (a b : Expr) : GoalM Unit := do
let p₁ exprAsPoly a
let p₂ exprAsPoly b
-- Remark: we don't need to use the comm ring normalizer here because `p` is always linear.
let p := p₁.combine (p₂.mul (-1))
{ p, h := .core a b p₁ p₂ : EqCnstr }.assert
/-- Asserts a constraint coming from the core. -/
private def EqCnstr.assertCore (c : EqCnstr) : GoalM Unit := do
if let some (re, rp, p) c.p.normCommRing? then
let c := { p, h := .commRingNorm c re rp : EqCnstr}
c.assert
else
c.assert
private def processNewNatEq (a b : Expr) : GoalM Unit := do
let (lhs, rhs) Int.OfNat.toIntEq a b
let gen getGeneration a
@@ -235,7 +245,7 @@ private def processNewNatEq (a b : Expr) : GoalM Unit := do
let p := lhs'.sub rhs' |>.norm
let c := { p, h := .coreNat a b lhs rhs lhs' rhs' : EqCnstr }
trace[grind.debug.cutsat.nat] "{← c.pp}"
c.assert
c.assertCore
private def processNewToIntEq (a b : Expr) : ToIntM Unit := do
let gen := max ( getGeneration a) ( getGeneration b)
@@ -246,7 +256,7 @@ private def processNewToIntEq (a b : Expr) : ToIntM Unit := do
let rhs toLinearExpr b' gen
let p := lhs.sub rhs |>.norm
let c := { p, h := .coreToInt a b thm lhs rhs : EqCnstr }
c.assert
c.assertCore
@[export lean_process_cutsat_eq]
def processNewEqImpl (a b : Expr) : GoalM Unit := do
@@ -262,6 +272,8 @@ def processNewEqImpl (a b : Expr) : GoalM Unit := do
ToIntM.run α do processNewToIntEq a b
private def processNewIntDiseq (a b : Expr) : GoalM Unit := do
-- Remark: we don't need to use comm ring to normalize these polynomials because they are
-- always linear.
let p₁ exprAsPoly a
let c if let some 0 getIntValue? b then
pure { p := p₁, h := .core0 a b : DiseqCnstr }
@@ -271,6 +283,14 @@ private def processNewIntDiseq (a b : Expr) : GoalM Unit := do
pure {p, h := .core a b p₁ p₂ : DiseqCnstr }
c.assert
/-- Asserts a constraint coming from the core. -/
private def DiseqCnstr.assertCore (c : DiseqCnstr) : GoalM Unit := do
if let some (re, rp, p) c.p.normCommRing? then
let c := { p, h := .commRingNorm c re rp : DiseqCnstr }
c.assert
else
c.assert
private def processNewNatDiseq (a b : Expr) : GoalM Unit := do
let (lhs, rhs) Int.OfNat.toIntEq a b
let gen getGeneration a
@@ -279,7 +299,7 @@ private def processNewNatDiseq (a b : Expr) : GoalM Unit := do
let rhs' toLinearExpr ( rhs.denoteAsIntExpr ctx) gen
let p := lhs'.sub rhs' |>.norm
let c := { p, h := .coreNat a b lhs rhs lhs' rhs' : DiseqCnstr }
c.assert
c.assertCore
private def processNewToIntDiseq (a b : Expr) : ToIntM Unit := do
let gen := max ( getGeneration a) ( getGeneration b)
@@ -290,7 +310,7 @@ private def processNewToIntDiseq (a b : Expr) : ToIntM Unit := do
let rhs toLinearExpr b' gen
let p := lhs.sub rhs |>.norm
let c := { p, h := .coreToInt a b thm lhs rhs : DiseqCnstr }
c.assert
c.assertCore
@[export lean_process_cutsat_diseq]
def processNewDiseqImpl (a b : Expr) : GoalM Unit := do
@@ -352,8 +372,12 @@ private def internalizeInt (e : Expr) : GoalM Unit := do
-- It is pointless to assert `x = x`
-- This can happen if `e` is a nonlinear term (e.g., `e` is `a*b`)
return
let c := { p := .add (-1) x p, h := .defn e p : EqCnstr }
c.assert
if let some (re, rp, p') p.normCommRing? then
let c := { p := .add (-1) x p', h := .defnCommRing e p re rp p' : EqCnstr }
c.assert
else
let c := { p := .add (-1) x p, h := .defn e p : EqCnstr }
c.assert
private def expandDivMod (a : Expr) (b : Int) : GoalM Unit := do
if b == 0 || b == 1 || b == -1 then
@@ -411,8 +435,12 @@ private def internalizeNat (e : Expr) : GoalM Unit := do
trace[grind.debug.cutsat.internalize] "{aquote natCast_e}:= {← p.pp}"
let x mkVar natCast_e
modify' fun s => { s with natDef := s.natDef.insert { expr := e } x }
let c := { p := .add (-1) x p, h := .defnNat e' x e'' : EqCnstr }
c.assert
if let some (re, rp, p') p.normCommRing? then
let c := { p := .add (-1) x p', h := .defnNatCommRing e' x e'' p re rp p' : EqCnstr }
c.assert
else
let c := { p := .add (-1) x p, h := .defnNat e' x e'' : EqCnstr }
c.assert
private def isToIntForbiddenParent (parent? : Option Expr) : Bool :=
if let some parent := parent? then

View File

@@ -12,6 +12,7 @@ import Lean.Meta.Tactic.Grind.Arith.Cutsat.Proof
import Lean.Meta.Tactic.Grind.Arith.Cutsat.Nat
import Lean.Meta.Tactic.Grind.Arith.Cutsat.Norm
import Lean.Meta.Tactic.Grind.Arith.Cutsat.ToInt
import Lean.Meta.Tactic.Grind.Arith.Cutsat.CommRing
namespace Lean.Meta.Grind.Arith.Cutsat
@@ -135,6 +136,14 @@ private def toPolyLe? (e : Expr) : GoalM (Option Poly) := do
reportNonNormalized e; return none
return some ( toPoly a)
/-- Asserts a constraint coming from the core. -/
private def LeCnstr.assertCore (c : LeCnstr) : GoalM Unit := do
if let some (re, rp, p) c.p.normCommRing? then
let c := { p, h := .commRingNorm c re rp : LeCnstr}
c.assert
else
c.assert
/--
Given an expression `e` that is in `True` (or `False` equivalence class), if `e` is an
integer inequality, asserts it to the cutsat state.
@@ -145,7 +154,7 @@ def propagateIntLe (e : Expr) (eqTrue : Bool) : GoalM Unit := do
pure { p, h := .core e : LeCnstr }
else
pure { p := p.mul (-1) |>.addConst 1, h := .coreNeg e p : LeCnstr }
c.assert
c.assertCore
def propagateNatLe (e : Expr) (eqTrue : Bool) : GoalM Unit := do
let some (lhs, rhs) Int.OfNat.toIntLe? e | return ()
@@ -158,7 +167,7 @@ def propagateNatLe (e : Expr) (eqTrue : Bool) : GoalM Unit := do
pure { p, h := .coreNat e lhs rhs lhs' rhs' : LeCnstr }
else
pure { p := p.mul (-1) |>.addConst 1, h := .coreNatNeg e lhs rhs lhs' rhs' : LeCnstr }
c.assert
c.assertCore
def propagateToIntLe (e : Expr) (eqTrue : Bool) : ToIntM Unit := do
let some thm if eqTrue then pure ( getInfo).ofLE? else pure ( getInfo).ofNotLE? | return ()
@@ -172,7 +181,7 @@ def propagateToIntLe (e : Expr) (eqTrue : Bool) : ToIntM Unit := do
let rhs toLinearExpr b' gen
let p := lhs.sub rhs |>.norm
let c := { p, h := .coreToInt e eqTrue thm lhs rhs : LeCnstr }
c.assert
c.assertCore
def propagateLe (e : Expr) (eqTrue : Bool) : GoalM Unit := do
unless ( getConfig).cutsat do return ()

View File

@@ -4,11 +4,15 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
prelude
import Init.Grind.Ring.Poly
import Lean.Meta.Tactic.Grind.Diseq
import Lean.Meta.Tactic.Grind.Arith.Util
import Lean.Meta.Tactic.Grind.Arith.ProofUtil
import Lean.Meta.Tactic.Grind.Arith.Cutsat.Util
import Lean.Meta.Tactic.Grind.Arith.Cutsat.Nat
import Lean.Meta.Tactic.Grind.Arith.Cutsat.CommRing
import Lean.Meta.Tactic.Grind.Arith.CommRing.Util
import Lean.Meta.Tactic.Grind.Arith.CommRing.Proof
namespace Lean.Meta.Grind.Arith.Cutsat
@@ -20,12 +24,15 @@ structure ProofM.State where
polyMap : Std.HashMap Poly Expr := {}
exprMap : Std.HashMap Int.Linear.Expr Expr := {}
natExprMap : Std.HashMap Int.OfNat.Expr Expr := {}
ringPolyMap : Std.HashMap CommRing.Poly Expr := {}
ringExprMap : Std.HashMap CommRing.RingExpr Expr := {}
structure ProofM.Context where
ctx : Expr
/-- Variables before reordering -/
ctx' : Expr
natCtx : Expr
ringCtx : Expr
/--
`unordered` is `true` if we entered a `.reorder c` justification. The variables in `c` and
its dependencies are unordered.
@@ -64,27 +71,41 @@ private abbrev caching (c : α) (k : ProofM Expr) : ProofM Expr := do
modify fun s => { s with cache := s.cache.insert addr h }
return h
def mkPolyDecl (p : Poly) : ProofM Expr := do
private def mkPolyDecl (p : Poly) : ProofM Expr := do
if let some x := ( get).polyMap[p]? then
return x
let x := mkFVar ( mkFreshFVarId)
modify fun s => { s with polyMap := s.polyMap.insert p x }
return x
def mkExprDecl (e : Int.Linear.Expr) : ProofM Expr := do
private def mkExprDecl (e : Int.Linear.Expr) : ProofM Expr := do
if let some x := ( get).exprMap[e]? then
return x
let x := mkFVar ( mkFreshFVarId)
modify fun s => { s with exprMap := s.exprMap.insert e x }
return x
def mkNatExprDecl (e : Int.OfNat.Expr) : ProofM Expr := do
private def mkNatExprDecl (e : Int.OfNat.Expr) : ProofM Expr := do
if let some x := ( get).natExprMap[e]? then
return x
let x := mkFVar ( mkFreshFVarId)
modify fun s => { s with natExprMap := s.natExprMap.insert e x }
return x
private def mkRingPolyDecl (p : CommRing.Poly) : ProofM Expr := do
if let some x := ( get).ringPolyMap[p]? then
return x
let x := mkFVar ( mkFreshFVarId)
modify fun s => { s with ringPolyMap := s.ringPolyMap.insert p x }
return x
private def mkRingExprDecl (e : CommRing.RingExpr) : ProofM Expr := do
if let some x := ( get).ringExprMap[e]? then
return x
let x := mkFVar ( mkFreshFVarId)
modify fun s => { s with ringExprMap := s.ringExprMap.insert e x }
return x
private def toContextExprCore (vars : PArray Expr) (type : Expr) : MetaM Expr :=
if h : 0 < vars.size then
RArray.toExpr type id (RArray.ofFn (vars[·]) h)
@@ -100,18 +121,34 @@ private def toContextExpr' : GoalM Expr := do
private def toNatContextExpr : GoalM Expr := do
toContextExprCore ( getNatVars) Nat.mkType
private def toRingContextExpr : GoalM Expr := do
if ( get').usedCommRing then
if let some ringId getIntRingId? then
return ( CommRing.RingM.run ringId do CommRing.toContextExpr)
RArray.toExpr Int.mkType id (RArray.leaf (mkIntLit 0))
private abbrev withProofContext (x : ProofM Expr) : GoalM Expr := do
withLetDecl `ctx (mkApp (mkConst ``RArray [levelZero]) Int.mkType) ( toContextExpr) fun ctx => do
withLetDecl `ctx' (mkApp (mkConst ``RArray [levelZero]) Int.mkType) ( toContextExpr') fun ctx' => do
withLetDecl `rctx (mkApp (mkConst ``RArray [levelZero]) Int.mkType) ( toRingContextExpr) fun ringCtx => do
withLetDecl `nctx (mkApp (mkConst ``RArray [levelZero]) Nat.mkType) ( toNatContextExpr) fun natCtx => do
go { ctx, ctx', natCtx } |>.run' {}
go { ctx, ctx', ringCtx, natCtx } |>.run' {}
where
go : ProofM Expr := do
let h x
let h mkLetOfMap ( get).polyMap h `p (mkConst ``Int.Linear.Poly) toExpr
let h mkLetOfMap ( get).exprMap h `e (mkConst ``Int.Linear.Expr) toExpr
let h mkLetOfMap ( get).natExprMap h `a (mkConst ``Int.OfNat.Expr) toExpr
mkLetFVars #[( getContext), ( read).ctx', ( getNatContext)] h
let h mkLetOfMap ( get).ringPolyMap h `rp (mkConst ``Grind.CommRing.Poly) toExpr
let h mkLetOfMap ( get).ringExprMap h `re (mkConst ``Grind.CommRing.Expr) toExpr
mkLetFVars #[( getContext), ( read).ctx', ( read).ringCtx, ( getNatContext)] h
/--
Returns a Lean expression representing the auxiliary `CommRing` variable context needed for normalizing
nonlinear polynomials.
-/
private abbrev getRingContext : ProofM Expr := do
return ( read).ringCtx
private def DvdCnstr.get_d_a (c : DvdCnstr) : GoalM (Int × Int) := do
let d := c.d
@@ -154,6 +191,20 @@ partial def EqCnstr.toExprProof (c' : EqCnstr) : ProofM Expr := caching c' do
( getContext) ( mkPolyDecl c₁.p) ( mkPolyDecl c₂.p)
reflBoolTrue ( c₁.toExprProof) ( c₂.toExprProof)
| .reorder c => withUnordered <| c.toExprProof
| .commRingNorm c e p =>
let h := mkApp4 (mkConst ``Grind.CommRing.norm_int) ( getRingContext) ( mkRingExprDecl e) ( mkRingPolyDecl p) reflBoolTrue
return mkApp5 (mkConst ``Int.Linear.eq_norm_poly) ( getContext) ( mkPolyDecl c.p) ( mkPolyDecl c'.p) h ( c.toExprProof)
| .defnCommRing e p re rp p' =>
let h := mkApp4 (mkConst ``Grind.CommRing.norm_int) ( getRingContext) ( mkRingExprDecl re) ( mkRingPolyDecl rp) reflBoolTrue
let some x := ( getVarMap).find? { expr := e } | throwError "`grind` internal error, missing cutsat variable{indentExpr e}"
return mkApp8 (mkConst ``Int.Linear.eq_def_norm) ( getContext)
(toExpr x) ( mkPolyDecl p) ( mkPolyDecl p') ( mkPolyDecl c'.p)
reflBoolTrue ( mkEqRefl e) h
| .defnNatCommRing e x e' p re rp p' =>
let h₁ := mkApp2 (mkConst ``Int.OfNat.Expr.eq_denoteAsInt) ( getNatContext) ( mkNatExprDecl e)
let h₂ := mkApp4 (mkConst ``Grind.CommRing.norm_int) ( getRingContext) ( mkRingExprDecl re) ( mkRingPolyDecl rp) reflBoolTrue
return mkApp9 (mkConst ``Int.Linear.eq_def'_norm) ( getContext) (toExpr x) ( mkExprDecl e')
( mkPolyDecl p) ( mkPolyDecl p') ( mkPolyDecl c'.p) reflBoolTrue h₁ h₂
partial def DvdCnstr.toExprProof (c' : DvdCnstr) : ProofM Expr := caching c' do
trace[grind.debug.cutsat.proof] "{← c'.pp}"
@@ -216,6 +267,9 @@ partial def DvdCnstr.toExprProof (c' : DvdCnstr) : ProofM Expr := caching c' do
( getContext) ( mkPolyDecl p₁) ( mkPolyDecl p₂) ( mkPolyDecl c₃.p) (toExpr c₃.d) (toExpr s.k) (toExpr c'.d) ( mkPolyDecl c'.p)
( s.toExprProof) reflBoolTrue
| .reorder c => withUnordered <| c.toExprProof
| .commRingNorm c e p =>
let h := mkApp4 (mkConst ``Grind.CommRing.norm_int) ( getRingContext) ( mkRingExprDecl e) ( mkRingPolyDecl p) reflBoolTrue
return mkApp6 (mkConst ``Int.Linear.dvd_norm_poly) ( getContext) (toExpr c.d) ( mkPolyDecl c.p) ( mkPolyDecl c'.p) h ( c.toExprProof)
partial def LeCnstr.toExprProof (c' : LeCnstr) : ProofM Expr := caching c' do
trace[grind.debug.cutsat.proof] "{← c'.pp}"
@@ -302,6 +356,9 @@ partial def LeCnstr.toExprProof (c' : LeCnstr) : ProofM Expr := caching c' do
return mkApp10 (mkConst thmName)
( getContext) ( mkPolyDecl p₁) ( mkPolyDecl p₂) ( mkPolyDecl c₃.p) (toExpr c₃.d) (toExpr s.k) (toExpr coeff) ( mkPolyDecl c'.p) ( s.toExprProof) reflBoolTrue
| .reorder c => withUnordered <| c.toExprProof
| .commRingNorm c e p =>
let h := mkApp4 (mkConst ``Grind.CommRing.norm_int) ( getRingContext) ( mkRingExprDecl e) ( mkRingPolyDecl p) reflBoolTrue
return mkApp5 (mkConst ``Int.Linear.le_norm_poly) ( getContext) ( mkPolyDecl c.p) ( mkPolyDecl c'.p) h ( c.toExprProof)
partial def DiseqCnstr.toExprProof (c' : DiseqCnstr) : ProofM Expr := caching c' do
match c'.h with
@@ -329,6 +386,9 @@ partial def DiseqCnstr.toExprProof (c' : DiseqCnstr) : ProofM Expr := caching c'
( getContext) (toExpr x) ( mkPolyDecl c₁.p) ( mkPolyDecl c₂.p) ( mkPolyDecl c'.p)
reflBoolTrue ( c₁.toExprProof) ( c₂.toExprProof)
| .reorder c => withUnordered <| c.toExprProof
| .commRingNorm c e p =>
let h := mkApp4 (mkConst ``Grind.CommRing.norm_int) ( getRingContext) ( mkRingExprDecl e) ( mkRingPolyDecl p) reflBoolTrue
return mkApp5 (mkConst ``Int.Linear.diseq_norm_poly) ( getContext) ( mkPolyDecl c.p) ( mkPolyDecl c'.p) h ( c.toExprProof)
partial def CooperSplit.toExprProof (s : CooperSplit) : ProofM Expr := caching s do
match s.h with
@@ -412,8 +472,9 @@ open CollectDecVars
mutual
partial def EqCnstr.collectDecVars (c' : EqCnstr) : CollectDecVarsM Unit := do unless ( alreadyVisited c') do
match c'.h with
| .core0 .. | .core .. | .coreNat .. | .defn .. | .defnNat .. | .coreToInt .. => return () -- Equalities coming from the core never contain cutsat decision variables
| .reorder c | .norm c | .divCoeffs c => c.collectDecVars
| .core0 .. | .core .. | .coreNat .. | .defn .. | .defnNat ..
| .defnCommRing .. | .defnNatCommRing .. | .coreToInt .. => return () -- Equalities coming from the core never contain cutsat decision variables
| .commRingNorm c .. | .reorder c | .norm c | .divCoeffs c => c.collectDecVars
| .subst _ c₁ c₂ | .ofLeGe c₁ c₂ => c₁.collectDecVars; c₂.collectDecVars
partial def CooperSplit.collectDecVars (s : CooperSplit) : CollectDecVarsM Unit := do unless ( alreadyVisited s) do
@@ -429,14 +490,15 @@ partial def DvdCnstr.collectDecVars (c' : DvdCnstr) : CollectDecVarsM Unit := do
match c'.h with
| .core _ | .coreNat .. => return ()
| .cooper₁ c | .cooper₂ c
| .reorder c | .norm c | .elim c | .divCoeffs c | .ofEq _ c => c.collectDecVars
| .commRingNorm c .. | .reorder c | .norm c | .elim c | .divCoeffs c | .ofEq _ c => c.collectDecVars
| .solveCombine c₁ c₂ | .solveElim c₁ c₂ | .subst _ c₁ c₂ => c₁.collectDecVars; c₂.collectDecVars
partial def LeCnstr.collectDecVars (c' : LeCnstr) : CollectDecVarsM Unit := do unless ( alreadyVisited c') do
match c'.h with
| .core .. | .coreNeg .. | .coreNat .. | .coreNatNeg .. | .coreToInt .. | .denoteAsIntNonneg .. | .bound .. => return ()
| .dec h => markAsFound h
| .reorder c | .cooper c | .norm c | .divCoeffs c => c.collectDecVars
| .commRingNorm c .. | .reorder c | .cooper c
| .norm c | .divCoeffs c => c.collectDecVars
| .dvdTight c₁ c₂ | .negDvdTight c₁ c₂
| .combine c₁ c₂ | .combineDivCoeffs c₁ c₂ _
| .subst _ c₁ c₂ | .ofLeDiseq c₁ c₂ => c₁.collectDecVars; c₂.collectDecVars
@@ -445,7 +507,7 @@ partial def LeCnstr.collectDecVars (c' : LeCnstr) : CollectDecVarsM Unit := do u
partial def DiseqCnstr.collectDecVars (c' : DiseqCnstr) : CollectDecVarsM Unit := do unless ( alreadyVisited c') do
match c'.h with
| .core0 .. | .core .. | .coreNat .. | .coreToInt .. => return () -- Disequalities coming from the core never contain cutsat decision variables
| .reorder c | .norm c | .divCoeffs c | .neg c => c.collectDecVars
| .commRingNorm c .. | .reorder c | .norm c | .divCoeffs c | .neg c => c.collectDecVars
| .subst _ c₁ c₂ => c₁.collectDecVars; c₂.collectDecVars
end

View File

@@ -10,6 +10,7 @@ import Lean.Data.PersistentArray
import Lean.Meta.Tactic.Grind.ExprPtr
import Lean.Meta.Tactic.Grind.Arith.Util
import Lean.Meta.Tactic.Grind.Arith.Cutsat.ToIntInfo
import Lean.Meta.Tactic.Grind.Arith.CommRing.Types
namespace Lean.Meta.Grind.Arith.Cutsat
@@ -95,6 +96,9 @@ inductive EqCnstrProof where
| subst (x : Var) (c₁ : EqCnstr) (c₂ : EqCnstr)
| ofLeGe (c₁ : LeCnstr) (c₂ : LeCnstr)
| reorder (c : EqCnstr)
| commRingNorm (c : EqCnstr) (e : CommRing.RingExpr) (p : CommRing.Poly)
| defnCommRing (e : Expr) (p : Poly) (re : CommRing.RingExpr) (rp : CommRing.Poly) (p' : Poly)
| defnNatCommRing (e : Int.OfNat.Expr) (x : Var) (e' : Int.Linear.Expr) (p : Poly) (re : CommRing.RingExpr) (rp : CommRing.Poly) (p' : Poly)
/-- A divisibility constraint and its justification/proof. -/
structure DvdCnstr where
@@ -156,6 +160,7 @@ inductive DvdCnstrProof where
/-- `c.c₃?` must be `some` -/
| cooper₂ (c : CooperSplit)
| reorder (c : DvdCnstr)
| commRingNorm (c : DvdCnstr) (e : CommRing.RingExpr) (p : CommRing.Poly)
/-- An inequality constraint and its justification/proof. -/
structure LeCnstr where
@@ -182,6 +187,7 @@ inductive LeCnstrProof where
| dvdTight (c₁ : DvdCnstr) (c₂ : LeCnstr)
| negDvdTight (c₁ : DvdCnstr) (c₂ : LeCnstr)
| reorder (c : LeCnstr)
| commRingNorm (c : LeCnstr) (e : CommRing.RingExpr) (p : CommRing.Poly)
/-- A disequality constraint and its justification/proof. -/
structure DiseqCnstr where
@@ -203,6 +209,7 @@ inductive DiseqCnstrProof where
| neg (c : DiseqCnstr)
| subst (x : Var) (c₁ : EqCnstr) (c₂ : DiseqCnstr)
| reorder (c : DiseqCnstr)
| commRingNorm (c : DiseqCnstr) (e : CommRing.RingExpr) (p : CommRing.Poly)
/--
A proof of `False`.
@@ -330,6 +337,10 @@ structure State where
from a α-term `e` to the pair `(toInt e, α)`.
-/
toIntTermMap : PHashMap ExprPtr ToIntTermInfo := {}
/--
`usedCommRing` is `true` if the `CommRing` has been used to normalize expressions.
-/
usedCommRing : Bool := false
deriving Inhabited
end Lean.Meta.Grind.Arith.Cutsat

View File

@@ -89,10 +89,7 @@ private def intro1 : GoalM FVarId := do
let (name, type) match target with
| .forallE n d .. => pure (n, d)
| .letE n d .. => pure (n, d)
| _ =>
let some (n, d, _) := target.letFun? |
throwError "`grind` internal error, binder expected"
pure (n, d)
| _ => throwError "`grind` internal error, binder expected"
let name mkCleanName name type
let (fvarId, mvarId) ( get).mvarId.intro name
modify fun s => { s with mvarId }
@@ -149,7 +146,7 @@ private partial def introNext (goal : Goal) (generation : Nat) : GrindM IntroRes
let h mkLambdaFVars #[mkFVar fvarId] mvarNew
mvarId.assign h
return .newHyp fvarId { ( get) with mvarId := mvarIdNew }
else if target.isLet || target.isLetFun then
else if target.isLet then
if ( getConfig).zetaDelta then
let targetNew := expandLet target #[]
let mvarId := ( get).mvarId

View File

@@ -77,11 +77,12 @@ def GrindM.run (x : GrindM α) (params : Params) (fallback : Fallback) : MetaM
let (btrueExpr, scState) := shareCommonAlpha (mkConst ``Bool.true) scState
let (natZExpr, scState) := shareCommonAlpha (mkNatLit 0) scState
let (ordEqExpr, scState) := shareCommonAlpha (mkConst ``Ordering.eq) scState
let (intExpr, scState) := shareCommonAlpha Int.mkType scState
let simprocs := params.normProcs
let simpMethods := Simp.mkMethods simprocs discharge? (wellBehavedDischarge := true)
let simp := params.norm
let config := params.config
x ( mkMethods fallback).toMethodsRef { config, simpMethods, simp, trueExpr, falseExpr, natZExpr, btrueExpr, bfalseExpr, ordEqExpr }
x ( mkMethods fallback).toMethodsRef { config, simpMethods, simp, trueExpr, falseExpr, natZExpr, btrueExpr, bfalseExpr, ordEqExpr, intExpr }
|>.run' { scState }
private def mkCleanState (mvarId : MVarId) (params : Params) : MetaM Clean.State := mvarId.withContext do

View File

@@ -112,6 +112,7 @@ structure Context where
btrueExpr : Expr
bfalseExpr : Expr
ordEqExpr : Expr -- `Ordering.eq`
intExpr : Expr -- `Int`
/-- Key for the congruence theorem cache. -/
structure CongrTheoremCacheKey where
@@ -251,6 +252,10 @@ def getNatZeroExpr : GrindM Expr := do
def getOrderingEqExpr : GrindM Expr := do
return ( readThe Context).ordEqExpr
/-- Returns the internalized `Int`. -/
def getIntExpr : GrindM Expr := do
return ( readThe Context).intExpr
def cheapCasesOnly : GrindM Bool :=
return ( readThe Context).cheapCases

View File

@@ -50,35 +50,24 @@ namespace Lean.Meta
let fvars := fvars.push fvar
loop i lctx fvars j s body
| i+1, type =>
if let some (n, type, val, body) := type.letFun? then
let type := type.instantiateRevRange j fvars.size fvars
let type := type.headBeta
let val := val.instantiateRevRange j fvars.size fvars
let fvarId mkFreshFVarId
let (n, s) mkName lctx n true s
let lctx := lctx.mkLetDecl fvarId n type val
let fvar := mkFVar fvarId
let fvars := fvars.push fvar
loop i lctx fvars j s body
else
let type := type.instantiateRevRange j fvars.size fvars
withLCtx' lctx do
withNewLocalInstances fvars j do
/- We used to use just `whnf`, but it produces counterintuitive behavior if
- `type` is a metavariable `?m` such that `?m := let x := v; b`, or
- `type` has `MData` or annotations such as `optParam` around a `let`-expression.
let type := type.instantiateRevRange j fvars.size fvars
withLCtx' lctx do
withNewLocalInstances fvars j do
/- We used to use just `whnf`, but it produces counterintuitive behavior if
- `type` is a metavariable `?m` such that `?m := let x := v; b`, or
- `type` has `MData` or annotations such as `optParam` around a `let`-expression.
`whnf` instantiates metavariables, and consumes `MData`, but it also expands the `let`.
-/
let newType := ( instantiateMVars type).cleanupAnnotations
if newType.isForall || newType.isLet || newType.isLetFun then
`whnf` instantiates metavariables, and consumes `MData`, but it also expands the `let`.
-/
let newType := ( instantiateMVars type).cleanupAnnotations
if newType.isForall || newType.isLet then
loop (i+1) lctx fvars fvars.size s newType
else
let newType whnf newType
if newType.isForall then
loop (i+1) lctx fvars fvars.size s newType
else
let newType whnf newType
if newType.isForall then
loop (i+1) lctx fvars fvars.size s newType
else
throwTacticEx `introN mvarId "insufficient number of binders"
throwTacticEx `introN mvarId "insufficient number of binders"
let (fvars, mvarId) loop n lctx #[] 0 s mvarType
return (fvars.map Expr.fvarId!, mvarId)
@@ -114,7 +103,7 @@ private def mkAuxNameImp (preserveBinderNames : Bool) (hygienic : Bool) (useName
mkAuxNameWithoutGivenName rest
where
mkAuxNameWithoutGivenName (rest : List Name) : MetaM (Name × List Name) := do
-- Use a nicer binder name than `[anonymous]`, which can appear in for example `letFun x f` when `f` is not a lambda expression.
-- Use a nicer binder name than `[anonymous]`.
-- In this case, we make sure the name is hygienic.
let binderName if binderName.isAnonymous then mkFreshUserName `a else pure binderName
if preserveBinderNames then
@@ -177,11 +166,7 @@ partial def getIntrosSize : Expr → Nat
| .forallE _ _ b _ => getIntrosSize b + 1
| .letE _ _ _ b _ => getIntrosSize b + 1
| .mdata _ b => getIntrosSize b
| e =>
if let some (_, _, _, b) := e.letFun? then
getIntrosSize b + 1
else
0
| _ => 0
/--
Introduce as many binders as possible without unfolding definitions.

View File

@@ -68,7 +68,7 @@ def nextNameForBinderName? (binderName : Name) : M (Option Name) := do
return n
else
if binderName.isAnonymous then
-- Use a nicer binder name than `[anonymous]`, which can appear for example in `letFun x f` when `f` is not a lambda expression.
-- Use a nicer binder name than `[anonymous]`.
mkFreshUserName `a
else if ( read).preserveBinderNames || n.hasMacroScopes then
return n
@@ -133,7 +133,7 @@ def withEnsuringDeclsInContext [Monad m] [MonadControlT MetaM m] [MonadLCtx m] (
withExistingLocalDecls decls.toList k
/--
Closes all the local declarations in `e`, creating `let` and `letFun` expressions.
Closes all the local declarations in `e`, creating `let` and `have` expressions.
Does not require that any of the declarations are in context.
Assumes that `e` contains no metavariables with local contexts that contain any of these metavariables
(the extraction procedure creates no new metavariables, so this is the case).
@@ -148,7 +148,7 @@ def mkLetDecls (decls : Array LocalDecl') (e : Expr) : Expr :=
/--
Makes sure the declaration for `fvarId` is marked with `isLet := true`.
Used in `lift + merge` mode to ensure that, after merging, if any version was a `let` then it's a `let` rather than a `letFun`.
Used in `lift + merge` mode to ensure that, after merging, if any version was a `let` then it's a `let` rather than a `have`.
-/
def ensureIsLet (fvarId : FVarId) : M Unit := do
modify fun s => { s with
@@ -185,12 +185,11 @@ def initializeValueMap : M Unit := do
modify fun s => { s with valueMap := s.valueMap.insert value decl.fvarId }
/--
Returns `true` if the expression contains a `let` expression or a `letFun`
(this does not verify that the `letFun`s are well-formed).
Returns `true` if the expression contains a `let` expression or a `have`.
Its purpose is to be a check for whether a subexpression can be skipped.
-/
def containsLet (e : Expr) : Bool :=
Option.isSome <| e.find? fun e' => e'.isLet || e'.isConstOf ``letFun
Option.isSome <| e.find? (·.isLet)
/--
Extracts lets from `e`.
@@ -214,7 +213,7 @@ partial def extractCore (fvars : List Expr) (e : Expr) (topLevel : Bool := false
if !containsLet e then
return e
-- Don't honor `proofs := false` or `types := false` for top-level lets, since it's confusing not having them be extracted.
unless topLevel && (e.isLet || e.isLetFun || e.isMData) do
unless topLevel && (e.isLet || e.isMData) do
if !cfg.proofs then
if isProof e then
return e
@@ -225,12 +224,8 @@ partial def extractCore (fvars : List Expr) (e : Expr) (topLevel : Bool := false
match e with
| .bvar .. | .fvar .. | .mvar .. | .sort .. | .const .. | .lit .. => unreachable!
| .mdata _ e' => return e.updateMData! ( extractCore fvars e' (topLevel := topLevel))
| .letE n t v b nondep => extractLetLike (!nondep) n t v b (fun t v b => pure <| e.updateLetE! t v b) (topLevel := topLevel)
| .app .. =>
if e.isLetFun then
extractLetFun e (topLevel := topLevel)
else
whenDescend do extractApp e.getAppFn e.getAppArgs
| .letE n t v b nondep => extractLetLike (!nondep) n t v b (topLevel := topLevel)
| .app .. => whenDescend do extractApp e.getAppFn e.getAppArgs
| .proj _ _ s => whenDescend do return e.updateProj! ( extractCore fvars s)
| .lam n t b i => whenDescend do extractBinder n t b i (fun t b => e.updateLambda! i t b)
| .forallE n t b i => whenDescend do extractBinder n t b i (fun t b => e.updateForall! i t b)
@@ -248,7 +243,7 @@ where
return mk t (b.abstract #[x])
else
return mk t b
extractLetLike (isLet : Bool) (n : Name) (t v b : Expr) (mk : Expr Expr Expr M Expr) (topLevel : Bool) : M Expr := do
extractLetLike (isLet : Bool) (n : Name) (t v b : Expr) (topLevel : Bool) : M Expr := do
let cfg read
let t extractCore fvars t
let v extractCore fvars v
@@ -261,39 +256,26 @@ where
extractCore fvars (b.instantiate1 (.fvar fvarId)) (topLevel := topLevel)
let (extract, n) isExtractableLet fvars n t v
if !extract && (!cfg.underBinder || !cfg.descend) then
return mk t v b
return e.updateLetE! t v b
withLetDecl n t v fun x => do
if extract then
addDecl ( x.fvarId!.getDecl) isLet
extractCore fvars (b.instantiate1 x) (topLevel := topLevel)
else
let b extractCore (x :: fvars) (b.instantiate1 x)
mk t v (b.abstract #[x])
/-- `e` is the letFun expression -/
extractLetFun (e : Expr) (topLevel : Bool) : M Expr := do
let letFunE := e.getAppFn
let β := e.getArg! 1
let (n, t, v, b) := e.letFun?.get!
extractLetLike false n t v b (topLevel := topLevel)
(fun t v b =>
-- Strategy: construct letFun directly rather than use `mkLetFun`.
-- We don't update the `β` argument.
return mkApp4 letFunE t β v (.lam n t b .default))
return e.updateLetE! t v (b.abstract #[x])
extractApp (f : Expr) (args : Array Expr) : M Expr := do
let cfg read
if f.isConstOf ``letFun && args.size 4 then
extractApp (mkAppN f args[*...4]) args[4...*]
let f' extractCore fvars f
if cfg.implicits then
return mkAppN f' ( args.mapM (extractCore fvars))
else
let f' extractCore fvars f
if cfg.implicits then
return mkAppN f' ( args.mapM (extractCore fvars))
else
let (paramInfos, _) instantiateForallWithParamInfos ( inferType f) args
let mut args := args
for i in [0:args.size] do
if paramInfos[i]!.binderInfo.isExplicit then
args := args.set! i ( extractCore fvars args[i]!)
return mkAppN f' args
let (paramInfos, _) instantiateForallWithParamInfos ( inferType f) args
let mut args := args
for i in [0:args.size] do
if paramInfos[i]!.binderInfo.isExplicit then
args := args.set! i ( extractCore fvars args[i]!)
return mkAppN f' args
def extractTopLevel (e : Expr) : M Expr := do
let e instantiateMVars e

View File

@@ -210,12 +210,6 @@ private def reduceStep (e : Expr) : SimpM Expr := do
return expandLet b #[v] (zetaHave := cfg.zetaHave)
else if cfg.zetaUnused && !b.hasLooseBVars then
return consumeUnusedLet b
-- TODO(kmill): we are going to remove `letFun` support.
if let some (args, _, _, v, b) := e.letFunAppArgs? then
if cfg.zeta && cfg.zetaHave then
return mkAppN (expandLet b #[v] (zetaHave := cfg.zetaHave)) args
else if cfg.zetaUnused && !b.hasLooseBVars then
return mkAppN (consumeUnusedLet b) args
match ( unfold? e) with
| some e' =>
trace[Meta.Tactic.simp.rewrite] "unfold {.ofConst e.getAppFn}, {e} ==> {e'}"
@@ -986,12 +980,6 @@ def simpApp (e : Expr) : SimpM Result := do
if isOfNatNatLit e || isOfScientificLit e || isCharLit e then
-- Recall that we fold "orphan" kernel Nat literals `n` into `OfNat.ofNat n`
return { expr := e }
else if let some (args, n, t, v, b) := e.letFunAppArgs? then
/-
Note: we will be removing `letFun`, and we want everything to be in terms of `nondep := true` lets.
This will then use `simpLet`.
-/
return { expr := mkAppN (Expr.letE n t v b true) args }
else
congr e

View File

@@ -601,12 +601,6 @@ partial def expandLet (e : Expr) (vs : Array Expr) (zetaHave : Bool := true) : E
expandLet b (vs.push <| v.instantiateRev vs) zetaHave
else
e.instantiateRev vs
-- TODO(kmill): we are going to remove `letFun` support.
else if let some (_, _, v, b) := e.letFun? then
if zetaHave then
expandLet b (vs.push <| v.instantiateRev vs) zetaHave
else
e.instantiateRev vs
else
e.instantiateRev vs
@@ -620,13 +614,8 @@ In the case of `isDefEqQuick`, it is also used when `zeta` is set.
-/
partial def consumeUnusedLet (e : Expr) (consumeNondep : Bool := false) : Expr :=
match e with
| e@(.letE _ _ _ b nondep) => if b.hasLooseBVars || (nondep && !consumeNondep) then e else consumeUnusedLet b consumeNondep
| e =>
-- TODO(kmill): we are going to remove `letFun` support.
if let some (_, _, _, b) := e.letFun? then
if b.hasLooseBVars || !consumeNondep then e else consumeUnusedLet b consumeNondep
else
e
| .letE _ _ _ b nondep => if b.hasLooseBVars || (nondep && !consumeNondep) then e else consumeUnusedLet b consumeNondep
| _ => e
/--
Apply beta-reduction, zeta-reduction (i.e., unfold let local-decls), iota-reduction,
@@ -650,13 +639,6 @@ where
return e
| .app f .. =>
let cfg getConfig
-- TODO(kmill): we are going to remove `letFun` support.
if let some (args, _, _, v, b) := e.letFunAppArgs? then
-- When zeta reducing enabled, always reduce `letFun` no matter the current reducibility level
if cfg.zeta && cfg.zetaHave then
return ( go <| mkAppN (expandLet b #[v] (zetaHave := cfg.zetaHave)) args)
else if cfg.zetaUnused && !b.hasLooseBVars then
return ( go <| mkAppN (consumeUnusedLet b) args)
let f := f.getAppFn
let f' go f
/-

View File

@@ -692,9 +692,7 @@ creating a *nondependent* let expression.
@[builtin_term_parser] def «have» := leading_parser:leadPrec
withPosition ("have" >> letConfig >> letDecl) >> optSemicolon termParser
/--
`let_fun x := v; b` is syntax sugar for `letFun v (fun x => b)`.
It is very similar to `let x := v; b`, but they are not equivalent.
In `let_fun`, the value `v` has been abstracted away and cannot be accessed in `b`.
`let_fun x := v; b` is deprecated syntax sugar for `have x := v; b`.
-/
@[builtin_term_parser] def «let_fun» := leading_parser:leadPrec
withPosition ((symbol "let_fun " <|> "let_λ ") >> letDecl) >> optSemicolon termParser

View File

@@ -37,8 +37,8 @@ partial def parserNodeKind? (e : Expr) : MetaM (Option Name) := do
let reduceEval? e : MetaM (Option Name) := do
try pure <| some ( reduceEval e) catch _ => pure none
let e whnfCore e
if e matches Expr.lam .. then
lambdaLetTelescope e fun _ e => parserNodeKind? e
if e matches Expr.lam .. | Expr.letE .. then
lambdaLetTelescope (preserveNondepLet := false) e fun _ e => parserNodeKind? e
else if e.isAppOfArity ``leadingNode 3 || e.isAppOfArity ``trailingNode 4 || e.isAppOfArity ``node 2 then
reduceEval? (e.getArg! 0)
else if e.isAppOfArity ``withAntiquot 2 then
@@ -61,7 +61,7 @@ variable {α} (ctx : Context α) (builtin : Bool) (force : Bool) in
partial def compileParserExpr (e : Expr) : MetaM Expr := do
let e whnfCore e
match e with
| .lam .. => mapLambdaLetTelescope e fun _ b => compileParserExpr b
| .lam .. | .letE .. => mapLambdaLetTelescope (preserveNondepLet := false) e fun _ b => compileParserExpr b
| .fvar .. => return e
| _ => do
let fn := e.getAppFn

View File

@@ -11,19 +11,8 @@ import Lean.Parser.Module
import Lean.ParserCompiler
import Lean.Util.NumObjs
import Lean.Util.ShareCommon
namespace Lean
def PPContext.runCoreM {α : Type} (ppCtx : PPContext) (x : CoreM α) : IO α :=
Prod.fst <$> x.toIO { options := ppCtx.opts, currNamespace := ppCtx.currNamespace
openDecls := ppCtx.openDecls
fileName := "<PrettyPrinter>", fileMap := default
diag := getDiag ppCtx.opts }
{ env := ppCtx.env, ngen := { namePrefix := `_pp_uniq } }
def PPContext.runMetaM {α : Type} (ppCtx : PPContext) (x : MetaM α) : IO α :=
ppCtx.runCoreM <| x.run' { lctx := ppCtx.lctx } { mctx := ppCtx.mctx }
namespace PrettyPrinter
namespace Lean.PrettyPrinter
def ppCategory (cat : Name) (stx : Syntax) : CoreM Format := do
let opts getOptions
@@ -147,22 +136,6 @@ def ofFormatWithInfosM (fmt : MetaM FormatWithInfos) : MessageData :=
| .ok fmt => return .ofFormatWithInfos fmt
| .error ex => return m!"[Error pretty printing: {ex}]"
/--
Turns a `MetaM MessageData` into a `MessageData.lazy` which will run the monadic value.
The optional array of expressions is used to set the `hasSyntheticSorry` fields, and should
comprise the expressions that are included in the message data.
-/
def ofLazyM (f : MetaM MessageData) (es : Array Expr := #[]) : MessageData :=
.lazy
(f := fun ppctxt => do
match ( ppctxt.runMetaM f |>.toBaseIO) with
| .ok fmt => return fmt
| .error ex => return m!"[Error pretty printing: {ex}]"
)
(hasSyntheticSorry := fun mvarctxt => es.any (fun a =>
instantiateMVarsCore mvarctxt a |>.1.hasSyntheticSorry
))
/--
Pretty print a const expression using `delabConst` and generate terminfo.
This function avoids inserting `@` if the constant is for a function whose first

View File

@@ -836,23 +836,6 @@ where
else
x
/--
Delaborates applications of the form `letFun v (fun x => b)` as `have x := v; b`.
-/
@[builtin_delab app.letFun]
def delabLetFun : Delab := whenPPOption getPPNotation <| withOverApp 4 do
let e getExpr
guard <| e.getAppNumArgs == 4
let Expr.lam n _ b _ := e.appArg! | failure
let n getUnusedName n b
let stxV withAppFn <| withAppArg delab
let (stxN, stxB) withAppArg <| withBindingBody' n (mkAnnotatedIdent n) fun stxN => return (stxN, delab)
if getPPOption getPPLetVarTypes <||> getPPOption getPPAnalysisLetVarType then
let stxT SubExpr.withNaryArg 0 delab
`(have $stxN : $stxT := $stxV; $stxB)
else
`(have $stxN := $stxV; $stxB)
@[builtin_delab mdata]
def delabMData : Delab := do
if let some _ := inaccessible? ( getExpr) then
@@ -1312,16 +1295,6 @@ partial def delabDoElems : DelabM (List Syntax) := do
prependAndRec `(doElem|have $(mkIdent n) : $stxT := $stxV)
else
prependAndRec `(doElem|let $(mkIdent n) : $stxT := $stxV)
else if e.isLetFun then
-- letFun.{u, v} : {α : Sort u} → {β : α → Sort v} → (v : α) → ((x : α) → β x) → β v
let stxT withNaryArg 0 delab
let stxV withNaryArg 2 delab
withAppArg do
match ( getExpr) with
| Expr.lam .. =>
withBindingBodyUnusedName fun n => do
prependAndRec `(doElem|have $n:term : $stxT := $stxV)
| _ => failure
else
let stx delab
return [ `(doElem|$stx:term)]

View File

@@ -570,6 +570,52 @@ register_builtin_option pp.oneline : Bool := {
descr := "(pretty printer) elide all but first line of pretty printer output"
}
namespace OneLine
/-!
The `OneLine.pretty` function takes the first line of a `Format` while preserving tags.
-/
structure State where
line : Format := .nil
column : Nat := 0
tags : List (Nat × Format) := []
/-- We use `EStateM` to be able to abort pretty printing once we get to the end of a line. -/
abbrev M := EStateM Unit State
def continuation : String := " [...]"
instance : Std.Format.MonadPrettyFormat M where
pushOutput s := do
let lineEnd := s.find (· == '\n')
if lineEnd < s.endPos then
let s := (s.extract 0 lineEnd).trimRight ++ continuation
modify fun st => { st with line := st.line.append s }
throw ()
else
modify fun st => { st with line := st.line.append s, column := st.column + s.length }
pushNewline _ := do
modify fun st => { st with line := st.line.append continuation }
throw ()
currColumn := return ( get).column
startTag tag := modify fun st => { st with line := .nil, tags := (tag, st.line) :: st.tags }
endTags num := do
let tags := ( get).tags
let ended := tags.take num
let acc := ended.foldl (init := ( get).line) fun acc (t, line') => line'.append (acc.tag t)
modify fun st => { st with line := acc, tags := tags.drop num }
/--
Pretty prints `f` and creates a new `Format` of its first line. Preserves tags.
-/
def pretty (f : Format) (width : Nat) : Format :=
let m : M Unit := Std.Format.prettyM f width
match m.run {} with
| .ok _ s | .error _ s =>
s.tags.foldl (init := s.line) fun acc (t, line') => line'.append (acc.tag t)
end OneLine
def format (formatter : Formatter) (stx : Syntax) : CoreM Format := do
trace[PrettyPrinter.format.input] "{Std.format stx}"
let options getOptions
@@ -579,12 +625,7 @@ def format (formatter : Formatter) (stx : Syntax) : CoreM Format := do
let (_, st) (concat formatter { table, options }).run { stxTrav := .fromSyntax stx }
let mut f := st.stack[0]!
if pp.oneline.get options then
let mut s := f.pretty' options |>.trim
let lineEnd := s.find (· == '\n')
if lineEnd < s.endPos then
s := s.extract 0 lineEnd ++ " [...]"
-- TODO: preserve `Format.tag`s?
f := s
f := OneLine.pretty f (Std.Format.format.width.get options)
return .fill f)
(fun _ => throwError "format: uncaught backtrack exception")

View File

@@ -33,6 +33,9 @@ def Expr.hasNonSyntheticSorry (e : Expr) : Bool :=
def Declaration.hasSorry (d : Declaration) : Bool := Id.run do
d.foldExprM (fun r e => r || e.hasSorry) false
def Declaration.hasSyntheticSorry (d : Declaration) : Bool := Id.run do
d.foldExprM (fun r e => r || e.hasSyntheticSorry) false
def Declaration.hasNonSyntheticSorry (d : Declaration) : Bool := Id.run do
d.foldExprM (fun r e => r || e.hasNonSyntheticSorry) false

View File

@@ -60,46 +60,43 @@ which stores information about a (successful) build.
structure BuildMetadata where
depHash : Hash
inputs : Array (String × Json)
outputs? : Option Json := none
outputs? : Option Json
log : Log
/-- A trace file that was created from fetching an artifact from the cache. -/
synthetic : Bool
deriving ToJson
def BuildMetadata.ofHash (h : Hash) : BuildMetadata :=
{depHash := h, inputs := #[], log := {}}
def BuildMetadata.fromJson? (json : Json) : Except String BuildMetadata := do
let obj JsonObject.fromJson? json
let depHash obj.get "depHash"
let inputs obj.getD "inputs" {}
let outputs? obj.getD "outputs" none
let log obj.getD "log" {}
return {depHash, inputs, outputs?, log}
let synthetic obj.getD "synthetic" false
return {depHash, inputs, outputs?, log, synthetic}
instance : FromJson BuildMetadata := BuildMetadata.fromJson?
/-- The state of the trace file data saved on the file system. -/
inductive SavedTrace
| missing
| invalid
| ok (data : BuildMetadata)
/--
Construct build metadata from a trace stub.
That is, the old version of the trace file format that just contained a hash.
-/
def BuildMetadata.ofStub (hash : Hash) : BuildMetadata :=
{depHash := hash, inputs := #[], outputs? := none, log := {}, synthetic := false}
/-- Read persistent trace data from a file. -/
def readTraceFile (path : FilePath) : LogIO SavedTrace := do
match ( IO.FS.readFile path |>.toBaseIO) with
| .ok contents =>
if let some hash := Hash.ofString? contents.trim then
return .ok <| .ofHash hash
else
match Json.parse contents >>= fromJson? with
| .ok contents => return .ok contents
| .error e => logVerbose s!"{path}: invalid trace file: {e}"; return .invalid
| .error (.noFileOrDirectory ..) => return .missing
| .error e => logWarning s!"{path}: read failed: {e}"; return .invalid
@[deprecated ofStub (since := "2025-06-28")]
abbrev BuildMetadata.ofHash := @ofStub
/-- Read persistent trace data from a file. -/
@[inline, deprecated readTraceFile (since := "2025-06-26")]
def readTraceFile? (path : FilePath) : LogIO (Option BuildMetadata) := do
if let .ok data readTraceFile path then return some data else none
/-- Parse build metadata from a trace file's contents. -/
def BuildMetadata.parse (contents : String) : Except String BuildMetadata :=
if let some hash := Hash.ofString? contents.trim then
return .ofStub hash
else
Json.parse contents >>= fromJson?
/-- Construct build metadata from a cached input-to-output mapping. -/
def BuildMetadata.ofFetch (inputHash : Hash) (outputs : Json) : BuildMetadata :=
{depHash := inputHash, outputs? := outputs, synthetic := true, inputs := #[], log := {}}
private partial def serializeInputs (inputs : Array BuildTrace) : Array (String × Json) :=
inputs.foldl (init := {}) fun r trace =>
@@ -110,21 +107,63 @@ private partial def serializeInputs (inputs : Array BuildTrace) : Array (String
toJson (serializeInputs trace.inputs)
r.push (trace.caption, val)
/-- Write persistent trace data to a file. -/
@[inline] def writeTraceFile
private def BuildMetadata.ofBuildCore
(depTrace : BuildTrace) (outputs : Json) (log : Log)
: BuildMetadata where
inputs := serializeInputs depTrace.inputs
depHash := depTrace.hash
outputs? := outputs
synthetic := false
log
/-- Construct trace file contents from a build's trace, outputs, and log. -/
@[inline] def BuildMetadata.ofBuild
[ToJson α] (depTrace : BuildTrace) (outputs : α) (log : Log)
:= BuildMetadata.ofBuildCore depTrace (toJson outputs) log
/-- The state of the trace file data saved on the file system. -/
inductive SavedTrace
| missing
| invalid
| ok (data : BuildMetadata)
/--
Try to read data from a trace file.
Logs if the read failed or the contents where invalid.
-/
def readTraceFile (path : FilePath) : LogIO SavedTrace := do
match ( IO.FS.readFile path |>.toBaseIO) with
| .ok contents =>
match BuildMetadata.parse contents with
| .ok data => return .ok data
| .error e => logVerbose s!"{path}: invalid trace file: {e}"; return .invalid
| .error (.noFileOrDirectory ..) => return .missing
| .error e => logWarning s!"{path}: read failed: {e}"; return .invalid
/--
Tries to read data from a trace file. On failure, returns `none`.
Logs if the read failed or the contents where invalid.
-/
@[inline, deprecated readTraceFile (since := "2025-06-26")]
def readTraceFile? (path : FilePath) : LogIO (Option BuildMetadata) := do
if let .ok data readTraceFile path then return some data else none
/-- Write a trace file containing the metadata. -/
def BuildMetadata.writeFile (path : FilePath) (data : BuildMetadata) : IO Unit := do
createParentDirs path
IO.FS.writeFile path (toJson data).pretty
/-- Write a trace file containing metadata on an artifact fetched from a cache. -/
@[inline] def writeFetchTrace (path : FilePath) (inputHash : Hash) (outputs : Json) : IO Unit :=
BuildMetadata.writeFile path (.ofFetch inputHash outputs)
/-- Write a trace file containing metadata about a build. -/
@[inline] def writeBuildTrace
[ToJson α] (path : FilePath) (depTrace : BuildTrace) (outputs : α) (log : Log)
: IO Unit :=
go path depTrace (toJson outputs) log
where
go path depTrace outputs log := do
createParentDirs path
let data : BuildMetadata := {
inputs := serializeInputs depTrace.inputs
depHash := depTrace.hash
outputs? := outputs
log
}
IO.FS.writeFile path (toJson data).pretty
: IO Unit := BuildMetadata.writeFile path (.ofBuild depTrace outputs log)
@[deprecated writeBuildTrace (since := "2025-06-28")]
abbrev writeTraceFile := @writeBuildTrace
/--
Checks if the `info` is up-to-date by comparing `depTrace` with `depHash`.
@@ -162,6 +201,23 @@ as the point of comparison instead.
| .missing =>
depTrace.checkAgainstTime info
/--
Replays the saved log from the trace if it exists and is not synthetic.
Otherwise, writes a new synthetic trace file recording the fetch of the artifact from the cache.
-/
def SavedTrace.replayOrFetch
(traceFile : FilePath) (inputHash : Hash) (outputs : Json) (savedTrace : SavedTrace)
: JobM Unit := do
if let .ok data := savedTrace then
if data.synthetic && data.log.isEmpty then
updateAction .fetch
else
updateAction .replay
data.log.replay
else
updateAction .fetch
writeFetchTrace traceFile inputHash outputs
/--
Runs `build` as a build action of kind `action`.
@@ -180,7 +236,7 @@ and log are saved to `traceFile`, if the build completes without a fatal error
let iniPos getLogPos
let outputs build -- fatal errors will abort here
let log := ( getLog).takeFrom iniPos
writeTraceFile traceFile depTrace outputs log
writeBuildTrace traceFile depTrace outputs log
return outputs
/--
@@ -314,7 +370,7 @@ If `text := true`, `file` contents are hashed as a text file rather than a binar
If the Lake cache is disabled, the behavior of this function is undefined.
-/
def Cache.saveArtifact
(cache : Cache) (file : FilePath) (ext := "art") (text := false)
(cache : Cache) (file : FilePath) (ext := "art") (text := false) (exe := false)
: IO Artifact := do
if text then
let contents IO.FS.readFile file
@@ -332,6 +388,9 @@ def Cache.saveArtifact
let path := cache.artifactPath hash ext
createParentDirs path
IO.FS.writeBinFile path contents
if exe then
let r := true, true, true
IO.setAccessRights path r, r, r -- 777
writeFileHash file hash
let mtime := ( getMTime path |>.toBaseIO).toOption.getD 0
return {name := file.toString, path, mtime, hash}
@@ -339,8 +398,8 @@ def Cache.saveArtifact
@[inline, inherit_doc Cache.saveArtifact]
def cacheArtifact
[MonadLakeEnv m] [MonadLiftT IO m] [Monad m]
(file : FilePath) (ext := "art") (text := false)
: m Artifact := do ( getLakeCache).saveArtifact file ext text
(file : FilePath) (ext := "art") (text := false) (exe := false)
: m Artifact := do ( getLakeCache).saveArtifact file ext text exe
/--
Computes the trace of a cached artifact.
@@ -352,14 +411,6 @@ def computeArtifactTrace
let mtime := ( getMTime art |>.toBaseIO).toOption.getD 0
return {caption := buildFile.toString, mtime, hash := contentHash}
/-- Replays the saved log from the trace if it exists. -/
def SavedTrace.replayIfExists (savedTrace : SavedTrace) : JobM Unit := do
if let .ok data := savedTrace then
updateAction .replay
data.log.replay
else
updateAction .fetch
class ResolveArtifacts (m : Type v Type w) (α : Type u) (β : outParam $ Type v) where
resolveArtifacts? (contentHashes : α) : m (Option β)
@@ -370,11 +421,12 @@ in either the saved trace file or in the cached input-to-content mapping.
-/
@[specialize] def resolveArtifactsUsing?
(α : Type u) [FromJson α] [ResolveArtifacts JobM α β]
(inputHash : Hash) (savedTrace : SavedTrace) (cache : CacheRef)
(inputHash : Hash) (traceFile : FilePath) (savedTrace : SavedTrace) (cache : CacheRef)
: JobM (Option β) := do
if let some out cache.get? inputHash then
if let .ok (hashes : α) := fromJson? out then
if let some arts resolveArtifacts? hashes then
savedTrace.replayOrFetch traceFile inputHash out
return some arts
else
logWarning s!"\
@@ -386,11 +438,13 @@ in either the saved trace file or in the cached input-to-content mapping.
input '{inputHash.toString.take 7}' found in package artifact cache, \
but output(s) were in an unexpected format"
if let .ok data := savedTrace then
if let some out := data.outputs? then
if let .ok (hashes : α) := fromJson? out then
if let some arts resolveArtifacts? hashes then
cache.insert inputHash out
return some arts
if data.depHash == inputHash then
if let some out := data.outputs? then
if let .ok (hashes : α) := fromJson? out then
if let some arts resolveArtifacts? hashes then
cache.insert inputHash out
savedTrace.replayOrFetch traceFile inputHash out
return some arts
return none
/-- The content hash of an artifact which is stored with extension `ext` in the Lake cache. -/
@@ -428,7 +482,7 @@ than the path to the cached artifact.
-/
def buildArtifactUnlessUpToDate
(file : FilePath) (build : JobM PUnit)
(text := false) (ext := "art") (restore := false)
(text := false) (ext := "art") (restore := false) (exe := false)
: JobM FilePath := do
let depTrace getTrace
let traceFile := FilePath.mk <| file.toString ++ ".trace"
@@ -436,18 +490,20 @@ def buildArtifactUnlessUpToDate
if let some pkg getCurrPackage? then
let inputHash := depTrace.hash
if let some cache := pkg.cacheRef? then
let art? resolveArtifactsUsing? (FileOutputHash ext) inputHash savedTrace cache
let art? resolveArtifactsUsing? (FileOutputHash ext) inputHash traceFile savedTrace cache
if let some art := art? then
if restore && !( file.pathExists) then
logVerbose s!"restored artifact from cache to: {file}"
copyFile art.path file
if exe then
let r := true, true, true
IO.setAccessRights file r, r, r -- 777
writeFileHash file art.hash
savedTrace.replayIfExists
setTrace art.trace
return if restore then file else art.path
unless ( savedTrace.replayIfUpToDate file depTrace) do
discard <| doBuild depTrace traceFile
let art cacheArtifact file ext text
let art cacheArtifact file ext text exe
cache.insert inputHash art.hash
setTrace art.trace
return if restore then file else art.path
@@ -710,7 +766,7 @@ def buildLeanExe
addLeanTrace
addPureTrace traceArgs "traceArgs"
addPlatformTrace -- executables are platform-dependent artifacts
buildArtifactUnlessUpToDate exeFile (ext := FilePath.exeExtension) do
buildArtifactUnlessUpToDate exeFile (ext := FilePath.exeExtension) (exe := true) do
let lean getLeanInstall
let libs mkLinkOrder libs
let args := mkLinkObjArgs objs libs ++ weakArgs ++ traceArgs ++

View File

@@ -375,11 +375,11 @@ Fetch its dependencies and then elaborate the Lean source file, producing
all possible artifacts (e.g., `.olean`, `.ilean`, `.c`, `.bc`).
-/
def Module.recBuildLean (mod : Module) : FetchM (Job ModuleOutputArtifacts) := do
withRegisterJob mod.name.toString do
let setupJob mod.setup.fetch
let leanJob mod.lean.fetch
leanJob.bindM fun srcFile => do
leanJob.bindM (sync := true) fun srcFile => do
let srcTrace getTrace
withRegisterJob mod.name.toString do
setupJob.mapM fun setup => do
addLeanTrace
addTrace <| traceOptions setup.options "options"
@@ -398,10 +398,8 @@ def Module.recBuildLean (mod : Module) : FetchM (Job ModuleOutputArtifacts) := d
let inputHash := depTrace.hash
let savedTrace readTraceFile mod.traceFile
if let some ref := mod.pkg.cacheRef? then
if let some arts resolveArtifactsUsing? ModuleOutputHashes inputHash savedTrace ref then
let arts mod.restoreArtifacts arts
savedTrace.replayIfExists
return arts
if let some arts resolveArtifactsUsing? ModuleOutputHashes inputHash mod.traceFile savedTrace ref then
return mod.restoreArtifacts arts
let upToDate savedTrace.replayIfUpToDate (oldTrace := srcTrace.mtime) mod depTrace
unless upToDate do
discard <| buildAction depTrace mod.traceFile do

View File

@@ -35,7 +35,7 @@ protected def ModuleOutputHashes.toJson (hashes : ModuleOutputHashes) : Json :=
obj := obj.insert "i" hashes.ilean
obj := obj.insert "c" hashes.c
if let some bc := hashes.bc? then
obj := obj.insert "bc" bc
obj := obj.insert "b" bc
return obj
instance : ToJson ModuleOutputHashes := ModuleOutputHashes.toJson

View File

@@ -39,7 +39,9 @@ structure Env where
Can be overridden on a per-command basis with`--try-cache`.
-/
noCache : Bool
/-- The directory of the Lake cache. If `none`, the cache is disabled. -/
/-- Whether the Lake artifact cache should be enabled by default (i.e., `LAKE_ARTIFACT_CACHE`). -/
enableArtifactCache : Bool
/-- The directory of the Lake cache. Set by `LAKE_CACHE_DIR`. If `none`, the cache is disabled. -/
lakeCache : Cache
/-- The initial Elan toolchain of the environment (i.e., `ELAN_TOOLCHAIN`). -/
initToolchain : String
@@ -72,17 +74,32 @@ def getUserHome? : BaseIO (Option FilePath) := do
else
return none
/-- Returns the system directory that can be used to store the Lake artifact cache. -/
/-- Returns the system directory that can be used to store caches (if one exists). -/
def getSystemCacheHome? : BaseIO (Option FilePath) := do
if let some cacheHome IO.getEnv "XDG_CACHE_HOME" then
return FilePath.mk cacheHome / "lake"
return FilePath.mk cacheHome
else if let some userHome getUserHome? then
return userHome / ".cache" / "lake"
return userHome / ".cache"
else
return none
namespace Env
/-- Compute the Lean toolchain string used by Lake from the process environment. -/
def computeToolchain : BaseIO String := do
return ( IO.getEnv "ELAN_TOOLCHAIN").getD Lean.toolchain
/-- Compute the cache location used by Lake from the process environment. May be disabled. -/
def computeCache (elan? : Option ElanInstall) (toolchain : String) : BaseIO Cache := do
if let some cacheDir IO.getEnv "LAKE_CACHE_DIR" then
return cacheDir
else if let some elan := elan? then
return elan.toolchainDir toolchain / "lake" / "cache"
else if let some cacheHome getSystemCacheHome? then
return cacheHome / "lake"
else
return ""
/--
Compute a `Lake.Env` object from the given installs
and the set environment variables.
@@ -92,14 +109,16 @@ def compute
(noCache : Option Bool := none)
: EIO String Env := do
let reservoirBaseUrl getUrlD "RESERVOIR_API_BASE_URL" "https://reservoir.lean-lang.org/api"
let initToolchain := ( IO.getEnv "ELAN_TOOLCHAIN").getD ""
let toolchain := if initToolchain.isEmpty then Lean.toolchain else initToolchain
let elanToolchain? IO.getEnv "ELAN_TOOLCHAIN"
let initToolchain := elanToolchain?.getD ""
let toolchain := elanToolchain?.getD Lean.toolchain
return {
lake, lean, elan?,
pkgUrlMap := computePkgUrlMap
reservoirApiUrl := getUrlD "RESERVOIR_API_URL" s!"{reservoirBaseUrl}/v1"
noCache := (noCache <|> ( IO.getEnv "LAKE_NO_CACHE").bind envToBool?).getD false
lakeCache := getCache? toolchain
enableArtifactCache := ( IO.getEnv "LAKE_ARTIFACT_CACHE").bind envToBool? |>.getD false
lakeCache := computeCache elan? toolchain
githashOverride := ( IO.getEnv "LEAN_GITHASH").getD ""
toolchain
initToolchain
@@ -109,15 +128,6 @@ def compute
initPath := getSearchPath "PATH"
}
where
getCache? toolchain := do
if let some cacheDir IO.getEnv "LAKE_CACHE_DIR" then
return cacheDir
else if let some elan := elan? then
return elan.toolchainDir toolchain / "lake" / "cache"
else if let some cacheHome getSystemCacheHome? then
return cacheHome
else
return ""
computePkgUrlMap := do
let some urlMapStr IO.getEnv "LAKE_PKG_URL_MAP" | return {}
match Json.parse urlMapStr |>.bind fromJson? with
@@ -186,7 +196,6 @@ def noToolchainVars : Array (String × Option String) :=
("LAKE", none),
("LAKE_OVERRIDE_LEAN", none),
("LAKE_HOME", none),
("LAKE_CACHE_DIR", none),
("LEAN", none),
("LEAN_GITHASH", none),
("LEAN_SYSROOT", none),
@@ -202,6 +211,8 @@ def baseVars (env : Env) : Array (String × Option String) :=
("LAKE", env.lake.lake.toString),
("LAKE_HOME", env.lake.home.toString),
("LAKE_PKG_URL_MAP", toJson env.pkgUrlMap |>.compress),
("LAKE_NO_CACHE", toString env.noCache),
("LAKE_ARTIFACT_CACHE", toString env.enableArtifactCache),
("LAKE_CACHE_DIR", if env.lakeCache.isDisabled then none else env.lakeCache.dir.toString),
("LEAN", env.lean.lean.toString),
("LEAN_GITHASH", env.leanGithash),

View File

@@ -285,7 +285,7 @@ configuration PackageConfig (name : Name) extends WorkspaceConfig, LeanConfig wh
-/
reservoir : Bool := true
/-
Enables Lake's local, offline artifact cache for the package.
Whether to enables Lake's local, offline artifact cache for the package.
Artifacts (i.e., build products) of packages will be shared across
local copies by storing them in a cache associated with the Lean toolchain.
@@ -295,8 +295,11 @@ configuration PackageConfig (name : Name) extends WorkspaceConfig, LeanConfig wh
As a caveat, build targets which support the artifact cache will not be stored
in their usual location within the build directory. Thus, projects with custom build
scripts that rely on specific location of artifacts may wish to disable this feature.
If `none`, the cache will be disabled by default unless the `LAKE_ARTIFACT_CACHE`
environment variable is set to true.
-/
enableArtifactCache : Bool := false
enableArtifactCache?, enableArtifactCache : Option Bool := none
instance : EmptyCollection (PackageConfig n) := {}
@@ -622,9 +625,9 @@ def nativeLibDir (self : Package) : FilePath :=
@[inline] def irDir (self : Package) : FilePath :=
self.buildDir / self.config.irDir.normalize
/-- The package's `enableArtifactCache` configuration. -/
@[inline] def enableArtifactCache (self : Package) : Bool :=
self.config.enableArtifactCache
/-- The package's `enableArtifactCache?` configuration. -/
@[inline] def enableArtifactCache? (self : Package) : Option Bool :=
self.config.enableArtifactCache?
/-- The file where the package's input-to-content mapping is stored in the Lake cache. -/
def inputsFileIn (cache : Cache) (self : Package) : FilePath :=

View File

@@ -84,9 +84,9 @@ def loadPackage (config : LoadConfig) : LogIO Package := do
(·.1) <$> loadPackageCore "[root]" config
/-- Load the package's input-to-content mapping from the Lake cache (if enabled). -/
protected def Package.loadInputsFrom (cache : Cache) (pkg : Package) : LogIO Package := do
if !pkg.enableArtifactCache || cache.isDisabled then
protected def Package.loadInputsFrom (env : Env) (pkg : Package) : LogIO Package := do
if env.lakeCache.isDisabled || !(pkg.enableArtifactCache?.getD env.enableArtifactCache) then
return pkg
else
let inputs CacheMap.load (pkg.inputsFileIn cache)
let inputs CacheMap.load (pkg.inputsFileIn env.lakeCache)
return {pkg with cacheRef? := IO.mkRef inputs}

View File

@@ -56,7 +56,7 @@ def loadDepPackage
scope := dep.scope
remoteUrl := dep.remoteUrl
}
let pkg pkg.loadInputsFrom ws.lakeCache
let pkg pkg.loadInputsFrom ws.lakeEnv
if let some env := env? then
let ws IO.ofExcept <| ws.addFacetsFromEnv env leanOpts
return (pkg, ws)

View File

@@ -23,7 +23,7 @@ Does not resolve dependencies.
def loadWorkspaceRoot (config : LoadConfig) : LogIO Workspace := do
Lean.searchPathRef.set config.lakeEnv.leanSearchPath
let (root, env?) loadPackageCore "[root]" config
let root root.loadInputsFrom config.lakeEnv.lakeCache
let root root.loadInputsFrom config.lakeEnv
let ws : Workspace := {
root
lakeEnv := config.lakeEnv

1
src/lake/tests/cache/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
Ignored.lean

View File

@@ -1 +1 @@
rm -rf .lake lake-manifest.json produced.out
rm -rf .lake lake-manifest.json produced.out Ignored.lean

View File

@@ -1,4 +1,5 @@
name = "test"
enableArtifactCache = false
[[lean_lib]]
name = "Test"

View File

@@ -4,6 +4,9 @@ enableArtifactCache = true
[[lean_lib]]
name = "Test"
[[lean_lib]]
name = "Ignored"
[[lean_exe]]
name = "test"
root = "Main"

View File

@@ -8,9 +8,23 @@ source ../common.sh
TEST_DIR="$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &> /dev/null && pwd)"
CACHE_DIR="$TEST_DIR/.lake/cache"
# Ensure packages without `enableArtifactCache` do not use the cache
LAKE_CACHE_DIR="$CACHE_DIR" test_run build -f disabled.toml Test:static
# Verify packages without `enableArtifactCache` do not use the cache by default
LAKE_CACHE_DIR="$CACHE_DIR" test_run build -f unset.toml Test:static
test_exp ! -d "$CACHE_DIR"
# Verify they also do not if `LAKE_ARTIFACT_CACHE` is set to `false`
LAKE_CACHE_DIR="$CACHE_DIR" LAKE_ARTIFACT_CACHE=false \
test_run build -f unset.toml Test:static
test_exp ! -d "$CACHE_DIR"
# Verify packages with `enableArtifactCache` set to `false``
# also do not if `LAKE_ARTIFACT_CACHE` is set to `true`
LAKE_CACHE_DIR="$CACHE_DIR" LAKE_ARTIFACT_CACHE=true \
test_run build -f disabled.toml Test:static
test_exp ! -d "$CACHE_DIR"
# Verify that packages without `enableArtifactCache` and
# `LAKE_ARTIFACT_CACHE` is set to `true` do use the cache
LAKE_CACHE_DIR="$CACHE_DIR" LAKE_ARTIFACT_CACHE=true \
test_run build -f unset.toml Test:static
test_exp -d "$CACHE_DIR"
./clean.sh
# Ensure a build runs properly with some artifacts cached
@@ -63,6 +77,19 @@ test_cached +Test:c
# Verify no `.hash` files end up in the cache directory
check_diff /dev/null <(ls -1 "$CACHE_DIR/*.hash" 2>/dev/null)
# Verify that the executable has the right permissions to be run
LAKE_CACHE_DIR="$CACHE_DIR" test_run exe test
# Verify that fetching from the cache creates a trace file that does not replay
touch Ignored.lean
LAKE_CACHE_DIR="$CACHE_DIR" test_out "Fetched Ignored" -v build +Ignored
test_exp -f .lake/build/lib/lean/Ignored.trace
LAKE_CACHE_DIR="$CACHE_DIR" test_out "Fetched Ignored" -v build +Ignored
# Verify that modifications invalidate the cache
echo "def foo := ()" > Ignored.lean
LAKE_CACHE_DIR="$CACHE_DIR" test_out "Built Ignored" -v build +Ignored
# Verify module oleans and ileans are restored from the cache
LAKE_CACHE_DIR="$CACHE_DIR" test_run build +Test --no-build
test_cmd rm .lake/build/lib/lean/Test.olean .lake/build/lib/lean/Test.ilean
@@ -94,4 +121,4 @@ test_cmd rm "$CACHE_DIR/inputs/test.jsonl" .lake/build/ir/Test.c
LAKE_CACHE_DIR="$CACHE_DIR" test_run -v build +Test:c --no-build
# Cleanup
rm -f produced.out
rm -f produced.out Ignored.lean

4
src/lake/tests/cache/unset.toml vendored Normal file
View File

@@ -0,0 +1,4 @@
name = "test"
[[lean_lib]]
name = "Test"

View File

@@ -25,6 +25,11 @@ test_out "lake" env printenv LEAN_SRC_PATH
test_out "hello" -d ../../examples/hello env printenv LEAN_SRC_PATH
test_out "hello" -d ../../examples/hello env printenv PATH
# Test other variables are set
test_eq "false" env printenv LAKE_NO_CACHE
test_eq "false" env printenv LAKE_ARTIFACT_CACHE
test_run env printenv LAKE_CACHE_DIR
# Test that `env` preserves the input environment for certain variables
echo "# TEST: Setting variables for lake env"
test_eq "foo" env env ELAN_TOOLCHAIN=foo $LAKE env printenv ELAN_TOOLCHAIN
@@ -33,6 +38,11 @@ LEAN_GITHASH=foo test_eq "foo" env printenv LEAN_GITHASH
LEAN_AR=foo test_eq "foo" env printenv LEAN_AR
LEAN_CC=foo test_eq "foo" env printenv LEAN_CC
# Test `LAKE_ARTIFACT_CACHE` setting and default
LAKE_ARTIFACT_CACHE=true test_eq "true" env printenv LAKE_ARTIFACT_CACHE
LAKE_ARTIFACT_CACHE=false test_eq "false" env printenv LAKE_ARTIFACT_CACHE
LAKE_ARTIFACT_CACHE= test_eq "false" env printenv LAKE_ARTIFACT_CACHE
# Test `LAKE_PKG_URL_MAP` setting and errors
echo "# TEST: LAKE_PKG_URL_MAP"
LAKE_PKG_URL_MAP='{"a":"a"}' test_eq '{"a":"a"}' env printenv LAKE_PKG_URL_MAP

View File

@@ -25,6 +25,8 @@ leanLibDir = "lib/lean"
# Destination for static libraries
nativeLibDir = "lib/lean"
leanOptions = { experimental.module = true }
# Additional options derived from the CMake configuration
# For example, CI will set `-DwarningAsError=true` through this
moreLeanArgs = [${LEAN_EXTRA_OPTS_TOML}]

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More