Compare commits

...

12 Commits

Author SHA1 Message Date
Kim Morrison
b86e2e5824 chore: set(LEAN_VERSION_PATCH 2) 2025-11-25 21:47:27 +11:00
Sebastian Ullrich
2be355fd1d fix: avoid storing reference to environment in realization result to prevent promise cycle (#11328)
This PR fixes freeing memory accidentally retained for each document
version in the language server on certain elaboration workloads. The
issue must have existed since 4.18.0.
2025-11-25 21:47:19 +11:00
Kim Morrison
e82306f6ad chore: set(LEAN_VERSION_PATCH 1) 2025-11-25 21:47:18 +11:00
Mac Malone
259135d089 fix: lake: improper uses of computeArtifact w/o text (#11216)
This PR ensures that the `text` argument of `computeArtifact` is always
provided in Lake code, fixing a hashing bug with
`buildArtifactUnlessUpToDate` in the process.

Closes #11209
2025-11-18 03:29:21 +01:00
Joachim Breitner
cdd38ac511 fix: Meta.Closure: topologically sort abstracted vars (#10926)
This PR topologically sorts abstracted vars in
`Meta.Closure.mkValueTypeClosure` if MVars are being abstracted.
Fixes #10705

---------

Co-authored-by: Eric Wieser <efw@google.com>
(cherry picked from commit a6d50a61b3)
2025-11-03 12:04:38 +11:00
Markus Himmel
ba0f86db9c fix: KMP implementation (#10998)
This PR fixes the KMP implementation, which did incorrect bookkeeping of
the backtracking process, leading to incorrect starting ranges of
matches.

The new implementation does not require `partial` anywhere.
2025-10-29 09:01:43 +01:00
Markus Himmel
edc75a6cee fix: search for empty string (#10985)
This PR ensures that searching for an empty string returns the expected
pattern of alternating size-zero matches and size-one rejects.

In particular, splitting by an empty string returns an array formed of
the empty string, all of the string's characters as singleton strings,
followed by another empty string. This matches the [Rust
behavior](https://doc.rust-lang.org/std/primitive.str.html#method.split),
for example.
2025-10-29 09:01:43 +01:00
Kim Morrison
744f98064b chore: release_steps runs lake exe cache get when needed 2025-10-22 09:43:19 +11:00
github-actions[bot]
e0b813b94a chore: more module system fixes and refinements for finishing batteries port (#10873)
Backport 37b78bd53d from #10819.

Co-authored-by: Sebastian Ullrich <sebasti@nullri.ch>
2025-10-21 16:00:27 +02:00
Markus Himmel
74fea5d8e2 fix: logic error in String.Slice.takeWhile (#10868)
This PR fixes a bug in `String.Slice.takeWhile` which caused it to get
its bookkeeping wrong and panic. The new version only uses safe
operations on `String.Slice.Pos`.

(cherry picked from commit 196d50156a)
2025-10-21 12:39:39 +02:00
Kim Morrison
7133f1a8a3 feat: improvements to release automation for v4.25.0-rc1 (#10860)
This PR improves the release automation. We link to CI output for
building the release tag, don't give instructions for bumping downstream
repositories until the release it ready, and improve documentation and
prompts.
2025-10-21 04:05:17 +02:00
Kim Morrison
66466a7b5d chore: set LEAN_VERSION_IS_RELEASE to 1 for v4.25.0-rc1 2025-10-21 01:55:24 +02:00
36 changed files with 653 additions and 266 deletions

View File

@@ -16,14 +16,18 @@ These comments explain the scripts' behavior, which repositories get special han
## Process
1. Run `script/release_checklist.py {version}` to check the current status
2. Create a todo list tracking all repositories that need updates
3. For each repository that needs updating:
2. **CRITICAL: If preliminary lean4 checks fail, STOP immediately and alert the user**
- Check for: release branch exists, CMake version correct, tag exists, release page exists, release notes exist
- **IMPORTANT**: The release page is created AUTOMATICALLY by CI after pushing the tag - DO NOT create it manually
- Do NOT create any PRs or proceed with repository updates if these checks fail
3. Create a todo list tracking all repositories that need updates
4. For each repository that needs updating:
- Run `script/release_steps.py {version} {repo_name}` to create the PR
- Mark it complete when the PR is created
4. After creating PRs, notify the user which PRs need review and merging
5. Continuously rerun `script/release_checklist.py {version}` to check progress
6. As PRs are merged, dependent repositories will become ready - create PRs for those as well
7. Continue until all repositories are updated and the release is complete
5. After creating PRs, notify the user which PRs need review and merging
6. Continuously rerun `script/release_checklist.py {version}` to check progress
7. As PRs are merged, dependent repositories will become ready - create PRs for those as well
8. Continue until all repositories are updated and the release is complete
## Important Notes

View File

@@ -13,8 +13,15 @@ What this script does:
- Checks that the release branch (releases/vX.Y.0) exists
- Verifies CMake version settings are correct
- Confirms the release tag exists
- Validates the release page exists on GitHub
- Checks the release notes page on lean-lang.org
- Validates the release page exists on GitHub (created automatically by CI after tag push)
- Checks the release notes page on lean-lang.org (updated while bumping the `reference-manual` repository)
**IMPORTANT: If the release page doesn't exist, the script will skip checking
downstream repositories and the master branch configuration. The preliminary
infrastructure must be in place before the release process can proceed.**
**NOTE: The GitHub release page is created AUTOMATICALLY by CI after the tag is pushed.
DO NOT create it manually. Wait for CI to complete after pushing the tag.**
2. For each downstream repository (batteries, mathlib4, etc.):
- Checks if dependencies are ready (e.g., mathlib4 depends on batteries)
@@ -122,6 +129,39 @@ def release_page_exists(repo_url, tag_name, github_token):
response = requests.get(api_url, headers=headers)
return response.status_code == 200
def get_tag_workflow_status(repo_url, tag_name, github_token):
"""Get the status of CI workflows running for a specific tag."""
api_base = repo_url.replace("https://github.com/", "https://api.github.com/repos/")
headers = {'Authorization': f'token {github_token}'} if github_token else {}
# Get workflow runs for the tag
# GitHub's workflow runs API uses the branch/tag name in the 'head_branch' field
api_url = f"{api_base}/actions/runs?event=push&head_branch={tag_name}"
response = requests.get(api_url, headers=headers)
if response.status_code != 200:
return None
data = response.json()
workflow_runs = data.get('workflow_runs', [])
if not workflow_runs:
return None
# Get the most recent workflow run for this tag
run = workflow_runs[0]
status = run.get('status')
conclusion = run.get('conclusion')
workflow_name = run.get('name', 'CI')
run_id = run.get('id')
return {
'status': status,
'conclusion': conclusion,
'workflow_name': workflow_name,
'run_id': run_id
}
def get_release_notes(tag_name):
"""Fetch release notes page title from lean-lang.org."""
# Strip -rcX suffix if present for the URL
@@ -130,20 +170,17 @@ def get_release_notes(tag_name):
try:
response = requests.get(reference_url)
response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx)
# Extract title using regex
match = re.search(r"<title>(.*?)</title>", response.text, re.IGNORECASE | re.DOTALL)
if match:
return match.group(1).strip()
else:
print(f" ⚠️ Could not find <title> tag in {reference_url}")
return None
except requests.exceptions.RequestException as e:
print(f" ❌ Error fetching release notes from {reference_url}: {e}")
except requests.exceptions.RequestException:
return None
except Exception as e:
print(f" ❌ An unexpected error occurred while processing release notes: {e}")
except Exception:
return None
def get_branch_content(repo_url, branch, file_path, github_token):
@@ -512,30 +549,57 @@ def main():
print(f" ❌ Short commit hash {commit_hash[:SHORT_HASH_LENGTH]} is numeric and starts with 0, causing issues for version parsing. Try regenerating the last commit to get a new hash.")
lean4_success = False
if not release_page_exists(lean_repo_url, toolchain, github_token):
print(f" ❌ Release page for {toolchain} does not exist")
release_page_ready = release_page_exists(lean_repo_url, toolchain, github_token)
if not release_page_ready:
print(f" ❌ Release page for {toolchain} does not exist (This will be created by CI.)")
# Check CI workflow status
workflow_status = get_tag_workflow_status(lean_repo_url, toolchain, github_token)
if workflow_status:
status = workflow_status['status']
conclusion = workflow_status['conclusion']
workflow_name = workflow_status['workflow_name']
run_id = workflow_status['run_id']
workflow_url = f"{lean_repo_url}/actions/runs/{run_id}"
if status == 'in_progress' or status == 'queued':
print(f" 🔄 {workflow_name} workflow is {status}: {workflow_url}")
elif status == 'completed':
if conclusion == 'success':
print(f"{workflow_name} workflow completed successfully: {workflow_url}")
elif conclusion == 'failure':
print(f"{workflow_name} workflow failed: {workflow_url}")
else:
print(f" ⚠️ {workflow_name} workflow completed with status: {conclusion}: {workflow_url}")
else:
print(f" {workflow_name} workflow status: {status}: {workflow_url}")
lean4_success = False
else:
print(f" ✅ Release page for {toolchain} exists")
# Check the actual release notes page title
# Check the actual release notes page title (informational only - does not block)
actual_title = get_release_notes(toolchain)
expected_title_prefix = f"Lean {toolchain.lstrip('v')}" # e.g., "Lean 4.19.0" or "Lean 4.19.0-rc1"
base_tag = toolchain.split('-')[0]
release_notes_url = f"https://lean-lang.org/doc/reference/latest/releases/{base_tag}/"
if actual_title is None:
# Error already printed by get_release_notes
lean4_success = False
print(f" ⚠️ Release notes not found at {release_notes_url} (this will be fixed while updating the reference-manual repository)")
elif not actual_title.startswith(expected_title_prefix):
# Construct URL for the error message (using the base tag)
base_tag = toolchain.split('-')[0]
check_url = f"https://lean-lang.org/doc/reference/latest/releases/{base_tag}/"
print(f" ❌ Release notes page title mismatch. Expected prefix '{expected_title_prefix}', got '{actual_title}'. Check {check_url}")
lean4_success = False
print(f" ⚠️ Release notes page title mismatch. Expected prefix '{expected_title_prefix}', got '{actual_title}'. Check {release_notes_url}")
else:
print(f" ✅ Release notes page title looks good ('{actual_title}').")
repo_status["lean4"] = lean4_success
# If the release page doesn't exist, skip repository checks and master branch checks
# The preliminary infrastructure must be in place first
if not release_page_exists(lean_repo_url, toolchain, github_token):
print("\n⚠️ Release process blocked: preliminary Lean4 infrastructure incomplete.")
print(" Complete the steps above, then rerun this script to proceed with downstream repositories.")
return
# Load repositories and perform further checks
print("\nChecking repositories...")

View File

@@ -589,8 +589,19 @@ def execute_release_steps(repo, version, config):
# Clean lake cache for a fresh build
print(blue("Cleaning lake cache..."))
run_command("rm -rf .lake", cwd=repo_path)
run_command("lake clean", cwd=repo_path)
# Check if downstream of Mathlib and get cache if so
mathlib_package_dir = repo_path / ".lake" / "packages" / "mathlib"
if mathlib_package_dir.exists():
print(blue("Project is downstream of Mathlib, fetching cache..."))
try:
run_command("lake exe cache get", cwd=repo_path, stream_output=True)
print(green("Cache fetched successfully"))
except subprocess.CalledProcessError as e:
print(yellow("Failed to fetch cache, continuing anyway..."))
print(yellow(f"Cache fetch error: {e}"))
try:
run_command("lake build", cwd=repo_path, stream_output=True)
print(green("Build completed successfully"))

View File

@@ -11,8 +11,8 @@ include(ExternalProject)
project(LEAN CXX C)
set(LEAN_VERSION_MAJOR 4)
set(LEAN_VERSION_MINOR 25)
set(LEAN_VERSION_PATCH 0)
set(LEAN_VERSION_IS_RELEASE 0) # This number is 1 in the release revision, and 0 otherwise.
set(LEAN_VERSION_PATCH 2)
set(LEAN_VERSION_IS_RELEASE 1) # This number is 1 in the release revision, and 0 otherwise.
set(LEAN_SPECIAL_VERSION_DESC "" CACHE STRING "Additional version description like 'nightly-2018-03-11'")
set(LEAN_VERSION_STRING "${LEAN_VERSION_MAJOR}.${LEAN_VERSION_MINOR}.${LEAN_VERSION_PATCH}")
if (LEAN_SPECIAL_VERSION_DESC)

View File

@@ -1870,7 +1870,7 @@ theorem Pos.Raw.unoffsetBy_offsetBy {p q : Pos.Raw} : (p.offsetBy q).unoffsetBy
/-- Given a position in `s` that is at least `p₀`, obtain the corresponding position in
`s.replaceStart p₀`. -/
@[inline]
def Slice.Pos.toReplaceStart {s : Slice} (p₀ : s.Pos) (pos : s.Pos) (h : p₀.offset pos.offset) :
def Slice.Pos.toReplaceStart {s : Slice} (p₀ : s.Pos) (pos : s.Pos) (h : p₀ pos) :
(s.replaceStart p₀).Pos where
offset := pos.offset.unoffsetBy p₀.offset
isValidForSlice := Pos.Raw.isValidForSlice_replaceStart.2 (by
@@ -1904,6 +1904,27 @@ theorem Slice.Pos.get_eq_get_ofReplaceStart {s : Slice} {p₀ : s.Pos} {pos : (s
pos.get h = (ofReplaceStart pos).get (by rwa [ ofReplaceStart_endPos, ne_eq, ofReplaceStart_inj]) := by
simp [Slice.Pos.get, Nat.add_assoc]
/-- Given a position in `s.replaceEnd p₀`, obtain the corresponding position in `s`. -/
@[inline]
def Slice.Pos.ofReplaceEnd {s : Slice} {p₀ : s.Pos} (pos : (s.replaceEnd p₀).Pos) : s.Pos where
offset := pos.offset
isValidForSlice := (Pos.Raw.isValidForSlice_replaceEnd.1 pos.isValidForSlice).2
@[simp]
theorem Slice.Pos.offset_ofReplaceEnd {s : Slice} {p₀ : s.Pos} {pos : (s.replaceEnd p₀).Pos} :
(ofReplaceEnd pos).offset = pos.offset := (rfl)
/-- Given a position in `s` that is at most `p₀`, obtain the corresponding position in `s.replaceEnd p₀`. -/
@[inline]
def Slice.Pos.toReplaceEnd {s : Slice} (p₀ : s.Pos) (pos : s.Pos) (h : pos p₀) :
(s.replaceEnd p₀).Pos where
offset := pos.offset
isValidForSlice := Pos.Raw.isValidForSlice_replaceEnd.2 h, pos.isValidForSlice
@[simp]
theorem Slice.Pos.offset_toReplaceEnd {s : Slice} {p₀ : s.Pos} {pos : s.Pos} {h : pos p₀} :
(toReplaceEnd p₀ pos h).offset = pos.offset := (rfl)
theorem Slice.Pos.copy_eq_append_get {s : Slice} {pos : s.Pos} (h : pos s.endPos) :
t₁ t₂ : String, s.copy = t₁ ++ singleton (pos.get h) ++ t₂ t₁.utf8ByteSize = pos.offset.byteIdx := by
obtain t₂, ht₂ := (s.replaceStart pos).copy.eq_singleton_append (by simpa [ ValidPos.ofCopy_inj, ofReplaceStart_inj])

View File

@@ -38,7 +38,7 @@ inductive SearchStep (s : Slice) where
The subslice starting at {name}`startPos` and ending at {name}`endPos` did not match the pattern.
-/
| matched (startPos endPos : s.Pos)
deriving Inhabited
deriving Inhabited, BEq
/--
Provides a conversion from a pattern to an iterator of {name}`SearchStep` that searches for matches
@@ -70,7 +70,7 @@ class ForwardPattern (ρ : Type) where
Checks whether the slice starts with the pattern. If it does, the slice is returned with the
prefix removed; otherwise the result is {name}`none`.
-/
dropPrefix? : Slice ρ Option Slice
dropPrefix? : (s : Slice) ρ Option s.Pos
namespace Internal
@@ -153,10 +153,10 @@ def defaultStartsWith (s : Slice) (pat : ρ) : Bool :=
| _ => false
@[specialize pat]
def defaultDropPrefix? (s : Slice) (pat : ρ) : Option Slice :=
def defaultDropPrefix? (s : Slice) (pat : ρ) : Option s.Pos :=
let searcher := ToForwardSearcher.toSearcher s pat
match searcher.step with
| .yield _ (.matched _ endPos) _ => some (s.replaceStart endPos)
| .yield _ (.matched _ endPos) _ => some endPos
| _ => none
@[always_inline, inline]
@@ -196,7 +196,7 @@ class BackwardPattern (ρ : Type) where
Checks whether the slice ends with the pattern. If it does, the slice is returned with the
suffix removed; otherwise the result is {name}`none`.
-/
dropSuffix? : Slice ρ Option Slice
dropSuffix? : (s : Slice) ρ Option s.Pos
namespace ToBackwardSearcher
@@ -212,10 +212,10 @@ def defaultEndsWith (s : Slice) (pat : ρ) : Bool :=
| _ => false
@[specialize pat]
def defaultDropSuffix? (s : Slice) (pat : ρ) : Option Slice :=
def defaultDropSuffix? (s : Slice) (pat : ρ) : Option s.Pos :=
let searcher := ToBackwardSearcher.toSearcher s pat
match searcher.step with
| .yield _ (.matched startPos _) _ => some (s.replaceEnd startPos)
| .yield _ (.matched startPos _) _ => some startPos
| _ => none
@[always_inline, inline]

View File

@@ -9,6 +9,7 @@ prelude
public import Init.Data.String.Pattern.Basic
public import Init.Data.Iterators.Internal.Termination
public import Init.Data.Iterators.Consumers.Monadic.Loop
public import Init.Data.Vector.Basic
set_option doc.verso true
@@ -21,142 +22,202 @@ public section
namespace String.Slice.Pattern
inductive ForwardSliceSearcher (s : Slice) where
| empty (pos : s.Pos)
| proper (needle : Slice) (table : Array String.Pos.Raw) (stackPos : String.Pos.Raw) (needlePos : String.Pos.Raw)
namespace ForwardSliceSearcher
def buildTable (pat : Slice) : Vector Nat pat.utf8ByteSize :=
if h : pat.utf8ByteSize = 0 then
#v[].cast h.symm
else
let arr := Array.emptyWithCapacity pat.utf8ByteSize
let arr' := arr.push 0
go arr' (by simp [arr']) (by simp [arr', arr]; omega) (by simp [arr', arr])
where
go (table : Array Nat) (ht₀ : 0 < table.size) (ht : table.size pat.utf8ByteSize) (h : (i : Nat) hi, table[i]'hi i) :
Vector Nat pat.utf8ByteSize :=
if hs : table.size < pat.utf8ByteSize then
let patByte := pat.getUTF8Byte table.size hs
let dist := computeDistance patByte table ht h (table[table.size - 1])
(by have := h (table.size - 1) (by omega); omega)
let dist' := if pat.getUTF8Byte dist.1 (by simp [Pos.Raw.lt_iff]; omega) = patByte then dist.1 + 1 else dist
go (table.push dist') (by simp) (by simp; omega) (by
intro i hi
by_cases hi' : i = table.size
· subst hi'
simp [dist']
have := dist.2
split <;> omega
· rw [Array.getElem_push_lt]
· apply h
· simp at hi
omega)
else
Vector.mk table (by omega)
computeDistance (patByte : UInt8) (table : Array Nat)
(ht : table.size pat.utf8ByteSize)
(h : (i : Nat) hi, table[i]'hi i) (guess : Nat) (hg : guess < table.size) :
{ n : Nat // n < table.size } :=
if h' : guess = 0 pat.getUTF8Byte guess (by simp [Pos.Raw.lt_iff]; omega) = patByte then
guess, hg
else
have : table[guess - 1] < guess := by have := h (guess - 1) (by omega); omega
computeDistance patByte table ht h table[guess - 1] (by omega)
theorem getElem_buildTable_le (pat : Slice) (i : Nat) (hi) : (buildTable pat)[i]'hi i := by
rw [buildTable]
split <;> rename_i h
· simp [h] at hi
· simp only [Array.emptyWithCapacity_eq, List.push_toArray, List.nil_append]
suffices pat' table ht₀ ht h (i : Nat) hi, (buildTable.go pat' table ht₀ ht h)[i]'hi i from this ..
intro pat' table ht₀ ht h i hi
fun_induction buildTable.go with
| case1 => assumption
| case2 table ht₀ ht ht' ht'' => apply ht'
inductive _root_.String.Slice.Pattern.ForwardSliceSearcher (s : Slice) where
| emptyBefore (pos : s.Pos)
| emptyAt (pos : s.Pos) (h : pos s.endPos)
| proper (needle : Slice) (table : Vector Nat needle.utf8ByteSize) (ht : table = buildTable needle)
(stackPos : String.Pos.Raw) (needlePos : String.Pos.Raw) (hn : needlePos < needle.rawEndPos)
| atEnd
deriving Inhabited
namespace ForwardSliceSearcher
partial def buildTable (pat : Slice) : Array String.Pos.Raw :=
if pat.utf8ByteSize == 0 then
#[]
else
let arr := Array.emptyWithCapacity pat.utf8ByteSize
let arr := arr.push 0
go 1 arr
where
go (pos : String.Pos.Raw) (table : Array String.Pos.Raw) :=
if h : pos < pat.rawEndPos then
let patByte := pat.getUTF8Byte pos h
let distance := computeDistance table[table.size - 1]! patByte table
let distance := if patByte = pat.getUTF8Byte! distance then distance.inc else distance
go pos.inc (table.push distance)
else
table
computeDistance (distance : String.Pos.Raw) (patByte : UInt8) (table : Array String.Pos.Raw) :
String.Pos.Raw :=
if distance > 0 && patByte != pat.getUTF8Byte! distance then
computeDistance table[distance.byteIdx - 1]! patByte table
else
distance
@[inline]
def iter (s : Slice) (pat : Slice) : Std.Iter (α := ForwardSliceSearcher s) (SearchStep s) :=
if pat.utf8ByteSize == 0 then
{ internalState := .empty s.startPos }
if h : pat.utf8ByteSize = 0 then
{ internalState := .emptyBefore s.startPos }
else
{ internalState := .proper pat (buildTable pat) s.startPos.offset pat.startPos.offset }
partial def backtrackIfNecessary (pat : Slice) (table : Array String.Pos.Raw) (stackByte : UInt8)
(needlePos : String.Pos.Raw) : String.Pos.Raw :=
if needlePos != 0 && stackByte != pat.getUTF8Byte! needlePos then
backtrackIfNecessary pat table stackByte table[needlePos.byteIdx - 1]!
else
needlePos
{ internalState := .proper pat (buildTable pat) rfl s.startPos.offset pat.startPos.offset
(by simp [Pos.Raw.lt_iff]; omega) }
instance (s : Slice) : Std.Iterators.Iterator (ForwardSliceSearcher s) Id (SearchStep s) where
IsPlausibleStep it
| .yield it' out =>
match it.internalState with
| .empty pos =>
( newPos, pos < newPos it'.internalState = .empty newPos)
it'.internalState = .atEnd
| .proper needle table stackPos needlePos =>
( newStackPos newNeedlePos,
stackPos < newStackPos
newStackPos s.rawEndPos
it'.internalState = .proper needle table newStackPos newNeedlePos)
| .yield it' out | .skip it' =>
match it.internalState with
| .emptyBefore pos => ( h, it'.internalState = .emptyAt pos h) it'.internalState = .atEnd
| .emptyAt pos h => newPos, pos < newPos it'.internalState = .emptyBefore newPos
| .proper needle table ht stackPos needlePos hn =>
( newStackPos newNeedlePos hn,
it'.internalState = .proper needle table ht newStackPos newNeedlePos hn
((s.utf8ByteSize - newStackPos.byteIdx < s.utf8ByteSize - stackPos.byteIdx)
(newStackPos = stackPos newNeedlePos < needlePos)))
it'.internalState = .atEnd
| .atEnd => False
| .skip _ => False
| .done => True
step := fun iter =>
match iter with
| .empty pos =>
| .emptyBefore pos =>
let res := .matched pos pos
if h : pos s.endPos then
pure (.deflate .yield .empty (pos.next h) res, by simp)
pure (.deflate .yield .emptyAt pos h res, by simp [h])
else
pure (.deflate .yield .atEnd res, by simp)
| .proper needle table stackPos needlePos =>
let rec findNext (startPos : String.Pos.Raw)
(currStackPos : String.Pos.Raw) (needlePos : String.Pos.Raw) (h : stackPos currStackPos) :=
if h1 : currStackPos < s.rawEndPos then
let stackByte := s.getUTF8Byte currStackPos h1
let needlePos := backtrackIfNecessary needle table stackByte needlePos
let patByte := needle.getUTF8Byte! needlePos
if stackByte != patByte then
let nextStackPos := s.findNextPos currStackPos h1 |>.offset
let res := .rejected (s.pos! startPos) (s.pos! nextStackPos)
have hiter := by
left
exists nextStackPos
have haux := lt_offset_findNextPos h1
simp only [String.Pos.Raw.lt_iff, proper.injEq, true_and, exists_and_left, exists_eq', and_true,
nextStackPos]
constructor
· simp [String.Pos.Raw.le_iff, String.Pos.Raw.lt_iff] at h haux
omega
· apply Pos.Raw.IsValidForSlice.le_utf8ByteSize
apply Pos.isValidForSlice
.deflate .yield .proper needle table nextStackPos needlePos res, hiter
| .emptyAt pos h =>
let res := .rejected pos (pos.next h)
pure (.deflate .yield .emptyBefore (pos.next h) res, by simp)
| .proper needle table htable stackPos needlePos hn =>
-- **Invariant 1:** we have already covered everything up until `stackPos - needlePos` (exclusive),
-- with matches and rejections.
-- **Invariant 2:** `stackPos - needlePos` is a valid position
-- **Invariant 3:** the range from from `stackPos - needlePos` to `stackPos` (exclusive) is a
-- prefix of the pattern.
if h₁ : stackPos < s.rawEndPos then
let stackByte := s.getUTF8Byte stackPos h₁
let patByte := needle.getUTF8Byte needlePos hn
if stackByte = patByte then
let nextStackPos := stackPos.inc
let nextNeedlePos := needlePos.inc
if h : nextNeedlePos = needle.rawEndPos then
-- Safety: the section from `nextStackPos.decreaseBy needle.utf8ByteSize` to `nextStackPos`
-- (exclusive) is exactly the needle, so it must represent a valid range.
let res := .matched (s.pos! (nextStackPos.decreaseBy needle.utf8ByteSize)) (s.pos! nextStackPos)
-- Invariants still satisfied
pure (.deflate .yield .proper needle table htable nextStackPos 0
(by simp [Pos.Raw.lt_iff] at hn ; omega) res,
by simpa using _, _, rfl, rfl, by simp [Pos.Raw.lt_iff] at hn ; omega,
Or.inl (by simp [nextStackPos, Pos.Raw.lt_iff] at h₁ ; omega))
else
let needlePos := needlePos.inc
if needlePos == needle.rawEndPos then
let nextStackPos := currStackPos.inc
let res := .matched (s.pos! startPos) (s.pos! nextStackPos)
have hiter := by
left
exists nextStackPos
simp only [Pos.Raw.byteIdx_inc, proper.injEq, true_and, exists_and_left,
exists_eq', and_true, nextStackPos, String.Pos.Raw.lt_iff]
constructor
· simp [String.Pos.Raw.le_iff] at h
omega
· simp [String.Pos.Raw.le_iff, String.Pos.Raw.lt_iff] at h1
omega
.deflate .yield .proper needle table nextStackPos 0 res, hiter
else
have hinv := by
simp [String.Pos.Raw.le_iff] at h
omega
findNext startPos currStackPos.inc needlePos hinv
-- Invariants still satisfied
pure (.deflate .skip .proper needle table htable nextStackPos nextNeedlePos
(by simp [Pos.Raw.lt_iff, nextNeedlePos, Pos.Raw.ext_iff] at h hn ; omega),
by simpa using _, _, rfl, rfl, by simp [nextNeedlePos, Pos.Raw.lt_iff, Pos.Raw.ext_iff] at h hn ; omega,
Or.inl (by simp [nextStackPos, Pos.Raw.lt_iff] at h₁ ; omega))
else
if startPos != s.rawEndPos then
let res := .rejected (s.pos! startPos) (s.pos! currStackPos)
.deflate .yield .atEnd res, by simp
if hnp : needlePos.byteIdx = 0 then
-- Safety: by invariant 2
let basePos := s.pos! stackPos
-- Since we report (mis)matches by code point and not by byte, missing in the first byte
-- means that we should skip ahead to the next code point.
let nextStackPos := s.findNextPos stackPos h₁
let res := .rejected basePos nextStackPos
-- Invariants still satisfied
pure (.deflate .yield .proper needle table htable nextStackPos.offset 0
(by simp [Pos.Raw.lt_iff] at hn ; omega) res,
by simpa using _, _, rfl, rfl, by simp [Pos.Raw.lt_iff] at hn ; omega,
Or.inl (by
have := lt_offset_findNextPos h₁
have t₀ := (findNextPos _ _ h₁).isValidForSlice.le_utf8ByteSize
simp [nextStackPos, Pos.Raw.lt_iff, Pos.Raw.le_iff] at this t₀ ; omega))
else
.deflate .done, by simp
termination_by s.utf8ByteSize - currStackPos.byteIdx
decreasing_by
simp [String.Pos.Raw.lt_iff] at h1
omega
findNext stackPos stackPos needlePos (by simp)
let newNeedlePos := table[needlePos.byteIdx - 1]'(by simp [Pos.Raw.lt_iff] at hn; omega)
if newNeedlePos = 0 then
-- Safety: by invariant 2
let basePos := s.pos! (stackPos.unoffsetBy needlePos)
-- Since we report (mis)matches by code point and not by byte, missing in the first byte
-- means that we should skip ahead to the next code point.
let nextStackPos := (s.pos? stackPos).getD (s.findNextPos stackPos h₁)
let res := .rejected basePos nextStackPos
-- Invariants still satisfied
pure (.deflate .yield .proper needle table htable nextStackPos.offset 0
(by simp [Pos.Raw.lt_iff] at hn ; omega) res,
by simpa using _, _, rfl, rfl, by simp [Pos.Raw.lt_iff] at hn ; omega, by
simp only [pos?, Pos.Raw.isValidForSlice_eq_true_iff, nextStackPos]
split
· exact Or.inr (by simp [Pos.Raw.lt_iff]; omega)
· refine Or.inl ?_
have := lt_offset_findNextPos h₁
have t₀ := (findNextPos _ _ h₁).isValidForSlice.le_utf8ByteSize
simp [Pos.Raw.lt_iff, Pos.Raw.le_iff] at this t₀ ; omega)
else
let oldBasePos := s.pos! (stackPos.decreaseBy needlePos.byteIdx)
let newBasePos := s.pos! (stackPos.decreaseBy newNeedlePos)
let res := .rejected oldBasePos newBasePos
-- Invariants still satisfied by definition of the prefix table
pure (.deflate .yield .proper needle table htable stackPos newNeedlePos
(by
subst htable
have := getElem_buildTable_le needle (needlePos.byteIdx - 1) (by simp [Pos.Raw.lt_iff] at hn; omega)
simp [newNeedlePos, Pos.Raw.lt_iff] at hn
omega) res,
by
simp only [proper.injEq, heq_eq_eq, true_and, exists_and_left, exists_prop,
reduceCtorEq, or_false]
refine _, _, rfl, rfl, ?_, Or.inr rfl, ?_
all_goals
subst htable
have := getElem_buildTable_le needle (needlePos.byteIdx - 1) (by simp [Pos.Raw.lt_iff] at hn; omega)
simp [newNeedlePos, Pos.Raw.lt_iff] at hn
omega)
else
if 0 < needlePos then
let basePos := stackPos.unoffsetBy needlePos
let res := .rejected (s.pos! basePos) s.endPos
pure (.deflate .yield .atEnd res, by simp)
else
pure (.deflate .done, by simp)
| .atEnd => pure (.deflate .done, by simp)
private def toPair : ForwardSliceSearcher s (Nat × Nat)
| .empty pos => (1, s.utf8ByteSize - pos.offset.byteIdx)
| .proper _ _ sp _ => (1, s.utf8ByteSize - sp.byteIdx)
| .atEnd => (0, 0)
private def toOption : ForwardSliceSearcher s Option (Nat × Nat)
| .emptyBefore pos => some (s.utf8ByteSize - pos.offset.byteIdx, 1)
| .emptyAt pos _ => some (s.utf8ByteSize - pos.offset.byteIdx, 0)
| .proper _ _ _ sp np _ => some (s.utf8ByteSize - sp.byteIdx, np.byteIdx)
| .atEnd => none
private instance : WellFoundedRelation (ForwardSliceSearcher s) where
rel s1 s2 := Prod.Lex (· < ·) (· < ·) s1.toPair s2.toPair
rel := InvImage (Option.lt (Prod.Lex (· < ·) (· < ·))) ForwardSliceSearcher.toOption
wf := by
apply InvImage.wf
apply Option.wellFounded_lt
apply (Prod.lex _ _).wf
private def finitenessRelation :
@@ -167,38 +228,35 @@ private def finitenessRelation :
simp_wf
obtain step, h, h' := h
cases step
· cases h
simp only [Std.Iterators.IterM.IsPlausibleStep, Std.Iterators.Iterator.IsPlausibleStep] at h'
split at h'
· next heq =>
rw [heq]
rcases h' with np, h1', h2' | h'
· rw [h2']
apply Prod.Lex.right'
· simp
· have haux := np.isValidForSlice.le_utf8ByteSize
simp [Slice.Pos.lt_iff, String.Pos.Raw.le_iff, String.Pos.Raw.lt_iff] at h1' haux
omega
· apply Prod.Lex.left
simp [h']
· next heq =>
rw [heq]
rcases h' with np, sp, h1', h2', h3' | h'
· rw [h3']
apply Prod.Lex.right'
· simp
· simp [String.Pos.Raw.le_iff, String.Pos.Raw.lt_iff] at h1' h2'
omega
· apply Prod.Lex.left
simp [h']
· contradiction
· cases h'
all_goals try
cases h
revert h'
simp only [Std.Iterators.IterM.IsPlausibleStep, Std.Iterators.Iterator.IsPlausibleStep]
match it.internalState with
| .emptyBefore pos =>
rintro (h, h'|h') <;> simp [h', ForwardSliceSearcher.toOption, Option.lt, Prod.lex_def]
| .emptyAt pos h =>
simp only [forall_exists_index, and_imp]
intro x hx h
have := x.isValidForSlice.le_utf8ByteSize
simp [h, ForwardSliceSearcher.toOption, Option.lt, Prod.lex_def, Pos.lt_iff,
Pos.Raw.lt_iff, Pos.Raw.le_iff] at hx this
omega
| .proper .. =>
rintro (newStackPos, newNeedlePos, h₁, h₂, (h|rfl, h)|h)
· simp [h₂, ForwardSliceSearcher.toOption, Option.lt, Prod.lex_def, h]
· simpa [h, ForwardSliceSearcher.toOption, Option.lt, Prod.lex_def, Pos.Raw.lt_iff]
· simp [h, ForwardSliceSearcher.toOption, Option.lt]
| .atEnd .. => simp
· cases h
@[no_expose]
instance : Std.Iterators.Finite (ForwardSliceSearcher s) Id :=
.of_finitenessRelation finitenessRelation
instance : Std.Iterators.IteratorCollect (ForwardSliceSearcher s) Id Id :=
.defaultImplementation
instance : Std.Iterators.IteratorLoop (ForwardSliceSearcher s) Id Id :=
.defaultImplementation
@@ -218,9 +276,9 @@ def startsWith (s : Slice) (pat : Slice) : Bool :=
false
@[inline]
def dropPrefix? (s : Slice) (pat : Slice) : Option Slice :=
def dropPrefix? (s : Slice) (pat : Slice) : Option s.Pos :=
if startsWith s pat then
some <| s.replaceStart <| s.pos! <| pat.rawEndPos.offsetBy s.startPos.offset
some <| s.pos! <| pat.rawEndPos.offsetBy s.startPos.offset
else
none
@@ -254,9 +312,9 @@ def endsWith (s : Slice) (pat : Slice) : Bool :=
false
@[inline]
def dropSuffix? (s : Slice) (pat : Slice) : Option Slice :=
def dropSuffix? (s : Slice) (pat : Slice) : Option s.Pos :=
if endsWith s pat then
some <| s.replaceEnd <| s.pos! <| s.endPos.offset.unoffsetBy pat.rawEndPos
some <| s.pos! <| s.endPos.offset.unoffsetBy pat.rawEndPos
else
none

View File

@@ -251,7 +251,7 @@ Examples:
-/
@[inline]
def dropPrefix? [ForwardPattern ρ] (s : Slice) (pat : ρ) : Option Slice :=
ForwardPattern.dropPrefix? s pat
(ForwardPattern.dropPrefix? s pat).map s.replaceStart
/--
If {name}`pat` matches a prefix of {name}`s`, returns the remainder. Returns {name}`s` unmodified
@@ -357,18 +357,18 @@ Examples:
-/
@[inline]
partial def takeWhile [ForwardPattern ρ] (s : Slice) (pat : ρ) : Slice :=
go s
go s.startPos
where
@[specialize pat]
go (curr : Slice) : Slice :=
if let some nextCurr := dropPrefix? curr pat then
if curr.startInclusive.offset < nextCurr.startInclusive.offset then
go (curr : s.Pos) : Slice :=
if let some nextCurr := ForwardPattern.dropPrefix? (s.replaceStart curr) pat then
if (s.replaceStart curr).startPos < nextCurr then
-- TODO: need lawful patterns to show this terminates
go nextCurr
go (Pos.ofReplaceStart nextCurr)
else
s.replaceEnd <| s.pos! <| curr.startInclusive.offset
s.replaceEnd curr
else
s.replaceEnd <| s.pos! <| curr.startInclusive.offset
s.replaceEnd curr
/--
Finds the position of the first match of the pattern {name}`pat` in a slice {name}`true`. If there
@@ -524,7 +524,7 @@ Examples:
-/
@[inline]
def dropSuffix? [BackwardPattern ρ] (s : Slice) (pat : ρ) : Option Slice :=
BackwardPattern.dropSuffix? s pat
(BackwardPattern.dropSuffix? s pat).map s.replaceEnd
/--
If {name}`pat` matches a suffix of {name}`s`, returns the remainder. Returns {name}`s` unmodified
@@ -629,18 +629,18 @@ Examples:
-/
@[inline]
partial def takeEndWhile [BackwardPattern ρ] (s : Slice) (pat : ρ) : Slice :=
go s
go s.endPos
where
@[specialize pat]
go (curr : Slice) : Slice :=
if let some nextCurr := dropSuffix? curr pat then
if nextCurr.endExclusive.offset < curr.endExclusive.offset then
go (curr : s.Pos) : Slice :=
if let some nextCurr := BackwardPattern.dropSuffix? (s.replaceEnd curr) pat then
if nextCurr < (s.replaceEnd curr).endPos then
-- TODO: need lawful patterns to show this terminates
go nextCurr
go (Pos.ofReplaceEnd nextCurr)
else
s.replaceStart <| s.pos! <| curr.endExclusive.offset
s.replaceStart curr
else
s.replaceStart <| s.pos! <| curr.endExclusive.offset
s.replaceStart curr
/--
Finds the position of the first match of the pattern {name}`pat` in a slice {name}`true`, starting

View File

@@ -96,13 +96,17 @@ def warnIfUsesSorry (decl : Declaration) : CoreM Unit := do
-- This case should not happen, but it ensures a warning will get logged no matter what.
logWarning <| .tagged `hasSorry m!"declaration uses 'sorry'"
builtin_initialize
registerTraceClass `addDecl
/--
Adds the given declaration to the environment's private scope, deriving a suitable presentation in
the public scope if under the module system and if the declaration is not private. If `forceExpose`
is true, exposes the declaration body, i.e. preserves the full representation in the public scope,
independently of `Environment.isExporting` and even for theorems.
-/
def addDecl (decl : Declaration) (forceExpose := false) : CoreM Unit := do
def addDecl (decl : Declaration) (forceExpose := false) : CoreM Unit :=
withTraceNode `addDecl (fun _ => return m!"adding declarations {decl.getNames}") do
-- register namespaces for newly added constants; this used to be done by the kernel itself
-- but that is incompatible with moving it to a separate task
-- NOTE: we do not use `getTopLevelNames` here so that inductive types are registered as
@@ -115,26 +119,37 @@ def addDecl (decl : Declaration) (forceExpose := false) : CoreM Unit := do
let (name, info, kind) match decl with
| .thmDecl thm =>
if !forceExpose && ( getEnv).header.isModule then
trace[addDecl] "exporting theorem {thm.name} as axiom"
exportedInfo? := some <| .axiomInfo { thm with isUnsafe := false }
pure (thm.name, .thmInfo thm, .thm)
| .defnDecl defn | .mutualDefnDecl [defn] =>
if !forceExpose && ( getEnv).header.isModule && !( getEnv).isExporting then
trace[addDecl] "exporting definition {defn.name} as axiom"
exportedInfo? := some <| .axiomInfo { defn with isUnsafe := defn.safety == .unsafe }
pure (defn.name, .defnInfo defn, .defn)
| .opaqueDecl op =>
if !forceExpose && ( getEnv).header.isModule && !( getEnv).isExporting then
trace[addDecl] "exporting opaque {op.name} as axiom"
exportedInfo? := some <| .axiomInfo { op with }
pure (op.name, .opaqueInfo op, .opaque)
| .axiomDecl ax => pure (ax.name, .axiomInfo ax, .axiom)
| _ => return ( doAdd)
| _ =>
trace[addDecl] "no matching async adding rules, adding synchronously"
return ( doAdd)
if decl.getTopLevelNames.all isPrivateName && !( ResolveName.backward.privateInPublic.getM) then
exportedInfo? := none
if decl.getTopLevelNames.all isPrivateName then
if ( ResolveName.backward.privateInPublic.getM) then
trace[addDecl] "private decl under `privateInPublic`, exporting as is"
exportedInfo? := some info
else
trace[addDecl] "not exporting private declaration at all"
exportedInfo? := none
else
-- preserve original constant kind in extension if different from exported one
if exportedInfo?.isSome then
modifyEnv (privateConstKindsExt.insert · name kind)
else
trace[addDecl] "no matching exporting rules, exporting as is"
exportedInfo? := some info
-- no environment extension changes to report after kernel checking; ensures we do not

View File

@@ -148,11 +148,14 @@ def throwAttrDeclNotOfExpectedType (attrName declName : Name) (givenType expecte
throwError m!"Cannot add attribute `[{attrName}]`: Declaration `{.ofConstName declName}` has type{indentExpr givenType}\n\
but `[{attrName}]` can only be added to declarations of type{indentExpr expectedType}"
def ensureAttrDeclIsPublic [MonadEnv m] (attrName declName : Name) (attrKind : AttributeKind) : m Unit := do
if ( getEnv).header.isModule && attrKind != .local && !(( getEnv).setExporting true).contains declName then
throwError m!"Cannot add attribute `[{attrName}]`: Declaration `{.ofConstName declName}` must be public"
def ensureAttrDeclIsPublic (attrName declName : Name) (attrKind : AttributeKind) : AttrM Unit := do
if ( getEnv).header.isModule && attrKind != .local then
withExporting do
checkPrivateInPublic declName
if !( hasConst declName) then
throwError m!"Cannot add attribute `[{attrName}]`: Declaration `{.ofConstName declName}` must be public"
def ensureAttrDeclIsMeta [MonadEnv m] (attrName declName : Name) (attrKind : AttributeKind) : m Unit := do
def ensureAttrDeclIsMeta (attrName declName : Name) (attrKind : AttributeKind) : AttrM Unit := do
if ( getEnv).header.isModule && !isMeta ( getEnv) declName then
throwError m!"Cannot add attribute `[{attrName}]`: Declaration `{.ofConstName declName}` must be marked as `meta`"
-- Make sure attributed decls can't refer to private meta imports, which is already checked for

View File

@@ -195,6 +195,10 @@ where
visitApp (f : Expr) (args : Array Expr) := do
let fNew match f with
| .const declName us =>
if ( getEnv).isExporting && isPrivateName declName then
-- This branch can happen under `backward.privateInPublic`; restore original behavior of
-- failing here, which is caught and ignored above by `observing`.
throwError "internal compiler error: private in public"
let .inductInfo _ getConstInfo declName | return anyExpr
pure <| .const declName us
| .fvar .. => pure f

View File

@@ -1616,7 +1616,7 @@ private def elabAppLValsAux (namedArgs : Array NamedArg) (args : Array Arg) (exp
| LValResolution.projFn baseStructName structName fieldName =>
let f mkBaseProjections baseStructName structName f
let some info := getFieldInfo? ( getEnv) baseStructName fieldName | unreachable!
if isInaccessiblePrivateName ( getEnv) info.projFn then
if ( isInaccessiblePrivateName info.projFn) then
throwError "Field `{fieldName}` from structure `{structName}` is private"
let projFn mkConst info.projFn
let projFn addProjTermInfo lval.getRef projFn
@@ -1750,7 +1750,7 @@ where
match resultTypeFn with
| .const declName .. =>
let env getEnv
if isInaccessiblePrivateName env declName then
if ( isInaccessiblePrivateName declName) then
throwError "The private declaration `{.ofConstName declName}` is not accessible in the current context"
-- Recall that the namespace for private declarations is non-private.
let fullName := privateToUserName declName ++ id

View File

@@ -56,7 +56,7 @@ open Meta
(fun ival _ => do
match ival.ctors with
| [ctor] =>
if isInaccessiblePrivateName ( getEnv) ctor then
if ( isInaccessiblePrivateName ctor) then
throwError "Invalid `⟨...⟩` notation: Constructor for `{ival.name}` is marked as private"
let cinfo getConstInfoCtor ctor
let numExplicitFields forallTelescopeReducing cinfo.type fun xs _ => do

View File

@@ -164,7 +164,7 @@ private def getMVarFromUserName (ident : Syntax) : MetaM Expr := do
-- `by` switches from an exported to a private context, so we must disallow unassigned
-- metavariables in the goal in this case as they could otherwise leak private data back into
-- the exported context.
mkTacticMVar expectedType stx .term (delayOnMVars := ( getEnv).isExporting)
mkTacticMVar expectedType stx .term (delayOnMVars := ( getEnv).isExporting && !( backward.proofsInPublic.getM))
| none =>
tryPostpone
throwError ("invalid 'by' tactic, expected type has not been provided")

View File

@@ -1222,7 +1222,9 @@ where
assert! view.kind.isTheorem
let env getEnv
let async env.addConstAsync declId.declName .thm
(exportedKind? := guard (!isPrivateName declId.declName) *> some .axiom)
(exportedKind? :=
guard (!isPrivateName declId.declName || ( ResolveName.backward.privateInPublic.getM)) *>
some .axiom)
setEnv async.mainEnv
-- TODO: parallelize header elaboration as well? Would have to refactor auto implicits catch,

View File

@@ -135,7 +135,10 @@ private def shouldGenCodeFor (preDef : PreDefinition) : Bool :=
!preDef.kind.isTheorem && !preDef.modifiers.isNoncomputable
private def compileDecl (decl : Declaration) : TermElabM Unit := do
Lean.compileDecl (logErrors := !( read).isNoncomputableSection) decl
Lean.compileDecl
-- `meta` should disregard `noncomputable section`
(logErrors := !( read).isNoncomputableSection || decl.getTopLevelNames.any (isMeta ( getEnv)))
decl
register_builtin_option diagnostics.threshold.proofSize : Nat := {
defValue := 16384

View File

@@ -1190,7 +1190,7 @@ private def elabStructInstView (s : StructInstView) (structName : Name) (structT
TermElabM Expr := withRef s.ref do
let env getEnv
let ctorVal := getStructureCtor env structName
if isInaccessiblePrivateName env ctorVal.name then
if ( isInaccessiblePrivateName ctorVal.name) then
throwError "invalid \{...} notation, constructor for `{.ofConstName structName}` is marked as private"
let { ctorFn, ctorFnType, structType, levels, params } mkCtorHeader ctorVal structType?
let (_, fields) expandFields structName s.fields (recover := ( read).errToSorry)

View File

@@ -340,6 +340,13 @@ private def TacticMVarKind.maybeWithoutRecovery (kind : TacticMVarKind) (m : Tac
else
m
register_builtin_option backward.proofsInPublic : Bool := {
defValue := false
descr := "(module system) Do not abstract proofs used in the public scope into auxiliary \
theorems. Enabling this option may lead to failures or, when `backward.privateInPublic` and \
its `warn` sub-option are enabled, additional warnings from private accesses."
}
mutual
/--
@@ -352,7 +359,7 @@ mutual
partial def runTactic (mvarId : MVarId) (tacticCode : Syntax) (kind : TacticMVarKind) (report := true) : TermElabM Unit := withoutAutoBoundImplicit do
let wasExporting := ( getEnv).isExporting
-- exit exporting context if entering proof
let isNoLongerExporting pure wasExporting <&&> do
let isNoLongerExporting pure (wasExporting && !( backward.proofsInPublic.getM)) <&&> do
mvarId.withContext do
isProp ( mvarId.getType)
instantiateMVarDeclMVars mvarId

View File

@@ -112,7 +112,7 @@ def realizeExtTheorem (structName : Name) (flat : Bool) : Elab.Command.CommandEl
try
Elab.Command.liftTermElabM <| withoutErrToSorry <| withDeclName extName do
let type mkExtType structName flat
let pf withSynthesize do
let pf withoutExporting <| withSynthesize do
let indVal getConstInfoInduct structName
let params := Array.replicate indVal.numParams ( `(_))
Elab.Term.elabTermEnsuringType (expectedType? := type) (implicitLambda := false)
@@ -149,7 +149,7 @@ def realizeExtIffTheorem (extName : Name) : Elab.Command.CommandElabM Name := do
let info getConstInfo extName
Elab.Command.liftTermElabM <| withoutErrToSorry <| withDeclName extIffName do
let type mkExtIffType extName
let pf withSynthesize do
let pf withoutExporting <| withSynthesize do
Elab.Term.elabTermEnsuringType (expectedType? := type) <| `(by
intros
refine ?_, ?_

View File

@@ -2443,13 +2443,17 @@ def replayConsts (dest : Environment) (oldEnv newEnv : Environment) (skipExistin
else
consts.add c
}
checked := dest.checked.map fun kenv => replayKernel exts newPrivateConsts kenv |>.toOption.getD kenv
checked := dest.checked.map fun kenv =>
replayKernel
oldEnv.toKernelEnv.extensions newEnv.toKernelEnv.extensions exts newPrivateConsts kenv
|>.toOption.getD kenv
allRealizations := dest.allRealizations.map (sync := true) fun allRealizations =>
newPrivateConsts.foldl (init := allRealizations) fun allRealizations c =>
allRealizations.insert c.constInfo.name c
}
where
replayKernel (exts : Array (EnvExtension EnvExtensionState)) (consts : List AsyncConst)
replayKernel (oldState newState : Array EnvExtensionState)
(exts : Array (EnvExtension EnvExtensionState)) (consts : List AsyncConst)
(kenv : Kernel.Environment) : Except Kernel.Exception Kernel.Environment := do
let mut kenv := kenv
-- replay extensions first in case kernel checking needs them (`IR.declMapExt`)
@@ -2459,8 +2463,8 @@ where
-- safety: like in `modifyState`, but that one takes an elab env instead of a kernel env
extensions := unsafe (ext.modifyStateImpl kenv.extensions <|
replay
(ext.getStateImpl oldEnv.toKernelEnv.extensions)
(ext.getStateImpl newEnv.toKernelEnv.extensions)
(ext.getStateImpl oldState)
(ext.getStateImpl newState)
(consts.map (·.constInfo.name))) }
for c in consts do
if skipExisting && (kenv.find? c.constInfo.name).isSome then
@@ -2602,7 +2606,11 @@ def realizeConst (env : Environment) (forConst : Name) (constName : Name)
{ c with exts? := some <| .pure realizeEnv'.base.private.extensions }
else c
let exts EnvExtension.envExtensionsRef.get
let replayKernel := replayConsts.replayKernel (skipExisting := true) realizeEnv realizeEnv' exts newPrivateConsts
-- NOTE: We must ensure that `realizeEnv.localRealizationCtxMap` is not reachable via `res`
-- (such as by storing `realizeEnv` or `realizeEnv'` in a field or the closure) as `res` will be
-- stored in a promise in there, creating a cycle.
let replayKernel := replayConsts.replayKernel (skipExisting := true)
realizeEnv.toKernelEnv.extensions realizeEnv'.toKernelEnv.extensions exts newPrivateConsts
let res : RealizeConstResult := { newConsts.private := newPrivateConsts, newConsts.public := newPublicConsts, replayKernel, dyn }
pure (.mk res)
let some res := res.get? RealizeConstResult | unreachable!

View File

@@ -2524,6 +2524,9 @@ generated diagnostics is deterministic). Note that, as `realize` is run using th
declaration time of `forConst`, trace options must be set prior to that (or, for imported constants,
on the cmdline) in order to be active. If `realize` throws an exception, it is rethrown at all
callers.
CAVEAT: `realize` MUST NOT reference the current environment (the result of `getEnv`) in its result
to avoid creating an un-collectable promise cycle.
-/
def realizeValue [BEq α] [Hashable α] [TypeName α] [TypeName β] (forConst : Name) (key : α) (realize : MetaM β) :
MetaM β := do

View File

@@ -194,6 +194,24 @@ def throwLetTypeMismatchMessage {α} (fvarId : FVarId) : MetaM α := do
throwError "invalid {declKind} declaration, term{indentExpr v}\nhas type{indentExpr vType}\nbut is expected to have type{indentExpr t}"
| _ => unreachable!
/-- Adds note about definitions not unfolded because of the module system, if any. -/
def mkUnfoldAxiomsNote (givenType expectedType : Expr) : MetaM MessageData := do
let env getEnv
if env.header.isModule then
let origDiag := ( get).diag
try
let _ observing <| withOptions (diagnostics.set · true) <| isDefEq givenType expectedType
let blocked := ( get).diag.unfoldAxiomCounter.toList.filterMap fun (n, count) => do
let count := count - origDiag.unfoldAxiomCounter.findD n 0
guard <| count > 0 && getOriginalConstKind? env n matches some .defn
return m!"{.ofConstName n} ↦ {count}"
if !blocked.isEmpty then
return MessageData.note m!"The following definitions were not unfolded because \
their definition is not exposed:{indentD <| .joinSep blocked Format.line}"
finally
modify ({ · with diag := origDiag })
return .nil
/--
Return error message "has type{givenType}\nbut is expected to have type{expectedType}"
Adds the types types unless they are defeq.
@@ -226,19 +244,7 @@ def mkHasTypeButIsExpectedMsg (givenType expectedType : Expr)
let (givenType, expectedType) addPPExplicitToExposeDiff givenType expectedType
let trailing := trailing?.map (m!"\n" ++ ·) |>.getD .nil
pure m!"has type{indentExpr givenType}\nbut is expected to have type{indentExpr expectedType}{trailing}")
let env getEnv
if env.header.isModule then
let origDiag := ( get).diag
let _ observing <| withOptions (diagnostics.set · true) <| isDefEq givenType expectedType
let blocked := ( get).diag.unfoldAxiomCounter.toList.filterMap fun (n, count) => do
let count := count - origDiag.unfoldAxiomCounter.findD n 0
guard <| count > 0 && getOriginalConstKind? env n matches some .defn
return m!"{.ofConstName n} ↦ {count}"
if !blocked.isEmpty then
msg := msg ++ MessageData.note m!"The following definitions were not unfolded because \
their definition is not exposed:{indentD <| .joinSep blocked Format.line}"
modify ({ · with diag := origDiag })
return msg
return msg ++ ( mkUnfoldAxiomsNote givenType expectedType)
def throwAppTypeMismatch (f a : Expr) : MetaM α := do
-- Clarify that `a` is "last" only if it may be confused with some preceding argument; otherwise,

View File

@@ -8,6 +8,7 @@ module
prelude
public import Lean.Meta.Check
public import Lean.Meta.Tactic.AuxLemma
import Lean.Util.ForEachExpr
public section
@@ -349,9 +350,67 @@ def mkValueTypeClosureAux (type : Expr) (value : Expr) : ClosureM (Expr × Expr)
process
pure (type, value)
private structure TopoSort where
tempMark : FVarIdHashSet := {}
doneMark : FVarIdHashSet := {}
newDecls : Array LocalDecl := #[]
newArgs : Array Expr := #[]
/--
By construction, the `newLocalDecls` for fvars are in dependency order, but those for MVars may not be,
and need to be interleaved appropriately. This we do a “topological insertion sort” of these.
We care about efficiency for the common case of many fvars and no mvars.
-/
private partial def sortDecls (sortedDecls : Array LocalDecl) (sortedArgs : Array Expr)
(toSortDecls : Array LocalDecl) (toSortArgs : Array Expr) : CoreM (Array LocalDecl × Array Expr):= do
assert! sortedDecls.size = sortedArgs.size
assert! toSortDecls.size = toSortArgs.size
if toSortDecls.isEmpty then
return (sortedDecls, sortedArgs)
trace[Meta.Closure] "MVars to abstract, topologically sorting the abstracted variables"
let mut m : Std.HashMap FVarId (LocalDecl × Expr) := {}
for decl in sortedDecls, arg in sortedArgs do
m := m.insert decl.fvarId (decl, arg)
for decl in toSortDecls, arg in toSortArgs do
m := m.insert decl.fvarId (decl, arg)
let rec visit (fvarId : FVarId) : StateT TopoSort CoreM Unit := do
let some (decl, arg) := m.get? fvarId | return
if ( get).doneMark.contains decl.fvarId then
return ()
trace[Meta.Closure] "Sorting decl {mkFVar decl.fvarId} : {decl.type}"
if ( get).tempMark.contains decl.fvarId then
throwError "cycle detected in sorting abstracted variables"
assert! !decl.isLet (allowNondep := true) -- should all be cdecls
modify fun s => { s with tempMark := s.tempMark.insert decl.fvarId }
let type := decl.type
type.forEach' fun e => do
if e.hasFVar then
if e.isFVar then
visit e.fvarId!
return true
else
return false
modify fun s => { s with
newDecls := s.newDecls.push decl
newArgs := s.newArgs.push arg
doneMark := s.doneMark.insert decl.fvarId
}
let s₀ := { newDecls := .emptyWithCapacity m.size, newArgs := .emptyWithCapacity m.size }
StateT.run' (s := s₀) do
for decl in sortedDecls do
visit decl.fvarId
for decl in toSortDecls do
visit decl.fvarId
let {newDecls, newArgs, .. } get
trace[Meta.Closure] "Sorted fvars: {newDecls.map (mkFVar ·.fvarId)}"
return (newDecls, newArgs)
def mkValueTypeClosure (type : Expr) (value : Expr) (zetaDelta : Bool) : MetaM MkValueTypeClosureResult := do
let ((type, value), s) ((mkValueTypeClosureAux type value).run { zetaDelta }).run {}
let newLocalDecls := s.newLocalDecls.reverse ++ s.newLocalDeclsForMVars
let (newLocalDecls, newArgs) sortDecls s.newLocalDecls.reverse s.exprFVarArgs.reverse
s.newLocalDeclsForMVars s.exprMVarArgs
let newLetDecls := s.newLetDecls.reverse
let type := mkForall newLocalDecls (mkForall newLetDecls type)
let value := mkLambda newLocalDecls (mkLambda newLetDecls value)
@@ -360,7 +419,7 @@ def mkValueTypeClosure (type : Expr) (value : Expr) (zetaDelta : Bool) : MetaM M
value := value,
levelParams := s.levelParams,
levelArgs := s.levelArgs,
exprArgs := s.exprFVarArgs.reverse ++ s.exprMVarArgs
exprArgs := newArgs
}
end Closure
@@ -396,4 +455,7 @@ def mkAuxTheorem (type : Expr) (value : Expr) (zetaDelta : Bool := false) (kind?
let name mkAuxLemma (kind? := kind?) (cache := cache) result.levelParams.toList result.type result.value
return mkAppN (mkConst name result.levelArgs.toList) result.exprArgs
builtin_initialize
registerTraceClass `Meta.Closure
end Lean.Meta

View File

@@ -34,7 +34,7 @@ private def throwApplyError {α} (mvarId : MVarId)
let (conclusionType, targetType) addPPExplicitToExposeDiff conclusionType targetType
let conclusion := if conclusionType?.isNone then "type" else "conclusion"
return m!"could not unify the {conclusion} of {term?.getD "the term"}{indentExpr conclusionType}\n\
with the goal{indentExpr targetType}{note}"
with the goal{indentExpr targetType}{note}{← mkUnfoldAxiomsNote conclusionType targetType}"
def synthAppInstances (tacticName : Name) (mvarId : MVarId) (mvarsNew : Array Expr) (binderInfos : Array BinderInfo)
(synthAssignedInstances : Bool) (allowSynthFailures : Bool) : MetaM Unit := do

View File

@@ -283,7 +283,7 @@ improve the errors, for example by passing down a flag whether we expect the sam
occurrences of `newIH`), or whether we are in “supple mode”, and catch it earlier if the rewriting
fails.
-/
partial def foldAndCollect (oldIH newIH : FVarId) (isRecCall : Expr Option Expr) (e : Expr) : M Expr := do
partial def foldAndCollect (oldIH newIH : FVarId) (isRecCall : Expr Option Expr) (e : Expr) : M Expr := withoutExporting do
unless e.containsFVar oldIH do
return e
withTraceNode `Meta.FunInd (pure m!"{exceptEmoji ·} foldAndCollect ({mkFVar oldIH} → {mkFVar newIH})::{indentExpr e}") do

View File

@@ -25,19 +25,4 @@ def mkPrivateName (env : Environment) (n : Name) : Name :=
-- is private to *this* module.
mkPrivateNameCore env.mainModule <| privateToUserName n
def isInaccessiblePrivateName (env : Environment) (n : Name) : Bool := Id.run do
if !isPrivateName n then
return false
-- All private names are inaccessible from the public scope
if env.isExporting then
return true
-- In the private scope, ...
match env.getModuleIdxFor? n with
| some modIdx =>
-- ... allow access through `import all`
!env.header.isModule || !env.header.modules[modIdx]?.any (·.importAll)
| none =>
-- ... allow all accesses in the current module
false
end Lean

View File

@@ -45,22 +45,22 @@ private def projInfo (c : Name) : MetaM (Option (Name × Nat × Bool × Bool ×
/--
Checks that `e` is an application of a constant that equals `baseName`, taking into consideration private name mangling.
-/
private def isAppOfBaseName (env : Environment) (e : Expr) (baseName : Name) : Bool :=
private def isAppOfBaseName (e : Expr) (baseName : Name) : MetaM Bool := do
if let some c := e.cleanupAnnotations.getAppFn.constName? then
privateToUserName c == baseName && !isInaccessiblePrivateName env c
return privateToUserName c == baseName && !( isInaccessiblePrivateName c)
else
false
return false
/--
Like `Lean.Elab.Term.typeMatchesBaseName` but does not use `Function` for pi types.
-/
private partial def typeMatchesBaseName (type : Expr) (baseName : Name) : MetaM Bool := do
withReducibleAndInstances do
if isAppOfBaseName ( getEnv) type baseName then
if ( isAppOfBaseName type baseName) then
return true
else
let type whnfCore type
if isAppOfBaseName ( getEnv) type baseName then
if ( isAppOfBaseName type baseName) then
return true
else
match unfoldDefinition? type with
@@ -94,7 +94,7 @@ private def generalizedFieldInfo (c : Name) (args : Array Expr) : MetaM (Name ×
-- We require an exact match for the base name.
-- While `Lean.Elab.Term.resolveLValLoop` is able to unfold the type and iterate, we do not attempt to exploit this feature.
-- (To get it right, we would need to check that each relevant namespace does not contain a declaration named `field`.)
guard <| isAppOfBaseName ( getEnv) ( instantiateMVars <| inferType args[i]!) baseName
guard ( isAppOfBaseName ( instantiateMVars <| inferType args[i]!) baseName)
return (field, i)
else
-- We only use the first explicit argument for field notation.
@@ -114,7 +114,7 @@ returns the field name to use and the argument index for the object of the field
def fieldNotationCandidate? (f : Expr) (args : Array Expr) (useGeneralizedFieldNotation : Bool) : MetaM (Option (Name × Nat)) := do
let env getEnv
let .const c .. := f.consumeMData | return none
if isInaccessiblePrivateName ( getEnv) c then
if ( isInaccessiblePrivateName c) then
return none
if c.getPrefix.isAnonymous then return none
-- If there is `pp_nodot` on this function, then don't use field notation for it.

View File

@@ -275,6 +275,23 @@ def checkPrivateInPublic (id : Name) : m Unit := do
this is allowed only because the `backward.privateInPublic` option is enabled. \n\n\
Disable `backward.privateInPublic.warn` to silence this warning."
def isInaccessiblePrivateName [Monad m] [MonadEnv m] [MonadOptions m] (n : Name) : m Bool := do
if !isPrivateName n then
return false
let env getEnv
-- All private names are inaccessible from the public scope
if env.isExporting && !( ResolveName.backward.privateInPublic.getM) then
return true
checkPrivateInPublic n
-- In the private scope, ...
match env.getModuleIdxFor? n with
| some modIdx =>
-- ... allow access through `import all`
return !env.header.isModule || !env.header.modules[modIdx]?.any (·.importAll)
| none =>
-- ... allow all accesses in the current module
return false
/--
Given a name `n`, return a list of possible interpretations.
Each interpretation is a pair `(declName, fieldList)`, where `declName`

View File

@@ -4,12 +4,13 @@ Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sebastian Ullrich
-/
module
prelude
import Std.Data
import Std.Do
import Std.Sat
import Std.Sync
import Std.Time
import Std.Tactic
import Std.Internal
import Std.Net
public import Std.Data
public import Std.Do
public import Std.Sat
public import Std.Sync
public import Std.Time
public import Std.Tactic
public import Std.Internal
public import Std.Net

View File

@@ -604,7 +604,7 @@ public def buildArtifactUnlessUpToDate
else
computeArtifact file ext text
else if ( savedTrace.replayIfUpToDate file depTrace) then
computeArtifact file ext
computeArtifact file ext text
else if let some art fetchArt? (restore := true) then
return art
else
@@ -627,7 +627,7 @@ where
build
clearFileHash file
removeFileIfExists traceFile
computeArtifact file ext
computeArtifact file ext text
/--
Build `file` using `build` after `dep` completes if the dependency's

View File

@@ -647,7 +647,8 @@ private def Module.computeArtifacts (mod : Module) (isModule : Bool) : FetchM Mo
}
where
@[inline] compute file ext := do
computeArtifact file ext
-- Note: Lean produces LF-only line endings for `.c` and `.ilean`, so no normalization.
computeArtifact file ext (text := false)
computeIf c file ext := do
if c then return some ( compute file ext) else return none

View File

@@ -0,0 +1,21 @@
import Lean
open Lean Meta
-- set_option trace.Meta.Closure true
/--
info: before: h.2
---
info: after: _proof_2 ?m.1 h
-/
#guard_msgs(pass trace, all) in
run_meta do
have l := Lean.Meta.mkFreshExprMVar (mkConst ``True) (kind := .syntheticOpaque)
let ty := mkAnd (mkConst ``True) (.letE `foo (mkConst ``True) l (mkConst ``True) false)
withLocalDecl `h default ty fun x => do
let e := mkProj ``And 1 x
Lean.logInfo m!"before: {e}"
-- works fine without this line
let e Lean.Meta.mkAuxTheorem (mkConst ``True) e (zetaDelta := true) -- or false, not really relevant
Lean.logInfo m!"after: {e}"

View File

@@ -0,0 +1,39 @@
module
inductive S where
| m (b e : Nat)
| r (b e : Nat)
deriving Repr, BEq, DecidableEq
def run (s pat : String) : List S :=
String.Slice.Pattern.ForwardSliceSearcher.iter s.toSlice pat.toSlice
|>.map (fun | .matched b e => S.m b.offset.byteIdx e.offset.byteIdx | .rejected b e => S.r b.offset.byteIdx e.offset.byteIdx)
|>.toList
-- 𝔸 is [240,157,148,184]
-- 𝕸 is [240,157,149,184]
#guard run "aababaab" "a" = [.m 0 1, .m 1 2, .r 2 3, .m 3 4, .r 4 5, .m 5 6, .m 6 7, .r 7 8]
#guard run "aab" "ab" = [.r 0 1, .m 1 3]
#guard run "aababacab" "ab" = [.r 0 1, .m 1 3, .m 3 5, .r 5 6, .r 6 7, .m 7 9]
#guard run "aaab" "aab" = [.r 0 1, .m 1 4]
#guard run "aaaaa" "aa" = [.m 0 2, .m 2 4, .r 4 5]
#guard run "abcabd" "abd" = [.r 0 2, .r 2 3, .m 3 6]
#guard run "αβ" "β" = [.r 0 2, .m 2 4]
#guard run "𝔸" "𝕸" = [.r 0 4]
#guard run "𝔸𝕸" "𝕸" = [.r 0 4, .m 4 8]
#guard run "α𝔸€α𝔸₭" "α𝔸₭" = [.r 0 9, .m 9 18]
#guard run "α𝔸𝕸α𝔸₭" "α𝔸₭" = [.r 0 6, .r 6 10, .m 10 19]
#guard run "𝕸𝔸𝕸𝔸₭" "𝕸𝔸₭" = [.r 0 8, .m 8 19]
#guard run "𝕸𝔸𝕸β₭" "𝕸𝔸₭" = [.r 0 8, .r 8 12, .r 12 14, .r 14 17]
#guard run "𝔸𝔸𝔸𝔸𝕸𝔸𝔸𝔸𝕸" "𝔸𝔸𝕸" = [.r 0 4, .r 4 8, .m 8 20, .r 20 24, .m 24 36]
#guard run "𝔸b" "𝕸" = [.r 0 4, .r 4 5]
#guard run "𝔸bb𝕸β" "𝕸" = [.r 0 4, .r 4 5, .r 5 6, .m 6 10, .r 10 12]
#guard run "𝔸bbββαβαββββ𝕸β" "ββ𝕸" = [.r 0 4, .r 4 5, .r 5 6, .r 6 8, .r 8 10, .r 10 12, .r 12 14, .r 14 16, .r 16 18, .r 18 20, .m 20 28, .r 28 30]
#guard run "𝔸β𝕸" "𝕸" = [.r 0 4, .r 4 6, .m 6 10]
#guard run "𝔸b𝕸xu∅" "𝕸x" = [.r 0 4, .r 4 5, .m 5 10, .r 10 11, .r 11 14]
#guard run "é" "ù" = [.r 0 2]
#guard run "éB" "ù" = [.r 0 2, .r 2 3]
#guard run "abcabdabcabcabcabe" "abcabdabcabe" = [.r 0 6, .r 6 9, .r 9 12, .r 12 15, .r 15 17, .r 17 18]
#guard run "abcabdabcxabcabdabcabe" "abcabdabcabe" = [.r 0 6, .r 6 9, .r 9 10, .m 10 22]
#guard run "€α𝕸€α𝔸€α𝕸€α𝕸€α𝕸€αù" "€α𝕸€α𝔸€α𝕸€αù" = [.r 0 18, .r 18 27, .r 27 36, .r 36 45, .r 45 50, .r 50 52]

View File

@@ -55,6 +55,11 @@ Tests for `String.Slice` functions
#guard "red red green blue".toSlice.takeWhile "red " == "red red ".toSlice
#guard "red green blue".toSlice.takeWhile (fun (_ : Char) => true) == "red green blue".toSlice
#guard (" ".toSlice.dropWhile ' ' |>.takeWhile Char.isLower) == "".toSlice
#guard (" ".toSlice.dropWhile ' ' |>.takeEndWhile Char.isLower) == "".toSlice
#guard ("∃a∃".toSlice.drop 1 |>.takeWhile Char.isLower) == "a".toSlice
#guard ("∃a∃".toSlice.dropEnd 1 |>.takeEndWhile Char.isLower) == "a".toSlice
#guard "red green blue".toSlice.dropPrefix? "red " == some "green blue".toSlice
#guard "red green blue".toSlice.dropPrefix? "reed " == none
#guard "red green blue".toSlice.dropPrefix? 'r' == some "ed green blue".toSlice
@@ -204,3 +209,25 @@ Tests for `String.Slice` functions
#guard "abc".toSlice.back = 'c'
#guard "".toSlice.back = (default : Char)
section
open String.Slice.Pattern
instance [Monad n]{s : String.Slice} : Std.Iterators.IteratorCollect (ForwardSliceSearcher s) Id n :=
.defaultImplementation
#guard (ToForwardSearcher.toSearcher "".toSlice "").toList == [.matched "".toSlice.startPos "".toSlice.startPos]
#guard (ToForwardSearcher.toSearcher "abc".toSlice "").toList == [
.matched ("abc".toSlice.pos 0 (by decide)) ("abc".toSlice.pos 0 (by decide)),
.rejected ("abc".toSlice.pos 0 (by decide)) ("abc".toSlice.pos 1 (by decide)),
.matched ("abc".toSlice.pos 1 (by decide)) ("abc".toSlice.pos 1 (by decide)),
.rejected ("abc".toSlice.pos 1 (by decide)) ("abc".toSlice.pos 2 (by decide)),
.matched ("abc".toSlice.pos 2 (by decide)) ("abc".toSlice.pos 2 (by decide)),
.rejected ("abc".toSlice.pos 2 (by decide)) ("abc".toSlice.pos 3 (by decide)),
.matched ("abc".toSlice.pos 3 (by decide)) ("abc".toSlice.pos 3 (by decide)),
]
end
#guard ("".toSlice.split "").allowNontermination.toList == ["".toSlice, "".toSlice]
#guard ("abc".toSlice.split "").allowNontermination.toList == ["".toSlice, "a".toSlice, "b".toSlice, "c".toSlice, "".toSlice]

View File

@@ -496,3 +496,11 @@ Note: A private declaration `S.s` (from the current module) exists but would nee
-/
#guard_msgs in
@[expose] public def useS (s : S) := s.s
/- `meta` should trump `noncomputable`. -/
noncomputable section
/-- error: Invalid `meta` definition `m`, `S.s` not marked `meta` -/
#guard_msgs in
meta def m := S.s
end

View File

@@ -27,6 +27,23 @@ Note: The following definitions were not unfolded because their definition is no
#guard_msgs in
example : f = 1 := rfl
/--
error: Tactic `apply` failed: could not unify the conclusion of `@rfl`
?a = ?a
with the goal
f = 1
Note: The full type of `@rfl` is
∀ {α : Sort ?u.115} {a : α}, a = a
Note: The following definitions were not unfolded because their definition is not exposed:
f ↦ 1
⊢ f = 1
-/
#guard_msgs in
example : f = 1 := by apply rfl
/-! Theorems should be exported without their bodies -/
/--