Compare commits

..

24 Commits

Author SHA1 Message Date
Kim Morrison
22ca91c0b9 chore: remove unneeded Batteries reference in repeat doc-string 2024-06-04 10:46:42 +10:00
Joachim Breitner
f65e3ae985 feat: simp to still work even if one simp arg does not work (#4177)
this fixes a usability paper cut that just annoyed me. When editing a
larger simp proof, I usually want to see the goal state after the simp,
and this is what I see while the `simp` command is complete. But then,
when I start typing, and necessarily type incomplete lemma names, that
error makes `simp` do nothing again and I see the original goal state.
In fact, if a prefix of the simp theorem name I am typing is a valid
identifier, it jumps even more around.

With this PR, using `logException`, I still get the red squiggly lines
for the unknown identifer, but `simp` just ignores that argument and
still shows me the final goal. Much nicer.

I also demoted the message for `[-foo]` when `foo` isn’t `simp` to a
warning and gave it the correct `ref`.

See it in action here: (in the middle, when you suddenly see the
terminal,
I am switching lean versions.)


https://github.com/leanprover/lean4/assets/148037/8cb3c563-1354-4c2d-bcee-26dfa1005ae0
2024-06-03 14:21:31 +00:00
Siddharth
81f5b07215 feat: getLsb_sshiftRight (#4179)
In the course of the development, I grabbed facts about right shifting
over integers [from
`mathlib4`](https://github.com/leanprover-community/mathlib4/blob/master/Mathlib/Data/Int/Bitwise.lean).

The core proof strategy is to perform a case analysis of the msb:
- If `msb = false`, then `sshiftRight = ushiftRight`.
- If `msb = true`. then `x >>>s i = ~~~(~~~(x >>>u i))`. The double
negation introduces the high `1` bits that one expects of the arithmetic
shift.

---------

Co-authored-by: Kim Morrison <scott@tqft.net>
2024-06-01 16:43:11 +00:00
Siddharth
9a597aeb2e feat: getLsb_{rotateLeft, rotateRight} (#4257)
These will be used by LeanSAT for bitblasting rotations by constant
distances.

We first reduce the case when the rotation amount is larger than the
width to the case where the rotation amount is less than the width
(`x.rotateLeft/Right r = x.rotateLeft/Right (r%w)`).

Then, we case analyze on the low bits versus the high bits of the
rotation, where we prove equality by extensionality.

---------

Co-authored-by: Alex Keizer <alex@keizer.dev>
Co-authored-by: Tobias Grosser <github@grosser.es>
2024-06-01 16:42:10 +00:00
Tobias Grosser
ff116dae5f feat: add BitVec _assoc lemmas (#4299) 2024-06-01 16:24:18 +00:00
John Tristan
0dff5701af doc: updated build instructions for mac os X (#4317)
Edit to the platform specific setup for mac os X. The installation of
llvm does not accept the options --with-clang and --with-asan anymore.
2024-06-01 16:23:17 +00:00
Kim Morrison
299cb9a806 chore: remove @[simp] from bind_eq_some (#4314) 2024-06-01 16:04:02 +00:00
Leonardo de Moura
b53a74d6fd fix: miscompilation in constant folding (#4311)
closes #4306
2024-05-31 04:24:45 +00:00
Lean stage0 autoupdater
007b423006 chore: update stage0 2024-05-30 09:57:02 +00:00
Sebastian Ullrich
6c63c9c716 feat: quotations for parser aliases (#4307)
Another papercut towards incremental `have`
2024-05-30 09:22:22 +00:00
Kim Morrison
8bbb015a97 chore: add namespace in Init/Data/Fin/Fold (#4304) 2024-05-29 16:40:55 +00:00
Alex Keizer
9133470243 feat: upstream BitVec.toFin_ofNat and BitVec.toFin_neg (#4298)
These lemmas are morally equivalent to Mathlib lemmas which are proposed
to be deleted from Mathlib in
[#13286](https://github.com/leanprover-community/mathlib4/pull/13286).

It is only morally equivalent, because the Mathlib lemmas are stated in
terms of Mathlib-defined things: `toFin_natCast` uses a coercion from
`Nat` to `Fin (2^w)` which relies on `NeZero` machinery available only
in Mathlib. Thus, I've rephrased the rhs in terms of the def-eq
`Fin.ofNat'` with an explicit proof that `2^w` is non-zero.

Similarly, the RHS of `toFin_neg` was phrased in terms of negation on
`Fin`s, which is only defined in Mathlib, so I've unfolded the
definition.
2024-05-29 08:25:51 +00:00
Markus Himmel
d07b316804 fix: incorrect docstring for named pattern syntax (#4294)
---
2024-05-29 08:23:15 +00:00
Wojciech Nawrocki
ec59e7a2c0 feat: widget messages (#4254)
Allows embedding user widgets in structured messages. Companion PR is
leanprover/vscode-lean4#449.

Some technical choices:
- The `MessageData.ofWidget` constructor might not be strictly necessary
as we already have `MessageData.ofFormatWithInfos`, and there is
`Info.ofUserWidget`. However, `.ofUserWidget` also requires a `Syntax`
object (as it is normally produced when widgets are saved at a piece of
syntax during elaboration) which we do not have in this case. More
generally, it continues to be a bit cursed that `Elab.Info` nodes are
used both for elaboration and delaboration (pretty-printing), so
entrenching that approach seems wrong. The better approach would be to
have a separate notion of pretty-printer annotation; but such a refactor
would not be clearly beneficial right now.
- To support non-JS-based environments such as
https://github.com/Julian/lean.nvim, `.ofWidget` requires also providing
another message which approximates the widget in a textual form.
However, in practice these environments might still want to support a
few specific user widgets such as "Try this".

---

Closes #2064.
2024-05-29 06:37:42 +00:00
Sebastian Ullrich
cc33c39cb0 chore: bootstrap fixes 2024-05-28 23:04:19 +02:00
Sebastian Ullrich
8c7364ee64 chore: update stage0 2024-05-28 23:04:19 +02:00
Sebastian Ullrich
26b6718422 chore: haveId node kind 2024-05-28 23:04:19 +02:00
Sebastian Ullrich
66777670e8 fix: stray tokens in tactic block should not inhibit incrementality (#4268) 2024-05-27 07:36:13 +00:00
Sebastian Ullrich
f05a82799a chore: CI: restore macOS aarch64 install sufix 2024-05-26 14:29:56 +02:00
Leonardo de Moura
8eee5ff27f fix: do not include internal match equational theorems at simp trace (#4274)
closes #4251
2024-05-25 17:16:19 +00:00
Mac Malone
fe17b82096 refactor: lake: ensure job actions can be lifted to FetchM (#4273)
In `v4.8.0-rc2`, due to additional build refactor changes, `JobM` no
longer cleanly lifts in `FetchM`. Generally, a `JobM` action should not
be run `FetchM` directly but spawned asynchronously as job (e.g., via
`Job.async`). However, there may be some edge cases were this is
necessary and it is a backwards compatibility break, so this change adds
back the lift. This change also includes an `example` definition to
ensure the lift works in order to prevent similar accidental breakages
in the future.

This breakage was first reported by Mario on
[Zulip](https://leanprover.zulipchat.com/#narrow/stream/113488-general/topic/v4.2E8.2E0-rc2.20discussion/near/440407037).
2024-05-25 02:20:04 +00:00
Mac Malone
def00d3920 refactor: lake: manifest semver & code cleanup (#4083)
Switches the manifest format to use `major.minor.patch` semantic
versions. Major version increments indicate breaking changes (e.g., new
required fields and semantic changes to existing fields). Minor version
increments (after `0.x`) indicate backwards-compatible extensions (e.g.,
adding optional fields, removing fields). This change is backwards
compatible. Lake will still successfully read old manifest with numeric
versions. It will treat the numeric version `N` as semantic version
`0.N.0`. Lake will also accept manifest versions with `-` suffixes
(e.g., `x.y.z-foo`) and then ignore the suffix.

This change also includes the general cleanup/refactoring of the
manifest code and data structures that was part of #3174.
2024-05-24 21:32:41 +00:00
Mac Malone
cd16975946 feat: lake pack / lake unpack (#4270)
Adds two new Lake commands, `lake pack` and `lake unpack`, which pack
and unpack, respectively, Lake build artifacts from an archive. If a
path argument is given, creates the archive specified, otherwise uses
the information in a package's `buildArchive` configuration as the
default.

The pack command will be used by Reservoir to prepare crate-style build
archives for packages. In the future, the command will also be
extensible through configuration file hooks.
2024-05-24 21:32:07 +00:00
Mac Malone
0448e3f4ea feat: lake test improvements & lake lint (#4261)
Extends the functionality of `lake test` and adds a parallel command in
`lake lint`.

* Rename `@[test_runner]` / `testRunner` to `@[test_driver]` /
`testDriver`. The old names are kept as deprecated aliases.
* Extend help page for `lake test` and adds one for `lake check-test`. 
* Add `lake lint` and its parallel tag `@[lint_driver]` , setting
`lintDriver`, and checker `lake check-lint`.
* Add support for specifying test / lint drivers from dependencies. 
* Add `testDriverArgs` / `lintDriverArgs` for fixing additional
arguments to the invocation of a driver script or executable.
* Add support for library test drivers (but not library lint drivers). 
* `lake check-test` / `lake check-lint` only load the package (without
dependencies), not the whole workspace.

Closes #4116. Closes #4121. Closes #4142.
2024-05-24 21:31:41 +00:00
405 changed files with 1651 additions and 864 deletions

View File

@@ -192,6 +192,7 @@ jobs:
{
"name": "macOS aarch64",
"os": "macos-14",
"CMAKE_OPTIONS": "-DLEAN_INSTALL_SUFFIX=-darwin_aarch64",
"release": true,
"check-level": 1,
"shell": "bash -euxo pipefail {0}",

View File

@@ -1,4 +1,4 @@
# Install Packages on OS X 10.9
# Install Packages on OS X 14.5
We assume that you are using [homebrew][homebrew] as a package manager.
@@ -22,7 +22,7 @@ brew install gcc
```
To install clang++-3.5 via homebrew, please execute:
```bash
brew install llvm --with-clang --with-asan
brew install llvm
```
To use compilers other than the default one (Apple's clang++), you
need to use `-DCMAKE_CXX_COMPILER` option to specify the compiler

View File

@@ -1,180 +0,0 @@
#!/usr/bin/env python3
import subprocess
import sys
import json
from datetime import datetime, timedelta
from urllib.parse import urlencode
import argparse
import calendar
import time
import statistics
# Reminder: Ensure you have `gh` CLI installed and authorized before running this script.
# Follow instructions from https://cli.github.com/ to set up `gh` and ensure it is authorized.
LABELS = ["bug", "feature", "RFC", "new-user-papercuts", "Lake"]
def get_items(query):
items = []
page = 1
base_url = 'https://api.github.com/search/issues'
retries = 0
max_retries = 5
while True:
params = {'q': query, 'per_page': 100, 'page': page}
url = f"{base_url}?{urlencode(params)}"
# print(f"Fetching page {page} from URL: {url}")
try:
result = subprocess.run(['gh', 'api', url], capture_output=True, text=True)
data = json.loads(result.stdout)
if 'items' in data:
items.extend(data['items'])
elif 'message' in data and 'rate limit' in data['message'].lower():
if retries < max_retries:
wait_time = (2 ** retries) * 60 # Exponential backoff
time.sleep(wait_time)
retries += 1
continue
else:
print("Max retries exceeded. Exiting.")
break
else:
print(f"Error fetching data: {data}")
break
if len(data['items']) < 100:
break
page += 1
except Exception as e:
print(f"Error fetching data: {e}")
print(result.stdout) # Print the JSON output for debugging
break
return items
def get_fro_team_members():
try:
result = subprocess.run(['gh', 'api', '-H', 'Accept: application/vnd.github.v3+json', '/orgs/leanprover/teams/fro/members'], capture_output=True, text=True)
members = json.loads(result.stdout)
return [member['login'] for member in members]
except Exception as e:
print(f"Error fetching team members: {e}")
return []
def calculate_average_time_to_close(closed_items):
times_to_close = [(datetime.strptime(item['closed_at'], '%Y-%m-%dT%H:%M:%SZ') - datetime.strptime(item['created_at'], '%Y-%m-%dT%H:%M:%SZ')).days for item in closed_items]
average_time_to_close = sum(times_to_close) / len(times_to_close) if times_to_close else 0
return average_time_to_close
def parse_dates(date_args):
if len(date_args) == 2:
start_date = date_args[0]
end_date = date_args[1]
elif len(date_args) == 1:
if len(date_args[0]) == 7: # YYYY-MM format
year, month = map(int, date_args[0].split('-'))
start_date = f"{year}-{month:02d}-01"
end_date = f"{year}-{month:02d}-{calendar.monthrange(year, month)[1]}"
elif len(date_args[0]) == 4: # YYYY format
year = int(date_args[0])
start_date = f"{year}-07-01"
end_date = f"{year+1}-06-30"
elif len(date_args[0]) == 6 and date_args[0][4] == 'Q': # YYYYQn format
year = int(date_args[0][:4])
quarter = int(date_args[0][5])
if quarter == 1:
start_date = f"{year}-01-01"
end_date = f"{year}-03-31"
elif quarter == 2:
start_date = f"{year}-04-01"
end_date = f"{year}-06-30"
elif quarter == 3:
start_date = f"{year}-07-01"
end_date = f"{year}-09-30"
elif quarter == 4:
start_date = f"{year}-10-01"
end_date = f"{year}-12-31"
else:
raise ValueError("Invalid quarter format")
else:
raise ValueError("Invalid date format")
else:
raise ValueError("Invalid number of date arguments")
return start_date, end_date
def split_date_range(start_date, end_date):
start = datetime.strptime(start_date, '%Y-%m-%d')
end = datetime.strptime(end_date, '%Y-%m-%d')
date_ranges = []
# Splitting into month-long windows to work around the GitHub search 1000 result limit.
while start <= end:
month_end = start + timedelta(days=calendar.monthrange(start.year, start.month)[1] - start.day)
month_end = min(month_end, end)
date_ranges.append((start.strftime('%Y-%m-%d'), month_end.strftime('%Y-%m-%d')))
start = month_end + timedelta(days=1)
return date_ranges
def main():
parser = argparse.ArgumentParser(description="Fetch and count GitHub issues assigned to fro team members between two dates.")
parser.add_argument("dates", type=str, nargs='+', help="Start and end dates in YYYY-MM-DD, YYYY-MM, YYYY-Qn, or YYYY format")
args = parser.parse_args()
start_date, end_date = parse_dates(args.dates)
repo = "leanprover/lean4"
date_ranges = split_date_range(start_date, end_date)
fro_members = get_fro_team_members()
fro_members.append("unassigned") # Add "unassigned" for issues with no assignee
label_headers = ", ".join([f"MTTR ({label})" for label in LABELS])
print(f"# username, open issues, new issues, closed issues, MTTR (all), {label_headers}")
for member in fro_members:
open_issues_count = 0
new_issues_count = 0
closed_issues_count = 0
total_time_to_close_issues = 0
closed_issues = []
label_times = {label: [] for label in LABELS}
for start, end in date_ranges:
if member == "unassigned":
open_issues_query1 = f'repo:{repo} is:issue no:assignee state:open created:<={end}'
open_issues_query2 = f'repo:{repo} is:issue no:assignee state:closed created:<={end} closed:>{end}'
new_issues_query = f'repo:{repo} is:issue no:assignee created:{start}..{end}'
closed_issues_query = f'repo:{repo} is:issue no:assignee closed:{start}..{end}'
else:
open_issues_query1 = f'repo:{repo} is:issue assignee:{member} state:open created:<={end}'
open_issues_query2 = f'repo:{repo} is:issue assignee:{member} state:closed created:<={end} closed:>{end}'
new_issues_query = f'repo:{repo} is:issue assignee:{member} created:{start}..{end}'
closed_issues_query = f'repo:{repo} is:issue assignee:{member} closed:{start}..{end}'
open_issues1 = get_items(open_issues_query1)
open_issues2 = get_items(open_issues_query2)
new_issues = get_items(new_issues_query)
closed_issues_period = get_items(closed_issues_query)
open_issues_count = len(open_issues1) + len(open_issues2)
new_issues_count += len(new_issues)
closed_issues_count += len(closed_issues_period)
closed_issues.extend(closed_issues_period)
for issue in closed_issues_period:
time_to_close = (datetime.strptime(issue['closed_at'], '%Y-%m-%dT%H:%M:%SZ') - datetime.strptime(issue['created_at'], '%Y-%m-%dT%H:%M:%SZ')).days
total_time_to_close_issues += time_to_close
for label in LABELS:
if label in [l['name'] for l in issue['labels']]:
label_times[label].append(time_to_close)
average_time_to_close_issues = total_time_to_close_issues / closed_issues_count if closed_issues_count else 0
label_averages = {label: (sum(times) / len(times)) if times else 0 for label, times in label_times.items()}
label_averages_str = ", ".join([f"{label_averages[label]:.2f}" for label in LABELS])
print(f"{member},{open_issues_count},{new_issues_count},{closed_issues_count},{average_time_to_close_issues:.2f},{label_averages_str}")
if __name__ == "__main__":
main()

View File

@@ -1,163 +0,0 @@
#!/usr/bin/env python3
import subprocess
import sys
import json
from datetime import datetime, timedelta
from urllib.parse import urlencode
import argparse
import calendar
import time
# Reminder: Ensure you have `gh` CLI installed and authorized before running this script.
# Follow instructions from https://cli.github.com/ to set up `gh` and ensure it is authorized.
def get_items(query):
items = []
page = 1
base_url = 'https://api.github.com/search/issues'
retries = 0
max_retries = 5
while True:
params = {'q': query, 'per_page': 100, 'page': page}
url = f"{base_url}?{urlencode(params)}"
# print(f"Fetching page {page} from URL: {url}")
try:
result = subprocess.run(['gh', 'api', url], capture_output=True, text=True)
data = json.loads(result.stdout)
if 'items' in data:
items.extend(data['items'])
elif 'message' in data and 'rate limit' in data['message'].lower():
if retries < max_retries:
wait_time = (2 ** retries) * 60 # Exponential backoff
print(f"Rate limit exceeded. Retrying in {wait_time} seconds...")
time.sleep(wait_time)
retries += 1
continue
else:
print("Max retries exceeded. Exiting.")
break
else:
print(f"Error fetching data: {data}")
break
if len(data['items']) < 100:
break
page += 1
except Exception as e:
print(f"Error fetching data: {e}")
print(result.stdout) # Print the JSON output for debugging
break
return items
def calculate_average_time_to_close(closed_items):
times_to_close = [(datetime.strptime(item['closed_at'], '%Y-%m-%dT%H:%M:%SZ') - datetime.strptime(item['created_at'], '%Y-%m-%dT%H:%M:%SZ')).days for item in closed_items]
average_time_to_close = sum(times_to_close) / len(times_to_close) if times_to_close else 0
return average_time_to_close
def parse_dates(date_args):
if len(date_args) == 2:
start_date = date_args[0]
end_date = date_args[1]
elif len(date_args) == 1:
if len(date_args[0]) == 7: # YYYY-MM format
year, month = map(int, date_args[0].split('-'))
start_date = f"{year}-{month:02d}-01"
end_date = f"{year}-{month:02d}-{calendar.monthrange(year, month)[1]}"
elif len(date_args[0]) == 4: # YYYY format
year = int(date_args[0])
start_date = f"{year}-07-01"
end_date = f"{year+1}-06-30"
elif len(date_args[0]) == 6 and date_args[0][4] == 'Q': # YYYYQn format
year = int(date_args[0][:4])
quarter = int(date_args[0][5])
if quarter == 1:
start_date = f"{year}-01-01"
end_date = f"{year}-03-31"
elif quarter == 2:
start_date = f"{year}-04-01"
end_date = f"{year}-06-30"
elif quarter == 3:
start_date = f"{year}-07-01"
end_date = f"{year}-09-30"
elif quarter == 4:
start_date = f"{year}-10-01"
end_date = f"{year}-12-31"
else:
raise ValueError("Invalid quarter format")
else:
raise ValueError("Invalid date format")
else:
raise ValueError("Invalid number of date arguments")
return start_date, end_date
def split_date_range(start_date, end_date):
start = datetime.strptime(start_date, '%Y-%m-%d')
end = datetime.strptime(end_date, '%Y-%m-%d')
date_ranges = []
# Splitting into month-long windows to work around the GitHub search 1000 result limit.
while start <= end:
month_end = start + timedelta(days=calendar.monthrange(start.year, start.month)[1] - start.day)
month_end = min(month_end, end)
date_ranges.append((start.strftime('%Y-%m-%d'), month_end.strftime('%Y-%m-%d')))
start = month_end + timedelta(days=1)
return date_ranges
def main():
parser = argparse.ArgumentParser(description="Fetch and count GitHub issues and pull requests between two dates.")
parser.add_argument("dates", type=str, nargs='+', help="Start and end dates in YYYY-MM-DD, YYYY-MM, YYYY-Qn, or YYYY format")
args = parser.parse_args()
start_date, end_date = parse_dates(args.dates)
repo = "leanprover/lean4"
date_ranges = split_date_range(start_date, end_date)
open_issues_count = 0
opened_issues_count = 0
closed_issues_count = 0
total_time_to_close_issues = 0
open_prs_count = 0
closed_but_not_merged_prs_count = 0
merged_prs_count = 0
for start, end in date_ranges:
open_issues_query1 = f'repo:{repo} is:issue state:open created:<={end}'
open_issues_query2 = f'repo:{repo} is:issue state:closed created:<={end} closed:>{end}'
opened_issues_query = f'repo:{repo} is:issue created:{start}..{end}'
closed_issues_query = f'repo:{repo} is:issue closed:{start}..{end}'
open_prs_query1 = f'repo:{repo} is:pr state:open created:<={end}'
open_prs_query2 = f'repo:{repo} is:pr state:closed created:<={end} closed:>{end}'
closed_but_not_merged_prs_query = f'repo:{repo} is:pr state:closed is:unmerged closed:{start}..{end}'
merged_prs_query = f'repo:{repo} is:pr is:merged closed:{start}..{end}'
open_issues1 = get_items(open_issues_query1)
open_issues2 = get_items(open_issues_query2)
opened_issues = get_items(opened_issues_query)
closed_issues = get_items(closed_issues_query)
open_prs1 = get_items(open_prs_query1)
open_prs2 = get_items(open_prs_query2)
closed_but_not_merged_prs = get_items(closed_but_not_merged_prs_query)
merged_prs = get_items(merged_prs_query)
open_issues_count = len(open_issues1) + len(open_issues2)
opened_issues_count += len(opened_issues)
closed_issues_count += len(closed_issues)
total_time_to_close_issues += sum((datetime.strptime(item['closed_at'], '%Y-%m-%dT%H:%M:%SZ') - datetime.strptime(item['created_at'], '%Y-%m-%dT%H:%M:%SZ')).days for item in closed_issues)
open_prs_count = len(open_prs1) + len(open_prs2)
closed_but_not_merged_prs_count += len(closed_but_not_merged_prs)
merged_prs_count += len(merged_prs)
average_time_to_close_issues = total_time_to_close_issues / closed_issues_count if closed_issues_count else 0
print("# open issues, opened issues, closed issues, average age of closed issues, open PRs, closed PRs, merged PRs")
print(f"{open_issues_count},{opened_issues_count},{closed_issues_count},{average_time_to_close_issues:.2f},{open_prs_count},{closed_but_not_merged_prs_count},{merged_prs_count}")
if __name__ == "__main__":
main()

View File

@@ -10,6 +10,7 @@ import Init.Data.BitVec.Basic
import Init.Data.Fin.Lemmas
import Init.Data.Nat.Lemmas
import Init.Data.Nat.Mod
import Init.Data.Int.Bitwise.Lemmas
namespace BitVec
@@ -141,6 +142,8 @@ theorem ofBool_eq_iff_eq : ∀(b b' : Bool), BitVec.ofBool b = BitVec.ofBool b'
@[simp, bv_toNat] theorem toNat_ofNat (x w : Nat) : (x#w).toNat = x % 2^w := by
simp [BitVec.toNat, BitVec.ofNat, Fin.ofNat']
@[simp] theorem toFin_ofNat (x : Nat) : toFin x#w = Fin.ofNat' x (Nat.two_pow_pos w) := rfl
-- Remark: we don't use `[simp]` here because simproc` subsumes it for literals.
-- If `x` and `n` are not literals, applying this theorem eagerly may not be a good idea.
theorem getLsb_ofNat (n : Nat) (x : Nat) (i : Nat) :
@@ -279,6 +282,9 @@ theorem toInt_ofNat {n : Nat} (x : Nat) :
have p : 0 i % (2^n : Nat) := by omega
simp [toInt_eq_toNat_bmod, Int.toNat_of_nonneg p]
@[simp] theorem ofInt_natCast (w n : Nat) :
BitVec.ofInt w (n : Int) = BitVec.ofNat w n := rfl
/-! ### zeroExtend and truncate -/
@[simp, bv_toNat] theorem toNat_zeroExtend' {m n : Nat} (p : m n) (x : BitVec m) :
@@ -461,6 +467,11 @@ protected theorem extractLsb_ofNat (x n : Nat) (hi lo : Nat) :
ext
simp
theorem or_assoc (x y z : BitVec w) :
x ||| y ||| z = x ||| (y ||| z) := by
ext i
simp [Bool.or_assoc]
/-! ### and -/
@[simp] theorem toNat_and (x y : BitVec v) :
@@ -487,6 +498,11 @@ protected theorem extractLsb_ofNat (x n : Nat) (hi lo : Nat) :
ext
simp
theorem and_assoc (x y z : BitVec w) :
x &&& y &&& z = x &&& (y &&& z) := by
ext i
simp [Bool.and_assoc]
/-! ### xor -/
@[simp] theorem toNat_xor (x y : BitVec v) :
@@ -507,6 +523,11 @@ protected theorem extractLsb_ofNat (x n : Nat) (hi lo : Nat) :
ext
simp
theorem xor_assoc (x y z : BitVec w) :
x ^^^ y ^^^ z = x ^^^ (y ^^^ z) := by
ext i
simp [Bool.xor_assoc]
/-! ### not -/
theorem not_def {x : BitVec v} : ~~~x = allOnes v ^^^ x := rfl
@@ -641,6 +662,70 @@ theorem shiftLeft_shiftLeft {w : Nat} (x : BitVec w) (n m : Nat) :
getLsb (x >>> i) j = getLsb x (i+j) := by
unfold getLsb ; simp
/-! ### sshiftRight -/
theorem sshiftRight_eq {x : BitVec n} {i : Nat} :
x.sshiftRight i = BitVec.ofInt n (x.toInt >>> i) := by
apply BitVec.eq_of_toInt_eq
simp [BitVec.sshiftRight]
/-- if the msb is false, the arithmetic shift right equals logical shift right -/
theorem sshiftRight_eq_of_msb_false {x : BitVec w} {s : Nat} (h : x.msb = false) :
(x.sshiftRight s) = x >>> s := by
apply BitVec.eq_of_toNat_eq
rw [BitVec.sshiftRight_eq, BitVec.toInt_eq_toNat_cond]
have hxbound : 2 * x.toNat < 2 ^ w := (BitVec.msb_eq_false_iff_two_mul_lt x).mp h
simp only [hxbound, reduceIte, Int.natCast_shiftRight, Int.ofNat_eq_coe, ofInt_natCast,
toNat_ofNat, toNat_ushiftRight]
replace hxbound : x.toNat >>> s < 2 ^ w := by
rw [Nat.shiftRight_eq_div_pow]
exact Nat.lt_of_le_of_lt (Nat.div_le_self ..) x.isLt
apply Nat.mod_eq_of_lt hxbound
/--
If the msb is `true`, the arithmetic shift right equals negating,
then logical shifting right, then negating again.
The double negation preserves the lower bits that have been shifted,
and the outer negation ensures that the high bits are '1'. -/
theorem sshiftRight_eq_of_msb_true {x : BitVec w} {s : Nat} (h : x.msb = true) :
(x.sshiftRight s) = ~~~((~~~x) >>> s) := by
apply BitVec.eq_of_toNat_eq
rcases w with rfl | w
· simp
· rw [BitVec.sshiftRight_eq, BitVec.toInt_eq_toNat_cond]
have hxbound : (2 * x.toNat 2 ^ (w + 1)) := (BitVec.msb_eq_true_iff_two_mul_ge x).mp h
replace hxbound : ¬ (2 * x.toNat < 2 ^ (w + 1)) := by omega
simp only [hxbound, reduceIte, toNat_ofInt, toNat_not, toNat_ushiftRight]
rw [ Int.subNatNat_eq_coe, Int.subNatNat_of_lt (by omega),
Nat.pred_eq_sub_one, Int.negSucc_shiftRight,
Int.emod_negSucc, Int.natAbs_ofNat, Nat.succ_eq_add_one,
Int.subNatNat_of_le (by omega), Int.toNat_ofNat, Nat.mod_eq_of_lt,
Nat.sub_right_comm]
omega
· rw [Nat.shiftRight_eq_div_pow]
apply Nat.lt_of_le_of_lt (Nat.div_le_self _ _) (by omega)
theorem getLsb_sshiftRight (x : BitVec w) (s i : Nat) :
getLsb (x.sshiftRight s) i =
(!decide (w i) && if s + i < w then x.getLsb (s + i) else x.msb) := by
rcases hmsb : x.msb with rfl | rfl
· simp only [sshiftRight_eq_of_msb_false hmsb, getLsb_ushiftRight, Bool.if_false_right]
by_cases hi : i w
· simp only [hi, decide_True, Bool.not_true, Bool.false_and]
apply getLsb_ge
omega
· simp only [hi, decide_False, Bool.not_false, Bool.true_and, Bool.iff_and_self,
decide_eq_true_eq]
intros hlsb
apply BitVec.lt_of_getLsb _ _ hlsb
· by_cases hi : i w
· simp [hi]
· simp only [sshiftRight_eq_of_msb_true hmsb, getLsb_not, getLsb_ushiftRight, Bool.not_and,
Bool.not_not, hi, decide_False, Bool.not_false, Bool.if_true_right, Bool.true_and,
Bool.and_iff_right_iff_imp, Bool.or_eq_true, Bool.not_eq_true', decide_eq_false_iff_not,
Nat.not_lt, decide_eq_true_eq]
omega
/-! ### append -/
theorem append_def (x : BitVec v) (y : BitVec w) :
@@ -924,6 +1009,10 @@ theorem ofNat_sub_ofNat {n} (x y : Nat) : x#n - y#n = .ofNat n (x + (2^n - y % 2
@[simp, bv_toNat] theorem toNat_neg (x : BitVec n) : (- x).toNat = (2^n - x.toNat) % 2^n := by
simp [Neg.neg, BitVec.neg]
@[simp] theorem toFin_neg (x : BitVec n) :
(-x).toFin = Fin.ofNat' (2^n - x.toNat) (Nat.two_pow_pos _) :=
rfl
theorem sub_toAdd {n} (x y : BitVec n) : x - y = x + - y := by
apply eq_of_toNat_eq
simp
@@ -1069,8 +1158,126 @@ theorem rotateLeft_eq_rotateLeftAux_of_lt {x : BitVec w} {r : Nat} (hr : r < w)
x.rotateLeft r = x.rotateLeftAux r := by
simp only [rotateLeft, Nat.mod_eq_of_lt hr]
/--
Accessing bits in `x.rotateLeft r` the range `[0, r)` is equal to
accessing bits `x` in the range `[w - r, w)`.
Proof by example:
Let x := <6 5 4 3 2 1 0> : BitVec 7.
x.rotateLeft 2 = (<6 5 | 4 3 2 1 0>).rotateLeft 2 = <3 2 1 0 | 6 5>
(x.rotateLeft 2).getLsb ⟨i, i < 2⟩
= <3 2 1 0 | 6 5>.getLsb ⟨i, i < 2⟩
= <6 5>[i]
= <6 5 | 4 3 2 1 0>[i + len(<4 3 2 1 0>)]
= <6 5 | 4 3 2 1 0>[i + 7 - 2]
-/
theorem getLsb_rotateLeftAux_of_le {x : BitVec w} {r : Nat} {i : Nat} (hi : i < r) :
(x.rotateLeftAux r).getLsb i = x.getLsb (w - r + i) := by
rw [rotateLeftAux, getLsb_or, getLsb_ushiftRight]
suffices (x <<< r).getLsb i = false by
simp; omega
simp only [getLsb_shiftLeft, Bool.and_eq_false_imp, Bool.and_eq_true, decide_eq_true_eq,
Bool.not_eq_true', decide_eq_false_iff_not, Nat.not_lt, and_imp]
omega
/--
Accessing bits in `x.rotateLeft r` the range `[r, w)` is equal to
accessing bits `x` in the range `[0, w - r)`.
Proof by example:
Let x := <6 5 4 3 2 1 0> : BitVec 7.
x.rotateLeft 2 = (<6 5 | 4 3 2 1 0>).rotateLeft 2 = <3 2 1 0 | 6 5>
(x.rotateLeft 2).getLsb ⟨i, i ≥ 2⟩
= <3 2 1 0 | 6 5>.getLsb ⟨i, i ≥ 2⟩
= <3 2 1 0>[i - 2]
= <6 5 | 3 2 1 0>[i - 2]
Intuitively, grab the full width (7), then move the marker `|` by `r` to the right `(-2)`
Then, access the bit at `i` from the right `(+i)`.
-/
theorem getLsb_rotateLeftAux_of_geq {x : BitVec w} {r : Nat} {i : Nat} (hi : i r) :
(x.rotateLeftAux r).getLsb i = (decide (i < w) && x.getLsb (i - r)) := by
rw [rotateLeftAux, getLsb_or]
suffices (x >>> (w - r)).getLsb i = false by
have hiltr : decide (i < r) = false := by
simp [hi]
simp [getLsb_shiftLeft, Bool.or_false, hi, hiltr, this]
simp only [getLsb_ushiftRight]
apply getLsb_ge
omega
/-- When `r < w`, we give a formula for `(x.rotateRight r).getLsb i`. -/
theorem getLsb_rotateLeft_of_le {x : BitVec w} {r i : Nat} (hr: r < w) :
(x.rotateLeft r).getLsb i =
cond (i < r)
(x.getLsb (w - r + i))
(decide (i < w) && x.getLsb (i - r)) := by
· rw [rotateLeft_eq_rotateLeftAux_of_lt hr]
by_cases h : i < r
· simp [h, getLsb_rotateLeftAux_of_le h]
· simp [h, getLsb_rotateLeftAux_of_geq <| Nat.ge_of_not_lt h]
@[simp]
theorem getLsb_rotateLeft {x : BitVec w} {r i : Nat} :
(x.rotateLeft r).getLsb i =
cond (i < r % w)
(x.getLsb (w - (r % w) + i))
(decide (i < w) && x.getLsb (i - (r % w))) := by
rcases w with rfl, w
· simp
· rw [ rotateLeft_mod_eq_rotateLeft, getLsb_rotateLeft_of_le (Nat.mod_lt _ (by omega))]
/-! ## Rotate Right -/
/--
Accessing bits in `x.rotateRight r` the range `[0, w-r)` is equal to
accessing bits `x` in the range `[r, w)`.
Proof by example:
Let x := <6 5 4 3 2 1 0> : BitVec 7.
x.rotateRight 2 = (<6 5 4 3 2 | 1 0>).rotateRight 2 = <1 0 | 6 5 4 3 2>
(x.rotateLeft 2).getLsb ⟨i, i ≤ 7 - 2⟩
= <1 0 | 6 5 4 3 2>.getLsb ⟨i, i ≤ 7 - 2⟩
= <6 5 4 3 2>.getLsb i
= <6 5 4 3 2 | 1 0>[i + 2]
-/
theorem getLsb_rotateRightAux_of_le {x : BitVec w} {r : Nat} {i : Nat} (hi : i < w - r) :
(x.rotateRightAux r).getLsb i = x.getLsb (r + i) := by
rw [rotateRightAux, getLsb_or, getLsb_ushiftRight]
suffices (x <<< (w - r)).getLsb i = false by
simp only [this, Bool.or_false]
simp only [getLsb_shiftLeft, Bool.and_eq_false_imp, Bool.and_eq_true, decide_eq_true_eq,
Bool.not_eq_true', decide_eq_false_iff_not, Nat.not_lt, and_imp]
omega
/--
Accessing bits in `x.rotateRight r` the range `[w-r, w)` is equal to
accessing bits `x` in the range `[0, r)`.
Proof by example:
Let x := <6 5 4 3 2 1 0> : BitVec 7.
x.rotateRight 2 = (<6 5 4 3 2 | 1 0>).rotateRight 2 = <1 0 | 6 5 4 3 2>
(x.rotateLeft 2).getLsb ⟨i, i ≥ 7 - 2⟩
= <1 0 | 6 5 4 3 2>.getLsb ⟨i, i ≤ 7 - 2⟩
= <1 0>.getLsb (i - len(<6 5 4 3 2>)
= <6 5 4 3 2 | 1 0> (i - len<6 4 4 3 2>)
-/
theorem getLsb_rotateRightAux_of_geq {x : BitVec w} {r : Nat} {i : Nat} (hi : i w - r) :
(x.rotateRightAux r).getLsb i = (decide (i < w) && x.getLsb (i - (w - r))) := by
rw [rotateRightAux, getLsb_or]
suffices (x >>> r).getLsb i = false by
simp only [this, getLsb_shiftLeft, Bool.false_or]
by_cases hiw : i < w
<;> simp [hiw, hi]
simp only [getLsb_ushiftRight]
apply getLsb_ge
omega
/-- `rotateRight` equals the bit fiddling definition of `rotateRightAux` when the rotation amount is
smaller than the bitwidth. -/
theorem rotateRight_eq_rotateRightAux_of_lt {x : BitVec w} {r : Nat} (hr : r < w) :
@@ -1083,4 +1290,25 @@ theorem rotateRight_mod_eq_rotateRight {x : BitVec w} {r : Nat} :
x.rotateRight (r % w) = x.rotateRight r := by
simp only [rotateRight, Nat.mod_mod]
/-- When `r < w`, we give a formula for `(x.rotateRight r).getLsb i`. -/
theorem getLsb_rotateRight_of_le {x : BitVec w} {r i : Nat} (hr: r < w) :
(x.rotateRight r).getLsb i =
cond (i < w - r)
(x.getLsb (r + i))
(decide (i < w) && x.getLsb (i - (w - r))) := by
· rw [rotateRight_eq_rotateRightAux_of_lt hr]
by_cases h : i < w - r
· simp [h, getLsb_rotateRightAux_of_le h]
· simp [h, getLsb_rotateRightAux_of_geq <| Nat.le_of_not_lt h]
@[simp]
theorem getLsb_rotateRight {x : BitVec w} {r i : Nat} :
(x.rotateRight r).getLsb i =
cond (i < w - (r % w))
(x.getLsb ((r % w) + i))
(decide (i < w) && x.getLsb (i - (w - (r % w)))) := by
rcases w with rfl, w
· simp
· rw [ rotateRight_mod_eq_rotateRight, getLsb_rotateRight_of_le (Nat.mod_lt _ (by omega))]
end BitVec

View File

@@ -6,6 +6,8 @@ Authors: François G. Dorais
prelude
import Init.Data.Nat.Linear
namespace Fin
/-- Folds over `Fin n` from the left: `foldl 3 f x = f (f (f x 0) 1) 2`. -/
@[inline] def foldl (n) (f : α Fin n α) (init : α) : α := loop init 0 where
/-- Inner loop for `Fin.foldl`. `Fin.foldl.loop n f x i = f (f (f x i) ...) (n-1)` -/
@@ -20,3 +22,5 @@ import Init.Data.Nat.Linear
loop : {i // i n} α α
| 0, _, x => x
| i+1, h, x => loop i, Nat.le_of_lt h (f i, h x)
end Fin

View File

@@ -0,0 +1,37 @@
/-
Copyright (c) 2023 Siddharth Bhat. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Siddharth Bhat, Jeremy Avigad
-/
prelude
import Init.Data.Nat.Bitwise.Lemmas
import Init.Data.Int.Bitwise
namespace Int
theorem shiftRight_eq (n : Int) (s : Nat) : n >>> s = Int.shiftRight n s := rfl
@[simp]
theorem natCast_shiftRight (n s : Nat) : (n : Int) >>> s = n >>> s := rfl
@[simp]
theorem negSucc_shiftRight (m n : Nat) :
-[m+1] >>> n = -[m >>>n +1] := rfl
theorem shiftRight_add (i : Int) (m n : Nat) :
i >>> (m + n) = i >>> m >>> n := by
simp only [shiftRight_eq, Int.shiftRight]
cases i <;> simp [Nat.shiftRight_add]
theorem shiftRight_eq_div_pow (m : Int) (n : Nat) :
m >>> n = m / ((2 ^ n) : Nat) := by
simp only [shiftRight_eq, Int.shiftRight, Nat.shiftRight_eq_div_pow]
split
· simp
· rw [negSucc_ediv _ (by norm_cast; exact Nat.pow_pos (Nat.zero_lt_two))]
rfl
@[simp]
theorem zero_shiftRight (n : Nat) : (0 : Int) >>> n = 0 := by
simp [Int.shiftRight_eq_div_pow]
end Int

View File

@@ -420,6 +420,9 @@ theorem negSucc_emod (m : Nat) {b : Int} (bpos : 0 < b) : -[m+1] % b = b - 1 - m
match b, eq_succ_of_zero_lt bpos with
| _, n, rfl => rfl
theorem emod_negSucc (m : Nat) (n : Int) :
(Int.negSucc m) % n = Int.subNatNat (Int.natAbs n) (Nat.succ (m % Int.natAbs n)) := rfl
theorem ofNat_mod_ofNat (m n : Nat) : (m % n : Int) = (m % n) := rfl
theorem emod_nonneg : (a : Int) {b : Int}, b 0 0 a % b

View File

@@ -101,7 +101,7 @@ theorem ball_ne_none {p : Option α → Prop} : (∀ x (_ : x ≠ none), p x)
@[simp] theorem bind_none (x : Option α) : x.bind (fun _ => none (α := β)) = none := by
cases x <;> rfl
@[simp] theorem bind_eq_some : x.bind f = some b a, x = some a f a = some b := by
theorem bind_eq_some : x.bind f = some b a, x = some a f a = some b := by
cases x <;> simp
@[simp] theorem bind_eq_none {o : Option α} {f : α Option β} :
@@ -119,7 +119,7 @@ theorem bind_assoc (x : Option α) (f : α → Option β) (g : β → Option γ)
(x.bind f).bind g = x.bind fun y => (f y).bind g := by cases x <;> rfl
theorem join_eq_some : x.join = some a x = some (some a) := by
simp
simp [bind_eq_some]
theorem join_ne_none : x.join none z, x = some (some z) := by
simp only [ne_none_iff_exists', join_eq_some, iff_self]

View File

@@ -835,7 +835,8 @@ syntax (name := renameI) "rename_i" (ppSpace colGt binderIdent)+ : tactic
/--
`repeat tac` repeatedly applies `tac` to the main goal until it fails.
That is, if `tac` produces multiple subgoals, only subgoals up to the first failure will be visited.
The `Batteries` library provides `repeat'` which repeats separately in each subgoal.
See also the tactic `repeat'` which repeats separately in each subgoal.
-/
syntax "repeat " tacticSeq : tactic
macro_rules

View File

@@ -193,12 +193,13 @@ def foldCharOfNat (beforeErasure : Bool) (a : Expr) : Option Expr := do
else
return mkUInt32Lit 0
def foldToNat (_ : Bool) (a : Expr) : Option Expr := do
def foldToNat (size : Nat) (_ : Bool) (a : Expr) : Option Expr := do
let n getNumLit a
return mkRawNatLit n
return mkRawNatLit (n % size)
def uintFoldToNatFns : List (Name × UnFoldFn) :=
numScalarTypes.foldl (fun r info => (info.toNatFn, foldToNat) :: r) []
numScalarTypes.foldl (fun r info => (info.toNatFn, foldToNat info.size) :: r) []
def unFoldFns : List (Name × UnFoldFn) :=
[(``Nat.succ, foldNatSucc),

View File

@@ -182,7 +182,8 @@ def mkDecEqEnum (declName : Name) : CommandElabM Unit := do
fun x y =>
if h : x.toCtorIdx = y.toCtorIdx then
-- We use `rfl` in the following proof because the first script fails for unit-like datatypes due to etaStruct.
isTrue (by first | have aux := congrArg $ofNatIdent h; rw [$auxThmIdent:ident, $auxThmIdent:ident] at aux; assumption | rfl)
-- Temporarily avoiding tactic `have` for bootstrapping
isTrue (by first | refine_lift have aux := congrArg $ofNatIdent h; ?_; rw [$auxThmIdent:ident, $auxThmIdent:ident] at aux; assumption | rfl)
else
isFalse fun h => by subst h; contradiction
)

View File

@@ -688,27 +688,15 @@ def getDoLetVars (doLet : Syntax) : TermElabM (Array Var) :=
-- leading_parser "let " >> optional "mut " >> letDecl
getLetDeclVars doLet[2]
def getHaveIdLhsVar (optIdent : Syntax) : Var :=
if optIdent.getKind == hygieneInfoKind then
HygieneInfo.mkIdent optIdent[0] `this
else
optIdent
def getDoHaveVars (doHave : Syntax) : TermElabM (Array Var) := do
-- doHave := leading_parser "have " >> Term.haveDecl
-- haveDecl := leading_parser haveIdDecl <|> letPatDecl <|> haveEqnsDecl
let arg := doHave[1][0]
if arg.getKind == ``Parser.Term.haveIdDecl then
-- haveIdDecl := leading_parser atomic (haveIdLhs >> " := ") >> termParser
-- haveIdLhs := (binderIdent <|> hygieneInfo) >> many letIdBinder >> optType
return #[getHaveIdLhsVar arg[0]]
else if arg.getKind == ``Parser.Term.letPatDecl then
getLetPatDeclVars arg
else if arg.getKind == ``Parser.Term.haveEqnsDecl then
-- haveEqnsDecl := leading_parser haveIdLhs >> matchAlts
return #[getHaveIdLhsVar arg[0]]
else
throwError "unexpected kind of have declaration"
def getDoHaveVars : Syntax TermElabM (Array Var)
-- NOTE: `hygieneInfo` case should come first as `id` will match anything else
| `(doElem| have $info:hygieneInfo $_params* $[$_:typeSpec]? := $_val)
| `(doElem| have $info:hygieneInfo $_params* $[$_:typeSpec]? $_eqns:matchAlts) =>
return #[HygieneInfo.mkIdent info `this]
| `(doElem| have $id $_params* $[$_:typeSpec]? := $_val)
| `(doElem| have $id $_params* $[$_:typeSpec]? $_eqns:matchAlts) => return #[id]
| `(doElem| have $pat:letPatDecl) => getLetPatDeclVars pat
| _ => throwError "unexpected kind of have declaration"
def getDoLetRecVars (doLetRec : Syntax) : TermElabM (Array Var) := do
-- letRecDecls is an array of `(group (optional attributes >> letDecl))`

View File

@@ -56,13 +56,11 @@ where
return Syntax.mkAntiquotNode kind term
| some (.category cat) =>
return Syntax.mkAntiquotNode cat term (isPseudoKind := true)
| none =>
| some (.alias _) =>
let id := id.getId.eraseMacroScopes
if ( Parser.isParserAlias id) then
let kind := ( Parser.getSyntaxKindOfParserAlias? id).getD Name.anonymous
return Syntax.mkAntiquotNode kind term
else
throwError "unknown parser declaration/category/alias '{id}'"
let kind := ( Parser.getSyntaxKindOfParserAlias? id).getD Name.anonymous
return Syntax.mkAntiquotNode kind term
| _ => throwError "unknown parser declaration/category/alias '{id}'"
| stx, term => do
-- can't match against `` `(stx| ($stxs*)) `` as `*` is interpreted as the `stx` operator
if stx.raw.isOfKind ``Parser.Syntax.paren then

View File

@@ -223,9 +223,12 @@ def getQuotKind (stx : Syntax) : TermElabM SyntaxNodeKind := do
| ``Parser.Tactic.quot => addNamedQuotInfo stx `tactic
| ``Parser.Tactic.quotSeq => addNamedQuotInfo stx `tactic.seq
| .str kind "quot" => addNamedQuotInfo stx kind
| ``dynamicQuot => match elabParserName stx[1] with
| ``dynamicQuot =>
let id := stx[1]
match ( elabParserName id) with
| .parser n _ => return n
| .category c => return c
| .alias _ => return ( Parser.getSyntaxKindOfParserAlias? id.getId.eraseMacroScopes).get!
| k => throwError "unexpected quotation kind {k}"
def mkSyntaxQuotation (stx : Syntax) (kind : Name) : TermElabM Syntax := do

View File

@@ -80,7 +80,7 @@ def checkLeftRec (stx : Syntax) : ToParserDescrM Bool := do
markAsTrailingParser (prec?.getD 0)
return true
def elabParserName? (stx : Syntax.Ident) : TermElabM (Option Parser.ParserName) := do
def elabParserName? (stx : Syntax.Ident) : TermElabM (Option Parser.ParserResolution) := do
match Parser.resolveParserName stx with
| [n@(.category cat)] =>
addCategoryInfo stx cat
@@ -88,10 +88,12 @@ def elabParserName? (stx : Syntax.Ident) : TermElabM (Option Parser.ParserName)
| [n@(.parser parser _)] =>
addTermInfo' stx (Lean.mkConst parser)
return n
| [n@(.alias _)] =>
return n
| _::_::_ => throwErrorAt stx "ambiguous parser {stx}"
| [] => return none
def elabParserName (stx : Syntax.Ident) : TermElabM Parser.ParserName := do
def elabParserName (stx : Syntax.Ident) : TermElabM Parser.ParserResolution := do
match elabParserName? stx with
| some n => return n
| none => throwErrorAt stx "unknown parser {stx}"
@@ -194,12 +196,6 @@ where
processNullaryOrCat (stx : Syntax) := do
let ident := stx[0]
let id := ident.getId.eraseMacroScopes
-- run when parser is neither a decl nor a cat
let default := do
if ( Parser.isParserAlias id) then
ensureNoPrec stx
return ( processAlias ident #[])
throwError "unknown parser declaration/category/alias '{id}'"
match ( elabParserName? ident) with
| some (.parser c (isDescr := true)) =>
ensureNoPrec stx
@@ -209,14 +205,18 @@ where
| some (.parser c (isDescr := false)) =>
if ( Parser.getParserAliasInfo id).declName == c then
-- prefer parser alias over base declaration because it has more metadata, #2249
return ( default)
ensureNoPrec stx
return ( processAlias ident #[])
ensureNoPrec stx
-- as usual, we assume that people using `Parser` know what they are doing
let stackSz := 1
return ( `(ParserDescr.parser $(quote c)), stackSz)
| some (.category _) =>
processParserCategory stx
| none => default
| some (.alias _) =>
ensureNoPrec stx
processAlias ident #[]
| none => throwError "unknown parser declaration/category/alias '{id}'"
processSepBy (stx : Syntax) := do
let p ensureUnaryOutput <$> withNestedParser do process stx[1]

View File

@@ -571,7 +571,7 @@ where
match stx with
| `(tactic| replace $decl:haveDecl) =>
withMainContext do
let vars Elab.Term.Do.getDoHaveVars <| mkNullNode #[.missing, decl]
let vars Elab.Term.Do.getDoHaveVars ( `(doElem| have $decl:haveDecl))
let origLCtx getLCtx
evalTactic $ `(tactic| have $decl:haveDecl)
let mut toClear := #[]

View File

@@ -153,6 +153,7 @@ inductive ResolveSimpIdResult where
Elaborate extra simp theorems provided to `simp`. `stx` is of the form `"[" simpTheorem,* "]"`
If `eraseLocal == true`, then we consider local declarations when resolving names for erased theorems (`- id`),
this option only makes sense for `simp_all` or `*` is used.
Try to recover from errors as much as possible so that users keep seeing the current goal.
-/
def elabSimpArgs (stx : Syntax) (ctx : Simp.Context) (simprocs : Simp.SimprocsArray) (eraseLocal : Bool) (kind : SimpKind) : TacticM ElabSimpArgsResult := do
if stx.isNone then
@@ -171,56 +172,58 @@ def elabSimpArgs (stx : Syntax) (ctx : Simp.Context) (simprocs : Simp.SimprocsAr
let mut simprocs := simprocs
let mut starArg := false
for arg in stx[1].getSepArgs do
if arg.getKind == ``Lean.Parser.Tactic.simpErase then
let fvar if eraseLocal || starArg then Term.isLocalIdent? arg[1] else pure none
if let some fvar := fvar then
-- We use `eraseCore` because the simp theorem for the hypothesis was not added yet
thms := thms.eraseCore (.fvar fvar.fvarId!)
try -- like withLogging, but compatible with do-notation
if arg.getKind == ``Lean.Parser.Tactic.simpErase then
let fvar? if eraseLocal || starArg then Term.isLocalIdent? arg[1] else pure none
if let some fvar := fvar? then
-- We use `eraseCore` because the simp theorem for the hypothesis was not added yet
thms := thms.eraseCore (.fvar fvar.fvarId!)
else
let id := arg[1]
if let .ok declName observing (realizeGlobalConstNoOverloadWithInfo id) then
if ( Simp.isSimproc declName) then
simprocs := simprocs.erase declName
else if ctx.config.autoUnfold then
thms := thms.eraseCore (.decl declName)
else
thms withRef id <| thms.erase (.decl declName)
else
-- If `id` could not be resolved, we should check whether it is a builtin simproc.
-- before returning error.
let name := id.getId.eraseMacroScopes
if ( Simp.isBuiltinSimproc name) then
simprocs := simprocs.erase name
else
withRef id <| throwUnknownConstant name
else if arg.getKind == ``Lean.Parser.Tactic.simpLemma then
let post :=
if arg[0].isNone then
true
else
arg[0][0].getKind == ``Parser.Tactic.simpPost
let inv := !arg[1].isNone
let term := arg[2]
match ( resolveSimpIdTheorem? term) with
| .expr e =>
let name mkFreshId
thms addDeclToUnfoldOrTheorem thms (.stx name arg) e post inv kind
| .simproc declName =>
simprocs simprocs.add declName post
| .ext (some ext₁) (some ext₂) _ =>
thmsArray := thmsArray.push ( ext₁.getTheorems)
simprocs := simprocs.push ( ext₂.getSimprocs)
| .ext (some ext₁) none _ =>
thmsArray := thmsArray.push ( ext₁.getTheorems)
| .ext none (some ext₂) _ =>
simprocs := simprocs.push ( ext₂.getSimprocs)
| .none =>
let name mkFreshId
thms addSimpTheorem thms (.stx name arg) term post inv
else if arg.getKind == ``Lean.Parser.Tactic.simpStar then
starArg := true
else
let id := arg[1]
if let .ok declName observing (realizeGlobalConstNoOverloadWithInfo id) then
if ( Simp.isSimproc declName) then
simprocs := simprocs.erase declName
else if ctx.config.autoUnfold then
thms := thms.eraseCore (.decl declName)
else
thms thms.erase (.decl declName)
else
-- If `id` could not be resolved, we should check whether it is a builtin simproc.
-- before returning error.
let name := id.getId.eraseMacroScopes
if ( Simp.isBuiltinSimproc name) then
simprocs := simprocs.erase name
else
throwUnknownConstant name
else if arg.getKind == ``Lean.Parser.Tactic.simpLemma then
let post :=
if arg[0].isNone then
true
else
arg[0][0].getKind == ``Parser.Tactic.simpPost
let inv := !arg[1].isNone
let term := arg[2]
match ( resolveSimpIdTheorem? term) with
| .expr e =>
let name mkFreshId
thms addDeclToUnfoldOrTheorem thms (.stx name arg) e post inv kind
| .simproc declName =>
simprocs simprocs.add declName post
| .ext (some ext₁) (some ext₂) _ =>
thmsArray := thmsArray.push ( ext₁.getTheorems)
simprocs := simprocs.push ( ext₂.getSimprocs)
| .ext (some ext₁) none _ =>
thmsArray := thmsArray.push ( ext₁.getTheorems)
| .ext none (some ext₂) _ =>
simprocs := simprocs.push ( ext₂.getSimprocs)
| .none =>
let name mkFreshId
thms addSimpTheorem thms (.stx name arg) term post inv
else if arg.getKind == ``Lean.Parser.Tactic.simpStar then
starArg := true
else
throwUnsupportedSyntax
throwUnsupportedSyntax
catch ex => logException ex
return { ctx := { ctx with simpTheorems := thmsArray.set! 0 thms }, simprocs, starArg }
where
isSimproc? (e : Expr) : MetaM (Option Name) := do
@@ -336,7 +339,9 @@ def mkSimpOnly (stx : Syntax) (usedSimps : Simp.UsedSimps) : MetaM Syntax := do
for (thm, _) in usedSimps.toArray.qsort (·.2 < ·.2) do
match thm with
| .decl declName post inv => -- global definitions in the environment
if env.contains declName && (inv || !simpOnlyBuiltins.contains declName) then
if env.contains declName
&& (inv || !simpOnlyBuiltins.contains declName)
&& !Match.isMatchEqnTheorem env declName then
let decl : Term `($(mkIdent ( unresolveNameGlobal declName)):ident)
let arg match post, inv with
| true, true => `(Parser.Tactic.simpLemma| $decl:term)

View File

@@ -209,6 +209,9 @@ def logException [Monad m] [MonadLog m] [AddMessageContext m] [MonadOptions m] [
let name id.getName
logError m!"internal exception: {name}"
/--
If `x` throws an exception, catch it and turn it into a log message (using `logException`).
-/
def withLogging [Monad m] [MonadLog m] [MonadExcept Exception m] [AddMessageContext m] [MonadOptions m] [MonadLiftT IO m]
(x : m Unit) : m Unit := do
try x catch ex => logException ex

View File

@@ -57,23 +57,33 @@ inductive MessageData where
This constructor is inspected in various hacks. -/
| ofFormatWithInfos : FormatWithInfos MessageData
| ofGoal : MVarId MessageData
/-- A widget instance.
In `ofWidget wi alt`,
the nested message `alt` should approximate the contents of the widget
without itself using `ofWidget wi _`.
This is used as fallback in environments that cannot display user widgets.
`alt` may nest any structured message,
for example `ofGoal` to approximate a tactic state widget,
and, if necessary, even other widget instances
(for which approximations are computed recursively). -/
| ofWidget : Widget.WidgetInstance MessageData MessageData
/-- `withContext ctx d` specifies the pretty printing context `(env, mctx, lctx, opts)` for the nested expressions in `d`. -/
| withContext : MessageDataContext MessageData MessageData
| withNamingContext : NamingContext MessageData MessageData
/-- Lifted `Format.nest` -/
| nest : Nat MessageData MessageData
| nest : Nat MessageData MessageData
/-- Lifted `Format.group` -/
| group : MessageData MessageData
| group : MessageData MessageData
/-- Lifted `Format.compose` -/
| compose : MessageData MessageData MessageData
| compose : MessageData MessageData MessageData
/-- Tagged sections. `Name` should be viewed as a "kind", and is used by `MessageData` inspector functions.
Example: an inspector that tries to find "definitional equality failures" may look for the tag "DefEqFailure". -/
| tagged : Name MessageData MessageData
| trace (data : TraceData) (msg : MessageData) (children : Array MessageData)
/-- A lazy message.
The provided thunk will not be run until it is about to be displayed.
This can save computation in cases where the message may never be seen,
e.g. when nested inside a collapsed trace.
This can save computation in cases where the message may never be seen.
The `Dynamic` value is expected to be a `MessageData`,
which is a workaround for the positivity restriction.
@@ -160,6 +170,7 @@ partial def formatAux : NamingContext → Option MessageDataContext → MessageD
| _, _, ofFormatWithInfos fmt => return fmt.1
| _, none, ofGoal mvarId => return "goal " ++ format (mkMVar mvarId)
| nCtx, some ctx, ofGoal mvarId => ppGoal (mkPPContext nCtx ctx) mvarId
| nCtx, ctx, ofWidget _ d => formatAux nCtx ctx d
| nCtx, _, withContext ctx d => formatAux nCtx ctx d
| _, ctx, withNamingContext nCtx d => formatAux nCtx ctx d
| nCtx, ctx, tagged _ d => formatAux nCtx ctx d

View File

@@ -19,14 +19,18 @@ def MatchEqns.size (e : MatchEqns) : Nat :=
structure MatchEqnsExtState where
map : PHashMap Name MatchEqns := {}
eqns : PHashSet Name := {}
deriving Inhabited
/- We generate the equations and splitter on demand, and do not save them on .olean files. -/
builtin_initialize matchEqnsExt : EnvExtension MatchEqnsExtState
registerEnvExtension (pure {})
def registerMatchEqns (matchDeclName : Name) (matchEqns : MatchEqns) : CoreM Unit :=
modifyEnv fun env => matchEqnsExt.modifyState env fun s => { s with map := s.map.insert matchDeclName matchEqns }
def registerMatchEqns (matchDeclName : Name) (matchEqns : MatchEqns) : CoreM Unit := do
modifyEnv fun env => matchEqnsExt.modifyState env fun { map, eqns } => {
eqns := matchEqns.eqnNames.foldl (init := eqns) fun eqns eqn => eqns.insert eqn
map := map.insert matchDeclName matchEqns
}
/-
Forward definition. We want to use `getEquationsFor` in the simplifier,
@@ -34,4 +38,10 @@ def registerMatchEqns (matchDeclName : Name) (matchEqns : MatchEqns) : CoreM Uni
@[extern "lean_get_match_equations_for"]
opaque getEquationsFor (matchDeclName : Name) : MetaM MatchEqns
/--
Returns `true` if `declName` is the name of a `match` equational theorem.
-/
def isMatchEqnTheorem (env : Environment) (declName : Name) : Bool :=
matchEqnsExt.getState env |>.eqns.contains declName
end Lean.Meta.Match

View File

@@ -221,13 +221,14 @@ partial def SimpTheorems.eraseCore (d : SimpTheorems) (thmId : Origin) : SimpThe
else
d
def SimpTheorems.erase [Monad m] [MonadError m] (d : SimpTheorems) (thmId : Origin) : m SimpTheorems := do
def SimpTheorems.erase [Monad m] [MonadLog m] [AddMessageContext m] [MonadOptions m]
(d : SimpTheorems) (thmId : Origin) : m SimpTheorems := do
unless d.isLemma thmId ||
match thmId with
| .decl declName .. => d.isDeclToUnfold declName || d.toUnfoldThms.contains declName
| _ => false
do
throwError "'{thmId.key}' does not have [simp] attribute"
logWarning m!"'{thmId.key}' does not have [simp] attribute"
return d.eraseCore thmId
private partial def isPerm : Expr Expr MetaM Bool

View File

@@ -94,8 +94,41 @@ def declSig := leading_parser
/-- `optDeclSig` matches the signature of a declaration with optional type: a list of binders and then possibly `: type` -/
def optDeclSig := leading_parser
many (ppSpace >> (Term.binderIdent <|> Term.bracketedBinder)) >> Term.optType
/-- Right-hand side of a `:=` in a declaration, a term. -/
def declBody : Parser :=
/-
We want to make sure that bodies starting with `by` in fact create a single `by` node instead of
accidentally parsing some remnants after it as well. This can especially happen when starting to
type comments inside tactic blocks where
```
by
sleep 2000
unfold f
-starting comment here
```
is parsed as
```
(by
sleep 2000
unfold f
) - (starting comment here)
```
where the new nesting will discard incrementality data. By using `byTactic`'s precedence, the
stray `-` will be flagged as an unexpected token and will not disturb the syntax tree up to it. We
do not call `byTactic` directly to avoid differences in pretty printing or behavior or error
reporting between the two branches.
-/
lookahead (setExpected [] "by") >> termParser leadPrec <|>
termParser
-- As the pretty printer ignores `lookahead`, we need a custom parenthesizer to choose the correct
-- precedence
open PrettyPrinter in
@[combinator_parenthesizer declBody] def declBody.parenthesizer : Parenthesizer :=
Parenthesizer.categoryParser.parenthesizer `term 0
def declValSimple := leading_parser
" :=" >> ppHardLineUnlessUngrouped >> termParser >> Termination.suffix >> optional Term.whereDecls
" :=" >> ppHardLineUnlessUngrouped >> declBody >> Termination.suffix >> optional Term.whereDecls
def declValEqns := leading_parser
Term.matchAltsWhereDecls
def whereStructField := leading_parser

View File

@@ -615,15 +615,30 @@ def withOpenDeclFn (p : ParserFn) : ParserFn := fun c s =>
@[inline] def withOpenDecl : Parser Parser := withFn withOpenDeclFn
inductive ParserName
/--
Helper environment extension that gives us access to built-in aliases in pure parser functions.
-/
builtin_initialize aliasExtension : EnvExtension (NameMap ParserAliasValue)
registerEnvExtension parserAliasesRef.get
/-- Result of resolving a parser name. -/
inductive ParserResolution where
/-- Reference to a category. -/
| category (cat : Name)
/--
Reference to a parser declaration in the environment. A `(Trailing)ParserDescr` if `isDescr` is
true.
-/
| parser (decl : Name) (isDescr : Bool)
-- TODO(gabriel): add parser aliases (this is blocked on doing IO in parsers)
deriving Repr
/--
Reference to a parser alias. Note that as aliases are built-in, a corresponding declaration may
not be in the environment (yet).
-/
| alias (p : ParserAliasValue)
/-- Resolve the given parser name and return a list of candidates. -/
def resolveParserNameCore (env : Environment) (currNamespace : Name)
(openDecls : List OpenDecl) (ident : Ident) : List ParserName := Id.run do
(openDecls : List OpenDecl) (ident : Ident) : List ParserResolution := Id.run do
let .ident (val := val) (preresolved := pre) .. := ident | return []
let rec isParser (name : Name) : Option Bool :=
@@ -643,16 +658,24 @@ def resolveParserNameCore (env : Environment) (currNamespace : Name)
if isParserCategory env erased then
return [.category erased]
ResolveName.resolveGlobalName env currNamespace openDecls val |>.filterMap fun
| (name, []) => (isParser name).map fun isDescr => .parser name isDescr
| _ => none
let resolved ResolveName.resolveGlobalName env currNamespace openDecls val |>.filterMap fun
| (name, []) => (isParser name).map fun isDescr => .parser name isDescr
| _ => none
unless resolved.isEmpty do
return resolved
-- Aliases are considered global declarations and so should be tried after scope-aware resolution
if let some alias := aliasExtension.getState env |>.find? erased then
return [.alias alias]
return []
/-- Resolve the given parser name and return a list of candidates. -/
def ParserContext.resolveParserName (ctx : ParserContext) (id : Ident) : List ParserName :=
def ParserContext.resolveParserName (ctx : ParserContext) (id : Ident) : List ParserResolution :=
Parser.resolveParserNameCore ctx.env ctx.currNamespace ctx.openDecls id
/-- Resolve the given parser name and return a list of candidates. -/
def resolveParserName (id : Ident) : CoreM (List ParserName) :=
def resolveParserName (id : Ident) : CoreM (List ParserResolution) :=
return resolveParserNameCore ( getEnv) ( getCurrNamespace) ( getOpenDecls) id
def parserOfStackFn (offset : Nat) : ParserFn := fun ctx s => Id.run do
@@ -661,23 +684,27 @@ def parserOfStackFn (offset : Nat) : ParserFn := fun ctx s => Id.run do
return s.mkUnexpectedError ("failed to determine parser using syntax stack, stack is too small")
let parserName@(.ident ..) := stack.get! (stack.size - offset - 1)
| s.mkUnexpectedError ("failed to determine parser using syntax stack, the specified element on the stack is not an identifier")
match ctx.resolveParserName parserName with
| [.category cat] =>
categoryParserFn cat ctx s
| [.parser parserName _] =>
let iniSz := s.stackSize
let s := adaptUncacheableContextFn (fun ctx =>
if !internal.parseQuotWithCurrentStage.get ctx.options then
-- static quotations such as `(e) do not use the interpreter unless the above option is set,
-- so for consistency neither should dynamic quotations using this function
{ ctx with options := ctx.options.setBool `interpreter.prefer_native true }
else ctx) (evalParserConst parserName) ctx s
if !s.hasError && s.stackSize != iniSz + 1 then
s.mkUnexpectedError "expected parser to return exactly one syntax object"
else
s
| _::_::_ => s.mkUnexpectedError s!"ambiguous parser name {parserName}"
| [] => s.mkUnexpectedError s!"unknown parser {parserName}"
let iniSz := s.stackSize
let s match ctx.resolveParserName parserName with
| [.category cat] =>
categoryParserFn cat ctx s
| [.parser parserName _] =>
adaptUncacheableContextFn (fun ctx =>
if !internal.parseQuotWithCurrentStage.get ctx.options then
-- static quotations such as `(e) do not use the interpreter unless the above option is set,
-- so for consistency neither should dynamic quotations using this function
{ ctx with options := ctx.options.setBool `interpreter.prefer_native true }
else ctx) (evalParserConst parserName) ctx s
| [.alias alias] =>
match alias with
| .const p => p.fn ctx s
| _ =>
return s.mkUnexpectedError s!"parser alias {parserName}, must not take parameters"
| _::_::_ => return s.mkUnexpectedError s!"ambiguous parser name {parserName}"
| [] => return s.mkUnexpectedError s!"unknown parser {parserName}"
if !s.hasError && s.stackSize != iniSz + 1 then
return s.mkUnexpectedError "expected parser to return exactly one syntax object"
s
def parserOfStack (offset : Nat) (prec : Nat := 0) : Parser where
fn := adaptCacheableContextFn ({ · with prec }) (parserOfStackFn offset)

View File

@@ -542,8 +542,11 @@ It is often used when building macros.
@[builtin_term_parser] def «let_tmp» := leading_parser:leadPrec
withPosition ("let_tmp " >> letDecl) >> optSemicolon termParser
def haveId := leading_parser (withAnonymousAntiquot := false)
(ppSpace >> binderIdent) <|> hygieneInfo
/- like `let_fun` but with optional name -/
def haveIdLhs := ((ppSpace >> binderIdent) <|> hygieneInfo) >> many (ppSpace >> letIdBinder) >> optType
def haveIdLhs :=
haveId >> many (ppSpace >> letIdBinder) >> optType
def haveIdDecl := leading_parser (withAnonymousAntiquot := false)
atomic (haveIdLhs >> " := ") >> termParser
def haveEqnsDecl := leading_parser (withAnonymousAntiquot := false)
@@ -786,7 +789,7 @@ def isIdent (stx : Syntax) : Bool :=
checkStackTop isIdent "expected preceding identifier" >>
checkNoWsBefore "no space before '.{'" >> ".{" >>
sepBy1 levelParser ", " >> "}"
/-- `x@e` or `x:h@e` matches the pattern `e` and binds its value to the identifier `x`.
/-- `x@e` or `x@h:e` matches the pattern `e` and binds its value to the identifier `x`.
If present, the identifier `h` is bound to a proof of `x = e`. -/
@[builtin_term_parser] def namedPattern : TrailingParser := trailing_parser
checkStackTop isIdent "expected preceding identifier" >>

View File

@@ -28,15 +28,25 @@ inductive MsgEmbed where
| expr : CodeWithInfos MsgEmbed
/-- An interactive goal display. -/
| goal : InteractiveGoal MsgEmbed
/-- Some messages (in particular, traces) are too costly to print eagerly. Instead, we allow
the user to expand sub-traces interactively. -/
/-- A widget instance.
`alt` is a fallback rendering of the widget
that can be shown in standard, non-interactive LSP diagnostics,
as well as when user widgets are not supported by the client. -/
| widget (wi : Widget.WidgetInstance) (alt : TaggedText MsgEmbed)
/-- Some messages (in particular, traces) are too costly to print eagerly.
Instead, we allow the user to expand sub-traces interactively. -/
| trace (indent : Nat) (cls : Name) (msg : TaggedText MsgEmbed) (collapsed : Bool)
(children : StrictOrLazy (Array (TaggedText MsgEmbed)) (WithRpcRef LazyTraceChildren))
deriving Inhabited, RpcEncodable
/-- The `message` field is the text of a message possibly containing interactive *embeds* of type
`MsgEmbed`. We maintain the invariant that embeds are stored in `.tag`s with empty `.text` subtrees,
i.e. `.tag embed (.text "")`, because a `MsgEmbed` display involve more than just text. -/
/-- The `message` field is the text of a message
possibly containing interactive *embeds* of type `MsgEmbed`.
We maintain the invariant that embeds are stored in `.tag`s with empty `.text` subtrees,
i.e., `.tag embed (.text "")`.
Client-side display algorithms render tags in a custom way,
ignoring the nested text. -/
abbrev InteractiveDiagnostic := Lsp.DiagnosticWith (TaggedText MsgEmbed)
deriving instance RpcEncodable for Lsp.DiagnosticWith
@@ -44,14 +54,15 @@ deriving instance RpcEncodable for Lsp.DiagnosticWith
namespace InteractiveDiagnostic
open MsgEmbed
def toDiagnostic (diag : InteractiveDiagnostic) : Lsp.Diagnostic :=
partial def toDiagnostic (diag : InteractiveDiagnostic) : Lsp.Diagnostic :=
{ diag with message := prettyTt diag.message }
where
prettyTt (tt : TaggedText MsgEmbed) : String :=
let tt : TaggedText MsgEmbed := tt.rewrite fun
| .expr tt, _ => .text tt.stripTags
| .goal g, _ => .text (toString g.pretty)
| .trace .., _ => .text "(trace)"
| .expr tt, _ => .text tt.stripTags
| .goal g, _ => .text (toString g.pretty)
| .widget _ alt, _ => .text $ prettyTt alt
| .trace .., _ => .text "(trace)"
tt.stripTags
/-- Compares interactive diagnostics modulo `TaggedText` tags and traces. -/
@@ -88,6 +99,8 @@ private inductive EmbedFmt
/-- Nested text is ignored. -/
| goal (ctx : Elab.ContextInfo) (lctx : LocalContext) (g : MVarId)
/-- Nested text is ignored. -/
| widget (wi : WidgetInstance) (alt : Format)
/-- Nested text is ignored. -/
| trace (cls : Name) (msg : Format) (collapsed : Bool)
(children : StrictOrLazy (Array Format) (Array MessageData))
/-- Nested tags are ignored, show nested text as-is. -/
@@ -128,20 +141,23 @@ where
}
go (nCtx : NamingContext) : Option MessageDataContext MessageData MsgFmtM Format
| none, ofFormatWithInfos fmt, _ => withIgnoreTags fmt
| some ctx, ofFormatWithInfos fmt, infos => do
| none, ofFormatWithInfos fmt, _ => withIgnoreTags fmt
| some ctx, ofFormatWithInfos fmt, infos => do
let t pushEmbed <| EmbedFmt.code (mkContextInfo nCtx ctx) infos
return Format.tag t fmt
| none, ofGoal mvarId => pure $ "goal " ++ format (mkMVar mvarId)
| some ctx, ofGoal mvarId =>
return .tag ( pushEmbed (.goal (mkContextInfo nCtx ctx) ctx.lctx mvarId)) "\n"
| _, withContext ctx d => go nCtx ctx d
| ctx, withNamingContext nCtx d => go nCtx ctx d
| ctx, tagged _ d => go nCtx ctx d
| ctx, nest n d => Format.nest n <$> go nCtx ctx d
| ctx, compose d₁ d₂ => do let d₁ go nCtx ctx d; let d₂ go nCtx ctx d; pure $ d₁ ++ d₂
| ctx, group d => Format.group <$> go nCtx ctx d
| ctx, .trace data header children => do
| none, ofGoal mvarId => pure $ "goal " ++ format (mkMVar mvarId)
| some ctx, ofGoal mvarId =>
return .tag ( pushEmbed (.goal (mkContextInfo nCtx ctx) ctx.lctx mvarId)) default
| ctx, ofWidget wi d => do
let t pushEmbed <| EmbedFmt.widget wi ( go nCtx ctx d)
return Format.tag t default
| _, withContext ctx d => go nCtx ctx d
| ctx, withNamingContext nCtx d => go nCtx ctx d
| ctx, tagged _ d => go nCtx ctx d
| ctx, nest n d => Format.nest n <$> go nCtx ctx d
| ctx, compose d₁ d₂ => do let d₁ go nCtx ctx d₁; let d₂ go nCtx ctx d₂; pure $ d₁ ++ d₂
| ctx, group d => Format.group <$> go nCtx ctx d
| ctx, .trace data header children => do
let mut header := ( go nCtx ctx header).nest 4
if data.startTime != 0 then
header := f!"[{data.stopTime - data.startTime}] {header}"
@@ -159,7 +175,7 @@ where
else
pure (.strict ( children.mapM (go nCtx ctx)))
let e := .trace data.cls header data.collapsed nodes
return .tag ( pushEmbed e) ".\n"
return .tag ( pushEmbed e) default
| ctx?, ofLazy f _ => do
let dyn f (ctx?.map (mkPPContext nCtx))
let some msg := dyn.get? MessageData
@@ -188,6 +204,8 @@ partial def msgToInteractive (msgData : MessageData) (hasWidgets : Bool) (indent
| .goal ctx lctx g =>
ctx.runMetaM lctx do
return .tag (.goal ( goalToInteractive g)) default
| .widget wi alt =>
return .tag (.widget wi ( fmtToTT alt col)) default
| .trace cls msg collapsed children => do
let col := col + tt.stripTags.length - 2
let children

View File

@@ -26,5 +26,6 @@ structure WidgetInstance where
so must be stored as a computation
with access to the RPC object store. -/
props : StateM Server.RpcObjectStore Json
deriving Server.RpcEncodable
end Lean.Widget

View File

@@ -214,17 +214,28 @@ def addPanelWidgetLocal [Monad m] [MonadEnv m] (wi : WidgetInstance) : m Unit :=
def erasePanelWidget [Monad m] [MonadEnv m] (h : UInt64) : m Unit := do
modifyEnv fun env => panelWidgetsExt.modifyState env fun st => st.erase h
/-- Save the data of a panel widget which will be displayed whenever the text cursor is on `stx`.
/-- Construct a widget instance by finding a widget module
in the current environment.
`hash` must be `hash (toModule c).javascript`
where `c` is some global constant annotated with `@[widget_module]`. -/
def savePanelWidgetInfo (hash : UInt64) (props : StateM Server.RpcObjectStore Json) (stx : Syntax) :
CoreM Unit := do
where `c` is some global constant annotated with `@[widget_module]`,
or the name of a builtin widget module. -/
def WidgetInstance.ofHash (hash : UInt64) (props : StateM Server.RpcObjectStore Json) :
CoreM WidgetInstance := do
let env getEnv
let builtins builtinModulesRef.get
let some id :=
(builtins.find? hash |>.map (·.1)) <|> (moduleRegistry.getState env |>.find? hash |>.map (·.1))
| throwError s!"No widget module with hash {hash} registered"
pushInfoLeaf <| .ofUserWidgetInfo { id, javascriptHash := hash, props, stx }
return { id, javascriptHash := hash, props }
/-- Save the data of a panel widget which will be displayed whenever the text cursor is on `stx`.
`hash` must be as in `WidgetInstance.ofHash`. -/
def savePanelWidgetInfo (hash : UInt64) (props : StateM Server.RpcObjectStore Json) (stx : Syntax) :
CoreM Unit := do
let wi WidgetInstance.ofHash hash props
pushInfoLeaf <| .ofUserWidgetInfo { wi with stx }
/-! ## `show_panel_widgets` command -/
@@ -372,8 +383,6 @@ opaque evalUserWidgetDefinition [Monad m] [MonadEnv m] [MonadOptions m] [MonadEr
/-! ## Retrieving panel widget instances -/
deriving instance Server.RpcEncodable for WidgetInstance
/-- Retrieve all the `UserWidgetInfo`s that intersect a given line. -/
def widgetInfosAt? (text : FileMap) (t : InfoTree) (hoverLine : Nat) : List UserWidgetInfo :=
t.deepestNodes fun

View File

@@ -5,7 +5,7 @@ Authors: Gabriel Ebner, Sebastian Ullrich, Mac Malone, Siddharth Bhat
-/
import Lake.Util.Proc
import Lake.Util.NativeLib
import Lake.Build.Job
import Lake.Util.IO
/-! # Common Build Actions
Low level actions to build common Lean artifacts via the Lean toolchain.
@@ -22,7 +22,7 @@ def compileLeanModule
(leanPath : SearchPath := []) (rootDir : FilePath := ".")
(dynlibs : Array FilePath := #[]) (dynlibPath : SearchPath := {})
(leanArgs : Array String := #[]) (lean : FilePath := "lean")
: JobM Unit := do
: LogIO Unit := do
let mut args := leanArgs ++
#[leanFile.toString, "-R", rootDir.toString]
if let some oleanFile := oleanFile? then
@@ -70,7 +70,7 @@ def compileLeanModule
def compileO
(oFile srcFile : FilePath)
(moreArgs : Array String := #[]) (compiler : FilePath := "cc")
: JobM Unit := do
: LogIO Unit := do
createParentDirs oFile
proc {
cmd := compiler.toString
@@ -80,7 +80,7 @@ def compileO
def compileStaticLib
(libFile : FilePath) (oFiles : Array FilePath)
(ar : FilePath := "ar")
: JobM Unit := do
: LogIO Unit := do
createParentDirs libFile
proc {
cmd := ar.toString
@@ -90,7 +90,7 @@ def compileStaticLib
def compileSharedLib
(libFile : FilePath) (linkArgs : Array String)
(linker : FilePath := "cc")
: JobM Unit := do
: LogIO Unit := do
createParentDirs libFile
proc {
cmd := linker.toString
@@ -100,7 +100,7 @@ def compileSharedLib
def compileExe
(binFile : FilePath) (linkFiles : Array FilePath)
(linkArgs : Array String := #[]) (linker : FilePath := "cc")
: JobM Unit := do
: LogIO Unit := do
createParentDirs binFile
proc {
cmd := linker.toString

View File

@@ -95,3 +95,13 @@ abbrev MonadBuild (m : Type → Type u) :=
/-- The internal core monad of Lake builds. Not intended for user use. -/
abbrev CoreBuildM := BuildT LogIO
/--
Logs a build step with `message`.
**Deprecated:** Build steps are now managed by a top-level build monitor.
As a result, this no longer functions the way it used to. It now just logs the
`message` via `logVerbose`.
-/
@[deprecated, inline] def logStep [Monad m] [MonadLog m] (message : String) : m Unit := do
logVerbose message

View File

@@ -25,6 +25,19 @@ abbrev RecBuildM :=
instance : MonadLift LogIO RecBuildM := ELogT.takeAndRun
/--
Run a `JobM` action in `RecBuildM` (and thus `FetchM`).
Generally, this should not be done, and instead a job action
should be run asynchronously in a Job (e.g., via `Job.async`).
-/
@[inline] def RecBuildM.runJobM (x : JobM α) : RecBuildM α := fun _ ctx log store => do
match ( x ctx {log}) with
| .ok a s => return (.ok a s.log, store)
| .error e s => return (.error e s.log, store)
instance : MonadLift JobM RecBuildM := RecBuildM.runJobM
/-- Run a recursive build. -/
@[inline] def RecBuildM.run
(stack : CallStack BuildKey) (store : BuildStore) (build : RecBuildM α)
@@ -55,6 +68,12 @@ abbrev IndexT (m : Type → Type v) := EquipT (IndexBuildFn m) m
/-- The top-level monad for Lake build functions. -/
abbrev FetchM := IndexT RecBuildM
/-- Ensures that `JobM` lifts into `FetchM`. -/
example : MonadLiftT JobM FetchM := inferInstance
/-- Ensures that `SpawnM` lifts into `FetchM`. -/
example : MonadLiftT SpawnM FetchM := inferInstance
/-- The top-level monad for Lake build functions. **Renamed `FetchM`.** -/
@[deprecated FetchM] abbrev IndexBuildM := FetchM

View File

@@ -66,7 +66,13 @@ abbrev JobResult α := EResult Log.Pos JobState α
/-- The `Task` of a Lake job. -/
abbrev JobTask α := BaseIOTask (JobResult α)
/-- The monad of asynchronous Lake jobs. Lifts into `FetchM`. -/
/--
The monad of asynchronous Lake jobs.
While this can be lifted into `FetchM`, job action should generally
be wrapped into an asynchronous job (e.g., via `Job.async`) instead of being
run directly in `FetchM`.
-/
abbrev JobM := BuildT <| EStateT Log.Pos JobState BaseIO
instance [Pure m] : MonadLift LakeM (BuildT m) where
@@ -94,16 +100,6 @@ abbrev SpawnM := BuildT BaseIO
/-- The monad used to spawn asynchronous Lake build jobs. **Replaced by `SpawnM`.** -/
@[deprecated SpawnM] abbrev SchedulerM := SpawnM
/--
Logs a build step with `message`.
**Deprecated:** Build steps are now managed by a top-level build monitor.
As a result, this no longer functions the way it used to. It now just logs the
`message` via `logVerbose`.
-/
@[deprecated] def logStep (message : String) : JobM Unit := do
logVerbose message
/-- A Lake job. -/
structure Job (α : Type u) where
task : JobTask α

View File

@@ -71,7 +71,7 @@ def Package.fetchOptRelease (self : Package) : FetchM (BuildJob Bool) := Job.asy
download url self.buildArchiveFile
unless upToDate && ( self.buildDir.pathExists) do
updateAction .fetch
logVerbose s!"unpacking {self.name}/{tag}/{self.buildArchive}"
logVerbose s!"unpacking {self.name}:{tag}:{self.buildArchive}"
untar self.buildArchiveFile self.buildDir
return (true, .nil)

View File

@@ -3,18 +3,16 @@ Copyright (c) 2021 Mac Malone. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Mac Malone
-/
import Lake.Util.IO
open System
namespace Lake
--------------------------------------------------------------------------------
/-! # Utilities -/
--------------------------------------------------------------------------------
/-- Creates any missing parent directories of `path`. -/
@[inline] def createParentDirs (path : FilePath) : IO Unit := do
if let some dir := path.parent then IO.FS.createDirAll dir
class CheckExists.{u} (i : Type u) where
/-- Check whether there already exists an artifact for the given target info. -/
checkExists : i BaseIO Bool

View File

@@ -5,6 +5,7 @@ Authors: Mac Malone
-/
import Lake.Build.Run
import Lake.Build.Targets
import Lake.CLI.Build
namespace Lake
@@ -18,24 +19,69 @@ def exe (name : Name) (args : Array String := #[]) (buildConfig : BuildConfig :
let exeFile ws.runBuild exe.fetch buildConfig
env exeFile.toString args
def uploadRelease (pkg : Package) (tag : String) : LogIO Unit := do
def Package.pack (pkg : Package) (file : FilePath := pkg.buildArchiveFile) : LogIO Unit := do
logInfo s!"packing {file}"
tar pkg.buildDir file
def Package.unpack (pkg : Package) (file : FilePath := pkg.buildArchiveFile) : LogIO Unit := do
logInfo s!"unpacking {file}"
untar file pkg.buildDir
def Package.uploadRelease (pkg : Package) (tag : String) : LogIO Unit := do
pkg.pack
logInfo s!"uploading {tag}:{pkg.buildArchive}"
let mut args :=
#["release", "upload", tag, pkg.buildArchiveFile.toString, "--clobber"]
if let some repo := pkg.releaseRepo? then
args := args.append #["-R", repo]
logInfo s!"packing {pkg.buildArchive}"
tar pkg.buildDir pkg.buildArchiveFile
logInfo s!"uploading {tag}/{pkg.buildArchive}"
proc {cmd := "gh", args}
def Package.test (pkg : Package) (args : List String := []) (buildConfig : BuildConfig := {}) : LakeT IO UInt32 := do
def Package.resolveDriver
(pkg : Package) (kind : String) (driver : String)
: LakeT IO (Package × String) := do
let pkgName := pkg.name.toString (escape := false)
if pkg.testRunner.isAnonymous then
error s!"{pkgName}: no test runner script or executable"
else if let some script := pkg.scripts.find? pkg.testRunner then
script.run args
else if let some exe := pkg.findLeanExe? pkg.testRunner then
let exeFile runBuild exe.fetch buildConfig
env exeFile.toString args.toArray
if driver.isEmpty then
error s!"{pkgName}: no {kind} driver configured"
else
error s!"{pkgName}: invalid test runner: unknown script or executable `{pkg.testRunner}`"
match driver.split (· == '/') with
| [pkg, driver] =>
let some pkg findPackage? pkg.toName
| error s!"{pkgName}: unknown {kind} driver package '{pkg}'"
return (pkg, driver)
| [driver] =>
return (pkg, driver)
| _ =>
error s!"{pkgName}: invalid {kind} driver '{driver}' (too many '/')"
def Package.test (pkg : Package) (args : List String := []) (buildConfig : BuildConfig := {}) : LakeT IO UInt32 := do
let cfgArgs := pkg.testDriverArgs
let (pkg, driver) pkg.resolveDriver "test" pkg.testDriver
let pkgName := pkg.name.toString (escape := false)
if let some script := pkg.scripts.find? driver.toName then
script.run (cfgArgs.toList ++ args)
else if let some exe := pkg.findLeanExe? driver.toName then
let exeFile runBuild exe.fetch buildConfig
env exeFile.toString (cfgArgs ++ args.toArray)
else if let some lib := pkg.findLeanLib? driver.toName then
unless cfgArgs.isEmpty args.isEmpty do
error s!"{pkgName}: arguments cannot be passed to a library test driver"
match resolveLibTarget ( getWorkspace) lib with
| .ok specs =>
runBuild (buildSpecs specs) {buildConfig with out := .stdout}
return 0
| .error e =>
error s!"{pkgName}: invalid test driver: {e}"
else
error s!"{pkgName}: invalid test driver: unknown script, executable, or library '{driver}'"
def Package.lint (pkg : Package) (args : List String := []) (buildConfig : BuildConfig := {}) : LakeT IO UInt32 := do
let cfgArgs := pkg.lintDriverArgs
let (pkg, driver) pkg.resolveDriver "lint" pkg.lintDriver
if let some script := pkg.scripts.find? driver.toName then
script.run (cfgArgs.data ++ args)
else if let some exe := pkg.findLeanExe? driver.toName then
let exeFile runBuild exe.fetch buildConfig
env exeFile.toString (cfgArgs ++ args.toArray)
else
let pkgName := pkg.name.toString (escape := false)
error s!"{pkgName}: invalid lint driver: unknown script or executable '{driver}'"

View File

@@ -15,15 +15,18 @@ structure BuildSpec where
getBuildJob : BuildData info.key BuildJob Unit
@[inline] def BuildData.toBuildJob
[FamilyOut BuildData k (BuildJob α)] (data : BuildData k) : BuildJob Unit :=
[FamilyOut BuildData k (BuildJob α)] (data : BuildData k)
: BuildJob Unit :=
discard <| ofFamily data
@[inline] def mkBuildSpec (info : BuildInfo)
[FamilyOut BuildData info.key (BuildJob α)] : BuildSpec :=
@[inline] def mkBuildSpec
(info : BuildInfo) [FamilyOut BuildData info.key (BuildJob α)]
: BuildSpec :=
{info, getBuildJob := BuildData.toBuildJob}
@[inline] def mkConfigBuildSpec (facetType : String)
(info : BuildInfo) (config : FacetConfig Fam ι facet) (h : BuildData info.key = Fam facet)
@[inline] def mkConfigBuildSpec
(facetType : String) (info : BuildInfo)
(config : FacetConfig Fam ι facet) (h : BuildData info.key = Fam facet)
: Except CliError BuildSpec := do
let some getJob := config.getJob?
| throw <| CliError.nonCliFacet facetType facet
@@ -47,7 +50,9 @@ def parsePackageSpec (ws : Workspace) (spec : String) : Except CliError Package
| none => throw <| CliError.unknownPackage spec
open Module in
def resolveModuleTarget (ws : Workspace) (mod : Module) (facet : Name) : Except CliError BuildSpec :=
def resolveModuleTarget
(ws : Workspace) (mod : Module) (facet : Name := .anonymous)
: Except CliError BuildSpec :=
if facet.isAnonymous then
return mkBuildSpec <| mod.facet leanArtsFacet
else if let some config := ws.findModuleFacetConfig? facet then do
@@ -55,7 +60,9 @@ def resolveModuleTarget (ws : Workspace) (mod : Module) (facet : Name) : Except
else
throw <| CliError.unknownFacet "module" facet
def resolveLibTarget (ws : Workspace) (lib : LeanLib) (facet : Name) : Except CliError (Array BuildSpec) :=
def resolveLibTarget
(ws : Workspace) (lib : LeanLib) (facet : Name := .anonymous)
: Except CliError (Array BuildSpec) :=
if facet.isAnonymous then
lib.defaultFacets.mapM (resolveFacet ·)
else

View File

@@ -18,11 +18,16 @@ COMMANDS:
init <name> <temp> create a Lean package in the current directory
build <targets>... build targets
exe <exe> <args>... build an exe and run it in Lake's environment
test run the workspace's test script or executable
test test the package using the configured test driver
check-test check if there is a properly configured test driver
lint lint the package using the configured lint driver
check-lint check if there is a properly configured lint driver
clean remove build outputs
env <cmd> <args>... execute a command in Lake's environment
lean <file> elaborate a Lean file in Lake's context
update update dependencies and save them to the manifest
pack pack build artifacts into an archive for distribution
unpack unpack build artifacts from an distributed archive
upload <tag> upload build artifacts to a GitHub release
script manage and run workspace scripts
scripts shorthand for `lake script list`
@@ -142,13 +147,84 @@ of the same package, the version materialized is undefined.
A bare `lake update` will upgrade all dependencies."
def helpTest :=
"Run the workspace's test script or executable
"Test the workspace's root package using its configured test driver
USAGE:
lake test [-- <args>...]
Looks for a script or executable tagged `@[test_runner]` in the workspace's
root package and executes it with `args`. "
A test driver can be configured by either setting the 'testDriver'
package configuration option or by tagging a script, executable, or library
`@[test_driver]`. A definition in a dependency can be used as a test driver
by using the `<pkg>/<name>` syntax for the 'testDriver' configuration option.
A script test driver will be run with the package configuration's
`testDriverArgs` plus the CLI `args`. An executable test driver will be
built and then run like a script. A library test driver will just be built.
"
def helpCheckTest :=
"Check if there is a properly configured test driver
USAGE:
lake check-test
Exits with code 0 if the workspace's root package has a properly
configured lint driver. Errors (with code 1) otherwise.
Does NOT verify that the configured test driver actually exists in the
package or its dependencies. It merely verifies that one is specified.
"
def helpLint :=
"Lint the workspace's root package using its configured lint driver
USAGE:
lake lint [-- <args>...]
A lint driver can be configured by either setting the `lintDriver` package
configuration option by tagging a script or executable `@[lint_driver]`.
A definition in dependency can be used as a test driver by using the
`<pkg>/<name>` syntax for the 'testDriver' configuration option.
A script lint driver will be run with the package configuration's
`lintDriverArgs` plus the CLI `args`. An executable lint driver will be
built and then run like a script.
"
def helpCheckLint :=
"Check if there is a properly configured lint driver
USAGE:
lake check-lint
Exits with code 0 if the workspace's root package has a properly
configured lint driver. Errors (with code 1) otherwise.
Does NOT verify that the configured lint driver actually exists in the
package or its dependencies. It merely verifies that one is specified.
"
def helpPack :=
"Pack build artifacts into a archive for distribution
USAGE:
lake pack [<file.tgz>]
Packs the root package's `buildDir` into a gzip tar archive using `tar`.
If a path for the archive is not specified, creates a archive in the package's
Lake directory (`.lake`) named according to its `buildArchive` setting.
Does NOT build any artifacts. It just packs the existing ones."
def helpUnpack :=
"Unpack build artifacts from a distributed archive
USAGE:
lake unpack [<file.tgz>]
Unpack build artifacts from the gzip tar archive `file.tgz` into the root
package's `buildDir`. If a path for the archive is not specified, uses the
the package's `buildArchive` in its Lake directory (`.lake`)."
def helpUpload :=
"Upload build artifacts to a GitHub release
@@ -300,8 +376,13 @@ def help : (cmd : String) → String
| "init" => helpInit
| "build" => helpBuild
| "update" | "upgrade" => helpUpdate
| "pack" => helpPack
| "unpack" => helpUnpack
| "upload" => helpUpload
| "test" => helpTest
| "check-test" => helpCheckTest
| "lint" => helpLint
| "check-lint" => helpCheckLint
| "clean" => helpClean
| "script" => helpScriptCli
| "scripts" => helpScriptList

View File

@@ -326,13 +326,28 @@ protected def update : CliM PUnit := do
let toUpdate := ( getArgs).foldl (·.insert <| stringToLegalOrSimpleName ·) {}
updateManifest config toUpdate
protected def pack : CliM PUnit := do
processOptions lakeOption
let file? takeArg?
noArgsRem do
let ws loadWorkspace ( mkLoadConfig ( getThe LakeOptions))
let file := (FilePath.mk <$> file?).getD ws.root.buildArchiveFile
ws.root.pack file
protected def unpack : CliM PUnit := do
processOptions lakeOption
let file? takeArg?
noArgsRem do
let ws loadWorkspace ( mkLoadConfig ( getThe LakeOptions))
let file := (FilePath.mk <$> file?).getD ws.root.buildArchiveFile
ws.root.unpack file
protected def upload : CliM PUnit := do
processOptions lakeOption
let tag takeArg "release tag"
let opts getThe LakeOptions
let config mkLoadConfig opts
let ws loadWorkspace config
uploadRelease ws.root tag
noArgsRem do
let ws loadWorkspace ( mkLoadConfig ( getThe LakeOptions))
ws.root.uploadRelease tag
protected def setupFile : CliM PUnit := do
processOptions lakeOption
@@ -346,16 +361,28 @@ protected def setupFile : CliM PUnit := do
protected def test : CliM PUnit := do
processOptions lakeOption
let opts getThe LakeOptions
let config mkLoadConfig opts
let ws loadWorkspace config
let ws loadWorkspace ( mkLoadConfig opts)
noArgsRem do
let x := ws.root.test opts.subArgs (mkBuildConfig opts)
exit <| x.run (mkLakeContext ws)
protected def checkTest : CliM PUnit := do
processOptions lakeOption
let ws loadWorkspace ( mkLoadConfig ( getThe LakeOptions))
noArgsRem do exit <| if ws.root.testRunner.isAnonymous then 1 else 0
let pkg loadPackage ( mkLoadConfig ( getThe LakeOptions))
noArgsRem do exit <| if pkg.testDriver.isEmpty then 1 else 0
protected def lint : CliM PUnit := do
processOptions lakeOption
let opts getThe LakeOptions
let ws loadWorkspace ( mkLoadConfig opts)
noArgsRem do
let x := ws.root.lint opts.subArgs (mkBuildConfig opts)
exit <| x.run (mkLakeContext ws)
protected def checkLint : CliM PUnit := do
processOptions lakeOption
let pkg loadPackage ( mkLoadConfig ( getThe LakeOptions))
noArgsRem do exit <| if pkg.lintDriver.isEmpty then 1 else 0
protected def clean : CliM PUnit := do
processOptions lakeOption
@@ -440,8 +467,7 @@ protected def translateConfig : CliM PUnit := do
let lang parseLangSpec ( takeArg "configuration language")
let outFile? := ( takeArg?).map FilePath.mk
noArgsRem do
Lean.searchPathRef.set cfg.lakeEnv.leanSearchPath
let (pkg, _) loadPackage "[root]" cfg
let pkg loadPackage cfg
let outFile := outFile?.getD <| pkg.configFile.withExtension lang.fileExtension
if ( outFile.pathExists) then
throw (.outputConfigExists outFile)
@@ -464,10 +490,14 @@ def lakeCli : (cmd : String) → CliM PUnit
| "build" => lake.build
| "update" | "upgrade" => lake.update
| "resolve-deps" => lake.resolveDeps
| "pack" => lake.pack
| "unpack" => lake.unpack
| "upload" => lake.upload
| "setup-file" => lake.setupFile
| "test" => lake.test
| "check-test" => lake.checkTest
| "lint" => lake.lint
| "check-lint" => lake.checkLint
| "clean" => lake.clean
| "script" => lake.script
| "scripts" => lake.script.list

View File

@@ -89,8 +89,10 @@ def LeanConfig.addDeclFields (cfg : LeanConfig) (fs : Array DeclField) : Array D
@[inline] def mkDeclValWhere? (fields : Array DeclField) : Option (TSyntax ``declValWhere) :=
if fields.isEmpty then none else Unhygienic.run `(declValWhere|where $fields*)
def PackageConfig.mkSyntax (cfg : PackageConfig) : PackageDecl := Unhygienic.run do
let declVal? := mkDeclValWhere? <|Array.empty
def PackageConfig.mkSyntax (cfg : PackageConfig)
(testDriver := cfg.testDriver) (lintDriver := cfg.lintDriver)
: PackageDecl := Unhygienic.run do
let declVal? := mkDeclValWhere? <| Array.empty
|> addDeclFieldD `precompileModules cfg.precompileModules false
|> addDeclFieldD `moreGlobalServerArgs cfg.moreGlobalServerArgs #[]
|> addDeclFieldD `srcDir cfg.srcDir "."
@@ -102,6 +104,10 @@ def PackageConfig.mkSyntax (cfg : PackageConfig) : PackageDecl := Unhygienic.run
|> addDeclField? `releaseRepo (cfg.releaseRepo <|> cfg.releaseRepo?)
|> addDeclFieldD `buildArchive (cfg.buildArchive?.getD cfg.buildArchive) (defaultBuildArchive cfg.name)
|> addDeclFieldD `preferReleaseBuild cfg.preferReleaseBuild false
|> addDeclFieldD `testDriver testDriver ""
|> addDeclFieldD `testDriverArgs cfg.testDriverArgs #[]
|> addDeclFieldD `lintDriver lintDriver ""
|> addDeclFieldD `lintDriverArgs cfg.lintDriverArgs #[]
|> cfg.toWorkspaceConfig.addDeclFields
|> cfg.toLeanConfig.addDeclFields
`(packageDecl|package $(mkIdent cfg.name) $[$declVal?]?)
@@ -145,7 +151,7 @@ protected def LeanLibConfig.mkSyntax
`(leanLibDecl|$[$attrs?:attributes]? lean_lib $(mkIdent cfg.name) $[$declVal?]?)
protected def LeanExeConfig.mkSyntax
(cfg : LeanExeConfig) (defaultTarget := false) (testRunner := false)
(cfg : LeanExeConfig) (defaultTarget := false)
: LeanExeDecl := Unhygienic.run do
let declVal? := mkDeclValWhere? <| Array.empty
|> addDeclFieldD `srcDir cfg.srcDir "."
@@ -153,11 +159,7 @@ protected def LeanExeConfig.mkSyntax
|> addDeclFieldD `exeName cfg.exeName (cfg.name.toStringWithSep "-" (escape := false))
|> addDeclFieldD `supportInterpreter cfg.supportInterpreter false
|> cfg.toLeanConfig.addDeclFields
let attrs? id do
let mut attrs := #[]
if testRunner then attrs := attrs.push <| `(Term.attrInstance|test_runner)
if defaultTarget then attrs := attrs.push <| `(Term.attrInstance|default_target)
if attrs.isEmpty then pure none else some <$> `(Term.attributes|@[$attrs,*])
let attrs? if defaultTarget then some <$> `(Term.attributes|@[default_target]) else pure none
`(leanExeDecl|$[$attrs?:attributes]? lean_exe $(mkIdent cfg.name) $[$declVal?]?)
protected def Dependency.mkSyntax (cfg : Dependency) : RequireDecl := Unhygienic.run do
@@ -175,14 +177,13 @@ protected def Dependency.mkSyntax (cfg : Dependency) : RequireDecl := Unhygienic
/-- Create a Lean module that encodes the declarative configuration of the package. -/
def Package.mkLeanConfig (pkg : Package) : TSyntax ``module := Unhygienic.run do
let testRunner := pkg.testRunner
let defaultTargets := pkg.defaultTargets.foldl NameSet.insert NameSet.empty
let pkgConfig := pkg.config.mkSyntax
let pkgConfig := pkg.config.mkSyntax pkg.testDriver pkg.lintDriver
let requires := pkg.depConfigs.map (·.mkSyntax)
let leanLibs := pkg.leanLibConfigs.toArray.map fun cfg =>
cfg.mkSyntax (defaultTargets.contains cfg.name)
let leanExes := pkg.leanExeConfigs.toArray.map fun cfg =>
cfg.mkSyntax (defaultTargets.contains cfg.name) (cfg.name == testRunner)
cfg.mkSyntax (defaultTargets.contains cfg.name)
`(module|
import $(mkIdent `Lake)
open $(mkIdent `System) $(mkIdent `Lake) $(mkIdent `DSL)

View File

@@ -122,7 +122,10 @@ instance : ToToml Dependency := ⟨(toToml ·.toToml)⟩
/-- Create a TOML table that encodes the declarative configuration of the package. -/
def Package.mkTomlConfig (pkg : Package) (t : Table := {}) : Table :=
pkg.config.toToml t
|>.insertD `testRunner pkg.testRunner .anonymous
|>.smartInsert `testDriver pkg.testDriver
|>.smartInsert `testDriverArgs pkg.testDriverArgs
|>.smartInsert `lintDriver pkg.lintDriver
|>.smartInsert `lintDriverArgs pkg.lintDriverArgs
|>.smartInsert `defaultTargets pkg.defaultTargets
|>.smartInsert `require pkg.depConfigs
|>.smartInsert `lean_lib pkg.leanLibConfigs.toArray

View File

@@ -154,6 +154,44 @@ structure PackageConfig extends WorkspaceConfig, LeanConfig where
-/
preferReleaseBuild : Bool := false
/--
The name of the script, executable, or library by `lake test` when
this package is the workspace root. To point to a definition in another
package, use the syntax `<pkg>/<def>`.
A script driver will be run by `lake test` with the arguments
configured in `testDriverArgs` followed by any specified on the CLI
(e.g., via `lake lint -- <args>...`). An executable driver will be built
and then run like a script. A library will just be built.
-/
testDriver : String := ""
/--
Arguments to pass to the package's test driver.
These arguments will come before those passed on the command line via
`lake test -- <args>...`.
-/
testDriverArgs : Array String := #[]
/--
The name of the script or executable used by `lake lint` when this package
is the workspace root. To point to a definition in another package, use the
syntax `<pkg>/<def>`.
A script driver will be run by `lake lint` with the arguments
configured in `lintDriverArgs` followed by any specified on the CLI
(e.g., via `lake lint -- <args>...`). An executable driver will be built
and then run like a script.
-/
lintDriver : String := ""
/--
Arguments to pass to the package's linter.
These arguments will come before those passed on the command line via
`lake lint -- <args>...`.
-/
lintDriverArgs : Array String := #[]
deriving Inhabited
--------------------------------------------------------------------------------
@@ -203,8 +241,11 @@ structure Package where
defaultScripts : Array Script := #[]
/-- Post-`lake update` hooks for the package. -/
postUpdateHooks : Array (OpaquePostUpdateHook config.name) := #[]
/-- Name of the package's test runner script or executable (if any). -/
testRunner : Name := .anonymous
/-- The driver used for `lake test` when this package is the workspace root. -/
testDriver : String := config.testDriver
/-- The driver used for `lake lint` when this package is the workspace root. -/
lintDriver : String := config.lintDriver
instance : Nonempty Package :=
have : Inhabited Environment := Classical.inhabited_of_nonempty inferInstance
@@ -287,6 +328,14 @@ namespace Package
@[inline] def buildDir (self : Package) : FilePath :=
self.dir / self.config.buildDir
/-- The package's `testDriverArgs` configuration. -/
@[inline] def testDriverArgs (self : Package) : Array String :=
self.config.testDriverArgs
/-- The package's `lintDriverArgs` configuration. -/
@[inline] def lintDriverArgs (self : Package) : Array String :=
self.config.lintDriverArgs
/-- The package's `extraDepTargets` configuration. -/
@[inline] def extraDepTargets (self : Package) : Array Name :=
self.config.extraDepTargets

View File

@@ -1,68 +1,22 @@
/-
Copyright (c) 2021 Mac Malone. All rights reserved.
Copyright (c) 2024 Mac Malone. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Mac Malone
-/
import Lake.Util.OrderedTagAttribute
import Lake.DSL.AttributesCore
open Lean
namespace Lake
initialize packageAttr : OrderedTagAttribute
registerOrderedTagAttribute `package "mark a definition as a Lake package configuration"
initialize packageDepAttr : OrderedTagAttribute
registerOrderedTagAttribute `package_dep "mark a definition as a Lake package dependency"
initialize postUpdateAttr : OrderedTagAttribute
registerOrderedTagAttribute `post_update "mark a definition as a Lake package post-update hook"
initialize scriptAttr : OrderedTagAttribute
registerOrderedTagAttribute `script "mark a definition as a Lake script"
initialize defaultScriptAttr : OrderedTagAttribute
registerOrderedTagAttribute `default_script "mark a Lake script as the package's default"
fun name => do
unless ( getEnv <&> (scriptAttr.hasTag · name)) do
throwError "attribute `default_script` can only be used on a `script`"
initialize leanLibAttr : OrderedTagAttribute
registerOrderedTagAttribute `lean_lib "mark a definition as a Lake Lean library target configuration"
initialize leanExeAttr : OrderedTagAttribute
registerOrderedTagAttribute `lean_exe "mark a definition as a Lake Lean executable target configuration"
initialize externLibAttr : OrderedTagAttribute
registerOrderedTagAttribute `extern_lib "mark a definition as a Lake external library target"
initialize targetAttr : OrderedTagAttribute
registerOrderedTagAttribute `target "mark a definition as a custom Lake target"
initialize defaultTargetAttr : OrderedTagAttribute
registerOrderedTagAttribute `default_target "mark a Lake target as the package's default"
fun name => do
let valid getEnv <&> fun env =>
leanLibAttr.hasTag env name ||
leanExeAttr.hasTag env name ||
externLibAttr.hasTag env name ||
targetAttr.hasTag env name
unless valid do
throwError "attribute `default_target` can only be used on a target (e.g., `lean_lib`, `lean_exe`)"
initialize testRunnerAttr : OrderedTagAttribute
registerOrderedTagAttribute `test_runner "mark a Lake script or executable as the package's test runner"
fun name => do
let valid getEnv <&> fun env =>
scriptAttr.hasTag env name ||
leanExeAttr.hasTag env name
unless valid do
throwError "attribute `test_runner` can only be used on a `script` or `lean_exe`"
initialize moduleFacetAttr : OrderedTagAttribute
registerOrderedTagAttribute `module_facet "mark a definition as a Lake module facet"
initialize packageFacetAttr : OrderedTagAttribute
registerOrderedTagAttribute `package_facet "mark a definition as a Lake package facet"
initialize libraryFacetAttr : OrderedTagAttribute
registerOrderedTagAttribute `library_facet "mark a definition as a Lake library facet"
initialize
registerBuiltinAttribute {
ref := by exact decl_name%
name := `test_runner
descr := testDriverAttr.attr.descr
add := fun decl stx attrKind => do
logWarningAt stx "@[test_runner] has been deprecated, use @[test_driver] instead"
testDriverAttr.attr.add decl stx attrKind
applicationTime := testDriverAttr.attr.applicationTime
erase := fun decl => testDriverAttr.attr.erase decl
}

View File

@@ -0,0 +1,78 @@
/-
Copyright (c) 2021 Mac Malone. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Mac Malone
-/
import Lake.Util.OrderedTagAttribute
open Lean
namespace Lake
initialize packageAttr : OrderedTagAttribute
registerOrderedTagAttribute `package "mark a definition as a Lake package configuration"
initialize packageDepAttr : OrderedTagAttribute
registerOrderedTagAttribute `package_dep "mark a definition as a Lake package dependency"
initialize postUpdateAttr : OrderedTagAttribute
registerOrderedTagAttribute `post_update "mark a definition as a Lake package post-update hook"
initialize scriptAttr : OrderedTagAttribute
registerOrderedTagAttribute `script "mark a definition as a Lake script"
initialize defaultScriptAttr : OrderedTagAttribute
registerOrderedTagAttribute `default_script "mark a Lake script as the package's default"
fun name => do
unless ( getEnv <&> (scriptAttr.hasTag · name)) do
throwError "attribute `default_script` can only be used on a `script`"
initialize leanLibAttr : OrderedTagAttribute
registerOrderedTagAttribute `lean_lib "mark a definition as a Lake Lean library target configuration"
initialize leanExeAttr : OrderedTagAttribute
registerOrderedTagAttribute `lean_exe "mark a definition as a Lake Lean executable target configuration"
initialize externLibAttr : OrderedTagAttribute
registerOrderedTagAttribute `extern_lib "mark a definition as a Lake external library target"
initialize targetAttr : OrderedTagAttribute
registerOrderedTagAttribute `target "mark a definition as a custom Lake target"
initialize defaultTargetAttr : OrderedTagAttribute
registerOrderedTagAttribute `default_target "mark a Lake target as the package's default"
fun name => do
let valid getEnv <&> fun env =>
leanLibAttr.hasTag env name ||
leanExeAttr.hasTag env name ||
externLibAttr.hasTag env name ||
targetAttr.hasTag env name
unless valid do
throwError "attribute `default_target` can only be used on a target (e.g., `lean_lib`, `lean_exe`)"
initialize testDriverAttr : OrderedTagAttribute
registerOrderedTagAttribute `test_driver "mark a Lake script, executable, or library as package's test driver"
fun name => do
let valid getEnv <&> fun env =>
scriptAttr.hasTag env name ||
leanExeAttr.hasTag env name ||
leanLibAttr.hasTag env name
unless valid do
throwError "attribute `test_driver` can only be used on a `script`, `lean_exe`, or `lean_lib`"
initialize lintDriverAttr : OrderedTagAttribute
registerOrderedTagAttribute `lint_driver "mark a Lake script or executable as package's linter"
fun name => do
let valid getEnv <&> fun env =>
scriptAttr.hasTag env name ||
leanExeAttr.hasTag env name
unless valid do
throwError "attribute `lint_driver` can only be used on a `script` or `lean_exe`"
initialize moduleFacetAttr : OrderedTagAttribute
registerOrderedTagAttribute `module_facet "mark a definition as a Lake module facet"
initialize packageFacetAttr : OrderedTagAttribute
registerOrderedTagAttribute `package_facet "mark a definition as a Lake package facet"
initialize libraryFacetAttr : OrderedTagAttribute
registerOrderedTagAttribute `library_facet "mark a definition as a Lake library facet"

View File

@@ -133,7 +133,8 @@ where
|>.insert ``externLibAttr
|>.insert ``targetAttr
|>.insert ``defaultTargetAttr
|>.insert ``testRunnerAttr
|>.insert ``testDriverAttr
|>.insert ``lintDriverAttr
|>.insert ``moduleFacetAttr
|>.insert ``packageFacetAttr
|>.insert ``libraryFacetAttr

View File

@@ -53,7 +53,7 @@ def configFileExists (cfgFile : FilePath) : BaseIO Bool :=
leanFile.pathExists <||> tomlFile.pathExists
/-- Loads a Lake package configuration (either Lean or TOML). -/
def loadPackage
def loadPackageCore
(name : String) (cfg : LoadConfig)
: LogIO (Package × Option Environment) := do
if let some ext := cfg.relConfigFile.extension then
@@ -88,7 +88,7 @@ def loadDepPackage
(dep : MaterializedDep) (leanOpts : Options) (reconfigure : Bool)
: StateT Workspace LogIO Package := fun ws => do
let name := dep.name.toString (escape := false)
let (pkg, env?) loadPackage name {
let (pkg, env?) loadPackageCore name {
lakeEnv := ws.lakeEnv
wsDir := ws.dir
relPkgDir := dep.relPkgDir
@@ -110,7 +110,7 @@ Does not resolve dependencies.
-/
def loadWorkspaceRoot (config : LoadConfig) : LogIO Workspace := do
Lean.searchPathRef.set config.lakeEnv.leanSearchPath
let (root, env?) loadPackage "[root]" config
let (root, env?) loadPackageCore "[root]" config
let ws : Workspace := {
root, lakeEnv := config.lakeEnv
moduleFacetConfigs := initModuleFacetConfigs
@@ -122,6 +122,11 @@ def loadWorkspaceRoot (config : LoadConfig) : LogIO Workspace := do
else
return ws
/-- Loads a Lake package as a single independent object (without dependencies). -/
def loadPackage (config : LoadConfig) : LogIO Package := do
Lean.searchPathRef.set config.lakeEnv.leanSearchPath
(·.1) <$> loadPackageCore "[root]" config
/-- Recursively visits a package dependency graph, avoiding cycles. -/
private def resolveDepsAcyclic
[Monad m] [MonadError m]
@@ -208,7 +213,7 @@ def Workspace.updateAndMaterialize
if depPkg.name dep.name then
if dep.name = .mkSimple "std" then
let rev :=
match dep.manifestEntry with
match dep.manifestEntry.src with
| .git (inputRev? := some rev) .. => s!" @ {repr rev}"
| _ => ""
logError (stdMismatchError depPkg.name.toString rev)
@@ -285,7 +290,7 @@ def Workspace.materializeDeps
s!"manifest out of date: {what} of dependency '{dep.name}' changed; " ++
s!"use `lake update {dep.name}` to update it"
if let .some entry := pkgEntries.find? dep.name then
match dep.src, entry with
match dep.src, entry.src with
| .git (url := url) (rev := rev) .., .git (url := url') (inputRev? := rev') .. =>
if url url' then warnOutOfDate "git url"
if rev rev' then warnOutOfDate "git revision"

View File

@@ -6,6 +6,8 @@ Authors: Mac Malone, Gabriel Ebner
import Lake.Util.Log
import Lake.Util.Name
import Lake.Util.FilePath
import Lake.Util.JsonObject
import Lake.Util.Version
import Lake.Load.Config
import Lake.Config.Workspace
@@ -18,139 +20,145 @@ to create, modify, serialize, and deserialize it.
namespace Lake
/-- Current version of the manifest format. -/
def Manifest.version : Nat := 7
/--
The current version of the manifest format.
/-- An entry for a package stored in the manifest. -/
Three-part semantic versions were introduced in `v1.0.0`.
Major version increments indicate breaking changes
(e.g., new required fields and semantic changes to existing fields).
Minor version increments (after `0.x`) indicate backwards-compatible extensions
(e.g., adding optional fields, removing fields).
Lake supports reading manifests with versions that have `-` suffixes.
When checking for version compatibility, Lake expects a manifest with version
`x.y.z-foo` to have all the features of the official manifest version `x.y.z`.
That is, Lake ignores the `-` suffix.
**VERSION HISTORY**
**v0.x.0** (versioned by a natural number)
- `1`: First version
- `2`: Adds optional `inputRev` package entry field
- `3`: Changes entry to inductive (with `path`/`git`)
- `4`: Adds required `packagesDir` manifest field
- `5`: Adds optional `inherited` package entry field (and removed `opts`)
- `6`: Adds optional package root `name` manifest field
- `7`: `type` refactor, custom to/fromJson
**v1.x.x** (versioned by a string)
- `"1.0.0"`: Switches to a semantic versioning scheme
-/
@[inline] def Manifest.version : LeanVer := v!"1.0.0"
/-- Manifest version `0.6.0` package entry. For backwards compatibility. -/
inductive PackageEntryV6
| path (name : Name) (opts : NameMap String) (inherited : Bool) (dir : FilePath)
| git (name : Name) (opts : NameMap String) (inherited : Bool) (url : String) (rev : String)
(inputRev? : Option String) (subDir? : Option FilePath)
deriving FromJson, ToJson, Inhabited
/-- An entry for a package stored in the manifest. -/
inductive PackageEntry
/--
The package source for an entry in the manifest.
Describes exactly how Lake should materialize the package.
-/
inductive PackageEntrySrc
/--
A local filesystem package. `dir` is relative to the package directory
of the package containing the manifest.
-/
| path
(name : Name)
(inherited : Bool)
(configFile : FilePath)
(manifestFile? : Option FilePath)
(dir : FilePath)
/-- A remote Git package. -/
| git
(name : Name)
(inherited : Bool)
(configFile : FilePath)
(manifestFile? : Option FilePath)
(url : String)
(rev : String)
(inputRev? : Option String)
(subDir? : Option FilePath)
deriving Inhabited
/-- An entry for a package stored in the manifest. -/
structure PackageEntry where
name : Name
inherited : Bool
configFile : FilePath := defaultConfigFile
manifestFile? : Option FilePath := none
src : PackageEntrySrc
deriving Inhabited
namespace PackageEntry
protected def toJson : PackageEntry Json
| .path name inherited configFile manifestFile? dir => Json.mkObj [
("type", "path"),
("inherited", toJson inherited),
("name", toJson name),
("configFile" , toJson configFile),
("manifestFile", toJson manifestFile?),
("inherited", toJson inherited),
("dir", toJson dir)
]
| .git name inherited configFile manifestFile? url rev inputRev? subDir? => Json.mkObj [
("type", "git"),
("inherited", toJson inherited),
("name", toJson name),
("configFile" , toJson configFile),
("manifestFile", toJson manifestFile?),
("url", toJson url),
("rev", toJson rev),
("inputRev", toJson inputRev?),
("subDir", toJson subDir?)
]
protected def toJson (entry : PackageEntry) : Json :=
let fields := [
("name", toJson entry.name),
("configFile" , toJson entry.configFile),
("manifestFile", toJson entry.manifestFile?),
("inherited", toJson entry.inherited),
]
let fields :=
match entry.src with
| .path dir =>
("type", "path") :: fields.append [
("dir", toJson dir),
]
| .git url rev inputRev? subDir? =>
("type", "git") :: fields.append [
("url", toJson url),
("rev", toJson rev),
("inputRev", toJson inputRev?),
("subDir", toJson subDir?),
]
Json.mkObj fields
instance : ToJson PackageEntry := PackageEntry.toJson
protected def fromJson? (json : Json) : Except String PackageEntry := do
let obj json.getObj?
let type get obj "type"
let name get obj "name"
let inherited get obj "inherited"
let configFile getD obj "configFile" defaultConfigFile
let manifestFile getD obj "manifestFile" defaultManifestFile
match type with
| "path"=>
let dir get obj "dir"
return .path name inherited configFile manifestFile dir
| "git" =>
let url get obj "url"
let rev get obj "rev"
let inputRev? get? obj "inputRev"
let subDir? get? obj "subDir"
return .git name inherited configFile manifestFile url rev inputRev? subDir?
| _ =>
throw s!"unknown package entry type '{type}'"
where
get {α} [FromJson α] obj prop : Except String α :=
match obj.find compare prop with
| none => throw s!"package entry missing required property '{prop}'"
| some val => fromJson? val |>.mapError (s!"in package entry property '{prop}': {·}")
get? {α} [FromJson α] obj prop : Except String (Option α) :=
match obj.find compare prop with
| none => pure none
| some val => fromJson? val |>.mapError (s!"in package entry property '{prop}': {·}")
getD {α} [FromJson α] obj prop (default : α) : Except String α :=
(Option.getD · default) <$> get? obj prop
let obj JsonObject.fromJson? json |>.mapError (s!"package entry: {·}")
let name obj.get "name" |>.mapError (s!"package entry: {·}")
try
let type obj.get "type"
let inherited obj.get "inherited"
let configFile obj.getD "configFile" defaultConfigFile
let manifestFile obj.getD "manifestFile" defaultManifestFile
let src : PackageEntrySrc id do
match type with
| "path" =>
let dir obj.get "dir"
return .path dir
| "git" =>
let url obj.get "url"
let rev obj.get "rev"
let inputRev? obj.get? "inputRev"
let subDir? obj.get? "subDir"
return .git url rev inputRev? subDir?
| _ =>
throw s!"unknown package entry type '{type}'"
return {
name, inherited,
configFile, manifestFile? := manifestFile, src
: PackageEntry
}
catch e =>
throw s!"package entry '{name}': {e}"
instance : FromJson PackageEntry := PackageEntry.fromJson?
@[inline] protected def name : PackageEntry Name
| .path (name := name) .. | .git (name := name) .. => name
@[inline] def setInherited (entry : PackageEntry) : PackageEntry :=
{entry with inherited := true}
@[inline] protected def inherited : PackageEntry Bool
| .path (inherited := inherited) .. | .git (inherited := inherited) .. => inherited
@[inline] def setConfigFile (path : FilePath) (entry : PackageEntry) : PackageEntry :=
{entry with configFile := path}
def setInherited : PackageEntry PackageEntry
| .path name _ configFile manifestFile? dir =>
.path name true configFile manifestFile? dir
| .git name _ configFile manifestFile? url rev inputRev? subDir? =>
.git name true configFile manifestFile? url rev inputRev? subDir?
@[inline] def setManifestFile (path? : Option FilePath) (entry : PackageEntry) : PackageEntry :=
{entry with manifestFile? := path?}
@[inline] protected def configFile : PackageEntry FilePath
| .path (configFile := configFile) .. | .git (configFile := configFile) .. => configFile
@[inline] protected def manifestFile? : PackageEntry Option FilePath
| .path (manifestFile? := manifestFile?) .. | .git (manifestFile? := manifestFile?) .. => manifestFile?
def setConfigFile (path : FilePath) : PackageEntry PackageEntry
| .path name inherited _ manifestFile? dir =>
.path name inherited path manifestFile? dir
| .git name inherited _ manifestFile? url rev inputRev? subDir? =>
.git name inherited path manifestFile? url rev inputRev? subDir?
def setManifestFile (path? : Option FilePath) : PackageEntry PackageEntry
| .path name inherited configFile _ dir =>
.path name inherited configFile path? dir
| .git name inherited configFile _ url rev inputRev? subDir? =>
.git name inherited configFile path? url rev inputRev? subDir?
def inDirectory (pkgDir : FilePath) : PackageEntry PackageEntry
| .path name inherited configFile manifestFile? dir =>
.path name inherited configFile manifestFile? (pkgDir / dir)
| entry => entry
@[inline] def inDirectory (pkgDir : FilePath) (entry : PackageEntry) : PackageEntry :=
{entry with src := match entry.src with | .path dir => .path (pkgDir / dir) | s => s}
def ofV6 : PackageEntryV6 PackageEntry
| .path name _opts inherited dir =>
.path name inherited defaultConfigFile none dir
{name, inherited, src := .path dir}
| .git name _opts inherited url rev inputRev? subDir? =>
.git name inherited defaultConfigFile none url rev inputRev? subDir?
{name, inherited, src := .git url rev inputRev? subDir?}
end PackageEntry
@@ -169,50 +177,38 @@ def addPackage (entry : PackageEntry) (self : Manifest) : Manifest :=
protected def toJson (self : Manifest) : Json :=
Json.mkObj [
("version", version),
("version", toJson version),
("name", toJson self.name),
("lakeDir", toJson self.lakeDir),
("packagesDir", toJson self.packagesDir?),
("packages", toJson self.packages)
("packages", toJson self.packages),
]
instance : ToJson Manifest := Manifest.toJson
protected def fromJson? (json : Json) : Except String Manifest := do
let .ok obj := json.getObj?
| throw "manifest not a JSON object"
let ver : Json get obj "version"
let .ok ver := ver.getNat?
| throw s!"unknown manifest version '{ver}'"
if ver < 5 then
let obj JsonObject.fromJson? json
let ver : SemVerCore
match ( obj.get "version" : Json) with
| (n : Nat) => pure {minor := n}
| (s : String) => LeanVer.parse s
| ver => throw s!"unknown manifest version format '{ver}'; \
you may need to update your 'lean-toolchain'"
if ver.major > 1 then
throw s!"manifest version '{ver}' is of a higher major version than this \
Lake's '{Manifest.version}'; you may need to update your 'lean-toolchain'"
else if ver < v!"0.5.0" then
throw s!"incompatible manifest version '{ver}'"
else if ver 6 then
let name getD obj "name" Name.anonymous
let lakeDir getD obj "lakeDir" defaultLakeDir
let packagesDir? get? obj "packagesDir"
let pkgs : Array PackageEntryV6 getD obj "packages" #[]
return {name, lakeDir, packagesDir?, packages := pkgs.map PackageEntry.ofV6}
else if ver = 7 then
let name getD obj "name" Name.anonymous
let lakeDir get obj "lakeDir"
let packagesDir get obj "packagesDir"
let packages : Array PackageEntry getD obj "packages" #[]
return {name, lakeDir, packagesDir? := packagesDir, packages}
else
throw <|
s!"manifest version `{ver}` is higher than this Lake's '{Manifest.version}'; " ++
"you may need to update your `lean-toolchain`"
where
get {α} [FromJson α] obj prop : Except String α :=
match obj.find compare prop with
| none => throw s!"manifest missing required property '{prop}'"
| some val => fromJson? val |>.mapError (s!"in manifest property '{prop}': {·}")
get? {α} [FromJson α] obj prop : Except String (Option α) :=
match obj.find compare prop with
| none => pure none
| some val => fromJson? val |>.mapError (s!"in manifest property '{prop}': {·}")
getD {α} [FromJson α] obj prop (default : α) : Except String α :=
(Option.getD · default) <$> get? obj prop
let name obj.getD "name" Name.anonymous
let lakeDir obj.getD "lakeDir" defaultLakeDir
let packagesDir? obj.get? "packagesDir"
let packages
if ver < v!"0.7.0" then
(·.map PackageEntry.ofV6) <$> obj.getD "packages" #[]
else
obj.getD "packages" #[]
return {name, lakeDir, packagesDir?, packages}
instance : FromJson Manifest := Manifest.fromJson?

View File

@@ -113,7 +113,7 @@ def Dependency.materialize (dep : Dependency) (inherited : Bool)
return {
relPkgDir
remoteUrl? := none
manifestEntry := .path dep.name inherited defaultConfigFile none relPkgDir
manifestEntry := mkEntry <| .path relPkgDir
configDep := dep
}
| .git url inputRev? subDir? => do
@@ -127,9 +127,11 @@ def Dependency.materialize (dep : Dependency) (inherited : Bool)
return {
relPkgDir
remoteUrl? := Git.filterUrl? url
manifestEntry := .git dep.name inherited defaultConfigFile none url rev inputRev? subDir?
manifestEntry := mkEntry <| .git url rev inputRev? subDir?
configDep := dep
}
where
mkEntry src : PackageEntry := {name := dep.name, inherited, src}
/--
Materializes a manifest package entry, cloning and/or checking it out as necessary.
@@ -137,7 +139,7 @@ Materializes a manifest package entry, cloning and/or checking it out as necessa
def PackageEntry.materialize (manifestEntry : PackageEntry)
(configDep : Dependency) (wsDir relPkgsDir : FilePath) (pkgUrlMap : NameMap String)
: LogIO MaterializedDep :=
match manifestEntry with
match manifestEntry.src with
| .path (dir := relPkgDir) .. =>
return {
relPkgDir
@@ -145,8 +147,8 @@ def PackageEntry.materialize (manifestEntry : PackageEntry)
manifestEntry
configDep
}
| .git name (url := url) (rev := rev) (subDir? := subDir?) .. => do
let sname := name.toString (escape := false)
| .git (url := url) (rev := rev) (subDir? := subDir?) .. => do
let sname := manifestEntry.name.toString (escape := false)
let relGitDir := relPkgsDir / sname
let gitDir := wsDir / relGitDir
let repo := GitRepo.mk gitDir
@@ -161,10 +163,10 @@ def PackageEntry.materialize (manifestEntry : PackageEntry)
if ( repo.hasDiff) then
logWarning s!"{sname}: repository '{repo.dir}' has local changes"
else
let url := pkgUrlMap.find? name |>.getD url
let url := pkgUrlMap.find? manifestEntry.name |>.getD url
updateGitRepo sname repo url rev
else
let url := pkgUrlMap.find? name |>.getD url
let url := pkgUrlMap.find? manifestEntry.name |>.getD url
cloneGitPkg sname repo url rev
let relPkgDir := match subDir? with | .some subDir => relGitDir / subDir | .none => relGitDir
return {

View File

@@ -109,14 +109,28 @@ def Package.loadFromEnv
| .error e => error e
let depConfigs IO.ofExcept <| packageDepAttr.getAllEntries env |>.mapM fun name =>
evalConstCheck env opts Dependency ``Dependency name
let testRunners := testRunnerAttr.getAllEntries env
let testRunner
if testRunners.size > 1 then
error s!"{self.name}: only one script or executable can be tagged `@[test_runner]`"
else if h : testRunners.size > 0 then
pure (testRunners[0]'h)
let testDrivers := testDriverAttr.getAllEntries env
let testDriver
if testDrivers.size > 1 then
error s!"{self.name}: only one script, executable, or library can be tagged @[test_driver]"
else if h : testDrivers.size > 0 then
if self.config.testDriver.isEmpty then
pure (testDrivers[0]'h |>.toString)
else
error s!"{self.name}: cannot both set testDriver and use @[test_driver]"
else
pure .anonymous
pure self.config.testDriver
let lintDrivers := lintDriverAttr.getAllEntries env
let lintDriver
if lintDrivers.size > 1 then
error s!"{self.name}: only one script or executable can be tagged @[linter]"
else if h : lintDrivers.size > 0 then
if self.config.lintDriver.isEmpty then
pure (lintDrivers[0]'h |>.toString)
else
error s!"{self.name}: cannot both set linter and use @[linter]"
else
pure self.config.lintDriver
-- Deprecation warnings
unless self.config.manifestFile.isNone do
@@ -127,8 +141,8 @@ def Package.loadFromEnv
-- Fill in the Package
return {self with
depConfigs, leanLibConfigs, leanExeConfigs, externLibConfigs
opaqueTargetConfigs, defaultTargets, scripts, defaultScripts, testRunner
postUpdateHooks
opaqueTargetConfigs, defaultTargets, scripts, defaultScripts
testDriver, lintDriver, postUpdateHooks
}
/--

View File

@@ -175,12 +175,19 @@ protected def PackageConfig.decodeToml (t : Table) (ref := Syntax.missing) : Exc
let releaseRepo t.tryDecode? `releaseRepo
let buildArchive? t.tryDecode? `buildArchive
let preferReleaseBuild t.tryDecodeD `preferReleaseBuild false
let testRunner t.tryDecodeD `testRunner ""
let testDriver t.tryDecodeD `testDriver ""
let testDriver := if ¬testRunner.isEmpty testDriver.isEmpty then testRunner else testDriver
let testDriverArgs t.tryDecodeD `testDriverArgs #[]
let lintDriver t.tryDecodeD `lintDriver ""
let lintDriverArgs t.tryDecodeD `lintDriverArgs #[]
let toLeanConfig tryDecode <| LeanConfig.decodeToml t
let toWorkspaceConfig tryDecode <| WorkspaceConfig.decodeToml t
return {
name, precompileModules, moreGlobalServerArgs,
srcDir, buildDir, leanLibDir, nativeLibDir, binDir, irDir,
releaseRepo, buildArchive?, preferReleaseBuild
testDriver, testDriverArgs, lintDriver, lintDriverArgs
toLeanConfig, toWorkspaceConfig
}
@@ -259,12 +266,11 @@ def loadTomlConfig (dir relDir relConfigFile : FilePath) : LogIO Package := do
let leanLibConfigs mkRBArray (·.name) <$> table.tryDecodeD `lean_lib #[]
let leanExeConfigs mkRBArray (·.name) <$> table.tryDecodeD `lean_exe #[]
let defaultTargets table.tryDecodeD `defaultTargets #[]
let testRunner table.tryDecodeD `testRunner .anonymous
let depConfigs table.tryDecodeD `require #[]
return {
dir, relDir, relConfigFile
config, depConfigs, leanLibConfigs, leanExeConfigs
defaultTargets, testRunner
defaultTargets
}
if errs.isEmpty then
return pkg

View File

@@ -43,8 +43,6 @@ namespace Toml.Table
@[inline] nonrec def insert [enc : ToToml α] (k : Name) (v : α) (t : Table) : Table :=
t.insert k (enc.toToml v)
instance (priority := low) [ToToml α] : SmartInsert α := Table.insert
/-- If the value is not `none`, inserts the encoded value into the table. -/
@[inline] nonrec def insertSome [enc : ToToml α] (k : Name) (v? : Option α) (t : Table) : Table :=
t.insertSome k (v?.map enc.toToml)
@@ -61,6 +59,9 @@ instance : SmartInsert Table where
instance [ToToml (Array α)] : SmartInsert (Array α) where
smartInsert k v t := t.insertUnless v.isEmpty k (toToml v)
instance : SmartInsert String where
smartInsert k v t := t.insertUnless v.isEmpty k (toToml v)
/-- Insert a value into the table if `p` is `true`. -/
@[inline] nonrec def insertIf [enc : ToToml α] (p : Bool) (k : Name) (v : α) (t : Table) : Table :=
t.insertIf p k (enc.toToml v)

View File

@@ -0,0 +1,11 @@
/-
Copyright (c) 2021 Mac Malone. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Mac Malone
-/
namespace Lake
/-- Creates any missing parent directories of `path`. -/
@[inline] def createParentDirs (path : System.FilePath) : IO Unit := do
if let some dir := path.parent then IO.FS.createDirAll dir

View File

@@ -0,0 +1,50 @@
/-
Copyright (c) 2024 Mac Malone. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Mac Malone
-/
import Lean.Data.Json
open Lean
/-! # JSON Object
Defines a convenient wrapper type for JSON object data that makes
indexing fields more convenient.
-/
namespace Lake
/-- A JSON object (`Json.obj` data). -/
abbrev JsonObject :=
RBNode String (fun _ => Json)
namespace JsonObject
@[inline] protected def toJson (obj : JsonObject) : Json :=
.obj obj
instance : ToJson JsonObject := JsonObject.toJson
@[inline] protected def fromJson? (json : Json) : Except String JsonObject :=
json.getObj?
instance : FromJson JsonObject := JsonObject.fromJson?
@[inline] nonrec def erase (obj : JsonObject) (prop : String) : JsonObject :=
obj.erase compare prop
@[inline] def getJson? (obj : JsonObject) (prop : String) : Option Json :=
obj.find compare prop
@[inline] def get [FromJson α] (obj : JsonObject) (prop : String) : Except String α :=
match obj.getJson? prop with
| none => throw s!"property not found: {prop}"
| some val => fromJson? val |>.mapError (s!"{prop}: {·}")
@[inline] def get? [FromJson α] (obj : JsonObject) (prop : String) : Except String (Option α) :=
match obj.getJson? prop with
| none => pure none
| some val => fromJson? val |>.mapError (s!"{prop}: {·}")
@[inline] def getD [FromJson α] (obj : JsonObject) (prop : String) (default : α) : Except String α :=
(Option.getD · default) <$> obj.get? prop

View File

@@ -0,0 +1,152 @@
/-
Copyright (c) 2024 Mac Malone. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Mac Malone
-/
import Lean.Elab.Eval
/-! # Version
This module contains useful definitions for manipulating versions.
It also defines a `v!"<ver>"` syntax for version literals.
-/
open Lean
namespace Lake
/-- The major-minor-patch triple of a [SemVer](https://semver.org/). -/
structure SemVerCore where
major : Nat := 0
minor : Nat := 0
patch : Nat := 0
deriving Inhabited, Repr, DecidableEq, Ord
instance : LT SemVerCore := ltOfOrd
instance : LE SemVerCore := leOfOrd
instance : Min SemVerCore := minOfLe
instance : Max SemVerCore := maxOfLe
def SemVerCore.parse (ver : String) : Except String SemVerCore := do
try
match ver.split (· == '.') with
| [major, minor, patch] =>
let parseNat (v : String) (what : String) := do
let some v := v.toNat?
| throw s!"expect numeral {what} version, got '{v}'"
return v
return {
major := parseNat major "major"
minor := parseNat minor "minor"
patch := parseNat patch "patch"
}
| ps => throw s!"incorrect number of components: got {ps.length}, expected 3"
catch e =>
throw s!"invalid version core: {e}"
protected def SemVerCore.toString (ver : SemVerCore) : String :=
s!"{ver.major}.{ver.minor}.{ver.patch}"
instance : ToString SemVerCore := SemVerCore.toString
instance : ToJson SemVerCore := (·.toString)
instance : FromJson SemVerCore := (do SemVerCore.parse <| fromJson? ·)
instance : ToExpr SemVerCore where
toExpr ver := mkAppN (mkConst ``SemVerCore.mk)
#[toExpr ver.major, toExpr ver.minor, toExpr ver.patch]
toTypeExpr := mkConst ``SemVerCore
/--
A Lean-style semantic version.
A major-minor-patch triple with an optional arbitrary `-` suffix.
-/
structure LeanVer extends SemVerCore where
specialDescr : String := ""
deriving Inhabited, Repr, DecidableEq
instance : Coe LeanVer SemVerCore := LeanVer.toSemVerCore
@[inline] protected def LeanVer.ofSemVerCore (ver : SemVerCore) : LeanVer :=
{toSemVerCore := ver, specialDescr := ""}
instance : Coe SemVerCore LeanVer := LeanVer.ofSemVerCore
protected def LeanVer.compare (a b : LeanVer) : Ordering :=
match compare a.toSemVerCore b.toSemVerCore with
| .eq =>
match a.specialDescr, b.specialDescr with
| "", "" => .eq
| _, "" => .lt
| "", _ => .gt
| a, b => compare a b
| ord => ord
instance : Ord LeanVer := LeanVer.compare
instance : LT LeanVer := ltOfOrd
instance : LE LeanVer := leOfOrd
instance : Min LeanVer := minOfLe
instance : Max LeanVer := maxOfLe
def LeanVer.parse (ver : String) : Except String LeanVer := do
let sepPos := ver.find (· == '-')
if h : ver.atEnd sepPos then
SemVerCore.parse ver
else
let core SemVerCore.parse <| ver.extract 0 sepPos
let specialDescr := ver.extract (ver.next' sepPos h) ver.endPos
if specialDescr.isEmpty then
throw "invalid Lean version: '-' suffix cannot be empty"
return {toSemVerCore := core, specialDescr}
protected def LeanVer.toString (ver : LeanVer) : String :=
if ver.specialDescr.isEmpty then
ver.toSemVerCore.toString
else
s!"{ver.toSemVerCore}-{ver.specialDescr}"
instance : ToString LeanVer := LeanVer.toString
instance : ToJson LeanVer := (·.toString)
instance : FromJson LeanVer := (do LeanVer.parse <| fromJson? ·)
instance : ToExpr LeanVer where
toExpr ver := mkAppN (mkConst ``LeanVer.mk)
#[toExpr ver.toSemVerCore, toExpr ver.specialDescr]
toTypeExpr := mkConst ``LeanVer
/-! ## Version Literals
Defines the `v!"<ver>"` syntax for version literals.
The elaborator attempts to synthesize an instance of `ToVersion` for the
expected type and then applies it to the string literal.
-/
open Elab Term Meta
/-- Parses a version from a string. -/
class DecodeVersion (α : Type u) where
decodeVersion : String Except String α
export DecodeVersion (decodeVersion)
instance : DecodeVersion SemVerCore := SemVerCore.parse
instance : DecodeVersion LeanVer := LeanVer.parse
private def toResultExpr [ToExpr α] (x : Except String α) : Except String Expr :=
Functor.map toExpr x
/-- A Lake version literal. -/
scoped syntax:max (name := verLit) "v!" noWs interpolatedStr(term) : term
@[term_elab verLit] def elabVerLit : TermElab := fun stx expectedType? => do
let `(v!$v) := stx | throwUnsupportedSyntax
tryPostponeIfNoneOrMVar expectedType?
let some expectedType := expectedType?
| throwError "expected type is not known"
let ofVerT? mkAppM ``Except #[mkConst ``String, expectedType]
let ofVerE elabTermEnsuringType ( ``(decodeVersion s!$v)) ofVerT?
let resT mkAppM ``Except #[mkConst ``String, mkConst ``Expr]
let resE mkAppM ``toResultExpr #[ofVerE]
match ( unsafe evalExpr (Except String Expr) resT resE) with
| .ok ver => return ver
| .error e => throwError e

View File

@@ -15,6 +15,7 @@ Each `lakefile.lean` includes a `package` declaration (akin to `main`) which def
* [Package Configuration Options](#package-configuration-options)
+ [Layout](#layout)
+ [Build & Run](#build--run)
+ [Test & Lint](#test--lint)
+ [Cloud Releases](#cloud-releases)
* [Defining Build Targets](#defining-build-targets)
+ [Lean Libraries](#lean-libraries)
@@ -164,8 +165,10 @@ Lake provides a large assortment of configuration options for packages.
### Layout
These options control the top-level directory layout of the package and its build directory. Further paths specified by libraries, executables, and targets within the package are relative to these directories.
* `packagesDir`: The directory to which Lake should download remote dependencies. Defaults to `.lake/packages`.
* `srcDir`: The directory containing the package's Lean source files. Defaults to the package's directory. (This will be passed to `lean` as the `-R` option.)
* `srcDir`: The directory containing the package's Lean source files. Defaults to the package's directory.
* `buildDir`: The directory to which Lake should output the package's build results. Defaults to `build`.
* `leanLibDir`: The build subdirectory to which Lake should output the package's binary Lean libraries (e.g., `.olean`, `.ilean` files). Defaults to `lib`.
* `nativeLibDir`: The build subdirectory to which Lake should output the package's native libraries (e.g., `.a`, `.so`, `.dll` files). Defaults to `lib`.
@@ -174,6 +177,8 @@ Lake provides a large assortment of configuration options for packages.
### Build & Run
These options configure how code is built and run in the package. Libraries, executables, and other targets within a package can further add to parts of this configuration.
* `platformIndependent`: Asserts whether Lake should assume Lean modules are platform-independent. That is, whether lake should include the platform and platform-dependent elements in a module's trace. See the docstring of `Lake.LeanConfig.platformIndependent` for more details. Defaults to `none`.
* `precompileModules`: Whether to compile each module into a native shared library that is loaded whenever the module is imported. This speeds up the evaluation of metaprograms and enables the interpreter to run functions marked `@[extern]`. Defaults to `false`.
* `moreServerOptions`: An `Array` of additional options to pass to the Lean language server (i.e., `lean --server`) launched by `lake serve`.
@@ -188,8 +193,21 @@ Lake provides a large assortment of configuration options for packages.
* `weakLinkArgs`: An `Array` of additional arguments to pass to `leanc` when linking (e.g., binary executables or shared libraries) Unlike `moreLinkArgs`, these arguments do not affect the trace of the build result, so they can be changed without triggering a rebuild. They come *before* `moreLinkArgs`.
* `extraDepTargets`: An `Array` of [target](#custom-targets) names that the package should always build before anything else.
### Test & Lint
The CLI commands `lake test` and `lake lint` use definitions configured by the workspace's root package to perform testing and linting (this referred to as the test or lint *driver*). In Lean configuration files, these can be specified by applying the `@[test_driver]` or `@[lint_driver]` to a `script`, `lean_exe`, or `lean_lb`. They can also be configured (in Lean or TOML format) via the following options on the package.
* `testDriver`: The name of the script, executable, or library to drive `lake test`.
* `testDriverArgs`: An `Array` of arguments to pass to the package's test driver.
* `lintDriver`: The name of the script or executable used by `lake lint`. Libraries cannot be lint drivers.
* `lintDriverArgs`: An `Array` of arguments to pass to the package's lint driver.
You can specify definition from a dependency as a package's test or lint driver by using the syntax `<pkg>/<name>`. An executable driver will be built and then run, a script driver will just be run, and a library driver will just be built. A script or executable driver is run with any arguments configured by package (e.g., via `testDriverArgs`) followed by any specified on the CLI (e.g., via `lake lint -- <args>...`).
### Cloud Releases
These options define a cloud release for the package. See the section on [GitHub Release Builds](#github-release-builds) for more information.
* `releaseRepo`: The URL of the GitHub repository to upload and download releases of this package. If `none` (the default), for downloads, Lake uses the URL the package was download from (if it is a dependency) and for uploads, uses `gh`'s default.
* `buildArchive`: The name of the build archive for the GitHub cloud release.
Defaults to `{(pkg-)name}-{System.Platform.target}.tar.gz`.

View File

@@ -1,4 +1,4 @@
{"version": 7,
{"version": "1.0.0",
"packagesDir": ".lake/packages",
"packages":
[{"type": "path",

View File

@@ -1,4 +1,5 @@
set -ex
#!/usr/bin/env bash
set -euxo pipefail
LAKE=${LAKE:-../../.lake/build/bin/lake}
@@ -15,6 +16,16 @@ $LAKE -q build hello 2>&1 | diff - /dev/null
# Related: https://github.com/leanprover/lean4/issues/2549
test -f lake-manifest.json
# Test pack/unpack
$LAKE pack .lake/build.tgz 2>&1 | grep --color "packing"
rm -rf .lake/build
$LAKE unpack .lake/build.tgz 2>&1 | grep --color "unpacking"
.lake/build/bin/hello
$LAKE pack 2>&1 | grep --color "packing"
rm -rf .lake/build
$LAKE unpack 2>&1 | grep --color "unpacking"
.lake/build/bin/hello
./clean.sh
$LAKE -f lakefile.toml exe hello

View File

View File

@@ -0,0 +1,6 @@
import Lake
open Lake DSL
package test where
testDriver := "dep/driver/foo"
lintDriver := "dep/driver/foo"

View File

@@ -0,0 +1,3 @@
name = "test"
testDriver = "dep/driver/foo"
lintDriver = "dep/driver/foo"

View File

@@ -0,0 +1,6 @@
import Lake
open Lake DSL
package test where
testDriver := "dep/driver"
lintDriver := "dep/driver"

View File

@@ -0,0 +1,3 @@
name = "test"
testDriver = "dep/driver"
lintDriver = "dep/driver"

View File

@@ -0,0 +1,8 @@
import Lake
open Lake DSL
package test where
testDriver := "dep/driver"
lintDriver := "dep/driver"
require dep from "dep"

View File

@@ -0,0 +1,7 @@
name = "test"
testDriver = "dep/driver"
lintDriver = "dep/driver"
[[require]]
name = "dep"
path = "dep"

View File

@@ -0,0 +1,9 @@
import Lake
open System Lake DSL
package dep
@[test_driver, lint_driver]
script driver args do
IO.println s!"dep: {args}"
return 0

View File

@@ -0,0 +1,7 @@
import Lake
open System Lake DSL
package test
@[test_driver, lint_driver]
lean_exe driver

View File

@@ -0,0 +1,6 @@
name = "test"
testDriver = "driver"
lintDriver = "driver"
[[lean_exe]]
name = "driver"

View File

@@ -0,0 +1,7 @@
import Lake
open Lake DSL
package test
@[test_driver]
lean_lib Test

View File

@@ -0,0 +1,5 @@
name = "test"
testDriver = "Test"
[[lean_lib]]
name = "Test"

View File

@@ -4,4 +4,4 @@ open System Lake DSL
package test
@[test_runner]
lean_exe test
lean_exe driver

View File

@@ -0,0 +1,5 @@
name = "test"
testRunner = "driver"
[[lean_exe]]
name = "driver"

View File

@@ -3,7 +3,7 @@ open System Lake DSL
package test
@[test_runner]
script test args do
@[test_driver, lint_driver]
script driver args do
IO.println s!"script: {args}"
return 0

86
src/lake/tests/driver/test.sh Executable file
View File

@@ -0,0 +1,86 @@
#!/usr/bin/env bash
set -euxo pipefail
LAKE=${LAKE:-../../.lake/build/bin/lake}
./clean.sh
# Script driver
$LAKE -f script.lean test | grep --color -F "script: []"
$LAKE -f script.lean lint | grep --color -F "script: []"
$LAKE -f script.lean test -- hello | grep --color "hello"
$LAKE -f script.lean lint -- hello | grep --color "hello"
# Executable driver
$LAKE -f exe.lean test | grep --color -F "exe: []"
$LAKE -f exe.toml test | grep --color -F "exe: []"
$LAKE -f exe.lean lint | grep --color -F "exe: []"
$LAKE -f exe.toml lint | grep --color -F "exe: []"
$LAKE -f exe.lean test -- hello | grep --color "hello"
$LAKE -f exe.toml test -- hello | grep --color "hello"
$LAKE -f exe.lean lint -- hello | grep --color "hello"
$LAKE -f exe.toml lint -- hello | grep --color "hello"
# Library test driver
rm -f .lake/build/lib/Test.olean
$LAKE -f lib.lean test | grep --color -F "Built Test"
rm -f .lake/build/lib/Test.olean
$LAKE -f lib.toml test | grep --color -F "Built Test"
($LAKE -f lib.lean test -- hello 2>&1 && exit 1 || true) | grep --color "arguments cannot be passed"
($LAKE -f lib.toml test -- hello 2>&1 && exit 1 || true) | grep --color "arguments cannot be passed"
# Upstream driver
rm -f lake-manifest.json
$LAKE -f dep.lean test | grep --color -F "dep: []"
$LAKE -f dep.toml test | grep --color -F "dep: []"
$LAKE -f dep.lean lint | grep --color -F "dep: []"
$LAKE -f dep.toml lint | grep --color -F "dep: []"
$LAKE -f dep.lean test -- hello | grep --color "hello"
$LAKE -f dep.toml test -- hello | grep --color "hello"
$LAKE -f dep.lean lint -- hello | grep --color "hello"
$LAKE -f dep.toml lint -- hello | grep --color "hello"
# Test runner
$LAKE -f runner.lean test 2>&1 | grep --color -F " @[test_runner] has been deprecated"
$LAKE -f runner.toml test
$LAKE -f runner.lean check-test
$LAKE -f runner.toml check-test
# Driver validation
($LAKE -f two.lean test 2>&1 && exit 1 || true) | grep --color "only one"
($LAKE -f none.lean test 2>&1 && exit 1 || true) | grep --color "no test driver"
($LAKE -f none.toml test 2>&1 && exit 1 || true) | grep --color "no test driver"
($LAKE -f unknown.lean test 2>&1 && exit 1 || true) | grep --color "invalid test driver: unknown"
($LAKE -f unknown.toml test 2>&1 && exit 1 || true) | grep --color "invalid test driver: unknown"
($LAKE -f dep-unknown.lean test 2>&1 && exit 1 || true) | grep --color "unknown test driver package"
($LAKE -f dep-unknown.toml test 2>&1 && exit 1 || true) | grep --color "unknown test driver package"
($LAKE -f dep-invalid.lean test 2>&1 && exit 1 || true) | grep --color "invalid test driver"
($LAKE -f dep-invalid.toml test 2>&1 && exit 1 || true) | grep --color "invalid test driver"
($LAKE -f two.lean lint 2>&1 && exit 1 || true) | grep --color "only one"
($LAKE -f none.lean lint 2>&1 && exit 1 || true) | grep --color "no lint driver"
($LAKE -f none.toml lint 2>&1 && exit 1 || true) | grep --color "no lint driver"
($LAKE -f unknown.lean lint 2>&1 && exit 1 || true) | grep --color "invalid lint driver: unknown"
($LAKE -f unknown.toml lint 2>&1 && exit 1 || true) | grep --color "invalid lint driver: unknown"
($LAKE -f dep-unknown.lean lint 2>&1 && exit 1 || true) | grep --color "unknown lint driver package"
($LAKE -f dep-unknown.toml lint 2>&1 && exit 1 || true) | grep --color "unknown lint driver package"
($LAKE -f dep-invalid.lean lint 2>&1 && exit 1 || true) | grep --color "invalid lint driver"
($LAKE -f dep-invalid.toml lint 2>&1 && exit 1 || true) | grep --color "invalid lint driver"
# Driver checker
$LAKE -f exe.lean check-test
$LAKE -f exe.toml check-test
$LAKE -f exe.lean check-lint
$LAKE -f exe.toml check-lint
$LAKE -f dep.lean check-test
$LAKE -f dep.toml check-test
$LAKE -f dep.lean check-lint
$LAKE -f dep.toml check-lint
$LAKE -f script.lean check-test
$LAKE -f script.lean check-lint
$LAKE -f lib.lean check-test
$LAKE -f two.lean check-test && exit 1 || true
$LAKE -f two.lean check-lint && exit 1 || true
$LAKE -f none.lean check-test && exit 1 || true
$LAKE -f none.toml check-test && exit 1 || true
$LAKE -f none.lean check-lint && exit 1 || true
$LAKE -f none.toml check-lint && exit 1 || true

View File

@@ -3,8 +3,8 @@ open System Lake DSL
package test
@[test_runner]
@[test_driver, lint_driver]
lean_exe testExe
@[test_runner]
@[test_driver, lint_driver]
script testScript do return 1

View File

@@ -0,0 +1,6 @@
import Lake
open System Lake DSL
package test where
testDriver := "invalid"
lintDriver := "invalid"

View File

@@ -0,0 +1,3 @@
name = "test"
testDriver = "invalid"
lintDriver = "invalid"

View File

@@ -0,0 +1,20 @@
{"version": "1.0.0",
"packagesDir": ".lake/packages",
"packages":
[{"type": "path",
"name": "foo",
"manifestFile": "lake-manifest.json",
"inherited": false,
"dir": "./foo",
"configFile": "lakefile.lean"},
{"url": "bar",
"type": "git",
"subDir": null,
"rev": "0538596b94a0510f55dc820cabd3bde41ad93c3e",
"name": "bar",
"manifestFile": "lake-manifest.json",
"inputRev": null,
"inherited": false,
"configFile": "lakefile.lean"}],
"name": "«foo-bar»",
"lakeDir": ".lake"}

View File

@@ -4,7 +4,7 @@
[{"git":
{"url": "bar",
"subDir?": null,
"rev": "253735aaee71d8bb0f29ae5cfc3ce086a4b9e64f",
"rev": "0538596b94a0510f55dc820cabd3bde41ad93c3e",
"name": "bar",
"inputRev?": null}},
{"path": {"opts": {}, "name": "foo", "inherited": false, "dir": "./foo"}}]}

View File

@@ -4,7 +4,7 @@
[{"git":
{"url": "bar",
"subDir?": null,
"rev": "253735aaee71d8bb0f29ae5cfc3ce086a4b9e64f",
"rev": "0538596b94a0510f55dc820cabd3bde41ad93c3e",
"opts": {},
"name": "bar",
"inputRev?": null,

View File

@@ -5,7 +5,7 @@
{"git":
{"url": "bar",
"subDir?": null,
"rev": "dab525a78710d185f3d23622b143bdd837e44ab0",
"rev": "0538596b94a0510f55dc820cabd3bde41ad93c3e",
"opts": {},
"name": "bar",
"inputRev?": null,

View File

@@ -25,7 +25,7 @@ git commit -m "initial commit"
GIT_REV=`git rev-parse HEAD`
popd
LATEST_VER=v7
LATEST_VER=v1.0.0
LOCKED_REV='0538596b94a0510f55dc820cabd3bde41ad93c3e'
# Test an update produces the expected manifest of the latest version
@@ -41,10 +41,10 @@ test_update() {
# Test loading of a V4 manifest fails
cp lake-manifest-v4.json lake-manifest.json
($LAKE resolve-deps 2>&1 && exit 1 || true) | grep --color "incompatible manifest version '4'"
($LAKE resolve-deps 2>&1 && exit 1 || true) | grep --color "incompatible manifest version '0.4.0'"
# Test package update fails as well
($LAKE update bar 2>&1 && exit 1 || true) | grep --color "incompatible manifest version '4'"
($LAKE update bar 2>&1 && exit 1 || true) | grep --color "incompatible manifest version '0.4.0'"
# Test bare update works
test_update
@@ -57,11 +57,12 @@ rm -rf .lake
# Test successful load & update of a supported manifest version
test_manifest() {
cp lake-manifest-$1.json lake-manifest.json
sed_i "s/$2/$GIT_REV/g" lake-manifest.json
sed_i "s/$LOCKED_REV/$GIT_REV/g" lake-manifest.json
$LAKE resolve-deps
test_update
}
test_manifest v5 253735aaee71d8bb0f29ae5cfc3ce086a4b9e64f
test_manifest v6 dab525a78710d185f3d23622b143bdd837e44ab0
test_manifest v7 0538596b94a0510f55dc820cabd3bde41ad93c3e
test_manifest v5
test_manifest v6
test_manifest v7
test_manifest v1.0.0

View File

@@ -45,9 +45,10 @@ EOF
git -C dep tag release
($LAKE build dep:release && exit 1 || true) | grep --color "downloading"
# Test unpacking
# Test automatic cloud release unpacking
mkdir -p dep/.lake/build
tar -cz -f dep/.lake/release.tgz -C dep/.lake/build .
$LAKE -d dep pack 2>&1 | grep --color "packing"
test -f dep/.lake/release.tgz
echo 4225503363911572621 > dep/.lake/release.tgz.trace
rmdir dep/.lake/build
$LAKE build dep:release -v | grep --color "unpacking"

View File

@@ -1,5 +0,0 @@
name = "test"
testRunner = "test"
[[lean_exe]]
name = "test"

View File

@@ -1,27 +0,0 @@
#!/usr/bin/env bash
set -euxo pipefail
LAKE=${LAKE:-../../.lake/build/bin/lake}
# Script test runner
$LAKE -f script.lean test | grep --color -F "script: []"
$LAKE -f script.lean test -- hello | grep --color "hello"
# Executable test runner
$LAKE -f exe.lean test | grep --color -F "exe: []"
$LAKE -f exe.lean test -- hello | grep --color "hello"
$LAKE -f exe.toml test | grep --color -F "exe: []"
$LAKE -f exe.toml test -- hello | grep --color "hello"
# Test runner validation
($LAKE -f two.lean test 2>&1 && exit 1 || true) | grep --color "only one"
($LAKE -f none.lean test 2>&1 && exit 1 || true) | grep --color "no test runner"
($LAKE -f none.toml test 2>&1 && exit 1 || true) | grep --color "no test runner"
# Test runner checker
$LAKE -f exe.lean check-test
$LAKE -f exe.toml check-test
$LAKE -f script.lean check-test
($LAKE -f two.lean check-test && exit 1 || true)
($LAKE -f none.lean check-test && exit 1 || true)
($LAKE -f none.toml check-test && exit 1 || true)

View File

@@ -5,6 +5,8 @@ open System Lake DSL
package test where
releaseRepo := ""
buildArchive := ""
testDriver := "b"
lintDriver := "b"
platformIndependent := true
require foo from "-" with Lake.NameMap.empty |>.insert `foo "bar"
@@ -13,4 +15,4 @@ require bar from git "https://example.com"@"abc"/"sub/dir"
@[default_target] lean_lib A
@[test_runner] lean_exe b
lean_exe b

View File

@@ -1,5 +1,6 @@
name = "test"
testRunner = "b"
testDriver = "b"
lintDriver = "b"
defaultTargets = ["A"]
[leanOptions]

View File

@@ -30,6 +30,7 @@ package test where
preferReleaseBuild := false
packagesDir := defaultPackagesDir
leanOptions := #[`pp.unicode.fun, true]
lintDriver := "b"
require foo from "dir" with NameMap.empty.insert `foo "bar"
require bar from git "https://example.com" @ "abc" / "sub" / "dir"
@@ -45,7 +46,7 @@ lean_lib A where
nativeFacets := fun _ => #[]
toLeanConfig := testLeanConfig
@[test_runner]
@[test_driver]
lean_exe b where
srcDir := "."
root := `b

View File

@@ -1,4 +1,5 @@
testRunner = "b"
testDriver = "b"
lintDriver = "b"
defaultTargets = ['A']
name = "test"

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More