Compare commits

...

99 Commits

Author SHA1 Message Date
Henrik Böving
f41b9d2e55 perf: inline a few trivial Array functions 2026-03-03 22:33:48 +00:00
Garmelon
530925c69b chore: fix test suite on macOS (#12780)
MacOS uses a very old version of bash where `"${FOO[@]}"` fails if `set
-u` is enabled and `FOO` is undefined. Newer versions of bash expand
this to zero arguments instead.

Also, `lint.py` used the shebang `#!/usr/bin/env python` instead of
`python3`, which fails on some systems.

In CI, all macos tests run on nscloud runners. Presumably, they have
installed newer versions of various software, hence this didn't break in
CI.
2026-03-03 20:59:08 +00:00
Copilot
73640d3758 fix: preserve @[implicit_reducible] for WF-recursive definitions (#12776)
This PR fixes `@[implicit_reducible]` on well-founded recursive
definitions.

`addPreDefAttributes` sets WF-recursive definitions as `@[irreducible]`
by default, skipping this only when the user explicitly wrote
`@[reducible]` or `@[semireducible]`. It was missing
`@[instance_reducible]` and `@[implicit_reducible]`, causing those
attributes to be silently overridden.

Add `instance_reducible` and `implicit_reducible` to the check in
`src/Lean/Elab/PreDefinition/Mutual.lean` that guards against overriding
user-specified reducibility attributes, and add regression tests in
`tests/elab/wfirred.lean`.

## Example

```lean
-- Before fix: printed @[irreducible] def f : List Nat → Nat
-- After fix:  printed @[implicit_reducible] def f : List Nat → Nat
@[instance_reducible] def f : ∀ _l : List Nat, Nat
  | [] => 0
  | [_x] => 1
  | x :: y :: l => if h : x = y then f (x :: l) else f l + 2
termination_by l => sizeOf l

#print sig f
```

Fixes #12775

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: nomeata <148037+nomeata@users.noreply.github.com>
2026-03-03 18:57:55 +00:00
Markus Himmel
e14f2c8c93 feat: model for string patterns (#12779)
This PR provides a `ForwardPatternModel` for string patterns and deduces
theorems and lawfulness instances from the corresponding results for
slice patterns.
2026-03-03 18:42:25 +00:00
Leonardo de Moura
df61abb08f fix: normalize instance argument in getStuckMVar? for class projections (#12778)
This PR fixes an inconsistency in `getStuckMVar?` where the instance
argument to class projection functions and auxiliary parent projections
was not whnf-normalized before checking for stuck metavariables. Every
other case in `getStuckMVar?` (recursors, quotient recursors, `.proj`
nodes) normalizes the major argument via `whnf` before recursing — class
projection functions and aux parent projections were the exception.

This bug was identified by Matthew Jasper. When the instance parameter
to a class projection is not normalized, `getStuckMVar?` may fail to
detect stuck metavariables that would be revealed by whnf, or conversely
may report stuckness for expressions that would reduce to constructors.
This caused issues with `OfNat` and `Zero` at
`with_reducible_and_instances` transparency.

Note: PR #12701 (already merged) is also required to fix the original
Mathlib examples.
2026-03-03 18:31:39 +00:00
Markus Himmel
dc63bb0b70 feat: lemmas about String.find? and String.contains (#12777)
This PR adds lemmas about `String.find?` and `String.contains`.
2026-03-03 16:30:34 +00:00
Wojciech Różowski
7ca47aad7d feat: add cbv at location syntax (#12773)
This PR adds `at` location syntax to the `cbv` tactic, matching the
interface of `simp at`. Previously `cbv` could only reduce the goal
target; now it supports `cbv at h`, `cbv at h |-`, and `cbv at *`.

`cbvGoal` is rewritten to use `Sym.preprocessMVar` followed by `cbvCore`
within a single `SymM` context, sharing the term table across all
hypotheses and the target. The old `cbvGoalCore` (which reduced one side
of an equation goal at a time) is replaced by a general approach that
reduces arbitrary goal types and hypothesis types, with special handling
for `True` targets and `False` hypotheses. `cbvDecideGoal` is updated to
use the extracted `cbvCore` as well.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:12:07 +00:00
Wojciech Różowski
1f04bf4fd1 feat: add simpDecideCbv simproc for cbv decide (#12766)
This PR adds a dedicated cbv simproc for `Decidable.decide` that
directly matches on `isTrue`/`isFalse` instances, producing simpler
proof terms and avoiding unnecessary unfolding through `Decidable.rec`.

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 14:24:14 +00:00
Markus Himmel
03a5db34c7 feat: generalize String.Slice.Pos.cast (#12771)
This PR generalizes `String.Slice.Pos.cast`, which turns an `s.Pos` into
a `t.Pos`, to no longer require `s = t`, but merely `s.copy = t.copy`.

This is a breaking change, but one that is easy to adapt to, by
replacing `proof` with `congrArg Slice.copy proof` where required.
2026-03-03 09:23:51 +00:00
Kim Morrison
f4bbf748df feat: add deriving noncomputable instance syntax (#12756)
This PR adds `deriving noncomputable instance Foo for Bar` syntax so
that delta-derived instances can be marked noncomputable. Previously,
when the underlying instance was noncomputable, `deriving instance`
would fail with an opaque async compilation error.

Now:
- `deriving noncomputable instance Foo for Bar` marks the generated
instance as noncomputable (using `addDecl` + `addNoncomputable` instead
of `addAndCompile`)
- `deriving instance Foo for Bar` pre-checks for noncomputable
dependencies and gives an actionable error with a "Try this:" suggestion
pointing to the noncomputable variant
- For handler-based deriving (inductives/structures), `noncomputable`
sets `isNoncomputable` on the scope

The `optDefDeriving` and `optDeriving` trailing parsers are updated with
`notSymbol "noncomputable"` to prevent them from stealing the parse of
`deriving noncomputable instance ...`.

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 06:42:41 +00:00
Mac Malone
46fe37290e feat: lake: download artifacts on demand (#12634)
This PR enables Lake to download artifacts from a remote cache service
on demand as part of a `lake build`. It also refactors much of the cache
API to be more type safe.

The newly documented `lake cache add` command loads input-to-output
mappings from a file and stores them in the cache with optional
information about which cache service and what scope they come from.
With this information, Lake can now download artifacts on demand during
a `lake build`.

The `lake cache get` command has also changed its default behavior to
download just the input-to-outputs mapping and then lazily fetch
artifacts from Reservoir as part of a `lake build`. The original eager
behavior can be forced via the new `--download-arts` option.
2026-03-03 03:48:56 +00:00
Kim Morrison
dd710dd1bd feat: use StateT.run instead of function application (#5121)
This PR using `StateT.run` rather than the "defeq abuse" of function
application. There remain many places where we still use function
application for `ReaderT`, but I've updated this in the touched files.

(To really solve this, we would make `StateT` irreducible, but that is
not happening here.)

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 03:12:26 +00:00
Kim Morrison
9a841125e7 chore: add HACK banner to isNonTrivialRegular transparency check (#12769)
This PR adds a HACK comment to the transparency restriction in
`isNonTrivialRegular` (from
https://github.com/leanprover/lean4/pull/12650) so it's not forgotten.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 00:40:08 +00:00
Kim Morrison
2daaa50afb chore: constructorNameAsVariable linter respects linter.all (#4966)
This PR ensures `linter.all` disables `constructorNameAsVariable`.

The issue was discovered by @eric-wieser while investigating a quote4
issue.

This seems like an easy mistake to make when setting up a new linter,
and perhaps we need a better structure to make it easy to do the right
thing.

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 00:20:21 +00:00
Lean stage0 autoupdater
145a121048 chore: update stage0 2026-03-02 22:42:13 +00:00
Leonardo de Moura
584d92d302 refactor: replace isImplicitReducible with Meta.isInstance in shouldInline (#12759)
This PR replaces the `isImplicitReducible` check with `Meta.isInstance`
in the `shouldInline` function within `inlineCandidate?`.

At the base phase, we skip inlining instances tagged with
`[inline]`/`[always_inline]`/`[inline_if_reduce]` because their local
functions will be lambda lifted during the base phase. The goal is to
keep instance code compact so the lambda lifter can extract
cheap-to-inline declarations. Inlining instances prematurely expands the
code and creates extra work for the lambda lifter — producing many
additional lambda-lifted closures.

The previous check used `isImplicitReducible`, which does not capture
the original intent: some `instanceReducible` declarations are not
instances. `Meta.isInstance` correctly targets only actual type class
instances. Although `Meta.isInstance` depends on the scoped extension
state, this is safe because `shouldInline` runs during LCNF compilation
at `addDecl` time — any instance referenced in the code was resolved
during elaboration when the scope was active, and LCNF compilation
occurs before the scope changes.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 21:49:46 +00:00
Wojciech Różowski
d66aaebca6 perf: simplify cbv ite/dite simprocs by reducing Decidable instance directly (#12677)
This PR changes the approach in `simpIteCbv` and `simpDIteCbv`, by
replacing call to `Decidable.decide`
with reducing and direct pattern matching on the `Decidable` instance
for `isTrue`/`isFalse`. This produces simpler proof terms.

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 17:11:48 +00:00
Henrik Böving
4ac7ea4aab perf: fixup BitVec.cpop termination proof performance (#12764) 2026-03-02 16:53:45 +00:00
Wojciech Różowski
6bebf9c529 feat: add short-circuit evaluation for Or and And in cbv (#12763)
This PR adds pre-pass simprocs `simpOr` and `simpAnd` to the `cbv`
tactic that evaluate only the left argument of `Or`/`And` first,
short-circuiting when the result is determined without evaluating the
right side. Previously, `cbv` processed `Or`/`And` via congruence, which
always evaluated both arguments. For expressions like `decide (m < n ∨
expensive)`, when `m < n` is true, the expensive right side is now
skipped entirely.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 13:47:04 +00:00
Luisa Cicolini
df74c80973 feat: add bitblasting circuit for BitVec.cpop (#12433)
This PR adds a bitblasting circuit for `BitVec.cpop` with a
divide-and-conquer for a parallel-prefix-sum.

This is the [most efficient circuit we could
fine](https://docs.google.com/spreadsheets/d/1dJ5uUY4-eWIQmMjIui3H4U-wBxBxy-qYuqJZFZD1xvA/edit?usp=sharing),
after comparing with Kernighan's algorithm and with the intuitive
addition circuit.

---------

Co-authored-by: Henrik Böving <hargonix@gmail.com>
2026-03-02 13:38:04 +00:00
Paul Reichert
292b423a17 feat: injectivity lemmas for getElem(?) on List and Option (#12435)
This PR provides injectivity lemmas for `List.getElem`, `List.getElem?`,
`List.getElem!` and `List.getD` as well as for `Option`. Note: This
introduces a breaking change, changing the signature of
`Option.getElem?_inj`.
2026-03-02 09:44:45 +00:00
Kim Morrison
cda84702e9 doc: add guidance on waiting for CI/merges in release command (#12755)
This PR adds a section to the /release command explaining how to use `gh
pr checks --watch` to wait for CI or merges without polling.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 02:49:34 +00:00
Kim Morrison
ec565f3bf7 fix: use _fvar._ instead of _ for anonymous fvars (#12745)
This PR fixes `pp.fvars.anonymous` to display loose free variables as
`_fvar._` instead of `_` when the option is set to `false`. This was the
intended behavior in https://github.com/leanprover/lean4/pull/12688 but
the fix was committed locally and not pushed before that PR was merged.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 09:59:13 +00:00
Kim Morrison
feea8a7611 fix: use pull_request_target for label-triggered workflows (#12638)
This PR switches four lightweight workflows from `pull_request` to
`pull_request_target` to stop GitHub from requiring manual approval when
the
`mathlib-lean-pr-testing[bot]` app triggers label events (e.g. adding
`builds-mathlib`). Since the bot never lands commits on master, it is
perpetually treated as a "first-time contributor" and every
`pull_request`
event it triggers requires approval. `pull_request_target` events always
run
without approval because they execute trusted code from the base branch.

This is safe for all four workflows because none check out or execute
code
from the PR branch — they only read labels, PR body, and file lists from
the
event payload and API:

- `awaiting-mathlib.yml` — checks label combinations
- `awaiting-manual.yml` — checks label combinations
- `pr-body.yml` — checks PR body formatting
- `check-stdlib-flags.yml` — checks if stdlib_flags.h was modified via
API

Also adds explicit `permissions: pull-requests: read` to each workflow
as a
least-privilege hardening measure, since `pull_request_target` has
access to
secrets.

Addresses the issue reported by Sebastian:

https://lean-fro.zulipchat.com/#narrow/channel/398861-general/topic/mathlib.20pr-testing.20breakage.3F/near/575084348

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 19:20:56 +11:00
Kim Morrison
6d305096e5 chore: fix profiler shebang and add profiling skill (#12519)
This PR changes the shebang in `lean_profile.sh` from `#!/bin/bash` to
`#!/usr/bin/env bash` so the script works on NixOS and other systems
where bash is not at `/bin/bash`, and adds a Claude Code skill pointing
to the profiler documentation.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 07:09:33 +00:00
Kim Morrison
235b0eb987 feat: add Meta.synthInstance.apply trace class (#12699)
This PR gives the `generate` function's "apply @Foo to Goal" trace nodes
their own trace sub-class `Meta.synthInstance.apply` instead of sharing
the parent `Meta.synthInstance` class.

This allows metaprograms that walk synthesis traces to distinguish
instance application attempts from other synthesis nodes by checking
`td.cls` rather than string-matching on the header text.

The new class is registered with `inherited := true`, so `set_option
trace.Meta.synthInstance true` continues to show these nodes.

Motivated by mathlib's `#defeq_abuse` diagnostic tactic
(https://github.com/leanprover-community/mathlib4/pull/35750) which
currently checks `headerStr.contains "apply"` to identify these nodes.
See
https://leanprover.zulipchat.com/#narrow/channel/113488-general/topic/backward.2EisDefEq.2ErespectTransparency

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 07:06:56 +00:00
Kim Morrison
5dd8d570fd feat: add pp.fvars.anonymous option (#12688)
This PR adds a `pp.fvars.anonymous` option (default `true`) that
controls the display of loose free variables (fvars not in the local
context).

- When `true` (default), loose fvars display their internal name like
`_fvar.42`
- When `false`, they display as `_fvar._`

This is analogous to `pp.mvars.anonymous` for metavariables. It's useful
for stabilizing output in `#guard_msgs` when messages contain fvar IDs
that vary between runs — for example, in diagnostic tools that report
`isDefEq` failures from trace output where the local context is not
available.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 06:43:14 +00:00
Kim Morrison
3ea59e15b8 fix: set implicitReducible on grandparent subobject projections (#12701)
This PR fixes a gap in how `@[implicit_reducible]` is assigned to parent
projections during structure elaboration.

When `class C extends P₁, P₂` has diamond inheritance, some ancestor
structures become constructor subobject fields even though they aren't
direct parents. For example, in `Monoid extends Semigroup, MulOneClass`,
`One` becomes a constructor subobject of `Monoid` — its field `one`
doesn't overlap with `Semigroup`'s fields, and `inSubobject?` is `none`
during `MulOneClass` flattening.

`mkProjections` creates the projection `Monoid.toOne` but defers
reducibility to `addParentInstances` (guarded by `if !instImplicit`).
However, `addParentInstances` only processes direct parents from the
`extends` clause. Grandparent subobject projections fall through the gap
and stay `semireducible`.

This causes defeq failures when `backward.isDefEq.respectTransparency`
is enabled (#12179): at `.instances` transparency, the semireducible
grandparent projection can't unfold, so two paths to the same ancestor
structure aren't recognized as definitionally equal.

Fix: before `addParentInstances`, iterate over all `.subobject` fields
and set `implicitReducible` on those whose parent is a class.

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 06:39:17 +00:00
Kim Morrison
d59f229b74 fix: mark levelZero, levelOne, and Level.ofNat as implicit_reducible (#12719)
This PR marks `levelZero` and `Level.ofNat` as `@[implicit_reducible]`
so that `Level.ofNat 0 =?= Level.zero` succeeds when the definitional
equality checker respects transparency annotations. Without this,
coercions between structures with implicit `Level` parameters fail, as
reported by @FLDutchmann on
[Zulip](https://leanprover.zulipchat.com/#narrow/channel/113488-general/topic/backward.2EisDefEq.2ErespectTransparency/near/576131374).

🤖 Prepared with Claude Code

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 06:37:54 +00:00
Garmelon
a364595111 chore: fix ci after new linter was added (#12733)
The linter was running in parallel with other tests, which were creating
and deleting files. Since the linter was iterating over some files and
directories at the time, it crashed.
2026-02-28 03:05:07 +00:00
Garmelon
08ab8bf7c3 chore: fix ci for new test suite (#12704) 2026-02-27 23:25:37 +00:00
Lean stage0 autoupdater
54df5173d2 chore: update stage0 2026-02-27 21:05:46 +00:00
Garmelon
36ffba4b57 chore: ensure test names differ by more than just case (#12729)
These tests may lead to issues on case insensitive file systems.
2026-02-27 19:03:22 +00:00
Henrik Böving
2e9e5db408 feat: extract simple array literals as static initializers (#12724)
This PR implements support for extracting simple ground array literals
into statically initialized data.
2026-02-27 18:42:21 +00:00
Henrik Böving
81a5eb55d5 feat: boxed simple ground literal extraction (#12727)
This PR implements simple ground literal extraction for boxed scalar
values.
2026-02-27 16:15:14 +00:00
Markus Himmel
b4f768b67f feat: lemmas about splitting the empty string/slice (#12725)
This PR shows that lawful searchers split the empty string to `[""]`.
2026-02-27 11:04:17 +00:00
Markus Himmel
9843794e3f feat: lemmas for String.split by a character or character predicate (#12723)
This PR relates `String.split` to `List.splitOn` and `List.splitOnP`,
provided that we are splitting by a character or character predicate.

Also included: some more lemmas about `List.splitOn`, and a refactor of
the generic `split` verification to get rid of the awkward `SlicesFrom`
constuct.
2026-02-27 09:46:58 +00:00
Markus Himmel
9bd4dfb696 chore: prefer cons_cons over cons₂ in names (#12710)
This PR deprecated the handful of names in core involving the component
`cons₂` in favor of `cons_cons`.
2026-02-27 08:58:08 +00:00
Henrik Böving
b1db0d2798 perf: non quadratic closed term initialization for closed array literals (#12715)
This PR ensures the compiler extracts `Array`/`ByteArray`/`FloatArray`
literals as one big closed term to avoid quadratic overhead at closed
term initialization time.
2026-02-27 08:37:12 +00:00
Sebastian Graf
4cd7a85334 test: speed up Sym mvcgen by doing fewer redundant program matches (#12712)
This PR changes the spec lookup procedure in Sym-based mvcgen so that

1. Spec candidates are sorted first before being filtered
2. Instead of filtering the whole set of candidates using
`spec.pattern.match?`, we take the first match with the highest
priority.

The second point means we will do a lot fewer matches when the highest
priority spec matches immediately. In this case, the one match is still
partially redundant with the final application of the backward rule
application. It would be great if could somehow specialize the backward
rule after it has been created. Still, this yields some welcome
speedups. Before and after for each.

```
vcgen_add_sub_cancel:
goal_1000: 865 ms, 1 VCs by grind: 228 ms, kernel: 435 ms
goal_1000: 540 ms, 1 VCs by grind: 229 ms, kernel: 426 ms

vcgen_ping_pong:
goal_1000: 458 ms, 0 VCs, kernel: 431 ms
goal_1000: 454 ms, 0 VCs, kernel: 443 ms (unchanged, because there is only ever one candidate spec)

vcgen_deep_add_sub_cancel:
goal_1000: 986 ms, 1 VCs by grind: 234 ms, kernel: 735 ms
goal_1000: 728 ms, 1 VCs by grind: 231 ms, kernel: 708 ms

vcgen_reader_state:
goal_1000: 746 ms, 1 VCs by sorry: 1 ms, kernel: 803 ms
goal_1000: 525 ms, 1 VCs by sorry: 1 ms, kernel: 840 ms
```
2026-02-27 03:24:34 +00:00
Sebastian Graf
6cf1c4a1be chore: simplify a proof in mvcgen test cases and remove duplicate (#12547) 2026-02-27 01:18:06 +00:00
Sebastian Graf
e7aa785822 chore: tighten a do match elaborator test case to prevent global defaulting (#12675)
This PR enshrines that the do `match` elaborator does not globally
default instances, in contrast to the term `match` elaborator.
2026-02-27 01:17:27 +00:00
Sebastian Graf
668f07039c chore: do not use Sym.inferType in mvcgen if inputs are not shared (#12713) 2026-02-27 01:15:09 +00:00
Kyle Miller
005f6ae7cd fix: let Meta.zetaReduce zeta reduce have expressions (#12695)
This PR fixes a bug in `Meta.zetaReduce` where `have` expressions were
not being zeta reduced. It also adds a feature where applications of
local functions are beta reduced, and another where zeta-delta reduction
can be disabled. These are all controllable by flags:
- `zetaDelta` (default: true) enables unfolding local definitions
- `zetaHave` (default: true) enables zeta reducing `have` expressions
- `beta` (default: true) enables beta reducing applications of local
definitions

Closes #10850
2026-02-27 00:37:52 +00:00
Henrik Böving
738688efee chore: cleanup after closed term extraction by removing dead values (#12717) 2026-02-26 22:33:08 +00:00
Garmelon
adf3e5e661 chore: stop using cached namespace.so checkout (#12714)
The namespace cache volumes were running out of space and preventing CI
from running.
2026-02-26 17:18:52 +00:00
Sebastian Graf
38682c4d4a fix: heartbeat limit in mvcgen due to withDefault rfl (#12696)
This PR fixes a test case reported by Alexander Bentkamp that runs into
a heartbeat limit due to daring use of `withDefault` `rfl` in `mvcgen`.
2026-02-26 16:40:42 +00:00
Sebastian Graf
f2438a1830 test: support postcondition VCs in Sym VCGen (#12711)
This PR adds support for generating and discharging postcondition VCs in
Sym-based `mvcgen`. It also adds a new benchmark case
`vcgen_ping_pong.lean` that tests this functionality. This benchmark
required a more diligent approach to maintain maximal sharing in goal
preprocessing. Goal preprocessing was subsequently merged into the main
VC generation function.
2026-02-26 16:34:15 +00:00
Markus Himmel
48c37f6588 feat: assorted string lemmas (#12709)
This PR adds various `String` lemmas that will be useful for deriving
high-level theorems about `String.split`.
2026-02-26 16:10:52 +00:00
Sebastian Graf
8273df0d0b fix: quantify over α before ps in PostCond definitions (#12708)
This PR changes the order of implicit parameters `α` and `ps` such that
`α` consistently comes before `ps` in `PostCond.noThrow`,
`PostCond.mayThrow`, `PostCond.entails`, `PostCond.and`, `PostCond.imp`
and theorems.
2026-02-26 16:00:00 +00:00
Henrik Böving
f83a8b4cd5 refactor: port simple ground expr extraction from IR to LCNF (#12705)
This PR ports the simple ground expression extraction pass from IR to
LCNF.

I locally confirmed that this produces no diff between stage1/stage2 at
the C level (apart from the
changed compiler files) so this should essentially be binary equivalent.
2026-02-26 15:10:01 +00:00
Markus Himmel
fedfc22c53 feat: lemmas for String.intercalate (#12707)
This PR adds lemmas about `String.intercalate` and
`String.Slice.intercalate`.
2026-02-26 15:05:41 +00:00
Markus Himmel
a91fb93eee feat: simproc for String.singleton (#12706)
This PR adds a dsimproc which evaluates `String.singleton ' '` to `" "`.
2026-02-26 14:41:56 +00:00
Sebastian Graf
b3b4867d6c feat: add two unfolding theorems to Std.Do (#12697)
This PR adds two new unfolding theorems to Std.Do: `PostCond.entails.mk`
and `Triple.of_entails_wp`.
2026-02-26 14:31:07 +00:00
Markus Himmel
1e4894b431 feat: upstream List.splitOn(P) (#12702)
This PR upstreams `List.splitOn` and `List.splitOnP` from
Batteries/mathlib.

The function `splitOnP.go` is factored out to `splitOnPPrepend`, because
it is useful to state induction hypotheses in terms of
`splitOnPPrepend`.
2026-02-26 13:45:34 +00:00
Lean stage0 autoupdater
846420daba chore: update stage0 2026-02-26 10:20:57 +00:00
Henrik Böving
d88ac25bd1 feat: non exponential codegen for reset-reuse (#12665)
This PR ports the expand reset/reuse pass from IR to LCNF. In addition
it prevents exponential code generation unlike the old one. This results
in a ~15% decrease in binary size and slight speedups across the board.

The change also removes the "is this reset actually used" syntactic
approximation as the previous passes guarantee (at the moment) that all
uses are in the continuation and will thus be caught by this.
2026-02-26 09:35:45 +00:00
Lean stage0 autoupdater
805060c0a8 chore: update stage0 2026-02-26 08:58:17 +00:00
Sebastian Ullrich
b1a991eee0 perf: separate meta and non-meta initializers (#12016)
This PR enables the module system, in cooperation with the linker, to
separate meta and non-meta code in native binaries. In particular, this
ensures tactics merely used in proofs do not make it into the final
binary. A simple example using `meta import Lean` has its binary size
reduced from 130MB to 1.7MB.

# Breaking change

`importModules (loadExts := true)` must now be preceded by
`enableInitializersExecution`. This was always the case for correct
importing but is now enforced and checked eagerly.
2026-02-26 08:05:19 +00:00
Sebastian Ullrich
65a0c61806 chore: idbg refinements (#12691) 2026-02-26 07:49:47 +00:00
Wojciech Różowski
d4b560ec4a test: add cbv tests adapted from LNSym (#12694)
This PR adds two `decide_cbv` stress tests extracted from LNSym (ARMv8
symbolic
simulator, Apache 2.0). `cbv_aes.lean` tests a full AES-128 encryption
on large
bitvector computations. `cbv_arm_ldst.lean` tests ARMv8 load/store
instruction
decoding and execution with nested pattern matching over bitvectors.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 17:08:24 +00:00
Wojciech Różowski
7390024170 test: add cbv test for Collatz conjecture verification (#12692)
This PR adds a `cbv` tactic test based on a minimized example extracted
from verifying the Collatz conjecture for small numbers, suggested by
Bhavik Mehta (@b-mehta).

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Bhavik Mehta <bhavikmehta8@gmail.com>
2026-02-25 17:05:51 +00:00
Henrik Böving
805012fb84 chore: revert "perf: improve over-applied cases in ToLCNF (#12284)" (#12693)
This PR reverts commit 9b7a8eb7c8. After
some more contemplation on
the implications of these changes I think this is not the direction we
want to move into.
2026-02-25 15:23:24 +00:00
Garmelon
dc760cf54a chore: fail build on non-make generators (#12690)
At the moment, the build relies on make and will fail with other cmake
generators. This explicit check (as suggested by @LecrisUT in
https://github.com/leanprover/lean4/pull/12577#discussion_r2832295132)
should help prevent confusion like in #12575.
2026-02-25 13:59:40 +00:00
Garmelon
08eb78a5b2 chore: switch to new test/bench suite (#12590)
This PR sets up the new integrated test/bench suite. It then migrates
all benchmarks and some related tests to the new suite. There's also
some documentation and some linting.

For now, a lot of the old tests are left alone so this PR doesn't become
even larger than it already is. Eventually, all tests should be migrated
to the new suite though so there isn't a confusing mix of two systems.
2026-02-25 13:51:53 +00:00
Kyle Miller
bd0c6a42c8 fix: copied 11940 fix for structure command (#12680)
This PR fixes an issue where `mutual public structure` would have a
private constructor. The fix copies the fix from #11940.

Closes #10067. Also recloses duplicate issue #11116 (its test case is
added to the test suite).
2026-02-25 13:50:04 +00:00
Paul Reichert
c86f82161a feat: upstream List/Array/Vector lemmas from human-eval-lean (#12405)
This PR adds several useful lemmas for `List`, `Array` and `Vector`
whenever they were missing, improving API coverage and consistency among
these types.
- `size_singleton`/`sum_singleton`/`sum_push`
-
`foldlM_toArray`/`foldlM_toList`/`foldl_toArray`/`foldl_toList`/`foldrM_toArray`/`foldrM_toList`/`foldr_toList`
- `toArray_toList`
- `foldl_eq_apply_foldr`/`foldr_eq_apply_foldl`, `foldr_eq_foldl`:
relates `foldl` and `foldr` for associative operations with identity
- `sum_eq_foldl`: relates sum to `foldl` for associative operations with
identity
- `Perm.pairwise_iff`/`Perm.pairwise`: pairwise properties are preserved
under permutations of arrays
2026-02-25 12:50:31 +00:00
Paul Reichert
b548cf38b6 feat: enable partial termination proofs about WellFounded.extrinsicFix (#12430)
This PR provides `WellFounded.partialExtrinsicFix`, which makes it
possible to implement and verify partially terminating functions, safely
building on top of the seemingly less general `extrinsicFix` (which is
now called `totalExtrinsicFix`). A proof of termination is only
necessary in order to formally verify the behavior of
`partialExtrinsicFix`.
2026-02-25 12:43:39 +00:00
Henrik Böving
e96d969d59 feat: support for del, isShared, oset and setTag (#12687)
This PR implements the LCNF instructions required for the expand reset
reuse pass.
2026-02-25 10:43:15 +00:00
Sebastian Ullrich
532310313f feat: lake shake --only (#12682)
This PR extends `lake shake` with a flag for minimizing only a specific
module
2026-02-25 10:24:50 +00:00
Marc Huisinga
168c125cf5 chore: relative lean-toolchains (#12652)
This PR changes all `lean-toolchain` to use relative toolchain paths
instead of `lean4` and `lean4-stage0` identifiers, which removes the
need for manually linking toolchains via Elan.

After this PR, at least Elan 4.2.0 and 0.0.224 of the Lean VS Code
extension will be needed to edit core.

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-25 10:23:35 +00:00
Sebastian Ullrich
54be382b2f chore: fix core after rebootstrap 2026-02-25 11:40:02 +01:00
Sebastian Ullrich
fa31b285df chore: update stage0 2026-02-25 11:40:02 +01:00
Sebastian Ullrich
1fd9adc693 fix: update-stage0 under the Lake cache 2026-02-25 11:40:02 +01:00
Sebastian Ullrich
423671a6c0 feat: strengthen evalConst meta check 2026-02-25 11:40:02 +01:00
Markus Himmel
1e0bfe931f feat: more lemmas about String.Slice.Pos.ofSlice(From|To)? (#12685)
This PR adds some missing material about transferring positions across
the subslicing operations `slice`, `sliceFrom`, `sliceTo`.
2026-02-25 09:39:59 +00:00
Henrik Böving
1bf43863e6 fix: better LCNF pretty printing (#12684) 2026-02-25 09:30:23 +00:00
Markus Himmel
87ec768a50 fix: ensure that tail-recursive List.flatten is used everywhere (#12678)
This PR marks `List.flatten`, `List.flatMap`, `List.intercalate` as
noncomputable to ensure that their `csimp` variants are used everywhere.

We also mark `List.flatMapM` as noncomputable and provide a
tail-recursive implementation, and mark `List.utf8Encode` as
noncomputable, which only exists for specification purposes anyway (at
this point).

Closes #12676.
2026-02-25 06:24:15 +00:00
Kyle Miller
de65af8318 feat: overriding binder kinds of parameters in inductive constructors (#12603)
This PR adds a feature where `inductive` constructors can override the
binder kinds of the type's parameters, like in #9480 for `structure`.
For example, it's possible to make `x` explicit in the constructor
`Eq.refl`, rather than implicit:
```lean
inductive Eq {α : Type u} (x : α) : α → Prop where
  | refl (x) : Eq x x
```
In the Prelude, this is currently accomplished by taking advantage of
auto-promotion of indices to parameters.

**Breaking change.** Inductive types with a constructor that starts with
typeless binders may need to be rewritten, e.g. changing `(x)` to `(x :
_)` if there is a `variable` with that name or if it is meant to shadow
one of the inductive type's parameters.
2026-02-25 02:30:12 +00:00
Kyle Miller
c032af2f51 fix: make tactic .. at * save info contexts (#12607)
This PR fixes an issue where `withLocation` wasn't saving the info
context, which meant that tactics that use `at *` location syntax and do
term elaboration would save infotrees but revert the metacontext,
leading to Infoview messages like "Error updating: Error fetching goals:
Rpc error: InternalError: unknown metavariable" if the tactic failed at
some locations but succeeded at others.

Closes #10898
2026-02-25 01:59:50 +00:00
Kyle Miller
48a715993d fix: pretty printing of constants should consider accessibility of names (#12654)
This PR fixes two aspects of pretty printing of private names.
1. Name unresolution. Now private names are not special cased: the
private prefix is stripped off and the `_root_` prefix is added, then it
tries resolving all suffixes of the result. This is sufficient to handle
imported private names in the new module system. (Additionally,
unresolution takes macro scopes into account now.)
2. Delaboration. Inaccessible private names use a deterministic
algorithm to convert private prefixes into macro scopes. The effect is
that the same private name appearing in multiple times in the same
delaborated expression will now have the same `✝` suffix each time. It
used to use fresh macro scopes per occurrence.

Note: There is currently a small hack to support pretty printing in the
compiler's trace messages, which print constants that do not exist (e.g.
`obj`, `tobj`, and auxiliary definitions being compiled). Even though
these names are inaccessible (for the stronger reason that they don't
exist), we make sure that the pretty printer won't add macro scopes. It
also does some analysis of private names to see if the private names are
for the current module.

Closes #10771, closes #10772, and closes #10773
2026-02-25 00:01:19 +00:00
Wojciech Różowski
f31f50836d fix: withNamespace now correctly calls popScopes after running (#12647)
This PR adds the missing `popScopes` call to `withNamespace`, which
previously
only dropped scopes from the elaborator's `Command.State` but did not
pop the
environment's `ScopedEnvExtension` state stacks. This caused scoped
syntax
declarations to leak keywords outside their namespace when
`withNamespace` had
been called.

Closes #12630

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 15:24:58 +00:00
Lean stage0 autoupdater
c1ab1668b2 chore: update stage0 2026-02-24 15:19:57 +00:00
Sebastian Graf
7517f768f9 feat: lightweight dependent match motive for do match (#12673)
This PR allows for a leightweight version of dependent `match` in the
new `do` elaborator: discriminant types get abstracted over previous
discriminants. The match result type and the local context still are not
considered for abstraction. For example, if both `i : Nat` and `h : i <
len` are discrminants, then if an alternative matches `i` with `0`, we
also have `h : 0 < len`:

```lean
example {α : Type u} {β : Type v} {m : Type v → Type w} [Monad m] (as : Array α) (b : β) (f : (a : α) → a ∈ as → β → m (ForInStep β)) : m β :=
  let rec loop (i : Nat) (h : i ≤ as.size) (b : β) : m β := do
    match i, h with
    | 0,   _ => pure b
    | i+1, h =>
      have h' : i < as.size            := Nat.lt_of_lt_of_le (Nat.lt_succ_self i) h
      have : as.size - 1 < as.size     := Nat.sub_lt (Nat.zero_lt_of_lt h') (by decide)
      have : as.size - 1 - i < as.size := Nat.lt_of_le_of_lt (Nat.sub_le (as.size - 1) i) this
      match (← f as[as.size - 1 - i] (Array.getElem_mem this) b) with
      | ForInStep.done b  => pure b
      | ForInStep.yield b => loop i (Nat.le_of_lt h') b
  loop as.size (Nat.le_refl _) b
```

This feature turns out to be enough to save quite a few adaptations
(6/16) during bootstrep.
2026-02-24 14:29:29 +00:00
Sebastian Graf
96cd6909ea doc: fix comment referring to elabElem instead of elabDoElem (#12674) 2026-02-24 14:23:58 +00:00
Sebastian Graf
bb8d8da1af test: add benchmark vcgen_reader_state (#12671)
This PR adds the benchmark vcgen_reader_state that is a variant of
vcgen_add_sub_cancel that takes the value to subtract from a `ReaderT`
layer. Measurements:
```
goal_100: 201 ms, 1 VCs by sorry: 0 ms, kernel: 52 ms
goal_500: 382 ms, 1 VCs by sorry: 0 ms, kernel: 327 ms
goal_1000: 674 ms, 1 VCs by sorry: 1 ms, kernel: 741 ms
```
Which suggests it scales linearly. The generated VC triggers superlinear
behavior in `grind`, though, hence it is discharged by `sorry`.
2026-02-24 13:19:15 +00:00
Sebastian Graf
8916246be5 test: speed up vcgen_get_throw_set.lean by partially evaluating specs (#12670)
This PR speeds up the vcgen_get_throw_set benchmark by a factor of 4 by
partially evaluating specs.
2026-02-24 13:10:42 +00:00
Wojciech Różowski
65f112a165 chore: rename prime filter benchmark and fix the merge sort benchmark (#12669)
This PR renames the "Eratosthenes' sieve" benchmark description to
"prime filter" in the speedcenter config (following the discussion in
https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/sieve.20of.20Eratosthenes.20benchmark/with/575310824),
and adds the missing `#eval runBenchmarks` call to the merge sort
benchmark so it actually executes.
2026-02-24 10:57:47 +00:00
Markus Himmel
75b083d20a chore: API to prepare for String.split API (#12668)
This PR adds lemmas about string positions and patterns that will be
useful for providing high-level API lemmas for `String.split` and
friends.
2026-02-24 10:03:00 +00:00
Sebastian Ullrich
c595413fcc test: robustify but also CI-disable idbg test for now (#12667) 2026-02-24 09:19:53 +00:00
Kyle Miller
cd7f55b6c9 feat: pp.mdata (#12606)
This PR adds the pretty printer option `pp.mdata`, which causes the
pretty printer to annotate terms with any metadata that is present. For
example,
```lean
set_option pp.mdata true
/-- info: [mdata noindex:true] 2 : Nat -/
#guard_msgs in #check no_index 2
```
The `[mdata ...] e` syntax is only for pretty printing.

Thanks to @Rob23oba for an initial version.

Closes #10929
2026-02-24 04:30:26 +00:00
Kyle Miller
673d1a038c feat: clean up binder annotations inside of let rec definitions (#12608)
This PR continues #9674, cleaning up binder annotations inside the
bodies of `let rec` and `where` definitions.

Closes #11025
2026-02-24 04:24:47 +00:00
Lean stage0 autoupdater
66ce282364 chore: update stage0 2026-02-24 00:40:29 +00:00
Sebastian Graf
cdbed919ec fix: preserve TermInfo for do-match discriminant variables (#12666)
This PR fixes spurious unused variable warnings for variables used in
non-atomic match discriminants in `do` notation. For example, in `match
Json.parse s >>= fromJson? with`, the variable `s` would be reported as
unused.

The root cause is that `expandNonAtomicDiscrs?` eagerly elaborates the
discriminant via `Term.elabTerm`, which creates TermInfo for variable
references. The result is then passed to `elabDoElem` for further
elaboration. When the match elaboration is postponed (e.g. because the
discriminant type contains an mvar from `fromJson?`), the result is a
postponed synthetic mvar. The `withTermInfoContext'` wrapper in
`elabDoElemFns` checks `isTacticOrPostponedHole?` on this result,
detects a postponed mvar, and replaces the info subtree with a `hole`
node — discarding all the TermInfo that was accumulated during
discriminant elaboration.

The fix applies `mkSaveInfoAnnotation` to the result, which prevents
`isTacticOrPostponedHole?` from recognizing it as a hole. This is the
same mechanism that `elabLetMVar` uses to preserve info trees when the
body is a metavariable.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 23:54:17 +00:00
Sebastian Ullrich
6d86c8372a perf: shake Lean.Elab.Idbg (#12664) 2026-02-23 21:59:55 +00:00
Lean stage0 autoupdater
5c23579f93 chore: update stage0 2026-02-23 20:33:27 +00:00
Sebastian Ullrich
d0f8eb7bd6 fix: @[nospecialize] is never template-like (#12663)
This PR avoids false-positive error messages on specialization
restrictions under the module system when the declaration is explicitly
marked as not specializable. It could also provide some minor public
size and rebuild savings.
2026-02-23 20:00:36 +00:00
Sebastian Graf
65e5053008 fix: add TermInfo for mut vars in ControlStack.stateT.runInBase (#12661)
This PR fixes false-positive "unused variable" warnings for mutable
variables reassigned inside `try`/`catch` blocks with the new do
elaborator.

The root cause was that `ControlStack.stateT.runInBase` packed mutable
variables into a state tuple without calling `Term.addTermInfo'`, so the
unused variable linter could not see that the variables were used. The
fix mirrors how the `for` loop elaborator handles the same pattern in
`useLoopMutVars`.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:21:40 +00:00
Sebastian Ullrich
8f80881c2f feat: idbg interactive debug expression evaluator (#12648)
This PR adds the experimental `idbg e`, a new do-element (and term)
syntax for live debugging between the language server and a running
compiled Lean program.

When placed in a `do` block, `idbg` captures all local variables in
scope and expression `e`, then:

- **In the language server**: starts a TCP server on localhost waiting
for the running program to
connect; the editor will mark this part of the program as "in progress"
during this wait but that
  will not block `lake build` of the project.
- **In the compiled program**: on first execution of the `idbg` call
site, connects to the server,
receives the expression, compiles and evaluates it using the program's
actual runtime values, and
  sends the `repr` result back.

The result is displayed as an info diagnostic on the `idbg` keyword. The
expression `e` can be
edited while the program is running - each edit triggers re-elaboration
of `e`, a new TCP exchange,
and an updated result. This makes `idbg` a live REPL for inspecting and
experimenting with
program state at a specific point in execution. Only when `idbg` is
inserted, moved, or removed does
the program need to be recompiled and restarted.

# Known Limitations

* The program will poll for the server for up to 10 minutes and needs to
be killed manually
  otherwise.
* Use of multiple `idbg` at once untested, likely too much overhead from
overlapping imports without
  further changes.
* `LEAN_PATH` must be properly set up so compiled program can import its
origin module.
* Untested on Windows and macOS.
2026-02-23 17:22:44 +00:00
7087 changed files with 30778 additions and 5265 deletions

View File

@@ -4,29 +4,25 @@ To build Lean you should use `make -j$(nproc) -C build/release`.
## Running Tests
See `doc/dev/testing.md` for full documentation. Quick reference:
See `tests/README.md` for full documentation. Quick reference:
```bash
# Full test suite (use after builds to verify correctness)
make -j$(nproc) -C build/release test ARGS="-j$(nproc)"
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test
# Specific test by name (supports regex via ctest -R)
make -j$(nproc) -C build/release test ARGS='-R grind_ematch --output-on-failure'
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS='-R grind_ematch'
# Rerun only previously failed tests
make -j$(nproc) -C build/release test ARGS='--rerun-failed --output-on-failure'
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS='--rerun-failed'
# Single test from tests/lean/run/ (quick check during development)
cd tests/lean/run && ./test_single.sh example_test.lean
# ctest directly (from stage1 build dir)
cd build/release/stage1 && ctest -j$(nproc) --output-on-failure --timeout 300
# Single test from tests/foo/bar/ (quick check during development)
cd tests/foo/bar && ./run_test example_test.lean
```
The full test suite includes `tests/lean/`, `tests/lean/run/`, `tests/lean/interactive/`,
`tests/compiler/`, `tests/pkg/`, Lake tests, and more. Using `make test` or `ctest` runs
all of them; `test_single.sh` in `tests/lean/run/` only covers that one directory.
## New features
When asked to implement new features:
@@ -34,8 +30,6 @@ When asked to implement new features:
* write comprehensive tests first (expecting that these will initially fail)
* and then iterate on the implementation until the tests pass.
All new tests should go in `tests/lean/run/`. These tests don't have expected output; we just check there are no errors. You should use `#guard_msgs` to check for specific messages.
## Success Criteria
*Never* report success on a task unless you have verified both a clean build without errors, and that the relevant tests pass.

View File

@@ -121,6 +121,20 @@ The nightly build system uses branches and tags across two repositories:
When a nightly succeeds with mathlib, all three should point to the same commit. Don't confuse these: branches are in the main lean4 repo, dated tags are in lean4-nightly.
## Waiting for CI or Merges
Use `gh pr checks --watch` to block until a PR's CI checks complete (no polling needed).
Run these as background bash commands so you get notified when they finish:
```bash
# Watch CI, then check merge state
gh pr checks <number> --repo <owner>/<repo> --watch && gh pr view <number> --repo <owner>/<repo> --json state --jq '.state'
```
For multiple PRs, launch one background command per PR in parallel. When each completes,
you'll be notified automatically via a task-notification. Do NOT use sleep-based polling
loops — `--watch` is event-driven and exits as soon as checks finish.
## Error Handling
**CRITICAL**: If something goes wrong or a command fails:

View File

@@ -0,0 +1,26 @@
---
name: profiling
description: Profile Lean programs with demangled names using samply and Firefox Profiler. Use when the user asks to profile a Lean binary or investigate performance.
allowed-tools: Bash, Read, Glob, Grep
---
# Profiling Lean Programs
Full documentation: `script/PROFILER_README.md`.
## Quick Start
```bash
script/lean_profile.sh ./build/release/stage1/bin/lean some_file.lean
```
Requires `samply` (`cargo install samply`) and `python3`.
## Agent Notes
- The pipeline is interactive (serves to browser at the end). When running non-interactively, run the steps manually instead of using the wrapper script.
- The three steps are: `samply record --save-only`, `symbolicate_profile.py`, then `serve_profile.py`.
- `lean_demangle.py` works standalone as a stdin filter (like `c++filt`) for quick name lookups.
- The `--raw` flag on `lean_demangle.py` gives exact demangled names without postprocessing (keeps `._redArg`, `._lam_0` suffixes as-is).
- Use `PROFILE_KEEP=1` to keep the temp directory for later inspection.
- The demangled profile is a standard Firefox Profiler JSON. Function names live in `threads[i].stringArray`, indexed by `threads[i].funcTable.name`.

View File

@@ -2,16 +2,19 @@ name: Check awaiting-manual label
on:
merge_group:
pull_request:
pull_request_target:
types: [opened, synchronize, reopened, labeled, unlabeled]
permissions:
pull-requests: read
jobs:
check-awaiting-manual:
runs-on: ubuntu-latest
steps:
- name: Check awaiting-manual label
id: check-awaiting-manual-label
if: github.event_name == 'pull_request'
if: github.event_name == 'pull_request_target'
uses: actions/github-script@v8
with:
script: |
@@ -28,7 +31,7 @@ jobs:
}
- name: Wait for manual compatibility
if: github.event_name == 'pull_request' && steps.check-awaiting-manual-label.outputs.awaiting == 'true'
if: github.event_name == 'pull_request_target' && steps.check-awaiting-manual-label.outputs.awaiting == 'true'
run: |
echo "::notice title=Awaiting manual::PR is marked 'awaiting-manual' but neither 'breaks-manual' nor 'builds-manual' labels are present."
echo "This check will remain in progress until the PR is updated with appropriate manual compatibility labels."

View File

@@ -2,16 +2,19 @@ name: Check awaiting-mathlib label
on:
merge_group:
pull_request:
pull_request_target:
types: [opened, synchronize, reopened, labeled, unlabeled]
permissions:
pull-requests: read
jobs:
check-awaiting-mathlib:
runs-on: ubuntu-latest
steps:
- name: Check awaiting-mathlib label
id: check-awaiting-mathlib-label
if: github.event_name == 'pull_request'
if: github.event_name == 'pull_request_target'
uses: actions/github-script@v8
with:
script: |
@@ -28,7 +31,7 @@ jobs:
}
- name: Wait for mathlib compatibility
if: github.event_name == 'pull_request' && steps.check-awaiting-mathlib-label.outputs.awaiting == 'true'
if: github.event_name == 'pull_request_target' && steps.check-awaiting-mathlib-label.outputs.awaiting == 'true'
run: |
echo "::notice title=Awaiting mathlib::PR is marked 'awaiting-mathlib' but neither 'breaks-mathlib' nor 'builds-mathlib' labels are present."
echo "This check will remain in progress until the PR is updated with appropriate mathlib compatibility labels."

View File

@@ -66,16 +66,10 @@ jobs:
brew install ccache tree zstd coreutils gmp libuv
if: runner.os == 'macOS'
- name: Checkout
if: (!endsWith(matrix.os, '-with-cache'))
uses: actions/checkout@v6
with:
# the default is to use a virtual merge commit between the PR and master: just use the PR
ref: ${{ github.event.pull_request.head.sha }}
- name: Namespace Checkout
if: endsWith(matrix.os, '-with-cache')
uses: namespacelabs/nscloud-checkout-action@v8
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Open Nix shell once
run: true
if: runner.os == 'Linux'
@@ -85,7 +79,7 @@ jobs:
- name: CI Merge Checkout
run: |
git fetch --depth=1 origin ${{ github.sha }}
git checkout FETCH_HEAD flake.nix flake.lock script/prepare-* tests/lean/run/importStructure.lean
git checkout FETCH_HEAD flake.nix flake.lock script/prepare-* tests/elab/importStructure.lean
if: github.event_name == 'pull_request'
# (needs to be after "Checkout" so files don't get overridden)
- name: Setup emsdk
@@ -235,25 +229,21 @@ jobs:
# prefix `if` above with `always` so it's run even if tests failed
if: always() && steps.test.conclusion != 'skipped'
- name: Check Test Binary
run: ${{ matrix.binary-check }} tests/compiler/534.lean.out
run: ${{ matrix.binary-check }} tests/compile/534.lean.out
if: (!matrix.cross) && steps.test.conclusion != 'skipped'
- name: Build Stage 2
run: |
make -C build -j$NPROC stage2
if: matrix.test-speedcenter
if: matrix.test-bench
- name: Check Stage 3
run: |
make -C build -j$NPROC check-stage3
if: matrix.check-stage3
- name: Test Speedcenter Benchmarks
- name: Test Benchmarks
run: |
# Necessary for some timing metrics but does not work on Namespace runners
# and we just want to test that the benchmarks run at all here
#echo -1 | sudo tee /proc/sys/kernel/perf_event_paranoid
export BUILD=$PWD/build PATH=$PWD/build/stage1/bin:$PATH
cd tests/bench
nix shell .#temci -c temci exec --config speedcenter.yaml --included_blocks fast --runs 1
if: matrix.test-speedcenter
cd tests
nix develop -c make -C ../build -j$NPROC bench
if: matrix.test-bench
- name: Check rebootstrap
run: |
set -e

View File

@@ -1,9 +1,12 @@
name: Check stdlib_flags.h modifications
on:
pull_request:
pull_request_target:
types: [opened, synchronize, reopened, labeled, unlabeled]
permissions:
pull-requests: read
jobs:
check-stdlib-flags:
runs-on: ubuntu-latest

View File

@@ -258,8 +258,8 @@ jobs:
"check-rebootstrap": level >= 1,
"check-stage3": level >= 2,
"test": true,
// NOTE: `test-speedcenter` currently seems to be broken on `ubuntu-latest`
"test-speedcenter": large && level >= 2,
// NOTE: `test-bench` currently seems to be broken on `ubuntu-latest`
"test-bench": large && level >= 2,
// We are not warning-free yet on all platforms, start here
"CMAKE_OPTIONS": "-DLEAN_EXTRA_CXX_FLAGS=-Werror",
},
@@ -269,6 +269,8 @@ jobs:
"enabled": level >= 2,
"test": true,
"CMAKE_PRESET": "reldebug",
// * `elab_bench/big_do` crashes with exit code 134
"CTEST_OPTIONS": "-E 'elab_bench/big_do'",
},
{
"name": "Linux fsanitize",

View File

@@ -2,17 +2,23 @@ name: Check PR body for changelog convention
on:
merge_group:
pull_request:
pull_request_target:
types: [opened, synchronize, reopened, edited, labeled, converted_to_draft, ready_for_review]
permissions:
pull-requests: read
jobs:
check-pr-body:
runs-on: ubuntu-latest
steps:
- name: Check PR body
if: github.event_name == 'pull_request'
if: github.event_name == 'pull_request_target'
uses: actions/github-script@v8
with:
# Safety note: this uses pull_request_target, so the workflow has elevated privileges.
# The PR title and body are only used in regex tests (read-only string matching),
# never interpolated into shell commands, eval'd, or written to GITHUB_ENV/GITHUB_OUTPUT.
script: |
const { title, body, labels, draft } = context.payload.pull_request;
if (!draft && /^(feat|fix):/.test(title) && !labels.some(label => label.name == "changelog-no")) {

1
.gitignore vendored
View File

@@ -1,7 +1,6 @@
*~
\#*
.#*
*.lock
.lake
lake-manifest.json
/build

View File

@@ -1,4 +1,8 @@
cmake_minimum_required(VERSION 3.11)
cmake_minimum_required(VERSION 3.21)
if(NOT CMAKE_GENERATOR MATCHES "Makefiles")
message(FATAL_ERROR "Only makefile generators are supported")
endif()
option(USE_MIMALLOC "use mimalloc" ON)
@@ -147,6 +151,7 @@ ExternalProject_Add(
INSTALL_COMMAND ""
DEPENDS stage2
EXCLUDE_FROM_ALL ON
STEP_TARGETS configure
)
# targets forwarded to appropriate stages
@@ -157,6 +162,25 @@ add_custom_target(update-stage0-commit COMMAND $(MAKE) -C stage1 update-stage0-c
add_custom_target(test COMMAND $(MAKE) -C stage1 test DEPENDS stage1)
add_custom_target(
bench
COMMAND $(MAKE) -C stage2
COMMAND $(MAKE) -C stage2 -j1 bench
DEPENDS stage2
)
add_custom_target(
bench-part1
COMMAND $(MAKE) -C stage2
COMMAND $(MAKE) -C stage2 -j1 bench-part1
DEPENDS stage2
)
add_custom_target(
bench-part2
COMMAND $(MAKE) -C stage2
COMMAND $(MAKE) -C stage2 -j1 bench-part2
DEPENDS stage2
)
add_custom_target(clean-stdlib COMMAND $(MAKE) -C stage1 clean-stdlib DEPENDS stage1)
install(CODE "execute_process(COMMAND make -C stage1 install)")

View File

@@ -41,7 +41,7 @@
"SMALL_ALLOCATOR": "OFF",
"USE_MIMALLOC": "OFF",
"BSYMBOLIC": "OFF",
"LEAN_TEST_VARS": "MAIN_STACK_SIZE=16000 LSAN_OPTIONS=max_leaks=10"
"LEAN_TEST_VARS": "MAIN_STACK_SIZE=16000 TEST_STACK_SIZE=16000 LSAN_OPTIONS=max_leaks=10"
},
"generator": "Unix Makefiles",
"binaryDir": "${sourceDir}/build/sanitize"

View File

@@ -1,5 +1,9 @@
# Test Suite
**Warning:** This document is partially outdated.
It describes the old test suite, which is currently in the process of being replaced.
The new test suite's documentation can be found at [`tests/README.md`](../../tests/README.md).
After [building Lean](../make/index.md) you can run all the tests using
```
cd build/release

View File

@@ -1 +1 @@
lean4
../../../build/release/stage1

View File

@@ -1 +1 @@
lean4
build/release/stage1

View File

@@ -2,21 +2,9 @@
"folders": [
{
"path": "."
},
{
"path": "src"
},
{
"path": "tests"
},
{
"path": "script"
}
],
"settings": {
// Open terminal at root, not current workspace folder
// (there is not way to directly refer to the root folder included as `.` above)
"terminal.integrated.cwd": "${workspaceFolder:src}/..",
"files.insertFinalNewline": true,
"files.trimTrailingWhitespace": true,
"cmake.buildDirectory": "${workspaceFolder}/build/release",

View File

@@ -83,7 +83,7 @@ def main (args : List String) : IO Unit := do
lastRSS? := some rss
let avgRSSDelta := totalRSSDelta / (n - 2)
IO.println s!"avg-reelab-rss-delta: {avgRSSDelta}"
IO.println s!"measurement: avg-reelab-rss-delta {avgRSSDelta*1024} b"
let _ Ipc.collectDiagnostics requestNo uri versionNo
( Ipc.stdin).writeLspMessage (Message.notification "exit" none)

View File

@@ -82,7 +82,7 @@ def main (args : List String) : IO Unit := do
lastRSS? := some rss
let avgRSSDelta := totalRSSDelta / (n - 2)
IO.println s!"avg-reelab-rss-delta: {avgRSSDelta}"
IO.println s!"measurement: avg-reelab-rss-delta {avgRSSDelta*1024} b"
let _ Ipc.collectDiagnostics requestNo uri versionNo
Ipc.shutdown requestNo

View File

@@ -9,5 +9,5 @@ find -regex '.*/CMakeLists\.txt\(\.in\)?\|.*\.cmake\(\.in\)?' \
! -path "./stage0/*" \
-exec \
uvx gersemi --in-place --line-length 120 --indent 2 \
--definitions src/cmake/Modules/ src/CMakeLists.txt \
--definitions src/cmake/Modules/ src/CMakeLists.txt tests/CMakeLists.txt \
-- {} +

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# Copyright (c) 2015 Microsoft Corporation. All rights reserved.

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# Copyright (c) 2015 Microsoft Corporation. All rights reserved.

View File

@@ -1 +1 @@
lean4
../build/release/stage1

View File

@@ -1,4 +1,4 @@
#!/bin/bash
#!/usr/bin/env bash
# Profile a Lean binary with demangled names.
#
# Usage:

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env bash
set -euo pipefail
rm -r stage0 || true
rm -rf stage0 || true
# don't copy untracked files
# `:!` is git glob flavor for exclude patterns
for f in $(git ls-files src ':!:src/lake/*' ':!:src/Leanc.lean'); do

View File

@@ -1,6 +1,4 @@
cmake_minimum_required(VERSION 3.10)
cmake_policy(SET CMP0054 NEW)
cmake_policy(SET CMP0110 NEW)
cmake_minimum_required(VERSION 3.21)
if(NOT CMAKE_GENERATOR MATCHES "Unix Makefiles")
message(FATAL_ERROR "The only supported CMake generator at the moment is 'Unix Makefiles'")
endif()

View File

@@ -1339,10 +1339,10 @@ transitive and contains `r`. `TransGen r a z` if and only if there exists a sequ
-/
inductive Relation.TransGen {α : Sort u} (r : α α Prop) : α α Prop
/-- If `r a b`, then `TransGen r a b`. This is the base case of the transitive closure. -/
| single {a b} : r a b TransGen r a b
| single {a b : α} : r a b TransGen r a b
/-- If `TransGen r a b` and `r b c`, then `TransGen r a c`.
This is the inductive case of the transitive closure. -/
| tail {a b c} : TransGen r a b r b c TransGen r a c
| tail {a b c : α} : TransGen r a b r b c TransGen r a c
/-- The transitive closure is transitive. -/
theorem Relation.TransGen.trans {α : Sort u} {r : α α Prop} {a b c} :

View File

@@ -283,7 +283,7 @@ Examples:
* `#[1, 2].isEmpty = false`
* `#[()].isEmpty = false`
-/
@[expose]
@[expose, inline]
def isEmpty (xs : Array α) : Bool :=
xs.size = 0
@@ -377,6 +377,7 @@ Returns the last element of an array, or panics if the array is empty.
Safer alternatives include `Array.back`, which requires a proof the array is non-empty, and
`Array.back?`, which returns an `Option`.
-/
@[inline]
def back! [Inhabited α] (xs : Array α) : α :=
xs[xs.size - 1]!
@@ -386,6 +387,7 @@ Returns the last element of an array, given a proof that the array is not empty.
See `Array.back!` for the version that panics if the array is empty, or `Array.back?` for the
version that returns an option.
-/
@[inline]
def back (xs : Array α) (h : 0 < xs.size := by get_elem_tactic) : α :=
xs[xs.size - 1]'(Nat.sub_one_lt_of_lt h)
@@ -395,6 +397,7 @@ Returns the last element of an array, or `none` if the array is empty.
See `Array.back!` for the version that panics if the array is empty, or `Array.back` for the version
that requires a proof the array is non-empty.
-/
@[inline]
def back? (xs : Array α) : Option α :=
xs[xs.size - 1]?

View File

@@ -72,6 +72,9 @@ theorem toArray_eq : List.toArray as = xs ↔ as = xs.toList := by
/-! ### size -/
theorem size_singleton {x : α} : #[x].size = 1 := by
simp
theorem eq_empty_of_size_eq_zero (h : xs.size = 0) : xs = #[] := by
cases xs
simp_all
@@ -3483,6 +3486,21 @@ theorem foldl_eq_foldr_reverse {xs : Array α} {f : β → α → β} {b} :
theorem foldr_eq_foldl_reverse {xs : Array α} {f : α β β} {b} :
xs.foldr f b = xs.reverse.foldl (fun x y => f y x) b := by simp
theorem foldl_eq_apply_foldr {xs : Array α} {f : α α α}
[Std.Associative f] [Std.LawfulRightIdentity f init] :
xs.foldl f x = f x (xs.foldr f init) := by
simp [ foldl_toList, foldr_toList, List.foldl_eq_apply_foldr]
theorem foldr_eq_apply_foldl {xs : Array α} {f : α α α}
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
xs.foldr f x = f (xs.foldl f init) x := by
simp [ foldl_toList, foldr_toList, List.foldr_eq_apply_foldl]
theorem foldr_eq_foldl {xs : Array α} {f : α α α}
[Std.Associative f] [Std.LawfulIdentity f init] :
xs.foldr f init = xs.foldl f init := by
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
@[simp] theorem foldr_push_eq_append {as : Array α} {bs : Array β} {f : α β} (w : start = as.size) :
as.foldr (fun a xs => Array.push xs (f a)) bs start 0 = bs ++ (as.map f).reverse := by
subst w
@@ -4335,16 +4353,33 @@ def sum_eq_sum_toList := @sum_toList
@[simp, grind =]
theorem sum_append [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.LeftIdentity (α := α) (· + ·) 0] [Std.LawfulLeftIdentity (α := α) (· + ·) 0]
[Std.LawfulLeftIdentity (α := α) (· + ·) 0]
{as₁ as₂ : Array α} : (as₁ ++ as₂).sum = as₁.sum + as₂.sum := by
simp [ sum_toList, List.sum_append]
@[simp, grind =]
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
#[x].sum = x := by
simp [Array.sum_eq_foldr, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem sum_push [Add α] [Zero α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Array α} {x : α} :
(xs.push x).sum = xs.sum + x := by
simp [Array.sum_eq_foldr, Std.LawfulRightIdentity.right_id, Std.LawfulLeftIdentity.left_id,
Array.foldr_assoc]
@[simp, grind =]
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.Commutative (α := α) (· + ·)]
[Std.LawfulLeftIdentity (α := α) (· + ·) 0] (xs : Array α) : xs.reverse.sum = xs.sum := by
simp [ sum_toList, List.sum_reverse]
theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Array α} :
xs.sum = xs.foldl (init := 0) (· + ·) := by
simp [ sum_toList, List.sum_eq_foldl]
theorem foldl_toList_eq_flatMap {l : List α} {acc : Array β}
{F : Array β α Array β} {G : α List β}
(H : acc a, (F acc a).toList = acc.toList ++ G a) :

View File

@@ -126,6 +126,14 @@ theorem swap_perm {xs : Array α} {i j : Nat} (h₁ : i < xs.size) (h₂ : j < x
simp only [swap, perm_iff_toList_perm, toList_set]
apply set_set_perm
theorem Perm.pairwise_iff {R : α α Prop} (S : {x y}, R x y R y x) {xs ys : Array α}
: _p : xs.Perm ys, xs.toList.Pairwise R ys.toList.Pairwise R := by
simpa only [perm_iff_toList_perm] using List.Perm.pairwise_iff S
theorem Perm.pairwise {R : α α Prop} {xs ys : Array α} (hp : xs ~ ys)
(hR : xs.toList.Pairwise R) (hsymm : {x y}, R x y R y x) :
ys.toList.Pairwise R := (hp.pairwise_iff hsymm).mp hR
namespace Perm
set_option linter.indexVariables false in

View File

@@ -2393,4 +2393,412 @@ theorem fastUmulOverflow (x y : BitVec w) :
simp [ Nat.pow_add, show w + 1 - (k - 1) + k = w + 1 + 1 by omega] at this
omega
/-! ### Population Count -/
/-- Extract the `k`-th bit from `x` and extend it to have length `len`. -/
def extractAndExtendBit (idx len : Nat) (x : BitVec w) : BitVec len :=
BitVec.zeroExtend len (BitVec.extractLsb' idx 1 x)
/-- Recursively extract one bit at a time and extend it to width `w` -/
def extractAndExtendAux (k len : Nat) (x : BitVec w) (acc : BitVec (k * len)) (hle : k w) :
BitVec (w * len) :=
match hwi : w - k with
| 0 => acc.cast (by simp [show w = k by omega])
| n' + 1 =>
let acc' := extractAndExtendBit k len x ++ acc
extractAndExtendAux (k + 1) len x (acc'.cast (by simp [Nat.add_mul]; omega)) (by omega)
termination_by w - k
/-- We instantiate `extractAndExtendAux` to extend each bit to `len`, extending
each bit in `x` to have width `w` and returning a `BitVec (w * w)`. -/
def extractAndExtend (len : Nat) (x : BitVec w) : BitVec (w * len) :=
extractAndExtendAux 0 len x ((0#0).cast (by simp)) (by omega)
/--
Construct a layer of the parallel-prefix-sum tree by summing two-by-two all the
`w`-long words in `oldLayer`, returning a bitvector containing `(oldLen + 1) / 2`
flattened `w`-long words, each resulting from an addition.
-/
def cpopLayer (oldLayer : BitVec (len * w)) (newLayer : BitVec (iterNum * w))
(hold : 2 * (iterNum - 1) < len) : BitVec (((len + 1)/2) * w) :=
if hlen : len - (iterNum * 2) = 0 then
have : ((len + 1)/2) = iterNum := by omega
newLayer.cast (by simp [this])
else
let op1 := oldLayer.extractLsb' ((2 * iterNum) * w) w
let op2 := oldLayer.extractLsb' ((2 * iterNum + 1) * w) w
let newLayer' := (op1 + op2) ++ newLayer
have hcast : w + iterNum * w = (iterNum + 1) * w := by simp [Nat.add_mul]; omega
cpopLayer oldLayer (newLayer'.cast hcast) (by omega)
termination_by len - (iterNum * 2)
/--
Given a `BitVec (len * w)` of `len` flattened `w`-long words,
construct a binary tree that sums two-by-two the `w`-long words in the previous layer,
ultimately returning a single `w`-long words corresponding to the whole addition.
-/
def cpopTree (l : BitVec (len * w)) : BitVec w :=
if h : len = 0 then 0#w
else if h : len = 1 then
l.cast (by simp [h])
else
cpopTree (cpopLayer l 0#(0 * w) (by omega))
termination_by len
/--
Given flattened bitvector `x : BitVec w` and a length `l : Nat`,
construct a parallel prefix sum circuit adding each available `l`-long word in `x`.
-/
def cpopRec (x : BitVec w) : BitVec w :=
if hw : 1 < w then
let extendedBits := x.extractAndExtend w
(cpopTree extendedBits).cast (by simp)
else if hw' : 0 < w then
x
else
0#w
/-- Recursive addition of the elements in a flattened bitvec, starting from the `rem`-th element. -/
private def addRecAux (x : BitVec (l * w)) (rem : Nat) (acc : BitVec w) : BitVec w :=
match rem with
| 0 => acc
| n + 1 => x.addRecAux n (acc + x.extractLsb' (n * w) w)
/-- Recursive addition of the elements in a flattened bitvec. -/
private def addRec (x : BitVec (l * w)) : BitVec w := addRecAux x l 0#w
theorem getLsbD_extractAndExtendBit {x : BitVec w} :
(extractAndExtendBit k len x).getLsbD i =
(decide (i = 0) && decide (0 < len) && x.getLsbD k) := by
simp only [extractAndExtendBit, truncate_eq_setWidth, getLsbD_setWidth, getLsbD_extractLsb',
Nat.lt_one_iff]
by_cases hi : i = 0
<;> simp [hi]
@[simp]
private theorem extractAndExtendAux_zero {k len : Nat} {x : BitVec w}
{acc : BitVec (k * len)} (heq : w = k) :
extractAndExtendAux k len x acc (by omega) = acc.cast (by simp [heq]) := by
unfold extractAndExtendAux
split
· simp
· omega
private theorem extractLsb'_extractAndExtendAux {k len : Nat} {x : BitVec w}
(acc : BitVec (k * len)) (hle : k w) :
( i (_ : i < k), acc.extractLsb' (i * len) len = (x.extractLsb' i 1).setWidth len)
(extractAndExtendAux k len x acc (by omega)).extractLsb' (i * len) len =
(x.extractLsb' i 1).setWidth len := by
intros hacc
induction hwi : w - k generalizing acc k
· case zero =>
rw [extractAndExtendAux_zero (by omega)]
by_cases hj : i < k
· apply hacc
exact hj
· ext l hl
have := mul_le_mul_right (n := k) (m := i) len (by omega)
simp [ getLsbD_eq_getElem, getLsbD_extractLsb', hl, getLsbD_setWidth,
show w i + l by omega, getLsbD_of_ge acc (i * len + l) (by omega)]
· case succ n' ihn' =>
rw [extractAndExtendAux]
split
· omega
· apply ihn'
· intros i hi
have hcast : len + k * len = (k + 1) * len := by
simp [Nat.mul_comm, Nat.mul_add, Nat.add_comm]
by_cases hi' : i < k
· have heq : extractLsb' (i * len) len (BitVec.cast hcast (extractAndExtendBit k len x ++ acc)) =
extractLsb' (i * len) len ((extractAndExtendBit k len x ++ acc)) := by
ext; simp
rw [heq, extractLsb'_append_of_lt hi']
apply hacc
exact hi'
· have heq : extractLsb' (i * len) len (BitVec.cast hcast (extractAndExtendBit k len x ++ acc)) =
extractLsb' (i * len) len ((extractAndExtendBit k len x ++ acc)) := by
ext; simp
rw [heq, extractLsb'_append_of_eq (by omega)]
simp [show i = k by omega, extractAndExtendBit]
· omega
theorem extractLsb'_cpopLayer {w iterNum i oldLen : Nat} {oldLayer : BitVec (oldLen * w)}
{newLayer : BitVec (iterNum * w)} (hold : 2 * (iterNum - 1) < oldLen) :
( i (_hi: i < iterNum),
newLayer.extractLsb' (i * w) w =
oldLayer.extractLsb' ((2 * i) * w) w + (oldLayer.extractLsb' ((2 * i + 1) * w) w))
extractLsb' (i * w) w (oldLayer.cpopLayer newLayer hold) =
extractLsb' (2 * i * w) w oldLayer + extractLsb' ((2 * i + 1) * w) w oldLayer := by
intro proof_addition
rw [cpopLayer]
split
· by_cases hi : i < iterNum
· simp only [extractLsb'_cast]
apply proof_addition
exact hi
· ext j hj
have : iterNum * w i * w := by refine mul_le_mul_right w (by omega)
have : oldLen * w (2 * i) * w := by refine mul_le_mul_right w (by omega)
have : oldLen * w (2 * i + 1) * w := by refine mul_le_mul_right w (by omega)
have hz : extractLsb' (2 * i * w) w oldLayer = 0#w := by
ext j hj
simp [show oldLen * w 2 * i * w + j by omega]
have hz' : extractLsb' ((2 * i + 1) * w) w oldLayer = 0#w := by
ext j hj
simp [show oldLen * w (2 * i + 1) * w + j by omega]
simp [show iterNum * w i * w + j by omega, hz, hz']
· generalize hop1 : oldLayer.extractLsb' ((2 * iterNum) * w) w = op1
generalize hop2 : oldLayer.extractLsb' ((2 * iterNum + 1) * w) w = op2
have hcast : w + iterNum * w = (iterNum + 1) * w := by simp [Nat.add_mul]; omega
apply extractLsb'_cpopLayer
intros i hi
by_cases hlt : i < iterNum
· rw [extractLsb'_cast, extractLsb'_append_eq_of_add_le]
· apply proof_addition
exact hlt
· rw [show i * w + w = i * w + 1 * w by omega, Nat.add_mul]
exact mul_le_mul_right w hlt
· rw [extractLsb'_cast, show i = iterNum by omega, extractLsb'_append_eq_left, hop1, hop2]
termination_by oldLen - 2 * (iterNum + 1 - 1)
theorem getLsbD_cpopLayer {w iterNum: Nat} {oldLayer : BitVec (oldLen * w)}
{newLayer : BitVec (iterNum * w)} (hold : 2 * (iterNum - 1) < oldLen) :
( i (_hi: i < iterNum),
newLayer.extractLsb' (i * w) w =
oldLayer.extractLsb' ((2 * i) * w) w + (oldLayer.extractLsb' ((2 * i + 1) * w) w))
(oldLayer.cpopLayer newLayer hold).getLsbD k =
(extractLsb' (2 * ((k - k % w) / w) * w) w oldLayer +
extractLsb' ((2 * ((k - k % w) / w) + 1) * w) w oldLayer).getLsbD (k % w) := by
intro proof_addition
by_cases hw0 : w = 0
· subst hw0
simp
· simp only [ extractLsb'_cpopLayer (hold := by omega) proof_addition,
Nat.mod_lt (x := k) (y := w) (by omega), getLsbD_eq_getElem, getElem_extractLsb']
congr
by_cases hmod : k % w = 0
· rw [hmod, Nat.sub_zero, Nat.add_zero, Nat.div_mul_cancel (by omega)]
· rw [Nat.div_mul_cancel (by exact dvd_sub_mod k), Nat.sub_add_cancel (by exact mod_le k w)]
@[simp]
private theorem addRecAux_zero {x : BitVec (l * w)} {acc : BitVec w} :
x.addRecAux 0 acc = acc := rfl
@[simp]
private theorem addRecAux_succ {x : BitVec (l * w)} {n : Nat} {acc : BitVec w} :
x.addRecAux (n + 1) acc = x.addRecAux n (acc + extractLsb' (n * w) w x) := rfl
private theorem addRecAux_eq {x : BitVec (l * w)} {n : Nat} {acc : BitVec w} :
x.addRecAux n acc = x.addRecAux n 0#w + acc := by
induction n generalizing acc
· case zero =>
simp
· case succ n ihn =>
simp only [addRecAux_succ, BitVec.zero_add, ihn (acc := extractLsb' (n * w) w x),
BitVec.add_assoc, ihn (acc := acc + extractLsb' (n * w) w x), BitVec.add_right_inj]
rw [BitVec.add_comm (x := acc)]
private theorem extractLsb'_addRecAux_of_le {x : BitVec (len * w)} (h : r k):
(extractLsb' 0 (k * w) x).addRecAux r 0#w = x.addRecAux r 0#w := by
induction r generalizing x len k
· case zero =>
simp [addRecAux]
· case succ diff ihdiff =>
simp only [addRecAux_succ, BitVec.zero_add]
have hext : diff * w + w k * w := by
simp only [show diff * w + w = (diff + 1) * w by simp [Nat.add_mul]]
exact Nat.mul_le_mul_right w h
rw [extractLsb'_extractLsb'_of_le hext, addRecAux_eq (x := x),
addRecAux_eq (x := extractLsb' 0 (k * w) x), ihdiff (x := x) (by omega) (k := k)]
private theorem extractLsb'_extractAndExtend_eq {i len : Nat} {x : BitVec w} :
(extractAndExtend len x).extractLsb' (i * len) len = extractAndExtendBit i len x := by
unfold extractAndExtend
by_cases hilt : i < w
· ext j hj
simp [extractLsb'_extractAndExtendAux, extractAndExtendBit]
· ext k hk
have := Nat.mul_le_mul_right (n := w) (k := len) (m := i) (by omega)
simp only [extractAndExtendBit, cast_ofNat, getElem_extractLsb', truncate_eq_setWidth,
getElem_setWidth, getLsbD_extractLsb', Nat.lt_one_iff]
rw [getLsbD_of_ge, getLsbD_of_ge]
· simp
· omega
· omega
private theorem addRecAux_append_extractLsb' {x : BitVec (len * w)} (ha : 0 < len) :
((x.extractLsb' ((len - 1) * w) w ++
x.extractLsb' 0 ((len - 1) * w)).cast (m := len * w) hcast).addRecAux len 0#w =
x.extractLsb' ((len - 1) * w) w +
(x.extractLsb' 0 ((len - 1) * w)).addRecAux (len - 1) 0#w := by
simp only [extractLsb'_addRecAux_of_le (k := len - 1) (r := len - 1) (by omega),
BitVec.append_extractLsb'_of_lt (hcast := hcast)]
have hsucc := addRecAux_succ (x := x) (acc := 0#w) (n := len - 1)
rw [BitVec.zero_add, Nat.sub_one_add_one (by omega)] at hsucc
rw [hsucc, addRecAux_eq, BitVec.add_comm]
private theorem Nat.mul_add_le_mul_of_succ_le {a b c : Nat} (h : a + 1 c) :
a * b + b c * b := by
rw [ Nat.succ_mul]
exact mul_le_mul_right b h
/--
The recursive addition of `w`-long words on two flattened bitvectors `x` and `y` (with different
number of words `len` and `len'`, respectively) returns the same value, if we can prove
that each `w`-long word in `x` results from the addition of two `w`-long words in `y`,
using exactly all `w`-long words in `y`.
-/
private theorem addRecAux_eq_of {x : BitVec (len * w)} {y : BitVec (len' * w)}
(hlen : len = (len' + 1) / 2) :
( (i : Nat) (_h : i < (len' + 1) / 2),
extractLsb' (i * w) w x = extractLsb' (2 * i * w) w y + extractLsb' ((2 * i + 1) * w) w y)
x.addRecAux len 0#w = y.addRecAux len' 0#w := by
intro hadd
induction len generalizing len' y
· case zero =>
simp [show len' = 0 by omega]
· case succ len ih =>
have hcast : w + (len + 1 - 1) * w = (len + 1) * w := by
simp [Nat.add_mul, Nat.add_comm]
have hcast' : w + (len' - 1) * w = len' * w := by
rw [Nat.sub_mul, Nat.one_mul,
Nat.add_sub_assoc (by refine Nat.le_mul_of_pos_left w (by omega)), Nat.add_comm]
simp
rw [addRecAux_succ, BitVec.append_extractLsb'_of_lt (x := x) (hcast := hcast)]
have happ := addRecAux_append_extractLsb' (len := len + 1) (x := x) (hcast := hcast) (by omega)
simp only [Nat.add_one_sub_one, addRecAux_succ, BitVec.zero_add] at happ
simp only [Nat.add_one_sub_one, BitVec.zero_add, happ]
have := Nat.succ_mul (n := len' - 1) (m := w)
rw [succ_eq_add_one, Nat.sub_one_add_one (by omega)] at this
by_cases hmod : len' % 2 = 0
· /- `sum` results from the addition of the two last elements in `y`, `sum = op1 + op2` -/
have := Nat.mul_le_mul_right (n := len' - 1 - 1) (m := len' - 1) (k := w) (by omega)
have := Nat.succ_mul (n := len' - 1 - 1) (m := w)
have hcast'' : w + (len' - 1 - 1) * w = (len' - 1) * w := by
rw [Nat.sub_mul, Nat.one_mul,
Nat.add_sub_assoc (k := w) (by refine Nat.le_mul_of_pos_left w (by omega))]
simp
rw [succ_eq_add_one, Nat.sub_one_add_one (by omega)] at this
rw [ BitVec.append_extractLsb'_of_lt (x := y) (hcast := hcast'),
addRecAux_append_extractLsb' (by omega),
BitVec.append_extractLsb'_of_lt (x := extractLsb' 0 ((len' - 1) * w) y) (hcast := hcast''),
addRecAux_append_extractLsb' (by omega),
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
extractLsb'_extractLsb'_of_le (by omega), BitVec.add_assoc, hadd (_h := by omega)]
congr 1
· rw [show len = (len' + 1) / 2 - 1 by omega, BitVec.add_comm]
congr <;> omega
· apply ih
· omega
· intros
rw [extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
hadd (_h := by omega)]
· /- `sum` results from the addition of the last elements in `y` with `0#w` -/
have : len' * w (len' - 1 + 1) * w := by exact mul_le_mul_right w (by omega)
rw [ BitVec.append_extractLsb'_of_lt (x := y) (hcast := hcast'),
addRecAux_append_extractLsb' (by omega), hadd (_h := by omega),
show 2 * len = len' - 1 by omega]
congr 1
· rw [BitVec.add_right_eq_self]
ext k hk
simp only [getElem_extractLsb', getElem_zero]
apply getLsbD_of_ge y ((len' - 1 + 1) * w + k) (by omega)
· apply ih
· omega
· intros
rw [extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
hadd (_h := by omega)]
private theorem getLsbD_extractAndExtend_of_lt {x : BitVec w} (hk : k < v) :
(x.extractAndExtend v).getLsbD (pos * v + k) = (extractAndExtendBit pos v x).getLsbD k := by
simp [ extractLsb'_extractAndExtend_eq (w := w) (len := v) (i := pos) (x := x)]
omega
/--
Extracting a bit from a `BitVec.extractAndExtend` is the same as extracting a bit
from a zero-extended bit at a certain position in the original bitvector.
-/
theorem getLsbD_extractAndExtend {x : BitVec w} (hv : 0 < v) :
(BitVec.extractAndExtend v x).getLsbD k =
(BitVec.extractAndExtendBit ((k - (k % v)) / v) v x).getLsbD (k % v):= by
rw [ getLsbD_extractAndExtend_of_lt (by exact mod_lt k hv)]
congr
by_cases hmod : k % v = 0
· simp only [hmod, Nat.sub_zero, Nat.add_zero]
rw [Nat.div_mul_cancel (by omega)]
· rw [ Nat.div_eq_sub_mod_div]
exact Eq.symm (div_add_mod' k v)
private theorem addRecAux_extractAndExtend_eq_cpopNatRec {x : BitVec w} :
(extractAndExtend w x).addRecAux n 0#w = x.cpopNatRec n 0 := by
induction n
· case zero =>
simp
· case succ n' ihn' =>
rw [cpopNatRec_succ, Nat.zero_add, natCast_eq_ofNat, addRecAux_succ, BitVec.zero_add,
addRecAux_eq, cpopNatRec_eq, ihn', ofNat_add, natCast_eq_ofNat, BitVec.add_right_inj,
extractLsb'_extractAndExtend_eq]
ext k hk
simp only [extractAndExtendBit, getLsbD_eq_getElem, getLsbD_ofNat, hk, decide_true,
Bool.true_and, truncate_eq_setWidth, getLsbD_setWidth, getLsbD_extractLsb', Nat.lt_one_iff]
by_cases hk0 : k = 0
· simp only [hk0, testBit_zero, decide_true, Nat.add_zero, Bool.true_and]
cases x.getLsbD n' <;> simp
· simp only [show ¬k = 0 by omega, decide_false, Bool.false_and]
symm
apply testBit_lt_two_pow ?_
have : (x.getLsbD n').toNat 1 := by
cases x.getLsbD n' <;> simp
have : 1 < 2 ^ k := by exact Nat.one_lt_two_pow hk0
omega
private theorem addRecAux_extractAndExtend_eq_cpop {x : BitVec w} :
(extractAndExtend w x).addRecAux w 0#w = x.cpop := by
simp only [cpop]
apply addRecAux_extractAndExtend_eq_cpopNatRec
private theorem addRecAux_cpopTree {x : BitVec (len * w)} :
addRecAux ((cpopTree x).cast (m := 1 * w) (by simp)) 1 0#w = addRecAux x len 0#w := by
unfold cpopTree
split
· case _ h =>
subst h
simp [addRecAux]
· case _ h =>
split
· case _ h' =>
simp only [addRecAux_succ, Nat.zero_mul, BitVec.zero_add, addRecAux_zero, h']
ext; simp
· rw [addRecAux_cpopTree]
apply BitVec.addRecAux_eq_of (x := cpopLayer x 0#(0 * w) (by omega)) (y := x)
· rfl
· intros j hj
simp [extractLsb'_cpopLayer]
termination_by len
private theorem addRecAux_eq_cpopTree {x : BitVec (len * w)} :
x.addRecAux len 0#w = (x.cpopTree).cast (by simp) := by
rw [ addRecAux_cpopTree, addRecAux_succ, Nat.zero_mul, BitVec.zero_add, addRecAux_zero]
ext k hk
simp [ getLsbD_eq_getElem, hk]
theorem cpop_eq_cpopRec {x : BitVec w} :
BitVec.cpop x = BitVec.cpopRec x := by
unfold BitVec.cpopRec
split
· simp [ addRecAux_extractAndExtend_eq_cpop, addRecAux_eq_cpopTree (x := extractAndExtend w x)]
· split
· ext k hk
cases hx : x.getLsbD 0
<;> simp [hx, cpop, getLsbD_eq_getElem, show k = 0 by omega, show w = 1 by omega]
· have hw : w = 0 := by omega
subst hw
simp [of_length_zero]
end BitVec

View File

@@ -2786,6 +2786,14 @@ theorem msb_append {x : BitVec w} {y : BitVec v} :
rw [getElem_append] -- Why does this not work with `simp [getElem_append]`?
simp
theorem append_of_zero_width (x : BitVec w) (y : BitVec v) (h : w = 0) :
(x ++ y) = y.cast (by simp [h]) := by
ext i ih
subst h
simp [ getLsbD_eq_getElem, getLsbD_append]
omega
set_option backward.isDefEq.respectTransparency false in
@[grind =]
theorem toInt_append {x : BitVec n} {y : BitVec m} :
(x ++ y).toInt = if n == 0 then y.toInt else (2 ^ m) * x.toInt + y.toNat := by
@@ -3012,6 +3020,34 @@ theorem extractLsb'_append_extractLsb'_eq_extractLsb' {x : BitVec w} (h : start
congr 1
omega
theorem append_extractLsb'_of_lt {x : BitVec (x_len * w)} :
(x.extractLsb' ((x_len - 1) * w) w ++ x.extractLsb' 0 ((x_len - 1) * w)).cast hcast = x := by
ext i hi
simp only [getElem_cast, getElem_append, getElem_extractLsb', Nat.zero_add, dite_eq_ite]
rw [ getLsbD_eq_getElem, ite_eq_left_iff, Nat.not_lt]
intros
simp only [show (x_len - 1) * w + (i - (x_len - 1) * w) = i by omega]
theorem extractLsb'_append_of_lt {x : BitVec (k * w)} {y : BitVec w} (hlt : i < k) :
extractLsb' (i * w) w (y ++ x) = extractLsb' (i * w) w x := by
ext j hj
simp only [ getLsbD_eq_getElem, getLsbD_extractLsb', hj, decide_true, getLsbD_append,
Bool.true_and, ite_eq_left_iff, Nat.not_lt]
intros h
by_cases hw0 : w = 0
· subst hw0
simp
· have : i * w (k - 1) * w := Nat.mul_le_mul_right w (by omega)
have h' : i * w + j < (k - 1 + 1) * w := by simp [Nat.add_mul]; omega
rw [Nat.sub_one_add_one (by omega)] at h'
omega
theorem extractLsb'_append_of_eq {x : BitVec (k * w)} {y : BitVec w} (heq : i = k) :
extractLsb' (i * w) w (y ++ x) = y := by
ext j hj
simp [ getLsbD_eq_getElem, getLsbD_append, hj, heq]
/-- Combine adjacent `~~~ (extractLsb _)'` operations into a single `~~~ (extractLsb _)'`. -/
theorem not_extractLsb'_append_not_extractLsb'_eq_not_extractLsb' {x : BitVec w} (h : start₂ = start₁ + len₁) :
(~~~ (x.extractLsb' start₂ len₂) ++ ~~~ (x.extractLsb' start₁ len₁)) =

View File

@@ -414,7 +414,7 @@ Renders a `Format` to a string.
-/
def pretty (f : Format) (width : Nat := defWidth) (indent : Nat := 0) (column := 0) : String :=
let act : StateM State Unit := prettyM f width indent
State.out <| act (State.mk "" column) |>.snd
State.out <| act.run (State.mk "" column) |>.snd
end Format

View File

@@ -36,3 +36,5 @@ public import Init.Data.List.FinRange
public import Init.Data.List.Lex
public import Init.Data.List.Range
public import Init.Data.List.Scan
public import Init.Data.List.ControlImpl
public import Init.Data.List.SplitOn

View File

@@ -135,7 +135,11 @@ protected def beq [BEq α] : List α → List α → Bool
@[simp] theorem beq_nil_nil [BEq α] : List.beq ([] : List α) ([] : List α) = true := rfl
@[simp] theorem beq_cons_nil [BEq α] {a : α} {as : List α} : List.beq (a::as) [] = false := rfl
@[simp] theorem beq_nil_cons [BEq α] {a : α} {as : List α} : List.beq [] (a::as) = false := rfl
theorem beq_cons [BEq α] {a b : α} {as bs : List α} : List.beq (a::as) (b::bs) = (a == b && List.beq as bs) := rfl
theorem beq_cons_cons [BEq α] {a b : α} {as bs : List α} : List.beq (a::as) (b::bs) = (a == b && List.beq as bs) := rfl
@[deprecated beq_cons_cons (since := "2026-02-26")]
theorem beq_cons₂ [BEq α] {a b : α} {as bs : List α} :
List.beq (a::as) (b::bs) = (a == b && List.beq as bs) := beq_cons_cons
instance [BEq α] : BEq (List α) := List.beq
@@ -175,7 +179,10 @@ Examples:
@[simp, grind =] theorem isEqv_nil_nil : isEqv ([] : List α) [] eqv = true := rfl
@[simp, grind =] theorem isEqv_nil_cons : isEqv ([] : List α) (a::as) eqv = false := rfl
@[simp, grind =] theorem isEqv_cons_nil : isEqv (a::as : List α) [] eqv = false := rfl
@[grind =] theorem isEqv_cons : isEqv (a::as) (b::bs) eqv = (eqv a b && isEqv as bs eqv) := rfl
@[grind =] theorem isEqv_cons_cons : isEqv (a::as) (b::bs) eqv = (eqv a b && isEqv as bs eqv) := rfl
@[deprecated isEqv_cons_cons (since := "2026-02-26")]
theorem isEqv_cons₂ : isEqv (a::as) (b::bs) eqv = (eqv a b && isEqv as bs eqv) := isEqv_cons_cons
/-! ## Lexicographic ordering -/
@@ -1048,9 +1055,12 @@ def dropLast {α} : List α → List α
@[simp, grind =] theorem dropLast_nil : ([] : List α).dropLast = [] := rfl
@[simp, grind =] theorem dropLast_singleton : [x].dropLast = [] := rfl
@[simp, grind =] theorem dropLast_cons :
@[simp, grind =] theorem dropLast_cons_cons :
(x::y::zs).dropLast = x :: (y::zs).dropLast := rfl
@[deprecated dropLast_cons_cons (since := "2026-02-26")]
theorem dropLast_cons₂ : (x::y::zs).dropLast = x :: (y::zs).dropLast := dropLast_cons_cons
-- Later this can be proved by `simp` via `[List.length_dropLast, List.length_cons, Nat.add_sub_cancel]`,
-- but we need this while bootstrapping `Array`.
@[simp] theorem length_dropLast_cons {a : α} {as : List α} : (a :: as).dropLast.length = as.length := by
@@ -1085,7 +1095,11 @@ inductive Sublist {α} : List α → List α → Prop
/-- If `l₁` is a subsequence of `l₂`, then it is also a subsequence of `a :: l₂`. -/
| cons a : Sublist l₁ l₂ Sublist l₁ (a :: l₂)
/-- If `l₁` is a subsequence of `l₂`, then `a :: l₁` is a subsequence of `a :: l₂`. -/
| cons a : Sublist l₁ l₂ Sublist (a :: l₁) (a :: l₂)
| cons_cons a : Sublist l₁ l₂ Sublist (a :: l₁) (a :: l₂)
set_option linter.missingDocs false in
@[deprecated Sublist.cons_cons (since := "2026-02-26"), match_pattern]
abbrev Sublist.cons₂ := @Sublist.cons_cons
@[inherit_doc] scoped infixl:50 " <+ " => Sublist
@@ -1143,9 +1157,13 @@ def isPrefixOf [BEq α] : List α → List α → Bool
@[simp, grind =] theorem isPrefixOf_nil_left [BEq α] : isPrefixOf ([] : List α) l = true := by
simp [isPrefixOf]
@[simp, grind =] theorem isPrefixOf_cons_nil [BEq α] : isPrefixOf (a::as) ([] : List α) = false := rfl
@[grind =] theorem isPrefixOf_cons [BEq α] {a : α} :
@[grind =] theorem isPrefixOf_cons_cons [BEq α] {a : α} :
isPrefixOf (a::as) (b::bs) = (a == b && isPrefixOf as bs) := rfl
@[deprecated isPrefixOf_cons_cons (since := "2026-02-26")]
theorem isPrefixOf_cons₂ [BEq α] {a : α} :
isPrefixOf (a::as) (b::bs) = (a == b && isPrefixOf as bs) := isPrefixOf_cons_cons
/--
If the first list is a prefix of the second, returns the result of dropping the prefix.
@@ -2164,10 +2182,16 @@ def intersperse (sep : α) : (l : List α) → List α
| x::xs => x :: sep :: intersperse sep xs
@[simp] theorem intersperse_nil {sep : α} : ([] : List α).intersperse sep = [] := rfl
@[simp] theorem intersperse_single {x : α} {sep : α} : [x].intersperse sep = [x] := rfl
@[simp] theorem intersperse_cons₂ {x : α} {y : α} {zs : List α} {sep : α} :
@[simp] theorem intersperse_singleton {x : α} {sep : α} : [x].intersperse sep = [x] := rfl
@[deprecated intersperse_singleton (since := "2026-02-26")]
theorem intersperse_single {x : α} {sep : α} : [x].intersperse sep = [x] := rfl
@[simp] theorem intersperse_cons_cons {x : α} {y : α} {zs : List α} {sep : α} :
(x::y::zs).intersperse sep = x::sep::((y::zs).intersperse sep) := rfl
@[deprecated intersperse_cons_cons (since := "2026-02-26")]
theorem intersperse_cons₂ {x : α} {y : α} {zs : List α} {sep : α} :
(x::y::zs).intersperse sep = x::sep::((y::zs).intersperse sep) := intersperse_cons_cons
/-! ### intercalate -/
set_option linter.listVariables false in
@@ -2186,7 +2210,7 @@ Examples:
* `List.intercalate sep [a, b] = a ++ sep ++ b`
* `List.intercalate sep [a, b, c] = a ++ sep ++ b ++ sep ++ c`
-/
def intercalate (sep : List α) (xs : List (List α)) : List α :=
noncomputable def intercalate (sep : List α) (xs : List (List α)) : List α :=
(intersperse sep xs).flatten
/-! ### eraseDupsBy -/

View File

@@ -219,9 +219,9 @@ def filterMapM {m : Type u → Type v} [Monad m] {α : Type w} {β : Type u} (f
Applies a monadic function that returns a list to each element of a list, from left to right, and
concatenates the resulting lists.
-/
@[inline, expose]
def flatMapM {m : Type u Type v} [Monad m] {α : Type w} {β : Type u} (f : α m (List β)) (as : List α) : m (List β) :=
let rec @[specialize] loop
@[expose]
noncomputable def flatMapM {m : Type u Type v} [Monad m] {α : Type w} {β : Type u} (f : α m (List β)) (as : List α) : m (List β) :=
let rec loop
| [], bs => pure bs.reverse.flatten
| a :: as, bs => do
let bs' f a

View File

@@ -0,0 +1,35 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Author: Markus Himmel
-/
module
prelude
public import Init.Data.List.Control
public import Init.Data.List.Impl
public section
namespace List
/--
Applies a monadic function that returns a list to each element of a list, from left to right, and
concatenates the resulting lists.
-/
@[inline, expose]
def flatMapMTR {m : Type u Type v} [Monad m] {α : Type w} {β : Type u} (f : α m (List β)) (as : List α) : m (List β) :=
let rec @[specialize] loop
| [], bs => pure bs.reverse.flatten
| a :: as, bs => do
let bs' f a
loop as (bs' :: bs)
loop as []
@[csimp] theorem flatMapM_eq_flatMapMTR : @flatMapM = @flatMapMTR := by
funext m _ α β f l
simp only [flatMapM, flatMapMTR]
generalize [] = m
fun_induction flatMapM.loop <;> simp_all [flatMapMTR.loop]
end List

View File

@@ -125,7 +125,7 @@ protected theorem Sublist.eraseP : l₁ <+ l₂ → l₁.eraseP p <+ l₂.eraseP
by_cases h : p a
· simpa [h] using s.eraseP.trans eraseP_sublist
· simpa [h] using s.eraseP.cons _
| .cons a s => by
| .cons_cons a s => by
by_cases h : p a
· simpa [h] using s
· simpa [h] using s.eraseP

View File

@@ -184,7 +184,7 @@ theorem Sublist.findSome?_isSome {l₁ l₂ : List α} (h : l₁ <+ l₂) :
induction h with
| slnil => simp
| cons a h ih
| cons a h ih =>
| cons_cons a h ih =>
simp only [findSome?]
split
· simp_all
@@ -455,7 +455,7 @@ theorem Sublist.find?_isSome {l₁ l₂ : List α} (h : l₁ <+ l₂) : (l₁.fi
induction h with
| slnil => simp
| cons a h ih
| cons a h ih =>
| cons_cons a h ih =>
simp only [find?]
split
· simp

View File

@@ -1394,7 +1394,7 @@ theorem head_filter_of_pos {p : α → Bool} {l : List α} (w : l ≠ []) (h : p
@[simp] theorem filter_sublist {p : α Bool} : {l : List α}, filter p l <+ l
| [] => .slnil
| a :: l => by rw [filter]; split <;> simp [Sublist.cons, Sublist.cons, filter_sublist]
| a :: l => by rw [filter]; split <;> simp [Sublist.cons, Sublist.cons_cons, filter_sublist]
/-! ### filterMap -/
@@ -1838,6 +1838,11 @@ theorem sum_append [Add α] [Zero α] [Std.LawfulLeftIdentity (α := α) (· +
[Std.Associative (α := α) (· + ·)] {l₁ l₂ : List α} : (l₁ ++ l₂).sum = l₁.sum + l₂.sum := by
induction l₁ generalizing l₂ <;> simp_all [Std.Associative.assoc, Std.LawfulLeftIdentity.left_id]
@[simp, grind =]
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
[x].sum = x := by
simp [List.sum_eq_foldr, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.Commutative (α := α) (· + ·)]
@@ -2727,6 +2732,31 @@ theorem foldr_assoc {op : ααα} [ha : Std.Associative op] :
simp only [foldr_cons, ha.assoc]
rw [foldr_assoc]
theorem foldl_eq_apply_foldr {xs : List α} {f : α α α}
[Std.Associative f] [Std.LawfulRightIdentity f init] :
xs.foldl f x = f x (xs.foldr f init) := by
induction xs generalizing x
· simp [Std.LawfulRightIdentity.right_id]
· simp [foldl_assoc, *]
theorem foldr_eq_apply_foldl {xs : List α} {f : α α α}
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
xs.foldr f x = f (xs.foldl f init) x := by
have : Std.Associative (fun x y => f y x) := by simp [Std.Associative.assoc]
have : Std.RightIdentity (fun x y => f y x) init :=
have : Std.LawfulRightIdentity (fun x y => f y x) init := by simp [Std.LawfulLeftIdentity.left_id]
rw [ List.reverse_reverse (as := xs), foldr_reverse, foldl_eq_apply_foldr, foldl_reverse]
theorem foldr_eq_foldl {xs : List α} {f : α α α}
[Std.Associative f] [Std.LawfulIdentity f init] :
xs.foldr f init = xs.foldl f init := by
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : List α} :
xs.sum = xs.foldl (init := 0) (· + ·) := by
simp [sum_eq_foldr, foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
-- The argument `f : α₁ → α₂` is intentionally explicit, as it is sometimes not found by unification.
theorem foldl_hom (f : α₁ α₂) {g₁ : α₁ β α₁} {g₂ : α₂ β α₂} {l : List β} {init : α₁}
(H : x y, g₂ (f x) y = f (g₁ x y)) : l.foldl g₂ (f init) = f (l.foldl g₁ init) := by
@@ -3124,7 +3154,7 @@ theorem dropLast_concat_getLast : ∀ {l : List α} (h : l ≠ []), dropLast l +
| [], h => absurd rfl h
| [_], _ => rfl
| _ :: b :: l, _ => by
rw [dropLast_cons, cons_append, getLast_cons (cons_ne_nil _ _)]
rw [dropLast_cons_cons, cons_append, getLast_cons (cons_ne_nil _ _)]
congr
exact dropLast_concat_getLast (cons_ne_nil b l)
@@ -3744,4 +3774,28 @@ theorem get_mem : ∀ (l : List α) n, get l n ∈ l
theorem mem_iff_get {a} {l : List α} : a l n, get l n = a :=
get_of_mem, fun _, e => e get_mem ..
/-! ### `intercalate` -/
@[simp]
theorem intercalate_nil {ys : List α} : ys.intercalate [] = [] := rfl
@[simp]
theorem intercalate_singleton {ys xs : List α} : ys.intercalate [xs] = xs := by
simp [intercalate]
@[simp]
theorem intercalate_cons_cons {ys l l' : List α} {zs : List (List α)} :
ys.intercalate (l :: l' :: zs) = l ++ ys ++ ys.intercalate (l' :: zs) := by
simp [intercalate]
@[simp]
theorem intercalate_cons_cons_left {ys l : List α} {x : α} {zs : List (List α)} :
ys.intercalate ((x :: l) :: zs) = x :: ys.intercalate (l :: zs) := by
cases zs <;> simp
theorem intercalate_cons_of_ne_nil {ys l : List α} {zs : List (List α)} (h : zs []) :
ys.intercalate (l :: zs) = l ++ ys ++ ys.intercalate zs :=
match zs, h with
| l'::zs, _ => by simp
end List

View File

@@ -42,7 +42,7 @@ theorem beq_eq_isEqv [BEq α] {as bs : List α} : as.beq bs = isEqv as bs (· ==
cases bs with
| nil => simp
| cons b bs =>
simp only [beq_cons, ih, isEqv_eq_decide, length_cons, Nat.add_right_cancel_iff,
simp only [beq_cons_cons, ih, isEqv_eq_decide, length_cons, Nat.add_right_cancel_iff,
Nat.forall_lt_succ_left', getElem_cons_zero, getElem_cons_succ, Bool.decide_and,
Bool.decide_eq_true]
split <;> simp

View File

@@ -106,7 +106,7 @@ theorem Sublist.le_countP (s : l₁ <+ l₂) (p) : countP p l₂ - (l₂.length
have := s.le_countP p
have := s.length_le
split <;> omega
| .cons a s =>
| .cons_cons a s =>
rename_i l₁ l₂
simp only [countP_cons, length_cons]
have := s.le_countP p

View File

@@ -38,7 +38,7 @@ theorem map_getElem_sublist {l : List α} {is : List (Fin l.length)} (h : is.Pai
simp only [Fin.getElem_fin, map_cons]
have := IH h.of_cons (hd+1) (pairwise_cons.mp h).1
specialize his hd (.head _)
have := (drop_eq_getElem_cons ..).symm this.cons (get l hd)
have := (drop_eq_getElem_cons ..).symm this.cons_cons (get l hd)
have := Sublist.append (nil_sublist (take hd l |>.drop j)) this
rwa [nil_append, (drop_append_of_le_length ?_), take_append_drop] at this
simp [Nat.min_eq_left (Nat.le_of_lt hd.isLt), his]
@@ -55,7 +55,7 @@ theorem sublist_eq_map_getElem {l l' : List α} (h : l' <+ l) : ∃ is : List (F
refine is.map (·.succ), ?_
set_option backward.isDefEq.respectTransparency false in
simpa [Function.comp_def, pairwise_map]
| cons _ _ IH =>
| cons_cons _ _ IH =>
rcases IH with is,IH
refine 0, by simp [Nat.zero_lt_succ] :: is.map (·.succ), ?_
set_option backward.isDefEq.respectTransparency false in

View File

@@ -207,7 +207,7 @@ theorem take_eq_dropLast {l : List α} {i : Nat} (h : i + 1 = l.length) :
· cases as with
| nil => simp_all
| cons b bs =>
simp only [take_succ_cons, dropLast_cons]
simp only [take_succ_cons, dropLast_cons_cons]
rw [ih]
simpa using h

View File

@@ -33,7 +33,7 @@ open Nat
@[grind ] theorem Pairwise.sublist : l₁ <+ l₂ l₂.Pairwise R l₁.Pairwise R
| .slnil, h => h
| .cons _ s, .cons _ h₂ => h₂.sublist s
| .cons _ s, .cons h₁ h₂ => (h₂.sublist s).cons fun _ h => h₁ _ (s.subset h)
| .cons_cons _ s, .cons h₁ h₂ => (h₂.sublist s).cons fun _ h => h₁ _ (s.subset h)
theorem Pairwise.imp {α R S} (H : {a b}, R a b S a b) :
{l : List α}, l.Pairwise R l.Pairwise S
@@ -226,7 +226,7 @@ theorem pairwise_iff_forall_sublist : l.Pairwise R ↔ (∀ {a b}, [a,b] <+ l
constructor <;> intro h
· intro
| a, b, .cons _ hab => exact IH.mp h.2 hab
| _, b, .cons _ hab => refine h.1 _ (hab.subset ?_); simp
| _, b, .cons_cons _ hab => refine h.1 _ (hab.subset ?_); simp
· constructor
· intro x hx
apply h
@@ -304,26 +304,43 @@ grind_pattern Nodup.sublist => l₁ <+ l₂, Nodup l₂
theorem Sublist.nodup : l₁ <+ l₂ Nodup l₂ Nodup l₁ :=
Nodup.sublist
theorem getElem?_inj {xs : List α}
(h₀ : i < xs.length) (h₁ : Nodup xs) (h₂ : xs[i]? = xs[j]?) : i = j := by
induction xs generalizing i j with
| nil => cases h
| cons x xs ih =>
match i, j with
| 0, 0 => rfl
| i+1, j+1 =>
cases h₁ with
| cons ha h₁ =>
simp only [getElem?_cons_succ] at h₂
exact congrArg (· + 1) (ih (Nat.lt_of_succ_lt_succ h) h₁ h₂)
| i+1, 0 => ?_
| 0, j+1 => ?_
all_goals
simp only [getElem?_cons_zero, getElem?_cons_succ] at h₂
cases h₁; rename_i h' h
have := h x ?_ rfl; cases this
rw [mem_iff_getElem?]
exact _, h₂; exact _ , h₂.symm
theorem getElem?_inj {l : List α} (h₀ : i < l.length) (h₁ : List.Nodup l) :
l[i]? = l[j]? i = j :=
by
intro h
induction l generalizing i j with
| nil => cases h₀
| cons x xs ih =>
match i, j with
| 0, 0 => rfl
| i+1, j+1 =>
cases h₁ with
| cons ha h₁ =>
simp only [getElem?_cons_succ] at h₂
exact congrArg (· + 1) (ih (Nat.lt_of_succ_lt_succ h₀) h₁ h₂)
| i+1, 0 => ?_
| 0, j+1 => ?_
all_goals
simp only [getElem?_cons_zero, getElem?_cons_succ] at h₂
cases h₁; rename_i h' h
have := h x ?_ rfl; cases this
rw [mem_iff_getElem?]
exact _, h₂; exact _ , h₂.symm
, by simp +contextual
theorem getElem_inj {xs : List α}
{h₀ : i < xs.length} {h₁ : j < xs.length} (h : Nodup xs) : xs[i] = xs[j] i = j := by
simpa only [List.getElem_eq_getElem?_get, Option.get_inj] using getElem?_inj h₀ h
theorem getD_inj {xs : List α}
(h₀ : i < xs.length) (h₁ : j < xs.length) (h₂ : Nodup xs) :
xs.getD i fallback = xs.getD j fallback i = j := by
simp only [List.getD_eq_getElem?_getD]
rw [Option.getD_inj, getElem?_inj] <;> simpa
theorem getElem!_inj [Inhabited α] {xs : List α}
(h₀ : i < xs.length) (h₁ : j < xs.length) (h₂ : Nodup xs) : xs[i]! = xs[j]! i = j := by
simpa only [getElem!_eq_getElem?_getD, getD_eq_getElem?_getD] using getD_inj h₀ h₁ h₂
@[simp, grind =] theorem nodup_replicate {n : Nat} {a : α} :
(replicate n a).Nodup n 1 := by simp [Nodup]

View File

@@ -252,13 +252,13 @@ theorem exists_perm_sublist {l₁ l₂ l₂' : List α} (s : l₁ <+ l₂) (p :
| cons x _ IH =>
match s with
| .cons _ s => let l₁', p', s' := IH s; exact l₁', p', s'.cons _
| .cons _ s => let l₁', p', s' := IH s; exact x :: l₁', p'.cons x, s'.cons _
| .cons_cons _ s => let l₁', p', s' := IH s; exact x :: l₁', p'.cons x, s'.cons_cons _
| swap x y l' =>
match s with
| .cons _ (.cons _ s) => exact _, .rfl, (s.cons _).cons _
| .cons _ (.cons _ s) => exact x :: _, .rfl, (s.cons _).cons _
| .cons _ (.cons _ s) => exact y :: _, .rfl, (s.cons _).cons _
| .cons _ (.cons _ s) => exact x :: y :: _, .swap .., (s.cons _).cons _
| .cons _ (.cons_cons _ s) => exact x :: _, .rfl, (s.cons _).cons_cons _
| .cons_cons _ (.cons _ s) => exact y :: _, .rfl, (s.cons_cons _).cons _
| .cons_cons _ (.cons_cons _ s) => exact x :: y :: _, .swap .., (s.cons_cons _).cons_cons _
| trans _ _ IH₁ IH₂ =>
let _, pm, sm := IH₁ s
let r₁, pr, sr := IH₂ sm
@@ -277,7 +277,7 @@ theorem Sublist.exists_perm_append {l₁ l₂ : List α} : l₁ <+ l₂ → ∃
| Sublist.cons a s =>
let l, p := Sublist.exists_perm_append s
a :: l, (p.cons a).trans perm_middle.symm
| Sublist.cons a s =>
| Sublist.cons_cons a s =>
let l, p := Sublist.exists_perm_append s
l, p.cons a

View File

@@ -452,7 +452,7 @@ theorem sublist_mergeSort
have h' := sublist_mergeSort trans total hc h
rw [h₂] at h'
exact h'.middle a
| _, _, @Sublist.cons _ l₁ l₂ a h => by
| _, _, @Sublist.cons_cons _ l₁ l₂ a h => by
rename_i hc
obtain l₃, l₄, h₁, h₂, h₃ := mergeSort_cons trans total a l₂
rw [h₁]
@@ -460,7 +460,7 @@ theorem sublist_mergeSort
rw [h₂] at h'
simp only [Bool.not_eq_true', tail_cons] at h₃ h'
exact
sublist_append_of_sublist_right (Sublist.cons a
sublist_append_of_sublist_right (Sublist.cons_cons a
((fun w => Sublist.of_sublist_append_right w h') fun b m₁ m₃ =>
(Bool.eq_not_self true).mp ((rel_of_pairwise_cons hc m₁).symm.trans (h₃ b m₃))))

View File

@@ -0,0 +1,10 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Markus Himmel
-/
module
prelude
public import Init.Data.List.SplitOn.Basic
public import Init.Data.List.SplitOn.Lemmas

View File

@@ -0,0 +1,70 @@
/-
Copyright (c) 2016 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Init.Data.List.Basic
public import Init.NotationExtra
import Init.Data.Array.Bootstrap
import Init.Data.List.Lemmas
public section
set_option doc.verso true
namespace List
/--
Split a list at every element satisfying a predicate, and then prepend {lean}`acc.reverse` to the
first element of the result.
* {lean}`[1, 1, 2, 3, 2, 4, 4].splitOnPPrepend (· == 2) [0, 5] = [[5, 0, 1, 1], [3], [4, 4]]`
-/
noncomputable def splitOnPPrepend (p : α Bool) : (l : List α) (acc : List α) List (List α)
| [], acc => [acc.reverse]
| a :: t, acc => if p a then acc.reverse :: splitOnPPrepend p t [] else splitOnPPrepend p t (a::acc)
/--
Split a list at every element satisfying a predicate. The separators are not in the result.
Examples:
* {lean}`[1, 1, 2, 3, 2, 4, 4].splitOnP (· == 2) = [[1, 1], [3], [4, 4]]`
-/
noncomputable def splitOnP (p : α Bool) (l : List α) : List (List α) :=
splitOnPPrepend p l []
@[deprecated splitOnPPrepend (since := "2026-02-26")]
noncomputable def splitOnP.go (p : α Bool) (l acc : List α) : List (List α) :=
splitOnPPrepend p l acc
/-- Tail recursive version of {name}`splitOnP`. -/
@[inline]
def splitOnPTR (p : α Bool) (l : List α) : List (List α) := go l #[] #[] where
@[specialize] go : List α Array α Array (List α) List (List α)
| [], acc, r => r.toListAppend [acc.toList]
| a :: t, acc, r => bif p a then go t #[] (r.push acc.toList) else go t (acc.push a) r
@[csimp] theorem splitOnP_eq_splitOnPTR : @splitOnP = @splitOnPTR := by
funext α P l
simp only [splitOnPTR]
suffices xs acc r,
splitOnPTR.go P xs acc r = r.toList ++ splitOnPPrepend P xs acc.toList.reverse from
(this l #[] #[]).symm
intro xs acc r
induction xs generalizing acc r with
| nil => simp [splitOnPPrepend, splitOnPTR.go]
| cons x xs IH => cases h : P x <;> simp [splitOnPPrepend, splitOnPTR.go, *]
/--
Split a list at every occurrence of a separator element. The separators are not in the result.
Examples:
* {lean}`[1, 1, 2, 3, 2, 4, 4].splitOn 2 = [[1, 1], [3], [4, 4]]`
-/
@[inline] def splitOn [BEq α] (a : α) (as : List α) : List (List α) :=
as.splitOnP (· == a)
end List

View File

@@ -0,0 +1,208 @@
/-
Copyright (c) 2014 Parikshit Khanna. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Parikshit Khanna, Jeremy Avigad, Leonardo de Moura, Floris van Doorn, Mario Carneiro, Markus Himmel
-/
module
prelude
public import Init.Data.List.SplitOn.Basic
import all Init.Data.List.SplitOn.Basic
import Init.Data.List.Nat.Modify
import Init.ByCases
public section
namespace List
variable {p : α Bool} {xs : List α} {ls : List (List α)}
@[simp]
theorem splitOn_nil [BEq α] (a : α) : [].splitOn a = [[]] :=
(rfl)
@[simp]
theorem splitOnP_nil : [].splitOnP p = [[]] :=
(rfl)
@[simp]
theorem splitOnPPrepend_ne_nil (p : α Bool) (xs acc : List α) : splitOnPPrepend p xs acc [] := by
fun_induction splitOnPPrepend <;> simp_all
@[deprecated splitOnPPrepend_ne_nil (since := "2026-02-26")]
theorem splitOnP.go_ne_nil (p : α Bool) (xs acc : List α) : splitOnPPrepend p xs acc [] :=
splitOnPPrepend_ne_nil p xs acc
@[simp] theorem splitOnPPrepend_nil {acc : List α} : splitOnPPrepend p [] acc = [acc.reverse] := (rfl)
@[simp] theorem splitOnPPrepend_nil_right : splitOnPPrepend p xs [] = splitOnP p xs := (rfl)
theorem splitOnP_eq_splitOnPPrepend : splitOnP p xs = splitOnPPrepend p xs [] := (rfl)
theorem splitOnPPrepend_cons_eq_if {x : α} {xs acc : List α} :
splitOnPPrepend p (x :: xs) acc =
if p x then acc.reverse :: splitOnP p xs else splitOnPPrepend p xs (x :: acc) := by
simp [splitOnPPrepend]
theorem splitOnPPrepend_cons_pos {p : α Bool} {a : α} {l acc : List α} (h : p a) :
splitOnPPrepend p (a :: l) acc = acc.reverse :: splitOnP p l := by
simp [splitOnPPrepend, h]
theorem splitOnPPrepend_cons_neg {p : α Bool} {a : α} {l acc : List α} (h : p a = false) :
splitOnPPrepend p (a :: l) acc = splitOnPPrepend p l (a :: acc) := by
simp [splitOnPPrepend, h]
theorem splitOnP_cons_eq_if_splitOnPPrepend {x : α} {xs : List α} :
splitOnP p (x :: xs) = if p x then [] :: splitOnP p xs else splitOnPPrepend p xs [x] := by
simp [splitOnPPrepend_cons_eq_if, splitOnPPrepend_nil_right]
theorem splitOnPPrepend_eq_modifyHead {xs acc : List α} :
splitOnPPrepend p xs acc = modifyHead (acc.reverse ++ ·) (splitOnP p xs) := by
induction xs generalizing acc with
| nil => simp
| cons hd tl ih =>
simp [splitOnPPrepend_cons_eq_if, splitOnP_cons_eq_if_splitOnPPrepend, ih]
split <;> simp <;> congr
@[deprecated splitOnPPrepend_eq_modifyHead (since := "2026-02-26")]
theorem splitOnP.go_acc {xs acc : List α} :
splitOnPPrepend p xs acc = modifyHead (acc.reverse ++ ·) (splitOnP p xs) :=
splitOnPPrepend_eq_modifyHead
@[simp]
theorem splitOnP_ne_nil (p : α Bool) (xs : List α) : xs.splitOnP p [] :=
splitOnPPrepend_ne_nil p xs []
theorem splitOnP_cons_eq_if_modifyHead (x : α) (xs : List α) :
(x :: xs).splitOnP p =
if p x then [] :: xs.splitOnP p else (xs.splitOnP p).modifyHead (cons x) := by
simp [splitOnP_cons_eq_if_splitOnPPrepend, splitOnPPrepend_eq_modifyHead]
@[deprecated splitOnP_cons_eq_if_modifyHead (since := "2026-02-26")]
theorem splitOnP_cons (x : α) (xs : List α) :
(x :: xs).splitOnP p =
if p x then [] :: xs.splitOnP p else (xs.splitOnP p).modifyHead (cons x) :=
splitOnP_cons_eq_if_modifyHead x xs
/-- The original list `L` can be recovered by flattening the lists produced by `splitOnP p L`,
interspersed with the elements `L.filter p`. -/
theorem splitOnP_spec (as : List α) :
flatten (zipWith (· ++ ·) (splitOnP p as) (((as.filter p).map fun x => [x]) ++ [[]])) = as := by
induction as with
| nil => simp
| cons a as' ih =>
rw [splitOnP_cons_eq_if_modifyHead]
split <;> simp [*, flatten_zipWith, splitOnP_ne_nil]
where
flatten_zipWith {xs ys : List (List α)} {a : α} (hxs : xs []) (hys : ys []) :
flatten (zipWith (fun x x_1 => x ++ x_1) (modifyHead (cons a) xs) ys) =
a :: flatten (zipWith (fun x x_1 => x ++ x_1) xs ys) := by
cases xs <;> cases ys <;> simp_all
/-- If no element satisfies `p` in the list `xs`, then `xs.splitOnP p = [xs]` -/
theorem splitOnP_eq_singleton (h : x xs, p x = false) : xs.splitOnP p = [xs] := by
induction xs with
| nil => simp
| cons hd tl ih =>
simp only [mem_cons, forall_eq_or_imp] at h
simp [splitOnP_cons_eq_if_modifyHead, h.1, ih h.2]
@[deprecated splitOnP_eq_singleton (since := "2026-02-26")]
theorem splitOnP_eq_single (h : x xs, p x = false) : xs.splitOnP p = [xs] :=
splitOnP_eq_singleton h
/-- When a list of the form `[...xs, sep, ...as]` is split at the `sep` element satisfying `p`,
the result is the concatenation of `splitOnP` called on `xs` and `as` -/
theorem splitOnP_append_cons (xs as : List α) {sep : α} (hsep : p sep) :
(xs ++ sep :: as).splitOnP p = List.splitOnP p xs ++ List.splitOnP p as := by
induction xs with
| nil => simp [splitOnP_cons_eq_if_modifyHead, hsep]
| cons hd tl ih =>
obtain hd1, tl1, h1' := List.exists_cons_of_ne_nil (List.splitOnP_ne_nil (p := p) (xs := tl))
by_cases hPh : p hd <;> simp [splitOnP_cons_eq_if_modifyHead, *]
/-- When a list of the form `[...xs, sep, ...as]` is split on `p`, the first element is `xs`,
assuming no element in `xs` satisfies `p` but `sep` does satisfy `p` -/
theorem splitOnP_append_cons_of_forall_mem (h : x xs, p x = false) (sep : α)
(hsep : p sep = true) (as : List α) : (xs ++ sep :: as).splitOnP p = xs :: as.splitOnP p := by
rw [splitOnP_append_cons xs as hsep, splitOnP_eq_singleton h, singleton_append]
@[deprecated splitOnP_append_cons_of_forall_mem (since := "2026-02-26")]
theorem splitOnP_first (h : x xs, p x = false) (sep : α)
(hsep : p sep = true) (as : List α) : (xs ++ sep :: as).splitOnP p = xs :: as.splitOnP p :=
splitOnP_append_cons_of_forall_mem h sep hsep as
theorem splitOn_eq_splitOnP [BEq α] {x : α} {xs : List α} : xs.splitOn x = xs.splitOnP (· == x) :=
(rfl)
@[simp]
theorem splitOn_ne_nil [BEq α] (a : α) (xs : List α) : xs.splitOn a [] := by
simp [splitOn_eq_splitOnP]
theorem splitOn_cons_eq_if_modifyHead [BEq α] {a : α} (x : α) (xs : List α) :
(x :: xs).splitOn a =
if x == a then [] :: xs.splitOn a else (xs.splitOn a).modifyHead (cons x) := by
simpa [splitOn_eq_splitOnP] using splitOnP_cons_eq_if_modifyHead ..
/-- If no element satisfies `p` in the list `xs`, then `xs.splitOnP p = [xs]` -/
theorem splitOn_eq_singleton_of_beq_eq_false [BEq α] {a : α} (h : x xs, (x == a) = false) :
xs.splitOn a = [xs] := by
simpa [splitOn_eq_splitOnP] using splitOnP_eq_singleton h
theorem splitOn_eq_singleton [BEq α] [LawfulBEq α] {a : α} (h : a xs) :
xs.splitOn a = [xs] :=
splitOn_eq_singleton_of_beq_eq_false
(fun _ hb => beq_eq_false_iff_ne.2 (fun hab => absurd hb (hab h)))
/-- When a list of the form `[...xs, sep, ...as]` is split at the `sep` element equal to `a`,
the result is the concatenation of `splitOnP` called on `xs` and `as` -/
theorem splitOn_append_cons_of_beq [BEq α] {a : α} (xs as : List α) {sep : α} (hsep : sep == a) :
(xs ++ sep :: as).splitOn a = List.splitOn a xs ++ List.splitOn a as := by
simpa [splitOn_eq_splitOnP] using splitOnP_append_cons (p := (· == a)) _ _ hsep
/-- When a list of the form `[...xs, sep, ...as]` is split at `a`,
the result is the concatenation of `splitOnP` called on `xs` and `as` -/
theorem splitOn_append_cons_self [BEq α] [ReflBEq α] {a : α} (xs as : List α) :
(xs ++ a :: as).splitOn a = List.splitOn a xs ++ List.splitOn a as :=
splitOn_append_cons_of_beq _ _ (BEq.refl _)
/-- When a list of the form `[...xs, sep, ...as]` is split at `a`, the first element is `xs`,
assuming no element in `xs` is equal to `a` but `sep` is equal to `a`. -/
theorem splitOn_append_cons_of_forall_mem_beq_eq_false [BEq α] {a : α}
(h : x xs, (x == a) = false) (sep : α)
(hsep : sep == a) (as : List α) : (xs ++ sep :: as).splitOn a = xs :: as.splitOn a := by
simpa [splitOn_eq_splitOnP] using splitOnP_append_cons_of_forall_mem h _ hsep _
/-- When a list of the form `[...xs, a, ...as]` is split at `a`, the first element is `xs`,
assuming no element in `xs` is equal to `a`. -/
theorem splitOn_append_cons_self_of_not_mem [BEq α] [LawfulBEq α] {a : α}
(h : a xs) (as : List α) : (xs ++ a :: as).splitOn a = xs :: as.splitOn a :=
splitOn_append_cons_of_forall_mem_beq_eq_false
(fun b hb => beq_eq_false_iff_ne.2 fun hab => absurd hb (hab h)) _ (by simp) _
/-- `intercalate [x]` is the left inverse of `splitOn x` -/
@[simp]
theorem intercalate_splitOn [BEq α] [LawfulBEq α] (x : α) : [x].intercalate (xs.splitOn x) = xs := by
induction xs with
| nil => simp
| cons hd tl ih =>
simp only [splitOn_cons_eq_if_modifyHead, beq_iff_eq]
split
· simp_all [intercalate_cons_of_ne_nil, splitOn_ne_nil]
· have hsp := splitOn_ne_nil x tl
generalize splitOn x tl = ls at *
cases ls <;> simp_all
/-- `splitOn x` is the left inverse of `intercalate [x]`, on the domain
consisting of each nonempty list of lists `ls` whose elements do not contain `x` -/
theorem splitOn_intercalate [BEq α] [LawfulBEq α] (x : α) (hx : l ls, x l) (hls : ls []) :
([x].intercalate ls).splitOn x = ls := by
induction ls with
| nil => simp at hls
| cons hd tl ih =>
simp only [mem_cons, forall_eq_or_imp] at hx
match tl with
| [] => simpa using splitOn_eq_singleton hx.1
| t::tl =>
simp only [intercalate_cons_cons, append_assoc, cons_append, nil_append]
rw [splitOn_append_cons_self_of_not_mem hx.1, ih hx.2 (by simp)]
end List

View File

@@ -32,8 +32,12 @@ open Nat
section isPrefixOf
variable [BEq α]
@[simp, grind =] theorem isPrefixOf_cons_self [LawfulBEq α] {a : α} :
isPrefixOf (a::as) (a::bs) = isPrefixOf as bs := by simp [isPrefixOf_cons]
@[simp, grind =] theorem isPrefixOf_cons_cons_self [LawfulBEq α] {a : α} :
isPrefixOf (a::as) (a::bs) = isPrefixOf as bs := by simp [isPrefixOf_cons_cons]
@[deprecated isPrefixOf_cons_cons_self (since := "2026-02-26")]
theorem isPrefixOf_cons₂_self [LawfulBEq α] {a : α} :
isPrefixOf (a::as) (a::bs) = isPrefixOf as bs := isPrefixOf_cons_cons_self
@[simp] theorem isPrefixOf_length_pos_nil {l : List α} (h : 0 < l.length) : isPrefixOf l [] = false := by
cases l <;> simp_all [isPrefixOf]
@@ -45,7 +49,7 @@ variable [BEq α]
| cons _ _ ih =>
cases n
· simp
· simp [replicate_succ, isPrefixOf_cons, ih, Nat.succ_le_succ_iff, Bool.and_left_comm]
· simp [replicate_succ, isPrefixOf_cons_cons, ih, Nat.succ_le_succ_iff, Bool.and_left_comm]
end isPrefixOf
@@ -169,18 +173,18 @@ theorem subset_replicate {n : Nat} {a : α} {l : List α} (h : n ≠ 0) : l ⊆
@[simp, grind ] theorem Sublist.refl : l : List α, l <+ l
| [] => .slnil
| a :: l => (Sublist.refl l).cons a
| a :: l => (Sublist.refl l).cons_cons a
theorem Sublist.trans {l₁ l₂ l₃ : List α} (h₁ : l₁ <+ l₂) (h₂ : l₂ <+ l₃) : l₁ <+ l₃ := by
induction h₂ generalizing l₁ with
| slnil => exact h₁
| cons _ _ IH => exact (IH h₁).cons _
| @cons l₂ _ a _ IH =>
| @cons_cons l₂ _ a _ IH =>
generalize e : a :: l₂ = l₂' at h₁
match h₁ with
| .slnil => apply nil_sublist
| .cons a' h₁' => cases e; apply (IH h₁').cons
| .cons a' h₁' => cases e; apply (IH h₁').cons
| .cons_cons a' h₁' => cases e; apply (IH h₁').cons_cons
instance : Trans (@Sublist α) Sublist Sublist := Sublist.trans
@@ -193,23 +197,23 @@ theorem sublist_of_cons_sublist : a :: l₁ <+ l₂ → l₁ <+ l₂ :=
@[simp, grind =]
theorem cons_sublist_cons : a :: l₁ <+ a :: l₂ l₁ <+ l₂ :=
fun | .cons _ s => sublist_of_cons_sublist s | .cons _ s => s, .cons _
fun | .cons _ s => sublist_of_cons_sublist s | .cons_cons _ s => s, .cons_cons _
theorem sublist_or_mem_of_sublist (h : l <+ l₁ ++ a :: l₂) : l <+ l₁ ++ l₂ a l := by
induction l₁ generalizing l with
| nil => match h with
| .cons _ h => exact .inl h
| .cons _ h => exact .inr (.head ..)
| .cons_cons _ h => exact .inr (.head ..)
| cons b l₁ IH =>
match h with
| .cons _ h => exact (IH h).imp_left (Sublist.cons _)
| .cons _ h => exact (IH h).imp (Sublist.cons _) (.tail _)
| .cons_cons _ h => exact (IH h).imp (Sublist.cons_cons _) (.tail _)
@[grind ] theorem Sublist.subset : l₁ <+ l₂ l₁ l₂
| .slnil, _, h => h
| .cons _ s, _, h => .tail _ (s.subset h)
| .cons .., _, .head .. => .head ..
| .cons _ s, _, .tail _ h => .tail _ (s.subset h)
| .cons_cons .., _, .head .. => .head ..
| .cons_cons _ s, _, .tail _ h => .tail _ (s.subset h)
protected theorem Sublist.mem (hx : a l₁) (hl : l₁ <+ l₂) : a l₂ :=
hl.subset hx
@@ -245,7 +249,7 @@ theorem eq_nil_of_sublist_nil {l : List α} (s : l <+ []) : l = [] :=
theorem Sublist.length_le : l₁ <+ l₂ length l₁ length l₂
| .slnil => Nat.le_refl 0
| .cons _l s => le_succ_of_le (length_le s)
| .cons _ s => succ_le_succ (length_le s)
| .cons_cons _ s => succ_le_succ (length_le s)
grind_pattern Sublist.length_le => l₁ <+ l₂, length l₁
grind_pattern Sublist.length_le => l₁ <+ l₂, length l₂
@@ -253,7 +257,7 @@ grind_pattern Sublist.length_le => l₁ <+ l₂, length l₂
theorem Sublist.eq_of_length : l₁ <+ l₂ length l₁ = length l₂ l₁ = l₂
| .slnil, _ => rfl
| .cons a s, h => nomatch Nat.not_lt.2 s.length_le (h lt_succ_self _)
| .cons a s, h => by rw [s.eq_of_length (succ.inj h)]
| .cons_cons a s, h => by rw [s.eq_of_length (succ.inj h)]
theorem Sublist.eq_of_length_le (s : l₁ <+ l₂) (h : length l₂ length l₁) : l₁ = l₂ :=
s.eq_of_length <| Nat.le_antisymm s.length_le h
@@ -275,7 +279,7 @@ grind_pattern tail_sublist => tail l <+ _
protected theorem Sublist.tail : {l₁ l₂ : List α}, l₁ <+ l₂ tail l₁ <+ tail l₂
| _, _, slnil => .slnil
| _, _, Sublist.cons _ h => (tail_sublist _).trans h
| _, _, Sublist.cons _ h => h
| _, _, Sublist.cons_cons _ h => h
@[grind ]
theorem Sublist.of_cons_cons {l₁ l₂ : List α} {a b : α} (h : a :: l₁ <+ b :: l₂) : l₁ <+ l₂ :=
@@ -287,8 +291,8 @@ protected theorem Sublist.map (f : α → β) {l₁ l₂} (s : l₁ <+ l₂) : m
| slnil => simp
| cons a s ih =>
simpa using cons (f a) ih
| cons a s ih =>
simpa using cons (f a) ih
| cons_cons a s ih =>
simpa using cons_cons (f a) ih
grind_pattern Sublist.map => l₁ <+ l₂, map f l₁
grind_pattern Sublist.map => l₁ <+ l₂, map f l₂
@@ -338,7 +342,7 @@ theorem sublist_filterMap_iff {l₁ : List β} {f : α → Option β} :
cases h with
| cons _ h =>
exact l', h, rfl
| cons _ h =>
| cons_cons _ h =>
rename_i l'
exact l', h, by simp_all
· constructor
@@ -347,10 +351,10 @@ theorem sublist_filterMap_iff {l₁ : List β} {f : α → Option β} :
| cons _ h =>
obtain l', s, rfl := ih.1 h
exact l', Sublist.cons a s, rfl
| cons _ h =>
| cons_cons _ h =>
rename_i l'
obtain l', s, rfl := ih.1 h
refine a :: l', Sublist.cons a s, ?_
refine a :: l', Sublist.cons_cons a s, ?_
rwa [filterMap_cons_some]
· rintro l', h, rfl
replace h := h.filterMap f
@@ -369,7 +373,7 @@ theorem sublist_filter_iff {l₁ : List α} {p : α → Bool} :
theorem sublist_append_left : l₁ l₂ : List α, l₁ <+ l₁ ++ l₂
| [], _ => nil_sublist _
| _ :: l₁, l₂ => (sublist_append_left l₁ l₂).cons _
| _ :: l₁, l₂ => (sublist_append_left l₁ l₂).cons_cons _
grind_pattern sublist_append_left => Sublist, l₁ ++ l₂
@@ -382,7 +386,7 @@ grind_pattern sublist_append_right => Sublist, l₁ ++ l₂
@[simp, grind =] theorem singleton_sublist {a : α} {l} : [a] <+ l a l := by
refine fun h => h.subset (mem_singleton_self _), fun h => ?_
obtain _, _, rfl := append_of_mem h
exact ((nil_sublist _).cons _).trans (sublist_append_right ..)
exact ((nil_sublist _).cons_cons _).trans (sublist_append_right ..)
@[simp] theorem sublist_append_of_sublist_left (s : l <+ l₁) : l <+ l₁ ++ l₂ :=
s.trans <| sublist_append_left ..
@@ -404,7 +408,7 @@ theorem Sublist.append_left : l₁ <+ l₂ → ∀ l, l ++ l₁ <+ l ++ l₂ :=
theorem Sublist.append_right : l₁ <+ l₂ l, l₁ ++ l <+ l₂ ++ l
| .slnil, _ => Sublist.refl _
| .cons _ h, _ => (h.append_right _).cons _
| .cons _ h, _ => (h.append_right _).cons _
| .cons_cons _ h, _ => (h.append_right _).cons_cons _
theorem Sublist.append (hl : l₁ <+ l₂) (hr : r₁ <+ r₂) : l₁ ++ r₁ <+ l₂ ++ r₂ :=
(hl.append_right _).trans ((append_sublist_append_left _).2 hr)
@@ -418,10 +422,10 @@ theorem sublist_cons_iff {a : α} {l l'} :
· intro h
cases h with
| cons _ h => exact Or.inl h
| cons _ h => exact Or.inr _, rfl, h
| cons_cons _ h => exact Or.inr _, rfl, h
· rintro (h | r, rfl, h)
· exact h.cons _
· exact h.cons _
· exact h.cons_cons _
@[grind =]
theorem cons_sublist_iff {a : α} {l l'} :
@@ -435,7 +439,7 @@ theorem cons_sublist_iff {a : α} {l l'} :
| cons _ w =>
obtain r₁, r₂, rfl, h₁, h₂ := ih.1 w
exact a' :: r₁, r₂, by simp, mem_cons_of_mem a' h₁, h₂
| cons _ w =>
| cons_cons _ w =>
exact [a], l', by simp, mem_singleton_self _, w
· rintro r₁, r₂, w, h₁, h₂
rw [w, singleton_append]
@@ -458,7 +462,7 @@ theorem sublist_append_iff {l : List α} :
| cons _ w =>
obtain l₁, l₂, rfl, w₁, w₂ := ih.1 w
exact l₁, l₂, rfl, Sublist.cons r w₁, w₂
| cons _ w =>
| cons_cons _ w =>
rename_i l
obtain l₁, l₂, rfl, w₁, w₂ := ih.1 w
refine r :: l₁, l₂, by simp, cons_sublist_cons.mpr w₁, w₂
@@ -466,9 +470,9 @@ theorem sublist_append_iff {l : List α} :
cases w₁ with
| cons _ w₁ =>
exact Sublist.cons _ (Sublist.append w₁ w₂)
| cons _ w₁ =>
| cons_cons _ w₁ =>
rename_i l
exact Sublist.cons _ (Sublist.append w₁ w₂)
exact Sublist.cons_cons _ (Sublist.append w₁ w₂)
theorem append_sublist_iff {l₁ l₂ : List α} :
l₁ ++ l₂ <+ r r₁ r₂, r = r₁ ++ r₂ l₁ <+ r₁ l₂ <+ r₂ := by
@@ -516,7 +520,7 @@ theorem Sublist.middle {l : List α} (h : l <+ l₁ ++ l₂) (a : α) : l <+ l
theorem Sublist.reverse : l₁ <+ l₂ l₁.reverse <+ l₂.reverse
| .slnil => Sublist.refl _
| .cons _ h => by rw [reverse_cons]; exact sublist_append_of_sublist_left h.reverse
| .cons _ h => by rw [reverse_cons, reverse_cons]; exact h.reverse.append_right _
| .cons_cons _ h => by rw [reverse_cons, reverse_cons]; exact h.reverse.append_right _
@[simp, grind =] theorem reverse_sublist : l₁.reverse <+ l₂.reverse l₁ <+ l₂ :=
fun h => l₁.reverse_reverse l₂.reverse_reverse h.reverse, Sublist.reverse
@@ -558,7 +562,7 @@ theorem sublist_replicate_iff : l <+ replicate m a ↔ ∃ n, n ≤ m ∧ l = re
obtain n, le, rfl := ih.1 (sublist_of_cons_sublist w)
obtain rfl := (mem_replicate.1 (mem_of_cons_sublist w)).2
exact n+1, Nat.add_le_add_right le 1, rfl
| cons _ w =>
| cons_cons _ w =>
obtain n, le, rfl := ih.1 w
refine n+1, Nat.add_le_add_right le 1, by simp [replicate_succ]
· rintro n, le, w
@@ -644,7 +648,7 @@ theorem flatten_sublist_iff {L : List (List α)} {l} :
cases h_sub
case cons h_sub =>
exact isSublist_iff_sublist.mpr h_sub
case cons =>
case cons_cons =>
contradiction
instance [DecidableEq α] (l₁ l₂ : List α) : Decidable (l₁ <+ l₂) :=

View File

@@ -393,7 +393,7 @@ theorem isPrefixOfAux_toArray_zero [BEq α] (l₁ l₂ : List α) (hle : l₁.le
| [], _ => rw [dif_neg] <;> simp
| _::_, [] => simp at hle
| a::l₁, b::l₂ =>
simp [isPrefixOf_cons, isPrefixOfAux_toArray_succ', isPrefixOfAux_toArray_zero]
simp [isPrefixOf_cons_cons, isPrefixOfAux_toArray_succ', isPrefixOfAux_toArray_zero]
@[simp, grind =] theorem isPrefixOf_toArray [BEq α] (l₁ l₂ : List α) :
l₁.toArray.isPrefixOf l₂.toArray = l₁.isPrefixOf l₂ := by
@@ -407,7 +407,7 @@ theorem isPrefixOfAux_toArray_zero [BEq α] (l₁ l₂ : List α) (hle : l₁.le
cases l₂ with
| nil => simp
| cons b l₂ =>
simp only [isPrefixOf_cons, Bool.and_eq_false_imp]
simp only [isPrefixOf_cons_cons, Bool.and_eq_false_imp]
intro w
rw [ih]
simp_all

View File

@@ -82,6 +82,15 @@ theorem get_inj {o1 o2 : Option α} {h1} {h2} :
match o1, o2, h1, h2 with
| some a, some b, _, _ => simp only [Option.get_some, Option.some.injEq]
theorem getD_inj {o₁ o₂ : Option α} (h₁ : o₁.isSome) (h₂ : o₂.isSome) {fallback} :
o₁.getD fallback = o₂.getD fallback o₁ = o₂ := by
match o₁, o₂, h₁, h₂ with
| some a, some b, _, _ => simp only [Option.getD_some, Option.some.injEq]
theorem get!_inj [Inhabited α] {o₁ o₂ : Option α} (h₁ : o₁.isSome) (h₂ : o₂.isSome) :
o₁.get! = o₂.get! o₁ = o₂ := by
simpa [get!_eq_getD] using getD_inj h₁ h₂
theorem mem_unique {o : Option α} {a b : α} (ha : a o) (hb : b o) : a = b :=
some.inj <| ha hb

View File

@@ -369,6 +369,12 @@ theorem String.ofList_toList {s : String} : String.ofList s.toList = s := by
theorem String.asString_data {b : String} : String.ofList b.toList = b :=
String.ofList_toList
@[simp]
theorem String.ofList_comp_toList : String.ofList String.toList = id := by ext; simp
@[simp]
theorem String.toList_comp_ofList : String.toList String.ofList = id := by ext; simp
theorem String.ofList_injective {l₁ l₂ : List Char} (h : String.ofList l₁ = String.ofList l₂) : l₁ = l₂ := by
simpa using congrArg String.toList h
@@ -1525,6 +1531,11 @@ def Slice.Pos.toReplaceEnd {s : Slice} (p₀ : s.Pos) (pos : s.Pos) (h : pos ≤
theorem Slice.Pos.offset_sliceTo {s : Slice} {p₀ : s.Pos} {pos : s.Pos} {h : pos p₀} :
(sliceTo p₀ pos h).offset = pos.offset := (rfl)
@[simp]
theorem Slice.Pos.sliceTo_inj {s : Slice} {p₀ : s.Pos} {pos pos' : s.Pos} {h h'} :
p₀.sliceTo pos h = p₀.sliceTo pos' h' pos = pos' := by
simp [Pos.ext_iff]
@[simp]
theorem Slice.Pos.ofSliceTo_startPos {s : Slice} {pos : s.Pos} :
ofSliceTo (s.sliceTo pos).startPos = s.startPos := by
@@ -1713,14 +1724,15 @@ def pos! (s : String) (off : Pos.Raw) : s.Pos :=
@[simp]
theorem offset_pos {s : String} {off : Pos.Raw} {h} : (s.pos off h).offset = off := rfl
/-- Constructs a valid position on `t` from a valid position on `s` and a proof that `s = t`. -/
/-- Constructs a valid position on `t` from a valid position on `s` and a proof that
`s.copy = t.copy`. -/
@[inline]
def Slice.Pos.cast {s t : Slice} (pos : s.Pos) (h : s = t) : t.Pos where
def Slice.Pos.cast {s t : Slice} (pos : s.Pos) (h : s.copy = t.copy) : t.Pos where
offset := pos.offset
isValidForSlice := h pos.isValidForSlice
isValidForSlice := Pos.Raw.isValid_copy_iff.mp (h Pos.Raw.isValid_copy_iff.mpr pos.isValidForSlice)
@[simp]
theorem Slice.Pos.offset_cast {s t : Slice} {pos : s.Pos} {h : s = t} :
theorem Slice.Pos.offset_cast {s t : Slice} {pos : s.Pos} {h : s.copy = t.copy} :
(pos.cast h).offset = pos.offset := (rfl)
@[simp]
@@ -1728,14 +1740,14 @@ theorem Slice.Pos.cast_rfl {s : Slice} {pos : s.Pos} : pos.cast rfl = pos :=
Slice.Pos.ext (by simp)
@[simp]
theorem Slice.Pos.cast_le_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s = t} :
theorem Slice.Pos.cast_le_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s.copy = t.copy} :
pos.cast h pos'.cast h pos pos' := by
cases h; simp
simp [Slice.Pos.le_iff]
@[simp]
theorem Slice.Pos.cast_lt_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s = t} :
theorem Slice.Pos.cast_lt_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s.copy = t.copy} :
pos.cast h < pos'.cast h pos < pos' := by
cases h; simp
simp [Slice.Pos.lt_iff]
/-- Constructs a valid position on `t` from a valid position on `s` and a proof that `s = t`. -/
@[inline]
@@ -1966,6 +1978,7 @@ theorem Pos.ne_of_lt {s : String} {p q : s.Pos} : p < q → p ≠ q := by
theorem Pos.lt_of_lt_of_le {s : String} {p q r : s.Pos} : p < q q r p < r := by
simpa [Pos.lt_iff, Pos.le_iff] using Pos.Raw.lt_of_lt_of_le
@[simp]
theorem Pos.le_endPos {s : String} (p : s.Pos) : p s.endPos := by
simpa [Pos.le_iff] using p.isValid.le_rawEndPos
@@ -2264,14 +2277,26 @@ theorem Slice.Pos.le_ofSliceFrom {s : Slice} {p₀ : s.Pos} {pos : (s.sliceFrom
p₀ ofSliceFrom pos := by
simp [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.ofSliceFrom_lt_ofSliceFrom_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Slice.Pos.ofSliceFrom q < Slice.Pos.ofSliceFrom r q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.ofSliceFrom_le_ofSliceFrom_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Slice.Pos.ofSliceFrom q Slice.Pos.ofSliceFrom r q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.ofSliceFrom_lt_ofSliceFrom_iff {s : String} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Pos.ofSliceFrom q < Pos.ofSliceFrom r q < r := by
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.ofSliceFrom_le_ofSliceFrom_iff {s : String} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Pos.ofSliceFrom q Pos.ofSliceFrom r q r := by
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
theorem Pos.get_eq_get_ofSliceFrom {s : String} {p₀ : s.Pos}
{pos : (s.sliceFrom p₀).Pos} {h} :
pos.get h = (ofSliceFrom pos).get (by rwa [ ofSliceFrom_endPos, ne_eq, ofSliceFrom_inj]) := by
@@ -2335,6 +2360,16 @@ theorem Slice.Pos.ofSliceTo_le {s : Slice} {p₀ : s.Pos} {pos : (s.sliceTo p₀
ofSliceTo pos p₀ := by
simpa [Pos.le_iff, Pos.Raw.le_iff] using pos.isValidForSlice.le_utf8ByteSize
@[simp]
theorem Pos.ofSliceTo_lt_ofSliceTo_iff {s : String} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Pos.ofSliceTo q < Pos.ofSliceTo r q < r := by
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.ofSliceTo_le_ofSliceTo_iff {s : String} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Pos.ofSliceTo q Pos.ofSliceTo r q r := by
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
/-- Given a position in `s` that is at most `p₀`, obtain the corresponding position in `s.sliceTo p₀`. -/
@[inline]
def Pos.sliceTo {s : String} (p₀ : s.Pos) (pos : s.Pos) (h : pos p₀) :
@@ -2351,6 +2386,11 @@ def Pos.toReplaceEnd {s : String} (p₀ : s.Pos) (pos : s.Pos) (h : pos ≤ p₀
theorem Pos.offset_sliceTo {s : String} {p₀ : s.Pos} {pos : s.Pos} {h : pos p₀} :
(sliceTo p₀ pos h).offset = pos.offset := (rfl)
@[simp]
theorem Pos.sliceTo_inj {s : String} {p₀ : s.Pos} {pos pos' : s.Pos} {h h'} :
p₀.sliceTo pos h = p₀.sliceTo pos' h' pos = pos' := by
simp [Pos.ext_iff, Slice.Pos.ext_iff]
@[simp]
theorem Slice.Pos.ofSliceTo_sliceTo {s : Slice} {p₀ p : s.Pos} {h : p p₀} :
Slice.Pos.ofSliceTo (p₀.sliceTo p h) = p := by
@@ -2419,6 +2459,27 @@ theorem Slice.Pos.ofSlice_inj {s : Slice} {p₀ p₁ : s.Pos} {h} (pos₁ pos₂
ofSlice pos₁ = ofSlice pos₂ pos₁ = pos₂ := by
simp [Pos.ext_iff, Pos.Raw.ext_iff]
@[simp]
theorem Slice.Pos.le_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : p₀ ofSlice pos := by
simp [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.ofSlice_le {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : ofSlice pos p₁ := by
have := (Pos.Raw.isValidForSlice_slice _).1 pos.isValidForSlice |>.1
simpa [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.ofSlice_lt_ofSlice_iff {s : Slice} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Slice.Pos.ofSlice q < Slice.Pos.ofSlice r q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.ofSlice_le_ofSlice_iff {s : Slice} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Slice.Pos.ofSlice q Slice.Pos.ofSlice r q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
/-- Given a position in `s.slice p₀ p₁ h`, obtain the corresponding position in `s`. -/
@[inline]
def Pos.ofSlice {s : String} {p₀ p₁ : s.Pos} {h} (pos : (s.slice p₀ p₁ h).Pos) : s.Pos :=
@@ -2449,6 +2510,27 @@ theorem Pos.ofSlice_inj {s : String} {p₀ p₁ : s.Pos} {h} (pos₁ pos₂ : (s
ofSlice pos₁ = ofSlice pos₂ pos₁ = pos₂ := by
simp [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.ext_iff]
@[simp]
theorem Pos.le_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : p₀ ofSlice pos := by
simp [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.ofSlice_le {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : ofSlice pos p₁ := by
have := (Pos.Raw.isValidForSlice_slice _).1 pos.isValidForSlice |>.1
simpa [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.ofSlice_lt_ofSlice_iff {s : String} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Pos.ofSlice q < Pos.ofSlice r q < r := by
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.ofSlice_le_ofSlice_iff {s : String} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Pos.ofSlice q Pos.ofSlice r q r := by
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
theorem Slice.Pos.le_trans {s : Slice} {p q r : s.Pos} : p q q r p r := by
simpa [Pos.le_iff, Pos.Raw.le_iff] using Nat.le_trans
@@ -2472,6 +2554,48 @@ def Pos.slice {s : String} (pos : s.Pos) (p₀ p₁ : s.Pos) (h₁ : p₀ ≤ po
theorem Pos.offset_slice {s : String} {p₀ p₁ pos : s.Pos} {h₁ : p₀ pos} {h₂ : pos p₁} :
(pos.slice p₀ p₁ h₁ h₂).offset = pos.offset.unoffsetBy p₀.offset := (rfl)
@[simp]
theorem Slice.Pos.offset_slice {s : Slice} {p₀ p₁ pos : s.Pos} {h₁ : p₀ pos} {h₂ : pos p₁} :
(pos.slice p₀ p₁ h₁ h₂).offset = pos.offset.unoffsetBy p₀.offset := (rfl)
@[simp]
theorem Slice.Pos.ofSlice_slice {s : Slice} {p₀ p₁ pos : s.Pos}
{h₁ : p₀ pos} {h₂ : pos p₁} :
Slice.Pos.ofSlice (pos.slice p₀ p₁ h₁ h₂) = pos := by
simpa [Pos.ext_iff] using Pos.Raw.offsetBy_unoffsetBy_of_le h₁
@[simp]
theorem Slice.Pos.slice_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} :
(Slice.Pos.ofSlice pos).slice p₀ p₁ Slice.Pos.le_ofSlice Slice.Pos.ofSlice_le = pos := by
simp [ Slice.Pos.ofSlice_inj]
@[simp]
theorem Pos.ofSlice_slice {s : String} {p₀ p₁ pos : s.Pos}
{h₁ : p₀ pos} {h₂ : pos p₁} :
Pos.ofSlice (pos.slice p₀ p₁ h₁ h₂) = pos := by
simpa [Pos.ext_iff] using Pos.Raw.offsetBy_unoffsetBy_of_le h₁
@[simp]
theorem Pos.slice_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} :
(Pos.ofSlice pos).slice p₀ p₁ Pos.le_ofSlice Pos.ofSlice_le = pos := by
simp [ Pos.ofSlice_inj]
@[simp]
theorem Slice.Pos.slice_inj {s : Slice} {p₀ p₁ : s.Pos} {pos pos' : s.Pos}
{h₁ h₁' h₂ h₂'} :
pos.slice p₀ p₁ h₁ h₂ = pos'.slice p₀ p₁ h₁' h₂' pos = pos' := by
simp [Pos.ext_iff, Pos.Raw.ext_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Pos.slice_inj {s : String} {p₀ p₁ : s.Pos} {pos pos' : s.Pos}
{h₁ h₁' h₂ h₂'} :
pos.slice p₀ p₁ h₁ h₂ = pos'.slice p₀ p₁ h₁' h₂' pos = pos' := by
simp [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.ext_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
/--
Given a position in `s`, obtain the corresponding position in `s.slice p₀ p₁ h`, or panic if `pos`
is not between `p₀` and `p₁`.
@@ -2504,7 +2628,7 @@ taking `s.slice! p₀ p₁` already panicked. -/
@[inline]
def Slice.Pos.ofSlice! {s : Slice} {p₀ p₁ : s.Pos} (pos : (s.slice! p₀ p₁).Pos) : s.Pos :=
if h : p₀ p₁ then
ofSlice (h := h) (pos.cast slice_eq_slice!.symm)
ofSlice (h := h) (pos.cast (congrArg Slice.copy slice_eq_slice!.symm))
else
panic! "Starting position must be less than or equal to end position."
@@ -2522,7 +2646,7 @@ taking `s.slice! p₀ p₁` already panicked or if the position is not between `
def Slice.Pos.slice! {s : Slice} (pos : s.Pos) (p₀ p₁ : s.Pos) :
(s.slice! p₀ p₁).Pos :=
if h : p₀ pos pos p₁ then
(pos.slice _ _ h.1 h.2).cast slice_eq_slice!
(pos.slice _ _ h.1 h.2).cast (congrArg Slice.copy slice_eq_slice!)
else
panic! "Starting position must be less than or equal to end position and position must be between starting position and end position."

View File

@@ -403,7 +403,6 @@ achieved by tracking the bounds by hand, the slice API is much more convenient.
`String.Slice` bundles proofs to ensure that the start and end positions always delineate a valid
string. For this reason, it should be preferred over `Substring.Raw`.
-/
@[ext]
structure Slice where
/-- The underlying strings. -/
str : String

View File

@@ -16,6 +16,7 @@ public import Init.Data.String.Lemmas.IsEmpty
public import Init.Data.String.Lemmas.Pattern
public import Init.Data.String.Lemmas.Slice
public import Init.Data.String.Lemmas.Iterate
public import Init.Data.String.Lemmas.Intercalate
import Init.Data.Order.Lemmas
public import Init.Data.String.Basic
import Init.Data.Char.Lemmas

View File

@@ -99,6 +99,15 @@ theorem Slice.utf8ByteSize_eq_size_toByteArray_copy {s : Slice} :
s.utf8ByteSize = s.copy.toByteArray.size := by
simp [utf8ByteSize_eq]
@[ext (iff := false)]
theorem Slice.ext {s t : Slice} (h : s.str = t.str)
(hsi : s.startInclusive.cast h = t.startInclusive)
(hee : s.endExclusive.cast h = t.endExclusive) : s = t := by
rcases s with s, s₁, e₁, h₁
rcases t with t, s₂, e₂, h₂
cases h
simp_all
section Iterate
/-
@@ -106,32 +115,71 @@ These lemmas are slightly evil because they are non-definitional equalities betw
are useful and they are at least equalities between slices with definitionally equal underlying
strings, so it should be fine.
-/
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem Slice.sliceTo_sliceFrom {s : Slice} {pos pos'} :
(s.sliceFrom pos).sliceTo pos' =
s.slice pos (Slice.Pos.ofSliceFrom pos') Slice.Pos.le_ofSliceFrom := by
ext <;> simp [String.Pos.ext_iff, Pos.Raw.offsetBy_assoc]
ext <;> simp [Pos.Raw.offsetBy_assoc]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem Slice.sliceFrom_sliceTo {s : Slice} {pos pos'} :
(s.sliceTo pos).sliceFrom pos' =
s.slice (Slice.Pos.ofSliceTo pos') pos Slice.Pos.ofSliceTo_le := by
ext <;> simp [String.Pos.ext_iff]
ext <;> simp
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem Slice.sliceFrom_sliceFrom {s : Slice} {pos pos'} :
(s.sliceFrom pos).sliceFrom pos' =
s.sliceFrom (Slice.Pos.ofSliceFrom pos') := by
ext <;> simp [String.Pos.ext_iff, Pos.Raw.offsetBy_assoc]
ext <;> simp [Pos.Raw.offsetBy_assoc]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem Slice.sliceTo_sliceTo {s : Slice} {pos pos'} :
(s.sliceTo pos).sliceTo pos' = s.sliceTo (Slice.Pos.ofSliceTo pos') := by
ext <;> simp [String.Pos.ext_iff]
ext <;> simp
@[simp]
theorem Slice.sliceFrom_slice {s : Slice} {p₁ p₂ h p} :
(s.slice p₁ p₂ h).sliceFrom p = s.slice (Pos.ofSlice p) p₂ Pos.ofSlice_le := by
ext <;> simp [Nat.add_assoc]
@[simp]
theorem Slice.sliceTo_slice {s : Slice} {p₁ p₂ h p} :
(s.slice p₁ p₂ h).sliceTo p = s.slice p₁ (Pos.ofSlice p) Pos.le_ofSlice := by
ext <;> simp [Nat.add_assoc]
@[simp]
theorem sliceTo_sliceFrom {s : String} {pos pos'} :
(s.sliceFrom pos).sliceTo pos' =
s.slice pos (Pos.ofSliceFrom pos') Pos.le_ofSliceFrom := by
ext <;> simp
@[simp]
theorem sliceFrom_sliceTo {s : String} {pos pos'} :
(s.sliceTo pos).sliceFrom pos' =
s.slice (Pos.ofSliceTo pos') pos Pos.ofSliceTo_le := by
ext <;> simp
@[simp]
theorem sliceFrom_sliceFrom {s : String} {pos pos'} :
(s.sliceFrom pos).sliceFrom pos' =
s.sliceFrom (Pos.ofSliceFrom pos') := by
ext <;> simp
@[simp]
theorem sliceTo_sliceTo {s : String} {pos pos'} :
(s.sliceTo pos).sliceTo pos' = s.sliceTo (Pos.ofSliceTo pos') := by
ext <;> simp
@[simp]
theorem sliceFrom_slice {s : String} {p₁ p₂ h p} :
(s.slice p₁ p₂ h).sliceFrom p = s.slice (Pos.ofSlice p) p₂ Pos.ofSlice_le := by
ext <;> simp
@[simp]
theorem sliceTo_slice {s : String} {p₁ p₂ h p} :
(s.slice p₁ p₂ h).sliceTo p = s.slice p₁ (Pos.ofSlice p) Pos.le_ofSlice := by
ext <;> simp
end Iterate
@@ -157,9 +205,10 @@ theorem Slice.copy_pos {s : Slice} {p : Pos.Raw} {h : Pos.Raw.IsValidForSlice s
simp [String.Pos.ext_iff]
@[simp]
theorem Slice.cast_pos {s t : Slice} {p : Pos.Raw} {h : Pos.Raw.IsValidForSlice s p} {h' : s = t} :
(s.pos p h).cast h' = t.pos p (h' h) := by
simp [Pos.ext_iff]
theorem Slice.cast_pos {s t : Slice} {p : Pos.Raw} {h : Pos.Raw.IsValidForSlice s p}
{h' : s.copy = t.copy} {h'' : Pos.Raw.IsValidForSlice t p} :
(s.pos p h).cast h' = t.pos p h'' := by
simp [Slice.Pos.ext_iff]
@[simp]
theorem cast_pos {s t : String} {p : Pos.Raw} {h : Pos.Raw.IsValid s p} {h' : s = t} :
@@ -176,4 +225,7 @@ theorem Pos.get_ofToSlice {s : String} {p : (s.toSlice).Pos} {h} :
(ofToSlice p).get h = p.get (by simpa [ ofToSlice_inj]) := by
simp [get_eq_get_toSlice]
@[simp]
theorem push_empty {c : Char} : "".push c = singleton c := rfl
end String

View File

@@ -0,0 +1,70 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Markus Himmel
-/
module
prelude
public import Init.Data.String.Defs
import all Init.Data.String.Defs
public import Init.Data.String.Slice
import all Init.Data.String.Slice
public section
namespace String
@[simp]
theorem intercalate_nil {s : String} : s.intercalate [] = "" := by
simp [intercalate]
@[simp]
theorem intercalate_singleton {s t : String} : s.intercalate [t] = t := by
simp [intercalate, intercalate.go]
private theorem intercalateGo_append {s t u : String} {l : List String} :
intercalate.go (s ++ t) u l = s ++ intercalate.go t u l := by
induction l generalizing t <;> simp [intercalate.go, String.append_assoc, *]
@[simp]
theorem intercalate_cons_cons {s t u : String} {l : List String} :
s.intercalate (t :: u :: l) = t ++ s ++ s.intercalate (u :: l) := by
simp [intercalate, intercalate.go, intercalateGo_append]
@[simp]
theorem intercalate_cons_append {s t u : String} {l : List String} :
s.intercalate ((t ++ u) :: l) = t ++ s.intercalate (u :: l) := by
cases l <;> simp [String.append_assoc]
theorem intercalate_cons_of_ne_nil {s t : String} {l : List String} (h : l []) :
s.intercalate (t :: l) = t ++ s ++ s.intercalate l :=
match l, h with
| u::l, _ => by simp
@[simp]
theorem toList_intercalate {s : String} {l : List String} :
(s.intercalate l).toList = s.toList.intercalate (l.map String.toList) := by
induction l with
| nil => simp
| cons hd tl ih => cases tl <;> simp_all
namespace Slice
@[simp]
theorem _root_.String.appendSlice_eq {s : String} {t : Slice} : s ++ t = s ++ t.copy := rfl
private theorem intercalateGo_append {s t : String} {u : Slice} {l : List Slice} :
intercalate.go (s ++ t) u l = s ++ intercalate.go t u l := by
induction l generalizing t <;> simp [intercalate.go, String.append_assoc, *]
@[simp]
theorem intercalate_eq {s : Slice} {l : List Slice} :
s.intercalate l = s.copy.intercalate (l.map Slice.copy) := by
induction l with
| nil => simp [intercalate]
| cons hd tl ih => cases tl <;> simp_all [intercalate, intercalate.go, intercalateGo_append]
end Slice
end String

View File

@@ -87,6 +87,10 @@ theorem isEmpty_iff_utf8ByteSize_eq_zero {s : String} : s.isEmpty ↔ s.utf8Byte
theorem isEmpty_iff {s : String} : s.isEmpty s = "" := by
simp [isEmpty_iff_utf8ByteSize_eq_zero]
@[simp]
theorem isEmpty_eq_false_iff {s : String} : s.isEmpty = false s "" := by
simp [ isEmpty_iff]
theorem startPos_ne_endPos_iff {s : String} : s.startPos s.endPos s "" := by
simp
@@ -175,4 +179,34 @@ theorem Slice.toByteArray_copy_ne_empty_iff {s : Slice} :
s.copy.toByteArray ByteArray.empty s.isEmpty = false := by
simp
section CopyEqEmpty
-- Yes, `simp` can prove these, but we still need to mark them as simp lemmas.
@[simp]
theorem copy_slice_self {s : String} {p : s.Pos} : (s.slice p p (Pos.le_refl _)).copy = "" := by
simp
@[simp]
theorem copy_sliceTo_startPos {s : String} : (s.sliceTo s.startPos).copy = "" := by
simp
@[simp]
theorem copy_sliceFrom_startPos {s : String} : (s.sliceFrom s.endPos).copy = "" := by
simp
@[simp]
theorem Slice.copy_slice_self {s : Slice} {p : s.Pos} : (s.slice p p (Pos.le_refl _)).copy = "" := by
simp
@[simp]
theorem Slice.copy_sliceTo_startPos {s : Slice} : (s.sliceTo s.startPos).copy = "" := by
simp
@[simp]
theorem Slice.copy_sliceFrom_startPos {s : Slice} : (s.sliceFrom s.endPos).copy = "" := by
simp
end CopyEqEmpty
end String

View File

@@ -16,6 +16,9 @@ import Init.ByCases
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
import Init.Data.String.Lemmas.Basic
import Init.Data.Iterators.Lemmas.Consumers.Loop
public import Init.Data.String.Lemmas.Order
import Init.Data.String.OrderInstances
import Init.Data.Subtype.Basic
set_option doc.verso true
@@ -47,6 +50,19 @@ theorem Model.positionsFrom_eq_cons {s : Slice} {p : s.Pos} (hp : p ≠ s.endPos
rw [Model.positionsFrom]
simp [hp]
@[simp]
theorem Model.mem_positionsFrom {s : Slice} {p : s.Pos} {q : { q : s.Pos // q s.endPos } } :
q Model.positionsFrom p p q := by
induction p using Pos.next_induction with
| next p h ih =>
rw [Model.positionsFrom_eq_cons h, List.mem_cons, ih]
simp [Subtype.ext_iff, Std.le_iff_lt_or_eq (a := p), or_comm, eq_comm]
| endPos => simp [q.property]
theorem Model.mem_positionsFrom_startPos {s : Slice} {q : { q : s.Pos // q s.endPos} } :
q Model.positionsFrom s.startPos := by
simp
theorem Model.map_get_positionsFrom_of_splits {s : Slice} {p : s.Pos} {t₁ t₂ : String}
(hp : p.Splits t₁ t₂) : (Model.positionsFrom p).map (fun p => p.1.get p.2) = t₂.toList := by
induction p using Pos.next_induction generalizing t₁ t₂ with
@@ -60,7 +76,6 @@ theorem Model.map_get_positionsFrom_startPos {s : Slice} :
(Model.positionsFrom s.startPos).map (fun p => p.1.get p.2) = s.copy.toList :=
Model.map_get_positionsFrom_of_splits (splits_startPos s)
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem toList_positionsFrom {s : Slice} {p : s.Pos} :
(s.positionsFrom p).toList = Model.positionsFrom p := by
@@ -80,6 +95,38 @@ theorem toList_positions {s : Slice} : s.positions.toList = Model.positionsFrom
theorem toList_chars {s : Slice} : s.chars.toList = s.copy.toList := by
simp [chars, Model.map_get_positionsFrom_startPos]
theorem mem_toList_copy_iff_exists_get {s : Slice} {c : Char} :
c s.copy.toList (p : s.Pos) (h : p s.endPos), p.get h = c := by
simp [ Model.map_get_positionsFrom_startPos]
theorem Pos.Splits.mem_toList_left_iff {s : Slice} {pos : s.Pos} {t u : String} {c : Char}
(hs : pos.Splits t u) :
c t.toList pos', (h : pos' < pos), pos'.get (Pos.ne_endPos_of_lt h) = c := by
rw [hs.eq_left pos.splits, mem_toList_copy_iff_exists_get]
refine ?_, ?_
· rintro p, hp, hpget
have hlt : Pos.ofSliceTo p < pos := by
simpa using Pos.ofSliceTo_lt_ofSliceTo_iff.mpr ((Pos.lt_endPos_iff _).mpr hp)
exact _, hlt, by rwa [Pos.get_eq_get_ofSliceTo] at hpget
· rintro pos', hlt, hget
exact pos.sliceTo pos' (Std.le_of_lt hlt),
by simpa [ Pos.ofSliceTo_inj] using Std.ne_of_lt hlt,
by rw [Slice.Pos.get_eq_get_ofSliceTo]; simpa using hget
theorem Pos.Splits.mem_toList_right_iff {s : Slice} {pos : s.Pos} {t u : String} {c : Char}
(hs : pos.Splits t u) :
c u.toList pos', (_ : pos pos') (h : pos' s.endPos), pos'.get h = c := by
rw [hs.eq_right pos.splits, mem_toList_copy_iff_exists_get]
refine ?_, ?_
· rintro p, hp, hpget
exact Pos.ofSliceFrom p, Pos.le_ofSliceFrom,
fun h => hp (Pos.ofSliceFrom_inj.mp (h.trans (Pos.ofSliceFrom_endPos (pos := pos)).symm)),
by rwa [Pos.get_eq_get_ofSliceFrom] at hpget
· rintro pos', hle, hne, hget
exact pos.sliceFrom pos' hle,
fun h => hne (by simpa using congrArg Pos.ofSliceFrom h),
by rw [Pos.get_eq_get_ofSliceFrom]; simpa using hget
/--
A list of all positions strictly before {name}`p`, ordered from largest to smallest.
@@ -115,7 +162,6 @@ theorem Model.map_get_revPositionsFrom_endPos {s : Slice} :
(Model.revPositionsFrom s.endPos).map (fun p => p.1.get p.2) = s.copy.toList.reverse :=
Model.map_get_revPositionsFrom_of_splits (splits_endPos s)
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem toList_revPositionsFrom {s : Slice} {p : s.Pos} :
(s.revPositionsFrom p).toList = Model.revPositionsFrom p := by
@@ -168,6 +214,19 @@ theorem Model.positionsFrom_eq_cons {s : String} {p : s.Pos} (hp : p ≠ s.endPo
rw [Model.positionsFrom]
simp [hp]
@[simp]
theorem Model.mem_positionsFrom {s : String} {p : s.Pos} {q : { q : s.Pos // q s.endPos } } :
q Model.positionsFrom p p q := by
induction p using Pos.next_induction with
| next p h ih =>
rw [Model.positionsFrom_eq_cons h, List.mem_cons, ih]
simp [Subtype.ext_iff, Std.le_iff_lt_or_eq (a := p), or_comm, eq_comm]
| endPos => simp [q.property]
theorem Model.mem_positionsFrom_startPos {s : String} {q : { q : s.Pos // q s.endPos} } :
q Model.positionsFrom s.startPos := by
simp
theorem Model.positionsFrom_eq_map {s : String} {p : s.Pos} :
Model.positionsFrom p = (Slice.Model.positionsFrom p.toSlice).map
(fun p => Pos.ofToSlice p.1, by simpa [ Pos.toSlice_inj] using p.2) := by
@@ -199,6 +258,38 @@ theorem toList_positions {s : String} : s.positions.toList = Model.positionsFrom
theorem toList_chars {s : String} : s.chars.toList = s.toList := by
simp [chars]
theorem mem_toList_iff_exists_get {s : String} {c : Char} :
c s.toList (p : s.Pos) (h : p s.endPos), p.get h = c := by
simp [ Model.map_get_positionsFrom_startPos]
theorem Pos.Splits.mem_toList_left_iff {s : String} {pos : s.Pos} {t u : String} {c : Char}
(hs : pos.Splits t u) :
c t.toList pos', (h : pos' < pos), pos'.get (Pos.ne_endPos_of_lt h) = c := by
rw [hs.eq_left pos.splits, Slice.mem_toList_copy_iff_exists_get]
refine ?_, ?_
· rintro p, hp, hpget
have hlt : Pos.ofSliceTo p < pos := by
simpa using Pos.ofSliceTo_lt_ofSliceTo_iff.mpr ((Slice.Pos.lt_endPos_iff _).mpr hp)
exact _, hlt, by rwa [Pos.get_eq_get_ofSliceTo] at hpget
· rintro pos', hlt, hget
exact pos.sliceTo pos' (Std.le_of_lt hlt),
fun h => Std.ne_of_lt hlt (by simpa using congrArg Pos.ofSliceTo h),
by rw [Pos.get_eq_get_ofSliceTo]; simpa using hget
theorem Pos.Splits.mem_toList_right_iff {s : String} {pos : s.Pos} {t u : String} {c : Char}
(hs : pos.Splits t u) :
c u.toList pos', (_ : pos pos') (h : pos' s.endPos), pos'.get h = c := by
rw [hs.eq_right pos.splits, Slice.mem_toList_copy_iff_exists_get]
refine ?_, ?_
· rintro p, hp, hpget
exact Pos.ofSliceFrom p, Pos.le_ofSliceFrom,
fun h => hp (Pos.ofSliceFrom_inj.mp (h.trans Pos.ofSliceFrom_endPos.symm)),
by rwa [Pos.get_eq_get_ofSliceFrom] at hpget
· rintro pos', hle, hne, hget
exact pos.sliceFrom pos' hle,
fun h => hne (by simpa using congrArg Pos.ofSliceFrom h),
by rw [Pos.get_eq_get_ofSliceFrom]; simpa using hget
/--
A list of all positions strictly before {name}`p`, ordered from largest to smallest.

View File

@@ -10,6 +10,7 @@ public import Init.Data.String.Basic
import Init.Data.String.OrderInstances
import Init.Data.String.Lemmas.Basic
import Init.Data.Order.Lemmas
import Init.Omega
public section
@@ -56,6 +57,14 @@ theorem Slice.Pos.endPos_le {s : Slice} (p : s.Pos) : s.endPos ≤ p ↔ p = s.e
theorem Slice.Pos.lt_endPos_iff {s : Slice} (p : s.Pos) : p < s.endPos p s.endPos := by
simp [ endPos_le, Std.not_le]
@[simp]
theorem Pos.endPos_le {s : String} (p : s.Pos) : s.endPos p p = s.endPos :=
fun h => Std.le_antisymm (le_endPos _) h, by simp +contextual
@[simp]
theorem Pos.lt_endPos_iff {s : String} (p : s.Pos) : p < s.endPos p s.endPos := by
simp [ endPos_le, Std.not_le]
@[simp]
theorem Pos.le_startPos {s : String} (p : s.Pos) : p s.startPos p = s.startPos :=
fun h => Std.le_antisymm h (startPos_le _), by simp +contextual
@@ -64,10 +73,6 @@ theorem Pos.le_startPos {s : String} (p : s.Pos) : p ≤ s.startPos ↔ p = s.st
theorem Pos.startPos_lt_iff {s : String} {p : s.Pos} : s.startPos < p p s.startPos := by
simp [ le_startPos, Std.not_le]
@[simp]
theorem Pos.endPos_le {s : String} (p : s.Pos) : s.endPos p p = s.endPos :=
fun h => Std.le_antisymm (le_endPos _) h, by simp +contextual [Std.le_refl]
@[simp]
theorem Slice.Pos.not_lt_startPos {s : Slice} {p : s.Pos} : ¬ p < s.startPos :=
fun h => Std.lt_irrefl (Std.lt_of_lt_of_le h (Slice.Pos.startPos_le _))
@@ -100,19 +105,317 @@ theorem Slice.Pos.le_next {s : Slice} {p : s.Pos} {h} : p ≤ p.next h :=
theorem Pos.le_next {s : String} {p : s.Pos} {h} : p p.next h :=
Std.le_of_lt (by simp)
@[simp]
theorem Slice.Pos.ne_next {s : Slice} {p : s.Pos} {h} : p p.next h :=
Std.ne_of_lt (by simp)
@[simp]
theorem Pos.ne_next {s : String} {p : s.Pos} {h} : p p.next h :=
Std.ne_of_lt (by simp)
@[simp]
theorem Slice.Pos.next_ne {s : Slice} {p : s.Pos} {h} : p.next h p :=
Ne.symm (by simp)
@[simp]
theorem Pos.next_ne {s : String} {p : s.Pos} {h} : p.next h p :=
Ne.symm (by simp)
@[simp]
theorem Slice.Pos.next_ne_startPos {s : Slice} {p : s.Pos} {h} :
p.next h s.startPos :=
ne_startPos_of_lt lt_next
@[simp]
theorem Slice.Pos.ofSliceTo_lt_ofSliceTo_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Slice.Pos.ofSliceTo q < Slice.Pos.ofSliceTo r q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.ofSliceTo_le_ofSliceTo_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Slice.Pos.ofSliceTo q Slice.Pos.ofSliceTo r q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.sliceTo_lt_sliceTo_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ < Pos.sliceTo p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.sliceTo_le_sliceTo_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ Pos.sliceTo p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.sliceTo_lt_sliceTo_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ < Pos.sliceTo p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.sliceTo_le_sliceTo_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ Pos.sliceTo p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.sliceFrom_lt_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ < Pos.sliceFrom p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff, Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
@[simp]
theorem Slice.Pos.sliceFrom_le_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ Pos.sliceFrom p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
@[simp]
theorem Pos.sliceFrom_lt_sliceFrom_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ < Pos.sliceFrom p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
@[simp]
theorem Pos.sliceFrom_le_sliceFrom_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ Pos.sliceFrom p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
theorem Slice.Pos.ofSliceFrom_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p < q h, p < Slice.Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_of_lt (Std.lt_of_le_of_lt Pos.le_ofSliceFrom h), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_lt_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_lt_ofSliceFrom_iff]
theorem Slice.Pos.le_ofSliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q Pos.ofSliceFrom p h, Slice.Pos.sliceFrom p₀ q h p := by
simp [ Std.not_lt, Pos.ofSliceFrom_lt_iff]
theorem Slice.Pos.ofSliceFrom_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p q h, p Slice.Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_trans Pos.le_ofSliceFrom h, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_le_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_le_ofSliceFrom_iff]
theorem Slice.Pos.lt_ofSliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceFrom p h, Slice.Pos.sliceFrom p₀ q h < p := by
simp [ Std.not_le, Pos.ofSliceFrom_le_iff]
theorem Pos.ofSliceFrom_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p < q h, p < Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_of_lt (Std.lt_of_le_of_lt Pos.le_ofSliceFrom h), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_lt_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_lt_ofSliceFrom_iff]
theorem Pos.le_ofSliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q Pos.ofSliceFrom p h, Pos.sliceFrom p₀ q h p := by
simp [ Std.not_lt, Pos.ofSliceFrom_lt_iff]
theorem Pos.ofSliceFrom_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p q h, p Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_trans Pos.le_ofSliceFrom h, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_le_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_le_ofSliceFrom_iff]
theorem Pos.lt_ofSliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceFrom p h, Pos.sliceFrom p₀ q h < p := by
simp [ Std.not_le, Pos.ofSliceFrom_le_iff]
theorem Slice.Pos.ofSliceFrom_next {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {h} :
Pos.ofSliceFrom (p.next h) = (Pos.ofSliceFrom p).next (by simpa [ Pos.ofSliceFrom_inj] using h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceFrom_lt_ofSliceFrom_iff, Pos.lt_next, Pos.ofSliceFrom_le_iff,
Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceFrom_lt_iff]
theorem Pos.ofSliceFrom_next {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {h} :
Pos.ofSliceFrom (p.next h) = (Pos.ofSliceFrom p).next (by simpa [ Pos.ofSliceFrom_inj] using h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceFrom_lt_ofSliceFrom_iff, Slice.Pos.lt_next, Pos.ofSliceFrom_le_iff,
Slice.Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceFrom_lt_iff]
theorem Slice.Pos.le_ofSliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q Pos.ofSliceTo p h, Slice.Pos.sliceTo p₀ q h p := by
refine fun h => Slice.Pos.le_trans h Pos.ofSliceTo_le, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_le_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_le_ofSliceTo_iff]
theorem Slice.Pos.ofSliceTo_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p < q h, p < Slice.Pos.sliceTo p₀ q h := by
simp [ Std.not_le, Slice.Pos.le_ofSliceTo_iff]
theorem Slice.Pos.lt_ofSliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceTo p h, Slice.Pos.sliceTo p₀ q h < p := by
refine fun h => Std.le_of_lt (Std.lt_of_le_of_lt (Std.le_refl q) (Std.lt_of_lt_of_le h Pos.ofSliceTo_le)), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_lt_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_lt_ofSliceTo_iff]
theorem Slice.Pos.ofSliceTo_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p q h, p Slice.Pos.sliceTo p₀ q h := by
simp [ Std.not_lt, Slice.Pos.lt_ofSliceTo_iff]
theorem Pos.le_ofSliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q Pos.ofSliceTo p h, Pos.sliceTo p₀ q h p := by
refine fun h => Pos.le_trans h Pos.ofSliceTo_le, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_le_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_le_ofSliceTo_iff]
theorem Pos.ofSliceTo_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p < q h, p < Pos.sliceTo p₀ q h := by
simp [ Std.not_le, Pos.le_ofSliceTo_iff]
theorem Pos.lt_ofSliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceTo p h, Pos.sliceTo p₀ q h < p := by
refine fun h => Pos.le_of_lt (Pos.lt_of_lt_of_le h Pos.ofSliceTo_le), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_lt_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_lt_ofSliceTo_iff]
theorem Pos.ofSliceTo_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p q h, p Pos.sliceTo p₀ q h := by
simp [ Std.not_lt, Pos.lt_ofSliceTo_iff]
theorem Slice.Pos.lt_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
p < Slice.Pos.sliceFrom p₀ q h Pos.ofSliceFrom p < q := by
simp [ofSliceFrom_lt_iff, h]
theorem Slice.Pos.sliceFrom_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
Slice.Pos.sliceFrom p₀ q h p q Pos.ofSliceFrom p := by
simp [ Std.not_lt, lt_sliceFrom_iff]
theorem Slice.Pos.le_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
p Slice.Pos.sliceFrom p₀ q h Pos.ofSliceFrom p q := by
simp [ofSliceFrom_le_iff, h]
theorem Slice.Pos.sliceFrom_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
Slice.Pos.sliceFrom p₀ q h < p q < Pos.ofSliceFrom p := by
simp [ Std.not_le, le_sliceFrom_iff]
theorem Pos.lt_sliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
p < Pos.sliceFrom p₀ q h Pos.ofSliceFrom p < q := by
simp [ofSliceFrom_lt_iff, h]
theorem Pos.sliceFrom_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
Pos.sliceFrom p₀ q h p q Pos.ofSliceFrom p := by
simp [ Std.not_lt, lt_sliceFrom_iff]
theorem Pos.le_sliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
p Pos.sliceFrom p₀ q h Pos.ofSliceFrom p q := by
simp [ofSliceFrom_le_iff, h]
theorem Pos.sliceFrom_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
Pos.sliceFrom p₀ q h < p q < Pos.ofSliceFrom p := by
simp [ Std.not_le, le_sliceFrom_iff]
theorem Slice.Pos.sliceTo_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
Pos.sliceTo p₀ q h p q Pos.ofSliceTo p := by
simp [le_ofSliceTo_iff, h]
theorem Slice.Pos.lt_sliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
p < Pos.sliceTo p₀ q h Pos.ofSliceTo p < q := by
simp [ Std.not_le, sliceTo_le_iff]
theorem Slice.Pos.sliceTo_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
Slice.Pos.sliceTo p₀ q h < p q < Pos.ofSliceTo p := by
simp [lt_ofSliceTo_iff, h]
theorem Slice.Pos.le_sliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
p Slice.Pos.sliceTo p₀ q h Pos.ofSliceTo p q := by
simp [ Std.not_lt, sliceTo_lt_iff]
theorem Pos.sliceTo_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
Pos.sliceTo p₀ q h p q Pos.ofSliceTo p := by
simp [le_ofSliceTo_iff, h]
theorem Pos.lt_sliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
p < Pos.sliceTo p₀ q h Pos.ofSliceTo p < q := by
simp [ Std.not_le, sliceTo_le_iff]
theorem Pos.sliceTo_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
Pos.sliceTo p₀ q h < p q < Pos.ofSliceTo p := by
simp [lt_ofSliceTo_iff, h]
theorem Pos.le_sliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
p Pos.sliceTo p₀ q h Pos.ofSliceTo p q := by
simp [ Std.not_lt, sliceTo_lt_iff]
theorem Slice.Pos.ofSliceTo_ne_endPos {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos}
(h : p (s.sliceTo p₀).endPos) : Pos.ofSliceTo p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₀))
simpa [ lt_endPos_iff, ofSliceTo_lt_ofSliceTo_iff] using h
theorem Pos.ofSliceTo_ne_endPos {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos}
(h : p (s.sliceTo p₀).endPos) : Pos.ofSliceTo p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₀))
simpa [ Slice.Pos.lt_endPos_iff, ofSliceTo_lt_ofSliceTo_iff] using h
theorem Slice.Pos.ofSliceTo_next {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {h} :
Pos.ofSliceTo (p.next h) = (Pos.ofSliceTo p).next (ofSliceTo_ne_endPos h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceTo_lt_ofSliceTo_iff, Pos.lt_next, Pos.ofSliceTo_le_iff,
Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceTo_lt_iff]
theorem Pos.ofSliceTo_next {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {h} :
Pos.ofSliceTo (p.next h) = (Pos.ofSliceTo p).next (ofSliceTo_ne_endPos h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceTo_lt_ofSliceTo_iff, Slice.Pos.lt_next, Pos.ofSliceTo_le_iff,
Slice.Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceTo_lt_iff]
@[simp]
theorem Slice.Pos.slice_lt_slice_iff {s : Slice} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ < r.slice p₀ p₁ h₁' h₂' q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff, Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Slice.Pos.slice_le_slice_iff {s : Slice} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ r.slice p₀ p₁ h₁' h₂' q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Pos.slice_lt_slice_iff {s : String} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ < r.slice p₀ p₁ h₁' h₂' q < r := by
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Pos.slice_le_slice_iff {s : String} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ r.slice p₀ p₁ h₁' h₂' q r := by
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
theorem Slice.Pos.ofSlice_ne_endPos {s : Slice} {p₀ p₁ : s.Pos} {h} {p : (s.slice p₀ p₁ h).Pos}
(h : p (s.slice p₀ p₁ h).endPos) : Pos.ofSlice p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₁))
simpa [ lt_endPos_iff, ofSlice_lt_ofSlice_iff] using h
theorem Pos.ofSlice_ne_endPos {s : String} {p₀ p₁ : s.Pos} {h} {p : (s.slice p₀ p₁ h).Pos}
(h : p (s.slice p₀ p₁ h).endPos) : Pos.ofSlice p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₁))
simpa [ Slice.Pos.lt_endPos_iff, ofSlice_lt_ofSlice_iff] using h
@[simp]
theorem Slice.Pos.offset_le_rawEndPos {s : Slice} {p : s.Pos} :
p.offset s.rawEndPos :=
@@ -161,4 +464,38 @@ theorem Pos.isUTF8FirstByte_getUTF8Byte_offset {s : String} {p : s.Pos} {h} :
(s.getUTF8Byte p.offset h).IsUTF8FirstByte := by
simpa [getUTF8Byte_offset] using isUTF8FirstByte_byte
theorem Slice.Pos.get_eq_get_ofSliceTo {s : Slice} {p₀ : s.Pos} {pos : (s.sliceTo p₀).Pos} {h} :
pos.get h = (ofSliceTo pos).get (ofSliceTo_ne_endPos h) := by
simp [Slice.Pos.get]
theorem Pos.get_eq_get_ofSliceTo {s : String} {p₀ : s.Pos}
{pos : (s.sliceTo p₀).Pos} {h} :
pos.get h = (ofSliceTo pos).get (ofSliceTo_ne_endPos h) := by
simp [Pos.get, Slice.Pos.get]
theorem Slice.Pos.get_eq_get_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} {h'} :
pos.get h' = (ofSlice pos).get (ofSlice_ne_endPos h') := by
simp [Slice.Pos.get, Nat.add_assoc]
theorem Pos.get_eq_get_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} {h'} :
pos.get h' = (ofSlice pos).get (ofSlice_ne_endPos h') := by
simp [Pos.get, Slice.Pos.get]
theorem Slice.Pos.ofSlice_next {s : Slice} {p₀ p₁ : s.Pos} {h}
{p : (s.slice p₀ p₁ h).Pos} {h'} :
Pos.ofSlice (p.next h') = (Pos.ofSlice p).next (ofSlice_ne_endPos h') := by
simp only [Slice.Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.offset_next, Slice.Pos.offset_ofSlice]
rw [Slice.Pos.get_eq_get_ofSlice (h' := h')]
simp [Pos.Raw.offsetBy, Nat.add_assoc]
theorem Pos.ofSlice_next {s : String} {p₀ p₁ : s.Pos} {h}
{p : (s.slice p₀ p₁ h).Pos} {h'} :
Pos.ofSlice (p.next h') = (Pos.ofSlice p).next (ofSlice_ne_endPos h') := by
simp only [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.offset_next, Pos.offset_next,
Pos.offset_ofSlice]
rw [Pos.get_eq_get_ofSlice (h' := h')]
simp [Pos.Raw.offsetBy, Nat.add_assoc]
end String

View File

@@ -12,3 +12,4 @@ public import Init.Data.String.Lemmas.Pattern.Pred
public import Init.Data.String.Lemmas.Pattern.Char
public import Init.Data.String.Lemmas.Pattern.String
public import Init.Data.String.Lemmas.Pattern.Split
public import Init.Data.String.Lemmas.Pattern.Find

View File

@@ -12,6 +12,7 @@ public import Init.Data.Iterators.Consumers.Collect
import all Init.Data.String.Pattern.Basic
import Init.Data.String.OrderInstances
import Init.Data.String.Lemmas.IsEmpty
import Init.Data.String.Lemmas.Basic
import Init.Data.String.Lemmas.Order
import Init.Data.String.Termination
import Init.Data.Order.Lemmas
@@ -168,6 +169,24 @@ theorem IsLongestMatchAt.eq {pat : ρ} [ForwardPatternModel pat] {s : Slice} {st
endPos = endPos' := by
simpa using h.isLongestMatch_sliceFrom.eq h'.isLongestMatch_sliceFrom
private theorem isLongestMatch_of_eq {pat : ρ} [ForwardPatternModel pat] {s t : Slice}
{pos : s.Pos} {pos' : t.Pos} (h_eq : s = t) (h_pos : pos.offset = pos'.offset)
(hm : IsLongestMatch pat pos) : IsLongestMatch pat pos' := by
subst h_eq; exact (Slice.Pos.ext h_pos) hm
theorem isLongestMatchAt_iff_isLongestMatchAt_ofSliceFrom {pat : ρ} [ForwardPatternModel pat]
{s : Slice} {base : s.Pos} {startPos endPos : (s.sliceFrom base).Pos} :
IsLongestMatchAt pat startPos endPos IsLongestMatchAt pat (Pos.ofSliceFrom startPos) (Pos.ofSliceFrom endPos) := by
constructor
· intro h
refine Slice.Pos.ofSliceFrom_le_ofSliceFrom_iff.mpr h.le, ?_
exact isLongestMatch_of_eq Slice.sliceFrom_sliceFrom
(by simp [Pos.Raw.ext_iff]; omega) h.isLongestMatch_sliceFrom
· intro h
refine Slice.Pos.ofSliceFrom_le_ofSliceFrom_iff.mp h.le, ?_
exact isLongestMatch_of_eq Slice.sliceFrom_sliceFrom.symm
(by simp [Pos.Raw.ext_iff]; omega) h.isLongestMatch_sliceFrom
theorem IsLongestMatch.isLongestMatchAt_ofSliceFrom {pat : ρ} [ForwardPatternModel pat] {s : Slice}
{p₀ : s.Pos} {pos : (s.sliceFrom p₀).Pos} (h : IsLongestMatch pat pos) :
IsLongestMatchAt pat p₀ (Slice.Pos.ofSliceFrom pos) where
@@ -198,6 +217,27 @@ theorem matchesAt_iff_exists_isMatch {pat : ρ} [ForwardPatternModel pat] {s : S
Std.le_trans h₁ (by simpa [ Pos.ofSliceFrom_le_ofSliceFrom_iff] using hq.le_of_isMatch h₂),
by simpa using hq
@[simp]
theorem not_matchesAt_endPos {pat : ρ} [ForwardPatternModel pat] {s : Slice} :
¬ MatchesAt pat s.endPos := by
simp only [matchesAt_iff_exists_isMatch, Pos.endPos_le, exists_prop_eq]
intro h
simpa [ Pos.ofSliceFrom_inj] using h.ne_startPos
theorem matchesAt_iff_matchesAt_ofSliceFrom {pat : ρ} [ForwardPatternModel pat] {s : Slice} {base : s.Pos}
{pos : (s.sliceFrom base).Pos} : MatchesAt pat pos MatchesAt pat (Pos.ofSliceFrom pos) := by
simp only [matchesAt_iff_exists_isLongestMatchAt]
constructor
· rintro endPos, h
exact Pos.ofSliceFrom endPos, isLongestMatchAt_iff_isLongestMatchAt_ofSliceFrom.mp h
· rintro endPos, h
exact base.sliceFrom endPos (Std.le_trans Slice.Pos.le_ofSliceFrom h.le),
isLongestMatchAt_iff_isLongestMatchAt_ofSliceFrom.mpr (by simpa using h)
theorem IsLongestMatchAt.matchesAt {pat : ρ} [ForwardPatternModel pat] {s : Slice} {startPos endPos : s.Pos}
(h : IsLongestMatchAt pat startPos endPos) : MatchesAt pat startPos where
exists_isLongestMatchAt := _, h
open Classical in
/--
Noncomputable model function returning the end point of the longest match starting at the given

View File

@@ -10,6 +10,10 @@ public import Init.Data.String.Pattern.Char
public import Init.Data.String.Lemmas.Pattern.Basic
import Init.Data.Option.Lemmas
import Init.Data.String.Lemmas.Basic
import Init.Data.String.Lemmas.Order
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Omega
public section
@@ -25,8 +29,7 @@ instance {c : Char} : NoPrefixForwardPatternModel c :=
theorem isMatch_iff {c : Char} {s : Slice} {pos : s.Pos} :
IsMatch c pos
(h : s.startPos s.endPos), pos = s.startPos.next h s.startPos.get h = c := by
simp only [Model.isMatch_iff, ForwardPatternModel.Matches]
rw [sliceTo_copy_eq_iff_exists_splits]
simp only [Model.isMatch_iff, ForwardPatternModel.Matches, sliceTo_copy_eq_iff_exists_splits]
refine ?_, ?_
· simp only [splits_singleton_iff]
exact fun t₂, h, h₁, h₂, h₃ => h, h₁, h₂
@@ -38,12 +41,43 @@ theorem isLongestMatch_iff {c : Char} {s : Slice} {pos : s.Pos} :
(h : s.startPos s.endPos), pos = s.startPos.next h s.startPos.get h = c := by
rw [isLongestMatch_iff_isMatch, isMatch_iff]
theorem isLongestMatchAt_iff {c : Char} {s : Slice} {pos pos' : s.Pos} :
IsLongestMatchAt c pos pos' h, pos' = pos.next h pos.get h = c := by
simp +contextual [Model.isLongestMatchAt_iff, isLongestMatch_iff, Pos.ofSliceFrom_inj,
Pos.get_eq_get_ofSliceFrom, Pos.ofSliceFrom_next]
theorem isLongestMatchAt_of_get_eq {c : Char} {s : Slice} {pos : s.Pos} {h : pos s.endPos}
(hc : pos.get h = c) : IsLongestMatchAt c pos (pos.next h) :=
isLongestMatchAt_iff.2 h, by simp [hc]
instance {c : Char} : LawfulForwardPatternModel c where
dropPrefix?_eq_some_iff {s} pos := by
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?]
exact fun h, h₁, h₂ => h, h₂.symm, h₁, fun h, h₁, h₂ => h, h₂, h₁.symm
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?, and_comm, eq_comm (b := pos)]
instance {c : Char} : LawfulToForwardSearcherModel c :=
.defaultImplementation
theorem matchesAt_iff {c : Char} {s : Slice} {pos : s.Pos} :
MatchesAt c pos (h : pos s.endPos), pos.get h = c := by
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
theorem matchesAt_iff_splits {c : Char} {s : Slice} {pos : s.Pos} :
MatchesAt c pos t₁ t₂, pos.Splits t₁ (singleton c ++ t₂) := by
rw [matchesAt_iff]
refine ?_, ?_
· rintro h, rfl
exact _, _, pos.splits_next_right h
· rintro t₁, t₂, hs
have hne := hs.ne_endPos_of_singleton
exact hne, (singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm
theorem not_matchesAt_of_get_ne {c : Char} {s : Slice} {pos : s.Pos} {h : pos s.endPos}
(hc : pos.get h c) : ¬ MatchesAt c pos := by
simp [matchesAt_iff, hc]
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {c : Char} :
matchAt? c pos =
if h₀ : (h : pos s.endPos), pos.get h = c then some (pos.next h₀.1) else none := by
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
end String.Slice.Pattern.Model.Char

View File

@@ -0,0 +1,11 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Markus Himmel
-/
module
prelude
public import Init.Data.String.Lemmas.Pattern.Find.Basic
public import Init.Data.String.Lemmas.Pattern.Find.Char
public import Init.Data.String.Lemmas.Pattern.Find.Pred

View File

@@ -0,0 +1,129 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Markus Himmel
-/
module
prelude
public import Init.Data.String.Slice
public import Init.Data.String.Search
public import Init.Data.String.Lemmas.Pattern.Basic
import all Init.Data.String.Slice
import all Init.Data.String.Search
import Init.Data.Iterators.Lemmas.Consumers.Loop
import Init.Data.String.Lemmas.Order
import Init.Data.String.Lemmas.Basic
import Init.Data.String.OrderInstances
import Init.Grind
public section
open Std String.Slice Pattern Pattern.Model
namespace String.Slice
theorem Pattern.Model.find?_eq_some_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} {pos : s.Pos} :
s.find? pat = some pos MatchesAt pat pos ( pos', pos' < pos ¬ MatchesAt pat pos') := by
rw [find?, Iter.findSome?_toList]
suffices (l : List (SearchStep s)) (pos : s.Pos) (hl : IsValidSearchFrom pat pos l) (pos' : s.Pos),
l.findSome? (fun | .matched s _ => some s | .rejected .. => none) = some pos'
pos pos' MatchesAt pat pos' pos'', pos pos'' pos'' < pos' ¬ MatchesAt pat pos'' by
simpa using this (ToForwardSearcher.toSearcher pat s).toList s.startPos
(LawfulToForwardSearcherModel.isValidSearchFrom_toList s) pos
intro l pos hl pos'
induction hl with
| endPos => simp +contextual
| matched h₁ _ _ => have := h₁.matchesAt; grind
| mismatched => grind
theorem Pattern.Model.find?_eq_none_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} :
s.find? pat = none (pos : s.Pos), ¬ MatchesAt pat pos := by
simp only [Option.eq_none_iff_forall_ne_some, ne_eq, find?_eq_some_iff, not_and,
Classical.not_forall, Classical.not_not]
refine fun _ pos => ?_, by grind
induction pos using WellFounded.induction Pos.wellFounded_lt with grind
@[simp]
theorem isSome_find? {ρ : Type} (pat : ρ) {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] {s : Slice} : (s.find? pat).isSome = s.contains pat := by
rw [find?, contains, Iter.findSome?_toList, Iter.any_toList]
induction (ToForwardSearcher.toSearcher pat s).toList <;> grind
@[simp]
theorem find?_eq_none_iff {ρ : Type} (pat : ρ) {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] {s : Slice} : s.find? pat = none s.contains pat = false := by
rw [ Option.isNone_iff_eq_none, Option.isSome_eq_false_iff, isSome_find?]
theorem Pattern.Model.contains_eq_false_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} :
s.contains pat = false (pos : s.Pos), ¬ MatchesAt pat pos := by
rw [ find?_eq_none_iff, Slice.find?_eq_none_iff]
theorem Pattern.Model.contains_eq_true_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} :
s.contains pat (pos : s.Pos), MatchesAt pat pos := by
simp [ Bool.not_eq_false, contains_eq_false_iff]
theorem Pos.find?_eq_find?_sliceFrom {ρ : Type} {pat : ρ} {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
{s : Slice} {p : s.Pos} :
p.find? pat = ((s.sliceFrom p).find? pat).map Pos.ofSliceFrom :=
(rfl)
theorem Pattern.Model.posFind?_eq_some_iff {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} {pos pos' : s.Pos} :
pos.find? pat = some pos' pos pos' MatchesAt pat pos' ( pos'', pos pos'' pos'' < pos' ¬ MatchesAt pat pos'') := by
simp only [Pos.find?_eq_find?_sliceFrom, Option.map_eq_some_iff, find?_eq_some_iff,
matchesAt_iff_matchesAt_ofSliceFrom]
refine ?_, ?_
· rintro pos', h₁, h₂, rfl
refine Pos.le_ofSliceFrom, h₁, fun p hp₁ hp₂ => ?_
simpa using h₂ (Pos.sliceFrom _ _ hp₁) (Pos.sliceFrom_lt_iff.2 hp₂)
· rintro h₁, h₂, h₃
refine Pos.sliceFrom _ _ h₁, by simpa using h₂, fun p hp₁ hp₂ => ?_, by simp
exact h₃ (Pos.ofSliceFrom p) Slice.Pos.le_ofSliceFrom (Pos.lt_sliceFrom_iff.1 hp₁) hp₂
theorem Pattern.Model.posFind?_eq_none_iff {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, Iterators.Finite (σ s) Id]
[ s, IteratorLoop (σ s) Id Id] [ s, LawfulIteratorLoop (σ s) Id Id]
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} {pos : s.Pos} :
pos.find? pat = none pos', pos pos' ¬ MatchesAt pat pos' := by
rw [Pos.find?_eq_find?_sliceFrom, Option.map_eq_none_iff, Pattern.Model.find?_eq_none_iff]
simpa only [matchesAt_iff_matchesAt_ofSliceFrom] using fun h p hp =>
by simpa using h (Pos.sliceFrom _ _ hp), fun h p => by simpa using h _ Pos.le_ofSliceFrom
end Slice
theorem Pos.find?_eq_find?_toSlice {ρ : Type} {pat : ρ} {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
{s : String} {p : s.Pos} : p.find? pat = (p.toSlice.find? pat).map Pos.ofToSlice :=
(rfl)
theorem find?_eq_find?_toSlice {ρ : Type} {pat : ρ} {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
{s : String} : s.find? pat = (s.toSlice.find? pat).map Pos.ofToSlice :=
(rfl)
theorem contains_eq_contains_toSlice {ρ : Type} {pat : ρ} {σ : Slice Type}
[ s, Iterator (σ s) Id (SearchStep s)] [ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
{s : String} : s.contains pat = s.toSlice.contains pat :=
(rfl)
end String

View File

@@ -0,0 +1,174 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Markus Himmel
-/
module
prelude
public import Init.Data.String.Slice
import Init.Data.String.Lemmas.Pattern.Find.Basic
import Init.Data.String.Lemmas.Pattern.Char
import Init.Data.String.Lemmas.Basic
import Init.Data.String.Lemmas.Order
import Init.Data.String.Termination
import Init.Data.String.Lemmas.Iterate
import Init.Grind
import Init.Data.Option.Lemmas
import Init.Data.String.OrderInstances
namespace String.Slice
theorem find?_char_eq_some_iff {c : Char} {s : Slice} {pos : s.Pos} :
s.find? c = some pos
h, pos.get h = c pos', (h' : pos' < pos) pos'.get (Pos.ne_endPos_of_lt h') c := by
grind [Pattern.Model.find?_eq_some_iff, Pattern.Model.Char.matchesAt_iff]
@[simp]
theorem contains_char_eq {c : Char} {s : Slice} : s.contains c = decide (c s.copy.toList) := by
rw [Bool.eq_iff_iff, Pattern.Model.contains_eq_true_iff]
simp [Pattern.Model.Char.matchesAt_iff, mem_toList_copy_iff_exists_get]
theorem find?_char_eq_some_iff_splits {c : Char} {s : Slice} {pos : s.Pos} :
s.find? c = some pos t u, pos.Splits t (singleton c ++ u) c t.toList := by
rw [find?_char_eq_some_iff]
refine ?_, ?_
· rintro h, hget, hmin
refine _, _, hget pos.splits_next_right h, fun hmem => ?_
obtain pos', hlt, hpget := (hget pos.splits_next_right h).mem_toList_left_iff.mp hmem
exact absurd hpget (hmin _ hlt)
· rintro t, u, hs, hnotin
have hne := hs.ne_endPos_of_singleton
exact hne, (singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm,
fun pos' hlt hget => hnotin (hs.mem_toList_left_iff.mpr pos', hlt, hget)
theorem Pos.find?_char_eq_some_iff {c : Char} {s : Slice} {pos pos' : s.Pos} :
pos.find? c = some pos'
pos pos' ( h, pos'.get h = c)
pos'', pos pos'' (h' : pos'' < pos') pos''.get (Pos.ne_endPos_of_lt h') c := by
grind [Pattern.Model.posFind?_eq_some_iff, Pattern.Model.Char.matchesAt_iff]
theorem Pos.find?_char_eq_some_iff_splits {c : Char} {s : Slice} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
pos.find? c = some pos' v w, pos'.Splits (t ++ v) (singleton c ++ w) c v.toList := by
rw [Pos.find?_char_eq_some_iff]
refine ?_, ?_
· rintro hle, hne, hget, hmin
have hsplit := hget pos'.splits_next_right hne
obtain v, hv1, hv2 := (hs.le_iff_exists_eq_append hsplit).mp hle
refine v, _, hsplit.of_eq hv1 rfl, fun hmem => ?_
obtain _, hcopy :=
Slice.copy_slice_eq_iff_splits.mpr t, _, hs.of_eq rfl hv2, hsplit.of_eq hv1 rfl
rw [ hcopy] at hmem
obtain p, hp, hpget := mem_toList_copy_iff_exists_get.mp hmem
have hlt : Pos.ofSlice p < pos' := by
simpa [ Slice.Pos.lt_endPos_iff, Pos.ofSlice_lt_ofSlice_iff] using hp
exact absurd (Pos.get_eq_get_ofSlice hpget) (hmin _ Pos.le_ofSlice hlt)
· rintro v, w, hsplit, hnotin
have hne := hsplit.ne_endPos_of_singleton
have hu : u = v ++ (singleton c ++ w) :=
append_right_inj t |>.mp (hs.eq_append.symm.trans (by rw [hsplit.eq_append, append_assoc]))
have hle : pos pos' := (hs.le_iff_exists_eq_append hsplit).mpr v, rfl, hu
refine hle,
hne, (singleton_append_inj.mp (hsplit.eq_right (pos'.splits_next_right hne))).1.symm,
fun pos'' hle' hlt hget => hnotin ?_
obtain _, hcopy :=
Slice.copy_slice_eq_iff_splits.mpr t, _, hs.of_eq rfl hu, hsplit
rw [ hcopy]
exact mem_toList_copy_iff_exists_get.mpr
pos''.slice pos pos' hle' (Std.le_of_lt hlt),
fun h => Std.ne_of_lt hlt
(by rw [ Slice.Pos.ofSlice_slice (h₁ := hle') (h₂ := Std.le_of_lt hlt), h,
Slice.Pos.ofSlice_endPos]),
by rw [Slice.Pos.get_eq_get_ofSlice]
simp [Slice.Pos.ofSlice_slice]
exact hget
theorem Pos.find?_char_eq_none_iff {c : Char} {s : Slice} {pos : s.Pos} :
pos.find? c = none pos', pos pos' (h : pos' s.endPos) pos'.get h c := by
grind [Pattern.Model.posFind?_eq_none_iff, Pattern.Model.Char.matchesAt_iff]
theorem Pos.find?_char_eq_none_iff_not_mem_of_splits {c : Char} {s : Slice} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) :
pos.find? c = none c u.toList := by
simp [Pos.find?_char_eq_none_iff, hs.mem_toList_right_iff]
end Slice
theorem Pos.find?_char_eq_some_iff {c : Char} {s : String} {pos pos' : s.Pos} :
pos.find? c = some pos'
pos pos' ( h, pos'.get h = c)
pos'', pos pos'' (h' : pos'' < pos') pos''.get (Pos.ne_endPos_of_lt h') c := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.Pos.find?_char_eq_some_iff, ne_eq, endPos_toSlice]
refine ?_, ?_
· rintro pos', h₁, h₂, rfl, h₃, rfl
refine by simpa [Pos.ofToSlice_le_iff] using h₁,
by simpa [ Pos.ofToSlice_inj] using h₂, by simp [Pos.get_ofToSlice], ?_
intro pos'' h₄ h₅
simpa using h₃ pos''.toSlice (by simpa [Pos.toSlice_le] using h₄) (by simpa using h₅)
· rintro h₁, h₂, hget, h₃
refine pos'.toSlice, by simpa [Pos.toSlice_le] using h₁,
by simpa [ Pos.toSlice_inj] using h₂, by simpa using hget, fun p hp₁ hp₂ => ?_,
by simp
simpa using h₃ (Pos.ofToSlice p)
(by simpa [Pos.ofToSlice_le_iff] using hp₁) (by simpa using hp₂)
theorem Pos.find?_char_eq_some_iff_splits {c : Char} {s : String} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
pos.find? c = some pos' v w, pos'.Splits (t ++ v) (singleton c ++ w) c v.toList := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.Pos.find?_char_eq_some_iff_splits (Pos.splits_toSlice_iff.mpr hs)]
constructor
· rintro q, v, w, hsplit, hnotin, rfl
exact v, w, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hnotin
· rintro v, w, hsplit, hnotin
exact pos'.toSlice, v, w, Pos.splits_toSlice_iff.mpr hsplit, hnotin, by simp
theorem Pos.find?_char_eq_none_iff {c : Char} {s : String} {pos : s.Pos} :
pos.find? c = none pos', pos pos' (h : pos' s.endPos) pos'.get h c := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff,
Slice.Pos.find?_char_eq_none_iff, endPos_toSlice]
refine ?_, ?_
· intro h pos' h₁ h₂
simpa [Pos.get_ofToSlice] using
h pos'.toSlice (by simpa [Pos.toSlice_le] using h₁) (by simpa [ Pos.toSlice_inj] using h₂)
· intro h pos' h₁ h₂
simpa using h (Pos.ofToSlice pos')
(by simpa [Pos.ofToSlice_le_iff] using h₁) (by simpa [ Pos.ofToSlice_inj] using h₂)
theorem Pos.find?_char_eq_none_iff_not_mem_of_splits {c : Char} {s : String} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) :
pos.find? c = none c u.toList := by
rw [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff]
exact Slice.Pos.find?_char_eq_none_iff_not_mem_of_splits (Pos.splits_toSlice_iff.mpr hs)
theorem find?_char_eq_some_iff {c : Char} {s : String} {pos : s.Pos} :
s.find? c = some pos
h, pos.get h = c pos', (h' : pos' < pos) pos'.get (Pos.ne_endPos_of_lt h') c := by
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff, Slice.find?_char_eq_some_iff, ne_eq,
endPos_toSlice, exists_and_right]
refine ?_, ?_
· rintro pos, h, rfl, h', rfl
refine by simpa [ Pos.ofToSlice_inj] using h, by simp [Pos.get_ofToSlice], ?_
intro pos' hp
simpa using h' pos'.toSlice hp
· rintro h, hget, hmin
exact pos.toSlice, by simpa [ Pos.toSlice_inj] using h, by simpa using hget,
fun pos' hp => by simpa using hmin (Pos.ofToSlice pos') hp, by simp
theorem find?_char_eq_some_iff_splits {c : Char} {s : String} {pos : s.Pos} :
s.find? c = some pos t u, pos.Splits t (singleton c ++ u) c t.toList := by
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.find?_char_eq_some_iff_splits]
constructor
· rintro q, t, u, hsplit, hnotin, rfl
exact t, u, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hnotin
· rintro t, u, hsplit, hnotin
exact pos.toSlice, t, u, Pos.splits_toSlice_iff.mpr hsplit, hnotin, by simp
@[simp]
theorem contains_char_eq {c : Char} {s : String} : s.contains c = decide (c s.toList) := by
simp [contains_eq_contains_toSlice, Slice.contains_char_eq, copy_toSlice]
end String

View File

@@ -0,0 +1,367 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Markus Himmel
-/
module
prelude
public import Init.Data.String.Slice
import Init.Data.String.Lemmas.Pattern.Find.Basic
import Init.Data.String.Lemmas.Pattern.Pred
import Init.Data.String.Lemmas.Basic
import Init.Data.String.Lemmas.Order
import Init.Data.String.Termination
import Init.Data.String.Lemmas.Iterate
import Init.Grind
import Init.Data.Option.Lemmas
import Init.Data.String.OrderInstances
namespace String.Slice
theorem find?_bool_eq_some_iff {p : Char Bool} {s : Slice} {pos : s.Pos} :
s.find? p = some pos
h, p (pos.get h) pos', (h' : pos' < pos) p (pos'.get (Pos.ne_endPos_of_lt h')) = false := by
grind [Pattern.Model.find?_eq_some_iff, Pattern.Model.CharPred.matchesAt_iff]
theorem find?_prop_eq_some_iff {p : Char Prop} [DecidablePred p] {s : Slice} {pos : s.Pos} :
s.find? p = some pos
h, p (pos.get h) pos', (h' : pos' < pos) ¬ p (pos'.get (Pos.ne_endPos_of_lt h')) := by
grind [Pattern.Model.find?_eq_some_iff, Pattern.Model.CharPred.Decidable.matchesAt_iff]
theorem find?_bool_eq_some_iff_splits {p : Char Bool} {s : Slice} {pos : s.Pos} :
s.find? p = some pos
t c u, pos.Splits t (singleton c ++ u) p c d t.toList, p d = false := by
rw [find?_bool_eq_some_iff]
refine ?_, ?_
· rintro h, hp, hmin
exact _, _, _, pos.splits_next_right h, hp, fun d hd => by
obtain pos', hlt, hpget := (pos.splits_next_right h).mem_toList_left_iff.mp hd
subst hpget; exact hmin _ hlt
· rintro t, c, u, hs, hpc, hmin
have hne := hs.ne_endPos_of_singleton
refine hne, ?_, fun pos' hlt => hmin _ (hs.mem_toList_left_iff.mpr pos', hlt, rfl)
rw [(singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm]
exact hpc
theorem find?_prop_eq_some_iff_splits {p : Char Prop} [DecidablePred p] {s : Slice}
{pos : s.Pos} :
s.find? p = some pos
t c u, pos.Splits t (singleton c ++ u) p c d t.toList, ¬ p d := by
rw [find?_prop_eq_some_iff]
refine ?_, ?_
· rintro h, hp, hmin
exact _, _, _, pos.splits_next_right h, hp, fun d hd => by
obtain pos', hlt, hpget := (pos.splits_next_right h).mem_toList_left_iff.mp hd
subst hpget; exact hmin _ hlt
· rintro t, c, u, hs, hpc, hmin
have hne := hs.ne_endPos_of_singleton
refine hne, ?_, fun pos' hlt => hmin _ (hs.mem_toList_left_iff.mpr pos', hlt, rfl)
rw [(singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm]
exact hpc
@[simp]
theorem contains_bool_eq {p : Char Bool} {s : Slice} : s.contains p = s.copy.toList.any p := by
rw [Bool.eq_iff_iff, Pattern.Model.contains_eq_true_iff]
simp only [Pattern.Model.CharPred.matchesAt_iff, ne_eq, List.any_eq_true,
mem_toList_copy_iff_exists_get]
exact fun pos, h, hp => _, _, _, rfl, hp, fun _, p, h, h', hp => p, h, h' hp
@[simp]
theorem contains_prop_eq {p : Char Prop} [DecidablePred p] {s : Slice} :
s.contains p = s.copy.toList.any p := by
rw [Bool.eq_iff_iff, Pattern.Model.contains_eq_true_iff]
simp only [Pattern.Model.CharPred.Decidable.matchesAt_iff, ne_eq, List.any_eq_true,
mem_toList_copy_iff_exists_get, decide_eq_true_eq]
exact fun pos, h, hp => _, _, _, rfl, hp, fun _, p, h, h', hp => p, h, h' hp
theorem Pos.find?_bool_eq_some_iff {p : Char Bool} {s : Slice} {pos pos' : s.Pos} :
pos.find? p = some pos'
pos pos' ( h, p (pos'.get h))
pos'', pos pos'' (h' : pos'' < pos')
p (pos''.get (Pos.ne_endPos_of_lt h')) = false := by
grind [Pattern.Model.posFind?_eq_some_iff, Pattern.Model.CharPred.matchesAt_iff]
theorem Pos.find?_bool_eq_some_iff_splits {p : Char Bool} {s : Slice} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
pos.find? p = some pos'
v c w, pos'.Splits (t ++ v) (singleton c ++ w) p c
d v.toList, p d = false := by
rw [Pos.find?_bool_eq_some_iff]
refine ?_, ?_
· rintro hle, hne, hp, hmin
have hsplit := pos'.splits_next_right hne
obtain v, hv1, hv2 := (hs.le_iff_exists_eq_append hsplit).mp hle
refine v, pos'.get hne, _, hsplit.of_eq hv1 rfl, hp, fun d hd => ?_
obtain _, hcopy :=
Slice.copy_slice_eq_iff_splits.mpr t, _, hs.of_eq rfl hv2, hsplit.of_eq hv1 rfl
rw [ hcopy] at hd
obtain q, hq, hqget := mem_toList_copy_iff_exists_get.mp hd
have hlt : Pos.ofSlice q < pos' := by
simpa [ Slice.Pos.lt_endPos_iff, Pos.ofSlice_lt_ofSlice_iff] using hq
subst hqget; rw [Slice.Pos.get_eq_get_ofSlice]; exact hmin _ Pos.le_ofSlice hlt
· rintro v, c, w, hsplit, hpc, hmin
have hne := hsplit.ne_endPos_of_singleton
have hu : u = v ++ (singleton c ++ w) :=
append_right_inj t |>.mp (hs.eq_append.symm.trans (by rw [hsplit.eq_append, append_assoc]))
have hle : pos pos' := (hs.le_iff_exists_eq_append hsplit).mpr v, rfl, hu
refine hle, hne, ?_, fun pos'' hle' hlt => hmin _ ?_
· rw [(singleton_append_inj.mp (hsplit.eq_right (pos'.splits_next_right hne))).1.symm]
exact hpc
· obtain _, hcopy :=
Slice.copy_slice_eq_iff_splits.mpr t, _, hs.of_eq rfl hu, hsplit
rw [ hcopy]
exact mem_toList_copy_iff_exists_get.mpr
pos''.slice pos pos' hle' (Std.le_of_lt hlt),
fun h => Std.ne_of_lt hlt
(by rw [ Slice.Pos.ofSlice_slice (h₁ := hle') (h₂ := Std.le_of_lt hlt), h,
Slice.Pos.ofSlice_endPos]),
by rw [Slice.Pos.get_eq_get_ofSlice]
simp [Slice.Pos.ofSlice_slice]
theorem Pos.find?_bool_eq_none_iff {p : Char Bool} {s : Slice} {pos : s.Pos} :
pos.find? p = none
pos', pos pos' (h : pos' s.endPos) p (pos'.get h) = false := by
grind [Pattern.Model.posFind?_eq_none_iff, Pattern.Model.CharPred.matchesAt_iff]
theorem Pos.find?_bool_eq_none_iff_of_splits {p : Char Bool} {s : Slice} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) :
pos.find? p = none c u.toList, p c = false := by
rw [Pos.find?_bool_eq_none_iff]
constructor
· intro h c hc
obtain pos', hle, hne, hget := hs.mem_toList_right_iff.mp hc
subst hget; exact h pos' hle hne
· intro h pos' hle hne
exact h _ (hs.mem_toList_right_iff.mpr pos', hle, hne, rfl)
theorem Pos.find?_prop_eq_some_iff {p : Char Prop} [DecidablePred p] {s : Slice}
{pos pos' : s.Pos} :
pos.find? p = some pos'
pos pos' ( h, p (pos'.get h))
pos'', pos pos'' (h' : pos'' < pos')
¬ p (pos''.get (Pos.ne_endPos_of_lt h')) := by
grind [Pattern.Model.posFind?_eq_some_iff, Pattern.Model.CharPred.Decidable.matchesAt_iff]
theorem Pos.find?_prop_eq_some_iff_splits {p : Char Prop} [DecidablePred p] {s : Slice}
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
pos.find? p = some pos'
v c w, pos'.Splits (t ++ v) (singleton c ++ w) p c d v.toList, ¬ p d := by
rw [Pos.find?_prop_eq_some_iff]
refine ?_, ?_
· rintro hle, hne, hp, hmin
have hsplit := pos'.splits_next_right hne
obtain v, hv1, hv2 := (hs.le_iff_exists_eq_append hsplit).mp hle
refine v, pos'.get hne, _, hsplit.of_eq hv1 rfl, hp, fun d hd => ?_
obtain _, hcopy :=
Slice.copy_slice_eq_iff_splits.mpr t, _, hs.of_eq rfl hv2, hsplit.of_eq hv1 rfl
rw [ hcopy] at hd
obtain q, hq, hqget := mem_toList_copy_iff_exists_get.mp hd
have hlt : Pos.ofSlice q < pos' := by
simpa [ Slice.Pos.lt_endPos_iff, Pos.ofSlice_lt_ofSlice_iff] using hq
subst hqget; rw [Slice.Pos.get_eq_get_ofSlice]; exact hmin _ Pos.le_ofSlice hlt
· rintro v, c, w, hsplit, hpc, hmin
have hne := hsplit.ne_endPos_of_singleton
have hu : u = v ++ (singleton c ++ w) :=
append_right_inj t |>.mp (hs.eq_append.symm.trans (by rw [hsplit.eq_append, append_assoc]))
have hle : pos pos' := (hs.le_iff_exists_eq_append hsplit).mpr v, rfl, hu
refine hle, hne, ?_, fun pos'' hle' hlt => hmin _ ?_
· rw [(singleton_append_inj.mp (hsplit.eq_right (pos'.splits_next_right hne))).1.symm]
exact hpc
· obtain _, hcopy :=
Slice.copy_slice_eq_iff_splits.mpr t, _, hs.of_eq rfl hu, hsplit
rw [ hcopy]
exact mem_toList_copy_iff_exists_get.mpr
pos''.slice pos pos' hle' (Std.le_of_lt hlt),
fun h => Std.ne_of_lt hlt
(by rw [ Slice.Pos.ofSlice_slice (h₁ := hle') (h₂ := Std.le_of_lt hlt), h,
Slice.Pos.ofSlice_endPos]),
by rw [Slice.Pos.get_eq_get_ofSlice]
simp [Slice.Pos.ofSlice_slice]
theorem Pos.find?_prop_eq_none_iff {p : Char Prop} [DecidablePred p] {s : Slice} {pos : s.Pos} :
pos.find? p = none
pos', pos pos' (h : pos' s.endPos) ¬ p (pos'.get h) := by
grind [Pattern.Model.posFind?_eq_none_iff, Pattern.Model.CharPred.Decidable.matchesAt_iff]
theorem Pos.find?_prop_eq_none_iff_of_splits {p : Char Prop} [DecidablePred p] {s : Slice}
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) :
pos.find? p = none c u.toList, ¬ p c := by
rw [Pos.find?_prop_eq_none_iff]
constructor
· intro h c hc
obtain pos', hle, hne, hget := hs.mem_toList_right_iff.mp hc
subst hget; exact h pos' hle hne
· intro h pos' hle hne
exact h _ (hs.mem_toList_right_iff.mpr pos', hle, hne, rfl)
end String.Slice
namespace String
theorem Pos.find?_bool_eq_some_iff {p : Char Bool} {s : String} {pos pos' : s.Pos} :
pos.find? p = some pos'
pos pos' ( h, p (pos'.get h))
pos'', pos pos'' (h' : pos'' < pos')
p (pos''.get (Pos.ne_endPos_of_lt h')) = false := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.Pos.find?_bool_eq_some_iff, endPos_toSlice]
refine ?_, ?_
· rintro pos', h₁, h₂, hp, h₃, rfl
refine by simpa [Pos.ofToSlice_le_iff] using h₁,
by simpa [ Pos.ofToSlice_inj] using h₂, by simpa [Pos.get_ofToSlice] using hp, ?_
intro pos'' h₄ h₅
simpa using h₃ pos''.toSlice (by simpa [Pos.toSlice_le] using h₄) (by simpa using h₅)
· rintro h₁, h₂, hp, h₃
refine pos'.toSlice, by simpa [Pos.toSlice_le] using h₁,
by simpa [ Pos.toSlice_inj] using h₂, by simpa using hp, fun p hp₁ hp₂ => ?_,
by simp
simpa using h₃ (Pos.ofToSlice p)
(by simpa [Pos.ofToSlice_le_iff] using hp₁) (by simpa using hp₂)
theorem Pos.find?_bool_eq_some_iff_splits {p : Char Bool} {s : String} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
pos.find? p = some pos'
v c w, pos'.Splits (t ++ v) (singleton c ++ w) p c
d v.toList, p d = false := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.Pos.find?_bool_eq_some_iff_splits (Pos.splits_toSlice_iff.mpr hs)]
constructor
· rintro q, v, c, w, hsplit, hpc, hmin, rfl
exact v, c, w, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin
· rintro v, c, w, hsplit, hpc, hmin
exact pos'.toSlice, v, c, w, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin, by simp
theorem Pos.find?_bool_eq_none_iff {p : Char Bool} {s : String} {pos : s.Pos} :
pos.find? p = none
pos', pos pos' (h : pos' s.endPos) p (pos'.get h) = false := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff,
Slice.Pos.find?_bool_eq_none_iff, endPos_toSlice]
refine ?_, ?_
· intro h pos' h₁ h₂
simpa [Pos.get_ofToSlice] using
h pos'.toSlice (by simpa [Pos.toSlice_le] using h₁) (by simpa [ Pos.toSlice_inj] using h₂)
· intro h pos' h₁ h₂
simpa using h (Pos.ofToSlice pos')
(by simpa [Pos.ofToSlice_le_iff] using h₁) (by simpa [ Pos.ofToSlice_inj] using h₂)
theorem Pos.find?_bool_eq_none_iff_of_splits {p : Char Bool} {s : String} {pos : s.Pos}
{t u : String} (hs : pos.Splits t u) :
pos.find? p = none c u.toList, p c = false := by
rw [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff]
exact Slice.Pos.find?_bool_eq_none_iff_of_splits (Pos.splits_toSlice_iff.mpr hs)
theorem Pos.find?_prop_eq_some_iff {p : Char Prop} [DecidablePred p] {s : String}
{pos pos' : s.Pos} :
pos.find? p = some pos'
pos pos' ( h, p (pos'.get h))
pos'', pos pos'' (h' : pos'' < pos')
¬ p (pos''.get (Pos.ne_endPos_of_lt h')) := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.Pos.find?_prop_eq_some_iff, endPos_toSlice]
refine ?_, ?_
· rintro pos', h₁, h₂, hp, h₃, rfl
refine by simpa [Pos.ofToSlice_le_iff] using h₁,
by simpa [ Pos.ofToSlice_inj] using h₂, by simpa [Pos.get_ofToSlice] using hp, ?_
intro pos'' h₄ h₅
simpa using h₃ pos''.toSlice (by simpa [Pos.toSlice_le] using h₄) (by simpa using h₅)
· rintro h₁, h₂, hp, h₃
refine pos'.toSlice, by simpa [Pos.toSlice_le] using h₁,
by simpa [ Pos.toSlice_inj] using h₂, by simpa using hp, fun p hp₁ hp₂ => ?_,
by simp
simpa using h₃ (Pos.ofToSlice p)
(by simpa [Pos.ofToSlice_le_iff] using hp₁) (by simpa using hp₂)
theorem Pos.find?_prop_eq_some_iff_splits {p : Char Prop} [DecidablePred p] {s : String}
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
pos.find? p = some pos'
v c w, pos'.Splits (t ++ v) (singleton c ++ w) p c d v.toList, ¬ p d := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.Pos.find?_prop_eq_some_iff_splits (Pos.splits_toSlice_iff.mpr hs)]
constructor
· rintro q, v, c, w, hsplit, hpc, hmin, rfl
exact v, c, w, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin
· rintro v, c, w, hsplit, hpc, hmin
exact pos'.toSlice, v, c, w, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin, by simp
theorem Pos.find?_prop_eq_none_iff {p : Char Prop} [DecidablePred p] {s : String}
{pos : s.Pos} :
pos.find? p = none
pos', pos pos' (h : pos' s.endPos) ¬ p (pos'.get h) := by
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff,
Slice.Pos.find?_prop_eq_none_iff, endPos_toSlice]
refine ?_, ?_
· intro h pos' h₁ h₂
simpa [Pos.get_ofToSlice] using
h pos'.toSlice (by simpa [Pos.toSlice_le] using h₁) (by simpa [ Pos.toSlice_inj] using h₂)
· intro h pos' h₁ h₂
simpa using h (Pos.ofToSlice pos')
(by simpa [Pos.ofToSlice_le_iff] using h₁) (by simpa [ Pos.ofToSlice_inj] using h₂)
theorem Pos.find?_prop_eq_none_iff_of_splits {p : Char Prop} [DecidablePred p] {s : String}
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) :
pos.find? p = none c u.toList, ¬ p c := by
rw [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff]
exact Slice.Pos.find?_prop_eq_none_iff_of_splits (Pos.splits_toSlice_iff.mpr hs)
theorem find?_bool_eq_some_iff {p : Char Bool} {s : String} {pos : s.Pos} :
s.find? p = some pos
h, p (pos.get h) pos', (h' : pos' < pos) p (pos'.get (Pos.ne_endPos_of_lt h')) = false := by
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff, Slice.find?_bool_eq_some_iff,
endPos_toSlice, exists_and_right]
refine ?_, ?_
· rintro pos, h, hp, h', rfl
refine by simpa [ Pos.ofToSlice_inj] using h, by simpa [Pos.get_ofToSlice] using hp, ?_
intro pos' hp
simpa using h' pos'.toSlice hp
· rintro h, hp, hmin
exact pos.toSlice, by simpa [ Pos.toSlice_inj] using h, by simpa using hp,
fun pos' hp => by simpa using hmin (Pos.ofToSlice pos') hp, by simp
theorem find?_bool_eq_some_iff_splits {p : Char Bool} {s : String} {pos : s.Pos} :
s.find? p = some pos
t c u, pos.Splits t (singleton c ++ u) p c d t.toList, p d = false := by
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.find?_bool_eq_some_iff_splits]
constructor
· rintro q, t, c, u, hsplit, hpc, hmin, rfl
exact t, c, u, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin
· rintro t, c, u, hsplit, hpc, hmin
exact pos.toSlice, t, c, u, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin, by simp
theorem find?_prop_eq_some_iff {p : Char Prop} [DecidablePred p] {s : String} {pos : s.Pos} :
s.find? p = some pos
h, p (pos.get h) pos', (h' : pos' < pos) ¬ p (pos'.get (Pos.ne_endPos_of_lt h')) := by
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff, Slice.find?_prop_eq_some_iff,
endPos_toSlice, exists_and_right]
refine ?_, ?_
· rintro pos, h, hp, h', rfl
refine by simpa [ Pos.ofToSlice_inj] using h, by simpa [Pos.get_ofToSlice] using hp, ?_
intro pos' hp
simpa using h' pos'.toSlice hp
· rintro h, hp, hmin
exact pos.toSlice, by simpa [ Pos.toSlice_inj] using h, by simpa using hp,
fun pos' hp => by simpa using hmin (Pos.ofToSlice pos') hp, by simp
theorem find?_prop_eq_some_iff_splits {p : Char Prop} [DecidablePred p] {s : String}
{pos : s.Pos} :
s.find? p = some pos
t c u, pos.Splits t (singleton c ++ u) p c d t.toList, ¬ p d := by
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff,
Slice.find?_prop_eq_some_iff_splits]
constructor
· rintro q, t, c, u, hsplit, hpc, hmin, rfl
exact t, c, u, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin
· rintro t, c, u, hsplit, hpc, hmin
exact pos.toSlice, t, c, u, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin, by simp
@[simp]
theorem contains_bool_eq {p : Char Bool} {s : String} : s.contains p = s.toList.any p := by
simp [contains_eq_contains_toSlice, Slice.contains_bool_eq, copy_toSlice]
@[simp]
theorem contains_prop_eq {p : Char Prop} [DecidablePred p] {s : String} :
s.contains p = s.toList.any p := by
simp [contains_eq_contains_toSlice, Slice.contains_prop_eq, copy_toSlice]
end String

View File

@@ -10,6 +10,10 @@ public import Init.Data.String.Pattern.Pred
public import Init.Data.String.Lemmas.Pattern.Basic
import Init.Data.Option.Lemmas
import Init.Data.String.Lemmas.Basic
import Init.Data.String.Lemmas.Order
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Omega
public section
@@ -38,14 +42,35 @@ theorem isLongestMatch_iff {p : Char → Bool} {s : Slice} {pos : s.Pos} :
(h : s.startPos s.endPos), pos = s.startPos.next h p (s.startPos.get h) := by
rw [isLongestMatch_iff_isMatch, isMatch_iff]
theorem isLongestMatchAt_iff {p : Char Bool} {s : Slice} {pos pos' : s.Pos} :
IsLongestMatchAt p pos pos' h, pos' = pos.next h p (pos.get h) := by
simp +contextual [Model.isLongestMatchAt_iff, isLongestMatch_iff, Pos.ofSliceFrom_inj,
Pos.get_eq_get_ofSliceFrom, Pos.ofSliceFrom_next]
theorem isLongestMatchAt_of_get {p : Char Bool} {s : Slice} {pos : s.Pos} {h : pos s.endPos}
(hc : p (pos.get h)) : IsLongestMatchAt p pos (pos.next h) :=
isLongestMatchAt_iff.2 h, by simp [hc]
instance {p : Char Bool} : LawfulForwardPatternModel p where
dropPrefix?_eq_some_iff {s} pos := by
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?]
exact fun h, h₁, h₂ => h, h₂.symm, h₁, fun h, h₁, h₂ => h, h₂, h₁.symm
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?, and_comm, eq_comm (b := pos)]
instance {p : Char Bool} : LawfulToForwardSearcherModel p :=
.defaultImplementation
theorem matchesAt_iff {p : Char Bool} {s : Slice} {pos : s.Pos} :
MatchesAt p pos (h : pos s.endPos), p (pos.get h) := by
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
theorem not_matchesAt_of_get {p : Char Bool} {s : Slice} {pos : s.Pos} {h : pos s.endPos}
(hc : p (pos.get h) = false) : ¬ MatchesAt p pos := by
simp [matchesAt_iff, hc]
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {p : Char Bool} :
matchAt? p pos =
if h₀ : (h : pos s.endPos), p (pos.get h) then some (pos.next h₀.1) else none := by
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
namespace Decidable
instance {p : Char Prop} [DecidablePred p] : ForwardPatternModel p where
@@ -73,6 +98,20 @@ theorem isLongestMatch_iff_isLongestMatch_decide {p : Char → Prop} [DecidableP
{pos : s.Pos} : IsLongestMatch p pos IsLongestMatch (decide <| p ·) pos := by
simp [isLongestMatch_iff_isMatch, isMatch_iff_isMatch_decide]
theorem isLongestMatchAt_iff_isLongestMatchAt_decide {p : Char Prop} [DecidablePred p]
{s : Slice} {pos pos' : s.Pos} :
IsLongestMatchAt p pos pos' IsLongestMatchAt (decide <| p ·) pos pos' := by
simp [Model.isLongestMatchAt_iff, isLongestMatch_iff_isLongestMatch_decide]
theorem isLongestMatchAt_iff {p : Char Prop} [DecidablePred p] {s : Slice}
{pos pos' : s.Pos} :
IsLongestMatchAt p pos pos' h, pos' = pos.next h p (pos.get h) := by
simp [isLongestMatchAt_iff_isLongestMatchAt_decide, CharPred.isLongestMatchAt_iff]
theorem isLongestMatchAt_of_get {p : Char Prop} [DecidablePred p] {s : Slice} {pos : s.Pos}
{h : pos s.endPos} (hc : p (pos.get h)) : IsLongestMatchAt p pos (pos.next h) :=
isLongestMatchAt_iff.2 h, by simp [hc]
theorem dropPrefix?_eq_dropPrefix?_decide {p : Char Prop} [DecidablePred p] :
ForwardPattern.dropPrefix? p = ForwardPattern.dropPrefix? (decide <| p ·) := rfl
@@ -84,6 +123,19 @@ instance {p : Char → Prop} [DecidablePred p] : LawfulForwardPatternModel p whe
instance {p : Char Prop} [DecidablePred p] : LawfulToForwardSearcherModel p :=
.defaultImplementation
theorem matchesAt_iff {p : Char Prop} [DecidablePred p] {s : Slice} {pos : s.Pos} :
MatchesAt p pos (h : pos s.endPos), p (pos.get h) := by
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
theorem not_matchesAt_of_get {p : Char Prop} [DecidablePred p] {s : Slice} {pos : s.Pos}
{h : pos s.endPos} (hc : ¬ p (pos.get h)) : ¬ MatchesAt p pos := by
simp [matchesAt_iff, hc]
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {p : Char Prop} [DecidablePred p] :
matchAt? p pos =
if h₀ : (h : pos s.endPos), p (pos.get h) then some (pos.next h₀.1) else none := by
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
end Decidable
end String.Slice.Pattern.Model.CharPred

View File

@@ -6,235 +6,6 @@ Author: Markus Himmel
module
prelude
public import Init.Data.String.Lemmas.Pattern.Basic
public import Init.Data.String.Slice
import all Init.Data.String.Slice
import Init.Data.Option.Lemmas
import Init.Data.String.Termination
import Init.Data.String.Lemmas.Order
import Init.ByCases
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Data.Iterators.Lemmas.Basic
import Init.Data.Iterators.Lemmas.Consumers.Collect
set_option doc.verso true
/-!
# Verification of {name}`String.Slice.splitToSubslice`
This PR verifies the {name}`String.Slice.splitToSubslice` function by relating it to a model
implementation based on the {name}`String.Slice.Pattern.Model.ForwardPatternModel` class.
This gives a low-level correctness proof from which higher-level API lemmas can be derived.
-/
namespace String.Slice.Pattern.Model
/--
Represents a list of subslices of a slice {name}`s`, the first of which starts at the given
position {name}`startPos`. This is a natural type for a split routine to return.
-/
@[ext]
public structure SlicesFrom {s : Slice} (startPos : s.Pos) : Type where
l : List s.Subslice
any_head? : l.head?.any (·.startInclusive = startPos)
namespace SlicesFrom
/--
A {name}`SlicesFrom` consisting of a single empty subslice at the position {name}`pos`.
-/
public def «at» {s : Slice} (pos : s.Pos) : SlicesFrom pos where
l := [s.subslice pos pos (Slice.Pos.le_refl _)]
any_head? := by simp
@[simp]
public theorem l_at {s : Slice} (pos : s.Pos) :
(SlicesFrom.at pos).l = [s.subslice pos pos (Slice.Pos.le_refl _)] := (rfl)
/--
Concatenating two {name}`SlicesFrom` yields a {name}`SlicesFrom` from the first position.
-/
public def append {s : Slice} {p₁ p₂ : s.Pos} (l₁ : SlicesFrom p₁) (l₂ : SlicesFrom p₂) :
SlicesFrom p₁ where
l := l₁.l ++ l₂.l
any_head? := by simpa using Option.any_or_of_any_left l₁.any_head?
@[simp]
public theorem l_append {s : Slice} {p₁ p₂ : s.Pos} {l₁ : SlicesFrom p₁} {l₂ : SlicesFrom p₂} :
(l₁.append l₂).l = l₁.l ++ l₂.l :=
(rfl)
/--
Given a {lean}`SlicesFrom p₂` and a position {name}`p₁` such that {lean}`p₁ ≤ p₂`, obtain a
{lean}`SlicesFrom p₁` by extending the left end of the first subslice to from {name}`p₂` to
{name}`p₁`.
-/
public def extend {s : Slice} (p₁ : s.Pos) {p₂ : s.Pos} (h : p₁ p₂) (l : SlicesFrom p₂) :
SlicesFrom p₁ where
l :=
match l.l, l.any_head? with
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts
any_head? := by split; simp
@[simp]
public theorem l_extend {s : Slice} {p₁ p₂ : s.Pos} (h : p₁ p₂) {l : SlicesFrom p₂} :
(l.extend p₁ h).l =
match l.l, l.any_head? with
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts :=
(rfl)
@[simp]
public theorem extend_self {s : Slice} {p₁ : s.Pos} (l : SlicesFrom p₁) :
l.extend p₁ (Slice.Pos.le_refl _) = l := by
rcases l with l, h
match l, h with
| st :: sts, h =>
simp at h
simp [SlicesFrom.extend, h]
@[simp]
public theorem extend_extend {s : Slice} {p₀ p₁ p₂ : s.Pos} {h : p₀ p₁} {h' : p₁ p₂}
{l : SlicesFrom p₂} : (l.extend p₁ h').extend p₀ h = l.extend p₀ (Slice.Pos.le_trans h h') := by
rcases l with l, h
match l, h with
| st :: sts, h => simp [SlicesFrom.extend]
end SlicesFrom
/--
Noncomputable model implementation of {name}`String.Slice.splitToSubslice` based on
{name}`ForwardPatternModel`. This is supposed to be simple enough to allow deriving higher-level
API lemmas about the public splitting functions.
-/
public protected noncomputable def split {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {s : Slice}
(start : s.Pos) : SlicesFrom start :=
if h : start = s.endPos then
.at start
else
match hd : matchAt? pat start with
| some pos =>
have : start < pos := (matchAt?_eq_some_iff.1 hd).lt
(SlicesFrom.at start).append (Model.split pat pos)
| none => (Model.split pat (start.next h)).extend start (by simp)
termination_by start
@[simp]
public theorem split_endPos {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice} :
Model.split pat s.endPos = SlicesFrom.at s.endPos := by
simp [Model.split]
public theorem split_eq_of_isLongestMatchAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
{s : Slice} {start stop : s.Pos} (h : IsLongestMatchAt pat start stop) :
Model.split pat start = (SlicesFrom.at start).append (Model.split pat stop) := by
rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h.lt)]
split
· congr <;> exact (matchAt?_eq_some_iff.1 _).eq h
· simp [matchAt?_eq_some_iff.2 _] at *
public theorem split_eq_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice}
{start stop : s.Pos} (h₀ : start stop) (h : p, start p p < stop ¬ MatchesAt pat p) :
Model.split pat start = (SlicesFrom.extend start h₀ (Model.split pat stop)) := by
induction start using WellFounded.induction Slice.Pos.wellFounded_gt with | h start ih
by_cases h' : start < stop
· rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h')]
have : ¬ MatchesAt pat start := h start (Slice.Pos.le_refl _) h'
split
· rename_i heq
simp [matchAt?_eq_none_iff.2 _] at heq
· rw [ih, SlicesFrom.extend_extend]
· simp
· simp [h']
· refine fun p hp₁ hp₂ => h p (Std.le_of_lt (by simpa using hp₁)) hp₂
· obtain rfl : start = stop := Std.le_antisymm h₀ (Std.not_lt.1 h')
simp
/--
Splits a slice {name}`s` into subslices from a list of {lean}`SearchStep s`.
This is an intermediate step in the verification. The equivalence of
{name}`String.Slice.splitToSubslice` and {name}`splitFromSteps` is pure "iteratorology", while
the equivalence of {name}`splitFromSteps` and {name}`split` is the actual correctness proof for the
splitting routine.
-/
def splitFromSteps {s : Slice} (currPos : s.Pos) (l : List (SearchStep s)) : List s.Subslice :=
match l with
| [] => [s.subsliceFrom currPos]
| .rejected .. :: l => splitFromSteps currPos l
| .matched p q :: l => s.subslice! currPos p :: splitFromSteps q l
theorem IsValidSearchFrom.splitFromSteps_eq_extend_split {ρ : Type} (pat : ρ)
[ForwardPatternModel pat] (l : List (SearchStep s)) (pos pos' : s.Pos) (h₀ : pos pos')
(h' : p, pos p p < pos' ¬ MatchesAt pat p)
(h : IsValidSearchFrom pat pos' l) :
splitFromSteps pos l = ((Model.split pat pos').extend pos h₀).l := by
induction h generalizing pos with
| endPos =>
simp only [splitFromSteps, Model.split, reduceDIte, SlicesFrom.l_extend, List.head?_cons,
Option.any_some]
split
simp_all only [SlicesFrom.l_at, List.cons.injEq, List.nil_eq, List.head?_cons, Option.any_some,
decide_eq_true_eq, heq_eq_eq, and_true]
rename_i h
simp only [ h.1]
ext <;> simp
| matched h valid ih =>
simp only [splitFromSteps]
rw [subslice!_eq_subslice h₀, split_eq_of_isLongestMatchAt h]
simp only [SlicesFrom.append, SlicesFrom.at, List.cons_append, List.nil_append,
SlicesFrom.l_extend, List.cons.injEq]
refine ?_, ?_
· ext <;> simp
· rw [ih _ (Slice.Pos.le_refl _), SlicesFrom.extend_self]
exact fun p hp₁ hp₂ => False.elim (Std.lt_irrefl (Std.lt_of_le_of_lt hp₁ hp₂))
| mismatched h rej valid ih =>
simp only [splitFromSteps]
rename_i l startPos endPos
rw [split_eq_of_not_matchesAt (Std.le_of_lt h) rej, SlicesFrom.extend_extend, ih]
intro p hp₁ hp₂
by_cases hp : p < startPos
· exact h' p hp₁ hp
· exact rej _ (Std.not_lt.1 hp) hp₂
theorem SplitIterator.toList_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type}
[ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] {s : Slice}
(it : Std.Iter (α := σ s) (SearchStep s)) (currPos : s.Pos) :
(Std.Iter.mk (α := SplitIterator pat s) (.operating currPos it)).toList =
splitFromSteps currPos it.toList := by
induction it using Std.Iter.inductSteps generalizing currPos with | step it ihy ihs
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
conv => rhs; rw [Std.Iter.toList_eq_match_step]
simp only [Std.Iter.toIterM_mk]
cases it.step using Std.PlausibleIterStep.casesOn with
| yield it out h =>
match out with
| .matched startPos endPos => simp [splitFromSteps, ihy h]
| .rejected startPos endPos => simp [splitFromSteps, ihy h]
| skip it h => simp [ ihs h]
| done =>
simp only [Id.run_pure, Std.Shrink.inflate_deflate, Std.IterM.Step.toPure_yield,
Std.PlausibleIterStep.yield, Std.IterM.toIter_mk, splitFromSteps, List.cons.injEq, true_and]
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
simp
theorem toList_splitToSubslice_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type} [ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] (s : Slice) :
(s.splitToSubslice pat).toList = splitFromSteps s.startPos (ToForwardSearcher.toSearcher pat s).toList := by
rw [splitToSubslice, SplitIterator.toList_eq_splitFromSteps]
end Model
open Model
public theorem toList_splitToSubslice_eq_modelSplit {ρ : Type} (pat : ρ) [ForwardPatternModel pat]
{σ : Slice Type} [ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] [LawfulToForwardSearcherModel pat] (s : Slice) :
(s.splitToSubslice pat).toList = (Model.split pat s.startPos).l := by
rw [toList_splitToSubslice_eq_splitFromSteps, IsValidSearchFrom.splitFromSteps_eq_extend_split pat _
s.startPos s.startPos (Std.le_refl _) _ (LawfulToForwardSearcherModel.isValidSearchFrom_toList _),
SlicesFrom.extend_self]
simp
end String.Slice.Pattern
public import Init.Data.String.Lemmas.Pattern.Split.Basic
public import Init.Data.String.Lemmas.Pattern.Split.Char
public import Init.Data.String.Lemmas.Pattern.Split.Pred

View File

@@ -0,0 +1,207 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Author: Markus Himmel
-/
module
prelude
public import Init.Data.String.Lemmas.Pattern.Basic
public import Init.Data.String.Slice
public import Init.Data.String.Search
import all Init.Data.String.Slice
import all Init.Data.String.Search
import Init.Data.Option.Lemmas
import Init.Data.String.Termination
import Init.Data.String.Lemmas.Order
import Init.ByCases
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Data.Iterators.Lemmas.Basic
import Init.Data.Iterators.Lemmas.Consumers.Collect
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
import Init.Data.String.Lemmas.IsEmpty
set_option doc.verso true
/-!
# Verification of {name}`String.Slice.splitToSubslice`
This PR verifies the {name}`String.Slice.splitToSubslice` function by relating it to a model
implementation based on the {name}`String.Slice.Pattern.Model.ForwardPatternModel` class.
This gives a low-level correctness proof from which higher-level API lemmas can be derived.
-/
namespace String.Slice.Pattern.Model
public protected noncomputable def split {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {s : Slice}
(firstRejected curr : s.Pos) (hle : firstRejected curr) : List s.Subslice :=
if h : curr = s.endPos then
[s.subslice _ _ hle]
else
match hd : matchAt? pat curr with
| some pos =>
have : curr < pos := (matchAt?_eq_some_iff.1 hd).lt
s.subslice _ _ hle :: Model.split pat pos pos (Std.le_refl _)
| none => Model.split pat firstRejected (curr.next h) (Std.le_trans hle (by simp))
termination_by curr
@[simp]
public theorem split_endPos {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice}
{firstRejected : s.Pos} :
Model.split (s := s) pat firstRejected s.endPos (by simp) = [s.subslice firstRejected s.endPos (by simp)] := by
simp [Model.split]
public theorem split_eq_of_isLongestMatchAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
{s : Slice} {firstRejected start stop : s.Pos} {hle} (h : IsLongestMatchAt pat start stop) :
Model.split pat firstRejected start hle =
s.subslice _ _ hle :: Model.split pat stop stop (by exact Std.le_refl _) := by
rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h.lt)]
split
· congr <;> exact (matchAt?_eq_some_iff.1 _).eq h
· simp [matchAt?_eq_some_iff.2 _] at *
public theorem split_eq_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
{s : Slice} {firstRejected start} (stop : s.Pos) (h₀ : start stop) {hle}
(h : p, start p p < stop ¬ MatchesAt pat p) :
Model.split pat firstRejected start hle =
Model.split pat firstRejected stop (by exact Std.le_trans hle h₀) := by
induction start using WellFounded.induction Slice.Pos.wellFounded_gt with | h start ih
by_cases h' : start < stop
· rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h')]
have : ¬ MatchesAt pat start := h start (Slice.Pos.le_refl _) h'
split
· rename_i heq
simp [matchAt?_eq_none_iff.2 _] at heq
· rw [ih _ (by simp) (by simpa)]
exact fun p hp₁ hp₂ => h p (Std.le_of_lt (by simpa using hp₁)) hp₂
· obtain rfl : start = stop := Std.le_antisymm h₀ (Std.not_lt.1 h')
simp
public theorem split_eq_next_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
{s : Slice} {firstRejected start} {hle} (hs : start s.endPos) (h : ¬ MatchesAt pat start) :
Model.split pat firstRejected start hle =
Model.split pat firstRejected (start.next hs) (by exact Std.le_trans hle (by simp)) := by
refine split_eq_of_not_matchesAt _ (by simp) (fun p hp₁ hp₂ => ?_)
obtain rfl : start = p := Std.le_antisymm hp₁ (by simpa using hp₂)
exact h
/--
Splits a slice {name}`s` into subslices from a list of {lean}`SearchStep s`.
This is an intermediate step in the verification. The equivalence of
{name}`String.Slice.splitToSubslice` and {name}`splitFromSteps` is pure "iteratorology", while
the equivalence of {name}`splitFromSteps` and {name}`split` is the actual correctness proof for the
splitting routine.
-/
def splitFromSteps {s : Slice} (currPos : s.Pos) (l : List (SearchStep s)) : List s.Subslice :=
match l with
| [] => [s.subsliceFrom currPos]
| .rejected .. :: l => splitFromSteps currPos l
| .matched p q :: l => s.subslice! currPos p :: splitFromSteps q l
theorem IsValidSearchFrom.splitFromSteps_eq_extend_split {ρ : Type} (pat : ρ)
[ForwardPatternModel pat] (l : List (SearchStep s)) (pos pos' : s.Pos) (h₀ : pos pos')
(h' : p, pos p p < pos' ¬ MatchesAt pat p)
(h : IsValidSearchFrom pat pos' l) :
splitFromSteps pos l = Model.split pat pos pos' h₀ := by
induction h generalizing pos with
| endPos =>
simp [splitFromSteps]
| matched h valid ih =>
simp only [splitFromSteps]
rw [subslice!_eq_subslice h₀, split_eq_of_isLongestMatchAt h, ih]
simp +contextual [ Std.not_lt]
| mismatched h rej valid ih =>
simp only [splitFromSteps]
rename_i l startPos endPos
rw [split_eq_of_not_matchesAt _ (Std.le_of_lt h) rej, ih]
intro p hp₁ hp₂
by_cases hp : p < startPos
· exact h' p hp₁ hp
· exact rej _ (Std.not_lt.1 hp) hp₂
theorem SplitIterator.toList_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type}
[ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] {s : Slice}
(it : Std.Iter (α := σ s) (SearchStep s)) (currPos : s.Pos) :
(Std.Iter.mk (α := SplitIterator pat s) (.operating currPos it)).toList =
splitFromSteps currPos it.toList := by
induction it using Std.Iter.inductSteps generalizing currPos with | step it ihy ihs
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
conv => rhs; rw [Std.Iter.toList_eq_match_step]
simp only [Std.Iter.toIterM_mk]
cases it.step using Std.PlausibleIterStep.casesOn with
| yield it out h =>
match out with
| .matched startPos endPos => simp [splitFromSteps, ihy h]
| .rejected startPos endPos => simp [splitFromSteps, ihy h]
| skip it h => simp [ ihs h]
| done =>
simp only [Id.run_pure, Std.Shrink.inflate_deflate, Std.IterM.Step.toPure_yield,
Std.PlausibleIterStep.yield, Std.IterM.toIter_mk, splitFromSteps, List.cons.injEq, true_and]
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
simp
theorem toList_splitToSubslice_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type} [ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] (s : Slice) :
(s.splitToSubslice pat).toList = splitFromSteps s.startPos (ToForwardSearcher.toSearcher pat s).toList := by
rw [splitToSubslice, SplitIterator.toList_eq_splitFromSteps]
end Model
open Model
public theorem toList_splitToSubslice_eq_modelSplit {ρ : Type} (pat : ρ) [ForwardPatternModel pat]
{σ : Slice Type} [ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] [LawfulToForwardSearcherModel pat] (s : Slice) :
(s.splitToSubslice pat).toList = Model.split pat s.startPos s.startPos (by exact Std.le_refl _) := by
rw [toList_splitToSubslice_eq_splitFromSteps, IsValidSearchFrom.splitFromSteps_eq_extend_split pat _
s.startPos s.startPos (Std.le_refl _) _ (LawfulToForwardSearcherModel.isValidSearchFrom_toList _)]
simp
end Pattern
open Pattern
public theorem toList_splitToSubslice_of_isEmpty {ρ : Type} (pat : ρ)
[Model.ForwardPatternModel pat] {σ : Slice Type}
[ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] [Model.LawfulToForwardSearcherModel pat] {s : Slice}
(h : s.isEmpty = true) :
(s.splitToSubslice pat).toList = [s.subsliceFrom s.endPos] := by
simp [toList_splitToSubslice_eq_modelSplit, Slice.startPos_eq_endPos_iff.2 h]
public theorem toList_split_eq_splitToSubslice {ρ : Type} (pat : ρ) {σ : Slice Type}
[ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] {s : Slice} :
(s.split pat).toList = (s.splitToSubslice pat).toList.map Subslice.toSlice := by
simp [split, Std.Iter.toList_map]
public theorem toList_split_of_isEmpty {ρ : Type} (pat : ρ)
[Model.ForwardPatternModel pat] {σ : Slice Type}
[ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] [Model.LawfulToForwardSearcherModel pat] {s : Slice}
(h : s.isEmpty = true) :
(s.split pat).toList.map Slice.copy = [""] := by
rw [toList_split_eq_splitToSubslice, toList_splitToSubslice_of_isEmpty _ h]
simp
end Slice
open Slice.Pattern
public theorem split_eq_split_toSlice {ρ : Type} {pat : ρ} {σ : Slice Type}
[ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)] {s : String} :
s.split pat = s.toSlice.split pat := (rfl)
@[simp]
public theorem toList_split_empty {ρ : Type} (pat : ρ)
[Model.ForwardPatternModel pat] {σ : Slice Type}
[ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] [Model.LawfulToForwardSearcherModel pat] :
("".split pat).toList.map Slice.copy = [""] := by
rw [split_eq_split_toSlice, Slice.toList_split_of_isEmpty _ (by simp)]
end String

View File

@@ -0,0 +1,78 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Author: Markus Himmel
-/
module
prelude
public import Init.Data.String.Slice
public import Init.Data.String.Search
public import Init.Data.List.SplitOn.Basic
import Init.Data.String.Termination
import Init.Data.Order.Lemmas
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
import Init.Data.String.Lemmas.Pattern.Split.Basic
import Init.Data.String.Lemmas.Pattern.Char
import Init.ByCases
import Init.Data.String.OrderInstances
import Init.Data.String.Lemmas.Order
import Init.Data.String.Lemmas.Intercalate
import Init.Data.List.SplitOn.Lemmas
public section
namespace String.Slice
open Pattern.Model Pattern.Model.Char
theorem toList_splitToSubslice_char {s : Slice} {c : Char} :
(s.splitToSubslice c).toList.map (Slice.copy Subslice.toSlice) =
(s.copy.toList.splitOn c).map String.ofList := by
simp only [Pattern.toList_splitToSubslice_eq_modelSplit]
suffices (f p : s.Pos) (hle : f p) (t₁ t₂ : String),
p.Splits t₁ t₂ (Pattern.Model.split c f p hle).map (copy Subslice.toSlice) =
(t₂.toList.splitOnPPrepend (· == c) (s.subslice f p hle).copy.toList.reverse).map String.ofList by
simpa [List.splitOn_eq_splitOnP] using this s.startPos s.startPos (Std.le_refl _) "" s.copy
intro f p hle t₁ t₂ hp
induction p using Pos.next_induction generalizing f t₁ t₂ with
| next p h ih =>
obtain t₂, rfl := hp.exists_eq_singleton_append h
by_cases hpc : p.get h = c
· simp [split_eq_of_isLongestMatchAt (isLongestMatchAt_of_get_eq hpc),
ih _ (Std.le_refl _) _ _ hp.next,
List.splitOnPPrepend_cons_pos (p := (· == c)) (beq_iff_eq.2 hpc)]
· rw [split_eq_next_of_not_matchesAt h (not_matchesAt_of_get_ne hpc)]
simp only [toList_append, toList_singleton, List.cons_append, List.nil_append, Subslice.copy_eq]
rw [ih _ _ _ _ hp.next, List.splitOnPPrepend_cons_neg (by simpa)]
have := (splits_slice (Std.le_trans hle (by simp)) (p.slice f (p.next h) hle (by simp))).eq_append
simp_all
| endPos => simp_all
theorem toList_split_char {s : Slice} {c : Char} :
(s.split c).toList.map Slice.copy = (s.copy.toList.splitOn c).map String.ofList := by
simp [toList_split_eq_splitToSubslice, toList_splitToSubslice_char]
end Slice
theorem toList_split_char {s : String} {c : Char} :
(s.split c).toList.map Slice.copy = (s.toList.splitOn c).map String.ofList := by
simp [split_eq_split_toSlice, Slice.toList_split_char]
theorem Slice.toList_split_intercalate {c : Char} {l : List Slice} (hl : s l, c s.copy.toList) :
((Slice.intercalate (String.singleton c) l).split c).toList.map Slice.copy =
if l = [] then [""] else l.map Slice.copy := by
simp [String.toList_split_char]
split
· simp_all
· rw [List.splitOn_intercalate] <;> simp_all
theorem toList_split_intercalate {c : Char} {l : List String} (hl : s l, c s.toList) :
((String.intercalate (String.singleton c) l).split c).toList.map (·.copy) =
if l = [] then [""] else l := by
simp only [toList_split_char, toList_intercalate, toList_singleton]
split
· simp_all
· rw [List.splitOn_intercalate] <;> simp_all
end String

View File

@@ -0,0 +1,103 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Author: Markus Himmel
-/
module
prelude
public import Init.Data.String.Slice
public import Init.Data.String.Search
public import Init.Data.List.SplitOn.Basic
import Init.Data.String.Termination
import Init.Data.Order.Lemmas
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
import Init.Data.String.Lemmas.Pattern.Split.Basic
import Init.Data.String.Lemmas.Pattern.Pred
import Init.ByCases
import Init.Data.String.OrderInstances
import Init.Data.List.SplitOn.Lemmas
import Init.Data.String.Lemmas.Order
public section
namespace String.Slice
section
open Pattern.Model Pattern.Model.CharPred
theorem toList_splitToSubslice_bool {s : Slice} {p : Char Bool} :
(s.splitToSubslice p).toList.map (Slice.copy Subslice.toSlice) =
(s.copy.toList.splitOnP p).map String.ofList := by
simp only [Pattern.toList_splitToSubslice_eq_modelSplit]
suffices (f pos : s.Pos) (hle : f pos) (t₁ t₂ : String),
pos.Splits t₁ t₂ (Pattern.Model.split p f pos hle).map (copy Subslice.toSlice) =
(t₂.toList.splitOnPPrepend p (s.subslice f pos hle).copy.toList.reverse).map String.ofList by
simpa using this s.startPos s.startPos (Std.le_refl _) "" s.copy
intro f pos hle t₁ t₂ hp
induction pos using Pos.next_induction generalizing f t₁ t₂ with
| next pos h ih =>
obtain t₂, rfl := hp.exists_eq_singleton_append h
by_cases hpc : p (pos.get h)
· simp [split_eq_of_isLongestMatchAt (isLongestMatchAt_of_get hpc),
ih _ (Std.le_refl _) _ _ hp.next,
List.splitOnPPrepend_cons_pos (p := p) hpc]
· rw [Bool.not_eq_true] at hpc
rw [split_eq_next_of_not_matchesAt h (not_matchesAt_of_get hpc)]
simp only [toList_append, toList_singleton, List.cons_append, List.nil_append, Subslice.copy_eq]
rw [ih _ _ _ _ hp.next, List.splitOnPPrepend_cons_neg (by simpa)]
have := (splits_slice (Std.le_trans hle (by simp)) (pos.slice f (pos.next h) hle (by simp))).eq_append
simp_all
| endPos => simp_all
theorem toList_split_bool {s : Slice} {p : Char Bool} :
(s.split p).toList.map Slice.copy = (s.copy.toList.splitOnP p).map String.ofList := by
simp [toList_split_eq_splitToSubslice, toList_splitToSubslice_bool]
end
section
open Pattern.Model Pattern.Model.CharPred.Decidable
theorem toList_splitToSubslice_prop {s : Slice} {p : Char Prop} [DecidablePred p] :
(s.splitToSubslice p).toList.map (Slice.copy Subslice.toSlice) =
(s.copy.toList.splitOnP p).map String.ofList := by
simp only [Pattern.toList_splitToSubslice_eq_modelSplit]
suffices (f pos : s.Pos) (hle : f pos) (t₁ t₂ : String),
pos.Splits t₁ t₂ (Pattern.Model.split p f pos hle).map (copy Subslice.toSlice) =
(t₂.toList.splitOnPPrepend p (s.subslice f pos hle).copy.toList.reverse).map String.ofList by
simpa using this s.startPos s.startPos (Std.le_refl _) "" s.copy
intro f pos hle t₁ t₂ hp
induction pos using Pos.next_induction generalizing f t₁ t₂ with
| next pos h ih =>
obtain t₂, rfl := hp.exists_eq_singleton_append h
by_cases hpc : p (pos.get h)
· simp [split_eq_of_isLongestMatchAt (isLongestMatchAt_of_get hpc),
ih _ (Std.le_refl _) _ _ hp.next,
List.splitOnPPrepend_cons_pos (p := (decide <| p ·)) (by simpa using hpc)]
· rw [split_eq_next_of_not_matchesAt h (not_matchesAt_of_get hpc)]
simp only [toList_append, toList_singleton, List.cons_append, List.nil_append, Subslice.copy_eq]
rw [ih _ _ _ _ hp.next, List.splitOnPPrepend_cons_neg (by simpa)]
have := (splits_slice (Std.le_trans hle (by simp)) (pos.slice f (pos.next h) hle (by simp))).eq_append
simp_all
| endPos => simp_all
theorem toList_split_prop {s : Slice} {p : Char Prop} [DecidablePred p] :
(s.split p).toList.map Slice.copy = (s.copy.toList.splitOnP p).map String.ofList := by
simp [toList_split_eq_splitToSubslice, toList_splitToSubslice_prop]
end
end Slice
theorem toList_split_bool {s : String} {p : Char Bool} :
(s.split p).toList.map Slice.copy = (s.toList.splitOnP p).map String.ofList := by
simp [split_eq_split_toSlice, Slice.toList_split_bool]
theorem toList_split_prop {s : String} {p : Char Prop} [DecidablePred p] :
(s.split p).toList.map Slice.copy = (s.toList.splitOnP p).map String.ofList := by
simp [split_eq_split_toSlice, Slice.toList_split_prop]
end String

View File

@@ -125,4 +125,105 @@ theorem le_of_matchesAt {pat s : Slice} {pos : s.Pos} (h : pat.isEmpty = false)
end ForwardSliceSearcher
namespace ForwardStringSearcher
instance {pat : String} : ForwardPatternModel pat where
Matches s := s "" s = pat
not_matches_empty := by simp
instance {pat : String} : NoPrefixForwardPatternModel pat :=
.of_length_eq (by simp +contextual [ForwardPatternModel.Matches])
theorem isMatch_iff_slice {pat : String} {s : Slice} {pos : s.Pos} :
IsMatch (ρ := String) pat pos IsMatch (ρ := Slice) pat.toSlice pos := by
simp only [Model.isMatch_iff, ForwardPatternModel.Matches, copy_toSlice]
theorem isLongestMatch_iff_slice {pat : String} {s : Slice} {pos : s.Pos} :
IsLongestMatch (ρ := String) pat pos IsLongestMatch (ρ := Slice) pat.toSlice pos where
mp h := isMatch_iff_slice.1 h.isMatch, fun p hp hm => h.not_isMatch p hp (isMatch_iff_slice.2 hm)
mpr h := isMatch_iff_slice.2 h.isMatch, fun p hp hm => h.not_isMatch p hp (isMatch_iff_slice.1 hm)
theorem isLongestMatchAt_iff_slice {pat : String} {s : Slice} {pos₁ pos₂ : s.Pos} :
IsLongestMatchAt (ρ := String) pat pos₁ pos₂
IsLongestMatchAt (ρ := Slice) pat.toSlice pos₁ pos₂ := by
simp [Model.isLongestMatchAt_iff, isLongestMatch_iff_slice]
theorem matchesAt_iff_slice {pat : String} {s : Slice} {pos : s.Pos} :
MatchesAt (ρ := String) pat pos MatchesAt (ρ := Slice) pat.toSlice pos := by
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff_slice]
private theorem toSlice_isEmpty (h : pat "") : pat.toSlice.isEmpty = false := by
rwa [isEmpty_toSlice, isEmpty_eq_false_iff]
theorem isMatch_iff {pat : String} {s : Slice} {pos : s.Pos} (h : pat "") :
IsMatch pat pos (s.sliceTo pos).copy = pat := by
rw [isMatch_iff_slice, ForwardSliceSearcher.isMatch_iff (toSlice_isEmpty h)]
simp
theorem isLongestMatch_iff {pat : String} {s : Slice} {pos : s.Pos} (h : pat "") :
IsLongestMatch pat pos (s.sliceTo pos).copy = pat := by
rw [isLongestMatch_iff_isMatch, isMatch_iff h]
theorem isLongestMatchAt_iff {pat : String} {s : Slice} {pos₁ pos₂ : s.Pos} (h : pat "") :
IsLongestMatchAt pat pos₁ pos₂ h, (s.slice pos₁ pos₂ h).copy = pat := by
rw [isLongestMatchAt_iff_slice,
ForwardSliceSearcher.isLongestMatchAt_iff (toSlice_isEmpty h)]
simp
theorem isLongestMatchAt_iff_splits {pat : String} {s : Slice} {pos₁ pos₂ : s.Pos}
(h : pat "") :
IsLongestMatchAt pat pos₁ pos₂
t₁ t₂, pos₁.Splits t₁ (pat ++ t₂) pos₂.Splits (t₁ ++ pat) t₂ := by
rw [isLongestMatchAt_iff_slice,
ForwardSliceSearcher.isLongestMatchAt_iff_splits (toSlice_isEmpty h)]
simp
theorem isLongestMatchAt_iff_extract {pat : String} {s : Slice} {pos₁ pos₂ : s.Pos}
(h : pat "") :
IsLongestMatchAt pat pos₁ pos₂
s.copy.toByteArray.extract pos₁.offset.byteIdx pos₂.offset.byteIdx = pat.toByteArray := by
rw [isLongestMatchAt_iff_slice,
ForwardSliceSearcher.isLongestMatchAt_iff_extract (toSlice_isEmpty h)]
simp
theorem offset_of_isLongestMatchAt {pat : String} {s : Slice} {pos₁ pos₂ : s.Pos}
(h : pat "") (h' : IsLongestMatchAt pat pos₁ pos₂) :
pos₂.offset = pos₁.offset.increaseBy pat.utf8ByteSize := by
rw [show pat.utf8ByteSize = pat.toSlice.utf8ByteSize from utf8ByteSize_toSlice.symm]
exact ForwardSliceSearcher.offset_of_isLongestMatchAt (toSlice_isEmpty h)
(isLongestMatchAt_iff_slice.1 h')
theorem matchesAt_iff_splits {pat : String} {s : Slice} {pos : s.Pos} (h : pat "") :
MatchesAt pat pos t₁ t₂, pos.Splits t₁ (pat ++ t₂) := by
rw [matchesAt_iff_slice,
ForwardSliceSearcher.matchesAt_iff_splits (toSlice_isEmpty h)]
simp
theorem matchesAt_iff_isLongestMatchAt {pat : String} {s : Slice} {pos : s.Pos}
(h : pat "") :
MatchesAt pat pos (h : (pos.offset.increaseBy pat.utf8ByteSize).IsValidForSlice s),
IsLongestMatchAt pat pos (s.pos _ h) := by
have key := ForwardSliceSearcher.matchesAt_iff_isLongestMatchAt (pat := pat.toSlice)
(toSlice_isEmpty h) (pos := pos)
simp only [utf8ByteSize_toSlice, isLongestMatchAt_iff_slice] at key
rwa [matchesAt_iff_slice]
theorem matchesAt_iff_getElem {pat : String} {s : Slice} {pos : s.Pos} (h : pat "") :
MatchesAt pat pos
(h : pos.offset.byteIdx + pat.toByteArray.size s.copy.toByteArray.size),
(j), (hj : j < pat.toByteArray.size)
pat.toByteArray[j] = s.copy.toByteArray[pos.offset.byteIdx + j] := by
have key := ForwardSliceSearcher.matchesAt_iff_getElem (pat := pat.toSlice)
(toSlice_isEmpty h) (pos := pos)
simp only [copy_toSlice] at key
rwa [matchesAt_iff_slice]
theorem le_of_matchesAt {pat : String} {s : Slice} {pos : s.Pos} (h : pat "")
(h' : MatchesAt pat pos) : pos.offset.increaseBy pat.utf8ByteSize s.rawEndPos := by
rw [show pat.utf8ByteSize = pat.toSlice.utf8ByteSize from utf8ByteSize_toSlice.symm]
exact ForwardSliceSearcher.le_of_matchesAt (toSlice_isEmpty h)
(matchesAt_iff_slice.1 h')
end ForwardStringSearcher
end String.Slice.Pattern.Model

View File

@@ -73,4 +73,17 @@ public theorem lawfulForwardPatternModel {pat : Slice} (hpat : pat.isEmpty = fal
end Model.ForwardSliceSearcher
namespace Model.ForwardStringSearcher
open Pattern.ForwardSliceSearcher
public theorem lawfulForwardPatternModel {pat : String} (hpat : pat "") :
LawfulForwardPatternModel pat where
dropPrefixOfNonempty?_eq h := rfl
startsWith_eq s := isSome_dropPrefix?.symm
dropPrefix?_eq_some_iff pos := by
simp [ForwardPattern.dropPrefix?, dropPrefix?_eq_some_iff, isLongestMatch_iff hpat]
end Model.ForwardStringSearcher
end String.Slice.Pattern

View File

@@ -565,4 +565,33 @@ public theorem lawfulToForwardSearcherModel {pat : Slice} (hpat : pat.isEmpty =
end ForwardSliceSearcher
namespace ForwardStringSearcher
private theorem isValidSearchFrom_iff_slice {pat : String} {s : Slice} {pos : s.Pos}
{l : List (SearchStep s)} :
IsValidSearchFrom (ρ := String) pat pos l
IsValidSearchFrom (ρ := Slice) pat.toSlice pos l := by
constructor
· intro h
induction h with
| endPos => exact .endPos
| matched hm _ ih => exact .matched (isLongestMatchAt_iff_slice.1 hm) ih
| mismatched hlt hnm _ ih =>
exact .mismatched hlt (fun p hp₁ hp₂ hm => hnm p hp₁ hp₂ (matchesAt_iff_slice.2 hm)) ih
· intro h
induction h with
| endPos => exact .endPos
| matched hm _ ih => exact .matched (isLongestMatchAt_iff_slice.2 hm) ih
| mismatched hlt hnm _ ih =>
exact .mismatched hlt (fun p hp₁ hp₂ hm => hnm p hp₁ hp₂ (matchesAt_iff_slice.1 hm)) ih
public theorem lawfulToForwardSearcherModel {pat : String} (hpat : pat "") :
LawfulToForwardSearcherModel pat where
isValidSearchFrom_toList s :=
isValidSearchFrom_iff_slice.2
((ForwardSliceSearcher.lawfulToForwardSearcherModel
(by rwa [isEmpty_toSlice, isEmpty_eq_false_iff])).isValidSearchFrom_toList s)
end ForwardStringSearcher
end String.Slice.Pattern.Model

View File

@@ -56,11 +56,15 @@ theorem Pos.Splits.cast {s₁ s₂ : String} {p : s₁.Pos} {t₁ t₂ : String}
splits_cast_iff.mpr
@[simp]
theorem Slice.Pos.splits_cast_iff {s₁ s₂ : Slice} {h : s₁ = s₂} {p : s₁.Pos} {t₁ t₂ : String} :
theorem Slice.Pos.splits_cast_iff {s₁ s₂ : Slice} {h : s₁.copy = s₂.copy} {p : s₁.Pos}
{t₁ t₂ : String} :
(p.cast h).Splits t₁ t₂ p.Splits t₁ t₂ := by
subst h; simp
constructor
· intro h₁, h₂; exact h h₁, by simpa using h₂
· intro h₁, h₂; exact h.symm h₁, by simpa using h₂
theorem Slice.Pos.Splits.cast {s₁ s₂ : Slice} {p : s₁.Pos} {t₁ t₂ : String} (h : s₁ = s₂) :
theorem Slice.Pos.Splits.cast {s₁ s₂ : Slice} {p : s₁.Pos} {t₁ t₂ : String}
(h : s₁.copy = s₂.copy) :
p.Splits t₁ t₂ (p.cast h).Splits t₁ t₂ :=
splits_cast_iff.mpr
@@ -416,14 +420,6 @@ theorem splits_singleton_iff {s : String} {p : s.Pos} {c : Char} {t : String} :
rw [ Pos.splits_toSlice_iff, Slice.splits_singleton_iff]
simp [ Pos.ofToSlice_inj]
@[simp]
theorem Slice.copy_sliceTo_startPos {s : Slice} : (s.sliceTo s.startPos).copy = "" :=
s.startPos.splits.eq_left s.splits_startPos
@[simp]
theorem copy_sliceTo_startPos {s : String} : (s.sliceTo s.startPos).copy = "" :=
s.startPos.splits.eq_left s.splits_startPos
theorem Slice.splits_next_startPos {s : Slice} {h : s.startPos s.endPos} :
(s.startPos.next h).Splits
(singleton (s.startPos.get h)) (s.sliceFrom (s.startPos.next h)).copy := by
@@ -597,4 +593,40 @@ theorem Slice.Pos.Splits.copy_sliceFrom_eq {s : Slice} {p : s.Pos} (h : p.Splits
(s.sliceFrom p).copy = t₂ :=
p.splits.eq_right h
theorem copy_slice_eq_append_of_lt {s : String} {p q : s.Pos} (h : p < q) :
(s.slice p q (by exact Std.le_of_lt h)).copy =
String.singleton (p.get (by exact Pos.ne_endPos_of_lt h)) ++
(s.slice (p.next (by exact Pos.ne_endPos_of_lt h)) q (by simpa)).copy := by
have hsp := (s.slice p q (Std.le_of_lt h)).splits_startPos
obtain t₂, ht := hsp.exists_eq_singleton_append (by simpa [ Pos.ofSlice_inj] using Std.ne_of_lt h)
have := (ht hsp).next.eq_right (Slice.Pos.splits _)
simpa [Pos.ofSlice_next, this, Pos.get_eq_get_ofSlice] using ht
@[simp]
theorem copy_slice_next {s : String} {p : s.Pos} {h} :
(s.slice p (p.next h) (by simp)).copy = String.singleton (p.get h) := by
rw [copy_slice_eq_append_of_lt (by simp), copy_slice_self, String.append_empty]
theorem splits_slice {s : String} {p₀ p₁ : s.Pos} (h) (p : (s.slice p₀ p₁ h).Pos) :
p.Splits (s.slice p₀ (Pos.ofSlice p) Pos.le_ofSlice).copy (s.slice (Pos.ofSlice p) p₁ Pos.ofSlice_le).copy := by
simpa using p.splits
theorem Slice.copy_slice_eq_append_of_lt {s : Slice} {p q : s.Pos} (h : p < q) :
(s.slice p q (by exact Std.le_of_lt h)).copy =
String.singleton (p.get (by exact Pos.ne_endPos_of_lt h)) ++
(s.slice (p.next (Pos.ne_endPos_of_lt h)) q (by simpa)).copy := by
have hsp := (s.slice p q (Std.le_of_lt h)).splits_startPos
obtain t₂, ht := hsp.exists_eq_singleton_append (by simpa [ Pos.ofSlice_inj] using Std.ne_of_lt h)
have := (ht hsp).next.eq_right (Slice.Pos.splits _)
simpa [Pos.ofSlice_next, this, Pos.get_eq_get_ofSlice] using ht
@[simp]
theorem Slice.copy_slice_next {s : Slice} {p : s.Pos} {h} :
(s.slice p (p.next h) (by simp)).copy = String.singleton (p.get h) := by
rw [copy_slice_eq_append_of_lt (by simp), copy_slice_self, String.append_empty]
theorem Slice.splits_slice {s : Slice} {p₀ p₁ : s.Pos} (h) (p : (s.slice p₀ p₁ h).Pos) :
p.Splits (s.slice p₀ (Pos.ofSlice p) Pos.le_ofSlice).copy (s.slice (Pos.ofSlice p) p₁ Pos.ofSlice_le).copy := by
simpa using p.splits
end String

View File

@@ -125,7 +125,7 @@ Examples:
-/
@[inline]
def find? (s : String) (pattern : ρ) [ToForwardSearcher pattern σ] : Option s.Pos :=
s.startPos.find? pattern
(s.toSlice.find? pattern).map Pos.ofToSlice
/--
Finds the position of the first match of the pattern {name}`pattern` in a slice {name}`s`. If there
@@ -140,7 +140,7 @@ Examples:
-/
@[inline]
def find (s : String) (pattern : ρ) [ToForwardSearcher pattern σ] : s.Pos :=
s.startPos.find pattern
Pos.ofToSlice (s.toSlice.find pattern)
/--
Finds the position of the first match of the pattern {name}`pattern` in a slice {name}`s` that is
@@ -189,7 +189,7 @@ Examples:
-/
@[inline]
def revFind? (s : String) (pattern : ρ) [ToBackwardSearcher pattern σ] : Option s.Pos :=
s.endPos.revFind? pattern
(s.toSlice.revFind? pattern).map Pos.ofToSlice
@[export lean_string_posof]
def Internal.posOfImpl (s : String) (c : Char) : Pos.Raw :=

View File

@@ -74,9 +74,15 @@ instance : BEq Slice where
def toString (s : Slice) : String :=
s.copy
@[simp]
theorem toString_eq : toString = copy := (rfl)
instance : ToString String.Slice where
toString := toString
@[simp]
theorem toStringToString_eq : ToString.toString = String.Slice.copy := (rfl)
@[extern "lean_slice_hash"]
opaque hash (s : @& Slice) : UInt64

View File

@@ -7,6 +7,8 @@ module
prelude
public import Init.Data.String.Basic
import Init.Data.String.Lemmas.IsEmpty
import Init.Data.String.Lemmas.Basic
set_option doc.verso true
@@ -59,6 +61,11 @@ theorem startInclusive_toSlice {s : Slice} {sl : s.Subslice} :
theorem endExclusive_toSlice {s : Slice} {sl : s.Subslice} :
sl.toSlice.endExclusive = sl.endExclusive.str := rfl
@[simp]
theorem isEmpty_toSlice_iff {s : Slice} {sl : s.Subslice} :
sl.toSlice.isEmpty sl.startInclusive = sl.endExclusive := by
simp [toSlice]
instance {s : Slice} : CoeOut s.Subslice Slice where
coe := Subslice.toSlice
@@ -76,6 +83,16 @@ def toString {s : Slice} (sl : s.Subslice) : String :=
instance {s : Slice} : ToString s.Subslice where
toString
@[simp]
theorem copy_eq {s : Slice} : copy (s := s) = Slice.copy toSlice := (rfl)
@[simp]
theorem toString_eq {s : Slice} : toString (s := s) = Slice.copy toSlice := (rfl)
@[simp]
theorem toStringToString_eq {s : Slice} :
ToString.toString (α := s.Subslice) = Slice.copy toSlice := (rfl)
end Subslice
/--
@@ -130,6 +147,15 @@ theorem startInclusive_subsliceFrom {s : Slice} {newStart : s.Pos} :
theorem endExclusive_subsliceFrom {s : Slice} {newStart : s.Pos} :
(s.subsliceFrom newStart).endExclusive = s.endPos := (rfl)
@[simp]
theorem subslice_endPos {s : Slice} {newStart : s.Pos} :
s.subslice newStart s.endPos (Slice.Pos.le_endPos _) = s.subsliceFrom newStart := (rfl)
@[simp]
theorem toSlice_subsliceFrom {s : Slice} {newStart : s.Pos} :
(s.subsliceFrom newStart).toSlice = s.sliceFrom newStart := by
ext1 <;> simp
/-- The entire slice, as a subslice of itself. -/
@[inline]
def toSubslice (s : Slice) : s.Subslice :=
@@ -197,21 +223,21 @@ theorem extendLeft_self {s : Slice} {sl : s.Subslice} :
ext <;> simp
/--
Given a subslice of {name}`s` and a proof that {lean}`s = t`, obtain the corresponding subslice of
{name}`t`.
Given a subslice of {name}`s` and a proof that {lean}`s.copy = t.copy`, obtain the corresponding
subslice of {name}`t`.
-/
@[inline]
def cast {s t : Slice} (h : s = t) (sl : s.Subslice) : t.Subslice where
def cast {s t : Slice} (h : s.copy = t.copy) (sl : s.Subslice) : t.Subslice where
startInclusive := sl.startInclusive.cast h
endExclusive := sl.endExclusive.cast h
startInclusive_le_endExclusive := by simpa using sl.startInclusive_le_endExclusive
@[simp]
theorem startInclusive_cast {s t : Slice} {h : s = t} {sl : s.Subslice} :
theorem startInclusive_cast {s t : Slice} {h : s.copy = t.copy} {sl : s.Subslice} :
(sl.cast h).startInclusive = sl.startInclusive.cast h := (rfl)
@[simp]
theorem endExclusive_cast {s t : Slice} {h : s = t} {sl : s.Subslice} :
theorem endExclusive_cast {s t : Slice} {h : s.copy = t.copy} {sl : s.Subslice} :
(sl.cast h).endExclusive = sl.endExclusive.cast h := (rfl)
@[simp]

View File

@@ -101,6 +101,17 @@ theorem toArray_mk {xs : Array α} (h : xs.size = n) : (Vector.mk xs h).toArray
@[simp] theorem foldr_mk {f : α β β} {b : β} {xs : Array α} (h : xs.size = n) :
(Vector.mk xs h).foldr f b = xs.foldr f b := rfl
@[simp, grind =] theorem foldlM_toArray [Monad m]
{f : β α m β} {init : β} {xs : Vector α n} :
xs.toArray.foldlM f init = xs.foldlM f init := rfl
@[simp, grind =] theorem foldrM_toArray [Monad m]
{f : α β m β} {init : β} {xs : Vector α n} :
xs.toArray.foldrM f init = xs.foldrM f init := rfl
@[simp, grind =] theorem foldl_toArray (f : β α β) {init : β} {xs : Vector α n} :
xs.toArray.foldl f init = xs.foldl f init := rfl
@[simp] theorem drop_mk {xs : Array α} {h : xs.size = n} {i} :
(Vector.mk xs h).drop i = Vector.mk (xs.extract i xs.size) (by simp [h]) := rfl
@@ -514,17 +525,32 @@ protected theorem ext {xs ys : Vector α n} (h : (i : Nat) → (_ : i < n) → x
@[grind =_] theorem toList_toArray {xs : Vector α n} : xs.toArray.toList = xs.toList := rfl
theorem toArray_toList {xs : Vector α n} : xs.toList.toArray = xs.toArray := rfl
@[simp, grind =] theorem foldlM_toList [Monad m]
{f : β α m β} {init : β} {xs : Vector α n} :
xs.toList.foldlM f init = xs.foldlM f init := by
rw [ foldlM_toArray, toArray_toList, List.foldlM_toArray]
@[simp, grind =] theorem foldl_toList (f : β α β) {init : β} {xs : Vector α n} :
xs.toList.foldl f init = xs.foldl f init :=
List.foldl_eq_foldlM .. foldlM_toList ..
@[simp, grind =] theorem foldrM_toList [Monad m]
{f : α β m β} {init : β} {xs : Vector α n} :
xs.toList.foldrM f init = xs.foldrM f init := by
rw [ foldrM_toArray, toArray_toList, List.foldrM_toArray]
@[simp, grind =] theorem foldr_toList (f : α β β) {init : β} {xs : Vector α n} :
xs.toList.foldr f init = xs.foldr f init :=
List.foldr_eq_foldrM .. foldrM_toList ..
@[simp, grind =] theorem toList_mk : (Vector.mk xs h).toList = xs.toList := rfl
@[simp, grind =] theorem sum_toList [Add α] [Zero α] {xs : Vector α n} :
xs.toList.sum = xs.sum := by
rw [ toList_toArray, Array.sum_toList, sum_toArray]
@[simp, grind =]
theorem toList_zip {as : Vector α n} {bs : Vector β n} :
(Vector.zip as bs).toList = List.zip as.toList bs.toList := by
rw [mk_zip_mk, toList_mk, Array.toList_zip, toList_toArray, toList_toArray]
@[simp] theorem getElem_toList {xs : Vector α n} {i : Nat} (h : i < xs.toList.length) :
xs.toList[i] = xs[i]'(by simpa using h) := by
cases xs
@@ -609,6 +635,11 @@ theorem toList_swap {xs : Vector α n} {i j} (hi hj) :
@[simp] theorem toList_take {xs : Vector α n} {i} : (xs.take i).toList = xs.toList.take i := by
simp [toList]
@[simp, grind =]
theorem toList_zip {as : Vector α n} {bs : Vector β n} :
(Vector.zip as bs).toList = List.zip as.toList bs.toList := by
rw [mk_zip_mk, toList_mk, Array.toList_zip, toList_toArray, toList_toArray]
@[simp] theorem toList_zipWith {f : α β γ} {as : Vector α n} {bs : Vector β n} :
(Vector.zipWith f as bs).toList = List.zipWith f as.toList bs.toList := by
rcases as with as, rfl
@@ -703,6 +734,9 @@ protected theorem eq_empty {xs : Vector α 0} : xs = #v[] := by
/-! ### size -/
theorem size_singleton {x : α} : #v[x].size = 1 := by
simp
theorem eq_empty_of_size_eq_zero {xs : Vector α n} (h : n = 0) : xs = #v[].cast h.symm := by
rcases xs with xs, rfl
apply toArray_inj.1
@@ -2448,6 +2482,21 @@ theorem foldl_eq_foldr_reverse {xs : Vector α n} {f : β → α → β} {b} :
theorem foldr_eq_foldl_reverse {xs : Vector α n} {f : α β β} {b} :
xs.foldr f b = xs.reverse.foldl (fun x y => f y x) b := by simp
theorem foldl_eq_apply_foldr {xs : Vector α n} {f : α α α}
[Std.Associative f] [Std.LawfulRightIdentity f init] :
xs.foldl f x = f x (xs.foldr f init) := by
simp [ foldl_toList, foldr_toList, List.foldl_eq_apply_foldr]
theorem foldr_eq_apply_foldl {xs : Vector α n} {f : α α α}
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
xs.foldr f x = f (xs.foldl f init) x := by
simp [ foldl_toList, foldr_toList, List.foldr_eq_apply_foldl]
theorem foldr_eq_foldl {xs : Vector α n} {f : α α α}
[Std.Associative f] [Std.LawfulIdentity f init] :
xs.foldr f init = xs.foldl f init := by
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
theorem foldl_assoc {op : α α α} [ha : Std.Associative op] {xs : Vector α n} {a₁ a₂} :
xs.foldl op (op a₁ a₂) = op a₁ (xs.foldl op a₂) := by
rcases xs with xs, rfl
@@ -3064,8 +3113,25 @@ theorem sum_append [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
{as₁ as₂ : Vector α n} : (as₁ ++ as₂).sum = as₁.sum + as₂.sum := by
simp [ sum_toList, List.sum_append]
@[simp, grind =]
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
#v[x].sum = x := by
simp [ sum_toList, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem sum_push [Add α] [Zero α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Vector α n} {x : α} :
(xs.push x).sum = xs.sum + x := by
simp [ sum_toArray]
@[simp, grind =]
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.Commutative (α := α) (· + ·)]
[Std.LawfulLeftIdentity (α := α) (· + ·) 0] (xs : Vector α n) : xs.reverse.sum = xs.sum := by
simp [ sum_toList, List.sum_reverse]
theorem sum_eq_foldl [Zero α] [Add α]
[Std.Associative (α := α) (· + ·)] [Std.LawfulIdentity (· + ·) (0 : α)]
{xs : Vector α n} :
xs.sum = xs.foldl (b := 0) (· + ·) := by
simp [ sum_toList, List.sum_eq_foldl]

View File

@@ -910,6 +910,8 @@ When messages contain autogenerated names (e.g., metavariables like `?m.47`), th
differ between runs or Lean versions. Use `set_option pp.mvars.anonymous false` to replace
anonymous metavariables with `?_` while preserving user-named metavariables like `?a`.
Alternatively, `set_option pp.mvars false` replaces all metavariables with `?_`.
Similarly, `set_option pp.fvars.anonymous false` replaces loose free variable names like
`_fvar.22` with `_fvar._`.
For example, `#guard_msgs (error, drop all) in cmd` means to check errors and drop
everything else.

View File

@@ -144,7 +144,10 @@ theorem mul_def (xs ys : IntList) : xs * ys = List.zipWith (· * ·) xs ys :=
@[simp] theorem mul_nil_left : ([] : IntList) * ys = [] := rfl
@[simp] theorem mul_nil_right : xs * ([] : IntList) = [] := List.zipWith_nil_right
@[simp] theorem mul_cons : (x::xs : IntList) * (y::ys) = (x * y) :: (xs * ys) := rfl
@[simp] theorem mul_cons_cons : (x::xs : IntList) * (y::ys) = (x * y) :: (xs * ys) := rfl
@[deprecated mul_cons_cons (since := "2026-02-26")]
theorem mul_cons₂ : (x::xs : IntList) * (y::ys) = (x * y) :: (xs * ys) := mul_cons_cons
/-- Implementation of negation on `IntList`. -/
def neg (xs : IntList) : IntList := xs.map fun x => -x
@@ -278,7 +281,10 @@ example : IntList.dot [a, b, c] [x, y, z] = IntList.dot [a, b, c] [x, y, z, w] :
@[local simp] theorem dot_nil_left : dot ([] : IntList) ys = 0 := rfl
@[simp] theorem dot_nil_right : dot xs ([] : IntList) = 0 := by simp [dot]
@[simp] theorem dot_cons : dot (x::xs) (y::ys) = x * y + dot xs ys := rfl
@[simp] theorem dot_cons_cons : dot (x::xs) (y::ys) = x * y + dot xs ys := rfl
@[deprecated dot_cons_cons (since := "2026-02-26")]
theorem dot_cons₂ : dot (x::xs) (y::ys) = x * y + dot xs ys := dot_cons_cons
-- theorem dot_comm (xs ys : IntList) : dot xs ys = dot ys xs := by
-- rw [dot, dot, mul_comm]
@@ -296,7 +302,7 @@ example : IntList.dot [a, b, c] [x, y, z] = IntList.dot [a, b, c] [x, y, z, w] :
cases ys with
| nil => simp
| cons y ys =>
simp only [set_cons_zero, dot_cons, get_cons_zero, Int.sub_mul]
simp only [set_cons_zero, dot_cons_cons, get_cons_zero, Int.sub_mul]
rw [Int.add_right_comm, Int.add_comm (x * y), Int.sub_add_cancel]
| succ i =>
cases ys with
@@ -319,7 +325,7 @@ theorem dot_of_left_zero (w : ∀ x, x ∈ xs → x = 0) : dot xs ys = 0 := by
cases ys with
| nil => simp
| cons y ys =>
rw [dot_cons, w x (by simp [List.mem_cons_self]), ih]
rw [dot_cons_cons, w x (by simp [List.mem_cons_self]), ih]
· simp
· intro x m
apply w
@@ -400,7 +406,7 @@ attribute [simp] Int.zero_dvd
cases ys with
| nil => simp
| cons y ys =>
rw [dot_cons, Int.add_emod,
rw [dot_cons_cons, Int.add_emod,
Int.emod_emod_of_dvd (x * y) (gcd_cons_div_left),
Int.emod_emod_of_dvd (dot xs ys) (Int.ofNat_dvd.mpr gcd_cons_div_right)]
simp_all
@@ -415,7 +421,7 @@ theorem dot_eq_zero_of_left_eq_zero {xs ys : IntList} (h : ∀ x, x ∈ xs → x
cases ys with
| nil => rfl
| cons y ys =>
rw [dot_cons, h x List.mem_cons_self, ih (fun x m => h x (List.mem_cons_of_mem _ m)),
rw [dot_cons_cons, h x List.mem_cons_self, ih (fun x m => h x (List.mem_cons_of_mem _ m)),
Int.zero_mul, Int.add_zero]
@[simp] theorem nil_dot (xs : IntList) : dot [] xs = 0 := rfl
@@ -456,7 +462,7 @@ theorem dvd_bmod_dot_sub_dot_bmod (m : Nat) (xs ys : IntList) :
cases ys with
| nil => simp
| cons y ys =>
simp only [IntList.dot_cons, List.map_cons]
simp only [IntList.dot_cons_cons, List.map_cons]
specialize ih ys
rw [Int.sub_emod, Int.bmod_emod] at ih
rw [Int.sub_emod, Int.bmod_emod, Int.add_emod, Int.add_emod (Int.bmod x m * y),

View File

@@ -32,6 +32,89 @@ unsafe axiom lcAny : Type
/-- Internal representation of `Void` in the compiler. -/
unsafe axiom lcVoid : Type
set_option bootstrap.inductiveCheckResultingUniverse false in
/--
The canonical universe-polymorphic type with just one element.
It should be used in contexts that require a type to be universe polymorphic, thus disallowing
`Unit`.
-/
inductive PUnit : Sort u where
/-- The only element of the universe-polymorphic unit type. -/
| unit : PUnit
/--
The equality relation. It has one introduction rule, `Eq.refl`.
We use `a = b` as notation for `Eq a b`.
A fundamental property of equality is that it is an equivalence relation.
```
variable (α : Type) (a b c d : α)
variable (hab : a = b) (hcb : c = b) (hcd : c = d)
example : a = d :=
Eq.trans (Eq.trans hab (Eq.symm hcb)) hcd
```
Equality is much more than an equivalence relation, however. It has the important property that every assertion
respects the equivalence, in the sense that we can substitute equal expressions without changing the truth value.
That is, given `h1 : a = b` and `h2 : p a`, we can construct a proof for `p b` using substitution: `Eq.subst h1 h2`.
Example:
```
example (α : Type) (a b : α) (p : α → Prop)
(h1 : a = b) (h2 : p a) : p b :=
Eq.subst h1 h2
example (α : Type) (a b : α) (p : α → Prop)
(h1 : a = b) (h2 : p a) : p b :=
h1 ▸ h2
```
The triangle in the second presentation is a macro built on top of `Eq.subst` and `Eq.symm`, and you can enter it by typing `\t`.
For more information: [Equality](https://lean-lang.org/theorem_proving_in_lean4/quantifiers_and_equality.html#equality)
-/
inductive Eq : α α Prop where
/-- `Eq.refl a : a = a` is reflexivity, the unique constructor of the
equality type. See also `rfl`, which is usually used instead. -/
| refl (a : α) : Eq a a
/-- Non-dependent recursor for the equality type. -/
@[simp] abbrev Eq.ndrec.{u1, u2} {α : Sort u2} {a : α} {motive : α Sort u1} (m : motive a) {b : α} (h : Eq a b) : motive b :=
h.rec m
/--
Heterogeneous equality. `a ≍ b` asserts that `a` and `b` have the same
type, and casting `a` across the equality yields `b`, and vice versa.
You should avoid using this type if you can. Heterogeneous equality does not
have all the same properties as `Eq`, because the assumption that the types of
`a` and `b` are equal is often too weak to prove theorems of interest. One
public important non-theorem is the analogue of `congr`: If `f ≍ g` and `x ≍ y`
and `f x` and `g y` are well typed it does not follow that `f x ≍ g y`.
(This does follow if you have `f = g` instead.) However if `a` and `b` have
the same type then `a = b` and `a ≍ b` are equivalent.
-/
inductive HEq : {α : Sort u} α {β : Sort u} β Prop where
/-- Reflexivity of heterogeneous equality. -/
| refl (a : α) : HEq a a
/--
The Boolean values, `true` and `false`.
Logically speaking, this is equivalent to `Prop` (the type of propositions). The distinction is
public important for programming: both propositions and their proofs are erased in the code generator,
while `Bool` corresponds to the Boolean type in most programming languages and carries precisely one
bit of run-time information.
-/
inductive Bool : Type where
/-- The Boolean value `false`, not to be confused with the proposition `False`. -/
| false : Bool
/-- The Boolean value `true`, not to be confused with the proposition `True`. -/
| true : Bool
export Bool (false true)
/-- Compute whether `x` is a tagged pointer or not. -/
@[extern "lean_is_scalar"]
unsafe axiom isScalarObj {α : Type u} (x : α) : Bool
/--
The identity function. `id` takes an implicit argument `α : Sort u`
@@ -115,16 +198,7 @@ does.) Example:
-/
abbrev inferInstanceAs (α : Sort u) [i : α] : α := i
set_option bootstrap.inductiveCheckResultingUniverse false in
/--
The canonical universe-polymorphic type with just one element.
It should be used in contexts that require a type to be universe polymorphic, thus disallowing
`Unit`.
-/
inductive PUnit : Sort u where
/-- The only element of the universe-polymorphic unit type. -/
| unit : PUnit
/--
The canonical type with one element. This element is written `()`.
@@ -245,42 +319,6 @@ For more information: [Propositional Logic](https://lean-lang.org/theorem_provin
@[macro_inline] def absurd {a : Prop} {b : Sort v} (h₁ : a) (h₂ : Not a) : b :=
(h₂ h₁).rec
/--
The equality relation. It has one introduction rule, `Eq.refl`.
We use `a = b` as notation for `Eq a b`.
A fundamental property of equality is that it is an equivalence relation.
```
variable (α : Type) (a b c d : α)
variable (hab : a = b) (hcb : c = b) (hcd : c = d)
example : a = d :=
Eq.trans (Eq.trans hab (Eq.symm hcb)) hcd
```
Equality is much more than an equivalence relation, however. It has the important property that every assertion
respects the equivalence, in the sense that we can substitute equal expressions without changing the truth value.
That is, given `h1 : a = b` and `h2 : p a`, we can construct a proof for `p b` using substitution: `Eq.subst h1 h2`.
Example:
```
example (α : Type) (a b : α) (p : α → Prop)
(h1 : a = b) (h2 : p a) : p b :=
Eq.subst h1 h2
example (α : Type) (a b : α) (p : α → Prop)
(h1 : a = b) (h2 : p a) : p b :=
h1 ▸ h2
```
The triangle in the second presentation is a macro built on top of `Eq.subst` and `Eq.symm`, and you can enter it by typing `\t`.
For more information: [Equality](https://lean-lang.org/theorem_proving_in_lean4/quantifiers_and_equality.html#equality)
-/
inductive Eq : α α Prop where
/-- `Eq.refl a : a = a` is reflexivity, the unique constructor of the
equality type. See also `rfl`, which is usually used instead. -/
| refl (a : α) : Eq a a
/-- Non-dependent recursor for the equality type. -/
@[simp] abbrev Eq.ndrec.{u1, u2} {α : Sort u2} {a : α} {motive : α Sort u1} (m : motive a) {b : α} (h : Eq a b) : motive b :=
h.rec m
/--
`rfl : a = a` is the unique constructor of the equality type. This is the
same as `Eq.refl` except that it takes `a` implicitly instead of explicitly.
@@ -477,21 +515,6 @@ Unsafe auxiliary constant used by the compiler to erase `Quot.lift`.
-/
unsafe axiom Quot.lcInv {α : Sort u} {r : α α Prop} (q : Quot r) : α
/--
Heterogeneous equality. `a ≍ b` asserts that `a` and `b` have the same
type, and casting `a` across the equality yields `b`, and vice versa.
You should avoid using this type if you can. Heterogeneous equality does not
have all the same properties as `Eq`, because the assumption that the types of
`a` and `b` are equal is often too weak to prove theorems of interest. One
public important non-theorem is the analogue of `congr`: If `f ≍ g` and `x ≍ y`
and `f x` and `g y` are well typed it does not follow that `f x ≍ g y`.
(This does follow if you have `f = g` instead.) However if `a` and `b` have
the same type then `a = b` and `a ≍ b` are equivalent.
-/
inductive HEq : {α : Sort u} α {β : Sort u} β Prop where
/-- Reflexivity of heterogeneous equality. -/
| refl (a : α) : HEq a a
/-- A version of `HEq.refl` with an implicit argument. -/
@[match_pattern] protected def HEq.rfl {α : Sort u} {a : α} : HEq a a :=
@@ -599,23 +622,6 @@ theorem Or.resolve_left (h: Or a b) (na : Not a) : b := h.elim (absurd · na) i
theorem Or.resolve_right (h: Or a b) (nb : Not b) : a := h.elim id (absurd · nb)
theorem Or.neg_resolve_left (h : Or (Not a) b) (ha : a) : b := h.elim (absurd ha) id
theorem Or.neg_resolve_right (h : Or a (Not b)) (nb : b) : a := h.elim id (absurd nb)
/--
The Boolean values, `true` and `false`.
Logically speaking, this is equivalent to `Prop` (the type of propositions). The distinction is
public important for programming: both propositions and their proofs are erased in the code generator,
while `Bool` corresponds to the Boolean type in most programming languages and carries precisely one
bit of run-time information.
-/
inductive Bool : Type where
/-- The Boolean value `false`, not to be confused with the proposition `False`. -/
| false : Bool
/-- The Boolean value `true`, not to be confused with the proposition `True`. -/
| true : Bool
export Bool (false true)
/--
All the elements of a type that satisfy a predicate.
@@ -3098,7 +3104,7 @@ Examples:
* `[["a"], ["b", "c"]].flatten = ["a", "b", "c"]`
* `[["a"], [], ["b", "c"], ["d", "e", "f"]].flatten = ["a", "b", "c", "d", "e", "f"]`
-/
def List.flatten : List (List α) List α
noncomputable def List.flatten : List (List α) List α
| nil => nil
| cons l L => List.append l (flatten L)
@@ -3125,7 +3131,7 @@ Examples:
* `[2, 3, 2].flatMap List.range = [0, 1, 0, 1, 2, 0, 1]`
* `["red", "blue"].flatMap String.toList = ['r', 'e', 'd', 'b', 'l', 'u', 'e']`
-/
@[inline] def List.flatMap {α : Type u} {β : Type v} (b : α List β) (as : List α) : List β := flatten (map b as)
@[inline] noncomputable def List.flatMap {α : Type u} {β : Type v} (b : α List β) (as : List α) : List β := flatten (map b as)
/--
`Array α` is the type of [dynamic arrays](https://en.wikipedia.org/wiki/Dynamic_array) with elements
@@ -3453,7 +3459,7 @@ def String.utf8EncodeChar (c : Char) : List UInt8 :=
/-- Encode a list of characters (Unicode scalar value) in UTF-8. This is an inefficient model
implementation. Use `List.asString` instead. -/
def List.utf8Encode (l : List Char) : ByteArray :=
noncomputable def List.utf8Encode (l : List Char) : ByteArray :=
l.flatMap String.utf8EncodeChar |>.toByteArray
/-- A byte array is valid UTF-8 if it is of the form `List.Internal.utf8Encode m` for some `m`.

View File

@@ -20,10 +20,28 @@ theorem ne_self (a : α) : (a ≠ a) = False := by simp
theorem not_true_eq : (¬ True) = False := by simp
theorem not_false_eq : (¬ False) = True := by simp
theorem or_eq_true_left (a b : Prop) (h : a = True) : (a b) = True := by simp [h]
theorem or_eq_right (a b : Prop) (h : a = False) : (a b) = b := by simp [h]
theorem and_eq_false_left (a b : Prop) (h : a = False) : (a b) = False := by simp [h]
theorem and_eq_left (a b : Prop) (h : a = True) : (a b) = b := by simp [h]
theorem ite_cond_congr {α : Sort u} (c : Prop) {inst : Decidable c} (a b : α)
(c' : Prop) {inst' : Decidable c'} (h : c = c') : @ite α c inst a b = @ite α c' inst' a b := by
simp [*]
theorem ite_true {α : Sort u} (c : Prop) {inst : Decidable c} (a b : α) {ht : c} : @ite α c inst a b = a := by
simp [*]
theorem ite_false {α : Sort u} (c : Prop) {inst : Decidable c} (a b : α) {ht : ¬ c} : @ite α c inst a b = b := by
simp [*]
theorem dite_true {α : Sort u} (c : Prop) {inst : Decidable c} (a : c α) (b : ¬ c α) {ht : c} : @dite α c inst a b = a ht := by
simp [*]
theorem dite_false {α : Sort u} (c : Prop) {inst : Decidable c} (a : c α) (b : ¬ c α) {ht : ¬ c} : @dite α c inst a b = b ht := by
simp [*]
theorem dite_cond_congr {α : Sort u} (c : Prop) {inst : Decidable c} (a : c α) (b : ¬ c α)
(c' : Prop) {inst' : Decidable c'} (h : c = c')
: @dite α c inst a b = @dite α c' inst' (fun h' => a (h.mpr_prop h')) (fun h' => b (h.mpr_not h')) := by
@@ -140,4 +158,13 @@ theorem Int.dvd_eq_true (a b : Int) (h : decide (a b) = true) : (a b) =
theorem Nat.dvd_eq_false (a b : Nat) (h : decide (a b) = false) : (a b) = False := by simp_all
theorem Int.dvd_eq_false (a b : Int) (h : decide (a b) = false) : (a b) = False := by simp_all
theorem decide_isTrue (p : Prop) {inst : Decidable p} {h : p} : decide p = true := by simp [*]
theorem decide_isTrue_congr (p p' : Prop) (heq : p = p') {inst : Decidable p} {hp : p'} : decide p = true := by simp [*]
theorem decide_isFalse (p : Prop) {inst : Decidable p} {h : ¬p} : decide p = false := by simp [*]
theorem decide_isFalse_congr (p p' : Prop) (heq : p = p') {inst : Decidable p} {hnp : ¬p'} : decide p = false := by simp [*]
theorem decide_prop_eq_true (p : Prop) {inst : Decidable p} (h : p = True) : decide p = true := by simp [*]
theorem decide_prop_eq_false (p : Prop) {inst : Decidable p} (h : p = False) : decide p = false := by simp [*]
end Lean.Sym

View File

@@ -2302,17 +2302,17 @@ It reduces terms by unfolding definitions using their defining equations and
applying matcher equations. The unfolding is propositional, so `cbv` also works
with functions defined via well-founded recursion or partial fixpoints.
`cbv` has built-in support for goals of the form `lhs = rhs`. It proceeds in
two passes:
1. Reduce `lhs`. If the result is definitionally equal to `rhs`, close the goal.
2. Otherwise, reduce `rhs`. If the result is now definitionally equal to the
reduced `lhs`, close the goal.
3. If neither check succeeds, generate a new goal `lhs' = rhs'`, where `lhs'`
and `rhs'` are the reduced forms of the original sides.
`cbv` reduces the goal type (and optionally hypothesis types) using call-by-value
evaluation. For equation goals (`lhs = rhs`), `cbv` automatically attempts `refl`
after reduction to close the goal.
`cbv` is therefore not a finishing tactic in general: it may leave a new
(simpler) equality goal. For goals that are not equalities, `cbv` currently
leaves the goal unchanged.
`cbv` supports the standard `at` location syntax:
- `cbv` — reduce the goal target
- `cbv at h` — reduce hypothesis `h`
- `cbv at h |-` — reduce hypothesis `h` and the goal target
- `cbv at *` — reduce all non-dependent propositional hypotheses and the goal target
`cbv` is not a finishing tactic in general: it may leave a new (simpler) goal.
The proofs produced by `cbv` only use the three standard axioms.
In particular, they do not require trust in the correctness of the code
@@ -2321,7 +2321,7 @@ generator.
This tactic is experimental and its behavior is likely to change in upcoming
releases of Lean.
-/
syntax (name := cbv) "cbv" : tactic
syntax (name := cbv) "cbv" (location)? : tactic
/--
`decide_cbv` is a finishing tactic that closes goals of the form `p`, where `p`

View File

@@ -30,6 +30,7 @@ variable {α : Sort _} {β : α → Sort _} {γ : (a : α) → β a → Sort _}
set_option doc.verso true
namespace WellFounded
open Relation
/--
The function implemented as the loop {lean}`opaqueFix R F a = F a (fun a _ => opaqueFix R F a)`.
@@ -85,6 +86,23 @@ public theorem extrinsicFix_eq_apply [∀ a, Nonempty (C a)] {R : αα
simp only [extrinsicFix, dif_pos h]
rw [WellFounded.fix_eq]
public theorem extrinsicFix_invImage {α' : Sort _} [ a, Nonempty (C a)] (R : α α Prop) (f : α' α)
(F : a, ( a', R a' a C a') C a) (F' : a, ( a', R (f a') (f a) C (f a')) C (f a))
(h : a r, F (f a) r = F' a fun a' hR => r (f a') hR) (a : α') (h : WellFounded R) :
extrinsicFix (C := (C <| f ·)) (InvImage R f) F' a = extrinsicFix (C := C) R F (f a) := by
have h' := h
rcases h with h
specialize h (f a)
have : Acc (InvImage R f) a := InvImage.accessible _ h
clear h
induction this
rename_i ih
rw [extrinsicFix_eq_apply, extrinsicFix_eq_apply, h]
· congr; ext a x
rw [ih _ x]
· assumption
· exact InvImage.wf _ _
/--
A fixpoint combinator that allows for deferred proofs of termination.
@@ -242,4 +260,273 @@ nontrivial properties about it.
-/
add_decl_doc extrinsicFix₃
/--
A fixpoint combinator that can be used to construct recursive functions with an
*extrinsic, partial* proof of termination.
Given a relation {name}`R` and a fixpoint functional {name}`F` which must be decreasing with respect
to {name}`R`, {lean}`partialExtrinsicFix R F` is the recursive function obtained by having {name}`F` call
itself recursively.
For each input {given}`a`, {lean}`partialExtrinsicFix R F a` can be verified given a *partial* termination
proof. The precise semantics are as follows.
If {lean}`Acc R a` does not hold, {lean}`partialExtrinsicFix R F a` might run forever. In this case,
nothing interesting can be proved about the recursive function; it is opaque and behaves like a
recursive function with the `partial` modifier.
If {lean}`Acc R a` _does_ hold, {lean}`partialExtrinsicFix R F a` is equivalent to
{lean}`F a (fun a' _ => partialExtrinsicFix R F a')`, both logically and regarding its termination behavior.
In particular, if {name}`R` is well-founded, {lean}`partialExtrinsicFix R F a` is equivalent to
{lean}`WellFounded.fix _ F`.
-/
@[inline]
public def partialExtrinsicFix [ a, Nonempty (C a)] (R : α α Prop)
(F : a, ( a', R a' a C a') C a) (a : α) : C a :=
extrinsicFix (α := { a' : α // a' = a TransGen R a' a }) (C := (C ·.1))
(fun p q => R p.1 q.1)
(fun a recur => F a.1 fun a' hR => recur a', by
rcases a.property with ha | ha
· exact Or.inr (TransGen.single (ha hR))
· apply Or.inr
apply TransGen.trans ?_ _
apply TransGen.single
assumption _) a, Or.inl rfl
public theorem partialExtrinsicFix_eq_apply_of_acc [ a, Nonempty (C a)] {R : α α Prop}
{F : a, ( a', R a' a C a') C a} {a : α} (h : Acc R a) :
partialExtrinsicFix R F a = F a (fun a' _ => partialExtrinsicFix R F a') := by
simp only [partialExtrinsicFix]
rw [extrinsicFix_eq_apply]
congr; ext a' hR
let f (x : { x : α // x = a' TransGen R x a' }) : { x : α // x = a TransGen R x a } :=
x.val, by
cases x.property
· rename_i h
rw [h]
exact Or.inr (.single hR)
· rename_i h
apply Or.inr
refine TransGen.trans h ?_
exact .single hR
have := extrinsicFix_invImage (C := (C ·.val)) (R := (R ·.1 ·.1)) (f := f)
(F := fun a r => F a.1 fun a' hR => r a', Or.inr (by rcases a.2 with ha | ha; exact .single (ha hR); exact .trans (.single hR) _) hR)
(F' := fun a r => F a.1 fun a' hR => r a', by rcases a.2 with ha | ha; exact .inr (.single (ha hR)); exact .inr (.trans (.single hR) _) hR)
unfold InvImage at this
rw [this]
· simp +zetaDelta
· constructor
intro x
refine InvImage.accessible _ ?_
cases x.2 <;> rename_i hx
· rwa [hx]
· exact h.inv_of_transGen hx
· constructor
intro x
refine InvImage.accessible _ ?_
cases x.2 <;> rename_i hx
· rwa [hx]
· exact h.inv_of_transGen hx
public theorem partialExtrinsicFix_eq_apply [ a, Nonempty (C a)] {R : α α Prop}
{F : a, ( a', R a' a C a') C a} {a : α} (wf : WellFounded R) :
partialExtrinsicFix R F a = F a (fun a' _ => partialExtrinsicFix R F a') :=
partialExtrinsicFix_eq_apply_of_acc (wf.apply _)
public theorem partialExtrinsicFix_eq_fix [ a, Nonempty (C a)] {R : α α Prop}
{F : a, ( a', R a' a C a') C a}
(wf : WellFounded R) {a : α} :
partialExtrinsicFix R F a = wf.fix F a := by
have h := wf.apply a
induction h with | intro a' h ih
rw [partialExtrinsicFix_eq_apply_of_acc (Acc.intro _ h), WellFounded.fix_eq]
congr 1; ext a'' hR
exact ih _ hR
/--
A 2-ary fixpoint combinator that can be used to construct recursive functions with an
*extrinsic, partial* proof of termination.
Given a relation {name}`R` and a fixpoint functional {name}`F` which must be decreasing with respect
to {name}`R`, {lean}`partialExtrinsicFix₂ R F` is the recursive function obtained by having {name}`F` call
itself recursively.
For each pair of inputs {given}`a` and {given}`b`, {lean}`partialExtrinsicFix₂ R F a b` can be verified
given a *partial* termination proof. The precise semantics are as follows.
If {lean}`Acc R ⟨a, b⟩ ` does not hold, {lean}`partialExtrinsicFix₂ R F a b` might run forever. In this
case, nothing interesting can be proved about the recursive function; it is opaque and behaves like
a recursive function with the `partial` modifier.
If {lean}`Acc R ⟨a, b⟩` _does_ hold, {lean}`partialExtrinsicFix₂ R F a b` is equivalent to
{lean}`F a b (fun a' b' _ => partialExtrinsicFix₂ R F a' b')`, both logically and regarding its
termination behavior.
In particular, if {name}`R` is well-founded, {lean}`partialExtrinsicFix₂ R F a b` is equivalent to
a well-foundesd fixpoint.
-/
@[inline]
public def partialExtrinsicFix₂ [ a b, Nonempty (C₂ a b)]
(R : (a : α) ×' β a (a : α) ×' β a Prop)
(F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b)
(a : α) (b : β a) :
C₂ a b :=
extrinsicFix₂ (α := α) (β := fun a' => { b' : β a' // (PSigma.mk a' b') = (PSigma.mk a b) TransGen R a', b' a, b })
(C₂ := (C₂ · ·.1))
(fun p q => R p.1, p.2.1 q.1, q.2.1)
(fun a b recur => F a b.1 fun a' b' hR => recur a' b', Or.inr (by
rcases b.property with hb | hb
· exact .single (hb hR)
· apply TransGen.trans ?_ _
apply TransGen.single
assumption) _) a b, Or.inl rfl
public theorem partialExtrinsicFix₂_eq_partialExtrinsicFix [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b}
{a : α} {b : β a} (h : Acc R a, b) :
partialExtrinsicFix₂ R F a b = partialExtrinsicFix (α := PSigma β) (C := fun a => C₂ a.1 a.2) R (fun p r => F p.1 p.2 fun a' b' hR => r a', b' hR) a, b := by
simp only [partialExtrinsicFix, partialExtrinsicFix₂, extrinsicFix₂]
let f (x : ((a' : α) ×' { b' // PSigma.mk a' b' = a, b TransGen R a', b' a, b })) : { a' // a' = a, b TransGen R a' a, b } :=
x.1, x.2.1, x.2.2
have := extrinsicFix_invImage (C := fun a => C₂ a.1.1 a.1.2) (f := f) (R := (R ·.1 ·.1))
(F := fun a r => F a.1.1 a.1.2 fun a' b' hR => r a', b', ?refine_a hR)
(F' := fun a r => F a.1 a.2.1 fun a' b' hR => r a', b', ?refine_b hR)
(a := a, b, ?refine_c); rotate_left
· cases a.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· cases a.2.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· exact .inl rfl
unfold InvImage f at this
simp at this
rw [this]
constructor
intro x
apply InvImage.accessible
cases x.2 <;> rename_i heq
· rwa [heq]
· exact h.inv_of_transGen heq
public theorem partialExtrinsicFix₂_eq_apply_of_acc [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b}
{a : α} {b : β a} (wf : Acc R a, b) :
partialExtrinsicFix₂ R F a b = F a b (fun a' b' _ => partialExtrinsicFix₂ R F a' b') := by
rw [partialExtrinsicFix₂_eq_partialExtrinsicFix wf, partialExtrinsicFix_eq_apply_of_acc wf]
congr 1; ext a' b' hR
rw [partialExtrinsicFix₂_eq_partialExtrinsicFix (wf.inv hR)]
public theorem partialExtrinsicFix₂_eq_apply [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b}
{a : α} {b : β a} (wf : WellFounded R) :
partialExtrinsicFix₂ R F a b = F a b (fun a' b' _ => partialExtrinsicFix₂ R F a' b') :=
partialExtrinsicFix₂_eq_apply_of_acc (wf.apply _)
public theorem partialExtrinsicFix₂_eq_fix [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : a b, ( a' b', R a', b' a, b C₂ a' b') C₂ a b}
(wf : WellFounded R) {a b} :
partialExtrinsicFix₂ R F a b = wf.fix (fun x G => F x.1 x.2 (fun a b h => G a, b h)) a, b := by
rw [partialExtrinsicFix₂_eq_partialExtrinsicFix (wf.apply _), partialExtrinsicFix_eq_fix wf]
/--
A 3-ary fixpoint combinator that can be used to construct recursive functions with an
*extrinsic, partial* proof of termination.
Given a relation {name}`R` and a fixpoint functional {name}`F` which must be decreasing with respect
to {name}`R`, {lean}`partialExtrinsicFix₃ R F` is the recursive function obtained by having {name}`F` call
itself recursively.
For each pair of inputs {given}`a`, {given}`b` and {given}`c`, {lean}`partialExtrinsicFix₃ R F a b` can be
verified given a *partial* termination proof. The precise semantics are as follows.
If {lean}`Acc R ⟨a, b, c⟩ ` does not hold, {lean}`partialExtrinsicFix₃ R F a b` might run forever. In this
case, nothing interesting can be proved about the recursive function; it is opaque and behaves like
a recursive function with the `partial` modifier.
If {lean}`Acc R ⟨a, b, c⟩` _does_ hold, {lean}`partialExtrinsicFix₃ R F a b` is equivalent to
{lean}`F a b c (fun a' b' c' _ => partialExtrinsicFix₃ R F a' b' c')`, both logically and regarding its
termination behavior.
In particular, if {name}`R` is well-founded, {lean}`partialExtrinsicFix₃ R F a b c` is equivalent to
a well-foundesd fixpoint.
-/
@[inline]
public def partialExtrinsicFix₃ [ a b c, Nonempty (C₃ a b c)]
(R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop)
(F : (a : α) (b : β a) (c : γ a b) ((a' : α) (b' : β a') (c' : γ a' b') R a', b', c' a, b, c C₃ a' b' c') C₃ a b c)
(a : α) (b : β a) (c : γ a b) :
C₃ a b c :=
extrinsicFix₃ (α := α) (β := β) (γ := fun a' b' => { c' : γ a' b' // (a', b', c' : (a : α) ×' (b : β a) ×' γ a b) = a, b, c TransGen R a', b', c' a, b, c })
(C₃ := (C₃ · · ·.1))
(fun p q => R p.1, p.2.1, p.2.2.1 q.1, q.2.1, q.2.2.1)
(fun a b c recur => F a b c.1 fun a' b' c' hR => recur a' b' c', Or.inr (by
rcases c.property with hb | hb
· exact .single (hb hR)
· apply TransGen.trans ?_ _
apply TransGen.single
assumption) _) a b c, Or.inl rfl
public theorem partialExtrinsicFix₃_eq_partialExtrinsicFix [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : (a : α) (b : β a) (c : γ a b) ((a' : α) (b' : β a') (c' : γ a' b') R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
{a : α} {b : β a} {c : γ a b} (h : Acc R a, b, c) :
partialExtrinsicFix₃ R F a b c = partialExtrinsicFix (α := (a : α) ×' (b : β a) ×' γ a b) (C := fun a => C₃ a.1 a.2.1 a.2.2) R (fun p r => F p.1 p.2.1 p.2.2 fun a' b' c' hR => r a', b', c' hR) a, b, c := by
simp only [partialExtrinsicFix, partialExtrinsicFix₃, extrinsicFix₃]
let f (x : ((a' : α) ×' (b' : β a') ×' { c' // (a', b', c' : (a : α) ×' (b : β a) ×' γ a b) = a, b, c TransGen R a', b', c' a, b, c })) : { a' // a' = a, b, c TransGen R a' a, b, c } :=
x.1, x.2.1, x.2.2.1, x.2.2.2
have := extrinsicFix_invImage (C := fun a => C₃ a.1.1 a.1.2.1 a.1.2.2) (f := f) (R := (R ·.1 ·.1))
(F := fun a r => F a.1.1 a.1.2.1 a.1.2.2 fun a' b' c' hR => r a', b', c', ?refine_a hR)
(F' := fun a r => F a.1 a.2.1 a.2.2.1 fun a' b' c' hR => r a', b', c', ?refine_b hR)
(a := a, b, c, ?refine_c); rotate_left
· cases a.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· cases a.2.2.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· exact .inl rfl
unfold InvImage f at this
simp at this
rw [this]
constructor
intro x
apply InvImage.accessible
cases x.2 <;> rename_i heq
· rwa [heq]
· exact h.inv_of_transGen heq
public theorem partialExtrinsicFix₃_eq_apply_of_acc [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : (a b c), ( (a' b' c'), R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
{a : α} {b : β a} {c : γ a b} (wf : Acc R a, b, c) :
partialExtrinsicFix₃ R F a b c = F a b c (fun a b c _ => partialExtrinsicFix₃ R F a b c) := by
rw [partialExtrinsicFix₃_eq_partialExtrinsicFix wf, partialExtrinsicFix_eq_apply_of_acc wf]
congr 1; ext a' b' c' hR
rw [partialExtrinsicFix₃_eq_partialExtrinsicFix (wf.inv hR)]
public theorem partialExtrinsicFix₃_eq_apply [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : (a b c), ( (a' b' c'), R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
{a : α} {b : β a} {c : γ a b} (wf : WellFounded R) :
partialExtrinsicFix₃ R F a b c = F a b c (fun a b c _ => partialExtrinsicFix₃ R F a b c) :=
partialExtrinsicFix₃_eq_apply_of_acc (wf.apply _)
public theorem partialExtrinsicFix₃_eq_fix [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : a b c, ( a' b' c', R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
(wf : WellFounded R) {a b c} :
partialExtrinsicFix₃ R F a b c = wf.fix (fun x G => F x.1 x.2.1 x.2.2 (fun a b c h => G a, b, c h)) a, b, c := by
rw [partialExtrinsicFix₃_eq_partialExtrinsicFix (wf.apply _), partialExtrinsicFix_eq_fix wf]
end WellFounded

View File

@@ -10,18 +10,14 @@ public import Lean.Compiler.IR.AddExtern
public import Lean.Compiler.IR.Basic
public import Lean.Compiler.IR.Format
public import Lean.Compiler.IR.CompilerM
public import Lean.Compiler.IR.PushProj
public import Lean.Compiler.IR.NormIds
public import Lean.Compiler.IR.Checker
public import Lean.Compiler.IR.ExpandResetReuse
public import Lean.Compiler.IR.UnboxResult
public import Lean.Compiler.IR.EmitC
public import Lean.Compiler.IR.Sorry
public import Lean.Compiler.IR.ToIR
public import Lean.Compiler.IR.ToIRType
public import Lean.Compiler.IR.Meta
public import Lean.Compiler.IR.SimpleGroundExpr
public import Lean.Compiler.IR.ElimDeadVars
-- The following imports are not required by the compiler. They are here to ensure that there
-- are no orphaned modules.
@@ -36,15 +32,9 @@ def compile (decls : Array Decl) : CompilerM (Array Decl) := do
logDecls `init decls
checkDecls decls
let mut decls := decls
if Compiler.LCNF.compiler.reuse.get ( getOptions) then
decls := decls.map Decl.expandResetReuse
logDecls `expand_reset_reuse decls
decls := decls.map Decl.pushProj
logDecls `push_proj decls
decls updateSorryDep decls
logDecls `result decls
checkDecls decls
decls.forM Decl.detectSimpleGround
addDecls decls
inferMeta decls
return decls

View File

@@ -1,72 +0,0 @@
/-
Copyright (c) 2019 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Compiler.IR.FreeVars
public section
namespace Lean.IR
/--
This function implements a simple heuristic for let values that we know may be dropped because they
are pure.
-/
private def safeToElim (e : Expr) : Bool :=
match e with
| .ctor .. | .reset .. | .reuse .. | .proj .. | .uproj .. | .sproj .. | .box .. | .unbox ..
| .lit .. | .isShared .. | .pap .. => true
-- 0-ary full applications are considered constants
| .fap _ args => args.isEmpty
| .ap .. => false
partial def reshapeWithoutDead (bs : Array FnBody) (term : FnBody) : FnBody :=
let rec reshape (bs : Array FnBody) (b : FnBody) (used : IndexSet) :=
if bs.isEmpty then b
else
let curr := bs.back!
let bs := bs.pop
let keep (_ : Unit) :=
let used := curr.collectFreeIndices used
let b := curr.setBody b
reshape bs b used
let keepIfUsedJp (vidx : Index) :=
if used.contains vidx then
keep ()
else
reshape bs b used
let keepIfUsedLet (vidx : Index) (val : Expr) :=
if used.contains vidx || !safeToElim val then
keep ()
else
reshape bs b used
match curr with
| FnBody.vdecl x _ e _ => keepIfUsedLet x.idx e
-- TODO: we should keep all struct/union projections because they are used to ensure struct/union values are fully consumed.
| FnBody.jdecl j _ _ _ => keepIfUsedJp j.idx
| _ => keep ()
reshape bs term term.freeIndices
partial def FnBody.elimDead (b : FnBody) : FnBody :=
let (bs, term) := b.flatten
let bs := modifyJPs bs elimDead
let term := match term with
| FnBody.case tid x xType alts =>
let alts := alts.map fun alt => alt.modifyBody elimDead
FnBody.case tid x xType alts
| other => other
reshapeWithoutDead bs term
/-- Eliminate dead let-declarations and join points -/
def Decl.elimDead (d : Decl) : Decl :=
match d with
| .fdecl (body := b) .. => d.updateBody! b.elimDead
| other => other
builtin_initialize registerTraceClass `compiler.ir.elim_dead (inherited := true)
end Lean.IR

View File

@@ -13,7 +13,7 @@ public import Lean.Compiler.IR.SimpCase
public import Lean.Compiler.ModPkgExt
import Lean.Compiler.LCNF.Types
import Lean.Compiler.ClosedTermCache
import Lean.Compiler.IR.SimpleGroundExpr
import Lean.Compiler.LCNF.SimpleGroundExpr
import Init.Omega
import Init.While
import Init.Data.Range.Polymorphic.Iterators
@@ -22,7 +22,9 @@ import Lean.Runtime
public section
namespace Lean.IR.EmitC
open Lean.Compiler.LCNF (isBoxedName)
open Lean.Compiler.LCNF (isBoxedName isSimpleGroundDecl getSimpleGroundExpr
getSimpleGroundExprWithResolvedRefs uint64ToByteArrayLE SimpleGroundExpr SimpleGroundArg
addSimpleGroundDecl)
def leanMainFn := "_lean_main"
@@ -39,9 +41,9 @@ abbrev M := ReaderT Context (EStateM String String)
@[inline] def getModName : M Name := Context.modName <$> read
@[inline] def getModInitFn : M String := do
@[inline] def getModInitFn (phases : IRPhases) : M String := do
let pkg? := ( getEnv).getModulePackage?
return mkModuleInitializationFunctionName ( getModName) pkg?
return mkModuleInitializationFunctionName (phases := phases) ( getModName) pkg?
def getDecl (n : Name) : M Decl := do
let env getEnv
@@ -174,6 +176,23 @@ where
| .nameMkStr args =>
let obj groundNameMkStrToCLit args
mkValueCLit "lean_ctor_object" obj
| .array elems =>
let leanArrayTag := 246
let header := mkHeader s!"sizeof(lean_array_object) + sizeof(void*)*{elems.size}" 0 leanArrayTag
let elemLits elems.mapM groundArgToCLit
let dataArray := String.intercalate "," elemLits.toList
mkValueCLit
"lean_array_object"
s!"\{.m_header = {header}, .m_size = {elems.size}, .m_capacity = {elems.size}, .m_data = \{{dataArray}}}"
| .byteArray data =>
let leanScalarArrayTag := 248
let elemSize : Nat := 1
let header := mkHeader s!"sizeof(lean_sarray_object) + {data.size}" elemSize leanScalarArrayTag
let dataLits := data.map toString
let dataArray := String.intercalate "," dataLits.toList
mkValueCLit
"lean_sarray_object"
s!"\{.m_header = {header}, .m_size = {data.size}, .m_capacity = {data.size}, .m_data = \{{dataArray}}}"
| .reference refDecl => findValueDecl refDecl
mkValueName (name : String) : String :=
@@ -222,7 +241,7 @@ where
break
return mkValueName ( toCName decl)
compileCtor (cidx : Nat) (objArgs : Array SimpleGroundArg) (usizeArgs : Array USize)
compileCtor (cidx : Nat) (objArgs : Array SimpleGroundArg) (usizeArgs : Array UInt64)
(scalarArgs : Array UInt8) : GroundM String := do
let header := mkCtorHeader objArgs.size usizeArgs.size scalarArgs.size cidx
let objArgs objArgs.mapM groundArgToCLit
@@ -343,7 +362,7 @@ def emitMainFn : M Unit := do
/- We disable panic messages because they do not mesh well with extracted closed terms.
See issue #534. We can remove this workaround after we implement issue #467. -/
emitLn "lean_set_panic_messages(false);"
emitLn s!"res = {← getModInitFn}(1 /* builtin */);"
emitLn s!"res = {← getModInitFn (phases := if env.header.isModule then .runtime else .all)}(1 /* builtin */);"
emitLn "lean_set_panic_messages(true);"
emitLns ["lean_io_mark_end_initialization();",
"if (lean_io_result_is_ok(res)) {",
@@ -470,7 +489,7 @@ def emitDec (x : VarId) (n : Nat) (checkRef : Bool) : M Unit := do
emitLn ");"
def emitDel (x : VarId) : M Unit := do
emit "lean_free_object("; emit x; emitLn ");"
emit "lean_del_object("; emit x; emitLn ");"
def emitSetTag (x : VarId) (i : Nat) : M Unit := do
emit "lean_ctor_set_tag("; emit x; emit ", "; emit i; emitLn ");"
@@ -887,24 +906,21 @@ def emitMarkPersistent (d : Decl) (n : Name) : M Unit := do
emitCName n
emitLn ");"
def emitDeclInit (d : Decl) : M Unit := do
def withErrRet (emitIORes : M Unit) : M Unit := do
emit "res = "; emitIORes; emitLn ";"
emitLn "if (lean_io_result_is_error(res)) return res;"
def emitDeclInit (d : Decl) (isBuiltin : Bool) : M Unit := do
let env getEnv
let n := d.name
if isIOUnitInitFn env n then
if isIOUnitBuiltinInitFn env n then
emit "if (builtin) {"
emit "res = "; emitCName n; emitLn "();"
emitLn "if (lean_io_result_is_error(res)) return res;"
if (isBuiltin && isIOUnitBuiltinInitFn env n) || isIOUnitInitFn env n then
withErrRet do
emitCName n; emitLn "()"
emitLn "lean_dec_ref(res);"
if isIOUnitBuiltinInitFn env n then
emit "}"
else if d.params.size == 0 then
match getInitFnNameFor? env d.name with
| some initFn =>
if getBuiltinInitFnNameFor? env d.name |>.isSome then
emit "if (builtin) {"
emit "res = "; emitCName initFn; emitLn "();"
emitLn "if (lean_io_result_is_error(res)) return res;"
if let some initFn := (guard isBuiltin *> getBuiltinInitFnNameFor? env d.name) <|> getInitFnNameFor? env d.name then
withErrRet do
emitCName initFn; emitLn "()"
emitCName n
if d.resultType.isScalar then
emitLn (" = " ++ getUnboxOpName d.resultType ++ "(lean_io_result_get_value(res));")
@@ -912,41 +928,78 @@ def emitDeclInit (d : Decl) : M Unit := do
emitLn " = lean_io_result_get_value(res);"
emitMarkPersistent d n
emitLn "lean_dec_ref(res);"
if getBuiltinInitFnNameFor? env d.name |>.isSome then
emit "}"
| _ =>
if !isClosedTermName env d.name && !isSimpleGroundDecl env d.name then
emitCName n; emit " = "; emitCInitName n; emitLn "();"; emitMarkPersistent d n
else if !isClosedTermName env d.name && !isSimpleGroundDecl env d.name then
emitCName n; emit " = "; emitCInitName n; emitLn "();"; emitMarkPersistent d n
def emitInitFn : M Unit := do
def emitInitFn (phases : IRPhases) : M Unit := do
let env getEnv
let impInitFns env.imports.mapM fun imp => do
let impInitFns env.imports.filterMapM fun imp => do
if phases != .all && imp.isMeta != (phases == .comptime) then
return none
let some idx := env.getModuleIdx? imp.module
| throw "(internal) import without module index" -- should be unreachable
let pkg? := env.getModulePackageByIdx? idx
let fn := mkModuleInitializationFunctionName (phases := if phases == .all then .all else if imp.isMeta then .runtime else phases) imp.module pkg?
emitLn s!"lean_object* {fn}(uint8_t builtin);"
return some fn
let initialized := s!"_G_{mkModuleInitializationPrefix phases}initialized"
emitLns [
s!"static bool {initialized} = false;",
s!"LEAN_EXPORT lean_object* {← getModInitFn (phases := phases)}(uint8_t builtin) \{",
"lean_object * res;",
s!"if ({initialized}) return lean_io_result_mk_ok(lean_box(0));",
s!"{initialized} = true;"
]
impInitFns.forM fun fn => do
withErrRet do
emitLn s!"{fn}(builtin)"
emitLn "lean_dec_ref(res);"
let decls := getDecls env
for d in decls.reverse do
if phases == .all || (phases == .comptime) == isMarkedMeta env d.name then
emitDeclInit d (isBuiltin := phases != .comptime)
emitLns ["return lean_io_result_mk_ok(lean_box(0));", "}"]
/-- Init function used before phase split under module system, keep for compatibility. -/
def emitLegacyInitFn : M Unit := do
let env getEnv
let impInitFns env.imports.filterMapM fun imp => do
let some idx := env.getModuleIdx? imp.module
| throw "(internal) import without module index" -- should be unreachable
let pkg? := env.getModulePackageByIdx? idx
let fn := mkModuleInitializationFunctionName imp.module pkg?
emitLn s!"lean_object* {fn}(uint8_t builtin);"
return fn
return some fn
let initialized := s!"_G_initialized"
emitLns [
"static bool _G_initialized = false;",
s!"LEAN_EXPORT lean_object* {← getModInitFn}(uint8_t builtin) \{",
s!"static bool {initialized} = false;",
s!"LEAN_EXPORT lean_object* {← getModInitFn (phases := .all)}(uint8_t builtin) \{",
"lean_object * res;",
"if (_G_initialized) return lean_io_result_mk_ok(lean_box(0));",
"_G_initialized = true;"
s!"if ({initialized}) return lean_io_result_mk_ok(lean_box(0));",
s!"{initialized} = true;"
]
impInitFns.forM fun fn => emitLns [
s!"res = {fn}(builtin);",
"if (lean_io_result_is_error(res)) return res;",
"lean_dec_ref(res);"]
let decls := getDecls env
decls.reverse.forM emitDeclInit
emitLns ["return lean_io_result_mk_ok(lean_box(0));", "}"]
impInitFns.forM fun fn => do
withErrRet do
emitLn s!"{fn}(builtin)"
emitLn "lean_dec_ref(res);"
withErrRet do
emitLn s!"{← getModInitFn (phases := .runtime)}(builtin)"
emitLn "lean_dec_ref(res);"
withErrRet do
emitLn s!"{← getModInitFn (phases := .comptime)}(builtin)"
emitLn "lean_dec_ref(res);"
emitLns [s!"return {← getModInitFn (phases := .all)}(builtin);", "}"]
def main : M Unit := do
emitFileHeader
emitFnDecls
emitFns
emitInitFn
if ( getEnv).header.isModule then
emitInitFn (phases := .runtime)
emitInitFn (phases := .comptime)
emitLegacyInitFn
else
emitInitFn (phases := .all)
emitMainFnIfNeeded
emitFileFooter

View File

@@ -1081,7 +1081,7 @@ def emitSSet (builder : LLVM.Builder llvmctx) (x : VarId) (n : Nat) (offset : Na
def emitDel (builder : LLVM.Builder llvmctx) (x : VarId) : M llvmctx Unit := do
let argtys := #[ LLVM.voidPtrType llvmctx]
let retty LLVM.voidType llvmctx
let fn getOrCreateFunctionPrototype ( getLLVMModule) retty "lean_free_object" argtys
let fn getOrCreateFunctionPrototype ( getLLVMModule) retty "lean_del_object" argtys
let xv emitLhsVal builder x
let fnty LLVM.functionType retty argtys
let _ LLVM.buildCall2 builder fnty fn #[xv]

View File

@@ -21,7 +21,7 @@ def isTailCallTo (g : Name) (b : FnBody) : Bool :=
| _ => false
def usesModuleFrom (env : Environment) (modulePrefix : Name) : Bool :=
env.allImportedModuleNames.toList.any fun modName => modulePrefix.isPrefixOf modName
env.header.modules.any fun mod => mod.irPhases != .comptime && modulePrefix.isPrefixOf mod.module
namespace CollectUsedDecls

View File

@@ -1,288 +0,0 @@
/-
Copyright (c) 2019 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Compiler.IR.CompilerM
public import Lean.Compiler.IR.NormIds
public import Lean.Compiler.IR.FreeVars
import Init.Omega
public section
namespace Lean.IR.ExpandResetReuse
/-- Mapping from variable to projections -/
abbrev ProjMap := Std.HashMap VarId Expr
namespace CollectProjMap
abbrev Collector := ProjMap ProjMap
@[inline] def collectVDecl (x : VarId) (v : Expr) : Collector := fun m =>
match v with
| .proj .. => m.insert x v
| .sproj .. => m.insert x v
| .uproj .. => m.insert x v
| _ => m
partial def collectFnBody : FnBody Collector
| .vdecl x _ v b => collectVDecl x v collectFnBody b
| .jdecl _ _ v b => collectFnBody v collectFnBody b
| .case _ _ _ alts => fun s => alts.foldl (fun s alt => collectFnBody alt.body s) s
| e => if e.isTerminal then id else collectFnBody e.body
end CollectProjMap
/-- Create a mapping from variables to projections.
This function assumes variable ids have been normalized -/
def mkProjMap (d : Decl) : ProjMap :=
match d with
| .fdecl (body := b) .. => CollectProjMap.collectFnBody b {}
| _ => {}
structure Context where
projMap : ProjMap
/-- Return true iff `x` is consumed in all branches of the current block.
Here consumption means the block contains a `dec x` or `reuse x ...`. -/
partial def consumed (x : VarId) : FnBody Bool
| .vdecl _ _ v b =>
match v with
| Expr.reuse y _ _ _ => x == y || consumed x b
| _ => consumed x b
| .dec y _ _ _ b => x == y || consumed x b
| .case _ _ _ alts => alts.all fun alt => consumed x alt.body
| e => !e.isTerminal && consumed x e.body
abbrev Mask := Array (Option VarId)
/-- Auxiliary function for eraseProjIncFor -/
partial def eraseProjIncForAux (y : VarId) (bs : Array FnBody) (mask : Mask) (keep : Array FnBody) : Array FnBody × Mask :=
let done (_ : Unit) := (bs ++ keep.reverse, mask)
let keepInstr (b : FnBody) := eraseProjIncForAux y bs.pop mask (keep.push b)
if h : bs.size < 2 then done ()
else
let b := bs.back!
match b with
| .vdecl _ _ (.sproj _ _ _) _ => keepInstr b
| .vdecl _ _ (.uproj _ _) _ => keepInstr b
| .inc z n c p _ =>
if n == 0 then done () else
let b' := bs[bs.size - 2]
match b' with
| .vdecl w _ (.proj i x) _ =>
if w == z && y == x then
/- Found
```
let z := proj[i] y
inc z n c
```
We keep `proj`, and `inc` when `n > 1`
-/
let bs := bs.pop.pop
let mask := mask.set! i (some z)
let keep := keep.push b'
let keep := if n == 1 then keep else keep.push (FnBody.inc z (n-1) c p FnBody.nil)
eraseProjIncForAux y bs mask keep
else done ()
| _ => done ()
| _ => done ()
/-- Try to erase `inc` instructions on projections of `y` occurring in the tail of `bs`.
Return the updated `bs` and a bit mask specifying which `inc`s have been removed. -/
def eraseProjIncFor (n : Nat) (y : VarId) (bs : Array FnBody) : Array FnBody × Mask :=
eraseProjIncForAux y bs (.replicate n none) #[]
/-- Replace `reuse x ctor ...` with `ctor ...`, and remove `dec x` -/
partial def reuseToCtor (x : VarId) : FnBody FnBody
| FnBody.dec y n c p b =>
if x == y then b -- n must be 1 since `x := reset ...`
else FnBody.dec y n c p (reuseToCtor x b)
| FnBody.vdecl z t v b =>
match v with
| Expr.reuse y c _ xs =>
if x == y then FnBody.vdecl z t (Expr.ctor c xs) b
else FnBody.vdecl z t v (reuseToCtor x b)
| _ =>
FnBody.vdecl z t v (reuseToCtor x b)
| FnBody.case tid y yType alts =>
let alts := alts.map fun alt => alt.modifyBody (reuseToCtor x)
FnBody.case tid y yType alts
| e =>
if e.isTerminal then
e
else
let (instr, b) := e.split
let b := reuseToCtor x b
instr.setBody b
/--
replace
```
x := reset y; b
```
with
```
inc z_1; ...; inc z_i; dec y; b'
```
where `z_i`'s are the variables in `mask`,
and `b'` is `b` where we removed `dec x` and replaced `reuse x ctor_i ...` with `ctor_i ...`.
-/
def mkSlowPath (x y : VarId) (mask : Mask) (b : FnBody) : FnBody :=
let b := reuseToCtor x b
let b := FnBody.dec y 1 true false b
mask.foldl (init := b) fun b m => match m with
| some z => FnBody.inc z 1 true false b
| none => b
abbrev M := ReaderT Context (StateM Nat)
def mkFresh : M VarId :=
modifyGet fun n => ({ idx := n }, n + 1)
def releaseUnreadFields (y : VarId) (mask : Mask) (b : FnBody) : M FnBody :=
mask.size.foldM (init := b) fun i _ b =>
match mask[i] with
| some _ => pure b -- code took ownership of this field
| none => do
let fld mkFresh
pure (FnBody.vdecl fld .tobject (Expr.proj i y) (FnBody.dec fld 1 true false b))
def setFields (y : VarId) (zs : Array Arg) (b : FnBody) : FnBody :=
zs.size.fold (init := b) fun i _ b => FnBody.set y i zs[i] b
/-- Given `set x[i] := y`, return true iff `y := proj[i] x` -/
def isSelfSet (ctx : Context) (x : VarId) (i : Nat) (y : Arg) : Bool :=
match y with
| .var y =>
match ctx.projMap[y]? with
| some (Expr.proj j w) => j == i && w == x
| _ => false
| .erased => false
/-- Given `uset x[i] := y`, return true iff `y := uproj[i] x` -/
def isSelfUSet (ctx : Context) (x : VarId) (i : Nat) (y : VarId) : Bool :=
match ctx.projMap[y]? with
| some (Expr.uproj j w) => j == i && w == x
| _ => false
/-- Given `sset x[n, i] := y`, return true iff `y := sproj[n, i] x` -/
def isSelfSSet (ctx : Context) (x : VarId) (n : Nat) (i : Nat) (y : VarId) : Bool :=
match ctx.projMap[y]? with
| some (Expr.sproj m j w) => n == m && j == i && w == x
| _ => false
/-- Remove unnecessary `set/uset/sset` operations -/
partial def removeSelfSet (ctx : Context) : FnBody FnBody
| FnBody.set x i y b =>
if isSelfSet ctx x i y then removeSelfSet ctx b
else FnBody.set x i y (removeSelfSet ctx b)
| FnBody.uset x i y b =>
if isSelfUSet ctx x i y then removeSelfSet ctx b
else FnBody.uset x i y (removeSelfSet ctx b)
| FnBody.sset x n i y t b =>
if isSelfSSet ctx x n i y then removeSelfSet ctx b
else FnBody.sset x n i y t (removeSelfSet ctx b)
| FnBody.case tid y yType alts =>
let alts := alts.map fun alt => alt.modifyBody (removeSelfSet ctx)
FnBody.case tid y yType alts
| e =>
if e.isTerminal then e
else
let (instr, b) := e.split
let b := removeSelfSet ctx b
instr.setBody b
partial def reuseToSet (ctx : Context) (x y : VarId) : FnBody FnBody
| FnBody.dec z n c p b =>
if x == z then FnBody.del y b
else FnBody.dec z n c p (reuseToSet ctx x y b)
| FnBody.vdecl z t v b =>
match v with
| Expr.reuse w c u zs =>
if x == w then
let b := setFields y zs (b.replaceVar z y)
let b := if u then FnBody.setTag y c.cidx b else b
removeSelfSet ctx b
else FnBody.vdecl z t v (reuseToSet ctx x y b)
| _ => FnBody.vdecl z t v (reuseToSet ctx x y b)
| FnBody.case tid z zType alts =>
let alts := alts.map fun alt => alt.modifyBody (reuseToSet ctx x y)
FnBody.case tid z zType alts
| e =>
if e.isTerminal then e
else
let (instr, b) := e.split
let b := reuseToSet ctx x y b
instr.setBody b
/--
replace
```
x := reset y; b
```
with
```
let f_i_1 := proj[i_1] y;
...
let f_i_k := proj[i_k] y;
b'
```
where `i_j`s are the field indexes
that the code did not touch immediately before the reset.
That is `mask[j] == none`.
`b'` is `b` where `y` `dec x` is replaced with `del y`,
and `z := reuse x ctor_i ws; F` is replaced with
`set x i ws[i]` operations, and we replace `z` with `x` in `F`
-/
def mkFastPath (x y : VarId) (mask : Mask) (b : FnBody) : M FnBody := do
let ctx read
let b := reuseToSet ctx x y b
releaseUnreadFields y mask b
-- Expand `bs; x := reset[n] y; b`
partial def expand (mainFn : FnBody Array FnBody M FnBody)
(bs : Array FnBody) (x : VarId) (n : Nat) (y : VarId) (b : FnBody) : M FnBody := do
let (bs, mask) := eraseProjIncFor n y bs
/- Remark: we may be duplicating variable/JP indices. That is, `bSlow` and `bFast` may
have duplicate indices. We run `normalizeIds` to fix the ids after we have expand them. -/
let bSlow := mkSlowPath x y mask b
let bFast mkFastPath x y mask b
/- We only optimize recursively the fast. -/
let bFast mainFn bFast #[]
let c mkFresh
let b := FnBody.vdecl c IRType.uint8 (Expr.isShared y) (mkIf c bSlow bFast)
return reshape bs b
partial def searchAndExpand : FnBody Array FnBody M FnBody
| d@(FnBody.vdecl x _ (Expr.reset n y) b), bs =>
if consumed x b then do
expand searchAndExpand bs x n y b
else
searchAndExpand b (push bs d)
| FnBody.jdecl j xs v b, bs => do
let v searchAndExpand v #[]
searchAndExpand b (push bs (FnBody.jdecl j xs v FnBody.nil))
| FnBody.case tid x xType alts, bs => do
let alts alts.mapM fun alt => alt.modifyBodyM fun b => searchAndExpand b #[]
return reshape bs (FnBody.case tid x xType alts)
| b, bs =>
if b.isTerminal then return reshape bs b
else searchAndExpand b.body (push bs b)
def main (d : Decl) : Decl :=
match d with
| .fdecl (body := b) .. =>
let m := mkProjMap d
let nextIdx := d.maxIndex + 1
let bNew := (searchAndExpand b #[] { projMap := m }).run' nextIdx
d.updateBody! bNew
| d => d
end ExpandResetReuse
/-- (Try to) expand `reset` and `reuse` instructions. -/
def Decl.expandResetReuse (d : Decl) : Decl :=
(ExpandResetReuse.main d).normalizeIds
builtin_initialize registerTraceClass `compiler.ir.expand_reset_reuse (inherited := true)
end Lean.IR

View File

@@ -1,245 +0,0 @@
/-
Copyright (c) 2019 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Compiler.IR.Basic
public section
namespace Lean.IR
namespace MaxIndex
/-! Compute the maximum index `M` used in a declaration.
We `M` to initialize the fresh index generator used to create fresh
variable and join point names.
Recall that we variable and join points share the same namespace in
our implementation.
-/
structure State where
currentMax : Nat := 0
abbrev M := StateM State
private def visitIndex (x : Index) : M Unit := do
modify fun s => { s with currentMax := s.currentMax.max x }
private def visitVar (x : VarId) : M Unit :=
visitIndex x.idx
private def visitJP (j : JoinPointId) : M Unit :=
visitIndex j.idx
private def visitArg (arg : Arg) : M Unit :=
match arg with
| .var x => visitVar x
| .erased => pure ()
private def visitParam (p : Param) : M Unit :=
visitVar p.x
private def visitExpr (e : Expr) : M Unit := do
match e with
| .proj _ x | .uproj _ x | .sproj _ _ x | .box _ x | .unbox x | .reset _ x | .isShared x =>
visitVar x
| .ctor _ ys | .fap _ ys | .pap _ ys =>
ys.forM visitArg
| .ap x ys | .reuse x _ _ ys =>
visitVar x
ys.forM visitArg
| .lit _ => pure ()
partial def visitFnBody (fnBody : FnBody) : M Unit := do
match fnBody with
| .vdecl x _ v b =>
visitVar x
visitExpr v
visitFnBody b
| .jdecl j ys v b =>
visitJP j
visitFnBody v
ys.forM visitParam
visitFnBody b
| .set x _ y b =>
visitVar x
visitArg y
visitFnBody b
| .uset x _ y b | .sset x _ _ y _ b =>
visitVar x
visitVar y
visitFnBody b
| .setTag x _ b | .inc x _ _ _ b | .dec x _ _ _ b | .del x b =>
visitVar x
visitFnBody b
| .case _ x _ alts =>
visitVar x
alts.forM (visitFnBody ·.body)
| .jmp j ys =>
visitJP j
ys.forM visitArg
| .ret x =>
visitArg x
| .unreachable => pure ()
private def visitDecl (decl : Decl) : M Unit := do
match decl with
| .fdecl (xs := xs) (body := b) .. =>
xs.forM visitParam
visitFnBody b
| .extern (xs := xs) .. =>
xs.forM visitParam
end MaxIndex
def FnBody.maxIndex (b : FnBody) : Index := Id.run do
let _, { currentMax } := MaxIndex.visitFnBody b |>.run {}
return currentMax
def Decl.maxIndex (d : Decl) : Index := Id.run do
let _, { currentMax } := MaxIndex.visitDecl d |>.run {}
return currentMax
namespace FreeIndices
/-! We say a variable (join point) index (aka name) is free in a function body
if there isn't a `FnBody.vdecl` (`Fnbody.jdecl`) binding it. -/
structure State where
freeIndices : IndexSet := {}
abbrev M := StateM State
private def visitIndex (x : Index) : M Unit := do
modify fun s => { s with freeIndices := s.freeIndices.insert x }
private def visitVar (x : VarId) : M Unit :=
visitIndex x.idx
private def visitJP (j : JoinPointId) : M Unit :=
visitIndex j.idx
private def visitArg (arg : Arg) : M Unit :=
match arg with
| .var x => visitVar x
| .erased => pure ()
private def visitParam (p : Param) : M Unit :=
visitVar p.x
private def visitExpr (e : Expr) : M Unit := do
match e with
| .proj _ x | .uproj _ x | .sproj _ _ x | .box _ x | .unbox x | .reset _ x | .isShared x =>
visitVar x
| .ctor _ ys | .fap _ ys | .pap _ ys =>
ys.forM visitArg
| .ap x ys | .reuse x _ _ ys =>
visitVar x
ys.forM visitArg
| .lit _ => pure ()
partial def visitFnBody (fnBody : FnBody) : M Unit := do
match fnBody with
| .vdecl x _ v b =>
visitVar x
visitExpr v
visitFnBody b
| .jdecl j ys v b =>
visitJP j
visitFnBody v
ys.forM visitParam
visitFnBody b
| .set x _ y b =>
visitVar x
visitArg y
visitFnBody b
| .uset x _ y b | .sset x _ _ y _ b =>
visitVar x
visitVar y
visitFnBody b
| .setTag x _ b | .inc x _ _ _ b | .dec x _ _ _ b | .del x b =>
visitVar x
visitFnBody b
| .case _ x _ alts =>
visitVar x
alts.forM (visitFnBody ·.body)
| .jmp j ys =>
visitJP j
ys.forM visitArg
| .ret x =>
visitArg x
| .unreachable => pure ()
private def visitDecl (decl : Decl) : M Unit := do
match decl with
| .fdecl (xs := xs) (body := b) .. =>
xs.forM visitParam
visitFnBody b
| .extern (xs := xs) .. =>
xs.forM visitParam
end FreeIndices
def FnBody.collectFreeIndices (b : FnBody) (init : IndexSet) : IndexSet := Id.run do
let _, { freeIndices } := FreeIndices.visitFnBody b |>.run { freeIndices := init }
return freeIndices
def FnBody.freeIndices (b : FnBody) : IndexSet :=
b.collectFreeIndices {}
namespace HasIndex
/-! In principle, we can check whether a function body `b` contains an index `i` using
`b.freeIndices.contains i`, but it is more efficient to avoid the construction
of the set of freeIndices and just search whether `i` occurs in `b` or not.
-/
def visitVar (w : Index) (x : VarId) : Bool := w == x.idx
def visitJP (w : Index) (x : JoinPointId) : Bool := w == x.idx
def visitArg (w : Index) : Arg Bool
| .var x => visitVar w x
| .erased => false
def visitArgs (w : Index) (xs : Array Arg) : Bool :=
xs.any (visitArg w)
def visitParams (w : Index) (ps : Array Param) : Bool :=
ps.any (fun p => w == p.x.idx)
def visitExpr (w : Index) : Expr Bool
| .proj _ x | .uproj _ x | .sproj _ _ x | .box _ x | .unbox x | .reset _ x | .isShared x =>
visitVar w x
| .ctor _ ys | .fap _ ys | .pap _ ys =>
visitArgs w ys
| .ap x ys | .reuse x _ _ ys =>
visitVar w x || visitArgs w ys
| .lit _ => false
partial def visitFnBody (w : Index) : FnBody Bool
| .vdecl _ _ v b =>
visitExpr w v || visitFnBody w b
| .jdecl _ _ v b =>
visitFnBody w v || visitFnBody w b
| FnBody.set x _ y b =>
visitVar w x || visitArg w y || visitFnBody w b
| .uset x _ y b | .sset x _ _ y _ b =>
visitVar w x || visitVar w y || visitFnBody w b
| .setTag x _ b | .inc x _ _ _ b | .dec x _ _ _ b | .del x b =>
visitVar w x || visitFnBody w b
| .case _ x _ alts =>
visitVar w x || alts.any (fun alt => visitFnBody w alt.body)
| .jmp j ys =>
visitJP w j || visitArgs w ys
| .ret x =>
visitArg w x
| .unreachable => false
end HasIndex
def Arg.hasFreeVar (arg : Arg) (x : VarId) : Bool := HasIndex.visitArg x.idx arg
def Expr.hasFreeVar (e : Expr) (x : VarId) : Bool := HasIndex.visitExpr x.idx e
def FnBody.hasFreeVar (b : FnBody) (x : VarId) : Bool := HasIndex.visitFnBody x.idx b
end Lean.IR

View File

@@ -55,22 +55,8 @@ errors from the interpreter itself as those depend on whether we are running in
-/
@[export lean_eval_check_meta]
private partial def evalCheckMeta (env : Environment) (declName : Name) : Except String Unit := do
if !env.header.isModule then
return
go declName |>.run' {}
where go (ref : Name) : StateT NameSet (Except String) Unit := do
if ( get).contains ref then
return
modify (·.insert ref)
if let some localDecl := declMapExt.getState env |>.find? ref then
for ref in collectUsedFDecls localDecl do
go ref
else
-- NOTE: We do not use `getIRPhases` here as it's intended for env decls, nor IR decls. We do
-- not set `includeServer` as we want this check to be independent of server mode. Server-only
-- users disable this check instead.
if findEnvDecl env ref |>.isNone then
throw s!"Cannot evaluate constant `{declName}` as it uses `{ref}` which is neither marked nor imported as `meta`"
if getIRPhases env declName == .runtime then
throw s!"Cannot evaluate constant `{declName}` as it is neither marked nor imported as `meta`"
builtin_initialize
registerTraceClass `compiler.ir.inferMeta

View File

@@ -1,62 +0,0 @@
/-
Copyright (c) 2019 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura
-/
module
prelude
public import Lean.Compiler.IR.FreeVars
public import Lean.Compiler.IR.NormIds
public section
namespace Lean.IR
partial def pushProjs (bs : Array FnBody) (alts : Array Alt) (altsF : Array IndexSet) (ctx : Array FnBody) (ctxF : IndexSet) : Array FnBody × Array Alt :=
if bs.isEmpty then (ctx.reverse, alts)
else
let b := bs.back!
let bs := bs.pop
let done (_ : Unit) := (bs.push b ++ ctx.reverse, alts)
let skip (_ : Unit) := pushProjs bs alts altsF (ctx.push b) (b.collectFreeIndices ctxF)
let push (x : VarId) :=
if !ctxF.contains x.idx then
let alts := alts.mapIdx fun i alt => alt.modifyBody fun b' =>
if altsF[i]!.contains x.idx then b.setBody b'
else b'
let altsF := altsF.map fun s => if s.contains x.idx then b.collectFreeIndices s else s
pushProjs bs alts altsF ctx ctxF
else
skip ()
match b with
| FnBody.vdecl x _ v _ =>
match v with
| Expr.proj _ _ => push x
| Expr.uproj _ _ => push x
| Expr.sproj _ _ _ => push x
| Expr.isShared _ => skip ()
| _ => done ()
| _ => done ()
partial def FnBody.pushProj (b : FnBody) : FnBody :=
let (bs, term) := b.flatten
let bs := modifyJPs bs pushProj
match term with
| .case tid x xType alts =>
let altsF := alts.map fun alt => alt.body.freeIndices
let (bs, alts) := pushProjs bs alts altsF #[] (mkIndexSet x.idx)
let alts := alts.map fun alt => alt.modifyBody pushProj
let term := FnBody.case tid x xType alts
reshape bs term
| _ => reshape bs term
/-- Push projections inside `case` branches. -/
def Decl.pushProj (d : Decl) : Decl :=
match d with
| .fdecl (body := b) .. => d.updateBody! b.pushProj |>.normalizeIds
| other => other
builtin_initialize registerTraceClass `compiler.ir.push_proj (inherited := true)
end Lean.IR

View File

@@ -101,6 +101,10 @@ partial def lowerCode (c : LCNF.Code .impure) : M FnBody := do
let ret getFVarValue fvarId
return .ret ret
| .unreach .. => return .unreachable
| .oset fvarId i y k _ =>
let y lowerArg y
let .var fvarId getFVarValue fvarId | unreachable!
return .set fvarId i y ( lowerCode k)
| .sset fvarId i offset y type k _ =>
let .var y getFVarValue y | unreachable!
let .var fvarId getFVarValue fvarId | unreachable!
@@ -109,12 +113,18 @@ partial def lowerCode (c : LCNF.Code .impure) : M FnBody := do
let .var y getFVarValue y | unreachable!
let .var fvarId getFVarValue fvarId | unreachable!
return .uset fvarId i y ( lowerCode k)
| .setTag fvarId cidx k _ =>
let .var var getFVarValue fvarId | unreachable!
return .setTag var cidx ( lowerCode k)
| .inc fvarId n check persistent k _ =>
let .var var getFVarValue fvarId | unreachable!
return .inc var n check persistent ( lowerCode k)
| .dec fvarId n check persistent k _ =>
let .var var getFVarValue fvarId | unreachable!
return .dec var n check persistent ( lowerCode k)
| .del fvarId k _ =>
let .var var getFVarValue fvarId | unreachable!
return .del var ( lowerCode k)
| .fun .. => panic! "all local functions should be λ-lifted"
partial def lowerLet (decl : LCNF.LetDecl .impure) (k : LCNF.Code .impure) : M FnBody := do
@@ -155,6 +165,9 @@ partial def lowerLet (decl : LCNF.LetDecl .impure) (k : LCNF.Code .impure) : M F
| .unbox var =>
withGetFVarValue var fun var => do
continueLet (.unbox var)
| .isShared var =>
withGetFVarValue var fun var => do
continueLet (.isShared var)
| .erased => mkErased ()
where
mkErased (_ : Unit) : M FnBody := do

View File

@@ -37,8 +37,8 @@ Run the initializer of the given module (without `builtin_initialize` commands).
Return `false` if the initializer is not available as native code.
Initializers do not have corresponding Lean definitions, so they cannot be interpreted in this case.
-/
@[inline] private unsafe def runModInit (mod : Name) (pkg? : Option String) : IO Bool :=
runModInitCore (mkModuleInitializationFunctionName mod pkg?)
@[inline] private unsafe def runModInit (mod : Name) (pkg? : Option String) (phases : IRPhases) : IO Bool :=
runModInitCore (mkModuleInitializationFunctionName mod pkg? phases)
/-- Run the initializer for `decl` and store its value for global access. Should only be used while importing. -/
@[extern "lean_run_init"]
@@ -160,36 +160,46 @@ def declareBuiltin (forDecl : Name) (value : Expr) : CoreM Unit :=
@[export lean_run_init_attrs]
private unsafe def runInitAttrs (env : Environment) (opts : Options) : IO Unit := do
if ( isInitializerExecutionEnabled) then
-- **Note**: `ModuleIdx` is not an abbreviation, and we don't have instances for it.
-- Thus, we use `(modIdx : Nat)`
for mod in env.header.moduleNames, (modIdx : Nat) in 0...* do
-- any native Lean code reachable by the interpreter (i.e. from shared
-- libraries with their corresponding module in the Environment) must
-- first be initialized
let pkg? := env.getModulePackageByIdx? modIdx
if ( runModInit mod pkg?) then
if !( isInitializerExecutionEnabled) then
throw <| IO.userError "`enableInitializerExecution` must be run before calling `importModules (loadExts := true)`"
-- **Note**: `ModuleIdx` is not an abbreviation, and we don't have instances for it.
-- Thus, we use `(modIdx : Nat)`
for mod in env.header.modules, (modIdx : Nat) in 0...* do
let initRuntime := Elab.inServer.get opts || mod.irPhases != .runtime
-- any native Lean code reachable by the interpreter (i.e. from shared
-- libraries with their corresponding module in the Environment) must
-- first be initialized
let pkg? := env.getModulePackageByIdx? modIdx
if env.header.isModule && /- TODO: remove after reboostrap -/ false then
let initializedRuntime pure initRuntime <&&> runModInit (phases := .runtime) mod.module pkg?
let initializedComptime runModInit (phases := .comptime) mod.module pkg?
if initializedRuntime || initializedComptime then
continue
-- As `[init]` decls can have global side effects, ensure we run them at most once,
-- just like the compiled code does.
if ( interpretedModInits.get).contains mod then
else
if ( runModInit (phases := .all) mod.module pkg?) then
continue
interpretedModInits.modify (·.insert mod)
let modEntries := regularInitAttr.ext.getModuleEntries env modIdx
-- `getModuleIREntries` is identical to `getModuleEntries` if we loaded only one of
-- .olean (from `meta initialize`)/.ir (`initialize` via transitive `meta import`)
-- so deduplicate (these lists should be very short).
-- If we have both, we should not need to worry about their relative ordering as `meta` and
-- non-`meta` initialize should not have interdependencies.
let modEntries := modEntries ++ (regularInitAttr.ext.getModuleIREntries env modIdx).filter (!modEntries.contains ·)
for (decl, initDecl) in modEntries do
-- Skip initializers we do not have IR for; they should not be reachable by interpretation.
if !Elab.inServer.get opts && getIRPhases env decl == .runtime then
continue
if initDecl.isAnonymous then
let initFn IO.ofExcept <| env.evalConst (IO Unit) opts decl
initFn
else
runInit env opts decl initDecl
-- As `[init]` decls can have global side effects, ensure we run them at most once,
-- just like the compiled code does.
if ( interpretedModInits.get).contains mod.module then
continue
interpretedModInits.modify (·.insert mod.module)
let modEntries := regularInitAttr.ext.getModuleEntries env modIdx
-- `getModuleIREntries` is identical to `getModuleEntries` if we loaded only one of
-- .olean (from `meta initialize`)/.ir (`initialize` via transitive `meta import`)
-- so deduplicate (these lists should be very short).
-- If we have both, we should not need to worry about their relative ordering as `meta` and
-- non-`meta` initialize should not have interdependencies.
let modEntries := modEntries ++ (regularInitAttr.ext.getModuleIREntries env modIdx).filter (!modEntries.contains ·)
for (decl, initDecl) in modEntries do
if !initRuntime && getIRPhases env decl == .runtime then
continue
if initDecl.isAnonymous then
-- Don't check `meta` again as it would not respect `Elab.inServer`
let initFn IO.ofExcept <| env.evalConst (checkMeta := false) (IO Unit) opts decl
initFn
else
runInit env opts decl initDecl
end Lean

View File

@@ -75,6 +75,7 @@ def eqvLetValue (e₁ e₂ : LetValue pu) : EqvM Bool := do
pure (i₁ == i₂ && u₁ == u₂) <&&> eqvFVar v₁ v₂ <&&> eqvArgs as₁ as₂
| .box ty₁ v₁ _, .box ty₂ v₂ _ => eqvType ty₁ ty₂ <&&> eqvFVar v₁ v₂
| .unbox v₁ _, .unbox v₂ _ => eqvFVar v₁ v₂
| .isShared v₁ _, .isShared v₂ _ => eqvFVar v₁ v₂
| _, _ => return false
@[inline] def withFVar (fvarId₁ fvarId₂ : FVarId) (x : EqvM α) : EqvM α :=
@@ -143,6 +144,11 @@ partial def eqv (code₁ code₂ : Code pu) : EqvM Bool := do
eqvFVar c₁.discr c₂.discr <&&>
eqvType c₁.resultType c₂.resultType <&&>
eqvAlts c₁.alts c₂.alts
| .oset fvarId₁ i₁ y₁ k₁ _, .oset fvarId₂ i₂ y₂ k₂ _ =>
pure (i₁ == i₂) <&&>
eqvFVar fvarId₁ fvarId₂ <&&>
eqvArg y₁ y₂ <&&>
eqv k₁ k₂
| .sset fvarId₁ i₁ offset₁ y₁ ty₁ k₁ _, .sset fvarId₂ i₂ offset₂ y₂ ty₂ k₂ _ =>
pure (i₁ == i₂) <&&>
pure (offset₁ == offset₂) <&&>
@@ -155,6 +161,10 @@ partial def eqv (code₁ code₂ : Code pu) : EqvM Bool := do
eqvFVar fvarId₁ fvarId₂ <&&>
eqvFVar y₁ y₂ <&&>
eqv k₁ k₂
| .setTag fvarId₁ c₁ k₁ _, .setTag fvarId₂ c₂ k₂ _ =>
pure (c₁ == c₂) <&&>
eqvFVar fvarId₁ fvarId₂ <&&>
eqv k₁ k₂
| .inc fvarId₁ n₁ c₁ p₁ k₁ _, .inc fvarId₂ n₂ c₂ p₂ k₂ _ =>
pure (n₁ == n₂) <&&>
pure (c₁ == c₂) <&&>
@@ -167,6 +177,9 @@ partial def eqv (code₁ code₂ : Code pu) : EqvM Bool := do
pure (p₁ == p₂) <&&>
eqvFVar fvarId₁ fvarId₂ <&&>
eqv k₁ k₂
| .del fvarId₁ k₁ _, .del fvarId₂ k₂ _ =>
eqvFVar fvarId₁ fvarId₂ <&&>
eqv k₁ k₂
| _, _ => return false
end

View File

@@ -219,6 +219,10 @@ inductive LetValue (pu : Purity) where
| box (ty : Expr) (fvarId : FVarId) (h : pu = .impure := by purity_tac)
/-- Given `fvarId : [t]object`, obtain the underlying scalar value. -/
| unbox (fvarId : FVarId) (h : pu = .impure := by purity_tac)
/--
Return whether the object stored behind `fvarId` is shared or not. The return type is a `UInt8`.
-/
| isShared (fvarId : FVarId) (h : pu = .impure := by purity_tac)
deriving Inhabited, BEq, Hashable
def Arg.toLetValue (arg : Arg pu) : LetValue pu :=
@@ -298,7 +302,12 @@ private unsafe def LetValue.updateUnboxImp (e : LetValue pu) (fvarId' : FVarId)
@[implemented_by LetValue.updateUnboxImp] opaque LetValue.updateUnbox! (e : LetValue pu) (fvarId' : FVarId) : LetValue pu
private unsafe def LetValue.updateIsSharedImp (e : LetValue pu) (fvarId' : FVarId) : LetValue pu :=
match e with
| .isShared fvarId _ => if fvarId == fvarId' then e else .isShared fvarId'
| _ => unreachable!
@[implemented_by LetValue.updateIsSharedImp] opaque LetValue.updateIsShared! (e : LetValue pu) (fvarId' : FVarId) : LetValue pu
private unsafe def LetValue.updateArgsImp (e : LetValue pu) (args' : Array (Arg pu)) : LetValue pu :=
match e with
@@ -331,6 +340,7 @@ def LetValue.toExpr (e : LetValue pu) : Expr :=
#[.fvar var, .const i.name [], ToExpr.toExpr updateHeader] ++ (args.map Arg.toExpr)
| .box ty var _ => mkApp2 (.const `box []) ty (.fvar var)
| .unbox var _ => mkApp (.const `unbox []) (.fvar var)
| .isShared fvarId _ => mkApp (.const `isShared []) (.fvar fvarId)
structure LetDecl (pu : Purity) where
fvarId : FVarId
@@ -361,10 +371,13 @@ inductive Code (pu : Purity) where
| cases (cases : Cases pu)
| return (fvarId : FVarId)
| unreach (type : Expr)
| oset (fvarId : FVarId) (i : Nat) (y : Arg pu) (k : Code pu) (h : pu = .impure := by purity_tac)
| uset (fvarId : FVarId) (i : Nat) (y : FVarId) (k : Code pu) (h : pu = .impure := by purity_tac)
| sset (fvarId : FVarId) (i : Nat) (offset : Nat) (y : FVarId) (ty : Expr) (k : Code pu) (h : pu = .impure := by purity_tac)
| setTag (fvarId : FVarId) (cidx : Nat) (k : Code pu) (h : pu = .impure := by purity_tac)
| inc (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (k : Code pu) (h : pu = .impure := by purity_tac)
| dec (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (k : Code pu) (h : pu = .impure := by purity_tac)
| del (fvarId : FVarId) (k : Code pu) (h : pu = .impure := by purity_tac)
deriving Inhabited
end
@@ -440,25 +453,32 @@ inductive CodeDecl (pu : Purity) where
| let (decl : LetDecl pu)
| fun (decl : FunDecl pu) (h : pu = .pure := by purity_tac)
| jp (decl : FunDecl pu)
| oset (fvarId : FVarId) (i : Nat) (y : Arg pu) (h : pu = .impure := by purity_tac)
| uset (fvarId : FVarId) (i : Nat) (y : FVarId) (h : pu = .impure := by purity_tac)
| sset (fvarId : FVarId) (i : Nat) (offset : Nat) (y : FVarId) (ty : Expr) (h : pu = .impure := by purity_tac)
| setTag (fvarId : FVarId) (cidx : Nat) (h : pu = .impure := by purity_tac)
| inc (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (h : pu = .impure := by purity_tac)
| dec (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (h : pu = .impure := by purity_tac)
| del (fvarId : FVarId) (h : pu = .impure := by purity_tac)
deriving Inhabited
def CodeDecl.fvarId : CodeDecl pu FVarId
| .let decl | .fun decl _ | .jp decl => decl.fvarId
| .uset fvarId .. | .sset fvarId .. | .inc fvarId .. | .dec fvarId .. => fvarId
| .uset fvarId .. | .sset fvarId .. | .inc fvarId .. | .dec fvarId .. | .del fvarId ..
| .oset fvarId .. | .setTag fvarId .. => fvarId
def Code.toCodeDecl! : Code pu CodeDecl pu
| .let decl _ => .let decl
| .fun decl _ _ => .fun decl
| .jp decl _ => .jp decl
| .uset fvarId i y _ _ => .uset fvarId i y
| .sset fvarId i offset ty y _ _ => .sset fvarId i offset ty y
| .inc fvarId n check persistent _ _ => .inc fvarId n check persistent
| .dec fvarId n check persistent _ _ => .dec fvarId n check persistent
| _ => unreachable!
| .let decl _ => .let decl
| .fun decl _ _ => .fun decl
| .jp decl _ => .jp decl
| .oset fvarId i y _ _ => .oset fvarId i y
| .uset fvarId i y _ _ => .uset fvarId i y
| .sset fvarId i offset ty y _ _ => .sset fvarId i offset ty y
| .setTag fvarId cidx _ _ => .setTag fvarId cidx
| .inc fvarId n check persistent _ _ => .inc fvarId n check persistent
| .dec fvarId n check persistent _ _ => .dec fvarId n check persistent
| .del fvarId _ _ => .del fvarId
| _ => unreachable!
def attachCodeDecls (decls : Array (CodeDecl pu)) (code : Code pu) : Code pu :=
go decls.size code
@@ -469,10 +489,13 @@ where
| .let decl => go (i-1) (.let decl code)
| .fun decl _ => go (i-1) (.fun decl code)
| .jp decl => go (i-1) (.jp decl code)
| .oset fvarId idx y _ => go (i-1) (.oset fvarId idx y code)
| .uset fvarId idx y _ => go (i-1) (.uset fvarId idx y code)
| .sset fvarId idx offset y ty _ => go (i-1) (.sset fvarId idx offset y ty code)
| .setTag fvarId cidx _ => go (i-1) (.setTag fvarId cidx code)
| .inc fvarId n check persistent _ => go (i-1) (.inc fvarId n check persistent code)
| .dec fvarId n check persistent _ => go (i-1) (.dec fvarId n check persistent code)
| .del fvarId _ => go (i-1) (.del fvarId code)
else
code
@@ -488,14 +511,20 @@ mutual
| .jmp j₁ as₁, .jmp j₂ as₂ => j₁ == j₂ && as₁ == as₂
| .return r₁, .return r₂ => r₁ == r₂
| .unreach t₁, .unreach t₂ => t₁ == t₂
| .oset v₁ i₁ y₁ k₁ _, .oset v₂ i₂ y₂ k₂ _ =>
v₁ == v₂ && i₁ == i₂ && y₁ == y₂ && eqImp k₁ k₂
| .uset v₁ i₁ y₁ k₁ _, .uset v₂ i₂ y₂ k₂ _ =>
v₁ == v₂ && i₁ == i₂ && y₁ == y₂ && eqImp k₁ k₂
| .sset v₁ i₁ o₁ y₁ ty₁ k₁ _, .sset v₂ i₂ o₂ y₂ ty₂ k₂ _ =>
v₁ == v₂ && i₁ == i₂ && o₁ == o₂ && y₁ == y₂ && ty₁ == ty₂ && eqImp k₁ k₂
| .setTag v₁ c₁ k₁ _, .setTag v₂ c₂ k₂ _ =>
v₁ == v₂ && c₁ == c₂ && eqImp k₁ k₂
| .inc v₁ n₁ c₁ p₁ k₁ _, .inc v₂ n₂ c₂ p₂ k₂ _ =>
v₁ == v₂ && n₁ == n₂ && c₁ == c₂ && p₁ == p₂ && eqImp k₁ k₂
| .dec v₁ n₁ c₁ p₁ k₁ _, .dec v₂ n₂ c₂ p₂ k₂ _ =>
v₁ == v₂ && n₁ == n₂ && c₁ == c₂ && p₁ == p₂ && eqImp k₁ k₂
| .del v₁ k₁ _, .del v₂ k₂ _ =>
v₁ == v₂ && eqImp k₁ k₂
| _, _ => false
private unsafe def eqFunDecl (d₁ d₂ : FunDecl pu) : Bool :=
@@ -588,10 +617,13 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
| .let decl k => if ptrEq k k' then c else .let decl k'
| .fun decl k _ => if ptrEq k k' then c else .fun decl k'
| .jp decl k => if ptrEq k k' then c else .jp decl k'
| .oset fvarId offset y k _ => if ptrEq k k' then c else .oset fvarId offset y k'
| .sset fvarId i offset y ty k _ => if ptrEq k k' then c else .sset fvarId i offset y ty k'
| .uset fvarId offset y k _ => if ptrEq k k' then c else .uset fvarId offset y k'
| .setTag fvarId cidx k _ => if ptrEq k k' then c else .setTag fvarId cidx k'
| .inc fvarId n check persistent k _ => if ptrEq k k' then c else .inc fvarId n check persistent k'
| .dec fvarId n check persistent k _ => if ptrEq k k' then c else .dec fvarId n check persistent k'
| .del fvarId k _ => if ptrEq k k' then c else .del fvarId k'
| _ => unreachable!
@[implemented_by updateContImp] opaque Code.updateCont! (c : Code pu) (k' : Code pu) : Code pu
@@ -635,6 +667,19 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
.sset fvarId' i' offset' y' ty' k'
| _ => unreachable!
@[inline] private unsafe def updateOsetImp (c : Code pu) (fvarId' : FVarId)
(i' : Nat) (y' : Arg pu) (k' : Code pu) : Code pu :=
match c with
| .oset fvarId i y k _ =>
if ptrEq fvarId fvarId' && i == i' && ptrEq y y' && ptrEq k k' then
c
else
.oset fvarId' i' y' k'
| _ => unreachable!
@[implemented_by updateOsetImp] opaque Code.updateOset! (c : Code pu) (fvarId' : FVarId)
(i' : Nat) (y' : Arg pu) (k' : Code pu) : Code pu
@[implemented_by updateSsetImp] opaque Code.updateSset! (c : Code pu) (fvarId' : FVarId) (i' : Nat)
(offset' : Nat) (y' : FVarId) (ty' : Expr) (k' : Code pu) : Code pu
@@ -651,6 +696,19 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
@[implemented_by updateUsetImp] opaque Code.updateUset! (c : Code pu) (fvarId' : FVarId)
(i' : Nat) (y' : FVarId) (k' : Code pu) : Code pu
@[inline] private unsafe def updateSetTagImp (c : Code pu) (fvarId' : FVarId) (cidx' : Nat)
(k' : Code pu) : Code pu :=
match c with
| .setTag fvarId cidx k _ =>
if ptrEq fvarId fvarId' && cidx == cidx' && ptrEq k k' then
c
else
.setTag fvarId' cidx' k'
| _ => unreachable!
@[implemented_by updateSetTagImp] opaque Code.updateSetTag! (c : Code pu) (fvarId' : FVarId)
(cidx' : Nat) (k' : Code pu) : Code pu
@[inline] private unsafe def updateIncImp (c : Code pu) (fvarId' : FVarId) (n' : Nat)
(check' : Bool) (persistent' : Bool) (k' : Code pu) : Code pu :=
match c with
@@ -685,6 +743,19 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
@[implemented_by updateDecImp] opaque Code.updateDec! (c : Code pu) (fvarId' : FVarId) (n' : Nat)
(check' : Bool) (persistent' : Bool) (k' : Code pu) : Code pu
@[inline] private unsafe def updateDelImp (c : Code pu) (fvarId' : FVarId) (k' : Code pu) :
Code pu :=
match c with
| .del fvarId k _ =>
if ptrEq fvarId fvarId' && ptrEq k k' then
c
else
.del fvarId' k'
| _ => unreachable!
@[implemented_by updateDelImp] opaque Code.updateDel! (c : Code pu) (fvarId' : FVarId)
(k' : Code pu) : Code pu
private unsafe def updateParamCoreImp (p : Param pu) (type : Expr) : Param pu :=
if ptrEq type p.type then
p
@@ -753,8 +824,8 @@ partial def Code.size (c : Code pu) : Nat :=
where
go (c : Code pu) (n : Nat) : Nat :=
match c with
| .let (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. => go k (n + 1)
| .let (k := k) .. | .oset (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .setTag (k := k) .. | .del (k := k) .. => go k (n + 1)
| .jp decl k | .fun decl k _ => go k <| go decl.value n
| .cases c => c.alts.foldl (init := n+1) fun n alt => go alt.getCode (n+1)
| .jmp .. => n+1
@@ -772,8 +843,8 @@ where
go (c : Code pu) : EStateM Unit Nat Unit := do
match c with
| .let (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. => inc; go k
| .let (k := k) .. | .oset (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .setTag (k := k) .. | .del (k := k) .. => inc; go k
| .jp decl k | .fun decl k _ => inc; go decl.value; go k
| .cases c => inc; c.alts.forM fun alt => go alt.getCode
| .jmp .. => inc
@@ -785,8 +856,8 @@ where
go (c : Code pu) : m Unit := do
f c
match c with
| .let (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. => go k
| .let (k := k) .. | .oset (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .setTag (k := k) .. | .del (k := k) .. => go k
| .fun decl k _ | .jp decl k => go decl.value; go k
| .cases c => c.alts.forM fun alt => go alt.getCode
| .unreach .. | .return .. | .jmp .. => return ()
@@ -1017,7 +1088,7 @@ Return `true` if `decl` is supposed to be inlined/specialized.
-/
def Decl.isTemplateLike (decl : Decl pu) : CoreM Bool := do
let env getEnv
if hasLocalInst decl.type then
if !hasNospecializeAttribute env decl.name && ( hasLocalInst decl.type) then
return true -- `decl` applications will be specialized
else if ( isImplicitReducible decl.name) then
return true -- `decl` is "fuel" for code specialization
@@ -1053,7 +1124,7 @@ private def collectLetValue (e : LetValue pu) (s : FVarIdHashSet) : FVarIdHashSe
| .fvar fvarId args => collectArgs args <| s.insert fvarId
| .const _ _ args _ | .pap _ args _ | .fap _ args _ | .ctor _ args _ => collectArgs args s
| .proj _ _ fvarId _ | .sproj _ _ fvarId _ | .uproj _ fvarId _ | .oproj _ fvarId _
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ => s.insert fvarId
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ | .isShared fvarId _ => s.insert fvarId
| .lit .. | .erased => s
| .reuse fvarId _ _ args _ => collectArgs args <| s.insert fvarId
@@ -1082,7 +1153,12 @@ partial def Code.collectUsed (code : Code pu) (s : FVarIdHashSet := {}) : FVarId
let s := s.insert fvarId
let s := s.insert y
k.collectUsed s
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) .. =>
| .oset fvarId _ y k _ =>
let s := s.insert fvarId
let s := if let .fvar y := y then s.insert y else s
k.collectUsed s
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) ..
| .del (fvarId := fvarId) (k := k) .. | .setTag (fvarId := fvarId) (k := k) .. =>
k.collectUsed <| s.insert fvarId
end
@@ -1095,7 +1171,11 @@ def CodeDecl.collectUsed (codeDecl : CodeDecl pu) (s : FVarIdHashSet := ∅) : F
| .jp decl | .fun decl _ => decl.collectUsed s
| .sset (fvarId := fvarId) (y := y) .. | .uset (fvarId := fvarId) (y := y) .. =>
s.insert fvarId |>.insert y
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. => s.insert fvarId
| .oset (fvarId := fvarId) (y := y) .. =>
let s := s.insert fvarId
if let .fvar y := y then s.insert y else s
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. | .setTag (fvarId := fvarId) ..
| .del (fvarId := fvarId) .. => s.insert fvarId
/--
Traverse the given block of potentially mutually recursive functions
@@ -1125,7 +1205,8 @@ where
modify fun s => s.insert declName
| _ => pure ()
visit k
| .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => visit k
| .oset (k := k) .. | .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .del (k := k) .. | .setTag (k := k) .. => visit k
go : StateM NameSet Unit :=
decls.forM (·.value.forCodeM visit)

View File

@@ -68,7 +68,8 @@ where
eraseCode k
eraseParam auxParam
return .unreach typeNew
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. =>
| .oset (k := k) ..| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) ..
| .del (k := k) .. | .setTag (k := k) .. =>
return c.updateCont! ( go k)
instance : MonadCodeBind CompilerM where

Some files were not shown because too many files have changed in this diff Show More