mirror of
https://github.com/leanprover/lean4.git
synced 2026-04-10 22:24:07 +00:00
Compare commits
46 Commits
hbv/inline
...
v4.29.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
98dc76e3c0 | ||
|
|
58db58cad3 | ||
|
|
d9764d755f | ||
|
|
6f69c914f4 | ||
|
|
513160ea59 | ||
|
|
dc8c5e9984 | ||
|
|
38bfb46cd0 | ||
|
|
f5d7f18743 | ||
|
|
2ce5c19116 | ||
|
|
ee890fd1e5 | ||
|
|
b191f74011 | ||
|
|
ab91146423 | ||
|
|
66b444c62a | ||
|
|
d7ebdca954 | ||
|
|
d4c15567af | ||
|
|
09b2b0cdf4 | ||
|
|
9cc7aa8902 | ||
|
|
0bbadfa02a | ||
|
|
04d3ba35de | ||
|
|
b1ce232903 | ||
|
|
0b6cf8e962 | ||
|
|
00659f8e60 | ||
|
|
da0bdb2e07 | ||
|
|
596d13f6d4 | ||
|
|
ef26f95ee6 | ||
|
|
ed910f9b59 | ||
|
|
1c667a8279 | ||
|
|
847c32c0df | ||
|
|
b83c0eefc3 | ||
|
|
1a612b77f6 | ||
|
|
1c61ba6420 | ||
|
|
51cdf26936 | ||
|
|
82f6653bca | ||
|
|
e8ebee6001 | ||
|
|
95583d74bd | ||
|
|
b9bfacd2da | ||
|
|
4962edfada | ||
|
|
86a8eb0051 | ||
|
|
5d86aa4032 | ||
|
|
fd226c813d | ||
|
|
ca43b60947 | ||
|
|
6979644c23 | ||
|
|
52032bde9c | ||
|
|
2152eddfb4 | ||
|
|
83e54b65b6 | ||
|
|
d155c86f9c |
@@ -4,25 +4,29 @@ To build Lean you should use `make -j$(nproc) -C build/release`.
|
||||
|
||||
## Running Tests
|
||||
|
||||
See `tests/README.md` for full documentation. Quick reference:
|
||||
See `doc/dev/testing.md` for full documentation. Quick reference:
|
||||
|
||||
```bash
|
||||
# Full test suite (use after builds to verify correctness)
|
||||
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
|
||||
make -C build/release -j "$(nproc)" test
|
||||
make -j$(nproc) -C build/release test ARGS="-j$(nproc)"
|
||||
|
||||
# Specific test by name (supports regex via ctest -R)
|
||||
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
|
||||
make -C build/release -j "$(nproc)" test ARGS='-R grind_ematch'
|
||||
make -j$(nproc) -C build/release test ARGS='-R grind_ematch --output-on-failure'
|
||||
|
||||
# Rerun only previously failed tests
|
||||
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
|
||||
make -C build/release -j "$(nproc)" test ARGS='--rerun-failed'
|
||||
make -j$(nproc) -C build/release test ARGS='--rerun-failed --output-on-failure'
|
||||
|
||||
# Single test from tests/foo/bar/ (quick check during development)
|
||||
cd tests/foo/bar && ./run_test example_test.lean
|
||||
# Single test from tests/lean/run/ (quick check during development)
|
||||
cd tests/lean/run && ./test_single.sh example_test.lean
|
||||
|
||||
# ctest directly (from stage1 build dir)
|
||||
cd build/release/stage1 && ctest -j$(nproc) --output-on-failure --timeout 300
|
||||
```
|
||||
|
||||
The full test suite includes `tests/lean/`, `tests/lean/run/`, `tests/lean/interactive/`,
|
||||
`tests/compiler/`, `tests/pkg/`, Lake tests, and more. Using `make test` or `ctest` runs
|
||||
all of them; `test_single.sh` in `tests/lean/run/` only covers that one directory.
|
||||
|
||||
## New features
|
||||
|
||||
When asked to implement new features:
|
||||
@@ -30,6 +34,8 @@ When asked to implement new features:
|
||||
* write comprehensive tests first (expecting that these will initially fail)
|
||||
* and then iterate on the implementation until the tests pass.
|
||||
|
||||
All new tests should go in `tests/lean/run/`. These tests don't have expected output; we just check there are no errors. You should use `#guard_msgs` to check for specific messages.
|
||||
|
||||
## Success Criteria
|
||||
|
||||
*Never* report success on a task unless you have verified both a clean build without errors, and that the relevant tests pass.
|
||||
|
||||
@@ -121,6 +121,24 @@ The nightly build system uses branches and tags across two repositories:
|
||||
|
||||
When a nightly succeeds with mathlib, all three should point to the same commit. Don't confuse these: branches are in the main lean4 repo, dated tags are in lean4-nightly.
|
||||
|
||||
## CI Failures: Investigate Immediately
|
||||
|
||||
**CRITICAL: If the checklist reports `❌ CI: X check(s) failing` for any PR, investigate immediately.**
|
||||
|
||||
Do NOT:
|
||||
- Report it as "CI in progress" or "some checks pending"
|
||||
- Wait for the remaining checks to finish before investigating
|
||||
- Assume it's a transient failure without checking
|
||||
|
||||
DO:
|
||||
1. Run `gh pr checks <number> --repo <owner>/<repo>` to see which specific check failed
|
||||
2. Run `gh run view <run-id> --repo <owner>/<repo> --log-failed` to see the failure output
|
||||
3. Diagnose the failure and report clearly to the user: what failed and why
|
||||
4. Propose a fix if one is obvious (e.g., subverso version mismatch, transient elan install error)
|
||||
|
||||
The checklist now distinguishes `❌ X check(s) failing, Y still in progress` from `🔄 Y check(s) in progress`.
|
||||
Any `❌` in CI status requires immediate investigation — do not move on.
|
||||
|
||||
## Waiting for CI or Merges
|
||||
|
||||
Use `gh pr checks --watch` to block until a PR's CI checks complete (no polling needed).
|
||||
@@ -135,6 +153,10 @@ For multiple PRs, launch one background command per PR in parallel. When each co
|
||||
you'll be notified automatically via a task-notification. Do NOT use sleep-based polling
|
||||
loops — `--watch` is event-driven and exits as soon as checks finish.
|
||||
|
||||
Note: `gh pr checks --watch` exits as soon as ALL checks complete (pass or fail). If some checks
|
||||
fail while others are still running, `--watch` will continue until everything settles, then exit
|
||||
with a non-zero code. So a background `--watch` finishing = all checks done; check which failed.
|
||||
|
||||
## Error Handling
|
||||
|
||||
**CRITICAL**: If something goes wrong or a command fails:
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
---
|
||||
name: profiling
|
||||
description: Profile Lean programs with demangled names using samply and Firefox Profiler. Use when the user asks to profile a Lean binary or investigate performance.
|
||||
allowed-tools: Bash, Read, Glob, Grep
|
||||
---
|
||||
|
||||
# Profiling Lean Programs
|
||||
|
||||
Full documentation: `script/PROFILER_README.md`.
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
script/lean_profile.sh ./build/release/stage1/bin/lean some_file.lean
|
||||
```
|
||||
|
||||
Requires `samply` (`cargo install samply`) and `python3`.
|
||||
|
||||
## Agent Notes
|
||||
|
||||
- The pipeline is interactive (serves to browser at the end). When running non-interactively, run the steps manually instead of using the wrapper script.
|
||||
- The three steps are: `samply record --save-only`, `symbolicate_profile.py`, then `serve_profile.py`.
|
||||
- `lean_demangle.py` works standalone as a stdin filter (like `c++filt`) for quick name lookups.
|
||||
- The `--raw` flag on `lean_demangle.py` gives exact demangled names without postprocessing (keeps `._redArg`, `._lam_0` suffixes as-is).
|
||||
- Use `PROFILE_KEEP=1` to keep the temp directory for later inspection.
|
||||
- The demangled profile is a standard Firefox Profiler JSON. Function names live in `threads[i].stringArray`, indexed by `threads[i].funcTable.name`.
|
||||
9
.github/workflows/awaiting-manual.yml
vendored
9
.github/workflows/awaiting-manual.yml
vendored
@@ -2,19 +2,16 @@ name: Check awaiting-manual label
|
||||
|
||||
on:
|
||||
merge_group:
|
||||
pull_request_target:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened, labeled, unlabeled]
|
||||
|
||||
permissions:
|
||||
pull-requests: read
|
||||
|
||||
jobs:
|
||||
check-awaiting-manual:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Check awaiting-manual label
|
||||
id: check-awaiting-manual-label
|
||||
if: github.event_name == 'pull_request_target'
|
||||
if: github.event_name == 'pull_request'
|
||||
uses: actions/github-script@v8
|
||||
with:
|
||||
script: |
|
||||
@@ -31,7 +28,7 @@ jobs:
|
||||
}
|
||||
|
||||
- name: Wait for manual compatibility
|
||||
if: github.event_name == 'pull_request_target' && steps.check-awaiting-manual-label.outputs.awaiting == 'true'
|
||||
if: github.event_name == 'pull_request' && steps.check-awaiting-manual-label.outputs.awaiting == 'true'
|
||||
run: |
|
||||
echo "::notice title=Awaiting manual::PR is marked 'awaiting-manual' but neither 'breaks-manual' nor 'builds-manual' labels are present."
|
||||
echo "This check will remain in progress until the PR is updated with appropriate manual compatibility labels."
|
||||
|
||||
9
.github/workflows/awaiting-mathlib.yml
vendored
9
.github/workflows/awaiting-mathlib.yml
vendored
@@ -2,19 +2,16 @@ name: Check awaiting-mathlib label
|
||||
|
||||
on:
|
||||
merge_group:
|
||||
pull_request_target:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened, labeled, unlabeled]
|
||||
|
||||
permissions:
|
||||
pull-requests: read
|
||||
|
||||
jobs:
|
||||
check-awaiting-mathlib:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Check awaiting-mathlib label
|
||||
id: check-awaiting-mathlib-label
|
||||
if: github.event_name == 'pull_request_target'
|
||||
if: github.event_name == 'pull_request'
|
||||
uses: actions/github-script@v8
|
||||
with:
|
||||
script: |
|
||||
@@ -31,7 +28,7 @@ jobs:
|
||||
}
|
||||
|
||||
- name: Wait for mathlib compatibility
|
||||
if: github.event_name == 'pull_request_target' && steps.check-awaiting-mathlib-label.outputs.awaiting == 'true'
|
||||
if: github.event_name == 'pull_request' && steps.check-awaiting-mathlib-label.outputs.awaiting == 'true'
|
||||
run: |
|
||||
echo "::notice title=Awaiting mathlib::PR is marked 'awaiting-mathlib' but neither 'breaks-mathlib' nor 'builds-mathlib' labels are present."
|
||||
echo "This check will remain in progress until the PR is updated with appropriate mathlib compatibility labels."
|
||||
|
||||
18
.github/workflows/build-template.yml
vendored
18
.github/workflows/build-template.yml
vendored
@@ -79,7 +79,7 @@ jobs:
|
||||
- name: CI Merge Checkout
|
||||
run: |
|
||||
git fetch --depth=1 origin ${{ github.sha }}
|
||||
git checkout FETCH_HEAD flake.nix flake.lock script/prepare-* tests/elab/importStructure.lean
|
||||
git checkout FETCH_HEAD flake.nix flake.lock script/prepare-* tests/lean/run/importStructure.lean
|
||||
if: github.event_name == 'pull_request'
|
||||
# (needs to be after "Checkout" so files don't get overridden)
|
||||
- name: Setup emsdk
|
||||
@@ -229,21 +229,25 @@ jobs:
|
||||
# prefix `if` above with `always` so it's run even if tests failed
|
||||
if: always() && steps.test.conclusion != 'skipped'
|
||||
- name: Check Test Binary
|
||||
run: ${{ matrix.binary-check }} tests/compile/534.lean.out
|
||||
run: ${{ matrix.binary-check }} tests/compiler/534.lean.out
|
||||
if: (!matrix.cross) && steps.test.conclusion != 'skipped'
|
||||
- name: Build Stage 2
|
||||
run: |
|
||||
make -C build -j$NPROC stage2
|
||||
if: matrix.test-bench
|
||||
if: matrix.test-speedcenter
|
||||
- name: Check Stage 3
|
||||
run: |
|
||||
make -C build -j$NPROC check-stage3
|
||||
if: matrix.check-stage3
|
||||
- name: Test Benchmarks
|
||||
- name: Test Speedcenter Benchmarks
|
||||
run: |
|
||||
cd tests
|
||||
nix develop -c make -C ../build -j$NPROC bench
|
||||
if: matrix.test-bench
|
||||
# Necessary for some timing metrics but does not work on Namespace runners
|
||||
# and we just want to test that the benchmarks run at all here
|
||||
#echo -1 | sudo tee /proc/sys/kernel/perf_event_paranoid
|
||||
export BUILD=$PWD/build PATH=$PWD/build/stage1/bin:$PATH
|
||||
cd tests/bench
|
||||
nix shell .#temci -c temci exec --config speedcenter.yaml --included_blocks fast --runs 1
|
||||
if: matrix.test-speedcenter
|
||||
- name: Check rebootstrap
|
||||
run: |
|
||||
set -e
|
||||
|
||||
5
.github/workflows/check-stdlib-flags.yml
vendored
5
.github/workflows/check-stdlib-flags.yml
vendored
@@ -1,12 +1,9 @@
|
||||
name: Check stdlib_flags.h modifications
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened, labeled, unlabeled]
|
||||
|
||||
permissions:
|
||||
pull-requests: read
|
||||
|
||||
jobs:
|
||||
check-stdlib-flags:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
38
.github/workflows/ci.yml
vendored
38
.github/workflows/ci.yml
vendored
@@ -163,6 +163,34 @@ jobs:
|
||||
|
||||
echo "Version validation passed: $TAG_MAJOR.$TAG_MINOR.$TAG_PATCH"
|
||||
|
||||
# Also check stage0/src/CMakeLists.txt — the stage0 compiler stamps .olean
|
||||
# headers with its baked-in version, so a mismatch produces .olean files
|
||||
# with the wrong version in the release tarball.
|
||||
STAGE0_MAJOR=$(grep -E "^set\(LEAN_VERSION_MAJOR " stage0/src/CMakeLists.txt | grep -oE '[0-9]+')
|
||||
STAGE0_MINOR=$(grep -E "^set\(LEAN_VERSION_MINOR " stage0/src/CMakeLists.txt | grep -oE '[0-9]+')
|
||||
|
||||
STAGE0_ERRORS=""
|
||||
if [[ "$STAGE0_MAJOR" != "$TAG_MAJOR" ]]; then
|
||||
STAGE0_ERRORS+="LEAN_VERSION_MAJOR: expected $TAG_MAJOR, found $STAGE0_MAJOR\n"
|
||||
fi
|
||||
if [[ "$STAGE0_MINOR" != "$TAG_MINOR" ]]; then
|
||||
STAGE0_ERRORS+="LEAN_VERSION_MINOR: expected $TAG_MINOR, found $STAGE0_MINOR\n"
|
||||
fi
|
||||
|
||||
if [[ -n "$STAGE0_ERRORS" ]]; then
|
||||
echo "::error::Version mismatch between tag and stage0/src/CMakeLists.txt"
|
||||
echo ""
|
||||
echo "Tag ${{ steps.set-release.outputs.RELEASE_TAG }} expects version $TAG_MAJOR.$TAG_MINOR.$TAG_PATCH"
|
||||
echo "But stage0/src/CMakeLists.txt has mismatched values:"
|
||||
echo -e "$STAGE0_ERRORS"
|
||||
echo ""
|
||||
echo "The stage0 compiler stamps .olean headers with its baked-in version."
|
||||
echo "Run 'make update-stage0' to rebuild stage0 with the correct version, then re-tag."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "stage0 version validation passed: $STAGE0_MAJOR.$STAGE0_MINOR"
|
||||
|
||||
# 0: PRs without special label
|
||||
# 1: PRs with `merge-ci` label, merge queue checks, master commits
|
||||
# 2: nightlies
|
||||
@@ -258,8 +286,8 @@ jobs:
|
||||
"check-rebootstrap": level >= 1,
|
||||
"check-stage3": level >= 2,
|
||||
"test": true,
|
||||
// NOTE: `test-bench` currently seems to be broken on `ubuntu-latest`
|
||||
"test-bench": large && level >= 2,
|
||||
// NOTE: `test-speedcenter` currently seems to be broken on `ubuntu-latest`
|
||||
"test-speedcenter": large && level >= 2,
|
||||
// We are not warning-free yet on all platforms, start here
|
||||
"CMAKE_OPTIONS": "-DLEAN_EXTRA_CXX_FLAGS=-Werror",
|
||||
},
|
||||
@@ -269,16 +297,14 @@ jobs:
|
||||
"enabled": level >= 2,
|
||||
"test": true,
|
||||
"CMAKE_PRESET": "reldebug",
|
||||
// * `elab_bench/big_do` crashes with exit code 134
|
||||
"CTEST_OPTIONS": "-E 'elab_bench/big_do'",
|
||||
},
|
||||
{
|
||||
"name": "Linux fsanitize",
|
||||
// Always run on large if available, more reliable regarding timeouts
|
||||
"os": large ? "nscloud-ubuntu-22.04-amd64-16x32-with-cache" : "ubuntu-latest",
|
||||
"enabled": level >= 2,
|
||||
// do not fail nightlies on this for now
|
||||
"secondary": level <= 2,
|
||||
// do not fail releases/nightlies on this for now
|
||||
"secondary": true,
|
||||
"test": true,
|
||||
// turn off custom allocator & symbolic functions to make LSAN do its magic
|
||||
"CMAKE_PRESET": "sanitize",
|
||||
|
||||
10
.github/workflows/pr-body.yml
vendored
10
.github/workflows/pr-body.yml
vendored
@@ -2,23 +2,17 @@ name: Check PR body for changelog convention
|
||||
|
||||
on:
|
||||
merge_group:
|
||||
pull_request_target:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened, edited, labeled, converted_to_draft, ready_for_review]
|
||||
|
||||
permissions:
|
||||
pull-requests: read
|
||||
|
||||
jobs:
|
||||
check-pr-body:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Check PR body
|
||||
if: github.event_name == 'pull_request_target'
|
||||
if: github.event_name == 'pull_request'
|
||||
uses: actions/github-script@v8
|
||||
with:
|
||||
# Safety note: this uses pull_request_target, so the workflow has elevated privileges.
|
||||
# The PR title and body are only used in regex tests (read-only string matching),
|
||||
# never interpolated into shell commands, eval'd, or written to GITHUB_ENV/GITHUB_OUTPUT.
|
||||
script: |
|
||||
const { title, body, labels, draft } = context.payload.pull_request;
|
||||
if (!draft && /^(feat|fix):/.test(title) && !labels.some(label => label.name == "changelog-no")) {
|
||||
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -1,6 +1,7 @@
|
||||
*~
|
||||
\#*
|
||||
.#*
|
||||
*.lock
|
||||
.lake
|
||||
lake-manifest.json
|
||||
/build
|
||||
|
||||
@@ -1,8 +1,4 @@
|
||||
cmake_minimum_required(VERSION 3.21)
|
||||
|
||||
if(NOT CMAKE_GENERATOR MATCHES "Makefiles")
|
||||
message(FATAL_ERROR "Only makefile generators are supported")
|
||||
endif()
|
||||
cmake_minimum_required(VERSION 3.11)
|
||||
|
||||
option(USE_MIMALLOC "use mimalloc" ON)
|
||||
|
||||
@@ -151,7 +147,6 @@ ExternalProject_Add(
|
||||
INSTALL_COMMAND ""
|
||||
DEPENDS stage2
|
||||
EXCLUDE_FROM_ALL ON
|
||||
STEP_TARGETS configure
|
||||
)
|
||||
|
||||
# targets forwarded to appropriate stages
|
||||
@@ -162,25 +157,6 @@ add_custom_target(update-stage0-commit COMMAND $(MAKE) -C stage1 update-stage0-c
|
||||
|
||||
add_custom_target(test COMMAND $(MAKE) -C stage1 test DEPENDS stage1)
|
||||
|
||||
add_custom_target(
|
||||
bench
|
||||
COMMAND $(MAKE) -C stage2
|
||||
COMMAND $(MAKE) -C stage2 -j1 bench
|
||||
DEPENDS stage2
|
||||
)
|
||||
add_custom_target(
|
||||
bench-part1
|
||||
COMMAND $(MAKE) -C stage2
|
||||
COMMAND $(MAKE) -C stage2 -j1 bench-part1
|
||||
DEPENDS stage2
|
||||
)
|
||||
add_custom_target(
|
||||
bench-part2
|
||||
COMMAND $(MAKE) -C stage2
|
||||
COMMAND $(MAKE) -C stage2 -j1 bench-part2
|
||||
DEPENDS stage2
|
||||
)
|
||||
|
||||
add_custom_target(clean-stdlib COMMAND $(MAKE) -C stage1 clean-stdlib DEPENDS stage1)
|
||||
|
||||
install(CODE "execute_process(COMMAND make -C stage1 install)")
|
||||
|
||||
@@ -41,7 +41,7 @@
|
||||
"SMALL_ALLOCATOR": "OFF",
|
||||
"USE_MIMALLOC": "OFF",
|
||||
"BSYMBOLIC": "OFF",
|
||||
"LEAN_TEST_VARS": "MAIN_STACK_SIZE=16000 TEST_STACK_SIZE=16000 LSAN_OPTIONS=max_leaks=10"
|
||||
"LEAN_TEST_VARS": "MAIN_STACK_SIZE=16000 LSAN_OPTIONS=max_leaks=10"
|
||||
},
|
||||
"generator": "Unix Makefiles",
|
||||
"binaryDir": "${sourceDir}/build/sanitize"
|
||||
|
||||
@@ -1,9 +1,5 @@
|
||||
# Test Suite
|
||||
|
||||
**Warning:** This document is partially outdated.
|
||||
It describes the old test suite, which is currently in the process of being replaced.
|
||||
The new test suite's documentation can be found at [`tests/README.md`](../../tests/README.md).
|
||||
|
||||
After [building Lean](../make/index.md) you can run all the tests using
|
||||
```
|
||||
cd build/release
|
||||
|
||||
@@ -1 +1 @@
|
||||
../../../build/release/stage1
|
||||
lean4
|
||||
|
||||
@@ -1 +1 @@
|
||||
build/release/stage1
|
||||
lean4
|
||||
|
||||
@@ -2,9 +2,21 @@
|
||||
"folders": [
|
||||
{
|
||||
"path": "."
|
||||
},
|
||||
{
|
||||
"path": "src"
|
||||
},
|
||||
{
|
||||
"path": "tests"
|
||||
},
|
||||
{
|
||||
"path": "script"
|
||||
}
|
||||
],
|
||||
"settings": {
|
||||
// Open terminal at root, not current workspace folder
|
||||
// (there is not way to directly refer to the root folder included as `.` above)
|
||||
"terminal.integrated.cwd": "${workspaceFolder:src}/..",
|
||||
"files.insertFinalNewline": true,
|
||||
"files.trimTrailingWhitespace": true,
|
||||
"cmake.buildDirectory": "${workspaceFolder}/build/release",
|
||||
|
||||
@@ -83,7 +83,7 @@ def main (args : List String) : IO Unit := do
|
||||
lastRSS? := some rss
|
||||
|
||||
let avgRSSDelta := totalRSSDelta / (n - 2)
|
||||
IO.println s!"measurement: avg-reelab-rss-delta {avgRSSDelta*1024} b"
|
||||
IO.println s!"avg-reelab-rss-delta: {avgRSSDelta}"
|
||||
|
||||
let _ ← Ipc.collectDiagnostics requestNo uri versionNo
|
||||
(← Ipc.stdin).writeLspMessage (Message.notification "exit" none)
|
||||
|
||||
@@ -82,7 +82,7 @@ def main (args : List String) : IO Unit := do
|
||||
lastRSS? := some rss
|
||||
|
||||
let avgRSSDelta := totalRSSDelta / (n - 2)
|
||||
IO.println s!"measurement: avg-reelab-rss-delta {avgRSSDelta*1024} b"
|
||||
IO.println s!"avg-reelab-rss-delta: {avgRSSDelta}"
|
||||
|
||||
let _ ← Ipc.collectDiagnostics requestNo uri versionNo
|
||||
Ipc.shutdown requestNo
|
||||
|
||||
@@ -9,5 +9,5 @@ find -regex '.*/CMakeLists\.txt\(\.in\)?\|.*\.cmake\(\.in\)?' \
|
||||
! -path "./stage0/*" \
|
||||
-exec \
|
||||
uvx gersemi --in-place --line-length 120 --indent 2 \
|
||||
--definitions src/cmake/Modules/ src/CMakeLists.txt tests/CMakeLists.txt \
|
||||
--definitions src/cmake/Modules/ src/CMakeLists.txt \
|
||||
-- {} +
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env python3
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (c) 2015 Microsoft Corporation. All rights reserved.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env python3
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (c) 2015 Microsoft Corporation. All rights reserved.
|
||||
|
||||
@@ -1 +1 @@
|
||||
../build/release/stage1
|
||||
lean4
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env bash
|
||||
#!/bin/bash
|
||||
# Profile a Lean binary with demangled names.
|
||||
#
|
||||
# Usage:
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
rm -rf stage0 || true
|
||||
rm -r stage0 || true
|
||||
# don't copy untracked files
|
||||
# `:!` is git glob flavor for exclude patterns
|
||||
for f in $(git ls-files src ':!:src/lake/*' ':!:src/Leanc.lean'); do
|
||||
|
||||
@@ -11,7 +11,7 @@ IMPORTANT: Keep this documentation up-to-date when modifying the script's behavi
|
||||
What this script does:
|
||||
1. Validates preliminary Lean4 release infrastructure:
|
||||
- Checks that the release branch (releases/vX.Y.0) exists
|
||||
- Verifies CMake version settings are correct
|
||||
- Verifies CMake version settings are correct (both src/ and stage0/)
|
||||
- Confirms the release tag exists
|
||||
- Validates the release page exists on GitHub (created automatically by CI after tag push)
|
||||
- Checks the release notes page on lean-lang.org (updated while bumping the `reference-manual` repository)
|
||||
@@ -326,6 +326,42 @@ def check_cmake_version(repo_url, branch, version_major, version_minor, github_t
|
||||
print(f" ✅ CMake version settings are correct in {cmake_file_path}")
|
||||
return True
|
||||
|
||||
def check_stage0_version(repo_url, branch, version_major, version_minor, github_token):
|
||||
"""Verify that stage0/src/CMakeLists.txt has the same version as src/CMakeLists.txt.
|
||||
|
||||
The stage0 pre-built binaries stamp .olean headers with their baked-in version.
|
||||
If stage0 has a different version (e.g. from a 'begin development cycle' bump),
|
||||
the release tarball will contain .olean files with the wrong version.
|
||||
"""
|
||||
stage0_cmake = "stage0/src/CMakeLists.txt"
|
||||
content = get_branch_content(repo_url, branch, stage0_cmake, github_token)
|
||||
if content is None:
|
||||
print(f" ❌ Could not retrieve {stage0_cmake} from {branch}")
|
||||
return False
|
||||
|
||||
errors = []
|
||||
for line in content.splitlines():
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("set(LEAN_VERSION_MAJOR "):
|
||||
actual = stripped.split()[-1].rstrip(")")
|
||||
if actual != str(version_major):
|
||||
errors.append(f"LEAN_VERSION_MAJOR: expected {version_major}, found {actual}")
|
||||
elif stripped.startswith("set(LEAN_VERSION_MINOR "):
|
||||
actual = stripped.split()[-1].rstrip(")")
|
||||
if actual != str(version_minor):
|
||||
errors.append(f"LEAN_VERSION_MINOR: expected {version_minor}, found {actual}")
|
||||
|
||||
if errors:
|
||||
print(f" ❌ stage0 version mismatch in {stage0_cmake}:")
|
||||
for error in errors:
|
||||
print(f" {error}")
|
||||
print(f" The stage0 compiler stamps .olean headers with its baked-in version.")
|
||||
print(f" Run `make update-stage0` to rebuild stage0 with the correct version.")
|
||||
return False
|
||||
|
||||
print(f" ✅ stage0 version matches in {stage0_cmake}")
|
||||
return True
|
||||
|
||||
def extract_org_repo_from_url(repo_url):
|
||||
"""Extract the 'org/repo' part from a GitHub URL."""
|
||||
if repo_url.startswith("https://github.com/"):
|
||||
@@ -441,7 +477,10 @@ def get_pr_ci_status(repo_url, pr_number, github_token):
|
||||
conclusions = [run['conclusion'] for run in check_runs if run.get('status') == 'completed']
|
||||
in_progress = [run for run in check_runs if run.get('status') in ['queued', 'in_progress']]
|
||||
|
||||
failed = sum(1 for c in conclusions if c in ['failure', 'timed_out', 'action_required'])
|
||||
if in_progress:
|
||||
if failed > 0:
|
||||
return "failure", f"{failed} check(s) failing, {len(in_progress)} still in progress"
|
||||
return "pending", f"{len(in_progress)} check(s) in progress"
|
||||
|
||||
if not conclusions:
|
||||
@@ -450,7 +489,6 @@ def get_pr_ci_status(repo_url, pr_number, github_token):
|
||||
if all(c == 'success' for c in conclusions):
|
||||
return "success", f"All {len(conclusions)} checks passed"
|
||||
|
||||
failed = sum(1 for c in conclusions if c in ['failure', 'timed_out', 'action_required'])
|
||||
if failed > 0:
|
||||
return "failure", f"{failed} check(s) failed"
|
||||
|
||||
@@ -680,6 +718,9 @@ def main():
|
||||
# Check CMake version settings
|
||||
if not check_cmake_version(lean_repo_url, branch_name, version_major, version_minor, github_token):
|
||||
lean4_success = False
|
||||
# Check that stage0 version matches (stage0 stamps .olean headers with its version)
|
||||
if not check_stage0_version(lean_repo_url, branch_name, version_major, version_minor, github_token):
|
||||
lean4_success = False
|
||||
|
||||
# Check for tag and release page
|
||||
if not tag_exists(lean_repo_url, toolchain, github_token):
|
||||
@@ -965,14 +1006,15 @@ def main():
|
||||
# Find the actual minor version in CMakeLists.txt
|
||||
for line in cmake_lines:
|
||||
if line.strip().startswith("set(LEAN_VERSION_MINOR "):
|
||||
actual_minor = int(line.split()[-1].rstrip(")"))
|
||||
m = re.search(r'set\(LEAN_VERSION_MINOR\s+(\d+)', line)
|
||||
actual_minor = int(m.group(1)) if m else 0
|
||||
version_minor_correct = actual_minor >= next_minor
|
||||
break
|
||||
else:
|
||||
version_minor_correct = False
|
||||
|
||||
is_release_correct = any(
|
||||
l.strip().startswith("set(LEAN_VERSION_IS_RELEASE 0)")
|
||||
re.match(r'set\(LEAN_VERSION_IS_RELEASE\s+0[\s)]', l.strip())
|
||||
for l in cmake_lines
|
||||
)
|
||||
|
||||
|
||||
@@ -14,13 +14,6 @@ repositories:
|
||||
bump-branch: true
|
||||
dependencies: []
|
||||
|
||||
- name: lean4checker
|
||||
url: https://github.com/leanprover/lean4checker
|
||||
toolchain-tag: true
|
||||
stable-branch: true
|
||||
branch: master
|
||||
dependencies: []
|
||||
|
||||
- name: quote4
|
||||
url: https://github.com/leanprover-community/quote4
|
||||
toolchain-tag: true
|
||||
|
||||
@@ -479,6 +479,25 @@ def execute_release_steps(repo, version, config):
|
||||
print(blue("Updating lakefile.toml..."))
|
||||
run_command(f'perl -pi -e \'s/"v4\\.[0-9]+(\\.[0-9]+)?(-rc[0-9]+)?"/"' + version + '"/g\' lakefile.*', cwd=repo_path)
|
||||
run_command("lake update", cwd=repo_path, stream_output=True)
|
||||
elif repo_name == "verso":
|
||||
# verso has nested Lake projects in test-projects/ that each have their own
|
||||
# lake-manifest.json with a subverso pin. After updating the root manifest via
|
||||
# `lake update`, sync the de-modulized subverso rev into all sub-manifests.
|
||||
# The sub-projects use an old toolchain (v4.21.0) that doesn't support module/prelude
|
||||
# syntax, so they need the de-modulized version (tagged no-modules/<root-rev>).
|
||||
# The "SubVerso version consistency" CI check accepts either the root or de-modulized rev.
|
||||
run_command("lake update", cwd=repo_path, stream_output=True)
|
||||
print(blue("Syncing de-modulized subverso rev to test-project sub-manifests..."))
|
||||
sync_script = (
|
||||
'ROOT_REV=$(jq -r \'.packages[] | select(.name == "subverso") | .rev\' lake-manifest.json); '
|
||||
'SUBVERSO_URL=$(jq -r \'.packages[] | select(.name == "subverso") | .url\' lake-manifest.json); '
|
||||
'DEMOD_REV=$(git ls-remote "$SUBVERSO_URL" "refs/tags/no-modules/$ROOT_REV" | awk \'{print $1}\'); '
|
||||
'find test-projects -name lake-manifest.json -print0 | while IFS= read -r -d \'\' f; do '
|
||||
'jq --arg rev "$DEMOD_REV" \'.packages |= map(if .name == "subverso" then .rev = $rev else . end)\' "$f" > /tmp/lm_tmp.json && mv /tmp/lm_tmp.json "$f"; '
|
||||
'done'
|
||||
)
|
||||
run_command(sync_script, cwd=repo_path)
|
||||
print(green("Synced de-modulized subverso rev to all test-project sub-manifests"))
|
||||
elif dependencies:
|
||||
run_command(f'perl -pi -e \'s/"v4\\.[0-9]+(\\.[0-9]+)?(-rc[0-9]+)?"/"' + version + '"/g\' lakefile.*', cwd=repo_path)
|
||||
run_command("lake update", cwd=repo_path, stream_output=True)
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
cmake_minimum_required(VERSION 3.21)
|
||||
cmake_minimum_required(VERSION 3.10)
|
||||
cmake_policy(SET CMP0054 NEW)
|
||||
cmake_policy(SET CMP0110 NEW)
|
||||
if(NOT CMAKE_GENERATOR MATCHES "Unix Makefiles")
|
||||
message(FATAL_ERROR "The only supported CMake generator at the moment is 'Unix Makefiles'")
|
||||
endif()
|
||||
@@ -8,9 +10,9 @@ endif()
|
||||
include(ExternalProject)
|
||||
project(LEAN CXX C)
|
||||
set(LEAN_VERSION_MAJOR 4)
|
||||
set(LEAN_VERSION_MINOR 30)
|
||||
set(LEAN_VERSION_MINOR 29)
|
||||
set(LEAN_VERSION_PATCH 0)
|
||||
set(LEAN_VERSION_IS_RELEASE 0) # This number is 1 in the release revision, and 0 otherwise.
|
||||
set(LEAN_VERSION_IS_RELEASE 1) # This number is 1 in the release revision, and 0 otherwise.
|
||||
set(LEAN_SPECIAL_VERSION_DESC "" CACHE STRING "Additional version description like 'nightly-2018-03-11'")
|
||||
set(LEAN_VERSION_STRING "${LEAN_VERSION_MAJOR}.${LEAN_VERSION_MINOR}.${LEAN_VERSION_PATCH}")
|
||||
if(LEAN_SPECIAL_VERSION_DESC)
|
||||
|
||||
@@ -69,9 +69,11 @@ theorem em (p : Prop) : p ∨ ¬p :=
|
||||
theorem exists_true_of_nonempty {α : Sort u} : Nonempty α → ∃ _ : α, True
|
||||
| ⟨x⟩ => ⟨x, trivial⟩
|
||||
|
||||
@[implicit_reducible]
|
||||
noncomputable def inhabited_of_nonempty {α : Sort u} (h : Nonempty α) : Inhabited α :=
|
||||
⟨choice h⟩
|
||||
|
||||
@[implicit_reducible]
|
||||
noncomputable def inhabited_of_exists {α : Sort u} {p : α → Prop} (h : ∃ x, p x) : Inhabited α :=
|
||||
inhabited_of_nonempty (Exists.elim h (fun w _ => ⟨w⟩))
|
||||
|
||||
@@ -81,6 +83,7 @@ noncomputable scoped instance (priority := low) propDecidable (a : Prop) : Decid
|
||||
| Or.inl h => ⟨isTrue h⟩
|
||||
| Or.inr h => ⟨isFalse h⟩
|
||||
|
||||
@[implicit_reducible]
|
||||
noncomputable def decidableInhabited (a : Prop) : Inhabited (Decidable a) where
|
||||
default := inferInstance
|
||||
|
||||
|
||||
@@ -49,6 +49,7 @@ instance : Monad Id where
|
||||
/--
|
||||
The identity monad has a `bind` operator.
|
||||
-/
|
||||
@[implicit_reducible]
|
||||
def hasBind : Bind Id :=
|
||||
inferInstance
|
||||
|
||||
@@ -58,7 +59,7 @@ Runs a computation in the identity monad.
|
||||
This function is the identity function. Because its parameter has type `Id α`, it causes
|
||||
`do`-notation in its arguments to use the `Monad Id` instance.
|
||||
-/
|
||||
@[always_inline, inline, expose]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
protected def run (x : Id α) : α := x
|
||||
|
||||
instance [OfNat α n] : OfNat (Id α) n :=
|
||||
|
||||
@@ -72,11 +72,11 @@ public instance [Monad m] [LawfulMonad m] [MonadAttach m] [LawfulMonadAttach m]
|
||||
|
||||
public instance [Monad m] [MonadAttach m] [LawfulMonad m] [WeaklyLawfulMonadAttach m] :
|
||||
WeaklyLawfulMonadAttach (StateRefT' ω σ m) :=
|
||||
inferInstanceAs (WeaklyLawfulMonadAttach (ReaderT _ _))
|
||||
inferInstanceAs (WeaklyLawfulMonadAttach (ReaderT (ST.Ref ω σ) m))
|
||||
|
||||
public instance [Monad m] [MonadAttach m] [LawfulMonad m] [LawfulMonadAttach m] :
|
||||
LawfulMonadAttach (StateRefT' ω σ m) :=
|
||||
inferInstanceAs (LawfulMonadAttach (ReaderT _ _))
|
||||
inferInstanceAs (LawfulMonadAttach (ReaderT (ST.Ref ω σ) m))
|
||||
|
||||
section
|
||||
|
||||
|
||||
@@ -103,11 +103,11 @@ namespace StateRefT'
|
||||
instance {ω σ : Type} {m : Type → Type} [Monad m] : LawfulMonadLift m (StateRefT' ω σ m) where
|
||||
monadLift_pure _ := by
|
||||
simp only [MonadLift.monadLift, pure]
|
||||
unfold StateRefT'.lift ReaderT.pure
|
||||
unfold StateRefT'.lift instMonad._aux_5 ReaderT.pure
|
||||
simp only
|
||||
monadLift_bind _ _ := by
|
||||
simp only [MonadLift.monadLift, bind]
|
||||
unfold StateRefT'.lift ReaderT.bind
|
||||
unfold StateRefT'.lift instMonad._aux_13 ReaderT.bind
|
||||
simp only
|
||||
|
||||
end StateRefT'
|
||||
|
||||
@@ -1339,10 +1339,10 @@ transitive and contains `r`. `TransGen r a z` if and only if there exists a sequ
|
||||
-/
|
||||
inductive Relation.TransGen {α : Sort u} (r : α → α → Prop) : α → α → Prop
|
||||
/-- If `r a b`, then `TransGen r a b`. This is the base case of the transitive closure. -/
|
||||
| single {a b : α} : r a b → TransGen r a b
|
||||
| single {a b} : r a b → TransGen r a b
|
||||
/-- If `TransGen r a b` and `r b c`, then `TransGen r a c`.
|
||||
This is the inductive case of the transitive closure. -/
|
||||
| tail {a b c : α} : TransGen r a b → r b c → TransGen r a c
|
||||
| tail {a b c} : TransGen r a b → r b c → TransGen r a c
|
||||
|
||||
/-- The transitive closure is transitive. -/
|
||||
theorem Relation.TransGen.trans {α : Sort u} {r : α → α → Prop} {a b c} :
|
||||
|
||||
@@ -283,7 +283,7 @@ Examples:
|
||||
* `#[1, 2].isEmpty = false`
|
||||
* `#[()].isEmpty = false`
|
||||
-/
|
||||
@[expose, inline]
|
||||
@[expose]
|
||||
def isEmpty (xs : Array α) : Bool :=
|
||||
xs.size = 0
|
||||
|
||||
@@ -377,7 +377,6 @@ Returns the last element of an array, or panics if the array is empty.
|
||||
Safer alternatives include `Array.back`, which requires a proof the array is non-empty, and
|
||||
`Array.back?`, which returns an `Option`.
|
||||
-/
|
||||
@[inline]
|
||||
def back! [Inhabited α] (xs : Array α) : α :=
|
||||
xs[xs.size - 1]!
|
||||
|
||||
@@ -387,7 +386,6 @@ Returns the last element of an array, given a proof that the array is not empty.
|
||||
See `Array.back!` for the version that panics if the array is empty, or `Array.back?` for the
|
||||
version that returns an option.
|
||||
-/
|
||||
@[inline]
|
||||
def back (xs : Array α) (h : 0 < xs.size := by get_elem_tactic) : α :=
|
||||
xs[xs.size - 1]'(Nat.sub_one_lt_of_lt h)
|
||||
|
||||
@@ -397,7 +395,6 @@ Returns the last element of an array, or `none` if the array is empty.
|
||||
See `Array.back!` for the version that panics if the array is empty, or `Array.back` for the version
|
||||
that requires a proof the array is non-empty.
|
||||
-/
|
||||
@[inline]
|
||||
def back? (xs : Array α) : Option α :=
|
||||
xs[xs.size - 1]?
|
||||
|
||||
|
||||
@@ -72,9 +72,6 @@ theorem toArray_eq : List.toArray as = xs ↔ as = xs.toList := by
|
||||
|
||||
/-! ### size -/
|
||||
|
||||
theorem size_singleton {x : α} : #[x].size = 1 := by
|
||||
simp
|
||||
|
||||
theorem eq_empty_of_size_eq_zero (h : xs.size = 0) : xs = #[] := by
|
||||
cases xs
|
||||
simp_all
|
||||
@@ -3486,21 +3483,6 @@ theorem foldl_eq_foldr_reverse {xs : Array α} {f : β → α → β} {b} :
|
||||
theorem foldr_eq_foldl_reverse {xs : Array α} {f : α → β → β} {b} :
|
||||
xs.foldr f b = xs.reverse.foldl (fun x y => f y x) b := by simp
|
||||
|
||||
theorem foldl_eq_apply_foldr {xs : Array α} {f : α → α → α}
|
||||
[Std.Associative f] [Std.LawfulRightIdentity f init] :
|
||||
xs.foldl f x = f x (xs.foldr f init) := by
|
||||
simp [← foldl_toList, ← foldr_toList, List.foldl_eq_apply_foldr]
|
||||
|
||||
theorem foldr_eq_apply_foldl {xs : Array α} {f : α → α → α}
|
||||
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
|
||||
xs.foldr f x = f (xs.foldl f init) x := by
|
||||
simp [← foldl_toList, ← foldr_toList, List.foldr_eq_apply_foldl]
|
||||
|
||||
theorem foldr_eq_foldl {xs : Array α} {f : α → α → α}
|
||||
[Std.Associative f] [Std.LawfulIdentity f init] :
|
||||
xs.foldr f init = xs.foldl f init := by
|
||||
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
|
||||
|
||||
@[simp] theorem foldr_push_eq_append {as : Array α} {bs : Array β} {f : α → β} (w : start = as.size) :
|
||||
as.foldr (fun a xs => Array.push xs (f a)) bs start 0 = bs ++ (as.map f).reverse := by
|
||||
subst w
|
||||
@@ -4353,33 +4335,16 @@ def sum_eq_sum_toList := @sum_toList
|
||||
|
||||
@[simp, grind =]
|
||||
theorem sum_append [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
|
||||
[Std.LawfulLeftIdentity (α := α) (· + ·) 0]
|
||||
[Std.LeftIdentity (α := α) (· + ·) 0] [Std.LawfulLeftIdentity (α := α) (· + ·) 0]
|
||||
{as₁ as₂ : Array α} : (as₁ ++ as₂).sum = as₁.sum + as₂.sum := by
|
||||
simp [← sum_toList, List.sum_append]
|
||||
|
||||
@[simp, grind =]
|
||||
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
|
||||
#[x].sum = x := by
|
||||
simp [Array.sum_eq_foldr, Std.LawfulRightIdentity.right_id x]
|
||||
|
||||
@[simp, grind =]
|
||||
theorem sum_push [Add α] [Zero α] [Std.Associative (α := α) (· + ·)]
|
||||
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Array α} {x : α} :
|
||||
(xs.push x).sum = xs.sum + x := by
|
||||
simp [Array.sum_eq_foldr, Std.LawfulRightIdentity.right_id, Std.LawfulLeftIdentity.left_id,
|
||||
← Array.foldr_assoc]
|
||||
|
||||
@[simp, grind =]
|
||||
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
|
||||
[Std.Commutative (α := α) (· + ·)]
|
||||
[Std.LawfulLeftIdentity (α := α) (· + ·) 0] (xs : Array α) : xs.reverse.sum = xs.sum := by
|
||||
simp [← sum_toList, List.sum_reverse]
|
||||
|
||||
theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
|
||||
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Array α} :
|
||||
xs.sum = xs.foldl (init := 0) (· + ·) := by
|
||||
simp [← sum_toList, List.sum_eq_foldl]
|
||||
|
||||
theorem foldl_toList_eq_flatMap {l : List α} {acc : Array β}
|
||||
{F : Array β → α → Array β} {G : α → List β}
|
||||
(H : ∀ acc a, (F acc a).toList = acc.toList ++ G a) :
|
||||
|
||||
@@ -126,14 +126,6 @@ theorem swap_perm {xs : Array α} {i j : Nat} (h₁ : i < xs.size) (h₂ : j < x
|
||||
simp only [swap, perm_iff_toList_perm, toList_set]
|
||||
apply set_set_perm
|
||||
|
||||
theorem Perm.pairwise_iff {R : α → α → Prop} (S : ∀ {x y}, R x y → R y x) {xs ys : Array α}
|
||||
: ∀ _p : xs.Perm ys, xs.toList.Pairwise R ↔ ys.toList.Pairwise R := by
|
||||
simpa only [perm_iff_toList_perm] using List.Perm.pairwise_iff S
|
||||
|
||||
theorem Perm.pairwise {R : α → α → Prop} {xs ys : Array α} (hp : xs ~ ys)
|
||||
(hR : xs.toList.Pairwise R) (hsymm : ∀ {x y}, R x y → R y x) :
|
||||
ys.toList.Pairwise R := (hp.pairwise_iff hsymm).mp hR
|
||||
|
||||
namespace Perm
|
||||
|
||||
set_option linter.indexVariables false in
|
||||
|
||||
@@ -2393,412 +2393,4 @@ theorem fastUmulOverflow (x y : BitVec w) :
|
||||
simp [← Nat.pow_add, show w + 1 - (k - 1) + k = w + 1 + 1 by omega] at this
|
||||
omega
|
||||
|
||||
/-! ### Population Count -/
|
||||
|
||||
/-- Extract the `k`-th bit from `x` and extend it to have length `len`. -/
|
||||
def extractAndExtendBit (idx len : Nat) (x : BitVec w) : BitVec len :=
|
||||
BitVec.zeroExtend len (BitVec.extractLsb' idx 1 x)
|
||||
|
||||
|
||||
/-- Recursively extract one bit at a time and extend it to width `w` -/
|
||||
def extractAndExtendAux (k len : Nat) (x : BitVec w) (acc : BitVec (k * len)) (hle : k ≤ w) :
|
||||
BitVec (w * len) :=
|
||||
match hwi : w - k with
|
||||
| 0 => acc.cast (by simp [show w = k by omega])
|
||||
| n' + 1 =>
|
||||
let acc' := extractAndExtendBit k len x ++ acc
|
||||
extractAndExtendAux (k + 1) len x (acc'.cast (by simp [Nat.add_mul]; omega)) (by omega)
|
||||
termination_by w - k
|
||||
|
||||
/-- We instantiate `extractAndExtendAux` to extend each bit to `len`, extending
|
||||
each bit in `x` to have width `w` and returning a `BitVec (w * w)`. -/
|
||||
def extractAndExtend (len : Nat) (x : BitVec w) : BitVec (w * len) :=
|
||||
extractAndExtendAux 0 len x ((0#0).cast (by simp)) (by omega)
|
||||
|
||||
/--
|
||||
Construct a layer of the parallel-prefix-sum tree by summing two-by-two all the
|
||||
`w`-long words in `oldLayer`, returning a bitvector containing `(oldLen + 1) / 2`
|
||||
flattened `w`-long words, each resulting from an addition.
|
||||
-/
|
||||
def cpopLayer (oldLayer : BitVec (len * w)) (newLayer : BitVec (iterNum * w))
|
||||
(hold : 2 * (iterNum - 1) < len) : BitVec (((len + 1)/2) * w) :=
|
||||
if hlen : len - (iterNum * 2) = 0 then
|
||||
have : ((len + 1)/2) = iterNum := by omega
|
||||
newLayer.cast (by simp [this])
|
||||
else
|
||||
let op1 := oldLayer.extractLsb' ((2 * iterNum) * w) w
|
||||
let op2 := oldLayer.extractLsb' ((2 * iterNum + 1) * w) w
|
||||
let newLayer' := (op1 + op2) ++ newLayer
|
||||
have hcast : w + iterNum * w = (iterNum + 1) * w := by simp [Nat.add_mul]; omega
|
||||
cpopLayer oldLayer (newLayer'.cast hcast) (by omega)
|
||||
termination_by len - (iterNum * 2)
|
||||
|
||||
/--
|
||||
Given a `BitVec (len * w)` of `len` flattened `w`-long words,
|
||||
construct a binary tree that sums two-by-two the `w`-long words in the previous layer,
|
||||
ultimately returning a single `w`-long words corresponding to the whole addition.
|
||||
-/
|
||||
def cpopTree (l : BitVec (len * w)) : BitVec w :=
|
||||
if h : len = 0 then 0#w
|
||||
else if h : len = 1 then
|
||||
l.cast (by simp [h])
|
||||
else
|
||||
cpopTree (cpopLayer l 0#(0 * w) (by omega))
|
||||
termination_by len
|
||||
|
||||
/--
|
||||
Given flattened bitvector `x : BitVec w` and a length `l : Nat`,
|
||||
construct a parallel prefix sum circuit adding each available `l`-long word in `x`.
|
||||
-/
|
||||
def cpopRec (x : BitVec w) : BitVec w :=
|
||||
if hw : 1 < w then
|
||||
let extendedBits := x.extractAndExtend w
|
||||
(cpopTree extendedBits).cast (by simp)
|
||||
else if hw' : 0 < w then
|
||||
x
|
||||
else
|
||||
0#w
|
||||
|
||||
/-- Recursive addition of the elements in a flattened bitvec, starting from the `rem`-th element. -/
|
||||
private def addRecAux (x : BitVec (l * w)) (rem : Nat) (acc : BitVec w) : BitVec w :=
|
||||
match rem with
|
||||
| 0 => acc
|
||||
| n + 1 => x.addRecAux n (acc + x.extractLsb' (n * w) w)
|
||||
|
||||
/-- Recursive addition of the elements in a flattened bitvec. -/
|
||||
private def addRec (x : BitVec (l * w)) : BitVec w := addRecAux x l 0#w
|
||||
|
||||
theorem getLsbD_extractAndExtendBit {x : BitVec w} :
|
||||
(extractAndExtendBit k len x).getLsbD i =
|
||||
(decide (i = 0) && decide (0 < len) && x.getLsbD k) := by
|
||||
simp only [extractAndExtendBit, truncate_eq_setWidth, getLsbD_setWidth, getLsbD_extractLsb',
|
||||
Nat.lt_one_iff]
|
||||
by_cases hi : i = 0
|
||||
<;> simp [hi]
|
||||
|
||||
@[simp]
|
||||
private theorem extractAndExtendAux_zero {k len : Nat} {x : BitVec w}
|
||||
{acc : BitVec (k * len)} (heq : w = k) :
|
||||
extractAndExtendAux k len x acc (by omega) = acc.cast (by simp [heq]) := by
|
||||
unfold extractAndExtendAux
|
||||
split
|
||||
· simp
|
||||
· omega
|
||||
|
||||
private theorem extractLsb'_extractAndExtendAux {k len : Nat} {x : BitVec w}
|
||||
(acc : BitVec (k * len)) (hle : k ≤ w) :
|
||||
(∀ i (_ : i < k), acc.extractLsb' (i * len) len = (x.extractLsb' i 1).setWidth len) →
|
||||
(extractAndExtendAux k len x acc (by omega)).extractLsb' (i * len) len =
|
||||
(x.extractLsb' i 1).setWidth len := by
|
||||
intros hacc
|
||||
induction hwi : w - k generalizing acc k
|
||||
· case zero =>
|
||||
rw [extractAndExtendAux_zero (by omega)]
|
||||
by_cases hj : i < k
|
||||
· apply hacc
|
||||
exact hj
|
||||
· ext l hl
|
||||
have := mul_le_mul_right (n := k) (m := i) len (by omega)
|
||||
simp [← getLsbD_eq_getElem, getLsbD_extractLsb', hl, getLsbD_setWidth,
|
||||
show w ≤ i + l by omega, getLsbD_of_ge acc (i * len + l) (by omega)]
|
||||
· case succ n' ihn' =>
|
||||
rw [extractAndExtendAux]
|
||||
split
|
||||
· omega
|
||||
· apply ihn'
|
||||
· intros i hi
|
||||
have hcast : len + k * len = (k + 1) * len := by
|
||||
simp [Nat.mul_comm, Nat.mul_add, Nat.add_comm]
|
||||
|
||||
by_cases hi' : i < k
|
||||
· have heq : extractLsb' (i * len) len (BitVec.cast hcast (extractAndExtendBit k len x ++ acc)) =
|
||||
extractLsb' (i * len) len ((extractAndExtendBit k len x ++ acc)) := by
|
||||
ext; simp
|
||||
rw [heq, extractLsb'_append_of_lt hi']
|
||||
apply hacc
|
||||
exact hi'
|
||||
· have heq : extractLsb' (i * len) len (BitVec.cast hcast (extractAndExtendBit k len x ++ acc)) =
|
||||
extractLsb' (i * len) len ((extractAndExtendBit k len x ++ acc)) := by
|
||||
ext; simp
|
||||
rw [heq, extractLsb'_append_of_eq (by omega)]
|
||||
simp [show i = k by omega, extractAndExtendBit]
|
||||
· omega
|
||||
|
||||
theorem extractLsb'_cpopLayer {w iterNum i oldLen : Nat} {oldLayer : BitVec (oldLen * w)}
|
||||
{newLayer : BitVec (iterNum * w)} (hold : 2 * (iterNum - 1) < oldLen) :
|
||||
(∀ i (_hi: i < iterNum),
|
||||
newLayer.extractLsb' (i * w) w =
|
||||
oldLayer.extractLsb' ((2 * i) * w) w + (oldLayer.extractLsb' ((2 * i + 1) * w) w)) →
|
||||
extractLsb' (i * w) w (oldLayer.cpopLayer newLayer hold) =
|
||||
extractLsb' (2 * i * w) w oldLayer + extractLsb' ((2 * i + 1) * w) w oldLayer := by
|
||||
intro proof_addition
|
||||
rw [cpopLayer]
|
||||
split
|
||||
· by_cases hi : i < iterNum
|
||||
· simp only [extractLsb'_cast]
|
||||
apply proof_addition
|
||||
exact hi
|
||||
· ext j hj
|
||||
have : iterNum * w ≤ i * w := by refine mul_le_mul_right w (by omega)
|
||||
have : oldLen * w ≤ (2 * i) * w := by refine mul_le_mul_right w (by omega)
|
||||
have : oldLen * w ≤ (2 * i + 1) * w := by refine mul_le_mul_right w (by omega)
|
||||
have hz : extractLsb' (2 * i * w) w oldLayer = 0#w := by
|
||||
ext j hj
|
||||
simp [show oldLen * w ≤ 2 * i * w + j by omega]
|
||||
have hz' : extractLsb' ((2 * i + 1) * w) w oldLayer = 0#w := by
|
||||
ext j hj
|
||||
simp [show oldLen * w ≤ (2 * i + 1) * w + j by omega]
|
||||
simp [show iterNum * w ≤ i * w + j by omega, hz, hz']
|
||||
· generalize hop1 : oldLayer.extractLsb' ((2 * iterNum) * w) w = op1
|
||||
generalize hop2 : oldLayer.extractLsb' ((2 * iterNum + 1) * w) w = op2
|
||||
have hcast : w + iterNum * w = (iterNum + 1) * w := by simp [Nat.add_mul]; omega
|
||||
apply extractLsb'_cpopLayer
|
||||
intros i hi
|
||||
by_cases hlt : i < iterNum
|
||||
· rw [extractLsb'_cast, extractLsb'_append_eq_of_add_le]
|
||||
· apply proof_addition
|
||||
exact hlt
|
||||
· rw [show i * w + w = i * w + 1 * w by omega, ← Nat.add_mul]
|
||||
exact mul_le_mul_right w hlt
|
||||
· rw [extractLsb'_cast, show i = iterNum by omega, extractLsb'_append_eq_left, hop1, hop2]
|
||||
termination_by oldLen - 2 * (iterNum + 1 - 1)
|
||||
|
||||
theorem getLsbD_cpopLayer {w iterNum: Nat} {oldLayer : BitVec (oldLen * w)}
|
||||
{newLayer : BitVec (iterNum * w)} (hold : 2 * (iterNum - 1) < oldLen) :
|
||||
(∀ i (_hi: i < iterNum),
|
||||
newLayer.extractLsb' (i * w) w =
|
||||
oldLayer.extractLsb' ((2 * i) * w) w + (oldLayer.extractLsb' ((2 * i + 1) * w) w)) →
|
||||
(oldLayer.cpopLayer newLayer hold).getLsbD k =
|
||||
(extractLsb' (2 * ((k - k % w) / w) * w) w oldLayer +
|
||||
extractLsb' ((2 * ((k - k % w) / w) + 1) * w) w oldLayer).getLsbD (k % w) := by
|
||||
intro proof_addition
|
||||
by_cases hw0 : w = 0
|
||||
· subst hw0
|
||||
simp
|
||||
· simp only [← extractLsb'_cpopLayer (hold := by omega) proof_addition,
|
||||
Nat.mod_lt (x := k) (y := w) (by omega), getLsbD_eq_getElem, getElem_extractLsb']
|
||||
congr
|
||||
by_cases hmod : k % w = 0
|
||||
· rw [hmod, Nat.sub_zero, Nat.add_zero, Nat.div_mul_cancel (by omega)]
|
||||
· rw [Nat.div_mul_cancel (by exact dvd_sub_mod k), Nat.sub_add_cancel (by exact mod_le k w)]
|
||||
|
||||
@[simp]
|
||||
private theorem addRecAux_zero {x : BitVec (l * w)} {acc : BitVec w} :
|
||||
x.addRecAux 0 acc = acc := rfl
|
||||
|
||||
@[simp]
|
||||
private theorem addRecAux_succ {x : BitVec (l * w)} {n : Nat} {acc : BitVec w} :
|
||||
x.addRecAux (n + 1) acc = x.addRecAux n (acc + extractLsb' (n * w) w x) := rfl
|
||||
|
||||
private theorem addRecAux_eq {x : BitVec (l * w)} {n : Nat} {acc : BitVec w} :
|
||||
x.addRecAux n acc = x.addRecAux n 0#w + acc := by
|
||||
induction n generalizing acc
|
||||
· case zero =>
|
||||
simp
|
||||
· case succ n ihn =>
|
||||
simp only [addRecAux_succ, BitVec.zero_add, ihn (acc := extractLsb' (n * w) w x),
|
||||
BitVec.add_assoc, ihn (acc := acc + extractLsb' (n * w) w x), BitVec.add_right_inj]
|
||||
rw [BitVec.add_comm (x := acc)]
|
||||
|
||||
private theorem extractLsb'_addRecAux_of_le {x : BitVec (len * w)} (h : r ≤ k):
|
||||
(extractLsb' 0 (k * w) x).addRecAux r 0#w = x.addRecAux r 0#w := by
|
||||
induction r generalizing x len k
|
||||
· case zero =>
|
||||
simp [addRecAux]
|
||||
· case succ diff ihdiff =>
|
||||
simp only [addRecAux_succ, BitVec.zero_add]
|
||||
have hext : diff * w + w ≤ k * w := by
|
||||
simp only [show diff * w + w = (diff + 1) * w by simp [Nat.add_mul]]
|
||||
exact Nat.mul_le_mul_right w h
|
||||
rw [extractLsb'_extractLsb'_of_le hext, addRecAux_eq (x := x),
|
||||
addRecAux_eq (x := extractLsb' 0 (k * w) x), ihdiff (x := x) (by omega) (k := k)]
|
||||
|
||||
private theorem extractLsb'_extractAndExtend_eq {i len : Nat} {x : BitVec w} :
|
||||
(extractAndExtend len x).extractLsb' (i * len) len = extractAndExtendBit i len x := by
|
||||
unfold extractAndExtend
|
||||
by_cases hilt : i < w
|
||||
· ext j hj
|
||||
simp [extractLsb'_extractAndExtendAux, extractAndExtendBit]
|
||||
· ext k hk
|
||||
have := Nat.mul_le_mul_right (n := w) (k := len) (m := i) (by omega)
|
||||
simp only [extractAndExtendBit, cast_ofNat, getElem_extractLsb', truncate_eq_setWidth,
|
||||
getElem_setWidth, getLsbD_extractLsb', Nat.lt_one_iff]
|
||||
rw [getLsbD_of_ge, getLsbD_of_ge]
|
||||
· simp
|
||||
· omega
|
||||
· omega
|
||||
|
||||
private theorem addRecAux_append_extractLsb' {x : BitVec (len * w)} (ha : 0 < len) :
|
||||
((x.extractLsb' ((len - 1) * w) w ++
|
||||
x.extractLsb' 0 ((len - 1) * w)).cast (m := len * w) hcast).addRecAux len 0#w =
|
||||
x.extractLsb' ((len - 1) * w) w +
|
||||
(x.extractLsb' 0 ((len - 1) * w)).addRecAux (len - 1) 0#w := by
|
||||
simp only [extractLsb'_addRecAux_of_le (k := len - 1) (r := len - 1) (by omega),
|
||||
BitVec.append_extractLsb'_of_lt (hcast := hcast)]
|
||||
have hsucc := addRecAux_succ (x := x) (acc := 0#w) (n := len - 1)
|
||||
rw [BitVec.zero_add, Nat.sub_one_add_one (by omega)] at hsucc
|
||||
rw [hsucc, addRecAux_eq, BitVec.add_comm]
|
||||
|
||||
private theorem Nat.mul_add_le_mul_of_succ_le {a b c : Nat} (h : a + 1 ≤ c) :
|
||||
a * b + b ≤ c * b := by
|
||||
rw [← Nat.succ_mul]
|
||||
exact mul_le_mul_right b h
|
||||
|
||||
/--
|
||||
The recursive addition of `w`-long words on two flattened bitvectors `x` and `y` (with different
|
||||
number of words `len` and `len'`, respectively) returns the same value, if we can prove
|
||||
that each `w`-long word in `x` results from the addition of two `w`-long words in `y`,
|
||||
using exactly all `w`-long words in `y`.
|
||||
-/
|
||||
private theorem addRecAux_eq_of {x : BitVec (len * w)} {y : BitVec (len' * w)}
|
||||
(hlen : len = (len' + 1) / 2) :
|
||||
(∀ (i : Nat) (_h : i < (len' + 1) / 2),
|
||||
extractLsb' (i * w) w x = extractLsb' (2 * i * w) w y + extractLsb' ((2 * i + 1) * w) w y) →
|
||||
x.addRecAux len 0#w = y.addRecAux len' 0#w := by
|
||||
intro hadd
|
||||
induction len generalizing len' y
|
||||
· case zero =>
|
||||
simp [show len' = 0 by omega]
|
||||
· case succ len ih =>
|
||||
have hcast : w + (len + 1 - 1) * w = (len + 1) * w := by
|
||||
simp [Nat.add_mul, Nat.add_comm]
|
||||
have hcast' : w + (len' - 1) * w = len' * w := by
|
||||
rw [Nat.sub_mul, Nat.one_mul,
|
||||
← Nat.add_sub_assoc (by refine Nat.le_mul_of_pos_left w (by omega)), Nat.add_comm]
|
||||
simp
|
||||
rw [addRecAux_succ, ← BitVec.append_extractLsb'_of_lt (x := x) (hcast := hcast)]
|
||||
have happ := addRecAux_append_extractLsb' (len := len + 1) (x := x) (hcast := hcast) (by omega)
|
||||
simp only [Nat.add_one_sub_one, addRecAux_succ, BitVec.zero_add] at happ
|
||||
simp only [Nat.add_one_sub_one, BitVec.zero_add, happ]
|
||||
have := Nat.succ_mul (n := len' - 1) (m := w)
|
||||
rw [succ_eq_add_one, Nat.sub_one_add_one (by omega)] at this
|
||||
by_cases hmod : len' % 2 = 0
|
||||
· /- `sum` results from the addition of the two last elements in `y`, `sum = op1 + op2` -/
|
||||
have := Nat.mul_le_mul_right (n := len' - 1 - 1) (m := len' - 1) (k := w) (by omega)
|
||||
have := Nat.succ_mul (n := len' - 1 - 1) (m := w)
|
||||
have hcast'' : w + (len' - 1 - 1) * w = (len' - 1) * w := by
|
||||
rw [Nat.sub_mul, Nat.one_mul,
|
||||
← Nat.add_sub_assoc (k := w) (by refine Nat.le_mul_of_pos_left w (by omega))]
|
||||
simp
|
||||
rw [succ_eq_add_one, Nat.sub_one_add_one (by omega)] at this
|
||||
rw [← BitVec.append_extractLsb'_of_lt (x := y) (hcast := hcast'),
|
||||
addRecAux_append_extractLsb' (by omega),
|
||||
← BitVec.append_extractLsb'_of_lt (x := extractLsb' 0 ((len' - 1) * w) y) (hcast := hcast''),
|
||||
addRecAux_append_extractLsb' (by omega),
|
||||
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
|
||||
extractLsb'_extractLsb'_of_le (by omega), ← BitVec.add_assoc, hadd (_h := by omega)]
|
||||
congr 1
|
||||
· rw [show len = (len' + 1) / 2 - 1 by omega, BitVec.add_comm]
|
||||
congr <;> omega
|
||||
· apply ih
|
||||
· omega
|
||||
· intros
|
||||
rw [extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
|
||||
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
|
||||
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
|
||||
hadd (_h := by omega)]
|
||||
· /- `sum` results from the addition of the last elements in `y` with `0#w` -/
|
||||
have : len' * w ≤ (len' - 1 + 1) * w := by exact mul_le_mul_right w (by omega)
|
||||
rw [← BitVec.append_extractLsb'_of_lt (x := y) (hcast := hcast'),
|
||||
addRecAux_append_extractLsb' (by omega), hadd (_h := by omega),
|
||||
show 2 * len = len' - 1 by omega]
|
||||
congr 1
|
||||
· rw [BitVec.add_right_eq_self]
|
||||
ext k hk
|
||||
simp only [getElem_extractLsb', getElem_zero]
|
||||
apply getLsbD_of_ge y ((len' - 1 + 1) * w + k) (by omega)
|
||||
· apply ih
|
||||
· omega
|
||||
· intros
|
||||
rw [extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
|
||||
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
|
||||
extractLsb'_extractLsb'_of_le (by exact Nat.mul_add_le_mul_of_succ_le (by omega)),
|
||||
hadd (_h := by omega)]
|
||||
|
||||
private theorem getLsbD_extractAndExtend_of_lt {x : BitVec w} (hk : k < v) :
|
||||
(x.extractAndExtend v).getLsbD (pos * v + k) = (extractAndExtendBit pos v x).getLsbD k := by
|
||||
simp [← extractLsb'_extractAndExtend_eq (w := w) (len := v) (i := pos) (x := x)]
|
||||
omega
|
||||
|
||||
/--
|
||||
Extracting a bit from a `BitVec.extractAndExtend` is the same as extracting a bit
|
||||
from a zero-extended bit at a certain position in the original bitvector.
|
||||
-/
|
||||
theorem getLsbD_extractAndExtend {x : BitVec w} (hv : 0 < v) :
|
||||
(BitVec.extractAndExtend v x).getLsbD k =
|
||||
(BitVec.extractAndExtendBit ((k - (k % v)) / v) v x).getLsbD (k % v):= by
|
||||
rw [← getLsbD_extractAndExtend_of_lt (by exact mod_lt k hv)]
|
||||
congr
|
||||
by_cases hmod : k % v = 0
|
||||
· simp only [hmod, Nat.sub_zero, Nat.add_zero]
|
||||
rw [Nat.div_mul_cancel (by omega)]
|
||||
· rw [← Nat.div_eq_sub_mod_div]
|
||||
exact Eq.symm (div_add_mod' k v)
|
||||
|
||||
private theorem addRecAux_extractAndExtend_eq_cpopNatRec {x : BitVec w} :
|
||||
(extractAndExtend w x).addRecAux n 0#w = x.cpopNatRec n 0 := by
|
||||
induction n
|
||||
· case zero =>
|
||||
simp
|
||||
· case succ n' ihn' =>
|
||||
rw [cpopNatRec_succ, Nat.zero_add, natCast_eq_ofNat, addRecAux_succ, BitVec.zero_add,
|
||||
addRecAux_eq, cpopNatRec_eq, ihn', ofNat_add, natCast_eq_ofNat, BitVec.add_right_inj,
|
||||
extractLsb'_extractAndExtend_eq]
|
||||
ext k hk
|
||||
simp only [extractAndExtendBit, ← getLsbD_eq_getElem, getLsbD_ofNat, hk, decide_true,
|
||||
Bool.true_and, truncate_eq_setWidth, getLsbD_setWidth, getLsbD_extractLsb', Nat.lt_one_iff]
|
||||
by_cases hk0 : k = 0
|
||||
· simp only [hk0, testBit_zero, decide_true, Nat.add_zero, Bool.true_and]
|
||||
cases x.getLsbD n' <;> simp
|
||||
· simp only [show ¬k = 0 by omega, decide_false, Bool.false_and]
|
||||
symm
|
||||
apply testBit_lt_two_pow ?_
|
||||
have : (x.getLsbD n').toNat ≤ 1 := by
|
||||
cases x.getLsbD n' <;> simp
|
||||
have : 1 < 2 ^ k := by exact Nat.one_lt_two_pow hk0
|
||||
omega
|
||||
|
||||
private theorem addRecAux_extractAndExtend_eq_cpop {x : BitVec w} :
|
||||
(extractAndExtend w x).addRecAux w 0#w = x.cpop := by
|
||||
simp only [cpop]
|
||||
apply addRecAux_extractAndExtend_eq_cpopNatRec
|
||||
|
||||
private theorem addRecAux_cpopTree {x : BitVec (len * w)} :
|
||||
addRecAux ((cpopTree x).cast (m := 1 * w) (by simp)) 1 0#w = addRecAux x len 0#w := by
|
||||
unfold cpopTree
|
||||
split
|
||||
· case _ h =>
|
||||
subst h
|
||||
simp [addRecAux]
|
||||
· case _ h =>
|
||||
split
|
||||
· case _ h' =>
|
||||
simp only [addRecAux_succ, Nat.zero_mul, BitVec.zero_add, addRecAux_zero, h']
|
||||
ext; simp
|
||||
· rw [addRecAux_cpopTree]
|
||||
apply BitVec.addRecAux_eq_of (x := cpopLayer x 0#(0 * w) (by omega)) (y := x)
|
||||
· rfl
|
||||
· intros j hj
|
||||
simp [extractLsb'_cpopLayer]
|
||||
termination_by len
|
||||
|
||||
private theorem addRecAux_eq_cpopTree {x : BitVec (len * w)} :
|
||||
x.addRecAux len 0#w = (x.cpopTree).cast (by simp) := by
|
||||
rw [← addRecAux_cpopTree, addRecAux_succ, Nat.zero_mul, BitVec.zero_add, addRecAux_zero]
|
||||
ext k hk
|
||||
simp [← getLsbD_eq_getElem, hk]
|
||||
|
||||
theorem cpop_eq_cpopRec {x : BitVec w} :
|
||||
BitVec.cpop x = BitVec.cpopRec x := by
|
||||
unfold BitVec.cpopRec
|
||||
split
|
||||
· simp [← addRecAux_extractAndExtend_eq_cpop, addRecAux_eq_cpopTree (x := extractAndExtend w x)]
|
||||
· split
|
||||
· ext k hk
|
||||
cases hx : x.getLsbD 0
|
||||
<;> simp [hx, cpop, ← getLsbD_eq_getElem, show k = 0 by omega, show w = 1 by omega]
|
||||
· have hw : w = 0 := by omega
|
||||
subst hw
|
||||
simp [of_length_zero]
|
||||
|
||||
end BitVec
|
||||
|
||||
@@ -2786,14 +2786,6 @@ theorem msb_append {x : BitVec w} {y : BitVec v} :
|
||||
rw [getElem_append] -- Why does this not work with `simp [getElem_append]`?
|
||||
simp
|
||||
|
||||
theorem append_of_zero_width (x : BitVec w) (y : BitVec v) (h : w = 0) :
|
||||
(x ++ y) = y.cast (by simp [h]) := by
|
||||
ext i ih
|
||||
subst h
|
||||
simp [← getLsbD_eq_getElem, getLsbD_append]
|
||||
omega
|
||||
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
@[grind =]
|
||||
theorem toInt_append {x : BitVec n} {y : BitVec m} :
|
||||
(x ++ y).toInt = if n == 0 then y.toInt else (2 ^ m) * x.toInt + y.toNat := by
|
||||
@@ -3020,34 +3012,6 @@ theorem extractLsb'_append_extractLsb'_eq_extractLsb' {x : BitVec w} (h : start
|
||||
congr 1
|
||||
omega
|
||||
|
||||
theorem append_extractLsb'_of_lt {x : BitVec (x_len * w)} :
|
||||
(x.extractLsb' ((x_len - 1) * w) w ++ x.extractLsb' 0 ((x_len - 1) * w)).cast hcast = x := by
|
||||
ext i hi
|
||||
simp only [getElem_cast, getElem_append, getElem_extractLsb', Nat.zero_add, dite_eq_ite]
|
||||
rw [← getLsbD_eq_getElem, ite_eq_left_iff, Nat.not_lt]
|
||||
intros
|
||||
simp only [show (x_len - 1) * w + (i - (x_len - 1) * w) = i by omega]
|
||||
|
||||
|
||||
theorem extractLsb'_append_of_lt {x : BitVec (k * w)} {y : BitVec w} (hlt : i < k) :
|
||||
extractLsb' (i * w) w (y ++ x) = extractLsb' (i * w) w x := by
|
||||
ext j hj
|
||||
simp only [← getLsbD_eq_getElem, getLsbD_extractLsb', hj, decide_true, getLsbD_append,
|
||||
Bool.true_and, ite_eq_left_iff, Nat.not_lt]
|
||||
intros h
|
||||
by_cases hw0 : w = 0
|
||||
· subst hw0
|
||||
simp
|
||||
· have : i * w ≤ (k - 1) * w := Nat.mul_le_mul_right w (by omega)
|
||||
have h' : i * w + j < (k - 1 + 1) * w := by simp [Nat.add_mul]; omega
|
||||
rw [Nat.sub_one_add_one (by omega)] at h'
|
||||
omega
|
||||
|
||||
theorem extractLsb'_append_of_eq {x : BitVec (k * w)} {y : BitVec w} (heq : i = k) :
|
||||
extractLsb' (i * w) w (y ++ x) = y := by
|
||||
ext j hj
|
||||
simp [← getLsbD_eq_getElem, getLsbD_append, hj, heq]
|
||||
|
||||
/-- Combine adjacent `~~~ (extractLsb _)'` operations into a single `~~~ (extractLsb _)'`. -/
|
||||
theorem not_extractLsb'_append_not_extractLsb'_eq_not_extractLsb' {x : BitVec w} (h : start₂ = start₁ + len₁) :
|
||||
(~~~ (x.extractLsb' start₂ len₂) ++ ~~~ (x.extractLsb' start₁ len₁)) =
|
||||
|
||||
@@ -629,6 +629,7 @@ export Bool (cond_eq_if cond_eq_ite xor and or not)
|
||||
This should not be turned on globally as an instance because it degrades performance in Mathlib,
|
||||
but may be used locally.
|
||||
-/
|
||||
@[implicit_reducible]
|
||||
def boolPredToPred : Coe (α → Bool) (α → Prop) where
|
||||
coe r := fun a => Eq (r a) true
|
||||
|
||||
|
||||
@@ -62,7 +62,7 @@ instance ltTrichotomous : Std.Trichotomous (· < · : Char → Char → Prop) wh
|
||||
trichotomous _ _ h₁ h₂ := Char.le_antisymm (by simpa using h₂) (by simpa using h₁)
|
||||
|
||||
@[deprecated ltTrichotomous (since := "2025-10-27")]
|
||||
def notLTAntisymm : Std.Antisymm (¬ · < · : Char → Char → Prop) where
|
||||
theorem notLTAntisymm : Std.Antisymm (¬ · < · : Char → Char → Prop) where
|
||||
antisymm := Char.ltTrichotomous.trichotomous
|
||||
|
||||
instance ltAsymm : Std.Asymm (· < · : Char → Char → Prop) where
|
||||
@@ -73,7 +73,7 @@ instance leTotal : Std.Total (· ≤ · : Char → Char → Prop) where
|
||||
|
||||
-- This instance is useful while setting up instances for `String`.
|
||||
@[deprecated ltAsymm (since := "2025-08-01")]
|
||||
def notLTTotal : Std.Total (¬ · < · : Char → Char → Prop) where
|
||||
theorem notLTTotal : Std.Total (¬ · < · : Char → Char → Prop) where
|
||||
total := fun x y => by simpa using Char.le_total y x
|
||||
|
||||
@[simp] theorem ofNat_toNat (c : Char) : Char.ofNat c.toNat = c := by
|
||||
|
||||
@@ -414,7 +414,7 @@ Renders a `Format` to a string.
|
||||
-/
|
||||
def pretty (f : Format) (width : Nat := defWidth) (indent : Nat := 0) (column := 0) : String :=
|
||||
let act : StateM State Unit := prettyM f width indent
|
||||
State.out <| act.run (State.mk "" column) |>.snd
|
||||
State.out <| act (State.mk "" column) |>.snd
|
||||
|
||||
end Format
|
||||
|
||||
|
||||
@@ -168,6 +168,13 @@ instance Map.instIterator {α β γ : Type w} {m : Type w → Type w'} {n : Type
|
||||
Iterator (Map α m n lift f) n γ :=
|
||||
inferInstanceAs <| Iterator (FilterMap α m n lift _) n γ
|
||||
|
||||
theorem Map.instIterator_eq_filterMapInstIterator {α β γ : Type w} {m : Type w → Type w'}
|
||||
{n : Type w → Type w''} [Monad n]
|
||||
[Iterator α m β] {lift : ⦃α : Type w⦄ → m α → n α} {f : β → PostconditionT n γ} :
|
||||
Map.instIterator (α := α) (β := β) (γ := γ) (m := m) (n := n) (lift := lift) (f := f) =
|
||||
FilterMap.instIterator :=
|
||||
rfl
|
||||
|
||||
private def FilterMap.instFinitenessRelation {α β γ : Type w} {m : Type w → Type w'}
|
||||
{n : Type w → Type w''} [Monad n] [Iterator α m β] {lift : ⦃α : Type w⦄ → m α → n α}
|
||||
{f : β → PostconditionT n (Option γ)} [Finite α m] :
|
||||
|
||||
@@ -362,8 +362,7 @@ def Flatten.instProductivenessRelation [Monad m] [Iterator α m (IterM (α := α
|
||||
case innerDone =>
|
||||
apply Flatten.productiveRel_of_right₂
|
||||
|
||||
@[no_expose]
|
||||
public def Flatten.instProductive [Monad m] [Iterator α m (IterM (α := α₂) m β)] [Iterator α₂ m β]
|
||||
public theorem Flatten.instProductive [Monad m] [Iterator α m (IterM (α := α₂) m β)] [Iterator α₂ m β]
|
||||
[Finite α m] [Productive α₂ m] : Productive (Flatten α α₂ β m) m :=
|
||||
.of_productivenessRelation instProductivenessRelation
|
||||
|
||||
|
||||
@@ -35,7 +35,7 @@ A `ForIn'` instance for iterators. Its generic membership relation is not easy t
|
||||
so this is not marked as `instance`. This way, more convenient instances can be built on top of it
|
||||
or future library improvements will make it more comfortable.
|
||||
-/
|
||||
@[always_inline, inline]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def Iter.instForIn' {α : Type w} {β : Type w} {n : Type x → Type x'} [Monad n]
|
||||
[Iterator α Id β] [IteratorLoop α Id n] :
|
||||
ForIn' n (Iter (α := α) β) β ⟨fun it out => it.IsPlausibleIndirectOutput out⟩ where
|
||||
@@ -53,7 +53,7 @@ instance (α : Type w) (β : Type w) (n : Type x → Type x') [Monad n]
|
||||
/--
|
||||
An implementation of `for h : ... in ... do ...` notation for partial iterators.
|
||||
-/
|
||||
@[always_inline, inline]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def Iter.Partial.instForIn' {α : Type w} {β : Type w} {n : Type x → Type x'} [Monad n]
|
||||
[Iterator α Id β] [IteratorLoop α Id n] :
|
||||
ForIn' n (Iter.Partial (α := α) β) β ⟨fun it out => it.it.IsPlausibleIndirectOutput out⟩ where
|
||||
@@ -71,7 +71,7 @@ instance (α : Type w) (β : Type w) (n : Type x → Type x') [Monad n]
|
||||
A `ForIn'` instance for iterators that is guaranteed to terminate after finitely many steps.
|
||||
It is not marked as an instance because the membership predicate is difficult to work with.
|
||||
-/
|
||||
@[always_inline, inline]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def Iter.Total.instForIn' {α : Type w} {β : Type w} {n : Type x → Type x'} [Monad n]
|
||||
[Iterator α Id β] [IteratorLoop α Id n] [Finite α Id] :
|
||||
ForIn' n (Iter.Total (α := α) β) β ⟨fun it out => it.it.IsPlausibleIndirectOutput out⟩ where
|
||||
|
||||
@@ -159,7 +159,7 @@ This is the default implementation of the `IteratorLoop` class.
|
||||
It simply iterates through the iterator using `IterM.step`. For certain iterators, more efficient
|
||||
implementations are possible and should be used instead.
|
||||
-/
|
||||
@[always_inline, inline, expose]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def IteratorLoop.defaultImplementation {α : Type w} {m : Type w → Type w'} {n : Type x → Type x'}
|
||||
[Monad n] [Iterator α m β] :
|
||||
IteratorLoop α m n where
|
||||
@@ -211,7 +211,7 @@ theorem IteratorLoop.wellFounded_of_productive {α β : Type w} {m : Type w →
|
||||
/--
|
||||
This `ForIn'`-style loop construct traverses a finite iterator using an `IteratorLoop` instance.
|
||||
-/
|
||||
@[always_inline, inline]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def IteratorLoop.finiteForIn' {m : Type w → Type w'} {n : Type x → Type x'}
|
||||
{α : Type w} {β : Type w} [Iterator α m β] [IteratorLoop α m n] [Monad n]
|
||||
(lift : ∀ γ δ, (γ → n δ) → m γ → n δ) :
|
||||
@@ -224,7 +224,7 @@ A `ForIn'` instance for iterators. Its generic membership relation is not easy t
|
||||
so this is not marked as `instance`. This way, more convenient instances can be built on top of it
|
||||
or future library improvements will make it more comfortable.
|
||||
-/
|
||||
@[always_inline, inline]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def IterM.instForIn' {m : Type w → Type w'} {n : Type w → Type w''}
|
||||
{α : Type w} {β : Type w} [Iterator α m β] [IteratorLoop α m n] [Monad n]
|
||||
[MonadLiftT m n] :
|
||||
@@ -239,7 +239,7 @@ instance IterM.instForInOfIteratorLoop {m : Type w → Type w'} {n : Type w →
|
||||
instForInOfForIn'
|
||||
|
||||
/-- Internal implementation detail of the iterator library. -/
|
||||
@[always_inline, inline]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def IterM.Partial.instForIn' {m : Type w → Type w'} {n : Type w → Type w''}
|
||||
{α : Type w} {β : Type w} [Iterator α m β] [IteratorLoop α m n] [MonadLiftT m n] [Monad n] :
|
||||
ForIn' n (IterM.Partial (α := α) m β) β ⟨fun it out => it.it.IsPlausibleIndirectOutput out⟩ where
|
||||
@@ -247,7 +247,7 @@ def IterM.Partial.instForIn' {m : Type w → Type w'} {n : Type w → Type w''}
|
||||
haveI := @IterM.instForIn'; forIn' it.it init f
|
||||
|
||||
/-- Internal implementation detail of the iterator library. -/
|
||||
@[always_inline, inline]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def IterM.Total.instForIn' {m : Type w → Type w'} {n : Type w → Type w''}
|
||||
{α : Type w} {β : Type w} [Iterator α m β] [IteratorLoop α m n] [MonadLiftT m n] [Monad n]
|
||||
[Finite α m] :
|
||||
|
||||
@@ -70,7 +70,7 @@ theorem LawfulMonadLiftFunction.lift_seqRight [LawfulMonad m] [LawfulMonad n]
|
||||
abbrev idToMonad [Monad m] ⦃α : Type u⦄ (x : Id α) : m α :=
|
||||
pure x.run
|
||||
|
||||
def LawfulMonadLiftFunction.idToMonad [Monad m] [LawfulMonad m] :
|
||||
theorem LawfulMonadLiftFunction.idToMonad [LawfulMonad m] :
|
||||
LawfulMonadLiftFunction (m := Id) (n := m) idToMonad where
|
||||
lift_pure := by simp [Internal.idToMonad]
|
||||
lift_bind := by simp [Internal.idToMonad]
|
||||
@@ -95,7 +95,7 @@ instance [LawfulMonadLiftBindFunction (n := n) (fun _ _ f x => lift x >>= f)] [L
|
||||
simpa using LawfulMonadLiftBindFunction.liftBind_bind (n := n)
|
||||
(liftBind := fun _ _ f x => lift x >>= f) (β := β) (γ := γ) (δ := γ) pure x g
|
||||
|
||||
def LawfulMonadLiftBindFunction.id [Monad m] [LawfulMonad m] :
|
||||
theorem LawfulMonadLiftBindFunction.id [LawfulMonad m] :
|
||||
LawfulMonadLiftBindFunction (m := Id) (n := m) (fun _ _ f x => f x.run) where
|
||||
liftBind_pure := by simp
|
||||
liftBind_bind := by simp
|
||||
|
||||
@@ -702,18 +702,16 @@ theorem IterM.toList_map {α β β' : Type w} {m : Type w → Type w'} [Monad m]
|
||||
(it : IterM (α := α) m β) :
|
||||
(it.map f).toList = (fun x => x.map f) <$> it.toList := by
|
||||
rw [← List.filterMap_eq_map, ← toList_filterMap]
|
||||
let t := type_of% (it.map f)
|
||||
let t' := type_of% (it.filterMap (some ∘ f))
|
||||
simp only [map, mapWithPostcondition, InternalCombinators.map, filterMap,
|
||||
filterMapWithPostcondition, InternalCombinators.filterMap]
|
||||
unfold Map
|
||||
congr
|
||||
· simp [Map]
|
||||
· simp [Map.instIterator, inferInstanceAs]
|
||||
· simp
|
||||
· rw [Map.instIterator_eq_filterMapInstIterator]
|
||||
congr
|
||||
simp
|
||||
· simp only [map, mapWithPostcondition, InternalCombinators.map, Function.comp_apply, filterMap,
|
||||
filterMapWithPostcondition, InternalCombinators.filterMap]
|
||||
congr
|
||||
· simp [Map]
|
||||
· simp
|
||||
· simp
|
||||
· simp
|
||||
|
||||
@[simp]
|
||||
theorem IterM.toList_filter {α : Type w} {m : Type w → Type w'} [Monad m] [LawfulMonad m]
|
||||
|
||||
@@ -32,7 +32,7 @@ theorem Iter.forIn'_eq {α β : Type w} [Iterator α Id β] [Finite α Id]
|
||||
IterM.DefaultConsumers.forIn' (n := m) (fun _ _ f x => f x.run) γ (fun _ _ _ => True)
|
||||
it.toIterM init _ (fun _ => id)
|
||||
(fun out h acc => return ⟨← f out (Iter.isPlausibleIndirectOutput_iff_isPlausibleIndirectOutput_toIterM.mpr h) acc, trivial⟩) := by
|
||||
simp +instances only [instForIn', ForIn'.forIn', IteratorLoop.finiteForIn']
|
||||
simp +instances only [ForIn'.forIn']
|
||||
have : ∀ a b c, f a b c = (Subtype.val <$> (⟨·, trivial⟩) <$> f a b c) := by simp
|
||||
simp +singlePass only [this]
|
||||
rw [hl.lawful (fun _ _ f x => f x.run) (wf := IteratorLoop.wellFounded_of_finite)]
|
||||
@@ -81,7 +81,7 @@ theorem Iter.forIn'_eq_forIn'_toIterM {α β : Type w} [Iterator α Id β]
|
||||
letI : ForIn' m (IterM (α := α) Id β) β _ := IterM.instForIn'
|
||||
ForIn'.forIn' it.toIterM init
|
||||
(fun out h acc => f out (isPlausibleIndirectOutput_iff_isPlausibleIndirectOutput_toIterM.mpr h) acc) := by
|
||||
simp +instances [ForIn'.forIn', Iter.instForIn', IterM.instForIn', monadLift]
|
||||
simp +instances [ForIn'.forIn', monadLift]
|
||||
|
||||
theorem Iter.forIn_eq_forIn_toIterM {α β : Type w} [Iterator α Id β]
|
||||
[Finite α Id] {m : Type w → Type w''} [Monad m] [LawfulMonad m]
|
||||
@@ -395,7 +395,7 @@ theorem Iter.fold_eq_fold_toIterM {α β : Type w} {γ : Type w} [Iterator α Id
|
||||
[Finite α Id] [IteratorLoop α Id Id]
|
||||
{f : γ → β → γ} {init : γ} {it : Iter (α := α) β} :
|
||||
it.fold (init := init) f = (it.toIterM.fold (init := init) f).run := by
|
||||
rw [fold_eq_foldM, foldM_eq_foldM_toIterM, IterM.fold_eq_foldM]; rfl
|
||||
rw [fold_eq_foldM, foldM_eq_foldM_toIterM, IterM.fold_eq_foldM]
|
||||
|
||||
@[simp]
|
||||
theorem Iter.forIn_pure_yield_eq_fold {α β : Type w} {γ : Type x} [Iterator α Id β]
|
||||
|
||||
@@ -109,7 +109,7 @@ theorem IterM.forIn'_eq {α β : Type w} {m : Type w → Type w'} [Iterator α m
|
||||
letI : ForIn' n (IterM (α := α) m β) β _ := IterM.instForIn'
|
||||
ForIn'.forIn' (α := β) (m := n) it init f = IterM.DefaultConsumers.forIn' (n := n)
|
||||
(fun _ _ f x => monadLift x >>= f) γ (fun _ _ _ => True) it init _ (fun _ => id) (return ⟨← f · · ·, trivial⟩) := by
|
||||
simp +instances only [instForIn', ForIn'.forIn', IteratorLoop.finiteForIn']
|
||||
simp +instances only [ForIn'.forIn']
|
||||
have : f = (Subtype.val <$> (⟨·, trivial⟩) <$> f · · ·) := by simp
|
||||
rw [this, hl.lawful (fun _ _ f x => monadLift x >>= f) (wf := IteratorLoop.wellFounded_of_finite)]
|
||||
simp +instances [IteratorLoop.defaultImplementation]
|
||||
|
||||
@@ -32,14 +32,14 @@ def ToIterator.iter [ToIterator γ Id α β] (x : γ) : Iter (α := α) β :=
|
||||
ToIterator.iterM x |>.toIter
|
||||
|
||||
/-- Creates a monadic `ToIterator` instance. -/
|
||||
@[always_inline, inline, expose, instance_reducible]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def ToIterator.ofM (α : Type w)
|
||||
(iterM : γ → IterM (α := α) m β) :
|
||||
ToIterator γ m α β where
|
||||
iterMInternal x := iterM x
|
||||
|
||||
/-- Creates a pure `ToIterator` instance. -/
|
||||
@[always_inline, inline, expose, instance_reducible]
|
||||
@[always_inline, inline, expose, implicit_reducible]
|
||||
def ToIterator.of (α : Type w)
|
||||
(iter : γ → Iter (α := α) β) :
|
||||
ToIterator γ Id α β where
|
||||
|
||||
@@ -36,5 +36,3 @@ public import Init.Data.List.FinRange
|
||||
public import Init.Data.List.Lex
|
||||
public import Init.Data.List.Range
|
||||
public import Init.Data.List.Scan
|
||||
public import Init.Data.List.ControlImpl
|
||||
public import Init.Data.List.SplitOn
|
||||
|
||||
@@ -135,11 +135,7 @@ protected def beq [BEq α] : List α → List α → Bool
|
||||
@[simp] theorem beq_nil_nil [BEq α] : List.beq ([] : List α) ([] : List α) = true := rfl
|
||||
@[simp] theorem beq_cons_nil [BEq α] {a : α} {as : List α} : List.beq (a::as) [] = false := rfl
|
||||
@[simp] theorem beq_nil_cons [BEq α] {a : α} {as : List α} : List.beq [] (a::as) = false := rfl
|
||||
theorem beq_cons_cons [BEq α] {a b : α} {as bs : List α} : List.beq (a::as) (b::bs) = (a == b && List.beq as bs) := rfl
|
||||
|
||||
@[deprecated beq_cons_cons (since := "2026-02-26")]
|
||||
theorem beq_cons₂ [BEq α] {a b : α} {as bs : List α} :
|
||||
List.beq (a::as) (b::bs) = (a == b && List.beq as bs) := beq_cons_cons
|
||||
theorem beq_cons₂ [BEq α] {a b : α} {as bs : List α} : List.beq (a::as) (b::bs) = (a == b && List.beq as bs) := rfl
|
||||
|
||||
instance [BEq α] : BEq (List α) := ⟨List.beq⟩
|
||||
|
||||
@@ -179,10 +175,7 @@ Examples:
|
||||
@[simp, grind =] theorem isEqv_nil_nil : isEqv ([] : List α) [] eqv = true := rfl
|
||||
@[simp, grind =] theorem isEqv_nil_cons : isEqv ([] : List α) (a::as) eqv = false := rfl
|
||||
@[simp, grind =] theorem isEqv_cons_nil : isEqv (a::as : List α) [] eqv = false := rfl
|
||||
@[grind =] theorem isEqv_cons_cons : isEqv (a::as) (b::bs) eqv = (eqv a b && isEqv as bs eqv) := rfl
|
||||
|
||||
@[deprecated isEqv_cons_cons (since := "2026-02-26")]
|
||||
theorem isEqv_cons₂ : isEqv (a::as) (b::bs) eqv = (eqv a b && isEqv as bs eqv) := isEqv_cons_cons
|
||||
@[grind =] theorem isEqv_cons₂ : isEqv (a::as) (b::bs) eqv = (eqv a b && isEqv as bs eqv) := rfl
|
||||
|
||||
|
||||
/-! ## Lexicographic ordering -/
|
||||
@@ -1055,12 +1048,9 @@ def dropLast {α} : List α → List α
|
||||
@[simp, grind =] theorem dropLast_nil : ([] : List α).dropLast = [] := rfl
|
||||
@[simp, grind =] theorem dropLast_singleton : [x].dropLast = [] := rfl
|
||||
|
||||
@[simp, grind =] theorem dropLast_cons_cons :
|
||||
@[simp, grind =] theorem dropLast_cons₂ :
|
||||
(x::y::zs).dropLast = x :: (y::zs).dropLast := rfl
|
||||
|
||||
@[deprecated dropLast_cons_cons (since := "2026-02-26")]
|
||||
theorem dropLast_cons₂ : (x::y::zs).dropLast = x :: (y::zs).dropLast := dropLast_cons_cons
|
||||
|
||||
-- Later this can be proved by `simp` via `[List.length_dropLast, List.length_cons, Nat.add_sub_cancel]`,
|
||||
-- but we need this while bootstrapping `Array`.
|
||||
@[simp] theorem length_dropLast_cons {a : α} {as : List α} : (a :: as).dropLast.length = as.length := by
|
||||
@@ -1095,11 +1085,7 @@ inductive Sublist {α} : List α → List α → Prop
|
||||
/-- If `l₁` is a subsequence of `l₂`, then it is also a subsequence of `a :: l₂`. -/
|
||||
| cons a : Sublist l₁ l₂ → Sublist l₁ (a :: l₂)
|
||||
/-- If `l₁` is a subsequence of `l₂`, then `a :: l₁` is a subsequence of `a :: l₂`. -/
|
||||
| cons_cons a : Sublist l₁ l₂ → Sublist (a :: l₁) (a :: l₂)
|
||||
|
||||
set_option linter.missingDocs false in
|
||||
@[deprecated Sublist.cons_cons (since := "2026-02-26"), match_pattern]
|
||||
abbrev Sublist.cons₂ := @Sublist.cons_cons
|
||||
| cons₂ a : Sublist l₁ l₂ → Sublist (a :: l₁) (a :: l₂)
|
||||
|
||||
@[inherit_doc] scoped infixl:50 " <+ " => Sublist
|
||||
|
||||
@@ -1157,13 +1143,9 @@ def isPrefixOf [BEq α] : List α → List α → Bool
|
||||
@[simp, grind =] theorem isPrefixOf_nil_left [BEq α] : isPrefixOf ([] : List α) l = true := by
|
||||
simp [isPrefixOf]
|
||||
@[simp, grind =] theorem isPrefixOf_cons_nil [BEq α] : isPrefixOf (a::as) ([] : List α) = false := rfl
|
||||
@[grind =] theorem isPrefixOf_cons_cons [BEq α] {a : α} :
|
||||
@[grind =] theorem isPrefixOf_cons₂ [BEq α] {a : α} :
|
||||
isPrefixOf (a::as) (b::bs) = (a == b && isPrefixOf as bs) := rfl
|
||||
|
||||
@[deprecated isPrefixOf_cons_cons (since := "2026-02-26")]
|
||||
theorem isPrefixOf_cons₂ [BEq α] {a : α} :
|
||||
isPrefixOf (a::as) (b::bs) = (a == b && isPrefixOf as bs) := isPrefixOf_cons_cons
|
||||
|
||||
/--
|
||||
If the first list is a prefix of the second, returns the result of dropping the prefix.
|
||||
|
||||
@@ -2182,16 +2164,10 @@ def intersperse (sep : α) : (l : List α) → List α
|
||||
| x::xs => x :: sep :: intersperse sep xs
|
||||
|
||||
@[simp] theorem intersperse_nil {sep : α} : ([] : List α).intersperse sep = [] := rfl
|
||||
@[simp] theorem intersperse_singleton {x : α} {sep : α} : [x].intersperse sep = [x] := rfl
|
||||
@[deprecated intersperse_singleton (since := "2026-02-26")]
|
||||
theorem intersperse_single {x : α} {sep : α} : [x].intersperse sep = [x] := rfl
|
||||
@[simp] theorem intersperse_cons_cons {x : α} {y : α} {zs : List α} {sep : α} :
|
||||
@[simp] theorem intersperse_single {x : α} {sep : α} : [x].intersperse sep = [x] := rfl
|
||||
@[simp] theorem intersperse_cons₂ {x : α} {y : α} {zs : List α} {sep : α} :
|
||||
(x::y::zs).intersperse sep = x::sep::((y::zs).intersperse sep) := rfl
|
||||
|
||||
@[deprecated intersperse_cons_cons (since := "2026-02-26")]
|
||||
theorem intersperse_cons₂ {x : α} {y : α} {zs : List α} {sep : α} :
|
||||
(x::y::zs).intersperse sep = x::sep::((y::zs).intersperse sep) := intersperse_cons_cons
|
||||
|
||||
/-! ### intercalate -/
|
||||
|
||||
set_option linter.listVariables false in
|
||||
@@ -2210,7 +2186,7 @@ Examples:
|
||||
* `List.intercalate sep [a, b] = a ++ sep ++ b`
|
||||
* `List.intercalate sep [a, b, c] = a ++ sep ++ b ++ sep ++ c`
|
||||
-/
|
||||
noncomputable def intercalate (sep : List α) (xs : List (List α)) : List α :=
|
||||
def intercalate (sep : List α) (xs : List (List α)) : List α :=
|
||||
(intersperse sep xs).flatten
|
||||
|
||||
/-! ### eraseDupsBy -/
|
||||
|
||||
@@ -219,9 +219,9 @@ def filterMapM {m : Type u → Type v} [Monad m] {α : Type w} {β : Type u} (f
|
||||
Applies a monadic function that returns a list to each element of a list, from left to right, and
|
||||
concatenates the resulting lists.
|
||||
-/
|
||||
@[expose]
|
||||
noncomputable def flatMapM {m : Type u → Type v} [Monad m] {α : Type w} {β : Type u} (f : α → m (List β)) (as : List α) : m (List β) :=
|
||||
let rec loop
|
||||
@[inline, expose]
|
||||
def flatMapM {m : Type u → Type v} [Monad m] {α : Type w} {β : Type u} (f : α → m (List β)) (as : List α) : m (List β) :=
|
||||
let rec @[specialize] loop
|
||||
| [], bs => pure bs.reverse.flatten
|
||||
| a :: as, bs => do
|
||||
let bs' ← f a
|
||||
|
||||
@@ -1,35 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.List.Control
|
||||
public import Init.Data.List.Impl
|
||||
|
||||
public section
|
||||
|
||||
namespace List
|
||||
|
||||
/--
|
||||
Applies a monadic function that returns a list to each element of a list, from left to right, and
|
||||
concatenates the resulting lists.
|
||||
-/
|
||||
@[inline, expose]
|
||||
def flatMapMTR {m : Type u → Type v} [Monad m] {α : Type w} {β : Type u} (f : α → m (List β)) (as : List α) : m (List β) :=
|
||||
let rec @[specialize] loop
|
||||
| [], bs => pure bs.reverse.flatten
|
||||
| a :: as, bs => do
|
||||
let bs' ← f a
|
||||
loop as (bs' :: bs)
|
||||
loop as []
|
||||
|
||||
@[csimp] theorem flatMapM_eq_flatMapMTR : @flatMapM = @flatMapMTR := by
|
||||
funext m _ α β f l
|
||||
simp only [flatMapM, flatMapMTR]
|
||||
generalize [] = m
|
||||
fun_induction flatMapM.loop <;> simp_all [flatMapMTR.loop]
|
||||
|
||||
end List
|
||||
@@ -125,7 +125,7 @@ protected theorem Sublist.eraseP : l₁ <+ l₂ → l₁.eraseP p <+ l₂.eraseP
|
||||
by_cases h : p a
|
||||
· simpa [h] using s.eraseP.trans eraseP_sublist
|
||||
· simpa [h] using s.eraseP.cons _
|
||||
| .cons_cons a s => by
|
||||
| .cons₂ a s => by
|
||||
by_cases h : p a
|
||||
· simpa [h] using s
|
||||
· simpa [h] using s.eraseP
|
||||
|
||||
@@ -184,7 +184,7 @@ theorem Sublist.findSome?_isSome {l₁ l₂ : List α} (h : l₁ <+ l₂) :
|
||||
induction h with
|
||||
| slnil => simp
|
||||
| cons a h ih
|
||||
| cons_cons a h ih =>
|
||||
| cons₂ a h ih =>
|
||||
simp only [findSome?]
|
||||
split
|
||||
· simp_all
|
||||
@@ -455,7 +455,7 @@ theorem Sublist.find?_isSome {l₁ l₂ : List α} (h : l₁ <+ l₂) : (l₁.fi
|
||||
induction h with
|
||||
| slnil => simp
|
||||
| cons a h ih
|
||||
| cons_cons a h ih =>
|
||||
| cons₂ a h ih =>
|
||||
simp only [find?]
|
||||
split
|
||||
· simp
|
||||
|
||||
@@ -236,7 +236,6 @@ theorem getElem?_eq_some_iff {l : List α} : l[i]? = some a ↔ ∃ h : i < l.le
|
||||
· match i, h with
|
||||
| i + 1, h => simp [getElem?_eq_some_iff, Nat.succ_lt_succ_iff]
|
||||
|
||||
@[grind →]
|
||||
theorem getElem_of_getElem? {l : List α} : l[i]? = some a → ∃ h : i < l.length, l[i] = a :=
|
||||
getElem?_eq_some_iff.mp
|
||||
|
||||
@@ -1394,7 +1393,7 @@ theorem head_filter_of_pos {p : α → Bool} {l : List α} (w : l ≠ []) (h : p
|
||||
|
||||
@[simp] theorem filter_sublist {p : α → Bool} : ∀ {l : List α}, filter p l <+ l
|
||||
| [] => .slnil
|
||||
| a :: l => by rw [filter]; split <;> simp [Sublist.cons, Sublist.cons_cons, filter_sublist]
|
||||
| a :: l => by rw [filter]; split <;> simp [Sublist.cons, Sublist.cons₂, filter_sublist]
|
||||
|
||||
/-! ### filterMap -/
|
||||
|
||||
@@ -1838,11 +1837,6 @@ theorem sum_append [Add α] [Zero α] [Std.LawfulLeftIdentity (α := α) (· +
|
||||
[Std.Associative (α := α) (· + ·)] {l₁ l₂ : List α} : (l₁ ++ l₂).sum = l₁.sum + l₂.sum := by
|
||||
induction l₁ generalizing l₂ <;> simp_all [Std.Associative.assoc, Std.LawfulLeftIdentity.left_id]
|
||||
|
||||
@[simp, grind =]
|
||||
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
|
||||
[x].sum = x := by
|
||||
simp [List.sum_eq_foldr, Std.LawfulRightIdentity.right_id x]
|
||||
|
||||
@[simp, grind =]
|
||||
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
|
||||
[Std.Commutative (α := α) (· + ·)]
|
||||
@@ -2732,31 +2726,6 @@ theorem foldr_assoc {op : α → α → α} [ha : Std.Associative op] :
|
||||
simp only [foldr_cons, ha.assoc]
|
||||
rw [foldr_assoc]
|
||||
|
||||
theorem foldl_eq_apply_foldr {xs : List α} {f : α → α → α}
|
||||
[Std.Associative f] [Std.LawfulRightIdentity f init] :
|
||||
xs.foldl f x = f x (xs.foldr f init) := by
|
||||
induction xs generalizing x
|
||||
· simp [Std.LawfulRightIdentity.right_id]
|
||||
· simp [foldl_assoc, *]
|
||||
|
||||
theorem foldr_eq_apply_foldl {xs : List α} {f : α → α → α}
|
||||
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
|
||||
xs.foldr f x = f (xs.foldl f init) x := by
|
||||
have : Std.Associative (fun x y => f y x) := ⟨by simp [Std.Associative.assoc]⟩
|
||||
have : Std.RightIdentity (fun x y => f y x) init := ⟨⟩
|
||||
have : Std.LawfulRightIdentity (fun x y => f y x) init := ⟨by simp [Std.LawfulLeftIdentity.left_id]⟩
|
||||
rw [← List.reverse_reverse (as := xs), foldr_reverse, foldl_eq_apply_foldr, foldl_reverse]
|
||||
|
||||
theorem foldr_eq_foldl {xs : List α} {f : α → α → α}
|
||||
[Std.Associative f] [Std.LawfulIdentity f init] :
|
||||
xs.foldr f init = xs.foldl f init := by
|
||||
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
|
||||
|
||||
theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
|
||||
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : List α} :
|
||||
xs.sum = xs.foldl (init := 0) (· + ·) := by
|
||||
simp [sum_eq_foldr, foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
|
||||
|
||||
-- The argument `f : α₁ → α₂` is intentionally explicit, as it is sometimes not found by unification.
|
||||
theorem foldl_hom (f : α₁ → α₂) {g₁ : α₁ → β → α₁} {g₂ : α₂ → β → α₂} {l : List β} {init : α₁}
|
||||
(H : ∀ x y, g₂ (f x) y = f (g₁ x y)) : l.foldl g₂ (f init) = f (l.foldl g₁ init) := by
|
||||
@@ -3154,7 +3123,7 @@ theorem dropLast_concat_getLast : ∀ {l : List α} (h : l ≠ []), dropLast l +
|
||||
| [], h => absurd rfl h
|
||||
| [_], _ => rfl
|
||||
| _ :: b :: l, _ => by
|
||||
rw [dropLast_cons_cons, cons_append, getLast_cons (cons_ne_nil _ _)]
|
||||
rw [dropLast_cons₂, cons_append, getLast_cons (cons_ne_nil _ _)]
|
||||
congr
|
||||
exact dropLast_concat_getLast (cons_ne_nil b l)
|
||||
|
||||
@@ -3774,28 +3743,4 @@ theorem get_mem : ∀ (l : List α) n, get l n ∈ l
|
||||
theorem mem_iff_get {a} {l : List α} : a ∈ l ↔ ∃ n, get l n = a :=
|
||||
⟨get_of_mem, fun ⟨_, e⟩ => e ▸ get_mem ..⟩
|
||||
|
||||
/-! ### `intercalate` -/
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_nil {ys : List α} : ys.intercalate [] = [] := rfl
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_singleton {ys xs : List α} : ys.intercalate [xs] = xs := by
|
||||
simp [intercalate]
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_cons_cons {ys l l' : List α} {zs : List (List α)} :
|
||||
ys.intercalate (l :: l' :: zs) = l ++ ys ++ ys.intercalate (l' :: zs) := by
|
||||
simp [intercalate]
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_cons_cons_left {ys l : List α} {x : α} {zs : List (List α)} :
|
||||
ys.intercalate ((x :: l) :: zs) = x :: ys.intercalate (l :: zs) := by
|
||||
cases zs <;> simp
|
||||
|
||||
theorem intercalate_cons_of_ne_nil {ys l : List α} {zs : List (List α)} (h : zs ≠ []) :
|
||||
ys.intercalate (l :: zs) = l ++ ys ++ ys.intercalate zs :=
|
||||
match zs, h with
|
||||
| l'::zs, _ => by simp
|
||||
|
||||
end List
|
||||
|
||||
@@ -481,13 +481,13 @@ protected theorem maxIdxOn_nil_eq_iff_false [LE β] [DecidableLE β] {f : α →
|
||||
@[simp]
|
||||
protected theorem maxIdxOn_singleton [LE β] [DecidableLE β] {x : α} {f : α → β} :
|
||||
[x].maxIdxOn f (of_decide_eq_false rfl) = 0 :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minIdxOn_singleton
|
||||
|
||||
@[simp]
|
||||
protected theorem maxIdxOn_lt_length [LE β] [DecidableLE β] {f : α → β} {xs : List α}
|
||||
(h : xs ≠ []) : xs.maxIdxOn f h < xs.length :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minIdxOn_lt_length h
|
||||
|
||||
protected theorem maxIdxOn_le_of_apply_getElem_le_apply_maxOn [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
@@ -495,7 +495,7 @@ protected theorem maxIdxOn_le_of_apply_getElem_le_apply_maxOn [LE β] [Decidable
|
||||
{k : Nat} (hi : k < xs.length) (hle : f (xs.maxOn f h) ≤ f xs[k]) :
|
||||
xs.maxIdxOn f h ≤ k := by
|
||||
simp only [List.maxIdxOn_eq_minIdxOn, List.maxOn_eq_minOn] at hle ⊢
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
exact List.minIdxOn_le_of_apply_getElem_le_apply_minOn h hi (by simpa [LE.le_opposite_iff] using hle)
|
||||
|
||||
protected theorem apply_maxOn_lt_apply_getElem_of_lt_maxIdxOn [LE β] [DecidableLE β] [LT β] [IsLinearPreorder β]
|
||||
@@ -513,7 +513,7 @@ protected theorem getElem_maxIdxOn [LE β] [DecidableLE β] [IsLinearPreorder β
|
||||
{f : α → β} {xs : List α} (h : xs ≠ []) :
|
||||
xs[xs.maxIdxOn f h] = xs.maxOn f h := by
|
||||
simp only [List.maxIdxOn_eq_minIdxOn, List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
exact List.getElem_minIdxOn h
|
||||
|
||||
protected theorem le_maxIdxOn_of_apply_getElem_lt_apply_getElem [LE β] [DecidableLE β] [LT β]
|
||||
@@ -562,14 +562,14 @@ protected theorem maxIdxOn_cons
|
||||
else if f (xs.maxOn f h) ≤ f x then 0
|
||||
else (xs.maxIdxOn f h) + 1 := by
|
||||
simp only [List.maxIdxOn_eq_minIdxOn, List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.minIdxOn_cons (f := f)
|
||||
|
||||
protected theorem maxIdxOn_eq_zero_iff [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{xs : List α} {f : α → β} (h : xs ≠ []) :
|
||||
xs.maxIdxOn f h = 0 ↔ ∀ x ∈ xs, f x ≤ f (xs.head h) := by
|
||||
simp only [List.maxIdxOn_eq_minIdxOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.minIdxOn_eq_zero_iff h (f := f)
|
||||
|
||||
protected theorem maxIdxOn_append [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
@@ -580,26 +580,26 @@ protected theorem maxIdxOn_append [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
else
|
||||
xs.length + ys.maxIdxOn f hys := by
|
||||
simp only [List.maxIdxOn_eq_minIdxOn, List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.minIdxOn_append hxs hys (f := f)
|
||||
|
||||
protected theorem left_le_maxIdxOn_append [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{xs ys : List α} {f : α → β} (h : xs ≠ []) :
|
||||
xs.maxIdxOn f h ≤ (xs ++ ys).maxIdxOn f (by simp [h]) :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.left_le_minIdxOn_append h
|
||||
|
||||
protected theorem maxIdxOn_take_le [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{xs : List α} {f : α → β} {i : Nat} (h : xs.take i ≠ []) :
|
||||
(xs.take i).maxIdxOn f h ≤ xs.maxIdxOn f (List.ne_nil_of_take_ne_nil h) :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minIdxOn_take_le h
|
||||
|
||||
@[simp]
|
||||
protected theorem maxIdxOn_replicate [LE β] [DecidableLE β] [Refl (α := β) (· ≤ ·)]
|
||||
{n : Nat} {a : α} {f : α → β} (h : replicate n a ≠ []) :
|
||||
(replicate n a).maxIdxOn f h = 0 :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minIdxOn_replicate h
|
||||
|
||||
@[simp]
|
||||
|
||||
@@ -297,13 +297,13 @@ protected theorem maxOn_cons
|
||||
(x :: xs).maxOn f (by exact of_decide_eq_false rfl) =
|
||||
if h : xs = [] then x else maxOn f x (xs.maxOn f h) := by
|
||||
simp only [maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
exact List.minOn_cons (f := f)
|
||||
|
||||
protected theorem maxOn_cons_cons [LE β] [DecidableLE β] {a b : α} {l : List α} {f : α → β} :
|
||||
(a :: b :: l).maxOn f (by simp) = (maxOn f a b :: l).maxOn f (by simp) := by
|
||||
simp only [List.maxOn_eq_minOn, maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
exact List.minOn_cons_cons
|
||||
|
||||
@[simp]
|
||||
@@ -334,51 +334,51 @@ protected theorem maxOn_id [Max α] [LE α] [DecidableLE α] [LawfulOrderLeftLea
|
||||
{xs : List α} (h : xs ≠ []) :
|
||||
xs.maxOn id h = xs.max h := by
|
||||
simp only [List.maxOn_eq_minOn]
|
||||
letI : LE α := (inferInstanceAs (LE α)).opposite
|
||||
letI : Min α := (inferInstanceAs (Max α)).oppositeMin
|
||||
letI : LE α := (inferInstance : LE α).opposite
|
||||
letI : Min α := (inferInstance : Max α).oppositeMin
|
||||
simpa only [List.max_eq_min] using List.minOn_id h
|
||||
|
||||
@[simp]
|
||||
protected theorem maxOn_mem [LE β] [DecidableLE β] {xs : List α}
|
||||
{f : α → β} {h : xs ≠ []} : xs.maxOn f h ∈ xs :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn_mem (f := f)
|
||||
|
||||
protected theorem le_apply_maxOn_of_mem [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{xs : List α} {f : α → β} {y : α} (hx : y ∈ xs) :
|
||||
f y ≤ f (xs.maxOn f (List.ne_nil_of_mem hx)) := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.apply_minOn_le_of_mem (f := f) hx
|
||||
|
||||
protected theorem apply_maxOn_le_iff [LE β] [DecidableLE β] [IsLinearPreorder β] {xs : List α}
|
||||
{f : α → β} (h : xs ≠ []) {b : β} :
|
||||
f (xs.maxOn f h) ≤ b ↔ ∀ x ∈ xs, f x ≤ b := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.le_apply_minOn_iff (f := f) h
|
||||
|
||||
protected theorem le_apply_maxOn_iff [LE β] [DecidableLE β] [IsLinearPreorder β] {xs : List α}
|
||||
{f : α → β} (h : xs ≠ []) {b : β} :
|
||||
b ≤ f (xs.maxOn f h) ↔ ∃ x ∈ xs, b ≤ f x := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.apply_minOn_le_iff (f := f) h
|
||||
|
||||
protected theorem apply_maxOn_lt_iff
|
||||
[LE β] [DecidableLE β] [LT β] [IsLinearPreorder β] [LawfulOrderLT β]
|
||||
{xs : List α} {f : α → β} (h : xs ≠ []) {b : β} :
|
||||
f (xs.maxOn f h) < b ↔ ∀ x ∈ xs, f x < b := by
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LT β := (inferInstanceAs (LT β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
letI : LT β := (inferInstance : LT β).opposite
|
||||
simpa [LT.lt_opposite_iff] using List.lt_apply_minOn_iff (f := f) h
|
||||
|
||||
protected theorem lt_apply_maxOn_iff
|
||||
[LE β] [DecidableLE β] [LT β] [IsLinearPreorder β] [LawfulOrderLT β]
|
||||
{xs : List α} {f : α → β} (h : xs ≠ []) {b : β} :
|
||||
b < f (xs.maxOn f h) ↔ ∃ x ∈ xs, b < f x := by
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LT β := (inferInstanceAs (LT β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
letI : LT β := (inferInstance : LT β).opposite
|
||||
simpa [LT.lt_opposite_iff] using List.apply_minOn_lt_iff (f := f) h
|
||||
|
||||
protected theorem apply_maxOn_le_apply_maxOn_of_subset [LE β] [DecidableLE β]
|
||||
@@ -386,14 +386,14 @@ protected theorem apply_maxOn_le_apply_maxOn_of_subset [LE β] [DecidableLE β]
|
||||
haveI : xs ≠ [] := by intro h; rw [h] at hxs; simp_all [subset_nil]
|
||||
f (ys.maxOn f hys) ≤ f (xs.maxOn f this) := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.apply_minOn_le_apply_minOn_of_subset (f := f) hxs hys
|
||||
|
||||
protected theorem apply_maxOn_take_le [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{xs : List α} {f : α → β} {i : Nat} (h : xs.take i ≠ []) :
|
||||
f ((xs.take i).maxOn f h) ≤ f (xs.maxOn f (List.ne_nil_of_take_ne_nil h)) := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.le_apply_minOn_take (f := f) h
|
||||
|
||||
protected theorem le_apply_maxOn_append_left [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
@@ -401,7 +401,7 @@ protected theorem le_apply_maxOn_append_left [LE β] [DecidableLE β] [IsLinearP
|
||||
f (xs.maxOn f h) ≤
|
||||
f ((xs ++ ys).maxOn f (append_ne_nil_of_left_ne_nil h ys)) := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.apply_minOn_append_le_left (f := f) h
|
||||
|
||||
protected theorem le_apply_maxOn_append_right [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
@@ -409,7 +409,7 @@ protected theorem le_apply_maxOn_append_right [LE β] [DecidableLE β] [IsLinear
|
||||
f (ys.maxOn f h) ≤
|
||||
f ((xs ++ ys).maxOn f (append_ne_nil_of_right_ne_nil xs h)) := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.apply_minOn_append_le_right (f := f) h
|
||||
|
||||
@[simp]
|
||||
@@ -417,21 +417,21 @@ protected theorem maxOn_append [LE β] [DecidableLE β] [IsLinearPreorder β] {x
|
||||
{f : α → β} (hxs : xs ≠ []) (hys : ys ≠ []) :
|
||||
(xs ++ ys).maxOn f (by simp [hxs]) = maxOn f (xs.maxOn f hxs) (ys.maxOn f hys) := by
|
||||
simp only [List.maxOn_eq_minOn, maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.minOn_append (f := f) hxs hys
|
||||
|
||||
protected theorem maxOn_eq_head [LE β] [DecidableLE β] [IsLinearPreorder β] {xs : List α}
|
||||
{f : α → β} (h : xs ≠ []) (h' : ∀ x ∈ xs, f x ≤ f (xs.head h)) :
|
||||
xs.maxOn f h = xs.head h := by
|
||||
rw [List.maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.minOn_eq_head (f := f) h (by simpa [LE.le_opposite_iff] using h')
|
||||
|
||||
protected theorem max_map
|
||||
[LE β] [DecidableLE β] [Max β] [IsLinearPreorder β] [LawfulOrderLeftLeaningMax β] {xs : List α}
|
||||
{f : α → β} (h : xs ≠ []) : (xs.map f).max (by simpa) = f (xs.maxOn f h) := by
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : Min β := (inferInstanceAs (Max β)).oppositeMin
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
letI : Min β := (inferInstance : Max β).oppositeMin
|
||||
simpa [List.max_eq_min] using List.min_map (f := f) h
|
||||
|
||||
protected theorem maxOn_eq_max [Max α] [LE α] [DecidableLE α] [LawfulOrderLeftLeaningMax α]
|
||||
@@ -458,7 +458,7 @@ protected theorem max_map_eq_max [Max α] [LE α] [DecidableLE α] [LawfulOrderL
|
||||
protected theorem maxOn_replicate [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{n : Nat} {a : α} {f : α → β} (h : replicate n a ≠ []) :
|
||||
(replicate n a).maxOn f h = a :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn_replicate (f := f) h
|
||||
|
||||
/-! # minOn? -/
|
||||
@@ -579,7 +579,7 @@ protected theorem maxOn?_nil [LE β] [DecidableLE β] {f : α → β} :
|
||||
protected theorem maxOn?_cons_eq_some_maxOn
|
||||
[LE β] [DecidableLE β] {f : α → β} {x : α} {xs : List α} :
|
||||
(x :: xs).maxOn? f = some ((x :: xs).maxOn f (fun h => nomatch h)) :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn?_cons_eq_some_minOn
|
||||
|
||||
protected theorem maxOn?_cons
|
||||
@@ -588,7 +588,7 @@ protected theorem maxOn?_cons
|
||||
have : maxOn f x = (letI : LE β := LE.opposite inferInstance; minOn f x) := by
|
||||
ext; simp only [maxOn_eq_minOn]
|
||||
simp only [List.maxOn?_eq_minOn?, this]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
exact List.minOn?_cons
|
||||
|
||||
@[simp]
|
||||
@@ -599,8 +599,8 @@ protected theorem maxOn?_singleton [LE β] [DecidableLE β] {x : α} {f : α →
|
||||
@[simp]
|
||||
protected theorem maxOn?_id [Max α] [LE α] [DecidableLE α] [LawfulOrderLeftLeaningMax α]
|
||||
{xs : List α} : xs.maxOn? id = xs.max? := by
|
||||
letI : LE α := (inferInstanceAs (LE α)).opposite
|
||||
letI : Min α := (inferInstanceAs (Max α)).oppositeMin
|
||||
letI : LE α := (inferInstance : LE α).opposite
|
||||
letI : Min α := (inferInstance : Max α).oppositeMin
|
||||
simpa only [List.maxOn?_eq_minOn?, List.max?_eq_min?] using List.minOn?_id (α := α)
|
||||
|
||||
protected theorem maxOn?_eq_if
|
||||
@@ -610,7 +610,7 @@ protected theorem maxOn?_eq_if
|
||||
some (xs.maxOn f h)
|
||||
else
|
||||
none :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn?_eq_if
|
||||
|
||||
@[simp]
|
||||
@@ -620,55 +620,55 @@ protected theorem isSome_maxOn?_iff [LE β] [DecidableLE β] {f : α → β} {xs
|
||||
|
||||
protected theorem maxOn_eq_get_maxOn? [LE β] [DecidableLE β] {f : α → β} {xs : List α}
|
||||
(h : xs ≠ []) : xs.maxOn f h = (xs.maxOn? f).get (List.isSome_maxOn?_iff.mpr h) :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn_eq_get_minOn? (f := f) h
|
||||
|
||||
protected theorem maxOn?_eq_some_maxOn [LE β] [DecidableLE β] {f : α → β} {xs : List α}
|
||||
(h : xs ≠ []) : xs.maxOn? f = some (xs.maxOn f h) :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn?_eq_some_minOn (f := f) h
|
||||
|
||||
@[simp]
|
||||
protected theorem get_maxOn? [LE β] [DecidableLE β] {f : α → β} {xs : List α}
|
||||
(h : xs ≠ []) : (xs.maxOn? f).get (List.isSome_maxOn?_iff.mpr h) = xs.maxOn f h :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.get_minOn? (f := f) h
|
||||
|
||||
protected theorem maxOn_eq_of_maxOn?_eq_some
|
||||
[LE β] [DecidableLE β] {f : α → β} {xs : List α} {x : α} (h : xs.maxOn? f = some x) :
|
||||
xs.maxOn f (List.isSome_maxOn?_iff.mp (Option.isSome_of_eq_some h)) = x :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn_eq_of_minOn?_eq_some (f := f) h
|
||||
|
||||
protected theorem isSome_maxOn?_of_mem
|
||||
[LE β] [DecidableLE β] {f : α → β} {xs : List α} {x : α} (h : x ∈ xs) :
|
||||
(xs.maxOn? f).isSome :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.isSome_minOn?_of_mem (f := f) h
|
||||
|
||||
protected theorem le_apply_get_maxOn?_of_mem
|
||||
[LE β] [DecidableLE β] [IsLinearPreorder β] {f : α → β} {xs : List α} {x : α} (h : x ∈ xs) :
|
||||
f x ≤ f ((xs.maxOn? f).get (List.isSome_maxOn?_of_mem h)) := by
|
||||
simp only [List.maxOn?_eq_minOn?]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa [LE.le_opposite_iff] using List.apply_get_minOn?_le_of_mem (f := f) h
|
||||
|
||||
protected theorem maxOn?_mem [LE β] [DecidableLE β] {xs : List α}
|
||||
{f : α → β} (h : xs.maxOn? f = some a) : a ∈ xs :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn?_mem (f := f) h
|
||||
|
||||
protected theorem maxOn?_replicate [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{n : Nat} {a : α} {f : α → β} :
|
||||
(replicate n a).maxOn? f = if n = 0 then none else some a :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn?_replicate
|
||||
|
||||
@[simp]
|
||||
protected theorem maxOn?_replicate_of_pos [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
{n : Nat} {a : α} {f : α → β} (h : 0 < n) :
|
||||
(replicate n a).maxOn? f = some a :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
List.minOn?_replicate_of_pos (f := f) h
|
||||
|
||||
@[simp]
|
||||
@@ -678,7 +678,7 @@ protected theorem maxOn?_append [LE β] [DecidableLE β] [IsLinearPreorder β]
|
||||
have : maxOn f = (letI : LE β := LE.opposite inferInstance; minOn f) := by
|
||||
ext; simp only [maxOn_eq_minOn]
|
||||
simp only [List.maxOn?_eq_minOn?, this]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
exact List.minOn?_append xs ys f
|
||||
|
||||
end List
|
||||
|
||||
@@ -42,7 +42,7 @@ theorem beq_eq_isEqv [BEq α] {as bs : List α} : as.beq bs = isEqv as bs (· ==
|
||||
cases bs with
|
||||
| nil => simp
|
||||
| cons b bs =>
|
||||
simp only [beq_cons_cons, ih, isEqv_eq_decide, length_cons, Nat.add_right_cancel_iff,
|
||||
simp only [beq_cons₂, ih, isEqv_eq_decide, length_cons, Nat.add_right_cancel_iff,
|
||||
Nat.forall_lt_succ_left', getElem_cons_zero, getElem_cons_succ, Bool.decide_and,
|
||||
Bool.decide_eq_true]
|
||||
split <;> simp
|
||||
|
||||
@@ -106,7 +106,7 @@ theorem Sublist.le_countP (s : l₁ <+ l₂) (p) : countP p l₂ - (l₂.length
|
||||
have := s.le_countP p
|
||||
have := s.length_le
|
||||
split <;> omega
|
||||
| .cons_cons a s =>
|
||||
| .cons₂ a s =>
|
||||
rename_i l₁ l₂
|
||||
simp only [countP_cons, length_cons]
|
||||
have := s.le_countP p
|
||||
|
||||
@@ -38,7 +38,7 @@ theorem map_getElem_sublist {l : List α} {is : List (Fin l.length)} (h : is.Pai
|
||||
simp only [Fin.getElem_fin, map_cons]
|
||||
have := IH h.of_cons (hd+1) (pairwise_cons.mp h).1
|
||||
specialize his hd (.head _)
|
||||
have := (drop_eq_getElem_cons ..).symm ▸ this.cons_cons (get l hd)
|
||||
have := (drop_eq_getElem_cons ..).symm ▸ this.cons₂ (get l hd)
|
||||
have := Sublist.append (nil_sublist (take hd l |>.drop j)) this
|
||||
rwa [nil_append, ← (drop_append_of_le_length ?_), take_append_drop] at this
|
||||
simp [Nat.min_eq_left (Nat.le_of_lt hd.isLt), his]
|
||||
@@ -55,7 +55,7 @@ theorem sublist_eq_map_getElem {l l' : List α} (h : l' <+ l) : ∃ is : List (F
|
||||
refine ⟨is.map (·.succ), ?_⟩
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
simpa [Function.comp_def, pairwise_map]
|
||||
| cons_cons _ _ IH =>
|
||||
| cons₂ _ _ IH =>
|
||||
rcases IH with ⟨is,IH⟩
|
||||
refine ⟨⟨0, by simp [Nat.zero_lt_succ]⟩ :: is.map (·.succ), ?_⟩
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
|
||||
@@ -207,7 +207,7 @@ theorem take_eq_dropLast {l : List α} {i : Nat} (h : i + 1 = l.length) :
|
||||
· cases as with
|
||||
| nil => simp_all
|
||||
| cons b bs =>
|
||||
simp only [take_succ_cons, dropLast_cons_cons]
|
||||
simp only [take_succ_cons, dropLast_cons₂]
|
||||
rw [ih]
|
||||
simpa using h
|
||||
|
||||
|
||||
@@ -33,7 +33,7 @@ open Nat
|
||||
@[grind →] theorem Pairwise.sublist : l₁ <+ l₂ → l₂.Pairwise R → l₁.Pairwise R
|
||||
| .slnil, h => h
|
||||
| .cons _ s, .cons _ h₂ => h₂.sublist s
|
||||
| .cons_cons _ s, .cons h₁ h₂ => (h₂.sublist s).cons fun _ h => h₁ _ (s.subset h)
|
||||
| .cons₂ _ s, .cons h₁ h₂ => (h₂.sublist s).cons fun _ h => h₁ _ (s.subset h)
|
||||
|
||||
theorem Pairwise.imp {α R S} (H : ∀ {a b}, R a b → S a b) :
|
||||
∀ {l : List α}, l.Pairwise R → l.Pairwise S
|
||||
@@ -226,7 +226,7 @@ theorem pairwise_iff_forall_sublist : l.Pairwise R ↔ (∀ {a b}, [a,b] <+ l
|
||||
constructor <;> intro h
|
||||
· intro
|
||||
| a, b, .cons _ hab => exact IH.mp h.2 hab
|
||||
| _, b, .cons_cons _ hab => refine h.1 _ (hab.subset ?_); simp
|
||||
| _, b, .cons₂ _ hab => refine h.1 _ (hab.subset ?_); simp
|
||||
· constructor
|
||||
· intro x hx
|
||||
apply h
|
||||
@@ -304,43 +304,26 @@ grind_pattern Nodup.sublist => l₁ <+ l₂, Nodup l₂
|
||||
theorem Sublist.nodup : l₁ <+ l₂ → Nodup l₂ → Nodup l₁ :=
|
||||
Nodup.sublist
|
||||
|
||||
theorem getElem?_inj {l : List α} (h₀ : i < l.length) (h₁ : List.Nodup l) :
|
||||
l[i]? = l[j]? ↔ i = j :=
|
||||
⟨by
|
||||
intro h₂
|
||||
induction l generalizing i j with
|
||||
| nil => cases h₀
|
||||
| cons x xs ih =>
|
||||
match i, j with
|
||||
| 0, 0 => rfl
|
||||
| i+1, j+1 =>
|
||||
cases h₁ with
|
||||
| cons ha h₁ =>
|
||||
simp only [getElem?_cons_succ] at h₂
|
||||
exact congrArg (· + 1) (ih (Nat.lt_of_succ_lt_succ h₀) h₁ h₂)
|
||||
| i+1, 0 => ?_
|
||||
| 0, j+1 => ?_
|
||||
all_goals
|
||||
simp only [getElem?_cons_zero, getElem?_cons_succ] at h₂
|
||||
cases h₁; rename_i h' h
|
||||
have := h x ?_ rfl; cases this
|
||||
rw [mem_iff_getElem?]
|
||||
exact ⟨_, h₂⟩; exact ⟨_ , h₂.symm⟩
|
||||
, by simp +contextual⟩
|
||||
|
||||
theorem getElem_inj {xs : List α}
|
||||
{h₀ : i < xs.length} {h₁ : j < xs.length} (h : Nodup xs) : xs[i] = xs[j] ↔ i = j := by
|
||||
simpa only [List.getElem_eq_getElem?_get, Option.get_inj] using getElem?_inj h₀ h
|
||||
|
||||
theorem getD_inj {xs : List α}
|
||||
(h₀ : i < xs.length) (h₁ : j < xs.length) (h₂ : Nodup xs) :
|
||||
xs.getD i fallback = xs.getD j fallback ↔ i = j := by
|
||||
simp only [List.getD_eq_getElem?_getD]
|
||||
rw [Option.getD_inj, getElem?_inj] <;> simpa
|
||||
|
||||
theorem getElem!_inj [Inhabited α] {xs : List α}
|
||||
(h₀ : i < xs.length) (h₁ : j < xs.length) (h₂ : Nodup xs) : xs[i]! = xs[j]! ↔ i = j := by
|
||||
simpa only [getElem!_eq_getElem?_getD, ← getD_eq_getElem?_getD] using getD_inj h₀ h₁ h₂
|
||||
theorem getElem?_inj {xs : List α}
|
||||
(h₀ : i < xs.length) (h₁ : Nodup xs) (h₂ : xs[i]? = xs[j]?) : i = j := by
|
||||
induction xs generalizing i j with
|
||||
| nil => cases h₀
|
||||
| cons x xs ih =>
|
||||
match i, j with
|
||||
| 0, 0 => rfl
|
||||
| i+1, j+1 =>
|
||||
cases h₁ with
|
||||
| cons ha h₁ =>
|
||||
simp only [getElem?_cons_succ] at h₂
|
||||
exact congrArg (· + 1) (ih (Nat.lt_of_succ_lt_succ h₀) h₁ h₂)
|
||||
| i+1, 0 => ?_
|
||||
| 0, j+1 => ?_
|
||||
all_goals
|
||||
simp only [getElem?_cons_zero, getElem?_cons_succ] at h₂
|
||||
cases h₁; rename_i h' h
|
||||
have := h x ?_ rfl; cases this
|
||||
rw [mem_iff_getElem?]
|
||||
exact ⟨_, h₂⟩; exact ⟨_ , h₂.symm⟩
|
||||
|
||||
@[simp, grind =] theorem nodup_replicate {n : Nat} {a : α} :
|
||||
(replicate n a).Nodup ↔ n ≤ 1 := by simp [Nodup]
|
||||
|
||||
@@ -252,13 +252,13 @@ theorem exists_perm_sublist {l₁ l₂ l₂' : List α} (s : l₁ <+ l₂) (p :
|
||||
| cons x _ IH =>
|
||||
match s with
|
||||
| .cons _ s => let ⟨l₁', p', s'⟩ := IH s; exact ⟨l₁', p', s'.cons _⟩
|
||||
| .cons_cons _ s => let ⟨l₁', p', s'⟩ := IH s; exact ⟨x :: l₁', p'.cons x, s'.cons_cons _⟩
|
||||
| .cons₂ _ s => let ⟨l₁', p', s'⟩ := IH s; exact ⟨x :: l₁', p'.cons x, s'.cons₂ _⟩
|
||||
| swap x y l' =>
|
||||
match s with
|
||||
| .cons _ (.cons _ s) => exact ⟨_, .rfl, (s.cons _).cons _⟩
|
||||
| .cons _ (.cons_cons _ s) => exact ⟨x :: _, .rfl, (s.cons _).cons_cons _⟩
|
||||
| .cons_cons _ (.cons _ s) => exact ⟨y :: _, .rfl, (s.cons_cons _).cons _⟩
|
||||
| .cons_cons _ (.cons_cons _ s) => exact ⟨x :: y :: _, .swap .., (s.cons_cons _).cons_cons _⟩
|
||||
| .cons _ (.cons₂ _ s) => exact ⟨x :: _, .rfl, (s.cons _).cons₂ _⟩
|
||||
| .cons₂ _ (.cons _ s) => exact ⟨y :: _, .rfl, (s.cons₂ _).cons _⟩
|
||||
| .cons₂ _ (.cons₂ _ s) => exact ⟨x :: y :: _, .swap .., (s.cons₂ _).cons₂ _⟩
|
||||
| trans _ _ IH₁ IH₂ =>
|
||||
let ⟨_, pm, sm⟩ := IH₁ s
|
||||
let ⟨r₁, pr, sr⟩ := IH₂ sm
|
||||
@@ -277,7 +277,7 @@ theorem Sublist.exists_perm_append {l₁ l₂ : List α} : l₁ <+ l₂ → ∃
|
||||
| Sublist.cons a s =>
|
||||
let ⟨l, p⟩ := Sublist.exists_perm_append s
|
||||
⟨a :: l, (p.cons a).trans perm_middle.symm⟩
|
||||
| Sublist.cons_cons a s =>
|
||||
| Sublist.cons₂ a s =>
|
||||
let ⟨l, p⟩ := Sublist.exists_perm_append s
|
||||
⟨l, p.cons a⟩
|
||||
|
||||
|
||||
@@ -452,7 +452,7 @@ theorem sublist_mergeSort
|
||||
have h' := sublist_mergeSort trans total hc h
|
||||
rw [h₂] at h'
|
||||
exact h'.middle a
|
||||
| _, _, @Sublist.cons_cons _ l₁ l₂ a h => by
|
||||
| _, _, @Sublist.cons₂ _ l₁ l₂ a h => by
|
||||
rename_i hc
|
||||
obtain ⟨l₃, l₄, h₁, h₂, h₃⟩ := mergeSort_cons trans total a l₂
|
||||
rw [h₁]
|
||||
@@ -460,7 +460,7 @@ theorem sublist_mergeSort
|
||||
rw [h₂] at h'
|
||||
simp only [Bool.not_eq_true', tail_cons] at h₃ h'
|
||||
exact
|
||||
sublist_append_of_sublist_right (Sublist.cons_cons a
|
||||
sublist_append_of_sublist_right (Sublist.cons₂ a
|
||||
((fun w => Sublist.of_sublist_append_right w h') fun b m₁ m₃ =>
|
||||
(Bool.eq_not_self true).mp ((rel_of_pairwise_cons hc m₁).symm.trans (h₃ b m₃))))
|
||||
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.List.SplitOn.Basic
|
||||
public import Init.Data.List.SplitOn.Lemmas
|
||||
@@ -1,70 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2016 Microsoft Corporation. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.List.Basic
|
||||
public import Init.NotationExtra
|
||||
import Init.Data.Array.Bootstrap
|
||||
import Init.Data.List.Lemmas
|
||||
|
||||
public section
|
||||
|
||||
set_option doc.verso true
|
||||
|
||||
namespace List
|
||||
|
||||
/--
|
||||
Split a list at every element satisfying a predicate, and then prepend {lean}`acc.reverse` to the
|
||||
first element of the result.
|
||||
|
||||
* {lean}`[1, 1, 2, 3, 2, 4, 4].splitOnPPrepend (· == 2) [0, 5] = [[5, 0, 1, 1], [3], [4, 4]]`
|
||||
-/
|
||||
noncomputable def splitOnPPrepend (p : α → Bool) : (l : List α) → (acc : List α) → List (List α)
|
||||
| [], acc => [acc.reverse]
|
||||
| a :: t, acc => if p a then acc.reverse :: splitOnPPrepend p t [] else splitOnPPrepend p t (a::acc)
|
||||
|
||||
/--
|
||||
Split a list at every element satisfying a predicate. The separators are not in the result.
|
||||
|
||||
Examples:
|
||||
* {lean}`[1, 1, 2, 3, 2, 4, 4].splitOnP (· == 2) = [[1, 1], [3], [4, 4]]`
|
||||
-/
|
||||
noncomputable def splitOnP (p : α → Bool) (l : List α) : List (List α) :=
|
||||
splitOnPPrepend p l []
|
||||
|
||||
@[deprecated splitOnPPrepend (since := "2026-02-26")]
|
||||
noncomputable def splitOnP.go (p : α → Bool) (l acc : List α) : List (List α) :=
|
||||
splitOnPPrepend p l acc
|
||||
|
||||
/-- Tail recursive version of {name}`splitOnP`. -/
|
||||
@[inline]
|
||||
def splitOnPTR (p : α → Bool) (l : List α) : List (List α) := go l #[] #[] where
|
||||
@[specialize] go : List α → Array α → Array (List α) → List (List α)
|
||||
| [], acc, r => r.toListAppend [acc.toList]
|
||||
| a :: t, acc, r => bif p a then go t #[] (r.push acc.toList) else go t (acc.push a) r
|
||||
|
||||
@[csimp] theorem splitOnP_eq_splitOnPTR : @splitOnP = @splitOnPTR := by
|
||||
funext α P l
|
||||
simp only [splitOnPTR]
|
||||
suffices ∀ xs acc r,
|
||||
splitOnPTR.go P xs acc r = r.toList ++ splitOnPPrepend P xs acc.toList.reverse from
|
||||
(this l #[] #[]).symm
|
||||
intro xs acc r
|
||||
induction xs generalizing acc r with
|
||||
| nil => simp [splitOnPPrepend, splitOnPTR.go]
|
||||
| cons x xs IH => cases h : P x <;> simp [splitOnPPrepend, splitOnPTR.go, *]
|
||||
|
||||
/--
|
||||
Split a list at every occurrence of a separator element. The separators are not in the result.
|
||||
|
||||
Examples:
|
||||
* {lean}`[1, 1, 2, 3, 2, 4, 4].splitOn 2 = [[1, 1], [3], [4, 4]]`
|
||||
-/
|
||||
@[inline] def splitOn [BEq α] (a : α) (as : List α) : List (List α) :=
|
||||
as.splitOnP (· == a)
|
||||
|
||||
end List
|
||||
@@ -1,208 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2014 Parikshit Khanna. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Parikshit Khanna, Jeremy Avigad, Leonardo de Moura, Floris van Doorn, Mario Carneiro, Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.List.SplitOn.Basic
|
||||
import all Init.Data.List.SplitOn.Basic
|
||||
import Init.Data.List.Nat.Modify
|
||||
import Init.ByCases
|
||||
|
||||
public section
|
||||
|
||||
namespace List
|
||||
|
||||
variable {p : α → Bool} {xs : List α} {ls : List (List α)}
|
||||
|
||||
@[simp]
|
||||
theorem splitOn_nil [BEq α] (a : α) : [].splitOn a = [[]] :=
|
||||
(rfl)
|
||||
|
||||
@[simp]
|
||||
theorem splitOnP_nil : [].splitOnP p = [[]] :=
|
||||
(rfl)
|
||||
|
||||
@[simp]
|
||||
theorem splitOnPPrepend_ne_nil (p : α → Bool) (xs acc : List α) : splitOnPPrepend p xs acc ≠ [] := by
|
||||
fun_induction splitOnPPrepend <;> simp_all
|
||||
|
||||
@[deprecated splitOnPPrepend_ne_nil (since := "2026-02-26")]
|
||||
theorem splitOnP.go_ne_nil (p : α → Bool) (xs acc : List α) : splitOnPPrepend p xs acc ≠ [] :=
|
||||
splitOnPPrepend_ne_nil p xs acc
|
||||
|
||||
@[simp] theorem splitOnPPrepend_nil {acc : List α} : splitOnPPrepend p [] acc = [acc.reverse] := (rfl)
|
||||
@[simp] theorem splitOnPPrepend_nil_right : splitOnPPrepend p xs [] = splitOnP p xs := (rfl)
|
||||
theorem splitOnP_eq_splitOnPPrepend : splitOnP p xs = splitOnPPrepend p xs [] := (rfl)
|
||||
|
||||
theorem splitOnPPrepend_cons_eq_if {x : α} {xs acc : List α} :
|
||||
splitOnPPrepend p (x :: xs) acc =
|
||||
if p x then acc.reverse :: splitOnP p xs else splitOnPPrepend p xs (x :: acc) := by
|
||||
simp [splitOnPPrepend]
|
||||
|
||||
theorem splitOnPPrepend_cons_pos {p : α → Bool} {a : α} {l acc : List α} (h : p a) :
|
||||
splitOnPPrepend p (a :: l) acc = acc.reverse :: splitOnP p l := by
|
||||
simp [splitOnPPrepend, h]
|
||||
|
||||
theorem splitOnPPrepend_cons_neg {p : α → Bool} {a : α} {l acc : List α} (h : p a = false) :
|
||||
splitOnPPrepend p (a :: l) acc = splitOnPPrepend p l (a :: acc) := by
|
||||
simp [splitOnPPrepend, h]
|
||||
|
||||
theorem splitOnP_cons_eq_if_splitOnPPrepend {x : α} {xs : List α} :
|
||||
splitOnP p (x :: xs) = if p x then [] :: splitOnP p xs else splitOnPPrepend p xs [x] := by
|
||||
simp [splitOnPPrepend_cons_eq_if, ← splitOnPPrepend_nil_right]
|
||||
|
||||
theorem splitOnPPrepend_eq_modifyHead {xs acc : List α} :
|
||||
splitOnPPrepend p xs acc = modifyHead (acc.reverse ++ ·) (splitOnP p xs) := by
|
||||
induction xs generalizing acc with
|
||||
| nil => simp
|
||||
| cons hd tl ih =>
|
||||
simp [splitOnPPrepend_cons_eq_if, splitOnP_cons_eq_if_splitOnPPrepend, ih]
|
||||
split <;> simp <;> congr
|
||||
|
||||
@[deprecated splitOnPPrepend_eq_modifyHead (since := "2026-02-26")]
|
||||
theorem splitOnP.go_acc {xs acc : List α} :
|
||||
splitOnPPrepend p xs acc = modifyHead (acc.reverse ++ ·) (splitOnP p xs) :=
|
||||
splitOnPPrepend_eq_modifyHead
|
||||
|
||||
@[simp]
|
||||
theorem splitOnP_ne_nil (p : α → Bool) (xs : List α) : xs.splitOnP p ≠ [] :=
|
||||
splitOnPPrepend_ne_nil p xs []
|
||||
|
||||
theorem splitOnP_cons_eq_if_modifyHead (x : α) (xs : List α) :
|
||||
(x :: xs).splitOnP p =
|
||||
if p x then [] :: xs.splitOnP p else (xs.splitOnP p).modifyHead (cons x) := by
|
||||
simp [splitOnP_cons_eq_if_splitOnPPrepend, splitOnPPrepend_eq_modifyHead]
|
||||
|
||||
@[deprecated splitOnP_cons_eq_if_modifyHead (since := "2026-02-26")]
|
||||
theorem splitOnP_cons (x : α) (xs : List α) :
|
||||
(x :: xs).splitOnP p =
|
||||
if p x then [] :: xs.splitOnP p else (xs.splitOnP p).modifyHead (cons x) :=
|
||||
splitOnP_cons_eq_if_modifyHead x xs
|
||||
|
||||
/-- The original list `L` can be recovered by flattening the lists produced by `splitOnP p L`,
|
||||
interspersed with the elements `L.filter p`. -/
|
||||
theorem splitOnP_spec (as : List α) :
|
||||
flatten (zipWith (· ++ ·) (splitOnP p as) (((as.filter p).map fun x => [x]) ++ [[]])) = as := by
|
||||
induction as with
|
||||
| nil => simp
|
||||
| cons a as' ih =>
|
||||
rw [splitOnP_cons_eq_if_modifyHead]
|
||||
split <;> simp [*, flatten_zipWith, splitOnP_ne_nil]
|
||||
where
|
||||
flatten_zipWith {xs ys : List (List α)} {a : α} (hxs : xs ≠ []) (hys : ys ≠ []) :
|
||||
flatten (zipWith (fun x x_1 => x ++ x_1) (modifyHead (cons a) xs) ys) =
|
||||
a :: flatten (zipWith (fun x x_1 => x ++ x_1) xs ys) := by
|
||||
cases xs <;> cases ys <;> simp_all
|
||||
|
||||
/-- If no element satisfies `p` in the list `xs`, then `xs.splitOnP p = [xs]` -/
|
||||
theorem splitOnP_eq_singleton (h : ∀ x ∈ xs, p x = false) : xs.splitOnP p = [xs] := by
|
||||
induction xs with
|
||||
| nil => simp
|
||||
| cons hd tl ih =>
|
||||
simp only [mem_cons, forall_eq_or_imp] at h
|
||||
simp [splitOnP_cons_eq_if_modifyHead, h.1, ih h.2]
|
||||
|
||||
@[deprecated splitOnP_eq_singleton (since := "2026-02-26")]
|
||||
theorem splitOnP_eq_single (h : ∀ x ∈ xs, p x = false) : xs.splitOnP p = [xs] :=
|
||||
splitOnP_eq_singleton h
|
||||
|
||||
/-- When a list of the form `[...xs, sep, ...as]` is split at the `sep` element satisfying `p`,
|
||||
the result is the concatenation of `splitOnP` called on `xs` and `as` -/
|
||||
theorem splitOnP_append_cons (xs as : List α) {sep : α} (hsep : p sep) :
|
||||
(xs ++ sep :: as).splitOnP p = List.splitOnP p xs ++ List.splitOnP p as := by
|
||||
induction xs with
|
||||
| nil => simp [splitOnP_cons_eq_if_modifyHead, hsep]
|
||||
| cons hd tl ih =>
|
||||
obtain ⟨hd1, tl1, h1'⟩ := List.exists_cons_of_ne_nil (List.splitOnP_ne_nil (p := p) (xs := tl))
|
||||
by_cases hPh : p hd <;> simp [splitOnP_cons_eq_if_modifyHead, *]
|
||||
|
||||
/-- When a list of the form `[...xs, sep, ...as]` is split on `p`, the first element is `xs`,
|
||||
assuming no element in `xs` satisfies `p` but `sep` does satisfy `p` -/
|
||||
theorem splitOnP_append_cons_of_forall_mem (h : ∀ x ∈ xs, p x = false) (sep : α)
|
||||
(hsep : p sep = true) (as : List α) : (xs ++ sep :: as).splitOnP p = xs :: as.splitOnP p := by
|
||||
rw [splitOnP_append_cons xs as hsep, splitOnP_eq_singleton h, singleton_append]
|
||||
|
||||
@[deprecated splitOnP_append_cons_of_forall_mem (since := "2026-02-26")]
|
||||
theorem splitOnP_first (h : ∀ x ∈ xs, p x = false) (sep : α)
|
||||
(hsep : p sep = true) (as : List α) : (xs ++ sep :: as).splitOnP p = xs :: as.splitOnP p :=
|
||||
splitOnP_append_cons_of_forall_mem h sep hsep as
|
||||
|
||||
theorem splitOn_eq_splitOnP [BEq α] {x : α} {xs : List α} : xs.splitOn x = xs.splitOnP (· == x) :=
|
||||
(rfl)
|
||||
|
||||
@[simp]
|
||||
theorem splitOn_ne_nil [BEq α] (a : α) (xs : List α) : xs.splitOn a ≠ [] := by
|
||||
simp [splitOn_eq_splitOnP]
|
||||
|
||||
theorem splitOn_cons_eq_if_modifyHead [BEq α] {a : α} (x : α) (xs : List α) :
|
||||
(x :: xs).splitOn a =
|
||||
if x == a then [] :: xs.splitOn a else (xs.splitOn a).modifyHead (cons x) := by
|
||||
simpa [splitOn_eq_splitOnP] using splitOnP_cons_eq_if_modifyHead ..
|
||||
|
||||
/-- If no element satisfies `p` in the list `xs`, then `xs.splitOnP p = [xs]` -/
|
||||
theorem splitOn_eq_singleton_of_beq_eq_false [BEq α] {a : α} (h : ∀ x ∈ xs, (x == a) = false) :
|
||||
xs.splitOn a = [xs] := by
|
||||
simpa [splitOn_eq_splitOnP] using splitOnP_eq_singleton h
|
||||
|
||||
theorem splitOn_eq_singleton [BEq α] [LawfulBEq α] {a : α} (h : a ∉ xs) :
|
||||
xs.splitOn a = [xs] :=
|
||||
splitOn_eq_singleton_of_beq_eq_false
|
||||
(fun _ hb => beq_eq_false_iff_ne.2 (fun hab => absurd hb (hab ▸ h)))
|
||||
|
||||
/-- When a list of the form `[...xs, sep, ...as]` is split at the `sep` element equal to `a`,
|
||||
the result is the concatenation of `splitOnP` called on `xs` and `as` -/
|
||||
theorem splitOn_append_cons_of_beq [BEq α] {a : α} (xs as : List α) {sep : α} (hsep : sep == a) :
|
||||
(xs ++ sep :: as).splitOn a = List.splitOn a xs ++ List.splitOn a as := by
|
||||
simpa [splitOn_eq_splitOnP] using splitOnP_append_cons (p := (· == a)) _ _ hsep
|
||||
|
||||
/-- When a list of the form `[...xs, sep, ...as]` is split at `a`,
|
||||
the result is the concatenation of `splitOnP` called on `xs` and `as` -/
|
||||
theorem splitOn_append_cons_self [BEq α] [ReflBEq α] {a : α} (xs as : List α) :
|
||||
(xs ++ a :: as).splitOn a = List.splitOn a xs ++ List.splitOn a as :=
|
||||
splitOn_append_cons_of_beq _ _ (BEq.refl _)
|
||||
|
||||
/-- When a list of the form `[...xs, sep, ...as]` is split at `a`, the first element is `xs`,
|
||||
assuming no element in `xs` is equal to `a` but `sep` is equal to `a`. -/
|
||||
theorem splitOn_append_cons_of_forall_mem_beq_eq_false [BEq α] {a : α}
|
||||
(h : ∀ x ∈ xs, (x == a) = false) (sep : α)
|
||||
(hsep : sep == a) (as : List α) : (xs ++ sep :: as).splitOn a = xs :: as.splitOn a := by
|
||||
simpa [splitOn_eq_splitOnP] using splitOnP_append_cons_of_forall_mem h _ hsep _
|
||||
|
||||
/-- When a list of the form `[...xs, a, ...as]` is split at `a`, the first element is `xs`,
|
||||
assuming no element in `xs` is equal to `a`. -/
|
||||
theorem splitOn_append_cons_self_of_not_mem [BEq α] [LawfulBEq α] {a : α}
|
||||
(h : a ∉ xs) (as : List α) : (xs ++ a :: as).splitOn a = xs :: as.splitOn a :=
|
||||
splitOn_append_cons_of_forall_mem_beq_eq_false
|
||||
(fun b hb => beq_eq_false_iff_ne.2 fun hab => absurd hb (hab ▸ h)) _ (by simp) _
|
||||
|
||||
/-- `intercalate [x]` is the left inverse of `splitOn x` -/
|
||||
@[simp]
|
||||
theorem intercalate_splitOn [BEq α] [LawfulBEq α] (x : α) : [x].intercalate (xs.splitOn x) = xs := by
|
||||
induction xs with
|
||||
| nil => simp
|
||||
| cons hd tl ih =>
|
||||
simp only [splitOn_cons_eq_if_modifyHead, beq_iff_eq]
|
||||
split
|
||||
· simp_all [intercalate_cons_of_ne_nil, splitOn_ne_nil]
|
||||
· have hsp := splitOn_ne_nil x tl
|
||||
generalize splitOn x tl = ls at *
|
||||
cases ls <;> simp_all
|
||||
|
||||
/-- `splitOn x` is the left inverse of `intercalate [x]`, on the domain
|
||||
consisting of each nonempty list of lists `ls` whose elements do not contain `x` -/
|
||||
theorem splitOn_intercalate [BEq α] [LawfulBEq α] (x : α) (hx : ∀ l ∈ ls, x ∉ l) (hls : ls ≠ []) :
|
||||
([x].intercalate ls).splitOn x = ls := by
|
||||
induction ls with
|
||||
| nil => simp at hls
|
||||
| cons hd tl ih =>
|
||||
simp only [mem_cons, forall_eq_or_imp] at ⊢ hx
|
||||
match tl with
|
||||
| [] => simpa using splitOn_eq_singleton hx.1
|
||||
| t::tl =>
|
||||
simp only [intercalate_cons_cons, append_assoc, cons_append, nil_append]
|
||||
rw [splitOn_append_cons_self_of_not_mem hx.1, ih hx.2 (by simp)]
|
||||
|
||||
end List
|
||||
@@ -32,12 +32,8 @@ open Nat
|
||||
section isPrefixOf
|
||||
variable [BEq α]
|
||||
|
||||
@[simp, grind =] theorem isPrefixOf_cons_cons_self [LawfulBEq α] {a : α} :
|
||||
isPrefixOf (a::as) (a::bs) = isPrefixOf as bs := by simp [isPrefixOf_cons_cons]
|
||||
|
||||
@[deprecated isPrefixOf_cons_cons_self (since := "2026-02-26")]
|
||||
theorem isPrefixOf_cons₂_self [LawfulBEq α] {a : α} :
|
||||
isPrefixOf (a::as) (a::bs) = isPrefixOf as bs := isPrefixOf_cons_cons_self
|
||||
@[simp, grind =] theorem isPrefixOf_cons₂_self [LawfulBEq α] {a : α} :
|
||||
isPrefixOf (a::as) (a::bs) = isPrefixOf as bs := by simp [isPrefixOf_cons₂]
|
||||
|
||||
@[simp] theorem isPrefixOf_length_pos_nil {l : List α} (h : 0 < l.length) : isPrefixOf l [] = false := by
|
||||
cases l <;> simp_all [isPrefixOf]
|
||||
@@ -49,7 +45,7 @@ theorem isPrefixOf_cons₂_self [LawfulBEq α] {a : α} :
|
||||
| cons _ _ ih =>
|
||||
cases n
|
||||
· simp
|
||||
· simp [replicate_succ, isPrefixOf_cons_cons, ih, Nat.succ_le_succ_iff, Bool.and_left_comm]
|
||||
· simp [replicate_succ, isPrefixOf_cons₂, ih, Nat.succ_le_succ_iff, Bool.and_left_comm]
|
||||
|
||||
end isPrefixOf
|
||||
|
||||
@@ -173,18 +169,18 @@ theorem subset_replicate {n : Nat} {a : α} {l : List α} (h : n ≠ 0) : l ⊆
|
||||
|
||||
@[simp, grind ←] theorem Sublist.refl : ∀ l : List α, l <+ l
|
||||
| [] => .slnil
|
||||
| a :: l => (Sublist.refl l).cons_cons a
|
||||
| a :: l => (Sublist.refl l).cons₂ a
|
||||
|
||||
theorem Sublist.trans {l₁ l₂ l₃ : List α} (h₁ : l₁ <+ l₂) (h₂ : l₂ <+ l₃) : l₁ <+ l₃ := by
|
||||
induction h₂ generalizing l₁ with
|
||||
| slnil => exact h₁
|
||||
| cons _ _ IH => exact (IH h₁).cons _
|
||||
| @cons_cons l₂ _ a _ IH =>
|
||||
| @cons₂ l₂ _ a _ IH =>
|
||||
generalize e : a :: l₂ = l₂' at h₁
|
||||
match h₁ with
|
||||
| .slnil => apply nil_sublist
|
||||
| .cons a' h₁' => cases e; apply (IH h₁').cons
|
||||
| .cons_cons a' h₁' => cases e; apply (IH h₁').cons_cons
|
||||
| .cons₂ a' h₁' => cases e; apply (IH h₁').cons₂
|
||||
|
||||
instance : Trans (@Sublist α) Sublist Sublist := ⟨Sublist.trans⟩
|
||||
|
||||
@@ -197,23 +193,23 @@ theorem sublist_of_cons_sublist : a :: l₁ <+ l₂ → l₁ <+ l₂ :=
|
||||
|
||||
@[simp, grind =]
|
||||
theorem cons_sublist_cons : a :: l₁ <+ a :: l₂ ↔ l₁ <+ l₂ :=
|
||||
⟨fun | .cons _ s => sublist_of_cons_sublist s | .cons_cons _ s => s, .cons_cons _⟩
|
||||
⟨fun | .cons _ s => sublist_of_cons_sublist s | .cons₂ _ s => s, .cons₂ _⟩
|
||||
|
||||
theorem sublist_or_mem_of_sublist (h : l <+ l₁ ++ a :: l₂) : l <+ l₁ ++ l₂ ∨ a ∈ l := by
|
||||
induction l₁ generalizing l with
|
||||
| nil => match h with
|
||||
| .cons _ h => exact .inl h
|
||||
| .cons_cons _ h => exact .inr (.head ..)
|
||||
| .cons₂ _ h => exact .inr (.head ..)
|
||||
| cons b l₁ IH =>
|
||||
match h with
|
||||
| .cons _ h => exact (IH h).imp_left (Sublist.cons _)
|
||||
| .cons_cons _ h => exact (IH h).imp (Sublist.cons_cons _) (.tail _)
|
||||
| .cons₂ _ h => exact (IH h).imp (Sublist.cons₂ _) (.tail _)
|
||||
|
||||
@[grind →] theorem Sublist.subset : l₁ <+ l₂ → l₁ ⊆ l₂
|
||||
| .slnil, _, h => h
|
||||
| .cons _ s, _, h => .tail _ (s.subset h)
|
||||
| .cons_cons .., _, .head .. => .head ..
|
||||
| .cons_cons _ s, _, .tail _ h => .tail _ (s.subset h)
|
||||
| .cons₂ .., _, .head .. => .head ..
|
||||
| .cons₂ _ s, _, .tail _ h => .tail _ (s.subset h)
|
||||
|
||||
protected theorem Sublist.mem (hx : a ∈ l₁) (hl : l₁ <+ l₂) : a ∈ l₂ :=
|
||||
hl.subset hx
|
||||
@@ -249,7 +245,7 @@ theorem eq_nil_of_sublist_nil {l : List α} (s : l <+ []) : l = [] :=
|
||||
theorem Sublist.length_le : l₁ <+ l₂ → length l₁ ≤ length l₂
|
||||
| .slnil => Nat.le_refl 0
|
||||
| .cons _l s => le_succ_of_le (length_le s)
|
||||
| .cons_cons _ s => succ_le_succ (length_le s)
|
||||
| .cons₂ _ s => succ_le_succ (length_le s)
|
||||
|
||||
grind_pattern Sublist.length_le => l₁ <+ l₂, length l₁
|
||||
grind_pattern Sublist.length_le => l₁ <+ l₂, length l₂
|
||||
@@ -257,7 +253,7 @@ grind_pattern Sublist.length_le => l₁ <+ l₂, length l₂
|
||||
theorem Sublist.eq_of_length : l₁ <+ l₂ → length l₁ = length l₂ → l₁ = l₂
|
||||
| .slnil, _ => rfl
|
||||
| .cons a s, h => nomatch Nat.not_lt.2 s.length_le (h ▸ lt_succ_self _)
|
||||
| .cons_cons a s, h => by rw [s.eq_of_length (succ.inj h)]
|
||||
| .cons₂ a s, h => by rw [s.eq_of_length (succ.inj h)]
|
||||
|
||||
theorem Sublist.eq_of_length_le (s : l₁ <+ l₂) (h : length l₂ ≤ length l₁) : l₁ = l₂ :=
|
||||
s.eq_of_length <| Nat.le_antisymm s.length_le h
|
||||
@@ -279,7 +275,7 @@ grind_pattern tail_sublist => tail l <+ _
|
||||
protected theorem Sublist.tail : ∀ {l₁ l₂ : List α}, l₁ <+ l₂ → tail l₁ <+ tail l₂
|
||||
| _, _, slnil => .slnil
|
||||
| _, _, Sublist.cons _ h => (tail_sublist _).trans h
|
||||
| _, _, Sublist.cons_cons _ h => h
|
||||
| _, _, Sublist.cons₂ _ h => h
|
||||
|
||||
@[grind →]
|
||||
theorem Sublist.of_cons_cons {l₁ l₂ : List α} {a b : α} (h : a :: l₁ <+ b :: l₂) : l₁ <+ l₂ :=
|
||||
@@ -291,8 +287,8 @@ protected theorem Sublist.map (f : α → β) {l₁ l₂} (s : l₁ <+ l₂) : m
|
||||
| slnil => simp
|
||||
| cons a s ih =>
|
||||
simpa using cons (f a) ih
|
||||
| cons_cons a s ih =>
|
||||
simpa using cons_cons (f a) ih
|
||||
| cons₂ a s ih =>
|
||||
simpa using cons₂ (f a) ih
|
||||
|
||||
grind_pattern Sublist.map => l₁ <+ l₂, map f l₁
|
||||
grind_pattern Sublist.map => l₁ <+ l₂, map f l₂
|
||||
@@ -342,7 +338,7 @@ theorem sublist_filterMap_iff {l₁ : List β} {f : α → Option β} :
|
||||
cases h with
|
||||
| cons _ h =>
|
||||
exact ⟨l', h, rfl⟩
|
||||
| cons_cons _ h =>
|
||||
| cons₂ _ h =>
|
||||
rename_i l'
|
||||
exact ⟨l', h, by simp_all⟩
|
||||
· constructor
|
||||
@@ -351,10 +347,10 @@ theorem sublist_filterMap_iff {l₁ : List β} {f : α → Option β} :
|
||||
| cons _ h =>
|
||||
obtain ⟨l', s, rfl⟩ := ih.1 h
|
||||
exact ⟨l', Sublist.cons a s, rfl⟩
|
||||
| cons_cons _ h =>
|
||||
| cons₂ _ h =>
|
||||
rename_i l'
|
||||
obtain ⟨l', s, rfl⟩ := ih.1 h
|
||||
refine ⟨a :: l', Sublist.cons_cons a s, ?_⟩
|
||||
refine ⟨a :: l', Sublist.cons₂ a s, ?_⟩
|
||||
rwa [filterMap_cons_some]
|
||||
· rintro ⟨l', h, rfl⟩
|
||||
replace h := h.filterMap f
|
||||
@@ -373,7 +369,7 @@ theorem sublist_filter_iff {l₁ : List α} {p : α → Bool} :
|
||||
|
||||
theorem sublist_append_left : ∀ l₁ l₂ : List α, l₁ <+ l₁ ++ l₂
|
||||
| [], _ => nil_sublist _
|
||||
| _ :: l₁, l₂ => (sublist_append_left l₁ l₂).cons_cons _
|
||||
| _ :: l₁, l₂ => (sublist_append_left l₁ l₂).cons₂ _
|
||||
|
||||
grind_pattern sublist_append_left => Sublist, l₁ ++ l₂
|
||||
|
||||
@@ -386,7 +382,7 @@ grind_pattern sublist_append_right => Sublist, l₁ ++ l₂
|
||||
@[simp, grind =] theorem singleton_sublist {a : α} {l} : [a] <+ l ↔ a ∈ l := by
|
||||
refine ⟨fun h => h.subset (mem_singleton_self _), fun h => ?_⟩
|
||||
obtain ⟨_, _, rfl⟩ := append_of_mem h
|
||||
exact ((nil_sublist _).cons_cons _).trans (sublist_append_right ..)
|
||||
exact ((nil_sublist _).cons₂ _).trans (sublist_append_right ..)
|
||||
|
||||
@[simp] theorem sublist_append_of_sublist_left (s : l <+ l₁) : l <+ l₁ ++ l₂ :=
|
||||
s.trans <| sublist_append_left ..
|
||||
@@ -408,7 +404,7 @@ theorem Sublist.append_left : l₁ <+ l₂ → ∀ l, l ++ l₁ <+ l ++ l₂ :=
|
||||
theorem Sublist.append_right : l₁ <+ l₂ → ∀ l, l₁ ++ l <+ l₂ ++ l
|
||||
| .slnil, _ => Sublist.refl _
|
||||
| .cons _ h, _ => (h.append_right _).cons _
|
||||
| .cons_cons _ h, _ => (h.append_right _).cons_cons _
|
||||
| .cons₂ _ h, _ => (h.append_right _).cons₂ _
|
||||
|
||||
theorem Sublist.append (hl : l₁ <+ l₂) (hr : r₁ <+ r₂) : l₁ ++ r₁ <+ l₂ ++ r₂ :=
|
||||
(hl.append_right _).trans ((append_sublist_append_left _).2 hr)
|
||||
@@ -422,10 +418,10 @@ theorem sublist_cons_iff {a : α} {l l'} :
|
||||
· intro h
|
||||
cases h with
|
||||
| cons _ h => exact Or.inl h
|
||||
| cons_cons _ h => exact Or.inr ⟨_, rfl, h⟩
|
||||
| cons₂ _ h => exact Or.inr ⟨_, rfl, h⟩
|
||||
· rintro (h | ⟨r, rfl, h⟩)
|
||||
· exact h.cons _
|
||||
· exact h.cons_cons _
|
||||
· exact h.cons₂ _
|
||||
|
||||
@[grind =]
|
||||
theorem cons_sublist_iff {a : α} {l l'} :
|
||||
@@ -439,7 +435,7 @@ theorem cons_sublist_iff {a : α} {l l'} :
|
||||
| cons _ w =>
|
||||
obtain ⟨r₁, r₂, rfl, h₁, h₂⟩ := ih.1 w
|
||||
exact ⟨a' :: r₁, r₂, by simp, mem_cons_of_mem a' h₁, h₂⟩
|
||||
| cons_cons _ w =>
|
||||
| cons₂ _ w =>
|
||||
exact ⟨[a], l', by simp, mem_singleton_self _, w⟩
|
||||
· rintro ⟨r₁, r₂, w, h₁, h₂⟩
|
||||
rw [w, ← singleton_append]
|
||||
@@ -462,7 +458,7 @@ theorem sublist_append_iff {l : List α} :
|
||||
| cons _ w =>
|
||||
obtain ⟨l₁, l₂, rfl, w₁, w₂⟩ := ih.1 w
|
||||
exact ⟨l₁, l₂, rfl, Sublist.cons r w₁, w₂⟩
|
||||
| cons_cons _ w =>
|
||||
| cons₂ _ w =>
|
||||
rename_i l
|
||||
obtain ⟨l₁, l₂, rfl, w₁, w₂⟩ := ih.1 w
|
||||
refine ⟨r :: l₁, l₂, by simp, cons_sublist_cons.mpr w₁, w₂⟩
|
||||
@@ -470,9 +466,9 @@ theorem sublist_append_iff {l : List α} :
|
||||
cases w₁ with
|
||||
| cons _ w₁ =>
|
||||
exact Sublist.cons _ (Sublist.append w₁ w₂)
|
||||
| cons_cons _ w₁ =>
|
||||
| cons₂ _ w₁ =>
|
||||
rename_i l
|
||||
exact Sublist.cons_cons _ (Sublist.append w₁ w₂)
|
||||
exact Sublist.cons₂ _ (Sublist.append w₁ w₂)
|
||||
|
||||
theorem append_sublist_iff {l₁ l₂ : List α} :
|
||||
l₁ ++ l₂ <+ r ↔ ∃ r₁ r₂, r = r₁ ++ r₂ ∧ l₁ <+ r₁ ∧ l₂ <+ r₂ := by
|
||||
@@ -520,7 +516,7 @@ theorem Sublist.middle {l : List α} (h : l <+ l₁ ++ l₂) (a : α) : l <+ l
|
||||
theorem Sublist.reverse : l₁ <+ l₂ → l₁.reverse <+ l₂.reverse
|
||||
| .slnil => Sublist.refl _
|
||||
| .cons _ h => by rw [reverse_cons]; exact sublist_append_of_sublist_left h.reverse
|
||||
| .cons_cons _ h => by rw [reverse_cons, reverse_cons]; exact h.reverse.append_right _
|
||||
| .cons₂ _ h => by rw [reverse_cons, reverse_cons]; exact h.reverse.append_right _
|
||||
|
||||
@[simp, grind =] theorem reverse_sublist : l₁.reverse <+ l₂.reverse ↔ l₁ <+ l₂ :=
|
||||
⟨fun h => l₁.reverse_reverse ▸ l₂.reverse_reverse ▸ h.reverse, Sublist.reverse⟩
|
||||
@@ -562,7 +558,7 @@ theorem sublist_replicate_iff : l <+ replicate m a ↔ ∃ n, n ≤ m ∧ l = re
|
||||
obtain ⟨n, le, rfl⟩ := ih.1 (sublist_of_cons_sublist w)
|
||||
obtain rfl := (mem_replicate.1 (mem_of_cons_sublist w)).2
|
||||
exact ⟨n+1, Nat.add_le_add_right le 1, rfl⟩
|
||||
| cons_cons _ w =>
|
||||
| cons₂ _ w =>
|
||||
obtain ⟨n, le, rfl⟩ := ih.1 w
|
||||
refine ⟨n+1, Nat.add_le_add_right le 1, by simp [replicate_succ]⟩
|
||||
· rintro ⟨n, le, w⟩
|
||||
@@ -648,7 +644,7 @@ theorem flatten_sublist_iff {L : List (List α)} {l} :
|
||||
cases h_sub
|
||||
case cons h_sub =>
|
||||
exact isSublist_iff_sublist.mpr h_sub
|
||||
case cons_cons =>
|
||||
case cons₂ =>
|
||||
contradiction
|
||||
|
||||
instance [DecidableEq α] (l₁ l₂ : List α) : Decidable (l₁ <+ l₂) :=
|
||||
|
||||
@@ -393,7 +393,7 @@ theorem isPrefixOfAux_toArray_zero [BEq α] (l₁ l₂ : List α) (hle : l₁.le
|
||||
| [], _ => rw [dif_neg] <;> simp
|
||||
| _::_, [] => simp at hle
|
||||
| a::l₁, b::l₂ =>
|
||||
simp [isPrefixOf_cons_cons, isPrefixOfAux_toArray_succ', isPrefixOfAux_toArray_zero]
|
||||
simp [isPrefixOf_cons₂, isPrefixOfAux_toArray_succ', isPrefixOfAux_toArray_zero]
|
||||
|
||||
@[simp, grind =] theorem isPrefixOf_toArray [BEq α] (l₁ l₂ : List α) :
|
||||
l₁.toArray.isPrefixOf l₂.toArray = l₁.isPrefixOf l₂ := by
|
||||
@@ -407,7 +407,7 @@ theorem isPrefixOfAux_toArray_zero [BEq α] (l₁ l₂ : List α) (hle : l₁.le
|
||||
cases l₂ with
|
||||
| nil => simp
|
||||
| cons b l₂ =>
|
||||
simp only [isPrefixOf_cons_cons, Bool.and_eq_false_imp]
|
||||
simp only [isPrefixOf_cons₂, Bool.and_eq_false_imp]
|
||||
intro w
|
||||
rw [ih]
|
||||
simp_all
|
||||
|
||||
@@ -478,7 +478,7 @@ instance : Std.Trichotomous (. < . : Nat → Nat → Prop) where
|
||||
|
||||
set_option linter.missingDocs false in
|
||||
@[deprecated Nat.instTrichotomousLt (since := "2025-10-27")]
|
||||
def Nat.instAntisymmNotLt : Std.Antisymm (¬ . < . : Nat → Nat → Prop) where
|
||||
theorem Nat.instAntisymmNotLt : Std.Antisymm (¬ . < . : Nat → Nat → Prop) where
|
||||
antisymm := Nat.instTrichotomousLt.trichotomous
|
||||
|
||||
protected theorem add_le_add_left {n m : Nat} (h : n ≤ m) (k : Nat) : k + n ≤ k + m :=
|
||||
|
||||
@@ -82,15 +82,6 @@ theorem get_inj {o1 o2 : Option α} {h1} {h2} :
|
||||
match o1, o2, h1, h2 with
|
||||
| some a, some b, _, _ => simp only [Option.get_some, Option.some.injEq]
|
||||
|
||||
theorem getD_inj {o₁ o₂ : Option α} (h₁ : o₁.isSome) (h₂ : o₂.isSome) {fallback} :
|
||||
o₁.getD fallback = o₂.getD fallback ↔ o₁ = o₂ := by
|
||||
match o₁, o₂, h₁, h₂ with
|
||||
| some a, some b, _, _ => simp only [Option.getD_some, Option.some.injEq]
|
||||
|
||||
theorem get!_inj [Inhabited α] {o₁ o₂ : Option α} (h₁ : o₁.isSome) (h₂ : o₂.isSome) :
|
||||
o₁.get! = o₂.get! ↔ o₁ = o₂ := by
|
||||
simpa [get!_eq_getD] using getD_inj h₁ h₂
|
||||
|
||||
theorem mem_unique {o : Option α} {a b : α} (ha : a ∈ o) (hb : b ∈ o) : a = b :=
|
||||
some.inj <| ha ▸ hb
|
||||
|
||||
|
||||
@@ -147,7 +147,7 @@ public theorem LawfulOrderMin.of_min_le {α : Type u} [Min α] [LE α]
|
||||
This lemma characterizes in terms of `LE α` when a `Max α` instance "behaves like a supremum
|
||||
operator".
|
||||
-/
|
||||
public def LawfulOrderSup.of_le {α : Type u} [Max α] [LE α]
|
||||
public theorem LawfulOrderSup.of_le {α : Type u} [Max α] [LE α]
|
||||
(max_le_iff : ∀ a b c : α, max a b ≤ c ↔ a ≤ c ∧ b ≤ c) : LawfulOrderSup α where
|
||||
max_le_iff := max_le_iff
|
||||
|
||||
@@ -159,7 +159,7 @@ instances.
|
||||
|
||||
The produced instance entails `LawfulOrderSup α` and `MaxEqOr α`.
|
||||
-/
|
||||
public def LawfulOrderMax.of_max_le_iff {α : Type u} [Max α] [LE α]
|
||||
public theorem LawfulOrderMax.of_max_le_iff {α : Type u} [Max α] [LE α]
|
||||
(max_le_iff : ∀ a b c : α, max a b ≤ c ↔ a ≤ c ∧ b ≤ c := by exact LawfulOrderInf.le_min_iff)
|
||||
(max_eq_or : ∀ a b : α, max a b = a ∨ max a b = b := by exact MaxEqOr.max_eq_or) :
|
||||
LawfulOrderMax α where
|
||||
@@ -196,7 +196,7 @@ Creates a *total* `LE α` instance from an `LT α` instance.
|
||||
|
||||
This only makes sense for asymmetric `LT α` instances (see `Std.Asymm`).
|
||||
-/
|
||||
@[inline]
|
||||
@[inline, implicit_reducible, expose]
|
||||
public def _root_.LE.ofLT (α : Type u) [LT α] : LE α where
|
||||
le a b := ¬ b < a
|
||||
|
||||
@@ -208,7 +208,7 @@ public instance LawfulOrderLT.of_lt {α : Type u} [LT α] [i : Asymm (α := α)
|
||||
haveI := LE.ofLT α
|
||||
LawfulOrderLT α :=
|
||||
letI := LE.ofLT α
|
||||
{ lt_iff a b := by simp +instances [LE.ofLT, LE.le]; apply Asymm.asymm }
|
||||
{ lt_iff a b := by simp +instances [LE.le]; apply Asymm.asymm }
|
||||
|
||||
/--
|
||||
If an `LT α` instance is asymmetric and its negation is transitive, then `LE.ofLT α` represents a
|
||||
@@ -253,7 +253,7 @@ public theorem LawfulOrderInf.of_lt {α : Type u} [Min α] [LT α]
|
||||
letI := LE.ofLT α
|
||||
{ le_min_iff a b c := by
|
||||
open Classical in
|
||||
simp +instances only [LE.ofLT, LE.le]
|
||||
simp +instances only [LE.le]
|
||||
simp [← not_or, Decidable.not_iff_not]
|
||||
simpa [Decidable.imp_iff_not_or] using min_lt_iff a b c }
|
||||
|
||||
@@ -276,14 +276,14 @@ public theorem LawfulOrderMin.of_lt {α : Type u} [Min α] [LT α]
|
||||
This lemma characterizes in terms of `LT α` when a `Max α` instance
|
||||
"behaves like an supremum operator" with respect to `LE.ofLT α`.
|
||||
-/
|
||||
public def LawfulOrderSup.of_lt {α : Type u} [Max α] [LT α]
|
||||
public theorem LawfulOrderSup.of_lt {α : Type u} [Max α] [LT α]
|
||||
(lt_max_iff : ∀ a b c : α, c < max a b ↔ c < a ∨ c < b) :
|
||||
haveI := LE.ofLT α
|
||||
LawfulOrderSup α :=
|
||||
letI := LE.ofLT α
|
||||
{ max_le_iff a b c := by
|
||||
open Classical in
|
||||
simp +instances only [LE.ofLT, LE.le]
|
||||
simp +instances only [LE.le]
|
||||
simp [← not_or, Decidable.not_iff_not]
|
||||
simpa [Decidable.imp_iff_not_or] using lt_max_iff a b c }
|
||||
|
||||
@@ -293,7 +293,7 @@ Derives a `LawfulOrderMax α` instance for `LE.ofLT` from two properties involvi
|
||||
|
||||
The produced instance entails `LawfulOrderSup α` and `MaxEqOr α`.
|
||||
-/
|
||||
public def LawfulOrderMax.of_lt {α : Type u} [Max α] [LT α]
|
||||
public theorem LawfulOrderMax.of_lt {α : Type u} [Max α] [LT α]
|
||||
(lt_max_iff : ∀ a b c : α, c < max a b ↔ c < a ∨ c < b)
|
||||
(max_eq_or : ∀ a b : α, max a b = a ∨ max a b = b) :
|
||||
haveI := LE.ofLT α
|
||||
|
||||
@@ -26,7 +26,7 @@ public def _root_.LE.ofOrd (α : Type u) [Ord α] : LE α where
|
||||
/--
|
||||
Creates an `DecidableLE α` instance using a well-behaved `Ord α` instance.
|
||||
-/
|
||||
@[inline, expose]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def _root_.DecidableLE.ofOrd (α : Type u) [LE α] [Ord α] [LawfulOrderOrd α] :
|
||||
DecidableLE α :=
|
||||
fun a b => match h : (compare a b).isLE with
|
||||
@@ -93,7 +93,7 @@ grind_pattern compare_ne_eq => compare a b, Ordering.eq where
|
||||
/--
|
||||
Creates a `DecidableLT α` instance using a well-behaved `Ord α` instance.
|
||||
-/
|
||||
@[inline, expose]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def _root_.DecidableLT.ofOrd (α : Type u) [LE α] [LT α] [Ord α] [LawfulOrderOrd α]
|
||||
[LawfulOrderLT α] :
|
||||
DecidableLT α :=
|
||||
|
||||
@@ -39,8 +39,8 @@ public theorem minOn_id [Min α] [LE α] [DecidableLE α] [LawfulOrderLeftLeanin
|
||||
|
||||
public theorem maxOn_id [Max α] [LE α] [DecidableLE α] [LawfulOrderLeftLeaningMax α] {x y : α} :
|
||||
maxOn id x y = max x y := by
|
||||
letI : LE α := (inferInstanceAs (LE α)).opposite
|
||||
letI : Min α := (inferInstanceAs (Max α)).oppositeMin
|
||||
letI : LE α := (inferInstance : LE α).opposite
|
||||
letI : Min α := (inferInstance : Max α).oppositeMin
|
||||
simp [maxOn, minOn_id, Max.min_oppositeMin, this]
|
||||
|
||||
public theorem minOn_eq_or [LE β] [DecidableLE β] {f : α → β} {x y : α} :
|
||||
@@ -168,32 +168,32 @@ public theorem maxOn_eq_right_of_lt
|
||||
[LE β] [DecidableLE β] [LT β] [Total (α := β) (· ≤ ·)] [LawfulOrderLT β]
|
||||
{f : α → β} {x y : α} (h : f x < f y) :
|
||||
maxOn f x y = y :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LT β := (inferInstanceAs (LT β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
letI : LT β := (inferInstance : LT β).opposite
|
||||
minOn_eq_right_of_lt (h := by simpa [LT.lt_opposite_iff] using h) ..
|
||||
|
||||
public theorem left_le_apply_maxOn [le : LE β] [DecidableLE β] [IsLinearPreorder β] {f : α → β}
|
||||
{x y : α} : f x ≤ f (maxOn f x y) := by
|
||||
rw [maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa only [LE.le_opposite_iff] using apply_minOn_le_left (f := f) ..
|
||||
|
||||
public theorem right_le_apply_maxOn [LE β] [DecidableLE β] [IsLinearPreorder β] {f : α → β}
|
||||
{x y : α} : f y ≤ f (maxOn f x y) := by
|
||||
rw [maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa only [LE.le_opposite_iff] using apply_minOn_le_right (f := f)
|
||||
|
||||
public theorem apply_maxOn_le_iff [LE β] [DecidableLE β] [IsLinearPreorder β] {f : α → β}
|
||||
{x y : α} {b : β} :
|
||||
f (maxOn f x y) ≤ b ↔ f x ≤ b ∧ f y ≤ b := by
|
||||
rw [maxOn_eq_minOn]
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
simpa only [LE.le_opposite_iff] using le_apply_minOn_iff (f := f)
|
||||
|
||||
public theorem maxOn_assoc [LE β] [DecidableLE β] [IsLinearPreorder β] {f : α → β}
|
||||
{x y z : α} : maxOn f (maxOn f x y) z = maxOn f x (maxOn f y z) :=
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
minOn_assoc (f := f)
|
||||
|
||||
public instance [LE β] [DecidableLE β] [IsLinearPreorder β] {f : α → β} :
|
||||
@@ -203,8 +203,8 @@ public instance [LE β] [DecidableLE β] [IsLinearPreorder β] {f : α → β} :
|
||||
|
||||
public theorem max_apply [LE β] [DecidableLE β] [Max β] [LawfulOrderLeftLeaningMax β]
|
||||
{f : α → β} {x y : α} : max (f x) (f y) = f (maxOn f x y) := by
|
||||
letI : LE β := (inferInstanceAs (LE β)).opposite
|
||||
letI : Min β := (inferInstanceAs (Max β)).oppositeMin
|
||||
letI : LE β := (inferInstance : LE β).opposite
|
||||
letI : Min β := (inferInstance : Max β).oppositeMin
|
||||
simpa [Max.min_oppositeMin] using min_apply (f := f)
|
||||
|
||||
public theorem apply_maxOn [LE β] [DecidableLE β] [Max β] [LawfulOrderLeftLeaningMax β]
|
||||
|
||||
@@ -44,7 +44,7 @@ def min' [LE α] [DecidableLE α] (a b : α) : α :=
|
||||
|
||||
open scoped Std.OppositeOrderInstances in
|
||||
def max' [LE α] [DecidableLE α] (a b : α) : α :=
|
||||
letI : LE α := (inferInstanceAs (LE α)).opposite
|
||||
letI : LE α := (inferInstance : LE α).opposite
|
||||
-- `DecidableLE` for the opposite order is derived automatically via `OppositeOrderInstances`
|
||||
min' a b
|
||||
```
|
||||
@@ -52,7 +52,8 @@ def max' [LE α] [DecidableLE α] (a b : α) : α :=
|
||||
Without the `open scoped` command, Lean would not find the required {lit}`DecidableLE α`
|
||||
instance for the opposite order.
|
||||
-/
|
||||
@[implicit_reducible] def LE.opposite (le : LE α) : LE α where
|
||||
@[implicit_reducible]
|
||||
def LE.opposite (le : LE α) : LE α where
|
||||
le a b := b ≤ a
|
||||
|
||||
theorem LE.opposite_def {le : LE α} :
|
||||
@@ -89,6 +90,7 @@ example [LE α] [LT α] [Std.LawfulOrderLT α] [Std.IsLinearOrder α] {x y : α}
|
||||
Without the `open scoped` command, Lean would not find the {lit}`LawfulOrderLT α`
|
||||
and {lit}`IsLinearOrder α` instances for the opposite order that are required by {name}`not_le`.
|
||||
-/
|
||||
@[implicit_reducible]
|
||||
def LT.opposite (lt : LT α) : LT α where
|
||||
lt a b := b < a
|
||||
|
||||
@@ -125,6 +127,7 @@ example [LE α] [DecidableLE α] [Min α] [Std.LawfulOrderLeftLeaningMin α] {a
|
||||
Without the `open scoped` command, Lean would not find the {lit}`LawfulOrderLeftLeaningMax α`
|
||||
instance for the opposite order that is required by {name}`max_eq_if`.
|
||||
-/
|
||||
@[implicit_reducible]
|
||||
def Min.oppositeMax (min : Min α) : Max α where
|
||||
max a b := Min.min a b
|
||||
|
||||
@@ -161,6 +164,7 @@ example [LE α] [DecidableLE α] [Max α] [Std.LawfulOrderLeftLeaningMax α] {a
|
||||
Without the `open scoped` command, Lean would not find the {lit}`LawfulOrderLeftLeaningMin α`
|
||||
instance for the opposite order that is required by {name}`min_eq_if`.
|
||||
-/
|
||||
@[implicit_reducible]
|
||||
def Max.oppositeMin (max : Max α) : Min α where
|
||||
min a b := Max.max a b
|
||||
|
||||
|
||||
@@ -47,7 +47,7 @@ public instance instLawfulOrderBEqOfDecidableLE {α : Type u} [LE α] [Decidable
|
||||
beq_iff_le_and_ge := by simp [BEq.beq]
|
||||
|
||||
/-- If `LT` can be characterized in terms of a decidable `LE`, then `LT` is decidable either. -/
|
||||
@[expose, instance_reducible]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def decidableLTOfLE {α : Type u} [LE α] {_ : LT α} [DecidableLE α] [LawfulOrderLT α] :
|
||||
DecidableLT α :=
|
||||
fun a b =>
|
||||
@@ -171,7 +171,7 @@ automatically. If it fails, it is necessary to provide some of the fields manual
|
||||
* Other proof obligations, namely `le_refl` and `le_trans`, can be omitted if `Refl` and `Trans`
|
||||
instances can be synthesized.
|
||||
-/
|
||||
@[expose, implicit_reducible]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def PreorderPackage.ofLE (α : Type u)
|
||||
(args : Packages.PreorderOfLEArgs α := by exact {}) : PreorderPackage α where
|
||||
toLE := args.le
|
||||
@@ -256,7 +256,7 @@ automatically. If it fails, it is necessary to provide some of the fields manual
|
||||
* Other proof obligations, namely `le_refl`, `le_trans` and `le_antisymm`, can be omitted if `Refl`,
|
||||
`Trans` and `Antisymm` instances can be synthesized.
|
||||
-/
|
||||
@[expose, implicit_reducible]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def PartialOrderPackage.ofLE (α : Type u)
|
||||
(args : Packages.PartialOrderOfLEArgs α := by exact {}) : PartialOrderPackage α where
|
||||
toPreorderPackage := .ofLE α args.toPreorderOfLEArgs
|
||||
@@ -385,7 +385,7 @@ automatically. If it fails, it is necessary to provide some of the fields manual
|
||||
* Other proof obligations, namely `le_total` and `le_trans`, can be omitted if `Total` and `Trans`
|
||||
instances can be synthesized.
|
||||
-/
|
||||
@[expose, implicit_reducible]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def LinearPreorderPackage.ofLE (α : Type u)
|
||||
(args : Packages.LinearPreorderOfLEArgs α := by exact {}) : LinearPreorderPackage α where
|
||||
toPreorderPackage := .ofLE α args.toPreorderOfLEArgs
|
||||
@@ -487,7 +487,7 @@ automatically. If it fails, it is necessary to provide some of the fields manual
|
||||
* Other proof obligations, namely `le_total`, `le_trans` and `le_antisymm`, can be omitted if
|
||||
`Total`, `Trans` and `Antisymm` instances can be synthesized.
|
||||
-/
|
||||
@[expose, implicit_reducible]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def LinearOrderPackage.ofLE (α : Type u)
|
||||
(args : Packages.LinearOrderOfLEArgs α := by exact {}) : LinearOrderPackage α where
|
||||
toLinearPreorderPackage := .ofLE α args.toLinearPreorderOfLEArgs
|
||||
@@ -647,7 +647,7 @@ automatically. If it fails, it is necessary to provide some of the fields manual
|
||||
* Other proof obligations, for example `transOrd`, can be omitted if a matching instance can be
|
||||
synthesized.
|
||||
-/
|
||||
@[expose, instance_reducible]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def LinearPreorderPackage.ofOrd (α : Type u)
|
||||
(args : Packages.LinearPreorderOfOrdArgs α := by exact {}) : LinearPreorderPackage α :=
|
||||
letI := args.ord
|
||||
@@ -793,7 +793,7 @@ automatically. If it fails, it is necessary to provide some of the fields manual
|
||||
* Other proof obligations, such as `transOrd`, can be omitted if matching instances can be
|
||||
synthesized.
|
||||
-/
|
||||
@[expose, instance_reducible]
|
||||
@[inline, expose, implicit_reducible]
|
||||
public def LinearOrderPackage.ofOrd (α : Type u)
|
||||
(args : Packages.LinearOrderOfOrdArgs α := by exact {}) : LinearOrderPackage α :=
|
||||
-- set_option backward.isDefEq.respectTransparency false in
|
||||
|
||||
@@ -597,8 +597,7 @@ instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α] [LE α] [Decidabl
|
||||
LawfulIteratorLoop (Rxc.Iterator α) Id n where
|
||||
lawful := by
|
||||
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
|
||||
simp +instances only [IteratorLoop.defaultImplementation, IteratorLoop.forIn,
|
||||
IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
|
||||
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
|
||||
rw [IterM.DefaultConsumers.forIn'.wf]
|
||||
split; rotate_left
|
||||
· simp only [IterM.step_eq,
|
||||
@@ -1173,8 +1172,7 @@ instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α] [LT α] [Decidabl
|
||||
LawfulIteratorLoop (Rxo.Iterator α) Id n where
|
||||
lawful := by
|
||||
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
|
||||
simp +instances only [IteratorLoop.defaultImplementation, IteratorLoop.forIn,
|
||||
IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
|
||||
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
|
||||
rw [IterM.DefaultConsumers.forIn'.wf]
|
||||
split; rotate_left
|
||||
· simp [IterM.step_eq, Monadic.step, Internal.LawfulMonadLiftBindFunction.liftBind_pure (liftBind := lift)]
|
||||
@@ -1639,8 +1637,7 @@ instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α]
|
||||
LawfulIteratorLoop (Rxi.Iterator α) Id n where
|
||||
lawful := by
|
||||
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
|
||||
simp +instances only [IteratorLoop.defaultImplementation, IteratorLoop.forIn,
|
||||
IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
|
||||
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
|
||||
rw [IterM.DefaultConsumers.forIn'.wf]
|
||||
split; rotate_left
|
||||
· simp [Monadic.step_eq_step, Monadic.step, Internal.LawfulMonadLiftBindFunction.liftBind_pure]
|
||||
|
||||
@@ -438,6 +438,7 @@ protected theorem UpwardEnumerable.le_iff {α : Type u} [LE α] [UpwardEnumerabl
|
||||
[LawfulUpwardEnumerableLE α] {a b : α} : a ≤ b ↔ UpwardEnumerable.LE a b :=
|
||||
LawfulUpwardEnumerableLE.le_iff a b
|
||||
|
||||
@[expose, implicit_reducible]
|
||||
def UpwardEnumerable.instLETransOfLawfulUpwardEnumerableLE {α : Type u} [LE α]
|
||||
[UpwardEnumerable α] [LawfulUpwardEnumerable α] [LawfulUpwardEnumerableLE α] :
|
||||
Trans (α := α) (· ≤ ·) (· ≤ ·) (· ≤ ·) where
|
||||
@@ -502,12 +503,13 @@ protected theorem UpwardEnumerable.lt_succ_iff {α : Type u} [UpwardEnumerable
|
||||
← succMany?_eq_some_iff_succMany] at hn
|
||||
exact ⟨n, hn⟩
|
||||
|
||||
@[expose, implicit_reducible]
|
||||
def UpwardEnumerable.instLTTransOfLawfulUpwardEnumerableLT {α : Type u} [LT α]
|
||||
[UpwardEnumerable α] [LawfulUpwardEnumerable α] [LawfulUpwardEnumerableLT α] :
|
||||
Trans (α := α) (· < ·) (· < ·) (· < ·) where
|
||||
trans := by simpa [UpwardEnumerable.lt_iff] using @UpwardEnumerable.lt_trans
|
||||
|
||||
def UpwardEnumerable.instLawfulOrderLTOfLawfulUpwardEnumerableLT {α : Type u} [LT α] [LE α]
|
||||
theorem UpwardEnumerable.instLawfulOrderLTOfLawfulUpwardEnumerableLT {α : Type u} [LT α] [LE α]
|
||||
[UpwardEnumerable α] [LawfulUpwardEnumerable α] [LawfulUpwardEnumerableLT α]
|
||||
[LawfulUpwardEnumerableLE α] :
|
||||
LawfulOrderLT α where
|
||||
|
||||
@@ -369,12 +369,6 @@ theorem String.ofList_toList {s : String} : String.ofList s.toList = s := by
|
||||
theorem String.asString_data {b : String} : String.ofList b.toList = b :=
|
||||
String.ofList_toList
|
||||
|
||||
@[simp]
|
||||
theorem String.ofList_comp_toList : String.ofList ∘ String.toList = id := by ext; simp
|
||||
|
||||
@[simp]
|
||||
theorem String.toList_comp_ofList : String.toList ∘ String.ofList = id := by ext; simp
|
||||
|
||||
theorem String.ofList_injective {l₁ l₂ : List Char} (h : String.ofList l₁ = String.ofList l₂) : l₁ = l₂ := by
|
||||
simpa using congrArg String.toList h
|
||||
|
||||
@@ -1531,11 +1525,6 @@ def Slice.Pos.toReplaceEnd {s : Slice} (p₀ : s.Pos) (pos : s.Pos) (h : pos ≤
|
||||
theorem Slice.Pos.offset_sliceTo {s : Slice} {p₀ : s.Pos} {pos : s.Pos} {h : pos ≤ p₀} :
|
||||
(sliceTo p₀ pos h).offset = pos.offset := (rfl)
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.sliceTo_inj {s : Slice} {p₀ : s.Pos} {pos pos' : s.Pos} {h h'} :
|
||||
p₀.sliceTo pos h = p₀.sliceTo pos' h' ↔ pos = pos' := by
|
||||
simp [Pos.ext_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSliceTo_startPos {s : Slice} {pos : s.Pos} :
|
||||
ofSliceTo (s.sliceTo pos).startPos = s.startPos := by
|
||||
@@ -1724,15 +1713,14 @@ def pos! (s : String) (off : Pos.Raw) : s.Pos :=
|
||||
@[simp]
|
||||
theorem offset_pos {s : String} {off : Pos.Raw} {h} : (s.pos off h).offset = off := rfl
|
||||
|
||||
/-- Constructs a valid position on `t` from a valid position on `s` and a proof that
|
||||
`s.copy = t.copy`. -/
|
||||
/-- Constructs a valid position on `t` from a valid position on `s` and a proof that `s = t`. -/
|
||||
@[inline]
|
||||
def Slice.Pos.cast {s t : Slice} (pos : s.Pos) (h : s.copy = t.copy) : t.Pos where
|
||||
def Slice.Pos.cast {s t : Slice} (pos : s.Pos) (h : s = t) : t.Pos where
|
||||
offset := pos.offset
|
||||
isValidForSlice := Pos.Raw.isValid_copy_iff.mp (h ▸ Pos.Raw.isValid_copy_iff.mpr pos.isValidForSlice)
|
||||
isValidForSlice := h ▸ pos.isValidForSlice
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.offset_cast {s t : Slice} {pos : s.Pos} {h : s.copy = t.copy} :
|
||||
theorem Slice.Pos.offset_cast {s t : Slice} {pos : s.Pos} {h : s = t} :
|
||||
(pos.cast h).offset = pos.offset := (rfl)
|
||||
|
||||
@[simp]
|
||||
@@ -1740,14 +1728,14 @@ theorem Slice.Pos.cast_rfl {s : Slice} {pos : s.Pos} : pos.cast rfl = pos :=
|
||||
Slice.Pos.ext (by simp)
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.cast_le_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s.copy = t.copy} :
|
||||
theorem Slice.Pos.cast_le_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s = t} :
|
||||
pos.cast h ≤ pos'.cast h ↔ pos ≤ pos' := by
|
||||
simp [Slice.Pos.le_iff]
|
||||
cases h; simp
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.cast_lt_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s.copy = t.copy} :
|
||||
theorem Slice.Pos.cast_lt_cast_iff {s t : Slice} {pos pos' : s.Pos} {h : s = t} :
|
||||
pos.cast h < pos'.cast h ↔ pos < pos' := by
|
||||
simp [Slice.Pos.lt_iff]
|
||||
cases h; simp
|
||||
|
||||
/-- Constructs a valid position on `t` from a valid position on `s` and a proof that `s = t`. -/
|
||||
@[inline]
|
||||
@@ -1978,7 +1966,6 @@ theorem Pos.ne_of_lt {s : String} {p q : s.Pos} : p < q → p ≠ q := by
|
||||
theorem Pos.lt_of_lt_of_le {s : String} {p q r : s.Pos} : p < q → q ≤ r → p < r := by
|
||||
simpa [Pos.lt_iff, Pos.le_iff] using Pos.Raw.lt_of_lt_of_le
|
||||
|
||||
@[simp]
|
||||
theorem Pos.le_endPos {s : String} (p : s.Pos) : p ≤ s.endPos := by
|
||||
simpa [Pos.le_iff] using p.isValid.le_rawEndPos
|
||||
|
||||
@@ -2277,26 +2264,14 @@ theorem Slice.Pos.le_ofSliceFrom {s : Slice} {p₀ : s.Pos} {pos : (s.sliceFrom
|
||||
p₀ ≤ ofSliceFrom pos := by
|
||||
simp [Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSliceFrom_lt_ofSliceFrom_iff {s : Slice} {p : s.Pos}
|
||||
{q r : (s.sliceFrom p).Pos} : Slice.Pos.ofSliceFrom q < Slice.Pos.ofSliceFrom r ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSliceFrom_le_ofSliceFrom_iff {s : Slice} {p : s.Pos}
|
||||
{q r : (s.sliceFrom p).Pos} : Slice.Pos.ofSliceFrom q ≤ Slice.Pos.ofSliceFrom r ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSliceFrom_lt_ofSliceFrom_iff {s : String} {p : s.Pos}
|
||||
{q r : (s.sliceFrom p).Pos} : Pos.ofSliceFrom q < Pos.ofSliceFrom r ↔ q < r := by
|
||||
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSliceFrom_le_ofSliceFrom_iff {s : String} {p : s.Pos}
|
||||
{q r : (s.sliceFrom p).Pos} : Pos.ofSliceFrom q ≤ Pos.ofSliceFrom r ↔ q ≤ r := by
|
||||
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
theorem Pos.get_eq_get_ofSliceFrom {s : String} {p₀ : s.Pos}
|
||||
{pos : (s.sliceFrom p₀).Pos} {h} :
|
||||
pos.get h = (ofSliceFrom pos).get (by rwa [← ofSliceFrom_endPos, ne_eq, ofSliceFrom_inj]) := by
|
||||
@@ -2360,16 +2335,6 @@ theorem Slice.Pos.ofSliceTo_le {s : Slice} {p₀ : s.Pos} {pos : (s.sliceTo p₀
|
||||
ofSliceTo pos ≤ p₀ := by
|
||||
simpa [Pos.le_iff, Pos.Raw.le_iff] using pos.isValidForSlice.le_utf8ByteSize
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSliceTo_lt_ofSliceTo_iff {s : String} {p : s.Pos}
|
||||
{q r : (s.sliceTo p).Pos} : Pos.ofSliceTo q < Pos.ofSliceTo r ↔ q < r := by
|
||||
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSliceTo_le_ofSliceTo_iff {s : String} {p : s.Pos}
|
||||
{q r : (s.sliceTo p).Pos} : Pos.ofSliceTo q ≤ Pos.ofSliceTo r ↔ q ≤ r := by
|
||||
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
/-- Given a position in `s` that is at most `p₀`, obtain the corresponding position in `s.sliceTo p₀`. -/
|
||||
@[inline]
|
||||
def Pos.sliceTo {s : String} (p₀ : s.Pos) (pos : s.Pos) (h : pos ≤ p₀) :
|
||||
@@ -2386,11 +2351,6 @@ def Pos.toReplaceEnd {s : String} (p₀ : s.Pos) (pos : s.Pos) (h : pos ≤ p₀
|
||||
theorem Pos.offset_sliceTo {s : String} {p₀ : s.Pos} {pos : s.Pos} {h : pos ≤ p₀} :
|
||||
(sliceTo p₀ pos h).offset = pos.offset := (rfl)
|
||||
|
||||
@[simp]
|
||||
theorem Pos.sliceTo_inj {s : String} {p₀ : s.Pos} {pos pos' : s.Pos} {h h'} :
|
||||
p₀.sliceTo pos h = p₀.sliceTo pos' h' ↔ pos = pos' := by
|
||||
simp [Pos.ext_iff, Slice.Pos.ext_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSliceTo_sliceTo {s : Slice} {p₀ p : s.Pos} {h : p ≤ p₀} :
|
||||
Slice.Pos.ofSliceTo (p₀.sliceTo p h) = p := by
|
||||
@@ -2459,27 +2419,6 @@ theorem Slice.Pos.ofSlice_inj {s : Slice} {p₀ p₁ : s.Pos} {h} (pos₁ pos₂
|
||||
ofSlice pos₁ = ofSlice pos₂ ↔ pos₁ = pos₂ := by
|
||||
simp [Pos.ext_iff, Pos.Raw.ext_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.le_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} : p₀ ≤ ofSlice pos := by
|
||||
simp [Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSlice_le {s : Slice} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} : ofSlice pos ≤ p₁ := by
|
||||
have := (Pos.Raw.isValidForSlice_slice _).1 pos.isValidForSlice |>.1
|
||||
simpa [Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSlice_lt_ofSlice_iff {s : Slice} {p₀ p₁ : s.Pos} {h}
|
||||
{q r : (s.slice p₀ p₁ h).Pos} : Slice.Pos.ofSlice q < Slice.Pos.ofSlice r ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSlice_le_ofSlice_iff {s : Slice} {p₀ p₁ : s.Pos} {h}
|
||||
{q r : (s.slice p₀ p₁ h).Pos} : Slice.Pos.ofSlice q ≤ Slice.Pos.ofSlice r ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
/-- Given a position in `s.slice p₀ p₁ h`, obtain the corresponding position in `s`. -/
|
||||
@[inline]
|
||||
def Pos.ofSlice {s : String} {p₀ p₁ : s.Pos} {h} (pos : (s.slice p₀ p₁ h).Pos) : s.Pos :=
|
||||
@@ -2510,27 +2449,6 @@ theorem Pos.ofSlice_inj {s : String} {p₀ p₁ : s.Pos} {h} (pos₁ pos₂ : (s
|
||||
ofSlice pos₁ = ofSlice pos₂ ↔ pos₁ = pos₂ := by
|
||||
simp [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.ext_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.le_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} : p₀ ≤ ofSlice pos := by
|
||||
simp [Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSlice_le {s : String} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} : ofSlice pos ≤ p₁ := by
|
||||
have := (Pos.Raw.isValidForSlice_slice _).1 pos.isValidForSlice |>.1
|
||||
simpa [Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSlice_lt_ofSlice_iff {s : String} {p₀ p₁ : s.Pos} {h}
|
||||
{q r : (s.slice p₀ p₁ h).Pos} : Pos.ofSlice q < Pos.ofSlice r ↔ q < r := by
|
||||
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSlice_le_ofSlice_iff {s : String} {p₀ p₁ : s.Pos} {h}
|
||||
{q r : (s.slice p₀ p₁ h).Pos} : Pos.ofSlice q ≤ Pos.ofSlice r ↔ q ≤ r := by
|
||||
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
theorem Slice.Pos.le_trans {s : Slice} {p q r : s.Pos} : p ≤ q → q ≤ r → p ≤ r := by
|
||||
simpa [Pos.le_iff, Pos.Raw.le_iff] using Nat.le_trans
|
||||
|
||||
@@ -2554,48 +2472,6 @@ def Pos.slice {s : String} (pos : s.Pos) (p₀ p₁ : s.Pos) (h₁ : p₀ ≤ po
|
||||
theorem Pos.offset_slice {s : String} {p₀ p₁ pos : s.Pos} {h₁ : p₀ ≤ pos} {h₂ : pos ≤ p₁} :
|
||||
(pos.slice p₀ p₁ h₁ h₂).offset = pos.offset.unoffsetBy p₀.offset := (rfl)
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.offset_slice {s : Slice} {p₀ p₁ pos : s.Pos} {h₁ : p₀ ≤ pos} {h₂ : pos ≤ p₁} :
|
||||
(pos.slice p₀ p₁ h₁ h₂).offset = pos.offset.unoffsetBy p₀.offset := (rfl)
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSlice_slice {s : Slice} {p₀ p₁ pos : s.Pos}
|
||||
{h₁ : p₀ ≤ pos} {h₂ : pos ≤ p₁} :
|
||||
Slice.Pos.ofSlice (pos.slice p₀ p₁ h₁ h₂) = pos := by
|
||||
simpa [Pos.ext_iff] using Pos.Raw.offsetBy_unoffsetBy_of_le h₁
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.slice_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} :
|
||||
(Slice.Pos.ofSlice pos).slice p₀ p₁ Slice.Pos.le_ofSlice Slice.Pos.ofSlice_le = pos := by
|
||||
simp [← Slice.Pos.ofSlice_inj]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ofSlice_slice {s : String} {p₀ p₁ pos : s.Pos}
|
||||
{h₁ : p₀ ≤ pos} {h₂ : pos ≤ p₁} :
|
||||
Pos.ofSlice (pos.slice p₀ p₁ h₁ h₂) = pos := by
|
||||
simpa [Pos.ext_iff] using Pos.Raw.offsetBy_unoffsetBy_of_le h₁
|
||||
|
||||
@[simp]
|
||||
theorem Pos.slice_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} :
|
||||
(Pos.ofSlice pos).slice p₀ p₁ Pos.le_ofSlice Pos.ofSlice_le = pos := by
|
||||
simp [← Pos.ofSlice_inj]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.slice_inj {s : Slice} {p₀ p₁ : s.Pos} {pos pos' : s.Pos}
|
||||
{h₁ h₁' h₂ h₂'} :
|
||||
pos.slice p₀ p₁ h₁ h₂ = pos'.slice p₀ p₁ h₁' h₂' ↔ pos = pos' := by
|
||||
simp [Pos.ext_iff, Pos.Raw.ext_iff, Pos.le_iff, Pos.Raw.le_iff] at ⊢ h₁ h₁'
|
||||
omega
|
||||
|
||||
@[simp]
|
||||
theorem Pos.slice_inj {s : String} {p₀ p₁ : s.Pos} {pos pos' : s.Pos}
|
||||
{h₁ h₁' h₂ h₂'} :
|
||||
pos.slice p₀ p₁ h₁ h₂ = pos'.slice p₀ p₁ h₁' h₂' ↔ pos = pos' := by
|
||||
simp [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.ext_iff, Pos.le_iff, Pos.Raw.le_iff] at ⊢ h₁ h₁'
|
||||
omega
|
||||
|
||||
/--
|
||||
Given a position in `s`, obtain the corresponding position in `s.slice p₀ p₁ h`, or panic if `pos`
|
||||
is not between `p₀` and `p₁`.
|
||||
@@ -2628,7 +2504,7 @@ taking `s.slice! p₀ p₁` already panicked. -/
|
||||
@[inline]
|
||||
def Slice.Pos.ofSlice! {s : Slice} {p₀ p₁ : s.Pos} (pos : (s.slice! p₀ p₁).Pos) : s.Pos :=
|
||||
if h : p₀ ≤ p₁ then
|
||||
ofSlice (h := h) (pos.cast (congrArg Slice.copy slice_eq_slice!.symm))
|
||||
ofSlice (h := h) (pos.cast slice_eq_slice!.symm)
|
||||
else
|
||||
panic! "Starting position must be less than or equal to end position."
|
||||
|
||||
@@ -2646,7 +2522,7 @@ taking `s.slice! p₀ p₁` already panicked or if the position is not between `
|
||||
def Slice.Pos.slice! {s : Slice} (pos : s.Pos) (p₀ p₁ : s.Pos) :
|
||||
(s.slice! p₀ p₁).Pos :=
|
||||
if h : p₀ ≤ pos ∧ pos ≤ p₁ then
|
||||
(pos.slice _ _ h.1 h.2).cast (congrArg Slice.copy slice_eq_slice!)
|
||||
(pos.slice _ _ h.1 h.2).cast slice_eq_slice!
|
||||
else
|
||||
panic! "Starting position must be less than or equal to end position and position must be between starting position and end position."
|
||||
|
||||
|
||||
@@ -403,6 +403,7 @@ achieved by tracking the bounds by hand, the slice API is much more convenient.
|
||||
`String.Slice` bundles proofs to ensure that the start and end positions always delineate a valid
|
||||
string. For this reason, it should be preferred over `Substring.Raw`.
|
||||
-/
|
||||
@[ext]
|
||||
structure Slice where
|
||||
/-- The underlying strings. -/
|
||||
str : String
|
||||
|
||||
@@ -16,7 +16,6 @@ public import Init.Data.String.Lemmas.IsEmpty
|
||||
public import Init.Data.String.Lemmas.Pattern
|
||||
public import Init.Data.String.Lemmas.Slice
|
||||
public import Init.Data.String.Lemmas.Iterate
|
||||
public import Init.Data.String.Lemmas.Intercalate
|
||||
import Init.Data.Order.Lemmas
|
||||
public import Init.Data.String.Basic
|
||||
import Init.Data.Char.Lemmas
|
||||
|
||||
@@ -99,15 +99,6 @@ theorem Slice.utf8ByteSize_eq_size_toByteArray_copy {s : Slice} :
|
||||
s.utf8ByteSize = s.copy.toByteArray.size := by
|
||||
simp [utf8ByteSize_eq]
|
||||
|
||||
@[ext (iff := false)]
|
||||
theorem Slice.ext {s t : Slice} (h : s.str = t.str)
|
||||
(hsi : s.startInclusive.cast h = t.startInclusive)
|
||||
(hee : s.endExclusive.cast h = t.endExclusive) : s = t := by
|
||||
rcases s with ⟨s, s₁, e₁, h₁⟩
|
||||
rcases t with ⟨t, s₂, e₂, h₂⟩
|
||||
cases h
|
||||
simp_all
|
||||
|
||||
section Iterate
|
||||
|
||||
/-
|
||||
@@ -115,71 +106,32 @@ These lemmas are slightly evil because they are non-definitional equalities betw
|
||||
are useful and they are at least equalities between slices with definitionally equal underlying
|
||||
strings, so it should be fine.
|
||||
-/
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
@[simp]
|
||||
theorem Slice.sliceTo_sliceFrom {s : Slice} {pos pos'} :
|
||||
(s.sliceFrom pos).sliceTo pos' =
|
||||
s.slice pos (Slice.Pos.ofSliceFrom pos') Slice.Pos.le_ofSliceFrom := by
|
||||
ext <;> simp [Pos.Raw.offsetBy_assoc]
|
||||
ext <;> simp [String.Pos.ext_iff, Pos.Raw.offsetBy_assoc]
|
||||
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
@[simp]
|
||||
theorem Slice.sliceFrom_sliceTo {s : Slice} {pos pos'} :
|
||||
(s.sliceTo pos).sliceFrom pos' =
|
||||
s.slice (Slice.Pos.ofSliceTo pos') pos Slice.Pos.ofSliceTo_le := by
|
||||
ext <;> simp
|
||||
ext <;> simp [String.Pos.ext_iff]
|
||||
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
@[simp]
|
||||
theorem Slice.sliceFrom_sliceFrom {s : Slice} {pos pos'} :
|
||||
(s.sliceFrom pos).sliceFrom pos' =
|
||||
s.sliceFrom (Slice.Pos.ofSliceFrom pos') := by
|
||||
ext <;> simp [Pos.Raw.offsetBy_assoc]
|
||||
ext <;> simp [String.Pos.ext_iff, Pos.Raw.offsetBy_assoc]
|
||||
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
@[simp]
|
||||
theorem Slice.sliceTo_sliceTo {s : Slice} {pos pos'} :
|
||||
(s.sliceTo pos).sliceTo pos' = s.sliceTo (Slice.Pos.ofSliceTo pos') := by
|
||||
ext <;> simp
|
||||
|
||||
@[simp]
|
||||
theorem Slice.sliceFrom_slice {s : Slice} {p₁ p₂ h p} :
|
||||
(s.slice p₁ p₂ h).sliceFrom p = s.slice (Pos.ofSlice p) p₂ Pos.ofSlice_le := by
|
||||
ext <;> simp [Nat.add_assoc]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.sliceTo_slice {s : Slice} {p₁ p₂ h p} :
|
||||
(s.slice p₁ p₂ h).sliceTo p = s.slice p₁ (Pos.ofSlice p) Pos.le_ofSlice := by
|
||||
ext <;> simp [Nat.add_assoc]
|
||||
|
||||
@[simp]
|
||||
theorem sliceTo_sliceFrom {s : String} {pos pos'} :
|
||||
(s.sliceFrom pos).sliceTo pos' =
|
||||
s.slice pos (Pos.ofSliceFrom pos') Pos.le_ofSliceFrom := by
|
||||
ext <;> simp
|
||||
|
||||
@[simp]
|
||||
theorem sliceFrom_sliceTo {s : String} {pos pos'} :
|
||||
(s.sliceTo pos).sliceFrom pos' =
|
||||
s.slice (Pos.ofSliceTo pos') pos Pos.ofSliceTo_le := by
|
||||
ext <;> simp
|
||||
|
||||
@[simp]
|
||||
theorem sliceFrom_sliceFrom {s : String} {pos pos'} :
|
||||
(s.sliceFrom pos).sliceFrom pos' =
|
||||
s.sliceFrom (Pos.ofSliceFrom pos') := by
|
||||
ext <;> simp
|
||||
|
||||
@[simp]
|
||||
theorem sliceTo_sliceTo {s : String} {pos pos'} :
|
||||
(s.sliceTo pos).sliceTo pos' = s.sliceTo (Pos.ofSliceTo pos') := by
|
||||
ext <;> simp
|
||||
|
||||
@[simp]
|
||||
theorem sliceFrom_slice {s : String} {p₁ p₂ h p} :
|
||||
(s.slice p₁ p₂ h).sliceFrom p = s.slice (Pos.ofSlice p) p₂ Pos.ofSlice_le := by
|
||||
ext <;> simp
|
||||
|
||||
@[simp]
|
||||
theorem sliceTo_slice {s : String} {p₁ p₂ h p} :
|
||||
(s.slice p₁ p₂ h).sliceTo p = s.slice p₁ (Pos.ofSlice p) Pos.le_ofSlice := by
|
||||
ext <;> simp
|
||||
ext <;> simp [String.Pos.ext_iff]
|
||||
|
||||
end Iterate
|
||||
|
||||
@@ -205,10 +157,9 @@ theorem Slice.copy_pos {s : Slice} {p : Pos.Raw} {h : Pos.Raw.IsValidForSlice s
|
||||
simp [String.Pos.ext_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.cast_pos {s t : Slice} {p : Pos.Raw} {h : Pos.Raw.IsValidForSlice s p}
|
||||
{h' : s.copy = t.copy} {h'' : Pos.Raw.IsValidForSlice t p} :
|
||||
(s.pos p h).cast h' = t.pos p h'' := by
|
||||
simp [Slice.Pos.ext_iff]
|
||||
theorem Slice.cast_pos {s t : Slice} {p : Pos.Raw} {h : Pos.Raw.IsValidForSlice s p} {h' : s = t} :
|
||||
(s.pos p h).cast h' = t.pos p (h' ▸ h) := by
|
||||
simp [Pos.ext_iff]
|
||||
|
||||
@[simp]
|
||||
theorem cast_pos {s t : String} {p : Pos.Raw} {h : Pos.Raw.IsValid s p} {h' : s = t} :
|
||||
@@ -225,7 +176,4 @@ theorem Pos.get_ofToSlice {s : String} {p : (s.toSlice).Pos} {h} :
|
||||
(ofToSlice p).get h = p.get (by simpa [← ofToSlice_inj]) := by
|
||||
simp [get_eq_get_toSlice]
|
||||
|
||||
@[simp]
|
||||
theorem push_empty {c : Char} : "".push c = singleton c := rfl
|
||||
|
||||
end String
|
||||
|
||||
@@ -1,70 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Defs
|
||||
import all Init.Data.String.Defs
|
||||
public import Init.Data.String.Slice
|
||||
import all Init.Data.String.Slice
|
||||
|
||||
public section
|
||||
|
||||
namespace String
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_nil {s : String} : s.intercalate [] = "" := by
|
||||
simp [intercalate]
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_singleton {s t : String} : s.intercalate [t] = t := by
|
||||
simp [intercalate, intercalate.go]
|
||||
|
||||
private theorem intercalateGo_append {s t u : String} {l : List String} :
|
||||
intercalate.go (s ++ t) u l = s ++ intercalate.go t u l := by
|
||||
induction l generalizing t <;> simp [intercalate.go, String.append_assoc, *]
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_cons_cons {s t u : String} {l : List String} :
|
||||
s.intercalate (t :: u :: l) = t ++ s ++ s.intercalate (u :: l) := by
|
||||
simp [intercalate, intercalate.go, intercalateGo_append]
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_cons_append {s t u : String} {l : List String} :
|
||||
s.intercalate ((t ++ u) :: l) = t ++ s.intercalate (u :: l) := by
|
||||
cases l <;> simp [String.append_assoc]
|
||||
|
||||
theorem intercalate_cons_of_ne_nil {s t : String} {l : List String} (h : l ≠ []) :
|
||||
s.intercalate (t :: l) = t ++ s ++ s.intercalate l :=
|
||||
match l, h with
|
||||
| u::l, _ => by simp
|
||||
|
||||
@[simp]
|
||||
theorem toList_intercalate {s : String} {l : List String} :
|
||||
(s.intercalate l).toList = s.toList.intercalate (l.map String.toList) := by
|
||||
induction l with
|
||||
| nil => simp
|
||||
| cons hd tl ih => cases tl <;> simp_all
|
||||
|
||||
namespace Slice
|
||||
|
||||
@[simp]
|
||||
theorem _root_.String.appendSlice_eq {s : String} {t : Slice} : s ++ t = s ++ t.copy := rfl
|
||||
|
||||
private theorem intercalateGo_append {s t : String} {u : Slice} {l : List Slice} :
|
||||
intercalate.go (s ++ t) u l = s ++ intercalate.go t u l := by
|
||||
induction l generalizing t <;> simp [intercalate.go, String.append_assoc, *]
|
||||
|
||||
@[simp]
|
||||
theorem intercalate_eq {s : Slice} {l : List Slice} :
|
||||
s.intercalate l = s.copy.intercalate (l.map Slice.copy) := by
|
||||
induction l with
|
||||
| nil => simp [intercalate]
|
||||
| cons hd tl ih => cases tl <;> simp_all [intercalate, intercalate.go, intercalateGo_append]
|
||||
|
||||
end Slice
|
||||
|
||||
end String
|
||||
@@ -87,10 +87,6 @@ theorem isEmpty_iff_utf8ByteSize_eq_zero {s : String} : s.isEmpty ↔ s.utf8Byte
|
||||
theorem isEmpty_iff {s : String} : s.isEmpty ↔ s = "" := by
|
||||
simp [isEmpty_iff_utf8ByteSize_eq_zero]
|
||||
|
||||
@[simp]
|
||||
theorem isEmpty_eq_false_iff {s : String} : s.isEmpty = false ↔ s ≠ "" := by
|
||||
simp [← isEmpty_iff]
|
||||
|
||||
theorem startPos_ne_endPos_iff {s : String} : s.startPos ≠ s.endPos ↔ s ≠ "" := by
|
||||
simp
|
||||
|
||||
@@ -179,34 +175,4 @@ theorem Slice.toByteArray_copy_ne_empty_iff {s : Slice} :
|
||||
s.copy.toByteArray ≠ ByteArray.empty ↔ s.isEmpty = false := by
|
||||
simp
|
||||
|
||||
section CopyEqEmpty
|
||||
|
||||
-- Yes, `simp` can prove these, but we still need to mark them as simp lemmas.
|
||||
|
||||
@[simp]
|
||||
theorem copy_slice_self {s : String} {p : s.Pos} : (s.slice p p (Pos.le_refl _)).copy = "" := by
|
||||
simp
|
||||
|
||||
@[simp]
|
||||
theorem copy_sliceTo_startPos {s : String} : (s.sliceTo s.startPos).copy = "" := by
|
||||
simp
|
||||
|
||||
@[simp]
|
||||
theorem copy_sliceFrom_startPos {s : String} : (s.sliceFrom s.endPos).copy = "" := by
|
||||
simp
|
||||
|
||||
@[simp]
|
||||
theorem Slice.copy_slice_self {s : Slice} {p : s.Pos} : (s.slice p p (Pos.le_refl _)).copy = "" := by
|
||||
simp
|
||||
|
||||
@[simp]
|
||||
theorem Slice.copy_sliceTo_startPos {s : Slice} : (s.sliceTo s.startPos).copy = "" := by
|
||||
simp
|
||||
|
||||
@[simp]
|
||||
theorem Slice.copy_sliceFrom_startPos {s : Slice} : (s.sliceFrom s.endPos).copy = "" := by
|
||||
simp
|
||||
|
||||
end CopyEqEmpty
|
||||
|
||||
end String
|
||||
|
||||
@@ -16,9 +16,6 @@ import Init.ByCases
|
||||
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.Iterators.Lemmas.Consumers.Loop
|
||||
public import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Data.Subtype.Basic
|
||||
|
||||
set_option doc.verso true
|
||||
|
||||
@@ -50,19 +47,6 @@ theorem Model.positionsFrom_eq_cons {s : Slice} {p : s.Pos} (hp : p ≠ s.endPos
|
||||
rw [Model.positionsFrom]
|
||||
simp [hp]
|
||||
|
||||
@[simp]
|
||||
theorem Model.mem_positionsFrom {s : Slice} {p : s.Pos} {q : { q : s.Pos // q ≠ s.endPos } } :
|
||||
q ∈ Model.positionsFrom p ↔ p ≤ q := by
|
||||
induction p using Pos.next_induction with
|
||||
| next p h ih =>
|
||||
rw [Model.positionsFrom_eq_cons h, List.mem_cons, ih]
|
||||
simp [Subtype.ext_iff, Std.le_iff_lt_or_eq (a := p), or_comm, eq_comm]
|
||||
| endPos => simp [q.property]
|
||||
|
||||
theorem Model.mem_positionsFrom_startPos {s : Slice} {q : { q : s.Pos // q ≠ s.endPos} } :
|
||||
q ∈ Model.positionsFrom s.startPos := by
|
||||
simp
|
||||
|
||||
theorem Model.map_get_positionsFrom_of_splits {s : Slice} {p : s.Pos} {t₁ t₂ : String}
|
||||
(hp : p.Splits t₁ t₂) : (Model.positionsFrom p).map (fun p => p.1.get p.2) = t₂.toList := by
|
||||
induction p using Pos.next_induction generalizing t₁ t₂ with
|
||||
@@ -76,6 +60,7 @@ theorem Model.map_get_positionsFrom_startPos {s : Slice} :
|
||||
(Model.positionsFrom s.startPos).map (fun p => p.1.get p.2) = s.copy.toList :=
|
||||
Model.map_get_positionsFrom_of_splits (splits_startPos s)
|
||||
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
@[simp]
|
||||
theorem toList_positionsFrom {s : Slice} {p : s.Pos} :
|
||||
(s.positionsFrom p).toList = Model.positionsFrom p := by
|
||||
@@ -95,38 +80,6 @@ theorem toList_positions {s : Slice} : s.positions.toList = Model.positionsFrom
|
||||
theorem toList_chars {s : Slice} : s.chars.toList = s.copy.toList := by
|
||||
simp [chars, Model.map_get_positionsFrom_startPos]
|
||||
|
||||
theorem mem_toList_copy_iff_exists_get {s : Slice} {c : Char} :
|
||||
c ∈ s.copy.toList ↔ ∃ (p : s.Pos) (h : p ≠ s.endPos), p.get h = c := by
|
||||
simp [← Model.map_get_positionsFrom_startPos]
|
||||
|
||||
theorem Pos.Splits.mem_toList_left_iff {s : Slice} {pos : s.Pos} {t u : String} {c : Char}
|
||||
(hs : pos.Splits t u) :
|
||||
c ∈ t.toList ↔ ∃ pos', ∃ (h : pos' < pos), pos'.get (Pos.ne_endPos_of_lt h) = c := by
|
||||
rw [hs.eq_left pos.splits, mem_toList_copy_iff_exists_get]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨p, hp, hpget⟩
|
||||
have hlt : Pos.ofSliceTo p < pos := by
|
||||
simpa using Pos.ofSliceTo_lt_ofSliceTo_iff.mpr ((Pos.lt_endPos_iff _).mpr hp)
|
||||
exact ⟨_, hlt, by rwa [Pos.get_eq_get_ofSliceTo] at hpget⟩
|
||||
· rintro ⟨pos', hlt, hget⟩
|
||||
exact ⟨pos.sliceTo pos' (Std.le_of_lt hlt),
|
||||
by simpa [← Pos.ofSliceTo_inj] using Std.ne_of_lt hlt,
|
||||
by rw [Slice.Pos.get_eq_get_ofSliceTo]; simpa using hget⟩
|
||||
|
||||
theorem Pos.Splits.mem_toList_right_iff {s : Slice} {pos : s.Pos} {t u : String} {c : Char}
|
||||
(hs : pos.Splits t u) :
|
||||
c ∈ u.toList ↔ ∃ pos', ∃ (_ : pos ≤ pos') (h : pos' ≠ s.endPos), pos'.get h = c := by
|
||||
rw [hs.eq_right pos.splits, mem_toList_copy_iff_exists_get]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨p, hp, hpget⟩
|
||||
exact ⟨Pos.ofSliceFrom p, Pos.le_ofSliceFrom,
|
||||
fun h => hp (Pos.ofSliceFrom_inj.mp (h.trans (Pos.ofSliceFrom_endPos (pos := pos)).symm)),
|
||||
by rwa [Pos.get_eq_get_ofSliceFrom] at hpget⟩
|
||||
· rintro ⟨pos', hle, hne, hget⟩
|
||||
exact ⟨pos.sliceFrom pos' hle,
|
||||
fun h => hne (by simpa using congrArg Pos.ofSliceFrom h),
|
||||
by rw [Pos.get_eq_get_ofSliceFrom]; simpa using hget⟩
|
||||
|
||||
/--
|
||||
A list of all positions strictly before {name}`p`, ordered from largest to smallest.
|
||||
|
||||
@@ -162,6 +115,7 @@ theorem Model.map_get_revPositionsFrom_endPos {s : Slice} :
|
||||
(Model.revPositionsFrom s.endPos).map (fun p => p.1.get p.2) = s.copy.toList.reverse :=
|
||||
Model.map_get_revPositionsFrom_of_splits (splits_endPos s)
|
||||
|
||||
set_option backward.isDefEq.respectTransparency false in
|
||||
@[simp]
|
||||
theorem toList_revPositionsFrom {s : Slice} {p : s.Pos} :
|
||||
(s.revPositionsFrom p).toList = Model.revPositionsFrom p := by
|
||||
@@ -214,19 +168,6 @@ theorem Model.positionsFrom_eq_cons {s : String} {p : s.Pos} (hp : p ≠ s.endPo
|
||||
rw [Model.positionsFrom]
|
||||
simp [hp]
|
||||
|
||||
@[simp]
|
||||
theorem Model.mem_positionsFrom {s : String} {p : s.Pos} {q : { q : s.Pos // q ≠ s.endPos } } :
|
||||
q ∈ Model.positionsFrom p ↔ p ≤ q := by
|
||||
induction p using Pos.next_induction with
|
||||
| next p h ih =>
|
||||
rw [Model.positionsFrom_eq_cons h, List.mem_cons, ih]
|
||||
simp [Subtype.ext_iff, Std.le_iff_lt_or_eq (a := p), or_comm, eq_comm]
|
||||
| endPos => simp [q.property]
|
||||
|
||||
theorem Model.mem_positionsFrom_startPos {s : String} {q : { q : s.Pos // q ≠ s.endPos} } :
|
||||
q ∈ Model.positionsFrom s.startPos := by
|
||||
simp
|
||||
|
||||
theorem Model.positionsFrom_eq_map {s : String} {p : s.Pos} :
|
||||
Model.positionsFrom p = (Slice.Model.positionsFrom p.toSlice).map
|
||||
(fun p => ⟨Pos.ofToSlice p.1, by simpa [← Pos.toSlice_inj] using p.2⟩) := by
|
||||
@@ -258,38 +199,6 @@ theorem toList_positions {s : String} : s.positions.toList = Model.positionsFrom
|
||||
theorem toList_chars {s : String} : s.chars.toList = s.toList := by
|
||||
simp [chars]
|
||||
|
||||
theorem mem_toList_iff_exists_get {s : String} {c : Char} :
|
||||
c ∈ s.toList ↔ ∃ (p : s.Pos) (h : p ≠ s.endPos), p.get h = c := by
|
||||
simp [← Model.map_get_positionsFrom_startPos]
|
||||
|
||||
theorem Pos.Splits.mem_toList_left_iff {s : String} {pos : s.Pos} {t u : String} {c : Char}
|
||||
(hs : pos.Splits t u) :
|
||||
c ∈ t.toList ↔ ∃ pos', ∃ (h : pos' < pos), pos'.get (Pos.ne_endPos_of_lt h) = c := by
|
||||
rw [hs.eq_left pos.splits, Slice.mem_toList_copy_iff_exists_get]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨p, hp, hpget⟩
|
||||
have hlt : Pos.ofSliceTo p < pos := by
|
||||
simpa using Pos.ofSliceTo_lt_ofSliceTo_iff.mpr ((Slice.Pos.lt_endPos_iff _).mpr hp)
|
||||
exact ⟨_, hlt, by rwa [Pos.get_eq_get_ofSliceTo] at hpget⟩
|
||||
· rintro ⟨pos', hlt, hget⟩
|
||||
exact ⟨pos.sliceTo pos' (Std.le_of_lt hlt),
|
||||
fun h => Std.ne_of_lt hlt (by simpa using congrArg Pos.ofSliceTo h),
|
||||
by rw [Pos.get_eq_get_ofSliceTo]; simpa using hget⟩
|
||||
|
||||
theorem Pos.Splits.mem_toList_right_iff {s : String} {pos : s.Pos} {t u : String} {c : Char}
|
||||
(hs : pos.Splits t u) :
|
||||
c ∈ u.toList ↔ ∃ pos', ∃ (_ : pos ≤ pos') (h : pos' ≠ s.endPos), pos'.get h = c := by
|
||||
rw [hs.eq_right pos.splits, Slice.mem_toList_copy_iff_exists_get]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨p, hp, hpget⟩
|
||||
exact ⟨Pos.ofSliceFrom p, Pos.le_ofSliceFrom,
|
||||
fun h => hp (Pos.ofSliceFrom_inj.mp (h.trans Pos.ofSliceFrom_endPos.symm)),
|
||||
by rwa [Pos.get_eq_get_ofSliceFrom] at hpget⟩
|
||||
· rintro ⟨pos', hle, hne, hget⟩
|
||||
exact ⟨pos.sliceFrom pos' hle,
|
||||
fun h => hne (by simpa using congrArg Pos.ofSliceFrom h),
|
||||
by rw [Pos.get_eq_get_ofSliceFrom]; simpa using hget⟩
|
||||
|
||||
/--
|
||||
A list of all positions strictly before {name}`p`, ordered from largest to smallest.
|
||||
|
||||
|
||||
@@ -10,7 +10,6 @@ public import Init.Data.String.Basic
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Omega
|
||||
|
||||
public section
|
||||
|
||||
@@ -57,14 +56,6 @@ theorem Slice.Pos.endPos_le {s : Slice} (p : s.Pos) : s.endPos ≤ p ↔ p = s.e
|
||||
theorem Slice.Pos.lt_endPos_iff {s : Slice} (p : s.Pos) : p < s.endPos ↔ p ≠ s.endPos := by
|
||||
simp [← endPos_le, Std.not_le]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.endPos_le {s : String} (p : s.Pos) : s.endPos ≤ p ↔ p = s.endPos :=
|
||||
⟨fun h => Std.le_antisymm (le_endPos _) h, by simp +contextual⟩
|
||||
|
||||
@[simp]
|
||||
theorem Pos.lt_endPos_iff {s : String} (p : s.Pos) : p < s.endPos ↔ p ≠ s.endPos := by
|
||||
simp [← endPos_le, Std.not_le]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.le_startPos {s : String} (p : s.Pos) : p ≤ s.startPos ↔ p = s.startPos :=
|
||||
⟨fun h => Std.le_antisymm h (startPos_le _), by simp +contextual⟩
|
||||
@@ -73,6 +64,10 @@ theorem Pos.le_startPos {s : String} (p : s.Pos) : p ≤ s.startPos ↔ p = s.st
|
||||
theorem Pos.startPos_lt_iff {s : String} {p : s.Pos} : s.startPos < p ↔ p ≠ s.startPos := by
|
||||
simp [← le_startPos, Std.not_le]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.endPos_le {s : String} (p : s.Pos) : s.endPos ≤ p ↔ p = s.endPos :=
|
||||
⟨fun h => Std.le_antisymm (le_endPos _) h, by simp +contextual [Std.le_refl]⟩
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.not_lt_startPos {s : Slice} {p : s.Pos} : ¬ p < s.startPos :=
|
||||
fun h => Std.lt_irrefl (Std.lt_of_lt_of_le h (Slice.Pos.startPos_le _))
|
||||
@@ -105,317 +100,19 @@ theorem Slice.Pos.le_next {s : Slice} {p : s.Pos} {h} : p ≤ p.next h :=
|
||||
theorem Pos.le_next {s : String} {p : s.Pos} {h} : p ≤ p.next h :=
|
||||
Std.le_of_lt (by simp)
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ne_next {s : Slice} {p : s.Pos} {h} : p ≠ p.next h :=
|
||||
Std.ne_of_lt (by simp)
|
||||
|
||||
@[simp]
|
||||
theorem Pos.ne_next {s : String} {p : s.Pos} {h} : p ≠ p.next h :=
|
||||
Std.ne_of_lt (by simp)
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.next_ne {s : Slice} {p : s.Pos} {h} : p.next h ≠ p :=
|
||||
Ne.symm (by simp)
|
||||
|
||||
@[simp]
|
||||
theorem Pos.next_ne {s : String} {p : s.Pos} {h} : p.next h ≠ p :=
|
||||
Ne.symm (by simp)
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.next_ne_startPos {s : Slice} {p : s.Pos} {h} :
|
||||
p.next h ≠ s.startPos :=
|
||||
ne_startPos_of_lt lt_next
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSliceTo_lt_ofSliceTo_iff {s : Slice} {p : s.Pos}
|
||||
{q r : (s.sliceTo p).Pos} : Slice.Pos.ofSliceTo q < Slice.Pos.ofSliceTo r ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.ofSliceTo_le_ofSliceTo_iff {s : Slice} {p : s.Pos}
|
||||
{q r : (s.sliceTo p).Pos} : Slice.Pos.ofSliceTo q ≤ Slice.Pos.ofSliceTo r ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.sliceTo_lt_sliceTo_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceTo p₀ q h₁ < Pos.sliceTo p₀ r h₂ ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.sliceTo_le_sliceTo_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceTo p₀ q h₁ ≤ Pos.sliceTo p₀ r h₂ ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.sliceTo_lt_sliceTo_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceTo p₀ q h₁ < Pos.sliceTo p₀ r h₂ ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Pos.sliceTo_le_sliceTo_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceTo p₀ q h₁ ≤ Pos.sliceTo p₀ r h₂ ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.sliceFrom_lt_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceFrom p₀ q h₁ < Pos.sliceFrom p₀ r h₂ ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff, Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂ ⊢
|
||||
omega
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.sliceFrom_le_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceFrom p₀ q h₁ ≤ Pos.sliceFrom p₀ r h₂ ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂ ⊢
|
||||
omega
|
||||
|
||||
@[simp]
|
||||
theorem Pos.sliceFrom_lt_sliceFrom_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceFrom p₀ q h₁ < Pos.sliceFrom p₀ r h₂ ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂ ⊢
|
||||
omega
|
||||
|
||||
@[simp]
|
||||
theorem Pos.sliceFrom_le_sliceFrom_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
|
||||
Pos.sliceFrom p₀ q h₁ ≤ Pos.sliceFrom p₀ r h₂ ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂ ⊢
|
||||
omega
|
||||
|
||||
theorem Slice.Pos.ofSliceFrom_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceFrom p < q ↔ ∃ h, p < Slice.Pos.sliceFrom p₀ q h := by
|
||||
refine ⟨fun h => ⟨Std.le_of_lt (Std.lt_of_le_of_lt Pos.le_ofSliceFrom h), ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceFrom_ofSliceFrom (p := p)]
|
||||
rwa [Pos.sliceFrom_lt_sliceFrom_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceFrom_sliceFrom (h := h)]
|
||||
rwa [Pos.ofSliceFrom_lt_ofSliceFrom_iff]
|
||||
|
||||
theorem Slice.Pos.le_ofSliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
q ≤ Pos.ofSliceFrom p ↔ ∀ h, Slice.Pos.sliceFrom p₀ q h ≤ p := by
|
||||
simp [← Std.not_lt, Pos.ofSliceFrom_lt_iff]
|
||||
|
||||
theorem Slice.Pos.ofSliceFrom_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceFrom p ≤ q ↔ ∃ h, p ≤ Slice.Pos.sliceFrom p₀ q h := by
|
||||
refine ⟨fun h => ⟨Std.le_trans Pos.le_ofSliceFrom h, ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceFrom_ofSliceFrom (p := p)]
|
||||
rwa [Pos.sliceFrom_le_sliceFrom_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceFrom_sliceFrom (h := h)]
|
||||
rwa [Pos.ofSliceFrom_le_ofSliceFrom_iff]
|
||||
|
||||
theorem Slice.Pos.lt_ofSliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
q < Pos.ofSliceFrom p ↔ ∀ h, Slice.Pos.sliceFrom p₀ q h < p := by
|
||||
simp [← Std.not_le, Pos.ofSliceFrom_le_iff]
|
||||
|
||||
theorem Pos.ofSliceFrom_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceFrom p < q ↔ ∃ h, p < Pos.sliceFrom p₀ q h := by
|
||||
refine ⟨fun h => ⟨Std.le_of_lt (Std.lt_of_le_of_lt Pos.le_ofSliceFrom h), ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceFrom_ofSliceFrom (p := p)]
|
||||
rwa [Pos.sliceFrom_lt_sliceFrom_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceFrom_sliceFrom (h := h)]
|
||||
rwa [Pos.ofSliceFrom_lt_ofSliceFrom_iff]
|
||||
|
||||
theorem Pos.le_ofSliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
q ≤ Pos.ofSliceFrom p ↔ ∀ h, Pos.sliceFrom p₀ q h ≤ p := by
|
||||
simp [← Std.not_lt, Pos.ofSliceFrom_lt_iff]
|
||||
|
||||
theorem Pos.ofSliceFrom_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceFrom p ≤ q ↔ ∃ h, p ≤ Pos.sliceFrom p₀ q h := by
|
||||
refine ⟨fun h => ⟨Std.le_trans Pos.le_ofSliceFrom h, ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceFrom_ofSliceFrom (p := p)]
|
||||
rwa [Pos.sliceFrom_le_sliceFrom_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceFrom_sliceFrom (h := h)]
|
||||
rwa [Pos.ofSliceFrom_le_ofSliceFrom_iff]
|
||||
|
||||
theorem Pos.lt_ofSliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
|
||||
q < Pos.ofSliceFrom p ↔ ∀ h, Pos.sliceFrom p₀ q h < p := by
|
||||
simp [← Std.not_le, Pos.ofSliceFrom_le_iff]
|
||||
|
||||
theorem Slice.Pos.ofSliceFrom_next {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {h} :
|
||||
Pos.ofSliceFrom (p.next h) = (Pos.ofSliceFrom p).next (by simpa [← Pos.ofSliceFrom_inj] using h) := by
|
||||
rw [eq_comm, Pos.next_eq_iff]
|
||||
simp only [Pos.ofSliceFrom_lt_ofSliceFrom_iff, Pos.lt_next, Pos.ofSliceFrom_le_iff,
|
||||
Pos.next_le_iff_lt, true_and]
|
||||
simp [Pos.ofSliceFrom_lt_iff]
|
||||
|
||||
theorem Pos.ofSliceFrom_next {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {h} :
|
||||
Pos.ofSliceFrom (p.next h) = (Pos.ofSliceFrom p).next (by simpa [← Pos.ofSliceFrom_inj] using h) := by
|
||||
rw [eq_comm, Pos.next_eq_iff]
|
||||
simp only [Pos.ofSliceFrom_lt_ofSliceFrom_iff, Slice.Pos.lt_next, Pos.ofSliceFrom_le_iff,
|
||||
Slice.Pos.next_le_iff_lt, true_and]
|
||||
simp [Pos.ofSliceFrom_lt_iff]
|
||||
|
||||
theorem Slice.Pos.le_ofSliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
q ≤ Pos.ofSliceTo p ↔ ∃ h, Slice.Pos.sliceTo p₀ q h ≤ p := by
|
||||
refine ⟨fun h => ⟨Slice.Pos.le_trans h Pos.ofSliceTo_le, ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceTo_ofSliceTo (p := p)]
|
||||
rwa [Pos.sliceTo_le_sliceTo_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceTo_sliceTo (h := h)]
|
||||
rwa [Pos.ofSliceTo_le_ofSliceTo_iff]
|
||||
|
||||
theorem Slice.Pos.ofSliceTo_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceTo p < q ↔ ∀ h, p < Slice.Pos.sliceTo p₀ q h := by
|
||||
simp [← Std.not_le, Slice.Pos.le_ofSliceTo_iff]
|
||||
|
||||
theorem Slice.Pos.lt_ofSliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
q < Pos.ofSliceTo p ↔ ∃ h, Slice.Pos.sliceTo p₀ q h < p := by
|
||||
refine ⟨fun h => ⟨Std.le_of_lt (Std.lt_of_le_of_lt (Std.le_refl q) (Std.lt_of_lt_of_le h Pos.ofSliceTo_le)), ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceTo_ofSliceTo (p := p)]
|
||||
rwa [Pos.sliceTo_lt_sliceTo_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceTo_sliceTo (h := h)]
|
||||
rwa [Pos.ofSliceTo_lt_ofSliceTo_iff]
|
||||
|
||||
theorem Slice.Pos.ofSliceTo_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceTo p ≤ q ↔ ∀ h, p ≤ Slice.Pos.sliceTo p₀ q h := by
|
||||
simp [← Std.not_lt, Slice.Pos.lt_ofSliceTo_iff]
|
||||
|
||||
theorem Pos.le_ofSliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
q ≤ Pos.ofSliceTo p ↔ ∃ h, Pos.sliceTo p₀ q h ≤ p := by
|
||||
refine ⟨fun h => ⟨Pos.le_trans h Pos.ofSliceTo_le, ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceTo_ofSliceTo (p := p)]
|
||||
rwa [Pos.sliceTo_le_sliceTo_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceTo_sliceTo (h := h)]
|
||||
rwa [Pos.ofSliceTo_le_ofSliceTo_iff]
|
||||
|
||||
theorem Pos.ofSliceTo_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceTo p < q ↔ ∀ h, p < Pos.sliceTo p₀ q h := by
|
||||
simp [← Std.not_le, Pos.le_ofSliceTo_iff]
|
||||
|
||||
theorem Pos.lt_ofSliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
q < Pos.ofSliceTo p ↔ ∃ h, Pos.sliceTo p₀ q h < p := by
|
||||
refine ⟨fun h => ⟨Pos.le_of_lt (Pos.lt_of_lt_of_le h Pos.ofSliceTo_le), ?_⟩, fun ⟨h, h'⟩ => ?_⟩
|
||||
· simp +singlePass only [← Pos.sliceTo_ofSliceTo (p := p)]
|
||||
rwa [Pos.sliceTo_lt_sliceTo_iff]
|
||||
· simp +singlePass only [← Pos.ofSliceTo_sliceTo (h := h)]
|
||||
rwa [Pos.ofSliceTo_lt_ofSliceTo_iff]
|
||||
|
||||
theorem Pos.ofSliceTo_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
|
||||
Pos.ofSliceTo p ≤ q ↔ ∀ h, p ≤ Pos.sliceTo p₀ q h := by
|
||||
simp [← Std.not_lt, Pos.lt_ofSliceTo_iff]
|
||||
|
||||
theorem Slice.Pos.lt_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
p < Slice.Pos.sliceFrom p₀ q h ↔ Pos.ofSliceFrom p < q := by
|
||||
simp [ofSliceFrom_lt_iff, h]
|
||||
|
||||
theorem Slice.Pos.sliceFrom_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
Slice.Pos.sliceFrom p₀ q h ≤ p ↔ q ≤ Pos.ofSliceFrom p := by
|
||||
simp [← Std.not_lt, lt_sliceFrom_iff]
|
||||
|
||||
theorem Slice.Pos.le_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
p ≤ Slice.Pos.sliceFrom p₀ q h ↔ Pos.ofSliceFrom p ≤ q := by
|
||||
simp [ofSliceFrom_le_iff, h]
|
||||
|
||||
theorem Slice.Pos.sliceFrom_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
Slice.Pos.sliceFrom p₀ q h < p ↔ q < Pos.ofSliceFrom p := by
|
||||
simp [← Std.not_le, le_sliceFrom_iff]
|
||||
|
||||
theorem Pos.lt_sliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
p < Pos.sliceFrom p₀ q h ↔ Pos.ofSliceFrom p < q := by
|
||||
simp [ofSliceFrom_lt_iff, h]
|
||||
|
||||
theorem Pos.sliceFrom_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
Pos.sliceFrom p₀ q h ≤ p ↔ q ≤ Pos.ofSliceFrom p := by
|
||||
simp [← Std.not_lt, lt_sliceFrom_iff]
|
||||
|
||||
theorem Pos.le_sliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
p ≤ Pos.sliceFrom p₀ q h ↔ Pos.ofSliceFrom p ≤ q := by
|
||||
simp [ofSliceFrom_le_iff, h]
|
||||
|
||||
theorem Pos.sliceFrom_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} {h} :
|
||||
Pos.sliceFrom p₀ q h < p ↔ q < Pos.ofSliceFrom p := by
|
||||
simp [← Std.not_le, le_sliceFrom_iff]
|
||||
|
||||
theorem Slice.Pos.sliceTo_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
Pos.sliceTo p₀ q h ≤ p ↔ q ≤ Pos.ofSliceTo p := by
|
||||
simp [le_ofSliceTo_iff, h]
|
||||
|
||||
theorem Slice.Pos.lt_sliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
p < Pos.sliceTo p₀ q h ↔ Pos.ofSliceTo p < q := by
|
||||
simp [← Std.not_le, sliceTo_le_iff]
|
||||
|
||||
theorem Slice.Pos.sliceTo_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
Slice.Pos.sliceTo p₀ q h < p ↔ q < Pos.ofSliceTo p := by
|
||||
simp [lt_ofSliceTo_iff, h]
|
||||
|
||||
theorem Slice.Pos.le_sliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
p ≤ Slice.Pos.sliceTo p₀ q h ↔ Pos.ofSliceTo p ≤ q := by
|
||||
simp [← Std.not_lt, sliceTo_lt_iff]
|
||||
|
||||
theorem Pos.sliceTo_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
Pos.sliceTo p₀ q h ≤ p ↔ q ≤ Pos.ofSliceTo p := by
|
||||
simp [le_ofSliceTo_iff, h]
|
||||
|
||||
theorem Pos.lt_sliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
p < Pos.sliceTo p₀ q h ↔ Pos.ofSliceTo p < q := by
|
||||
simp [← Std.not_le, sliceTo_le_iff]
|
||||
|
||||
theorem Pos.sliceTo_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
Pos.sliceTo p₀ q h < p ↔ q < Pos.ofSliceTo p := by
|
||||
simp [lt_ofSliceTo_iff, h]
|
||||
|
||||
theorem Pos.le_sliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} {h} :
|
||||
p ≤ Pos.sliceTo p₀ q h ↔ Pos.ofSliceTo p ≤ q := by
|
||||
simp [← Std.not_lt, sliceTo_lt_iff]
|
||||
|
||||
theorem Slice.Pos.ofSliceTo_ne_endPos {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos}
|
||||
(h : p ≠ (s.sliceTo p₀).endPos) : Pos.ofSliceTo p ≠ s.endPos := by
|
||||
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₀))
|
||||
simpa [← lt_endPos_iff, ← ofSliceTo_lt_ofSliceTo_iff] using h
|
||||
|
||||
theorem Pos.ofSliceTo_ne_endPos {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos}
|
||||
(h : p ≠ (s.sliceTo p₀).endPos) : Pos.ofSliceTo p ≠ s.endPos := by
|
||||
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₀))
|
||||
simpa [← Slice.Pos.lt_endPos_iff, ← ofSliceTo_lt_ofSliceTo_iff] using h
|
||||
|
||||
theorem Slice.Pos.ofSliceTo_next {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {h} :
|
||||
Pos.ofSliceTo (p.next h) = (Pos.ofSliceTo p).next (ofSliceTo_ne_endPos h) := by
|
||||
rw [eq_comm, Pos.next_eq_iff]
|
||||
simp only [Pos.ofSliceTo_lt_ofSliceTo_iff, Pos.lt_next, Pos.ofSliceTo_le_iff,
|
||||
Pos.next_le_iff_lt, true_and]
|
||||
simp [Pos.ofSliceTo_lt_iff]
|
||||
|
||||
theorem Pos.ofSliceTo_next {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {h} :
|
||||
Pos.ofSliceTo (p.next h) = (Pos.ofSliceTo p).next (ofSliceTo_ne_endPos h) := by
|
||||
rw [eq_comm, Pos.next_eq_iff]
|
||||
simp only [Pos.ofSliceTo_lt_ofSliceTo_iff, Slice.Pos.lt_next, Pos.ofSliceTo_le_iff,
|
||||
Slice.Pos.next_le_iff_lt, true_and]
|
||||
simp [Pos.ofSliceTo_lt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.slice_lt_slice_iff {s : Slice} {p₀ p₁ : s.Pos} {q r : s.Pos}
|
||||
{h₁ h₁' h₂ h₂'} :
|
||||
q.slice p₀ p₁ h₁ h₂ < r.slice p₀ p₁ h₁' h₂' ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff, Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁' ⊢
|
||||
omega
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.slice_le_slice_iff {s : Slice} {p₀ p₁ : s.Pos} {q r : s.Pos}
|
||||
{h₁ h₁' h₂ h₂'} :
|
||||
q.slice p₀ p₁ h₁ h₂ ≤ r.slice p₀ p₁ h₁' h₂' ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁' ⊢
|
||||
omega
|
||||
|
||||
@[simp]
|
||||
theorem Pos.slice_lt_slice_iff {s : String} {p₀ p₁ : s.Pos} {q r : s.Pos}
|
||||
{h₁ h₁' h₂ h₂'} :
|
||||
q.slice p₀ p₁ h₁ h₂ < r.slice p₀ p₁ h₁' h₂' ↔ q < r := by
|
||||
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁' ⊢
|
||||
omega
|
||||
|
||||
@[simp]
|
||||
theorem Pos.slice_le_slice_iff {s : String} {p₀ p₁ : s.Pos} {q r : s.Pos}
|
||||
{h₁ h₁' h₂ h₂'} :
|
||||
q.slice p₀ p₁ h₁ h₂ ≤ r.slice p₀ p₁ h₁' h₂' ↔ q ≤ r := by
|
||||
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁' ⊢
|
||||
omega
|
||||
|
||||
theorem Slice.Pos.ofSlice_ne_endPos {s : Slice} {p₀ p₁ : s.Pos} {h} {p : (s.slice p₀ p₁ h).Pos}
|
||||
(h : p ≠ (s.slice p₀ p₁ h).endPos) : Pos.ofSlice p ≠ s.endPos := by
|
||||
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₁))
|
||||
simpa [← lt_endPos_iff, ← ofSlice_lt_ofSlice_iff] using h
|
||||
|
||||
theorem Pos.ofSlice_ne_endPos {s : String} {p₀ p₁ : s.Pos} {h} {p : (s.slice p₀ p₁ h).Pos}
|
||||
(h : p ≠ (s.slice p₀ p₁ h).endPos) : Pos.ofSlice p ≠ s.endPos := by
|
||||
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₁))
|
||||
simpa [← Slice.Pos.lt_endPos_iff, ← ofSlice_lt_ofSlice_iff] using h
|
||||
|
||||
@[simp]
|
||||
theorem Slice.Pos.offset_le_rawEndPos {s : Slice} {p : s.Pos} :
|
||||
p.offset ≤ s.rawEndPos :=
|
||||
@@ -464,38 +161,4 @@ theorem Pos.isUTF8FirstByte_getUTF8Byte_offset {s : String} {p : s.Pos} {h} :
|
||||
(s.getUTF8Byte p.offset h).IsUTF8FirstByte := by
|
||||
simpa [getUTF8Byte_offset] using isUTF8FirstByte_byte
|
||||
|
||||
theorem Slice.Pos.get_eq_get_ofSliceTo {s : Slice} {p₀ : s.Pos} {pos : (s.sliceTo p₀).Pos} {h} :
|
||||
pos.get h = (ofSliceTo pos).get (ofSliceTo_ne_endPos h) := by
|
||||
simp [Slice.Pos.get]
|
||||
|
||||
theorem Pos.get_eq_get_ofSliceTo {s : String} {p₀ : s.Pos}
|
||||
{pos : (s.sliceTo p₀).Pos} {h} :
|
||||
pos.get h = (ofSliceTo pos).get (ofSliceTo_ne_endPos h) := by
|
||||
simp [Pos.get, Slice.Pos.get]
|
||||
|
||||
theorem Slice.Pos.get_eq_get_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} {h'} :
|
||||
pos.get h' = (ofSlice pos).get (ofSlice_ne_endPos h') := by
|
||||
simp [Slice.Pos.get, Nat.add_assoc]
|
||||
|
||||
theorem Pos.get_eq_get_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
|
||||
{pos : (s.slice p₀ p₁ h).Pos} {h'} :
|
||||
pos.get h' = (ofSlice pos).get (ofSlice_ne_endPos h') := by
|
||||
simp [Pos.get, Slice.Pos.get]
|
||||
|
||||
theorem Slice.Pos.ofSlice_next {s : Slice} {p₀ p₁ : s.Pos} {h}
|
||||
{p : (s.slice p₀ p₁ h).Pos} {h'} :
|
||||
Pos.ofSlice (p.next h') = (Pos.ofSlice p).next (ofSlice_ne_endPos h') := by
|
||||
simp only [Slice.Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.offset_next, Slice.Pos.offset_ofSlice]
|
||||
rw [Slice.Pos.get_eq_get_ofSlice (h' := h')]
|
||||
simp [Pos.Raw.offsetBy, Nat.add_assoc]
|
||||
|
||||
theorem Pos.ofSlice_next {s : String} {p₀ p₁ : s.Pos} {h}
|
||||
{p : (s.slice p₀ p₁ h).Pos} {h'} :
|
||||
Pos.ofSlice (p.next h') = (Pos.ofSlice p).next (ofSlice_ne_endPos h') := by
|
||||
simp only [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.offset_next, Pos.offset_next,
|
||||
Pos.offset_ofSlice]
|
||||
rw [Pos.get_eq_get_ofSlice (h' := h')]
|
||||
simp [Pos.Raw.offsetBy, Nat.add_assoc]
|
||||
|
||||
end String
|
||||
|
||||
@@ -12,4 +12,3 @@ public import Init.Data.String.Lemmas.Pattern.Pred
|
||||
public import Init.Data.String.Lemmas.Pattern.Char
|
||||
public import Init.Data.String.Lemmas.Pattern.String
|
||||
public import Init.Data.String.Lemmas.Pattern.Split
|
||||
public import Init.Data.String.Lemmas.Pattern.Find
|
||||
|
||||
@@ -12,7 +12,6 @@ public import Init.Data.Iterators.Consumers.Collect
|
||||
import all Init.Data.String.Pattern.Basic
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Data.String.Lemmas.IsEmpty
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.Order.Lemmas
|
||||
@@ -169,24 +168,6 @@ theorem IsLongestMatchAt.eq {pat : ρ} [ForwardPatternModel pat] {s : Slice} {st
|
||||
endPos = endPos' := by
|
||||
simpa using h.isLongestMatch_sliceFrom.eq h'.isLongestMatch_sliceFrom
|
||||
|
||||
private theorem isLongestMatch_of_eq {pat : ρ} [ForwardPatternModel pat] {s t : Slice}
|
||||
{pos : s.Pos} {pos' : t.Pos} (h_eq : s = t) (h_pos : pos.offset = pos'.offset)
|
||||
(hm : IsLongestMatch pat pos) : IsLongestMatch pat pos' := by
|
||||
subst h_eq; exact (Slice.Pos.ext h_pos) ▸ hm
|
||||
|
||||
theorem isLongestMatchAt_iff_isLongestMatchAt_ofSliceFrom {pat : ρ} [ForwardPatternModel pat]
|
||||
{s : Slice} {base : s.Pos} {startPos endPos : (s.sliceFrom base).Pos} :
|
||||
IsLongestMatchAt pat startPos endPos ↔ IsLongestMatchAt pat (Pos.ofSliceFrom startPos) (Pos.ofSliceFrom endPos) := by
|
||||
constructor
|
||||
· intro h
|
||||
refine ⟨Slice.Pos.ofSliceFrom_le_ofSliceFrom_iff.mpr h.le, ?_⟩
|
||||
exact isLongestMatch_of_eq Slice.sliceFrom_sliceFrom
|
||||
(by simp [Pos.Raw.ext_iff]; omega) h.isLongestMatch_sliceFrom
|
||||
· intro h
|
||||
refine ⟨Slice.Pos.ofSliceFrom_le_ofSliceFrom_iff.mp h.le, ?_⟩
|
||||
exact isLongestMatch_of_eq Slice.sliceFrom_sliceFrom.symm
|
||||
(by simp [Pos.Raw.ext_iff]; omega) h.isLongestMatch_sliceFrom
|
||||
|
||||
theorem IsLongestMatch.isLongestMatchAt_ofSliceFrom {pat : ρ} [ForwardPatternModel pat] {s : Slice}
|
||||
{p₀ : s.Pos} {pos : (s.sliceFrom p₀).Pos} (h : IsLongestMatch pat pos) :
|
||||
IsLongestMatchAt pat p₀ (Slice.Pos.ofSliceFrom pos) where
|
||||
@@ -217,27 +198,6 @@ theorem matchesAt_iff_exists_isMatch {pat : ρ} [ForwardPatternModel pat] {s : S
|
||||
⟨Std.le_trans h₁ (by simpa [← Pos.ofSliceFrom_le_ofSliceFrom_iff] using hq.le_of_isMatch h₂),
|
||||
by simpa using hq⟩⟩
|
||||
|
||||
@[simp]
|
||||
theorem not_matchesAt_endPos {pat : ρ} [ForwardPatternModel pat] {s : Slice} :
|
||||
¬ MatchesAt pat s.endPos := by
|
||||
simp only [matchesAt_iff_exists_isMatch, Pos.endPos_le, exists_prop_eq]
|
||||
intro h
|
||||
simpa [← Pos.ofSliceFrom_inj] using h.ne_startPos
|
||||
|
||||
theorem matchesAt_iff_matchesAt_ofSliceFrom {pat : ρ} [ForwardPatternModel pat] {s : Slice} {base : s.Pos}
|
||||
{pos : (s.sliceFrom base).Pos} : MatchesAt pat pos ↔ MatchesAt pat (Pos.ofSliceFrom pos) := by
|
||||
simp only [matchesAt_iff_exists_isLongestMatchAt]
|
||||
constructor
|
||||
· rintro ⟨endPos, h⟩
|
||||
exact ⟨Pos.ofSliceFrom endPos, isLongestMatchAt_iff_isLongestMatchAt_ofSliceFrom.mp h⟩
|
||||
· rintro ⟨endPos, h⟩
|
||||
exact ⟨base.sliceFrom endPos (Std.le_trans Slice.Pos.le_ofSliceFrom h.le),
|
||||
isLongestMatchAt_iff_isLongestMatchAt_ofSliceFrom.mpr (by simpa using h)⟩
|
||||
|
||||
theorem IsLongestMatchAt.matchesAt {pat : ρ} [ForwardPatternModel pat] {s : Slice} {startPos endPos : s.Pos}
|
||||
(h : IsLongestMatchAt pat startPos endPos) : MatchesAt pat startPos where
|
||||
exists_isLongestMatchAt := ⟨_, h⟩
|
||||
|
||||
open Classical in
|
||||
/--
|
||||
Noncomputable model function returning the end point of the longest match starting at the given
|
||||
|
||||
@@ -10,10 +10,6 @@ public import Init.Data.String.Pattern.Char
|
||||
public import Init.Data.String.Lemmas.Pattern.Basic
|
||||
import Init.Data.Option.Lemmas
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Omega
|
||||
|
||||
public section
|
||||
|
||||
@@ -29,7 +25,8 @@ instance {c : Char} : NoPrefixForwardPatternModel c :=
|
||||
theorem isMatch_iff {c : Char} {s : Slice} {pos : s.Pos} :
|
||||
IsMatch c pos ↔
|
||||
∃ (h : s.startPos ≠ s.endPos), pos = s.startPos.next h ∧ s.startPos.get h = c := by
|
||||
simp only [Model.isMatch_iff, ForwardPatternModel.Matches, sliceTo_copy_eq_iff_exists_splits]
|
||||
simp only [Model.isMatch_iff, ForwardPatternModel.Matches]
|
||||
rw [sliceTo_copy_eq_iff_exists_splits]
|
||||
refine ⟨?_, ?_⟩
|
||||
· simp only [splits_singleton_iff]
|
||||
exact fun ⟨t₂, h, h₁, h₂, h₃⟩ => ⟨h, h₁, h₂⟩
|
||||
@@ -41,43 +38,12 @@ theorem isLongestMatch_iff {c : Char} {s : Slice} {pos : s.Pos} :
|
||||
∃ (h : s.startPos ≠ s.endPos), pos = s.startPos.next h ∧ s.startPos.get h = c := by
|
||||
rw [isLongestMatch_iff_isMatch, isMatch_iff]
|
||||
|
||||
theorem isLongestMatchAt_iff {c : Char} {s : Slice} {pos pos' : s.Pos} :
|
||||
IsLongestMatchAt c pos pos' ↔ ∃ h, pos' = pos.next h ∧ pos.get h = c := by
|
||||
simp +contextual [Model.isLongestMatchAt_iff, isLongestMatch_iff, ← Pos.ofSliceFrom_inj,
|
||||
Pos.get_eq_get_ofSliceFrom, Pos.ofSliceFrom_next]
|
||||
|
||||
theorem isLongestMatchAt_of_get_eq {c : Char} {s : Slice} {pos : s.Pos} {h : pos ≠ s.endPos}
|
||||
(hc : pos.get h = c) : IsLongestMatchAt c pos (pos.next h) :=
|
||||
isLongestMatchAt_iff.2 ⟨h, by simp [hc]⟩
|
||||
|
||||
instance {c : Char} : LawfulForwardPatternModel c where
|
||||
dropPrefix?_eq_some_iff {s} pos := by
|
||||
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?, and_comm, eq_comm (b := pos)]
|
||||
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?]
|
||||
exact ⟨fun ⟨h, h₁, h₂⟩ => ⟨h, h₂.symm, h₁⟩, fun ⟨h, h₁, h₂⟩ => ⟨h, h₂, h₁.symm⟩⟩
|
||||
|
||||
instance {c : Char} : LawfulToForwardSearcherModel c :=
|
||||
.defaultImplementation
|
||||
|
||||
theorem matchesAt_iff {c : Char} {s : Slice} {pos : s.Pos} :
|
||||
MatchesAt c pos ↔ ∃ (h : pos ≠ s.endPos), pos.get h = c := by
|
||||
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
|
||||
|
||||
theorem matchesAt_iff_splits {c : Char} {s : Slice} {pos : s.Pos} :
|
||||
MatchesAt c pos ↔ ∃ t₁ t₂, pos.Splits t₁ (singleton c ++ t₂) := by
|
||||
rw [matchesAt_iff]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨h, rfl⟩
|
||||
exact ⟨_, _, pos.splits_next_right h⟩
|
||||
· rintro ⟨t₁, t₂, hs⟩
|
||||
have hne := hs.ne_endPos_of_singleton
|
||||
exact ⟨hne, (singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm⟩
|
||||
|
||||
theorem not_matchesAt_of_get_ne {c : Char} {s : Slice} {pos : s.Pos} {h : pos ≠ s.endPos}
|
||||
(hc : pos.get h ≠ c) : ¬ MatchesAt c pos := by
|
||||
simp [matchesAt_iff, hc]
|
||||
|
||||
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {c : Char} :
|
||||
matchAt? c pos =
|
||||
if h₀ : ∃ (h : pos ≠ s.endPos), pos.get h = c then some (pos.next h₀.1) else none := by
|
||||
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
|
||||
|
||||
end String.Slice.Pattern.Model.Char
|
||||
|
||||
@@ -1,11 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Lemmas.Pattern.Find.Basic
|
||||
public import Init.Data.String.Lemmas.Pattern.Find.Char
|
||||
public import Init.Data.String.Lemmas.Pattern.Find.Pred
|
||||
@@ -1,129 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Slice
|
||||
public import Init.Data.String.Search
|
||||
public import Init.Data.String.Lemmas.Pattern.Basic
|
||||
import all Init.Data.String.Slice
|
||||
import all Init.Data.String.Search
|
||||
import Init.Data.Iterators.Lemmas.Consumers.Loop
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Grind
|
||||
|
||||
public section
|
||||
|
||||
open Std String.Slice Pattern Pattern.Model
|
||||
|
||||
namespace String.Slice
|
||||
|
||||
theorem Pattern.Model.find?_eq_some_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} {pos : s.Pos} :
|
||||
s.find? pat = some pos ↔ MatchesAt pat pos ∧ (∀ pos', pos' < pos → ¬ MatchesAt pat pos') := by
|
||||
rw [find?, ← Iter.findSome?_toList]
|
||||
suffices ∀ (l : List (SearchStep s)) (pos : s.Pos) (hl : IsValidSearchFrom pat pos l) (pos' : s.Pos),
|
||||
l.findSome? (fun | .matched s _ => some s | .rejected .. => none) = some pos' ↔
|
||||
pos ≤ pos' ∧ MatchesAt pat pos' ∧ ∀ pos'', pos ≤ pos'' → pos'' < pos' → ¬ MatchesAt pat pos'' by
|
||||
simpa using this (ToForwardSearcher.toSearcher pat s).toList s.startPos
|
||||
(LawfulToForwardSearcherModel.isValidSearchFrom_toList s) pos
|
||||
intro l pos hl pos'
|
||||
induction hl with
|
||||
| endPos => simp +contextual
|
||||
| matched h₁ _ _ => have := h₁.matchesAt; grind
|
||||
| mismatched => grind
|
||||
|
||||
theorem Pattern.Model.find?_eq_none_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} :
|
||||
s.find? pat = none ↔ ∀ (pos : s.Pos), ¬ MatchesAt pat pos := by
|
||||
simp only [Option.eq_none_iff_forall_ne_some, ne_eq, find?_eq_some_iff, not_and,
|
||||
Classical.not_forall, Classical.not_not]
|
||||
refine ⟨fun _ pos => ?_, by grind⟩
|
||||
induction pos using WellFounded.induction Pos.wellFounded_lt with grind
|
||||
|
||||
@[simp]
|
||||
theorem isSome_find? {ρ : Type} (pat : ρ) {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] {s : Slice} : (s.find? pat).isSome = s.contains pat := by
|
||||
rw [find?, contains, ← Iter.findSome?_toList, ← Iter.any_toList]
|
||||
induction (ToForwardSearcher.toSearcher pat s).toList <;> grind
|
||||
|
||||
@[simp]
|
||||
theorem find?_eq_none_iff {ρ : Type} (pat : ρ) {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] {s : Slice} : s.find? pat = none ↔ s.contains pat = false := by
|
||||
rw [← Option.isNone_iff_eq_none, ← Option.isSome_eq_false_iff, isSome_find?]
|
||||
|
||||
theorem Pattern.Model.contains_eq_false_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} :
|
||||
s.contains pat = false ↔ ∀ (pos : s.Pos), ¬ MatchesAt pat pos := by
|
||||
rw [← find?_eq_none_iff, Slice.find?_eq_none_iff]
|
||||
|
||||
theorem Pattern.Model.contains_eq_true_iff {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} :
|
||||
s.contains pat ↔ ∃ (pos : s.Pos), MatchesAt pat pos := by
|
||||
simp [← Bool.not_eq_false, contains_eq_false_iff]
|
||||
|
||||
theorem Pos.find?_eq_find?_sliceFrom {ρ : Type} {pat : ρ} {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
|
||||
{s : Slice} {p : s.Pos} :
|
||||
p.find? pat = ((s.sliceFrom p).find? pat).map Pos.ofSliceFrom :=
|
||||
(rfl)
|
||||
|
||||
theorem Pattern.Model.posFind?_eq_some_iff {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} {pos pos' : s.Pos} :
|
||||
pos.find? pat = some pos' ↔ pos ≤ pos' ∧ MatchesAt pat pos' ∧ (∀ pos'', pos ≤ pos'' → pos'' < pos' → ¬ MatchesAt pat pos'') := by
|
||||
simp only [Pos.find?_eq_find?_sliceFrom, Option.map_eq_some_iff, find?_eq_some_iff,
|
||||
matchesAt_iff_matchesAt_ofSliceFrom]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨pos', ⟨h₁, h₂⟩, rfl⟩
|
||||
refine ⟨Pos.le_ofSliceFrom, h₁, fun p hp₁ hp₂ => ?_⟩
|
||||
simpa using h₂ (Pos.sliceFrom _ _ hp₁) (Pos.sliceFrom_lt_iff.2 hp₂)
|
||||
· rintro ⟨h₁, h₂, h₃⟩
|
||||
refine ⟨Pos.sliceFrom _ _ h₁, ⟨by simpa using h₂, fun p hp₁ hp₂ => ?_⟩, by simp⟩
|
||||
exact h₃ (Pos.ofSliceFrom p) Slice.Pos.le_ofSliceFrom (Pos.lt_sliceFrom_iff.1 hp₁) hp₂
|
||||
|
||||
theorem Pattern.Model.posFind?_eq_none_iff {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, Iterators.Finite (σ s) Id]
|
||||
[∀ s, IteratorLoop (σ s) Id Id] [∀ s, LawfulIteratorLoop (σ s) Id Id]
|
||||
[ToForwardSearcher pat σ] [LawfulToForwardSearcherModel pat] {s : Slice} {pos : s.Pos} :
|
||||
pos.find? pat = none ↔ ∀ pos', pos ≤ pos' → ¬ MatchesAt pat pos' := by
|
||||
rw [Pos.find?_eq_find?_sliceFrom, Option.map_eq_none_iff, Pattern.Model.find?_eq_none_iff]
|
||||
simpa only [matchesAt_iff_matchesAt_ofSliceFrom] using ⟨fun h p hp =>
|
||||
by simpa using h (Pos.sliceFrom _ _ hp), fun h p => by simpa using h _ Pos.le_ofSliceFrom⟩
|
||||
|
||||
end Slice
|
||||
|
||||
theorem Pos.find?_eq_find?_toSlice {ρ : Type} {pat : ρ} {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
|
||||
{s : String} {p : s.Pos} : p.find? pat = (p.toSlice.find? pat).map Pos.ofToSlice :=
|
||||
(rfl)
|
||||
|
||||
theorem find?_eq_find?_toSlice {ρ : Type} {pat : ρ} {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
|
||||
{s : String} : s.find? pat = (s.toSlice.find? pat).map Pos.ofToSlice :=
|
||||
(rfl)
|
||||
|
||||
theorem contains_eq_contains_toSlice {ρ : Type} {pat : ρ} {σ : Slice → Type}
|
||||
[∀ s, Iterator (σ s) Id (SearchStep s)] [∀ s, IteratorLoop (σ s) Id Id] [ToForwardSearcher pat σ]
|
||||
{s : String} : s.contains pat = s.toSlice.contains pat :=
|
||||
(rfl)
|
||||
|
||||
end String
|
||||
@@ -1,174 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Slice
|
||||
import Init.Data.String.Lemmas.Pattern.Find.Basic
|
||||
import Init.Data.String.Lemmas.Pattern.Char
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.String.Lemmas.Iterate
|
||||
import Init.Grind
|
||||
import Init.Data.Option.Lemmas
|
||||
import Init.Data.String.OrderInstances
|
||||
|
||||
namespace String.Slice
|
||||
|
||||
theorem find?_char_eq_some_iff {c : Char} {s : Slice} {pos : s.Pos} :
|
||||
s.find? c = some pos ↔
|
||||
∃ h, pos.get h = c ∧ ∀ pos', (h' : pos' < pos) → pos'.get (Pos.ne_endPos_of_lt h') ≠ c := by
|
||||
grind [Pattern.Model.find?_eq_some_iff, Pattern.Model.Char.matchesAt_iff]
|
||||
|
||||
@[simp]
|
||||
theorem contains_char_eq {c : Char} {s : Slice} : s.contains c = decide (c ∈ s.copy.toList) := by
|
||||
rw [Bool.eq_iff_iff, Pattern.Model.contains_eq_true_iff]
|
||||
simp [Pattern.Model.Char.matchesAt_iff, mem_toList_copy_iff_exists_get]
|
||||
|
||||
theorem find?_char_eq_some_iff_splits {c : Char} {s : Slice} {pos : s.Pos} :
|
||||
s.find? c = some pos ↔ ∃ t u, pos.Splits t (singleton c ++ u) ∧ c ∉ t.toList := by
|
||||
rw [find?_char_eq_some_iff]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨h, hget, hmin⟩
|
||||
refine ⟨_, _, hget ▸ pos.splits_next_right h, fun hmem => ?_⟩
|
||||
obtain ⟨pos', hlt, hpget⟩ := (hget ▸ pos.splits_next_right h).mem_toList_left_iff.mp hmem
|
||||
exact absurd hpget (hmin _ hlt)
|
||||
· rintro ⟨t, u, hs, hnotin⟩
|
||||
have hne := hs.ne_endPos_of_singleton
|
||||
exact ⟨hne, (singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm,
|
||||
fun pos' hlt hget => hnotin (hs.mem_toList_left_iff.mpr ⟨pos', hlt, hget⟩)⟩
|
||||
|
||||
theorem Pos.find?_char_eq_some_iff {c : Char} {s : Slice} {pos pos' : s.Pos} :
|
||||
pos.find? c = some pos' ↔
|
||||
pos ≤ pos' ∧ (∃ h, pos'.get h = c) ∧
|
||||
∀ pos'', pos ≤ pos'' → (h' : pos'' < pos') → pos''.get (Pos.ne_endPos_of_lt h') ≠ c := by
|
||||
grind [Pattern.Model.posFind?_eq_some_iff, Pattern.Model.Char.matchesAt_iff]
|
||||
|
||||
theorem Pos.find?_char_eq_some_iff_splits {c : Char} {s : Slice} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
|
||||
pos.find? c = some pos' ↔ ∃ v w, pos'.Splits (t ++ v) (singleton c ++ w) ∧ c ∉ v.toList := by
|
||||
rw [Pos.find?_char_eq_some_iff]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨hle, ⟨hne, hget⟩, hmin⟩
|
||||
have hsplit := hget ▸ pos'.splits_next_right hne
|
||||
obtain ⟨v, hv1, hv2⟩ := (hs.le_iff_exists_eq_append hsplit).mp hle
|
||||
refine ⟨v, _, hsplit.of_eq hv1 rfl, fun hmem => ?_⟩
|
||||
obtain ⟨_, hcopy⟩ :=
|
||||
Slice.copy_slice_eq_iff_splits.mpr ⟨t, _, hs.of_eq rfl hv2, hsplit.of_eq hv1 rfl⟩
|
||||
rw [← hcopy] at hmem
|
||||
obtain ⟨p, hp, hpget⟩ := mem_toList_copy_iff_exists_get.mp hmem
|
||||
have hlt : Pos.ofSlice p < pos' := by
|
||||
simpa [← Slice.Pos.lt_endPos_iff, ← Pos.ofSlice_lt_ofSlice_iff] using hp
|
||||
exact absurd (Pos.get_eq_get_ofSlice ▸ hpget) (hmin _ Pos.le_ofSlice hlt)
|
||||
· rintro ⟨v, w, hsplit, hnotin⟩
|
||||
have hne := hsplit.ne_endPos_of_singleton
|
||||
have hu : u = v ++ (singleton c ++ w) :=
|
||||
append_right_inj t |>.mp (hs.eq_append.symm.trans (by rw [hsplit.eq_append, append_assoc]))
|
||||
have hle : pos ≤ pos' := (hs.le_iff_exists_eq_append hsplit).mpr ⟨v, rfl, hu⟩
|
||||
refine ⟨hle,
|
||||
⟨hne, (singleton_append_inj.mp (hsplit.eq_right (pos'.splits_next_right hne))).1.symm⟩,
|
||||
fun pos'' hle' hlt hget => hnotin ?_⟩
|
||||
obtain ⟨_, hcopy⟩ :=
|
||||
Slice.copy_slice_eq_iff_splits.mpr ⟨t, _, hs.of_eq rfl hu, hsplit⟩
|
||||
rw [← hcopy]
|
||||
exact mem_toList_copy_iff_exists_get.mpr
|
||||
⟨pos''.slice pos pos' hle' (Std.le_of_lt hlt),
|
||||
fun h => Std.ne_of_lt hlt
|
||||
(by rw [← Slice.Pos.ofSlice_slice (h₁ := hle') (h₂ := Std.le_of_lt hlt), h,
|
||||
Slice.Pos.ofSlice_endPos]),
|
||||
by rw [Slice.Pos.get_eq_get_ofSlice]
|
||||
simp [Slice.Pos.ofSlice_slice]
|
||||
exact hget⟩
|
||||
|
||||
theorem Pos.find?_char_eq_none_iff {c : Char} {s : Slice} {pos : s.Pos} :
|
||||
pos.find? c = none ↔ ∀ pos', pos ≤ pos' → (h : pos' ≠ s.endPos) → pos'.get h ≠ c := by
|
||||
grind [Pattern.Model.posFind?_eq_none_iff, Pattern.Model.Char.matchesAt_iff]
|
||||
|
||||
theorem Pos.find?_char_eq_none_iff_not_mem_of_splits {c : Char} {s : Slice} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) :
|
||||
pos.find? c = none ↔ c ∉ u.toList := by
|
||||
simp [Pos.find?_char_eq_none_iff, hs.mem_toList_right_iff]
|
||||
|
||||
end Slice
|
||||
|
||||
theorem Pos.find?_char_eq_some_iff {c : Char} {s : String} {pos pos' : s.Pos} :
|
||||
pos.find? c = some pos' ↔
|
||||
pos ≤ pos' ∧ (∃ h, pos'.get h = c) ∧
|
||||
∀ pos'', pos ≤ pos'' → (h' : pos'' < pos') → pos''.get (Pos.ne_endPos_of_lt h') ≠ c := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.Pos.find?_char_eq_some_iff, ne_eq, endPos_toSlice]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨pos', ⟨h₁, ⟨h₂, rfl⟩, h₃⟩, rfl⟩
|
||||
refine ⟨by simpa [Pos.ofToSlice_le_iff] using h₁,
|
||||
⟨by simpa [← Pos.ofToSlice_inj] using h₂, by simp [Pos.get_ofToSlice]⟩, ?_⟩
|
||||
intro pos'' h₄ h₅
|
||||
simpa using h₃ pos''.toSlice (by simpa [Pos.toSlice_le] using h₄) (by simpa using h₅)
|
||||
· rintro ⟨h₁, ⟨h₂, hget⟩, h₃⟩
|
||||
refine ⟨pos'.toSlice, ⟨by simpa [Pos.toSlice_le] using h₁,
|
||||
⟨by simpa [← Pos.toSlice_inj] using h₂, by simpa using hget⟩, fun p hp₁ hp₂ => ?_⟩,
|
||||
by simp⟩
|
||||
simpa using h₃ (Pos.ofToSlice p)
|
||||
(by simpa [Pos.ofToSlice_le_iff] using hp₁) (by simpa using hp₂)
|
||||
|
||||
theorem Pos.find?_char_eq_some_iff_splits {c : Char} {s : String} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
|
||||
pos.find? c = some pos' ↔ ∃ v w, pos'.Splits (t ++ v) (singleton c ++ w) ∧ c ∉ v.toList := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.Pos.find?_char_eq_some_iff_splits (Pos.splits_toSlice_iff.mpr hs)]
|
||||
constructor
|
||||
· rintro ⟨q, ⟨v, w, hsplit, hnotin⟩, rfl⟩
|
||||
exact ⟨v, w, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hnotin⟩
|
||||
· rintro ⟨v, w, hsplit, hnotin⟩
|
||||
exact ⟨pos'.toSlice, ⟨v, w, Pos.splits_toSlice_iff.mpr hsplit, hnotin⟩, by simp⟩
|
||||
|
||||
theorem Pos.find?_char_eq_none_iff {c : Char} {s : String} {pos : s.Pos} :
|
||||
pos.find? c = none ↔ ∀ pos', pos ≤ pos' → (h : pos' ≠ s.endPos) → pos'.get h ≠ c := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff,
|
||||
Slice.Pos.find?_char_eq_none_iff, endPos_toSlice]
|
||||
refine ⟨?_, ?_⟩
|
||||
· intro h pos' h₁ h₂
|
||||
simpa [Pos.get_ofToSlice] using
|
||||
h pos'.toSlice (by simpa [Pos.toSlice_le] using h₁) (by simpa [← Pos.toSlice_inj] using h₂)
|
||||
· intro h pos' h₁ h₂
|
||||
simpa using h (Pos.ofToSlice pos')
|
||||
(by simpa [Pos.ofToSlice_le_iff] using h₁) (by simpa [← Pos.ofToSlice_inj] using h₂)
|
||||
|
||||
theorem Pos.find?_char_eq_none_iff_not_mem_of_splits {c : Char} {s : String} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) :
|
||||
pos.find? c = none ↔ c ∉ u.toList := by
|
||||
rw [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff]
|
||||
exact Slice.Pos.find?_char_eq_none_iff_not_mem_of_splits (Pos.splits_toSlice_iff.mpr hs)
|
||||
|
||||
theorem find?_char_eq_some_iff {c : Char} {s : String} {pos : s.Pos} :
|
||||
s.find? c = some pos ↔
|
||||
∃ h, pos.get h = c ∧ ∀ pos', (h' : pos' < pos) → pos'.get (Pos.ne_endPos_of_lt h') ≠ c := by
|
||||
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff, Slice.find?_char_eq_some_iff, ne_eq,
|
||||
endPos_toSlice, exists_and_right]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨pos, ⟨⟨h, rfl⟩, h'⟩, rfl⟩
|
||||
refine ⟨⟨by simpa [← Pos.ofToSlice_inj] using h, by simp [Pos.get_ofToSlice]⟩, ?_⟩
|
||||
intro pos' hp
|
||||
simpa using h' pos'.toSlice hp
|
||||
· rintro ⟨⟨h, hget⟩, hmin⟩
|
||||
exact ⟨pos.toSlice, ⟨⟨by simpa [← Pos.toSlice_inj] using h, by simpa using hget⟩,
|
||||
fun pos' hp => by simpa using hmin (Pos.ofToSlice pos') hp⟩, by simp⟩
|
||||
|
||||
theorem find?_char_eq_some_iff_splits {c : Char} {s : String} {pos : s.Pos} :
|
||||
s.find? c = some pos ↔ ∃ t u, pos.Splits t (singleton c ++ u) ∧ c ∉ t.toList := by
|
||||
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.find?_char_eq_some_iff_splits]
|
||||
constructor
|
||||
· rintro ⟨q, ⟨t, u, hsplit, hnotin⟩, rfl⟩
|
||||
exact ⟨t, u, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hnotin⟩
|
||||
· rintro ⟨t, u, hsplit, hnotin⟩
|
||||
exact ⟨pos.toSlice, ⟨t, u, Pos.splits_toSlice_iff.mpr hsplit, hnotin⟩, by simp⟩
|
||||
|
||||
@[simp]
|
||||
theorem contains_char_eq {c : Char} {s : String} : s.contains c = decide (c ∈ s.toList) := by
|
||||
simp [contains_eq_contains_toSlice, Slice.contains_char_eq, copy_toSlice]
|
||||
|
||||
end String
|
||||
@@ -1,367 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Slice
|
||||
import Init.Data.String.Lemmas.Pattern.Find.Basic
|
||||
import Init.Data.String.Lemmas.Pattern.Pred
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.String.Lemmas.Iterate
|
||||
import Init.Grind
|
||||
import Init.Data.Option.Lemmas
|
||||
import Init.Data.String.OrderInstances
|
||||
|
||||
namespace String.Slice
|
||||
|
||||
theorem find?_bool_eq_some_iff {p : Char → Bool} {s : Slice} {pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ h, p (pos.get h) ∧ ∀ pos', (h' : pos' < pos) → p (pos'.get (Pos.ne_endPos_of_lt h')) = false := by
|
||||
grind [Pattern.Model.find?_eq_some_iff, Pattern.Model.CharPred.matchesAt_iff]
|
||||
|
||||
theorem find?_prop_eq_some_iff {p : Char → Prop} [DecidablePred p] {s : Slice} {pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ h, p (pos.get h) ∧ ∀ pos', (h' : pos' < pos) → ¬ p (pos'.get (Pos.ne_endPos_of_lt h')) := by
|
||||
grind [Pattern.Model.find?_eq_some_iff, Pattern.Model.CharPred.Decidable.matchesAt_iff]
|
||||
|
||||
theorem find?_bool_eq_some_iff_splits {p : Char → Bool} {s : Slice} {pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ t c u, pos.Splits t (singleton c ++ u) ∧ p c ∧ ∀ d ∈ t.toList, p d = false := by
|
||||
rw [find?_bool_eq_some_iff]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨h, hp, hmin⟩
|
||||
exact ⟨_, _, _, pos.splits_next_right h, hp, fun d hd => by
|
||||
obtain ⟨pos', hlt, hpget⟩ := (pos.splits_next_right h).mem_toList_left_iff.mp hd
|
||||
subst hpget; exact hmin _ hlt⟩
|
||||
· rintro ⟨t, c, u, hs, hpc, hmin⟩
|
||||
have hne := hs.ne_endPos_of_singleton
|
||||
refine ⟨hne, ?_, fun pos' hlt => hmin _ (hs.mem_toList_left_iff.mpr ⟨pos', hlt, rfl⟩)⟩
|
||||
rw [(singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm]
|
||||
exact hpc
|
||||
|
||||
theorem find?_prop_eq_some_iff_splits {p : Char → Prop} [DecidablePred p] {s : Slice}
|
||||
{pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ t c u, pos.Splits t (singleton c ++ u) ∧ p c ∧ ∀ d ∈ t.toList, ¬ p d := by
|
||||
rw [find?_prop_eq_some_iff]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨h, hp, hmin⟩
|
||||
exact ⟨_, _, _, pos.splits_next_right h, hp, fun d hd => by
|
||||
obtain ⟨pos', hlt, hpget⟩ := (pos.splits_next_right h).mem_toList_left_iff.mp hd
|
||||
subst hpget; exact hmin _ hlt⟩
|
||||
· rintro ⟨t, c, u, hs, hpc, hmin⟩
|
||||
have hne := hs.ne_endPos_of_singleton
|
||||
refine ⟨hne, ?_, fun pos' hlt => hmin _ (hs.mem_toList_left_iff.mpr ⟨pos', hlt, rfl⟩)⟩
|
||||
rw [(singleton_append_inj.mp (hs.eq_right (pos.splits_next_right hne))).1.symm]
|
||||
exact hpc
|
||||
|
||||
@[simp]
|
||||
theorem contains_bool_eq {p : Char → Bool} {s : Slice} : s.contains p = s.copy.toList.any p := by
|
||||
rw [Bool.eq_iff_iff, Pattern.Model.contains_eq_true_iff]
|
||||
simp only [Pattern.Model.CharPred.matchesAt_iff, ne_eq, List.any_eq_true,
|
||||
mem_toList_copy_iff_exists_get]
|
||||
exact ⟨fun ⟨pos, h, hp⟩ => ⟨_, ⟨_, _, rfl⟩, hp⟩, fun ⟨_, ⟨p, h, h'⟩, hp⟩ => ⟨p, h, h' ▸ hp⟩⟩
|
||||
|
||||
@[simp]
|
||||
theorem contains_prop_eq {p : Char → Prop} [DecidablePred p] {s : Slice} :
|
||||
s.contains p = s.copy.toList.any p := by
|
||||
rw [Bool.eq_iff_iff, Pattern.Model.contains_eq_true_iff]
|
||||
simp only [Pattern.Model.CharPred.Decidable.matchesAt_iff, ne_eq, List.any_eq_true,
|
||||
mem_toList_copy_iff_exists_get, decide_eq_true_eq]
|
||||
exact ⟨fun ⟨pos, h, hp⟩ => ⟨_, ⟨_, _, rfl⟩, hp⟩, fun ⟨_, ⟨p, h, h'⟩, hp⟩ => ⟨p, h, h' ▸ hp⟩⟩
|
||||
|
||||
theorem Pos.find?_bool_eq_some_iff {p : Char → Bool} {s : Slice} {pos pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
pos ≤ pos' ∧ (∃ h, p (pos'.get h)) ∧
|
||||
∀ pos'', pos ≤ pos'' → (h' : pos'' < pos') →
|
||||
p (pos''.get (Pos.ne_endPos_of_lt h')) = false := by
|
||||
grind [Pattern.Model.posFind?_eq_some_iff, Pattern.Model.CharPred.matchesAt_iff]
|
||||
|
||||
theorem Pos.find?_bool_eq_some_iff_splits {p : Char → Bool} {s : Slice} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
∃ v c w, pos'.Splits (t ++ v) (singleton c ++ w) ∧ p c ∧
|
||||
∀ d ∈ v.toList, p d = false := by
|
||||
rw [Pos.find?_bool_eq_some_iff]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨hle, ⟨hne, hp⟩, hmin⟩
|
||||
have hsplit := pos'.splits_next_right hne
|
||||
obtain ⟨v, hv1, hv2⟩ := (hs.le_iff_exists_eq_append hsplit).mp hle
|
||||
refine ⟨v, pos'.get hne, _, hsplit.of_eq hv1 rfl, hp, fun d hd => ?_⟩
|
||||
obtain ⟨_, hcopy⟩ :=
|
||||
Slice.copy_slice_eq_iff_splits.mpr ⟨t, _, hs.of_eq rfl hv2, hsplit.of_eq hv1 rfl⟩
|
||||
rw [← hcopy] at hd
|
||||
obtain ⟨q, hq, hqget⟩ := mem_toList_copy_iff_exists_get.mp hd
|
||||
have hlt : Pos.ofSlice q < pos' := by
|
||||
simpa [← Slice.Pos.lt_endPos_iff, ← Pos.ofSlice_lt_ofSlice_iff] using hq
|
||||
subst hqget; rw [Slice.Pos.get_eq_get_ofSlice]; exact hmin _ Pos.le_ofSlice hlt
|
||||
· rintro ⟨v, c, w, hsplit, hpc, hmin⟩
|
||||
have hne := hsplit.ne_endPos_of_singleton
|
||||
have hu : u = v ++ (singleton c ++ w) :=
|
||||
append_right_inj t |>.mp (hs.eq_append.symm.trans (by rw [hsplit.eq_append, append_assoc]))
|
||||
have hle : pos ≤ pos' := (hs.le_iff_exists_eq_append hsplit).mpr ⟨v, rfl, hu⟩
|
||||
refine ⟨hle, ⟨hne, ?_⟩, fun pos'' hle' hlt => hmin _ ?_⟩
|
||||
· rw [(singleton_append_inj.mp (hsplit.eq_right (pos'.splits_next_right hne))).1.symm]
|
||||
exact hpc
|
||||
· obtain ⟨_, hcopy⟩ :=
|
||||
Slice.copy_slice_eq_iff_splits.mpr ⟨t, _, hs.of_eq rfl hu, hsplit⟩
|
||||
rw [← hcopy]
|
||||
exact mem_toList_copy_iff_exists_get.mpr
|
||||
⟨pos''.slice pos pos' hle' (Std.le_of_lt hlt),
|
||||
fun h => Std.ne_of_lt hlt
|
||||
(by rw [← Slice.Pos.ofSlice_slice (h₁ := hle') (h₂ := Std.le_of_lt hlt), h,
|
||||
Slice.Pos.ofSlice_endPos]),
|
||||
by rw [Slice.Pos.get_eq_get_ofSlice]
|
||||
simp [Slice.Pos.ofSlice_slice]⟩
|
||||
|
||||
theorem Pos.find?_bool_eq_none_iff {p : Char → Bool} {s : Slice} {pos : s.Pos} :
|
||||
pos.find? p = none ↔
|
||||
∀ pos', pos ≤ pos' → (h : pos' ≠ s.endPos) → p (pos'.get h) = false := by
|
||||
grind [Pattern.Model.posFind?_eq_none_iff, Pattern.Model.CharPred.matchesAt_iff]
|
||||
|
||||
theorem Pos.find?_bool_eq_none_iff_of_splits {p : Char → Bool} {s : Slice} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) :
|
||||
pos.find? p = none ↔ ∀ c ∈ u.toList, p c = false := by
|
||||
rw [Pos.find?_bool_eq_none_iff]
|
||||
constructor
|
||||
· intro h c hc
|
||||
obtain ⟨pos', hle, hne, hget⟩ := hs.mem_toList_right_iff.mp hc
|
||||
subst hget; exact h pos' hle hne
|
||||
· intro h pos' hle hne
|
||||
exact h _ (hs.mem_toList_right_iff.mpr ⟨pos', hle, hne, rfl⟩)
|
||||
|
||||
theorem Pos.find?_prop_eq_some_iff {p : Char → Prop} [DecidablePred p] {s : Slice}
|
||||
{pos pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
pos ≤ pos' ∧ (∃ h, p (pos'.get h)) ∧
|
||||
∀ pos'', pos ≤ pos'' → (h' : pos'' < pos') →
|
||||
¬ p (pos''.get (Pos.ne_endPos_of_lt h')) := by
|
||||
grind [Pattern.Model.posFind?_eq_some_iff, Pattern.Model.CharPred.Decidable.matchesAt_iff]
|
||||
|
||||
theorem Pos.find?_prop_eq_some_iff_splits {p : Char → Prop} [DecidablePred p] {s : Slice}
|
||||
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
∃ v c w, pos'.Splits (t ++ v) (singleton c ++ w) ∧ p c ∧ ∀ d ∈ v.toList, ¬ p d := by
|
||||
rw [Pos.find?_prop_eq_some_iff]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨hle, ⟨hne, hp⟩, hmin⟩
|
||||
have hsplit := pos'.splits_next_right hne
|
||||
obtain ⟨v, hv1, hv2⟩ := (hs.le_iff_exists_eq_append hsplit).mp hle
|
||||
refine ⟨v, pos'.get hne, _, hsplit.of_eq hv1 rfl, hp, fun d hd => ?_⟩
|
||||
obtain ⟨_, hcopy⟩ :=
|
||||
Slice.copy_slice_eq_iff_splits.mpr ⟨t, _, hs.of_eq rfl hv2, hsplit.of_eq hv1 rfl⟩
|
||||
rw [← hcopy] at hd
|
||||
obtain ⟨q, hq, hqget⟩ := mem_toList_copy_iff_exists_get.mp hd
|
||||
have hlt : Pos.ofSlice q < pos' := by
|
||||
simpa [← Slice.Pos.lt_endPos_iff, ← Pos.ofSlice_lt_ofSlice_iff] using hq
|
||||
subst hqget; rw [Slice.Pos.get_eq_get_ofSlice]; exact hmin _ Pos.le_ofSlice hlt
|
||||
· rintro ⟨v, c, w, hsplit, hpc, hmin⟩
|
||||
have hne := hsplit.ne_endPos_of_singleton
|
||||
have hu : u = v ++ (singleton c ++ w) :=
|
||||
append_right_inj t |>.mp (hs.eq_append.symm.trans (by rw [hsplit.eq_append, append_assoc]))
|
||||
have hle : pos ≤ pos' := (hs.le_iff_exists_eq_append hsplit).mpr ⟨v, rfl, hu⟩
|
||||
refine ⟨hle, ⟨hne, ?_⟩, fun pos'' hle' hlt => hmin _ ?_⟩
|
||||
· rw [(singleton_append_inj.mp (hsplit.eq_right (pos'.splits_next_right hne))).1.symm]
|
||||
exact hpc
|
||||
· obtain ⟨_, hcopy⟩ :=
|
||||
Slice.copy_slice_eq_iff_splits.mpr ⟨t, _, hs.of_eq rfl hu, hsplit⟩
|
||||
rw [← hcopy]
|
||||
exact mem_toList_copy_iff_exists_get.mpr
|
||||
⟨pos''.slice pos pos' hle' (Std.le_of_lt hlt),
|
||||
fun h => Std.ne_of_lt hlt
|
||||
(by rw [← Slice.Pos.ofSlice_slice (h₁ := hle') (h₂ := Std.le_of_lt hlt), h,
|
||||
Slice.Pos.ofSlice_endPos]),
|
||||
by rw [Slice.Pos.get_eq_get_ofSlice]
|
||||
simp [Slice.Pos.ofSlice_slice]⟩
|
||||
|
||||
theorem Pos.find?_prop_eq_none_iff {p : Char → Prop} [DecidablePred p] {s : Slice} {pos : s.Pos} :
|
||||
pos.find? p = none ↔
|
||||
∀ pos', pos ≤ pos' → (h : pos' ≠ s.endPos) → ¬ p (pos'.get h) := by
|
||||
grind [Pattern.Model.posFind?_eq_none_iff, Pattern.Model.CharPred.Decidable.matchesAt_iff]
|
||||
|
||||
theorem Pos.find?_prop_eq_none_iff_of_splits {p : Char → Prop} [DecidablePred p] {s : Slice}
|
||||
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) :
|
||||
pos.find? p = none ↔ ∀ c ∈ u.toList, ¬ p c := by
|
||||
rw [Pos.find?_prop_eq_none_iff]
|
||||
constructor
|
||||
· intro h c hc
|
||||
obtain ⟨pos', hle, hne, hget⟩ := hs.mem_toList_right_iff.mp hc
|
||||
subst hget; exact h pos' hle hne
|
||||
· intro h pos' hle hne
|
||||
exact h _ (hs.mem_toList_right_iff.mpr ⟨pos', hle, hne, rfl⟩)
|
||||
|
||||
end String.Slice
|
||||
|
||||
namespace String
|
||||
|
||||
theorem Pos.find?_bool_eq_some_iff {p : Char → Bool} {s : String} {pos pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
pos ≤ pos' ∧ (∃ h, p (pos'.get h)) ∧
|
||||
∀ pos'', pos ≤ pos'' → (h' : pos'' < pos') →
|
||||
p (pos''.get (Pos.ne_endPos_of_lt h')) = false := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.Pos.find?_bool_eq_some_iff, endPos_toSlice]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨pos', ⟨h₁, ⟨h₂, hp⟩, h₃⟩, rfl⟩
|
||||
refine ⟨by simpa [Pos.ofToSlice_le_iff] using h₁,
|
||||
⟨by simpa [← Pos.ofToSlice_inj] using h₂, by simpa [Pos.get_ofToSlice] using hp⟩, ?_⟩
|
||||
intro pos'' h₄ h₅
|
||||
simpa using h₃ pos''.toSlice (by simpa [Pos.toSlice_le] using h₄) (by simpa using h₅)
|
||||
· rintro ⟨h₁, ⟨h₂, hp⟩, h₃⟩
|
||||
refine ⟨pos'.toSlice, ⟨by simpa [Pos.toSlice_le] using h₁,
|
||||
⟨by simpa [← Pos.toSlice_inj] using h₂, by simpa using hp⟩, fun p hp₁ hp₂ => ?_⟩,
|
||||
by simp⟩
|
||||
simpa using h₃ (Pos.ofToSlice p)
|
||||
(by simpa [Pos.ofToSlice_le_iff] using hp₁) (by simpa using hp₂)
|
||||
|
||||
theorem Pos.find?_bool_eq_some_iff_splits {p : Char → Bool} {s : String} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
∃ v c w, pos'.Splits (t ++ v) (singleton c ++ w) ∧ p c ∧
|
||||
∀ d ∈ v.toList, p d = false := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.Pos.find?_bool_eq_some_iff_splits (Pos.splits_toSlice_iff.mpr hs)]
|
||||
constructor
|
||||
· rintro ⟨q, ⟨v, c, w, hsplit, hpc, hmin⟩, rfl⟩
|
||||
exact ⟨v, c, w, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin⟩
|
||||
· rintro ⟨v, c, w, hsplit, hpc, hmin⟩
|
||||
exact ⟨pos'.toSlice, ⟨v, c, w, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin⟩, by simp⟩
|
||||
|
||||
theorem Pos.find?_bool_eq_none_iff {p : Char → Bool} {s : String} {pos : s.Pos} :
|
||||
pos.find? p = none ↔
|
||||
∀ pos', pos ≤ pos' → (h : pos' ≠ s.endPos) → p (pos'.get h) = false := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff,
|
||||
Slice.Pos.find?_bool_eq_none_iff, endPos_toSlice]
|
||||
refine ⟨?_, ?_⟩
|
||||
· intro h pos' h₁ h₂
|
||||
simpa [Pos.get_ofToSlice] using
|
||||
h pos'.toSlice (by simpa [Pos.toSlice_le] using h₁) (by simpa [← Pos.toSlice_inj] using h₂)
|
||||
· intro h pos' h₁ h₂
|
||||
simpa using h (Pos.ofToSlice pos')
|
||||
(by simpa [Pos.ofToSlice_le_iff] using h₁) (by simpa [← Pos.ofToSlice_inj] using h₂)
|
||||
|
||||
theorem Pos.find?_bool_eq_none_iff_of_splits {p : Char → Bool} {s : String} {pos : s.Pos}
|
||||
{t u : String} (hs : pos.Splits t u) :
|
||||
pos.find? p = none ↔ ∀ c ∈ u.toList, p c = false := by
|
||||
rw [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff]
|
||||
exact Slice.Pos.find?_bool_eq_none_iff_of_splits (Pos.splits_toSlice_iff.mpr hs)
|
||||
|
||||
theorem Pos.find?_prop_eq_some_iff {p : Char → Prop} [DecidablePred p] {s : String}
|
||||
{pos pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
pos ≤ pos' ∧ (∃ h, p (pos'.get h)) ∧
|
||||
∀ pos'', pos ≤ pos'' → (h' : pos'' < pos') →
|
||||
¬ p (pos''.get (Pos.ne_endPos_of_lt h')) := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.Pos.find?_prop_eq_some_iff, endPos_toSlice]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨pos', ⟨h₁, ⟨h₂, hp⟩, h₃⟩, rfl⟩
|
||||
refine ⟨by simpa [Pos.ofToSlice_le_iff] using h₁,
|
||||
⟨by simpa [← Pos.ofToSlice_inj] using h₂, by simpa [Pos.get_ofToSlice] using hp⟩, ?_⟩
|
||||
intro pos'' h₄ h₅
|
||||
simpa using h₃ pos''.toSlice (by simpa [Pos.toSlice_le] using h₄) (by simpa using h₅)
|
||||
· rintro ⟨h₁, ⟨h₂, hp⟩, h₃⟩
|
||||
refine ⟨pos'.toSlice, ⟨by simpa [Pos.toSlice_le] using h₁,
|
||||
⟨by simpa [← Pos.toSlice_inj] using h₂, by simpa using hp⟩, fun p hp₁ hp₂ => ?_⟩,
|
||||
by simp⟩
|
||||
simpa using h₃ (Pos.ofToSlice p)
|
||||
(by simpa [Pos.ofToSlice_le_iff] using hp₁) (by simpa using hp₂)
|
||||
|
||||
theorem Pos.find?_prop_eq_some_iff_splits {p : Char → Prop} [DecidablePred p] {s : String}
|
||||
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) {pos' : s.Pos} :
|
||||
pos.find? p = some pos' ↔
|
||||
∃ v c w, pos'.Splits (t ++ v) (singleton c ++ w) ∧ p c ∧ ∀ d ∈ v.toList, ¬ p d := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.Pos.find?_prop_eq_some_iff_splits (Pos.splits_toSlice_iff.mpr hs)]
|
||||
constructor
|
||||
· rintro ⟨q, ⟨v, c, w, hsplit, hpc, hmin⟩, rfl⟩
|
||||
exact ⟨v, c, w, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin⟩
|
||||
· rintro ⟨v, c, w, hsplit, hpc, hmin⟩
|
||||
exact ⟨pos'.toSlice, ⟨v, c, w, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin⟩, by simp⟩
|
||||
|
||||
theorem Pos.find?_prop_eq_none_iff {p : Char → Prop} [DecidablePred p] {s : String}
|
||||
{pos : s.Pos} :
|
||||
pos.find? p = none ↔
|
||||
∀ pos', pos ≤ pos' → (h : pos' ≠ s.endPos) → ¬ p (pos'.get h) := by
|
||||
simp only [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff,
|
||||
Slice.Pos.find?_prop_eq_none_iff, endPos_toSlice]
|
||||
refine ⟨?_, ?_⟩
|
||||
· intro h pos' h₁ h₂
|
||||
simpa [Pos.get_ofToSlice] using
|
||||
h pos'.toSlice (by simpa [Pos.toSlice_le] using h₁) (by simpa [← Pos.toSlice_inj] using h₂)
|
||||
· intro h pos' h₁ h₂
|
||||
simpa using h (Pos.ofToSlice pos')
|
||||
(by simpa [Pos.ofToSlice_le_iff] using h₁) (by simpa [← Pos.ofToSlice_inj] using h₂)
|
||||
|
||||
theorem Pos.find?_prop_eq_none_iff_of_splits {p : Char → Prop} [DecidablePred p] {s : String}
|
||||
{pos : s.Pos} {t u : String} (hs : pos.Splits t u) :
|
||||
pos.find? p = none ↔ ∀ c ∈ u.toList, ¬ p c := by
|
||||
rw [Pos.find?_eq_find?_toSlice, Option.map_eq_none_iff]
|
||||
exact Slice.Pos.find?_prop_eq_none_iff_of_splits (Pos.splits_toSlice_iff.mpr hs)
|
||||
|
||||
theorem find?_bool_eq_some_iff {p : Char → Bool} {s : String} {pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ h, p (pos.get h) ∧ ∀ pos', (h' : pos' < pos) → p (pos'.get (Pos.ne_endPos_of_lt h')) = false := by
|
||||
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff, Slice.find?_bool_eq_some_iff,
|
||||
endPos_toSlice, exists_and_right]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨pos, ⟨⟨h, hp⟩, h'⟩, rfl⟩
|
||||
refine ⟨⟨by simpa [← Pos.ofToSlice_inj] using h, by simpa [Pos.get_ofToSlice] using hp⟩, ?_⟩
|
||||
intro pos' hp
|
||||
simpa using h' pos'.toSlice hp
|
||||
· rintro ⟨⟨h, hp⟩, hmin⟩
|
||||
exact ⟨pos.toSlice, ⟨⟨by simpa [← Pos.toSlice_inj] using h, by simpa using hp⟩,
|
||||
fun pos' hp => by simpa using hmin (Pos.ofToSlice pos') hp⟩, by simp⟩
|
||||
|
||||
theorem find?_bool_eq_some_iff_splits {p : Char → Bool} {s : String} {pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ t c u, pos.Splits t (singleton c ++ u) ∧ p c ∧ ∀ d ∈ t.toList, p d = false := by
|
||||
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.find?_bool_eq_some_iff_splits]
|
||||
constructor
|
||||
· rintro ⟨q, ⟨t, c, u, hsplit, hpc, hmin⟩, rfl⟩
|
||||
exact ⟨t, c, u, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin⟩
|
||||
· rintro ⟨t, c, u, hsplit, hpc, hmin⟩
|
||||
exact ⟨pos.toSlice, ⟨t, c, u, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin⟩, by simp⟩
|
||||
|
||||
theorem find?_prop_eq_some_iff {p : Char → Prop} [DecidablePred p] {s : String} {pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ h, p (pos.get h) ∧ ∀ pos', (h' : pos' < pos) → ¬ p (pos'.get (Pos.ne_endPos_of_lt h')) := by
|
||||
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff, Slice.find?_prop_eq_some_iff,
|
||||
endPos_toSlice, exists_and_right]
|
||||
refine ⟨?_, ?_⟩
|
||||
· rintro ⟨pos, ⟨⟨h, hp⟩, h'⟩, rfl⟩
|
||||
refine ⟨⟨by simpa [← Pos.ofToSlice_inj] using h, by simpa [Pos.get_ofToSlice] using hp⟩, ?_⟩
|
||||
intro pos' hp
|
||||
simpa using h' pos'.toSlice hp
|
||||
· rintro ⟨⟨h, hp⟩, hmin⟩
|
||||
exact ⟨pos.toSlice, ⟨⟨by simpa [← Pos.toSlice_inj] using h, by simpa using hp⟩,
|
||||
fun pos' hp => by simpa using hmin (Pos.ofToSlice pos') hp⟩, by simp⟩
|
||||
|
||||
theorem find?_prop_eq_some_iff_splits {p : Char → Prop} [DecidablePred p] {s : String}
|
||||
{pos : s.Pos} :
|
||||
s.find? p = some pos ↔
|
||||
∃ t c u, pos.Splits t (singleton c ++ u) ∧ p c ∧ ∀ d ∈ t.toList, ¬ p d := by
|
||||
simp only [find?_eq_find?_toSlice, Option.map_eq_some_iff,
|
||||
Slice.find?_prop_eq_some_iff_splits]
|
||||
constructor
|
||||
· rintro ⟨q, ⟨t, c, u, hsplit, hpc, hmin⟩, rfl⟩
|
||||
exact ⟨t, c, u, Slice.Pos.splits_ofToSlice_iff.mpr hsplit, hpc, hmin⟩
|
||||
· rintro ⟨t, c, u, hsplit, hpc, hmin⟩
|
||||
exact ⟨pos.toSlice, ⟨t, c, u, Pos.splits_toSlice_iff.mpr hsplit, hpc, hmin⟩, by simp⟩
|
||||
|
||||
@[simp]
|
||||
theorem contains_bool_eq {p : Char → Bool} {s : String} : s.contains p = s.toList.any p := by
|
||||
simp [contains_eq_contains_toSlice, Slice.contains_bool_eq, copy_toSlice]
|
||||
|
||||
@[simp]
|
||||
theorem contains_prop_eq {p : Char → Prop} [DecidablePred p] {s : String} :
|
||||
s.contains p = s.toList.any p := by
|
||||
simp [contains_eq_contains_toSlice, Slice.contains_prop_eq, copy_toSlice]
|
||||
|
||||
end String
|
||||
@@ -10,10 +10,6 @@ public import Init.Data.String.Pattern.Pred
|
||||
public import Init.Data.String.Lemmas.Pattern.Basic
|
||||
import Init.Data.Option.Lemmas
|
||||
import Init.Data.String.Lemmas.Basic
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Omega
|
||||
|
||||
public section
|
||||
|
||||
@@ -42,35 +38,14 @@ theorem isLongestMatch_iff {p : Char → Bool} {s : Slice} {pos : s.Pos} :
|
||||
∃ (h : s.startPos ≠ s.endPos), pos = s.startPos.next h ∧ p (s.startPos.get h) := by
|
||||
rw [isLongestMatch_iff_isMatch, isMatch_iff]
|
||||
|
||||
theorem isLongestMatchAt_iff {p : Char → Bool} {s : Slice} {pos pos' : s.Pos} :
|
||||
IsLongestMatchAt p pos pos' ↔ ∃ h, pos' = pos.next h ∧ p (pos.get h) := by
|
||||
simp +contextual [Model.isLongestMatchAt_iff, isLongestMatch_iff, ← Pos.ofSliceFrom_inj,
|
||||
Pos.get_eq_get_ofSliceFrom, Pos.ofSliceFrom_next]
|
||||
|
||||
theorem isLongestMatchAt_of_get {p : Char → Bool} {s : Slice} {pos : s.Pos} {h : pos ≠ s.endPos}
|
||||
(hc : p (pos.get h)) : IsLongestMatchAt p pos (pos.next h) :=
|
||||
isLongestMatchAt_iff.2 ⟨h, by simp [hc]⟩
|
||||
|
||||
instance {p : Char → Bool} : LawfulForwardPatternModel p where
|
||||
dropPrefix?_eq_some_iff {s} pos := by
|
||||
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?, and_comm, eq_comm (b := pos)]
|
||||
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?]
|
||||
exact ⟨fun ⟨h, h₁, h₂⟩ => ⟨h, h₂.symm, h₁⟩, fun ⟨h, h₁, h₂⟩ => ⟨h, h₂, h₁.symm⟩⟩
|
||||
|
||||
instance {p : Char → Bool} : LawfulToForwardSearcherModel p :=
|
||||
.defaultImplementation
|
||||
|
||||
theorem matchesAt_iff {p : Char → Bool} {s : Slice} {pos : s.Pos} :
|
||||
MatchesAt p pos ↔ ∃ (h : pos ≠ s.endPos), p (pos.get h) := by
|
||||
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
|
||||
|
||||
theorem not_matchesAt_of_get {p : Char → Bool} {s : Slice} {pos : s.Pos} {h : pos ≠ s.endPos}
|
||||
(hc : p (pos.get h) = false) : ¬ MatchesAt p pos := by
|
||||
simp [matchesAt_iff, hc]
|
||||
|
||||
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {p : Char → Bool} :
|
||||
matchAt? p pos =
|
||||
if h₀ : ∃ (h : pos ≠ s.endPos), p (pos.get h) then some (pos.next h₀.1) else none := by
|
||||
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
|
||||
|
||||
namespace Decidable
|
||||
|
||||
instance {p : Char → Prop} [DecidablePred p] : ForwardPatternModel p where
|
||||
@@ -98,20 +73,6 @@ theorem isLongestMatch_iff_isLongestMatch_decide {p : Char → Prop} [DecidableP
|
||||
{pos : s.Pos} : IsLongestMatch p pos ↔ IsLongestMatch (decide <| p ·) pos := by
|
||||
simp [isLongestMatch_iff_isMatch, isMatch_iff_isMatch_decide]
|
||||
|
||||
theorem isLongestMatchAt_iff_isLongestMatchAt_decide {p : Char → Prop} [DecidablePred p]
|
||||
{s : Slice} {pos pos' : s.Pos} :
|
||||
IsLongestMatchAt p pos pos' ↔ IsLongestMatchAt (decide <| p ·) pos pos' := by
|
||||
simp [Model.isLongestMatchAt_iff, isLongestMatch_iff_isLongestMatch_decide]
|
||||
|
||||
theorem isLongestMatchAt_iff {p : Char → Prop} [DecidablePred p] {s : Slice}
|
||||
{pos pos' : s.Pos} :
|
||||
IsLongestMatchAt p pos pos' ↔ ∃ h, pos' = pos.next h ∧ p (pos.get h) := by
|
||||
simp [isLongestMatchAt_iff_isLongestMatchAt_decide, CharPred.isLongestMatchAt_iff]
|
||||
|
||||
theorem isLongestMatchAt_of_get {p : Char → Prop} [DecidablePred p] {s : Slice} {pos : s.Pos}
|
||||
{h : pos ≠ s.endPos} (hc : p (pos.get h)) : IsLongestMatchAt p pos (pos.next h) :=
|
||||
isLongestMatchAt_iff.2 ⟨h, by simp [hc]⟩
|
||||
|
||||
theorem dropPrefix?_eq_dropPrefix?_decide {p : Char → Prop} [DecidablePred p] :
|
||||
ForwardPattern.dropPrefix? p = ForwardPattern.dropPrefix? (decide <| p ·) := rfl
|
||||
|
||||
@@ -123,19 +84,6 @@ instance {p : Char → Prop} [DecidablePred p] : LawfulForwardPatternModel p whe
|
||||
instance {p : Char → Prop} [DecidablePred p] : LawfulToForwardSearcherModel p :=
|
||||
.defaultImplementation
|
||||
|
||||
theorem matchesAt_iff {p : Char → Prop} [DecidablePred p] {s : Slice} {pos : s.Pos} :
|
||||
MatchesAt p pos ↔ ∃ (h : pos ≠ s.endPos), p (pos.get h) := by
|
||||
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
|
||||
|
||||
theorem not_matchesAt_of_get {p : Char → Prop} [DecidablePred p] {s : Slice} {pos : s.Pos}
|
||||
{h : pos ≠ s.endPos} (hc : ¬ p (pos.get h)) : ¬ MatchesAt p pos := by
|
||||
simp [matchesAt_iff, hc]
|
||||
|
||||
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {p : Char → Prop} [DecidablePred p] :
|
||||
matchAt? p pos =
|
||||
if h₀ : ∃ (h : pos ≠ s.endPos), p (pos.get h) then some (pos.next h₀.1) else none := by
|
||||
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
|
||||
|
||||
end Decidable
|
||||
|
||||
end String.Slice.Pattern.Model.CharPred
|
||||
|
||||
@@ -6,6 +6,235 @@ Author: Markus Himmel
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Lemmas.Pattern.Split.Basic
|
||||
public import Init.Data.String.Lemmas.Pattern.Split.Char
|
||||
public import Init.Data.String.Lemmas.Pattern.Split.Pred
|
||||
public import Init.Data.String.Lemmas.Pattern.Basic
|
||||
public import Init.Data.String.Slice
|
||||
import all Init.Data.String.Slice
|
||||
import Init.Data.Option.Lemmas
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.ByCases
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Data.Iterators.Lemmas.Basic
|
||||
import Init.Data.Iterators.Lemmas.Consumers.Collect
|
||||
|
||||
set_option doc.verso true
|
||||
|
||||
/-!
|
||||
# Verification of {name}`String.Slice.splitToSubslice`
|
||||
|
||||
This PR verifies the {name}`String.Slice.splitToSubslice` function by relating it to a model
|
||||
implementation based on the {name}`String.Slice.Pattern.Model.ForwardPatternModel` class.
|
||||
|
||||
This gives a low-level correctness proof from which higher-level API lemmas can be derived.
|
||||
-/
|
||||
|
||||
namespace String.Slice.Pattern.Model
|
||||
|
||||
/--
|
||||
Represents a list of subslices of a slice {name}`s`, the first of which starts at the given
|
||||
position {name}`startPos`. This is a natural type for a split routine to return.
|
||||
-/
|
||||
@[ext]
|
||||
public structure SlicesFrom {s : Slice} (startPos : s.Pos) : Type where
|
||||
l : List s.Subslice
|
||||
any_head? : l.head?.any (·.startInclusive = startPos)
|
||||
|
||||
namespace SlicesFrom
|
||||
|
||||
/--
|
||||
A {name}`SlicesFrom` consisting of a single empty subslice at the position {name}`pos`.
|
||||
-/
|
||||
public def «at» {s : Slice} (pos : s.Pos) : SlicesFrom pos where
|
||||
l := [s.subslice pos pos (Slice.Pos.le_refl _)]
|
||||
any_head? := by simp
|
||||
|
||||
@[simp]
|
||||
public theorem l_at {s : Slice} (pos : s.Pos) :
|
||||
(SlicesFrom.at pos).l = [s.subslice pos pos (Slice.Pos.le_refl _)] := (rfl)
|
||||
|
||||
/--
|
||||
Concatenating two {name}`SlicesFrom` yields a {name}`SlicesFrom` from the first position.
|
||||
-/
|
||||
public def append {s : Slice} {p₁ p₂ : s.Pos} (l₁ : SlicesFrom p₁) (l₂ : SlicesFrom p₂) :
|
||||
SlicesFrom p₁ where
|
||||
l := l₁.l ++ l₂.l
|
||||
any_head? := by simpa using Option.any_or_of_any_left l₁.any_head?
|
||||
|
||||
@[simp]
|
||||
public theorem l_append {s : Slice} {p₁ p₂ : s.Pos} {l₁ : SlicesFrom p₁} {l₂ : SlicesFrom p₂} :
|
||||
(l₁.append l₂).l = l₁.l ++ l₂.l :=
|
||||
(rfl)
|
||||
|
||||
/--
|
||||
Given a {lean}`SlicesFrom p₂` and a position {name}`p₁` such that {lean}`p₁ ≤ p₂`, obtain a
|
||||
{lean}`SlicesFrom p₁` by extending the left end of the first subslice to from {name}`p₂` to
|
||||
{name}`p₁`.
|
||||
-/
|
||||
public def extend {s : Slice} (p₁ : s.Pos) {p₂ : s.Pos} (h : p₁ ≤ p₂) (l : SlicesFrom p₂) :
|
||||
SlicesFrom p₁ where
|
||||
l :=
|
||||
match l.l, l.any_head? with
|
||||
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts
|
||||
any_head? := by split; simp
|
||||
|
||||
@[simp]
|
||||
public theorem l_extend {s : Slice} {p₁ p₂ : s.Pos} (h : p₁ ≤ p₂) {l : SlicesFrom p₂} :
|
||||
(l.extend p₁ h).l =
|
||||
match l.l, l.any_head? with
|
||||
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts :=
|
||||
(rfl)
|
||||
|
||||
@[simp]
|
||||
public theorem extend_self {s : Slice} {p₁ : s.Pos} (l : SlicesFrom p₁) :
|
||||
l.extend p₁ (Slice.Pos.le_refl _) = l := by
|
||||
rcases l with ⟨l, h⟩
|
||||
match l, h with
|
||||
| st :: sts, h =>
|
||||
simp at h
|
||||
simp [SlicesFrom.extend, ← h]
|
||||
|
||||
@[simp]
|
||||
public theorem extend_extend {s : Slice} {p₀ p₁ p₂ : s.Pos} {h : p₀ ≤ p₁} {h' : p₁ ≤ p₂}
|
||||
{l : SlicesFrom p₂} : (l.extend p₁ h').extend p₀ h = l.extend p₀ (Slice.Pos.le_trans h h') := by
|
||||
rcases l with ⟨l, h⟩
|
||||
match l, h with
|
||||
| st :: sts, h => simp [SlicesFrom.extend]
|
||||
|
||||
end SlicesFrom
|
||||
|
||||
/--
|
||||
Noncomputable model implementation of {name}`String.Slice.splitToSubslice` based on
|
||||
{name}`ForwardPatternModel`. This is supposed to be simple enough to allow deriving higher-level
|
||||
API lemmas about the public splitting functions.
|
||||
-/
|
||||
public protected noncomputable def split {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {s : Slice}
|
||||
(start : s.Pos) : SlicesFrom start :=
|
||||
if h : start = s.endPos then
|
||||
.at start
|
||||
else
|
||||
match hd : matchAt? pat start with
|
||||
| some pos =>
|
||||
have : start < pos := (matchAt?_eq_some_iff.1 hd).lt
|
||||
(SlicesFrom.at start).append (Model.split pat pos)
|
||||
| none => (Model.split pat (start.next h)).extend start (by simp)
|
||||
termination_by start
|
||||
|
||||
@[simp]
|
||||
public theorem split_endPos {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice} :
|
||||
Model.split pat s.endPos = SlicesFrom.at s.endPos := by
|
||||
simp [Model.split]
|
||||
|
||||
public theorem split_eq_of_isLongestMatchAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
|
||||
{s : Slice} {start stop : s.Pos} (h : IsLongestMatchAt pat start stop) :
|
||||
Model.split pat start = (SlicesFrom.at start).append (Model.split pat stop) := by
|
||||
rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h.lt)]
|
||||
split
|
||||
· congr <;> exact (matchAt?_eq_some_iff.1 ‹_›).eq h
|
||||
· simp [matchAt?_eq_some_iff.2 ‹_›] at *
|
||||
|
||||
public theorem split_eq_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice}
|
||||
{start stop : s.Pos} (h₀ : start ≤ stop) (h : ∀ p, start ≤ p → p < stop → ¬ MatchesAt pat p) :
|
||||
Model.split pat start = (SlicesFrom.extend start h₀ (Model.split pat stop)) := by
|
||||
induction start using WellFounded.induction Slice.Pos.wellFounded_gt with | h start ih
|
||||
by_cases h' : start < stop
|
||||
· rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h')]
|
||||
have : ¬ MatchesAt pat start := h start (Slice.Pos.le_refl _) h'
|
||||
split
|
||||
· rename_i heq
|
||||
simp [matchAt?_eq_none_iff.2 ‹_›] at heq
|
||||
· rw [ih, SlicesFrom.extend_extend]
|
||||
· simp
|
||||
· simp [h']
|
||||
· refine fun p hp₁ hp₂ => h p (Std.le_of_lt (by simpa using hp₁)) hp₂
|
||||
· obtain rfl : start = stop := Std.le_antisymm h₀ (Std.not_lt.1 h')
|
||||
simp
|
||||
|
||||
/--
|
||||
Splits a slice {name}`s` into subslices from a list of {lean}`SearchStep s`.
|
||||
|
||||
This is an intermediate step in the verification. The equivalence of
|
||||
{name}`String.Slice.splitToSubslice` and {name}`splitFromSteps` is pure "iteratorology", while
|
||||
the equivalence of {name}`splitFromSteps` and {name}`split` is the actual correctness proof for the
|
||||
splitting routine.
|
||||
-/
|
||||
def splitFromSteps {s : Slice} (currPos : s.Pos) (l : List (SearchStep s)) : List s.Subslice :=
|
||||
match l with
|
||||
| [] => [s.subsliceFrom currPos]
|
||||
| .rejected .. :: l => splitFromSteps currPos l
|
||||
| .matched p q :: l => s.subslice! currPos p :: splitFromSteps q l
|
||||
|
||||
theorem IsValidSearchFrom.splitFromSteps_eq_extend_split {ρ : Type} (pat : ρ)
|
||||
[ForwardPatternModel pat] (l : List (SearchStep s)) (pos pos' : s.Pos) (h₀ : pos ≤ pos')
|
||||
(h' : ∀ p, pos ≤ p → p < pos' → ¬ MatchesAt pat p)
|
||||
(h : IsValidSearchFrom pat pos' l) :
|
||||
splitFromSteps pos l = ((Model.split pat pos').extend pos h₀).l := by
|
||||
induction h generalizing pos with
|
||||
| endPos =>
|
||||
simp only [splitFromSteps, Model.split, ↓reduceDIte, SlicesFrom.l_extend, List.head?_cons,
|
||||
Option.any_some]
|
||||
split
|
||||
simp_all only [SlicesFrom.l_at, List.cons.injEq, List.nil_eq, List.head?_cons, Option.any_some,
|
||||
decide_eq_true_eq, heq_eq_eq, and_true]
|
||||
rename_i h
|
||||
simp only [← h.1]
|
||||
ext <;> simp
|
||||
| matched h valid ih =>
|
||||
simp only [splitFromSteps]
|
||||
rw [subslice!_eq_subslice h₀, split_eq_of_isLongestMatchAt h]
|
||||
simp only [SlicesFrom.append, SlicesFrom.at, List.cons_append, List.nil_append,
|
||||
SlicesFrom.l_extend, List.cons.injEq]
|
||||
refine ⟨?_, ?_⟩
|
||||
· ext <;> simp
|
||||
· rw [ih _ (Slice.Pos.le_refl _), SlicesFrom.extend_self]
|
||||
exact fun p hp₁ hp₂ => False.elim (Std.lt_irrefl (Std.lt_of_le_of_lt hp₁ hp₂))
|
||||
| mismatched h rej valid ih =>
|
||||
simp only [splitFromSteps]
|
||||
rename_i l startPos endPos
|
||||
rw [split_eq_of_not_matchesAt (Std.le_of_lt h) rej, SlicesFrom.extend_extend, ih]
|
||||
intro p hp₁ hp₂
|
||||
by_cases hp : p < startPos
|
||||
· exact h' p hp₁ hp
|
||||
· exact rej _ (Std.not_lt.1 hp) hp₂
|
||||
|
||||
theorem SplitIterator.toList_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice → Type}
|
||||
[ToForwardSearcher pat σ]
|
||||
[∀ s, Std.Iterator (σ s) Id (SearchStep s)] [∀ s, Std.Iterators.Finite (σ s) Id] {s : Slice}
|
||||
(it : Std.Iter (α := σ s) (SearchStep s)) (currPos : s.Pos) :
|
||||
(Std.Iter.mk (α := SplitIterator pat s) (.operating currPos it)).toList =
|
||||
splitFromSteps currPos it.toList := by
|
||||
induction it using Std.Iter.inductSteps generalizing currPos with | step it ihy ihs
|
||||
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
|
||||
conv => rhs; rw [Std.Iter.toList_eq_match_step]
|
||||
simp only [Std.Iter.toIterM_mk]
|
||||
cases it.step using Std.PlausibleIterStep.casesOn with
|
||||
| yield it out h =>
|
||||
match out with
|
||||
| .matched startPos endPos => simp [splitFromSteps, ← ihy h]
|
||||
| .rejected startPos endPos => simp [splitFromSteps, ← ihy h]
|
||||
| skip it h => simp [← ihs h]
|
||||
| done =>
|
||||
simp only [Id.run_pure, Std.Shrink.inflate_deflate, Std.IterM.Step.toPure_yield,
|
||||
Std.PlausibleIterStep.yield, Std.IterM.toIter_mk, splitFromSteps, List.cons.injEq, true_and]
|
||||
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
|
||||
simp
|
||||
|
||||
theorem toList_splitToSubslice_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice → Type} [ToForwardSearcher pat σ]
|
||||
[∀ s, Std.Iterator (σ s) Id (SearchStep s)] [∀ s, Std.Iterators.Finite (σ s) Id] (s : Slice) :
|
||||
(s.splitToSubslice pat).toList = splitFromSteps s.startPos (ToForwardSearcher.toSearcher pat s).toList := by
|
||||
rw [splitToSubslice, SplitIterator.toList_eq_splitFromSteps]
|
||||
|
||||
end Model
|
||||
|
||||
open Model
|
||||
|
||||
public theorem toList_splitToSubslice_eq_modelSplit {ρ : Type} (pat : ρ) [ForwardPatternModel pat]
|
||||
{σ : Slice → Type} [ToForwardSearcher pat σ] [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
|
||||
[∀ s, Std.Iterators.Finite (σ s) Id] [LawfulToForwardSearcherModel pat] (s : Slice) :
|
||||
(s.splitToSubslice pat).toList = (Model.split pat s.startPos).l := by
|
||||
rw [toList_splitToSubslice_eq_splitFromSteps, IsValidSearchFrom.splitFromSteps_eq_extend_split pat _
|
||||
s.startPos s.startPos (Std.le_refl _) _ (LawfulToForwardSearcherModel.isValidSearchFrom_toList _),
|
||||
SlicesFrom.extend_self]
|
||||
simp
|
||||
|
||||
end String.Slice.Pattern
|
||||
|
||||
@@ -1,207 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Lemmas.Pattern.Basic
|
||||
public import Init.Data.String.Slice
|
||||
public import Init.Data.String.Search
|
||||
import all Init.Data.String.Slice
|
||||
import all Init.Data.String.Search
|
||||
import Init.Data.Option.Lemmas
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.ByCases
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Data.Iterators.Lemmas.Basic
|
||||
import Init.Data.Iterators.Lemmas.Consumers.Collect
|
||||
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
|
||||
import Init.Data.String.Lemmas.IsEmpty
|
||||
|
||||
set_option doc.verso true
|
||||
|
||||
/-!
|
||||
# Verification of {name}`String.Slice.splitToSubslice`
|
||||
|
||||
This PR verifies the {name}`String.Slice.splitToSubslice` function by relating it to a model
|
||||
implementation based on the {name}`String.Slice.Pattern.Model.ForwardPatternModel` class.
|
||||
|
||||
This gives a low-level correctness proof from which higher-level API lemmas can be derived.
|
||||
-/
|
||||
|
||||
namespace String.Slice.Pattern.Model
|
||||
|
||||
public protected noncomputable def split {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {s : Slice}
|
||||
(firstRejected curr : s.Pos) (hle : firstRejected ≤ curr) : List s.Subslice :=
|
||||
if h : curr = s.endPos then
|
||||
[s.subslice _ _ hle]
|
||||
else
|
||||
match hd : matchAt? pat curr with
|
||||
| some pos =>
|
||||
have : curr < pos := (matchAt?_eq_some_iff.1 hd).lt
|
||||
s.subslice _ _ hle :: Model.split pat pos pos (Std.le_refl _)
|
||||
| none => Model.split pat firstRejected (curr.next h) (Std.le_trans hle (by simp))
|
||||
termination_by curr
|
||||
|
||||
@[simp]
|
||||
public theorem split_endPos {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice}
|
||||
{firstRejected : s.Pos} :
|
||||
Model.split (s := s) pat firstRejected s.endPos (by simp) = [s.subslice firstRejected s.endPos (by simp)] := by
|
||||
simp [Model.split]
|
||||
|
||||
public theorem split_eq_of_isLongestMatchAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
|
||||
{s : Slice} {firstRejected start stop : s.Pos} {hle} (h : IsLongestMatchAt pat start stop) :
|
||||
Model.split pat firstRejected start hle =
|
||||
s.subslice _ _ hle :: Model.split pat stop stop (by exact Std.le_refl _) := by
|
||||
rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h.lt)]
|
||||
split
|
||||
· congr <;> exact (matchAt?_eq_some_iff.1 ‹_›).eq h
|
||||
· simp [matchAt?_eq_some_iff.2 ‹_›] at *
|
||||
|
||||
public theorem split_eq_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
|
||||
{s : Slice} {firstRejected start} (stop : s.Pos) (h₀ : start ≤ stop) {hle}
|
||||
(h : ∀ p, start ≤ p → p < stop → ¬ MatchesAt pat p) :
|
||||
Model.split pat firstRejected start hle =
|
||||
Model.split pat firstRejected stop (by exact Std.le_trans hle h₀) := by
|
||||
induction start using WellFounded.induction Slice.Pos.wellFounded_gt with | h start ih
|
||||
by_cases h' : start < stop
|
||||
· rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h')]
|
||||
have : ¬ MatchesAt pat start := h start (Slice.Pos.le_refl _) h'
|
||||
split
|
||||
· rename_i heq
|
||||
simp [matchAt?_eq_none_iff.2 ‹_›] at heq
|
||||
· rw [ih _ (by simp) (by simpa)]
|
||||
exact fun p hp₁ hp₂ => h p (Std.le_of_lt (by simpa using hp₁)) hp₂
|
||||
· obtain rfl : start = stop := Std.le_antisymm h₀ (Std.not_lt.1 h')
|
||||
simp
|
||||
|
||||
public theorem split_eq_next_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
|
||||
{s : Slice} {firstRejected start} {hle} (hs : start ≠ s.endPos) (h : ¬ MatchesAt pat start) :
|
||||
Model.split pat firstRejected start hle =
|
||||
Model.split pat firstRejected (start.next hs) (by exact Std.le_trans hle (by simp)) := by
|
||||
refine split_eq_of_not_matchesAt _ (by simp) (fun p hp₁ hp₂ => ?_)
|
||||
obtain rfl : start = p := Std.le_antisymm hp₁ (by simpa using hp₂)
|
||||
exact h
|
||||
|
||||
/--
|
||||
Splits a slice {name}`s` into subslices from a list of {lean}`SearchStep s`.
|
||||
|
||||
This is an intermediate step in the verification. The equivalence of
|
||||
{name}`String.Slice.splitToSubslice` and {name}`splitFromSteps` is pure "iteratorology", while
|
||||
the equivalence of {name}`splitFromSteps` and {name}`split` is the actual correctness proof for the
|
||||
splitting routine.
|
||||
-/
|
||||
def splitFromSteps {s : Slice} (currPos : s.Pos) (l : List (SearchStep s)) : List s.Subslice :=
|
||||
match l with
|
||||
| [] => [s.subsliceFrom currPos]
|
||||
| .rejected .. :: l => splitFromSteps currPos l
|
||||
| .matched p q :: l => s.subslice! currPos p :: splitFromSteps q l
|
||||
|
||||
theorem IsValidSearchFrom.splitFromSteps_eq_extend_split {ρ : Type} (pat : ρ)
|
||||
[ForwardPatternModel pat] (l : List (SearchStep s)) (pos pos' : s.Pos) (h₀ : pos ≤ pos')
|
||||
(h' : ∀ p, pos ≤ p → p < pos' → ¬ MatchesAt pat p)
|
||||
(h : IsValidSearchFrom pat pos' l) :
|
||||
splitFromSteps pos l = Model.split pat pos pos' h₀ := by
|
||||
induction h generalizing pos with
|
||||
| endPos =>
|
||||
simp [splitFromSteps]
|
||||
| matched h valid ih =>
|
||||
simp only [splitFromSteps]
|
||||
rw [subslice!_eq_subslice h₀, split_eq_of_isLongestMatchAt h, ih]
|
||||
simp +contextual [← Std.not_lt]
|
||||
| mismatched h rej valid ih =>
|
||||
simp only [splitFromSteps]
|
||||
rename_i l startPos endPos
|
||||
rw [split_eq_of_not_matchesAt _ (Std.le_of_lt h) rej, ih]
|
||||
intro p hp₁ hp₂
|
||||
by_cases hp : p < startPos
|
||||
· exact h' p hp₁ hp
|
||||
· exact rej _ (Std.not_lt.1 hp) hp₂
|
||||
|
||||
theorem SplitIterator.toList_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice → Type}
|
||||
[ToForwardSearcher pat σ]
|
||||
[∀ s, Std.Iterator (σ s) Id (SearchStep s)] [∀ s, Std.Iterators.Finite (σ s) Id] {s : Slice}
|
||||
(it : Std.Iter (α := σ s) (SearchStep s)) (currPos : s.Pos) :
|
||||
(Std.Iter.mk (α := SplitIterator pat s) (.operating currPos it)).toList =
|
||||
splitFromSteps currPos it.toList := by
|
||||
induction it using Std.Iter.inductSteps generalizing currPos with | step it ihy ihs
|
||||
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
|
||||
conv => rhs; rw [Std.Iter.toList_eq_match_step]
|
||||
simp only [Std.Iter.toIterM_mk]
|
||||
cases it.step using Std.PlausibleIterStep.casesOn with
|
||||
| yield it out h =>
|
||||
match out with
|
||||
| .matched startPos endPos => simp [splitFromSteps, ← ihy h]
|
||||
| .rejected startPos endPos => simp [splitFromSteps, ← ihy h]
|
||||
| skip it h => simp [← ihs h]
|
||||
| done =>
|
||||
simp only [Id.run_pure, Std.Shrink.inflate_deflate, Std.IterM.Step.toPure_yield,
|
||||
Std.PlausibleIterStep.yield, Std.IterM.toIter_mk, splitFromSteps, List.cons.injEq, true_and]
|
||||
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
|
||||
simp
|
||||
|
||||
theorem toList_splitToSubslice_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice → Type} [ToForwardSearcher pat σ]
|
||||
[∀ s, Std.Iterator (σ s) Id (SearchStep s)] [∀ s, Std.Iterators.Finite (σ s) Id] (s : Slice) :
|
||||
(s.splitToSubslice pat).toList = splitFromSteps s.startPos (ToForwardSearcher.toSearcher pat s).toList := by
|
||||
rw [splitToSubslice, SplitIterator.toList_eq_splitFromSteps]
|
||||
|
||||
end Model
|
||||
|
||||
open Model
|
||||
|
||||
public theorem toList_splitToSubslice_eq_modelSplit {ρ : Type} (pat : ρ) [ForwardPatternModel pat]
|
||||
{σ : Slice → Type} [ToForwardSearcher pat σ] [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
|
||||
[∀ s, Std.Iterators.Finite (σ s) Id] [LawfulToForwardSearcherModel pat] (s : Slice) :
|
||||
(s.splitToSubslice pat).toList = Model.split pat s.startPos s.startPos (by exact Std.le_refl _) := by
|
||||
rw [toList_splitToSubslice_eq_splitFromSteps, IsValidSearchFrom.splitFromSteps_eq_extend_split pat _
|
||||
s.startPos s.startPos (Std.le_refl _) _ (LawfulToForwardSearcherModel.isValidSearchFrom_toList _)]
|
||||
simp
|
||||
|
||||
end Pattern
|
||||
|
||||
open Pattern
|
||||
|
||||
public theorem toList_splitToSubslice_of_isEmpty {ρ : Type} (pat : ρ)
|
||||
[Model.ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[ToForwardSearcher pat σ] [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
|
||||
[∀ s, Std.Iterators.Finite (σ s) Id] [Model.LawfulToForwardSearcherModel pat] {s : Slice}
|
||||
(h : s.isEmpty = true) :
|
||||
(s.splitToSubslice pat).toList = [s.subsliceFrom s.endPos] := by
|
||||
simp [toList_splitToSubslice_eq_modelSplit, Slice.startPos_eq_endPos_iff.2 h]
|
||||
|
||||
public theorem toList_split_eq_splitToSubslice {ρ : Type} (pat : ρ) {σ : Slice → Type}
|
||||
[ToForwardSearcher pat σ] [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
|
||||
[∀ s, Std.Iterators.Finite (σ s) Id] {s : Slice} :
|
||||
(s.split pat).toList = (s.splitToSubslice pat).toList.map Subslice.toSlice := by
|
||||
simp [split, Std.Iter.toList_map]
|
||||
|
||||
public theorem toList_split_of_isEmpty {ρ : Type} (pat : ρ)
|
||||
[Model.ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[ToForwardSearcher pat σ] [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
|
||||
[∀ s, Std.Iterators.Finite (σ s) Id] [Model.LawfulToForwardSearcherModel pat] {s : Slice}
|
||||
(h : s.isEmpty = true) :
|
||||
(s.split pat).toList.map Slice.copy = [""] := by
|
||||
rw [toList_split_eq_splitToSubslice, toList_splitToSubslice_of_isEmpty _ h]
|
||||
simp
|
||||
|
||||
end Slice
|
||||
|
||||
open Slice.Pattern
|
||||
|
||||
public theorem split_eq_split_toSlice {ρ : Type} {pat : ρ} {σ : Slice → Type}
|
||||
[ToForwardSearcher pat σ] [∀ s, Std.Iterator (σ s) Id (SearchStep s)] {s : String} :
|
||||
s.split pat = s.toSlice.split pat := (rfl)
|
||||
|
||||
@[simp]
|
||||
public theorem toList_split_empty {ρ : Type} (pat : ρ)
|
||||
[Model.ForwardPatternModel pat] {σ : Slice → Type}
|
||||
[ToForwardSearcher pat σ] [∀ s, Std.Iterator (σ s) Id (SearchStep s)]
|
||||
[∀ s, Std.Iterators.Finite (σ s) Id] [Model.LawfulToForwardSearcherModel pat] :
|
||||
("".split pat).toList.map Slice.copy = [""] := by
|
||||
rw [split_eq_split_toSlice, Slice.toList_split_of_isEmpty _ (by simp)]
|
||||
|
||||
end String
|
||||
@@ -1,78 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Slice
|
||||
public import Init.Data.String.Search
|
||||
public import Init.Data.List.SplitOn.Basic
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
|
||||
import Init.Data.String.Lemmas.Pattern.Split.Basic
|
||||
import Init.Data.String.Lemmas.Pattern.Char
|
||||
import Init.ByCases
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Data.String.Lemmas.Order
|
||||
import Init.Data.String.Lemmas.Intercalate
|
||||
import Init.Data.List.SplitOn.Lemmas
|
||||
|
||||
public section
|
||||
|
||||
namespace String.Slice
|
||||
|
||||
open Pattern.Model Pattern.Model.Char
|
||||
|
||||
theorem toList_splitToSubslice_char {s : Slice} {c : Char} :
|
||||
(s.splitToSubslice c).toList.map (Slice.copy ∘ Subslice.toSlice) =
|
||||
(s.copy.toList.splitOn c).map String.ofList := by
|
||||
simp only [Pattern.toList_splitToSubslice_eq_modelSplit]
|
||||
suffices ∀ (f p : s.Pos) (hle : f ≤ p) (t₁ t₂ : String),
|
||||
p.Splits t₁ t₂ → (Pattern.Model.split c f p hle).map (copy ∘ Subslice.toSlice) =
|
||||
(t₂.toList.splitOnPPrepend (· == c) (s.subslice f p hle).copy.toList.reverse).map String.ofList by
|
||||
simpa [List.splitOn_eq_splitOnP] using this s.startPos s.startPos (Std.le_refl _) "" s.copy
|
||||
intro f p hle t₁ t₂ hp
|
||||
induction p using Pos.next_induction generalizing f t₁ t₂ with
|
||||
| next p h ih =>
|
||||
obtain ⟨t₂, rfl⟩ := hp.exists_eq_singleton_append h
|
||||
by_cases hpc : p.get h = c
|
||||
· simp [split_eq_of_isLongestMatchAt (isLongestMatchAt_of_get_eq hpc),
|
||||
ih _ (Std.le_refl _) _ _ hp.next,
|
||||
List.splitOnPPrepend_cons_pos (p := (· == c)) (beq_iff_eq.2 hpc)]
|
||||
· rw [split_eq_next_of_not_matchesAt h (not_matchesAt_of_get_ne hpc)]
|
||||
simp only [toList_append, toList_singleton, List.cons_append, List.nil_append, Subslice.copy_eq]
|
||||
rw [ih _ _ _ _ hp.next, List.splitOnPPrepend_cons_neg (by simpa)]
|
||||
have := (splits_slice (Std.le_trans hle (by simp)) (p.slice f (p.next h) hle (by simp))).eq_append
|
||||
simp_all
|
||||
| endPos => simp_all
|
||||
|
||||
theorem toList_split_char {s : Slice} {c : Char} :
|
||||
(s.split c).toList.map Slice.copy = (s.copy.toList.splitOn c).map String.ofList := by
|
||||
simp [toList_split_eq_splitToSubslice, ← toList_splitToSubslice_char]
|
||||
|
||||
end Slice
|
||||
|
||||
theorem toList_split_char {s : String} {c : Char} :
|
||||
(s.split c).toList.map Slice.copy = (s.toList.splitOn c).map String.ofList := by
|
||||
simp [split_eq_split_toSlice, Slice.toList_split_char]
|
||||
|
||||
theorem Slice.toList_split_intercalate {c : Char} {l : List Slice} (hl : ∀ s ∈ l, c ∉ s.copy.toList) :
|
||||
((Slice.intercalate (String.singleton c) l).split c).toList.map Slice.copy =
|
||||
if l = [] then [""] else l.map Slice.copy := by
|
||||
simp [String.toList_split_char]
|
||||
split
|
||||
· simp_all
|
||||
· rw [List.splitOn_intercalate] <;> simp_all
|
||||
|
||||
theorem toList_split_intercalate {c : Char} {l : List String} (hl : ∀ s ∈ l, c ∉ s.toList) :
|
||||
((String.intercalate (String.singleton c) l).split c).toList.map (·.copy) =
|
||||
if l = [] then [""] else l := by
|
||||
simp only [toList_split_char, toList_intercalate, toList_singleton]
|
||||
split
|
||||
· simp_all
|
||||
· rw [List.splitOn_intercalate] <;> simp_all
|
||||
|
||||
end String
|
||||
@@ -1,103 +0,0 @@
|
||||
/-
|
||||
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Markus Himmel
|
||||
-/
|
||||
module
|
||||
|
||||
prelude
|
||||
public import Init.Data.String.Slice
|
||||
public import Init.Data.String.Search
|
||||
public import Init.Data.List.SplitOn.Basic
|
||||
import Init.Data.String.Termination
|
||||
import Init.Data.Order.Lemmas
|
||||
import Init.Data.Iterators.Lemmas.Combinators.FilterMap
|
||||
import Init.Data.String.Lemmas.Pattern.Split.Basic
|
||||
import Init.Data.String.Lemmas.Pattern.Pred
|
||||
import Init.ByCases
|
||||
import Init.Data.String.OrderInstances
|
||||
import Init.Data.List.SplitOn.Lemmas
|
||||
import Init.Data.String.Lemmas.Order
|
||||
|
||||
public section
|
||||
|
||||
namespace String.Slice
|
||||
|
||||
section
|
||||
|
||||
open Pattern.Model Pattern.Model.CharPred
|
||||
|
||||
theorem toList_splitToSubslice_bool {s : Slice} {p : Char → Bool} :
|
||||
(s.splitToSubslice p).toList.map (Slice.copy ∘ Subslice.toSlice) =
|
||||
(s.copy.toList.splitOnP p).map String.ofList := by
|
||||
simp only [Pattern.toList_splitToSubslice_eq_modelSplit]
|
||||
suffices ∀ (f pos : s.Pos) (hle : f ≤ pos) (t₁ t₂ : String),
|
||||
pos.Splits t₁ t₂ → (Pattern.Model.split p f pos hle).map (copy ∘ Subslice.toSlice) =
|
||||
(t₂.toList.splitOnPPrepend p (s.subslice f pos hle).copy.toList.reverse).map String.ofList by
|
||||
simpa using this s.startPos s.startPos (Std.le_refl _) "" s.copy
|
||||
intro f pos hle t₁ t₂ hp
|
||||
induction pos using Pos.next_induction generalizing f t₁ t₂ with
|
||||
| next pos h ih =>
|
||||
obtain ⟨t₂, rfl⟩ := hp.exists_eq_singleton_append h
|
||||
by_cases hpc : p (pos.get h)
|
||||
· simp [split_eq_of_isLongestMatchAt (isLongestMatchAt_of_get hpc),
|
||||
ih _ (Std.le_refl _) _ _ hp.next,
|
||||
List.splitOnPPrepend_cons_pos (p := p) hpc]
|
||||
· rw [Bool.not_eq_true] at hpc
|
||||
rw [split_eq_next_of_not_matchesAt h (not_matchesAt_of_get hpc)]
|
||||
simp only [toList_append, toList_singleton, List.cons_append, List.nil_append, Subslice.copy_eq]
|
||||
rw [ih _ _ _ _ hp.next, List.splitOnPPrepend_cons_neg (by simpa)]
|
||||
have := (splits_slice (Std.le_trans hle (by simp)) (pos.slice f (pos.next h) hle (by simp))).eq_append
|
||||
simp_all
|
||||
| endPos => simp_all
|
||||
|
||||
theorem toList_split_bool {s : Slice} {p : Char → Bool} :
|
||||
(s.split p).toList.map Slice.copy = (s.copy.toList.splitOnP p).map String.ofList := by
|
||||
simp [toList_split_eq_splitToSubslice, ← toList_splitToSubslice_bool]
|
||||
|
||||
end
|
||||
|
||||
section
|
||||
|
||||
open Pattern.Model Pattern.Model.CharPred.Decidable
|
||||
|
||||
theorem toList_splitToSubslice_prop {s : Slice} {p : Char → Prop} [DecidablePred p] :
|
||||
(s.splitToSubslice p).toList.map (Slice.copy ∘ Subslice.toSlice) =
|
||||
(s.copy.toList.splitOnP p).map String.ofList := by
|
||||
simp only [Pattern.toList_splitToSubslice_eq_modelSplit]
|
||||
suffices ∀ (f pos : s.Pos) (hle : f ≤ pos) (t₁ t₂ : String),
|
||||
pos.Splits t₁ t₂ → (Pattern.Model.split p f pos hle).map (copy ∘ Subslice.toSlice) =
|
||||
(t₂.toList.splitOnPPrepend p (s.subslice f pos hle).copy.toList.reverse).map String.ofList by
|
||||
simpa using this s.startPos s.startPos (Std.le_refl _) "" s.copy
|
||||
intro f pos hle t₁ t₂ hp
|
||||
induction pos using Pos.next_induction generalizing f t₁ t₂ with
|
||||
| next pos h ih =>
|
||||
obtain ⟨t₂, rfl⟩ := hp.exists_eq_singleton_append h
|
||||
by_cases hpc : p (pos.get h)
|
||||
· simp [split_eq_of_isLongestMatchAt (isLongestMatchAt_of_get hpc),
|
||||
ih _ (Std.le_refl _) _ _ hp.next,
|
||||
List.splitOnPPrepend_cons_pos (p := (decide <| p ·)) (by simpa using hpc)]
|
||||
· rw [split_eq_next_of_not_matchesAt h (not_matchesAt_of_get hpc)]
|
||||
simp only [toList_append, toList_singleton, List.cons_append, List.nil_append, Subslice.copy_eq]
|
||||
rw [ih _ _ _ _ hp.next, List.splitOnPPrepend_cons_neg (by simpa)]
|
||||
have := (splits_slice (Std.le_trans hle (by simp)) (pos.slice f (pos.next h) hle (by simp))).eq_append
|
||||
simp_all
|
||||
| endPos => simp_all
|
||||
|
||||
theorem toList_split_prop {s : Slice} {p : Char → Prop} [DecidablePred p] :
|
||||
(s.split p).toList.map Slice.copy = (s.copy.toList.splitOnP p).map String.ofList := by
|
||||
simp [toList_split_eq_splitToSubslice, ← toList_splitToSubslice_prop]
|
||||
|
||||
end
|
||||
|
||||
end Slice
|
||||
|
||||
theorem toList_split_bool {s : String} {p : Char → Bool} :
|
||||
(s.split p).toList.map Slice.copy = (s.toList.splitOnP p).map String.ofList := by
|
||||
simp [split_eq_split_toSlice, Slice.toList_split_bool]
|
||||
|
||||
theorem toList_split_prop {s : String} {p : Char → Prop} [DecidablePred p] :
|
||||
(s.split p).toList.map Slice.copy = (s.toList.splitOnP p).map String.ofList := by
|
||||
simp [split_eq_split_toSlice, Slice.toList_split_prop]
|
||||
|
||||
end String
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user