Merge `package:markdown` (#1208)
- [x] Move and fix workflow files, labeler.yaml, and badges in the
README.md
- [x] Rev the version of the package, so that pub.dev points to the
correct site
- [x] Add a line to the changelog:
```
* Move to `dart-lang/tools` monorepo.
```
- [x] Add the package to the top-level readme of the monorepo:
```
| [markdown](pkgs/markdown/) | A portable Markdown library written in Dart that can parse Markdown into HTML. | [](https://pub.dev/packages/markdown) |
```
- [ ] **Important!** Merge the PR with 'Create a merge commit' (enabling
then disabling the `Allow merge commits` admin setting)
- [x] Update the auto-publishing settings on
https://pub.dev/packages/markdown/admin
- [x] Add the following text to https://github.com/dart-lang/markdown/:'
```
> [!IMPORTANT]
> This repo has moved to https://github.com/dart-lang/tools/tree/main/pkgs/markdown
```
- [ ] Publish using the autopublish workflow
- [ ] Push tags to GitHub using
```git tag --list 'markdown*' | xargs git push origin```
- [x] Close open PRs in dart-lang/markdown with the following message:
```
Closing as the
[dart-lang/markdown](https://github.com/dart-lang/markdown) repository
is merged into the [dart-lang/tools](https://github.com/dart-lang/tools)
monorepo. Please re-open this PR there!
```
- [x] Transfer issues by running
```dart run pkgs/repo_manage/bin/report.dart transfer-issues
--source-repo dart-lang/markdown --target-repo dart-lang/tools
--add-label package:markdown --apply-changes```
- [ ] Archive https://github.com/dart-lang/markdown/
---
- [x] I’ve reviewed the contributor guide and applied the relevant
portions to this PR.
<details>
<summary>Contribution guidelines:</summary><br>
- See our [contributor
guide](https://github.com/dart-lang/.github/blob/main/CONTRIBUTING.md)
for general expectations for PRs.
- Larger or significant changes should be discussed in an issue before
creating a PR.
- Contributions to our repos should follow the [Dart style
guide](https://dart.dev/guides/language/effective-dart) and use `dart
format`.
- Most changes should add an entry to the changelog and may need to [rev
the pubspec package
version](https://github.com/dart-lang/sdk/blob/main/docs/External-Package-Maintenance.md#making-a-change).
- Changes to packages require [corresponding
tests](https://github.com/dart-lang/.github/blob/main/CONTRIBUTING.md#Testing).
Note that many Dart repos have a weekly cadence for reviewing PRs -
please allow for some latency before initial review feedback.
</details>
diff --git a/.github/ISSUE_TEMPLATE/io.md b/.github/ISSUE_TEMPLATE/io.md
new file mode 100644
index 0000000..5646f0f
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/io.md
@@ -0,0 +1,5 @@
+---
+name: "package:io"
+about: "Create a bug or file a feature request against package:io."
+labels: "package:io"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/package_config.md b/.github/ISSUE_TEMPLATE/package_config.md
new file mode 100644
index 0000000..f6322d0
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/package_config.md
@@ -0,0 +1,5 @@
+---
+name: "package:package_config"
+about: "Create a bug or file a feature request against package:package_config."
+labels: "package:package_config"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/pool.md b/.github/ISSUE_TEMPLATE/pool.md
new file mode 100644
index 0000000..7af32c4
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/pool.md
@@ -0,0 +1,5 @@
+---
+name: "package:pool"
+about: "Create a bug or file a feature request against package:pool."
+labels: "package:pool"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/pub_semver.md b/.github/ISSUE_TEMPLATE/pub_semver.md
new file mode 100644
index 0000000..c7db9b5
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/pub_semver.md
@@ -0,0 +1,5 @@
+---
+name: "package:pub_semver"
+about: "Create a bug or file a feature request against package:pub_semver."
+labels: "package:pub_semver"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/pubspec_parse.md b/.github/ISSUE_TEMPLATE/pubspec_parse.md
new file mode 100644
index 0000000..2d65881
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/pubspec_parse.md
@@ -0,0 +1,5 @@
+---
+name: "package:pubspec_parse"
+about: "Create a bug or file a feature request against package:pubspec_parse."
+labels: "package:pubspec_parse"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/source_maps.md b/.github/ISSUE_TEMPLATE/source_maps.md
new file mode 100644
index 0000000..a1e390a
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/source_maps.md
@@ -0,0 +1,5 @@
+---
+name: "package:source_maps"
+about: "Create a bug or file a feature request against package:source_maps."
+labels: "package:source_maps"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/source_span.md b/.github/ISSUE_TEMPLATE/source_span.md
new file mode 100644
index 0000000..7dbb3c4
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/source_span.md
@@ -0,0 +1,5 @@
+---
+name: "package:source_span"
+about: "Create a bug or file a feature request against package:source_span."
+labels: "package:source_span"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/sse.md b/.github/ISSUE_TEMPLATE/sse.md
new file mode 100644
index 0000000..17cc488
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/sse.md
@@ -0,0 +1,5 @@
+---
+name: "package:sse"
+about: "Create a bug or file a feature request against package:sse."
+labels: "package:sse"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/stack_trace.md b/.github/ISSUE_TEMPLATE/stack_trace.md
new file mode 100644
index 0000000..417362b
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/stack_trace.md
@@ -0,0 +1,5 @@
+---
+name: "package:stack_trace"
+about: "Create a bug or file a feature request against package:stack_trace."
+labels: "package:stack_trace"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/stream_channel.md b/.github/ISSUE_TEMPLATE/stream_channel.md
new file mode 100644
index 0000000..76b5994
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/stream_channel.md
@@ -0,0 +1,5 @@
+---
+name: "package:stream_channel"
+about: "Create a bug or file a feature request against package:stream_channel."
+labels: "package:stream_channel"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/stream_transform.md b/.github/ISSUE_TEMPLATE/stream_transform.md
new file mode 100644
index 0000000..475bd83
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/stream_transform.md
@@ -0,0 +1,5 @@
+---
+name: "package:stream_transform"
+about: "Create a bug or file a feature request against package:stream_transform."
+labels: "package:stream_transform"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/string_scanner.md b/.github/ISSUE_TEMPLATE/string_scanner.md
new file mode 100644
index 0000000..ad89f1b
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/string_scanner.md
@@ -0,0 +1,5 @@
+---
+name: "package:string_scanner"
+about: "Create a bug or file a feature request against package:string_scanner."
+labels: "package:string_scanner"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/term_glyph.md b/.github/ISSUE_TEMPLATE/term_glyph.md
new file mode 100644
index 0000000..b6a4766
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/term_glyph.md
@@ -0,0 +1,5 @@
+---
+name: "package:term_glyph"
+about: "Create a bug or file a feature request against package:term_glyph."
+labels: "package:term_glyph"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/test_reflective_loader.md b/.github/ISSUE_TEMPLATE/test_reflective_loader.md
new file mode 100644
index 0000000..bde03fe
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/test_reflective_loader.md
@@ -0,0 +1,5 @@
+---
+name: "package:test_reflective_loader"
+about: "Create a bug or file a feature request against package:test_reflective_loader."
+labels: "package:test_reflective_loader"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/timing.md b/.github/ISSUE_TEMPLATE/timing.md
new file mode 100644
index 0000000..38a0015
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/timing.md
@@ -0,0 +1,5 @@
+---
+name: "package:timing"
+about: "Create a bug or file a feature request against package:timing."
+labels: "package:timing"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/watcher.md b/.github/ISSUE_TEMPLATE/watcher.md
new file mode 100644
index 0000000..2578819
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/watcher.md
@@ -0,0 +1,5 @@
+---
+name: "package:watcher"
+about: "Create a bug or file a feature request against package:watcher."
+labels: "package:watcher"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/yaml.md b/.github/ISSUE_TEMPLATE/yaml.md
new file mode 100644
index 0000000..d6a7c7f
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/yaml.md
@@ -0,0 +1,5 @@
+---
+name: "package:yaml"
+about: "Create a bug or file a feature request against package:yaml."
+labels: "package:yaml"
+---
\ No newline at end of file
diff --git a/.github/ISSUE_TEMPLATE/yaml_edit.md b/.github/ISSUE_TEMPLATE/yaml_edit.md
new file mode 100644
index 0000000..d1122a9
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/yaml_edit.md
@@ -0,0 +1,5 @@
+---
+name: "package:yaml_edit"
+about: "Create a bug or file a feature request against package:yaml_edit."
+labels: "package:yaml_edit"
+---
\ No newline at end of file
diff --git a/.github/labeler.yml b/.github/labeler.yml
index e158737..0bb7feb 100644
--- a/.github/labeler.yml
+++ b/.github/labeler.yml
@@ -64,6 +64,10 @@
- changed-files:
- any-glob-to-any-file: 'pkgs/html/**'
+'package:io':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/io/**'
+
'package:json_rpc_2':
- changed-files:
- any-glob-to-any-file: 'pkgs/json_rpc_2/**'
@@ -80,10 +84,74 @@
- changed-files:
- any-glob-to-any-file: 'pkgs/oauth2/**'
+'package:package_config':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/package_config/**'
+
+'package:pool':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/pool/**'
+
+'package:pub_semver':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/pub_semver/**'
+
+'package:pubspec_parse':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/pubspec_parse/**'
+
'package:source_map_stack_trace':
- changed-files:
- any-glob-to-any-file: 'pkgs/source_map_stack_trace/**'
+'package:source_maps':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/source_maps/**'
+
+'package:source_span':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/source_span/**'
+
+'package:sse':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/sse/**'
+
+'package:stack_trace':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/stack_trace/**'
+
+'package:stream_channel':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/stream_channel/**'
+
+'package:stream_transform':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/stream_transform/**'
+
+'package:term_glyph':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/term_glyph/**'
+
+'package:test_reflective_loader':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/test_reflective_loader/**'
+
+'package:timing':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/timing/**'
+
'package:unified_analytics':
- changed-files:
- any-glob-to-any-file: 'pkgs/unified_analytics/**'
+
+'package:watcher':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/watcher/**'
+
+'package:yaml':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/yaml/**'
+
+'package:yaml_edit':
+ - changed-files:
+ - any-glob-to-any-file: 'pkgs/yaml_edit/**'
diff --git a/.github/workflows/bazel_worker.yaml b/.github/workflows/bazel_worker.yaml
index 0eb06da..b448219 100644
--- a/.github/workflows/bazel_worker.yaml
+++ b/.github/workflows/bazel_worker.yaml
@@ -36,6 +36,8 @@
- uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
with:
sdk: ${{ matrix.sdk }}
+ - run: dart pub get
- run: "dart format --output=none --set-exit-if-changed ."
+ if: ${{ matrix.sdk == 'dev' }}
- name: Test
run: ./tool/travis.sh
diff --git a/.github/workflows/io.yaml b/.github/workflows/io.yaml
new file mode 100644
index 0000000..0c719a6
--- /dev/null
+++ b/.github/workflows/io.yaml
@@ -0,0 +1,72 @@
+name: package:io
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/io.yml'
+ - 'pkgs/io/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/io.yml'
+ - 'pkgs/io/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/io/
+
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev and stable.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev, 3.4]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev, stable
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [dev, 3.4]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - run: dart test
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/package_config.yaml b/.github/workflows/package_config.yaml
new file mode 100644
index 0000000..416ea1a
--- /dev/null
+++ b/.github/workflows/package_config.yaml
@@ -0,0 +1,71 @@
+name: package:package_config
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/package_config.yml'
+ - 'pkgs/package_config/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/package_config.yml'
+ - 'pkgs/package_config/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/package_config/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ os: [ubuntu-latest, windows-latest]
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run tests
+ run: dart test -p chrome,vm
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/pool.yaml b/.github/workflows/pool.yaml
new file mode 100644
index 0000000..6d64062
--- /dev/null
+++ b/.github/workflows/pool.yaml
@@ -0,0 +1,78 @@
+name: package:pool
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/pool.yaml'
+ - 'pkgs/pool/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/pool.yaml'
+ - 'pkgs/pool/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/pool/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
+ - name: Run Chrome tests
+ run: dart test --platform chrome
+ if: always() && steps.install.outcome == 'success'
+ - name: Run Chrome tests - wasm
+ run: dart test --platform chrome -c dart2wasm
+ if: always() && steps.install.outcome == 'success' && matrix.sdk == 'dev'
diff --git a/.github/workflows/pub_semver.yaml b/.github/workflows/pub_semver.yaml
new file mode 100644
index 0000000..ba0db18
--- /dev/null
+++ b/.github/workflows/pub_semver.yaml
@@ -0,0 +1,75 @@
+name: package:pub_semver
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/pub_semver.yaml'
+ - 'pkgs/pub_semver/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/pub_semver.yaml'
+ - 'pkgs/pub_semver/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/pub_semver/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
+ - name: Run Chrome tests
+ run: dart test --platform chrome --compiler dart2js,dart2wasm
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/pubspec_parse.yaml b/.github/workflows/pubspec_parse.yaml
new file mode 100644
index 0000000..9cf6257
--- /dev/null
+++ b/.github/workflows/pubspec_parse.yaml
@@ -0,0 +1,71 @@
+name: package:pubspec_parse
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/pubspec_parse.yaml'
+ - 'pkgs/pubspec_parse/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/pubspec_parse.yaml'
+ - 'pkgs/pubspec_parse/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/pubspec_parse/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ os: [ubuntu-latest]
+ sdk: [3.6, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm --run-skipped
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/source_maps.yaml b/.github/workflows/source_maps.yaml
new file mode 100644
index 0000000..2ae0f20
--- /dev/null
+++ b/.github/workflows/source_maps.yaml
@@ -0,0 +1,72 @@
+name: package:source_maps
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/source_maps.yaml'
+ - 'pkgs/source_maps/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/source_maps.yaml'
+ - 'pkgs/source_maps/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/source_maps/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.3.0, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/source_span.yaml b/.github/workflows/source_span.yaml
new file mode 100644
index 0000000..2c2ba05
--- /dev/null
+++ b/.github/workflows/source_span.yaml
@@ -0,0 +1,75 @@
+name: package:source_span
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/source_span.yml'
+ - 'pkgs/source_span/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/source_span.yml'
+ - 'pkgs/source_span/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/source_span/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.1.0, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
+ - name: Run Chrome tests
+ run: dart test --platform chrome
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/sse.yaml b/.github/workflows/sse.yaml
new file mode 100644
index 0000000..9e2f212
--- /dev/null
+++ b/.github/workflows/sse.yaml
@@ -0,0 +1,73 @@
+name: package:sse
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/sse.yaml'
+ - 'pkgs/sse/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/sse.yaml'
+ - 'pkgs/sse/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/sse/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.3, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - uses: nanasess/setup-chromedriver@42cc2998329f041de87dc3cfa33a930eacd57eaa
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm --test-randomize-ordering-seed=random -j 1
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/stack_trace.yaml b/.github/workflows/stack_trace.yaml
new file mode 100644
index 0000000..7435967
--- /dev/null
+++ b/.github/workflows/stack_trace.yaml
@@ -0,0 +1,75 @@
+name: package:stack_trace
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/stack_trace.yaml'
+ - 'pkgs/stack_trace/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/stack_trace.yaml'
+ - 'pkgs/stack_trace/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/stack_trace/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
+ - name: Run browser tests
+ run: dart test --platform chrome
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/stream_channel.yaml b/.github/workflows/stream_channel.yaml
new file mode 100644
index 0000000..c39424d
--- /dev/null
+++ b/.github/workflows/stream_channel.yaml
@@ -0,0 +1,74 @@
+name: package:stream_channel
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/stream_channel.yaml'
+ - 'pkgs/stream_channel/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/stream_channel.yaml'
+ - 'pkgs/stream_channel/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+defaults:
+ run:
+ working-directory: pkgs/stream_channel/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.3, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
+ - name: Run Chrome tests
+ run: dart test --platform chrome
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/stream_transform.yaml b/.github/workflows/stream_transform.yaml
new file mode 100644
index 0000000..38be5cc
--- /dev/null
+++ b/.github/workflows/stream_transform.yaml
@@ -0,0 +1,73 @@
+name: package:stream_transform
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/stream_transform.yaml'
+ - 'pkgs/stream_transform/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/stream_transform.yaml'
+ - 'pkgs/stream_transform/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/stream_transform/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ # Bump SDK for Legacy tests when changing min SDK.
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run tests
+ run: dart test -p chrome,vm --test-randomize-ordering-seed=random
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/term_glyph.yaml b/.github/workflows/term_glyph.yaml
new file mode 100644
index 0000000..5b3b320
--- /dev/null
+++ b/.github/workflows/term_glyph.yaml
@@ -0,0 +1,72 @@
+name: package:term_glyph
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/term_glyph.yaml'
+ - 'pkgs/term_glyph/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/term_glyph.yaml'
+ - 'pkgs/term_glyph/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/term_glyph/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.1, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/test_reflective_loader.yaml b/.github/workflows/test_reflective_loader.yaml
new file mode 100644
index 0000000..975c970
--- /dev/null
+++ b/.github/workflows/test_reflective_loader.yaml
@@ -0,0 +1,43 @@
+name: package:test_reflective_loader
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/test_reflective_loader.yaml'
+ - 'pkgs/test_reflective_loader/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/test_reflective_loader.yaml'
+ - 'pkgs/test_reflective_loader/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+defaults:
+ run:
+ working-directory: pkgs/test_reflective_loader/
+
+jobs:
+ build:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev, 3.1]
+
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+
+ - run: dart pub get
+ - name: dart format
+ run: dart format --output=none --set-exit-if-changed .
+ - run: dart analyze --fatal-infos
+ - run: dart test
diff --git a/.github/workflows/timing.yaml b/.github/workflows/timing.yaml
new file mode 100644
index 0000000..df77b13
--- /dev/null
+++ b/.github/workflows/timing.yaml
@@ -0,0 +1,67 @@
+name: package:timing
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/timing.yaml'
+ - 'pkgs/timing/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/timing.yaml'
+ - 'pkgs/timing/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/timing/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ run: dart pub get
+ - run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev, 2.2.0
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ run: dart pub get
+ - run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/watcher.yaml b/.github/workflows/watcher.yaml
new file mode 100644
index 0000000..a04483c
--- /dev/null
+++ b/.github/workflows/watcher.yaml
@@ -0,0 +1,71 @@
+name: package:watcher
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/watcher.yaml'
+ - 'pkgs/watcher/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/watcher.yaml'
+ - 'pkgs/watcher/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/watcher/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, macos-latest, windows-latest
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ os: [ubuntu-latest, macos-latest, windows-latest]
+ sdk: [3.1, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/yaml.yaml b/.github/workflows/yaml.yaml
new file mode 100644
index 0000000..735461e
--- /dev/null
+++ b/.github/workflows/yaml.yaml
@@ -0,0 +1,75 @@
+name: package:yaml
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/yaml.yaml'
+ - 'pkgs/yaml/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/yaml.yaml'
+ - 'pkgs/yaml/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/yaml/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.4, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
+ - name: Run Chrome tests
+ run: dart test --platform chrome
+ if: always() && steps.install.outcome == 'success'
diff --git a/.github/workflows/yaml_edit.yaml b/.github/workflows/yaml_edit.yaml
new file mode 100644
index 0000000..ffea62c
--- /dev/null
+++ b/.github/workflows/yaml_edit.yaml
@@ -0,0 +1,91 @@
+name: package:yaml_edit
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/yaml_edit.yaml'
+ - 'pkgs/yaml_edit/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - '.github/workflows/yaml_edit.yaml'
+ - 'pkgs/yaml_edit/**'
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+
+defaults:
+ run:
+ working-directory: pkgs/yaml_edit/
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: ['3.1', stable, dev]
+ platform: [vm, chrome]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run tests on ${{ matrix.platform }}
+ run: dart test --platform ${{ matrix.platform }} --coverage=./coverage
+ if: always() && steps.install.outcome == 'success'
+ # We don't collect code coverage from 2.12.0, because it doesn't work
+ - name: Convert coverage to lcov
+ run: dart run coverage:format_coverage -i ./coverage -o ./coverage/lcov.info --lcov --report-on lib/
+ if: always() && steps.install.outcome == 'success' && matrix.sdk != '2.12.0'
+ - uses: coverallsapp/github-action@cfd0633edbd2411b532b808ba7a8b5e04f76d2c8
+ if: always() && steps.install.outcome == 'success' && matrix.sdk != '2.12.0'
+ with:
+ flag-name: os:${{ matrix.os }}/dart:${{ matrix.sdk }}/platform:${{ matrix.platform }}
+ parallel: true
+
+ report-coverage:
+ needs: test
+ if: ${{ always() }}
+ runs-on: ubuntu-latest
+ steps:
+ - uses: coverallsapp/github-action@cfd0633edbd2411b532b808ba7a8b5e04f76d2c8
+ with:
+ parallel-finished: true
diff --git a/README.md b/README.md
index 91c85e5..f3281a8 100644
--- a/README.md
+++ b/README.md
@@ -29,12 +29,29 @@
| [file_testing](pkgs/file_testing/) | Testing utilities for package:file. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Afile_testing) | [](https://pub.dev/packages/file_testing) |
| [graphs](pkgs/graphs/) | Graph algorithms that operate on graphs in any representation. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Agraphs) | [](https://pub.dev/packages/graphs) |
| [html](pkgs/html/) | APIs for parsing and manipulating HTML content outside the browser. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Ahtml) | [](https://pub.dev/packages/html) |
+| [io](pkgs/io/) | Utilities for the Dart VM Runtime including support for ANSI colors, file copying, and standard exit code values. | [](https://pub.dev/packages/io) |
| [json_rpc_2](pkgs/json_rpc_2/) | Utilities to write a client or server using the JSON-RPC 2.0 spec. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Ajson_rpc_2) | [](https://pub.dev/packages/json_rpc_2) |
| [markdown](pkgs/markdown/) | A portable Markdown library written in Dart that can parse Markdown into HTML. | [](https://pub.dev/packages/markdown) |
| [mime](pkgs/mime/) | Utilities for handling media (MIME) types, including determining a type from a file extension and file contents. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Amime) | [](https://pub.dev/packages/mime) |
| [oauth2](pkgs/oauth2/) | A client library for authenticating with a remote service via OAuth2 on behalf of a user, and making authorized HTTP requests with the user's OAuth2 credentials. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Aoauth2) | [](https://pub.dev/packages/oauth2) |
+| [package_config](pkgs/package_config/) | Support for reading and writing Dart Package Configuration files. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Apackage_config) | [](https://pub.dev/packages/package_config) |
+| [pool](pkgs/pool/) | Manage a finite pool of resources. Useful for controlling concurrent file system or network requests. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Apool) | [](https://pub.dev/packages/pool) |
+| [pub_semver](pkgs/pub_semver/) | Versions and version constraints implementing pub's versioning policy. This is very similar to vanilla semver, with a few corner cases. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Apub_semver) | [](https://pub.dev/packages/pub_semver) |
+| [pubspec_parse](pkgs/pubspec_parse/) | Simple package for parsing pubspec.yaml files with a type-safe API and rich error reporting. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Apubspec_parse) | [](https://pub.dev/packages/pubspec_parse) |
| [source_map_stack_trace](pkgs/source_map_stack_trace/) | A package for applying source maps to stack traces. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Asource_map_stack_trace) | [](https://pub.dev/packages/source_map_stack_trace) |
+| [source_maps](pkgs/source_maps/) | A library to programmatically manipulate source map files. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Asource_maps) | [](https://pub.dev/packages/source_maps) |
+| [source_span](pkgs/source_span/) | Provides a standard representation for source code locations and spans. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Asource_span) | [](https://pub.dev/packages/source_span) |
+| [sse](pkgs/sse/) | Provides client and server functionality for setting up bi-directional communication through Server Sent Events (SSE) and corresponding POST requests. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Asse) | [](https://pub.dev/packages/sse) |
+| [stack_trace](pkgs/stack_trace/) | A package for manipulating stack traces and printing them readably. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Astack_trace) | [](https://pub.dev/packages/stack_trace) |
+| [stream_channel](pkgs/stream_channel/) | An abstraction for two-way communication channels based on the Dart Stream class. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Astream_channel) | [](https://pub.dev/packages/stream_channel) |
+| [stream_transform](pkgs/stream_transform/) | A collection of utilities to transform and manipulate streams. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Astream_transform) | [](https://pub.dev/packages/stream_transform) |
+| [term_glyph](pkgs/term_glyph/) | Useful Unicode glyphs and ASCII substitutes. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Aterm_glyph) | [](https://pub.dev/packages/term_glyph) |
+| [test_reflective_loader](pkgs/test_reflective_loader/) | Support for discovering tests and test suites using reflection. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Atest_reflective_loader) | [](https://pub.dev/packages/test_reflective_loader) |
+| [timing](pkgs/timing/) | A simple package for tracking the performance of synchronous and asynchronous actions. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Atiming) | [](https://pub.dev/packages/timing) |
| [unified_analytics](pkgs/unified_analytics/) | A package for logging analytics for all Dart and Flutter related tooling to Google Analytics. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Aunified_analytics) | [](https://pub.dev/packages/unified_analytics) |
+| [watcher](pkgs/watcher/) | Monitor directories and send notifications when the contents change. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Awatcher) | [](https://pub.dev/packages/watcher) |
+| [yaml](pkgs/yaml/) | A parser for YAML, a human-friendly data serialization standard | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Ayaml) | [](https://pub.dev/packages/yaml) |
+| [yaml_edit](pkgs/yaml_edit/) | A library for YAML manipulation with comment and whitespace preservation. | [](https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Ayaml_edit) | [](https://pub.dev/packages/yaml_edit) |
## Publishing automation
diff --git a/pkgs/bazel_worker/benchmark/benchmark.dart b/pkgs/bazel_worker/benchmark/benchmark.dart
index 035e2b8..0a03122 100644
--- a/pkgs/bazel_worker/benchmark/benchmark.dart
+++ b/pkgs/bazel_worker/benchmark/benchmark.dart
@@ -12,10 +12,7 @@
var path = 'blaze-bin/some/path/to/a/file/that/is/an/input/$i';
workRequest
..arguments.add('--input=$path')
- ..inputs.add(Input(
- path: '',
- digest: List.filled(70, 0x11),
- ));
+ ..inputs.add(Input(path: '', digest: List.filled(70, 0x11)));
}
// Serialize it.
@@ -24,14 +21,20 @@
print('Request has $length requestBytes.');
// Add the length in front base 128 encoded as in the worker protocol.
- requestBytes =
- Uint8List.fromList(requestBytes.toList()..insertAll(0, _varInt(length)));
+ requestBytes = Uint8List.fromList(
+ requestBytes.toList()..insertAll(0, _varInt(length)),
+ );
// Split into 10000 byte chunks.
var lists = <Uint8List>[];
for (var i = 0; i < requestBytes.length; i += 10000) {
- lists.add(Uint8List.sublistView(
- requestBytes, i, min(i + 10000, requestBytes.length)));
+ lists.add(
+ Uint8List.sublistView(
+ requestBytes,
+ i,
+ min(i + 10000, requestBytes.length),
+ ),
+ );
}
// Time `AsyncMessageGrouper` and deserialization.
diff --git a/pkgs/bazel_worker/e2e_test/bin/async_worker_in_isolate.dart b/pkgs/bazel_worker/e2e_test/bin/async_worker_in_isolate.dart
index a94875d..285b03d 100644
--- a/pkgs/bazel_worker/e2e_test/bin/async_worker_in_isolate.dart
+++ b/pkgs/bazel_worker/e2e_test/bin/async_worker_in_isolate.dart
@@ -17,7 +17,10 @@
Future main(List<String> args, [SendPort? message]) async {
var receivePort = ReceivePort();
await Isolate.spawnUri(
- Uri.file('async_worker.dart'), [], receivePort.sendPort);
+ Uri.file('async_worker.dart'),
+ [],
+ receivePort.sendPort,
+ );
var worker = await ForwardsToIsolateAsyncWorker.create(receivePort);
await worker.run();
diff --git a/pkgs/bazel_worker/e2e_test/lib/async_worker.dart b/pkgs/bazel_worker/e2e_test/lib/async_worker.dart
index d48d87c..55f5171 100644
--- a/pkgs/bazel_worker/e2e_test/lib/async_worker.dart
+++ b/pkgs/bazel_worker/e2e_test/lib/async_worker.dart
@@ -16,9 +16,6 @@
@override
Future<WorkResponse> performRequest(WorkRequest request) async {
- return WorkResponse(
- exitCode: 0,
- output: request.arguments.join('\n'),
- );
+ return WorkResponse(exitCode: 0, output: request.arguments.join('\n'));
}
}
diff --git a/pkgs/bazel_worker/e2e_test/lib/forwards_to_isolate_async_worker.dart b/pkgs/bazel_worker/e2e_test/lib/forwards_to_isolate_async_worker.dart
index bb937b2..a4845cf 100644
--- a/pkgs/bazel_worker/e2e_test/lib/forwards_to_isolate_async_worker.dart
+++ b/pkgs/bazel_worker/e2e_test/lib/forwards_to_isolate_async_worker.dart
@@ -13,9 +13,11 @@
final IsolateDriverConnection _isolateDriverConnection;
static Future<ForwardsToIsolateAsyncWorker> create(
- ReceivePort receivePort) async {
+ ReceivePort receivePort,
+ ) async {
return ForwardsToIsolateAsyncWorker(
- await IsolateDriverConnection.create(receivePort));
+ await IsolateDriverConnection.create(receivePort),
+ );
}
ForwardsToIsolateAsyncWorker(this._isolateDriverConnection);
diff --git a/pkgs/bazel_worker/e2e_test/pubspec.yaml b/pkgs/bazel_worker/e2e_test/pubspec.yaml
index 56f00cd..7eaa89a 100644
--- a/pkgs/bazel_worker/e2e_test/pubspec.yaml
+++ b/pkgs/bazel_worker/e2e_test/pubspec.yaml
@@ -10,6 +10,6 @@
dev_dependencies:
cli_util: ^0.4.2
- dart_flutter_team_lints: ^1.0.0
+ dart_flutter_team_lints: ^3.0.0
path: ^1.8.0
test: ^1.16.0
diff --git a/pkgs/bazel_worker/e2e_test/test/e2e_test.dart b/pkgs/bazel_worker/e2e_test/test/e2e_test.dart
index caa813a..6b79b5e 100644
--- a/pkgs/bazel_worker/e2e_test/test/e2e_test.dart
+++ b/pkgs/bazel_worker/e2e_test/test/e2e_test.dart
@@ -12,14 +12,18 @@
void main() {
var dart = p.join(sdkPath, 'bin', 'dart');
- runE2eTestForWorker('sync worker',
- () => Process.start(dart, [p.join('bin', 'sync_worker.dart')]));
- runE2eTestForWorker('async worker',
- () => Process.start(dart, [p.join('bin', 'async_worker.dart')]));
runE2eTestForWorker(
- 'async worker in isolate',
- () =>
- Process.start(dart, [p.join('bin', 'async_worker_in_isolate.dart')]));
+ 'sync worker',
+ () => Process.start(dart, [p.join('bin', 'sync_worker.dart')]),
+ );
+ runE2eTestForWorker(
+ 'async worker',
+ () => Process.start(dart, [p.join('bin', 'async_worker.dart')]),
+ );
+ runE2eTestForWorker(
+ 'async worker in isolate',
+ () => Process.start(dart, [p.join('bin', 'async_worker_in_isolate.dart')]),
+ );
}
void runE2eTestForWorker(String groupName, SpawnWorker spawnWorker) {
diff --git a/pkgs/bazel_worker/example/client.dart b/pkgs/bazel_worker/example/client.dart
index 7147fcb..326bb18 100644
--- a/pkgs/bazel_worker/example/client.dart
+++ b/pkgs/bazel_worker/example/client.dart
@@ -5,10 +5,14 @@
void main() async {
var scratchSpace = await Directory.systemTemp.createTemp();
var driver = BazelWorkerDriver(
- () => Process.start(Platform.resolvedExecutable,
- [Platform.script.resolve('worker.dart').toFilePath()],
- workingDirectory: scratchSpace.path),
- maxWorkers: 4);
+ () => Process.start(
+ Platform.resolvedExecutable,
+ [
+ Platform.script.resolve('worker.dart').toFilePath(),
+ ],
+ workingDirectory: scratchSpace.path),
+ maxWorkers: 4,
+ );
var response = await driver.doWork(WorkRequest(arguments: ['foo']));
if (response.exitCode != EXIT_CODE_OK) {
print('Worker request failed');
diff --git a/pkgs/bazel_worker/lib/src/async_message_grouper.dart b/pkgs/bazel_worker/lib/src/async_message_grouper.dart
index e1f0dea..8fc4778 100644
--- a/pkgs/bazel_worker/lib/src/async_message_grouper.dart
+++ b/pkgs/bazel_worker/lib/src/async_message_grouper.dart
@@ -86,13 +86,18 @@
// Copy as much as possible from the input buffer. Limit is the
// smaller of the remaining length to fill in the message and the
// remaining length in the buffer.
- var lengthToCopy = min(_message.length - _messagePos,
- _inputBuffer.length - _inputBufferPos);
+ var lengthToCopy = min(
+ _message.length - _messagePos,
+ _inputBuffer.length - _inputBufferPos,
+ );
_message.setRange(
- _messagePos,
- _messagePos + lengthToCopy,
- _inputBuffer.sublist(
- _inputBufferPos, _inputBufferPos + lengthToCopy));
+ _messagePos,
+ _messagePos + lengthToCopy,
+ _inputBuffer.sublist(
+ _inputBufferPos,
+ _inputBufferPos + lengthToCopy,
+ ),
+ );
_messagePos += lengthToCopy;
_inputBufferPos += lengthToCopy;
diff --git a/pkgs/bazel_worker/lib/src/driver/driver.dart b/pkgs/bazel_worker/lib/src/driver/driver.dart
index 4a78020..06cf0fe 100644
--- a/pkgs/bazel_worker/lib/src/driver/driver.dart
+++ b/pkgs/bazel_worker/lib/src/driver/driver.dart
@@ -44,9 +44,12 @@
/// Factory method that spawns a worker process.
final SpawnWorker _spawnWorker;
- BazelWorkerDriver(this._spawnWorker,
- {int? maxIdleWorkers, int? maxWorkers, int? maxRetries})
- : _maxIdleWorkers = maxIdleWorkers ?? 4,
+ BazelWorkerDriver(
+ this._spawnWorker, {
+ int? maxIdleWorkers,
+ int? maxWorkers,
+ int? maxRetries,
+ }) : _maxIdleWorkers = maxIdleWorkers ?? 4,
_maxWorkers = maxWorkers ?? 4,
_maxRetries = maxRetries ?? 4;
@@ -56,8 +59,10 @@
/// [request] has been actually sent to the worker. This allows the caller
/// to determine when actual work is being done versus just waiting for an
/// available worker.
- Future<WorkResponse> doWork(WorkRequest request,
- {void Function(Future<WorkResponse?>)? trackWork}) {
+ Future<WorkResponse> doWork(
+ WorkRequest request, {
+ void Function(Future<WorkResponse?>)? trackWork,
+ }) {
var attempt = _WorkAttempt(request, trackWork: trackWork);
_workQueue.add(attempt);
_runWorkQueue();
@@ -69,9 +74,11 @@
for (var worker in _readyWorkers.toList()) {
_killWorker(worker);
}
- await Future.wait(_spawningWorkers.map((worker) async {
- _killWorker(await worker);
- }));
+ await Future.wait(
+ _spawningWorkers.map((worker) async {
+ _killWorker(await worker);
+ }),
+ );
}
/// Runs as many items in [_workQueue] as possible given the number of
@@ -88,8 +95,10 @@
if (_workQueue.isEmpty) return;
if (_numWorkers == _maxWorkers && _idleWorkers.isEmpty) return;
if (_numWorkers > _maxWorkers) {
- throw StateError('Internal error, created to many workers. Please '
- 'file a bug at https://github.com/dart-lang/bazel_worker/issues/new');
+ throw StateError(
+ 'Internal error, created to many workers. Please '
+ 'file a bug at https://github.com/dart-lang/bazel_worker/issues/new',
+ );
}
// At this point we definitely want to run a task, we just need to decide
@@ -137,48 +146,51 @@
void _runWorker(Process worker, _WorkAttempt attempt) {
var rescheduled = false;
- runZonedGuarded(() async {
- var connection = _workerConnections[worker]!;
+ runZonedGuarded(
+ () async {
+ var connection = _workerConnections[worker]!;
- connection.writeRequest(attempt.request);
- var responseFuture = connection.readResponse();
- if (attempt.trackWork != null) {
- attempt.trackWork!(responseFuture);
- }
- var response = await responseFuture;
+ connection.writeRequest(attempt.request);
+ var responseFuture = connection.readResponse();
+ if (attempt.trackWork != null) {
+ attempt.trackWork!(responseFuture);
+ }
+ var response = await responseFuture;
- // It is possible for us to complete with an error response due to an
- // unhandled async error before we get here.
- if (!attempt.responseCompleter.isCompleted) {
- if (response.exitCode == EXIT_CODE_BROKEN_PIPE) {
+ // It is possible for us to complete with an error response due to an
+ // unhandled async error before we get here.
+ if (!attempt.responseCompleter.isCompleted) {
+ if (response.exitCode == EXIT_CODE_BROKEN_PIPE) {
+ rescheduled = _tryReschedule(attempt);
+ if (rescheduled) return;
+ stderr.writeln('Failed to run request ${attempt.request}');
+ response = WorkResponse(
+ exitCode: EXIT_CODE_ERROR,
+ output:
+ 'Invalid response from worker, this probably means it wrote '
+ 'invalid output or died.',
+ );
+ }
+ attempt.responseCompleter.complete(response);
+ _cleanUp(worker);
+ }
+ },
+ (e, s) {
+ // Note that we don't need to do additional cleanup here on failures. If
+ // the worker dies that is already handled in a generic fashion, we just
+ // need to make sure we complete with a valid response.
+ if (!attempt.responseCompleter.isCompleted) {
rescheduled = _tryReschedule(attempt);
if (rescheduled) return;
- stderr.writeln('Failed to run request ${attempt.request}');
- response = WorkResponse(
+ var response = WorkResponse(
exitCode: EXIT_CODE_ERROR,
- output:
- 'Invalid response from worker, this probably means it wrote '
- 'invalid output or died.',
+ output: 'Error running worker:\n$e\n$s',
);
+ attempt.responseCompleter.complete(response);
+ _cleanUp(worker);
}
- attempt.responseCompleter.complete(response);
- _cleanUp(worker);
- }
- }, (e, s) {
- // Note that we don't need to do additional cleanup here on failures. If
- // the worker dies that is already handled in a generic fashion, we just
- // need to make sure we complete with a valid response.
- if (!attempt.responseCompleter.isCompleted) {
- rescheduled = _tryReschedule(attempt);
- if (rescheduled) return;
- var response = WorkResponse(
- exitCode: EXIT_CODE_ERROR,
- output: 'Error running worker:\n$e\n$s',
- );
- attempt.responseCompleter.complete(response);
- _cleanUp(worker);
- }
- });
+ },
+ );
}
/// Performs post-work cleanup for [worker].
diff --git a/pkgs/bazel_worker/lib/src/driver/driver_connection.dart b/pkgs/bazel_worker/lib/src/driver/driver_connection.dart
index b419deb..80d5c98 100644
--- a/pkgs/bazel_worker/lib/src/driver/driver_connection.dart
+++ b/pkgs/bazel_worker/lib/src/driver/driver_connection.dart
@@ -34,13 +34,16 @@
Future<void> get done => _messageGrouper.done;
- StdDriverConnection(
- {Stream<List<int>>? inputStream, StreamSink<List<int>>? outputStream})
- : _messageGrouper = AsyncMessageGrouper(inputStream ?? stdin),
+ StdDriverConnection({
+ Stream<List<int>>? inputStream,
+ StreamSink<List<int>>? outputStream,
+ }) : _messageGrouper = AsyncMessageGrouper(inputStream ?? stdin),
_outputStream = outputStream ?? stdout;
factory StdDriverConnection.forWorker(Process worker) => StdDriverConnection(
- inputStream: worker.stdout, outputStream: worker.stdin);
+ inputStream: worker.stdout,
+ outputStream: worker.stdin,
+ );
/// Note: This will attempts to recover from invalid proto messages by parsing
/// them as strings. This is a common error case for workers (they print a
diff --git a/pkgs/bazel_worker/lib/src/utils.dart b/pkgs/bazel_worker/lib/src/utils.dart
index 609b435..f67bbac 100644
--- a/pkgs/bazel_worker/lib/src/utils.dart
+++ b/pkgs/bazel_worker/lib/src/utils.dart
@@ -13,8 +13,9 @@
var delimiterBuffer = CodedBufferWriter();
delimiterBuffer.writeInt32NoTag(messageBuffer.lengthInBytes);
- var result =
- Uint8List(messageBuffer.lengthInBytes + delimiterBuffer.lengthInBytes);
+ var result = Uint8List(
+ messageBuffer.lengthInBytes + delimiterBuffer.lengthInBytes,
+ );
delimiterBuffer.writeTo(result);
messageBuffer.writeTo(result, delimiterBuffer.lengthInBytes);
diff --git a/pkgs/bazel_worker/lib/src/worker/async_worker_loop.dart b/pkgs/bazel_worker/lib/src/worker/async_worker_loop.dart
index 5182b55..a95d09a 100644
--- a/pkgs/bazel_worker/lib/src/worker/async_worker_loop.dart
+++ b/pkgs/bazel_worker/lib/src/worker/async_worker_loop.dart
@@ -32,20 +32,20 @@
var request = await connection.readRequest();
if (request == null) break;
var printMessages = StringBuffer();
- response = await runZoned(() => performRequest(request),
- zoneSpecification:
- ZoneSpecification(print: (self, parent, zone, message) {
- printMessages.writeln();
- printMessages.write(message);
- }));
+ response = await runZoned(
+ () => performRequest(request),
+ zoneSpecification: ZoneSpecification(
+ print: (self, parent, zone, message) {
+ printMessages.writeln();
+ printMessages.write(message);
+ },
+ ),
+ );
if (printMessages.isNotEmpty) {
response.output = '${response.output}$printMessages';
}
} catch (e, s) {
- response = WorkResponse(
- exitCode: EXIT_CODE_ERROR,
- output: '$e\n$s',
- );
+ response = WorkResponse(exitCode: EXIT_CODE_ERROR, output: '$e\n$s');
}
connection.writeResponse(response);
diff --git a/pkgs/bazel_worker/lib/src/worker/sync_worker_loop.dart b/pkgs/bazel_worker/lib/src/worker/sync_worker_loop.dart
index a857105..51da684 100644
--- a/pkgs/bazel_worker/lib/src/worker/sync_worker_loop.dart
+++ b/pkgs/bazel_worker/lib/src/worker/sync_worker_loop.dart
@@ -30,19 +30,20 @@
var request = connection.readRequest();
if (request == null) break;
var printMessages = StringBuffer();
- response = runZoned(() => performRequest(request), zoneSpecification:
- ZoneSpecification(print: (self, parent, zone, message) {
- printMessages.writeln();
- printMessages.write(message);
- }));
+ response = runZoned(
+ () => performRequest(request),
+ zoneSpecification: ZoneSpecification(
+ print: (self, parent, zone, message) {
+ printMessages.writeln();
+ printMessages.write(message);
+ },
+ ),
+ );
if (printMessages.isNotEmpty) {
response.output = '${response.output}$printMessages';
}
} catch (e, s) {
- response = WorkResponse(
- exitCode: EXIT_CODE_ERROR,
- output: '$e\n$s',
- );
+ response = WorkResponse(exitCode: EXIT_CODE_ERROR, output: '$e\n$s');
}
connection.writeResponse(response);
diff --git a/pkgs/bazel_worker/lib/src/worker/worker_connection.dart b/pkgs/bazel_worker/lib/src/worker/worker_connection.dart
index b395316..fd5508e 100644
--- a/pkgs/bazel_worker/lib/src/worker/worker_connection.dart
+++ b/pkgs/bazel_worker/lib/src/worker/worker_connection.dart
@@ -29,13 +29,16 @@
/// Creates a [StdAsyncWorkerConnection] with the specified [inputStream]
/// and [outputStream], unless [sendPort] is specified, in which case
/// creates a [SendPortAsyncWorkerConnection].
- factory AsyncWorkerConnection(
- {Stream<List<int>>? inputStream,
- StreamSink<List<int>>? outputStream,
- SendPort? sendPort}) =>
+ factory AsyncWorkerConnection({
+ Stream<List<int>>? inputStream,
+ StreamSink<List<int>>? outputStream,
+ SendPort? sendPort,
+ }) =>
sendPort == null
? StdAsyncWorkerConnection(
- inputStream: inputStream, outputStream: outputStream)
+ inputStream: inputStream,
+ outputStream: outputStream,
+ )
: SendPortAsyncWorkerConnection(sendPort);
@override
@@ -53,9 +56,10 @@
final AsyncMessageGrouper _messageGrouper;
final StreamSink<List<int>> _outputStream;
- StdAsyncWorkerConnection(
- {Stream<List<int>>? inputStream, StreamSink<List<int>>? outputStream})
- : _messageGrouper = AsyncMessageGrouper(inputStream ?? stdin),
+ StdAsyncWorkerConnection({
+ Stream<List<int>>? inputStream,
+ StreamSink<List<int>>? outputStream,
+ }) : _messageGrouper = AsyncMessageGrouper(inputStream ?? stdin),
_outputStream = outputStream ?? stdout;
@override
diff --git a/pkgs/bazel_worker/lib/testing.dart b/pkgs/bazel_worker/lib/testing.dart
index 3ae4c1f..7aefabb 100644
--- a/pkgs/bazel_worker/lib/testing.dart
+++ b/pkgs/bazel_worker/lib/testing.dart
@@ -72,10 +72,18 @@
}
@override
- StreamSubscription<Uint8List> listen(void Function(Uint8List bytes)? onData,
- {Function? onError, void Function()? onDone, bool? cancelOnError}) {
- return _controller.stream.listen(onData,
- onError: onError, onDone: onDone, cancelOnError: cancelOnError);
+ StreamSubscription<Uint8List> listen(
+ void Function(Uint8List bytes)? onData, {
+ Function? onError,
+ void Function()? onDone,
+ bool? cancelOnError,
+ }) {
+ return _controller.stream.listen(
+ onData,
+ onError: onError,
+ onDone: onDone,
+ cancelOnError: cancelOnError,
+ );
}
@override
@@ -165,8 +173,9 @@
final List<WorkResponse> responses = <WorkResponse>[];
TestAsyncWorkerConnection(
- Stream<List<int>> inputStream, StreamSink<List<int>> outputStream)
- : super(inputStream: inputStream, outputStream: outputStream);
+ Stream<List<int>> inputStream,
+ StreamSink<List<int>> outputStream,
+ ) : super(inputStream: inputStream, outputStream: outputStream);
@override
void writeResponse(WorkResponse response) {
diff --git a/pkgs/bazel_worker/test/driver_test.dart b/pkgs/bazel_worker/test/driver_test.dart
index c397830..c3db55c 100644
--- a/pkgs/bazel_worker/test/driver_test.dart
+++ b/pkgs/bazel_worker/test/driver_test.dart
@@ -23,27 +23,37 @@
await _doRequests(count: 1);
});
- test('can run multiple batches of requests through multiple workers',
- () async {
- var maxWorkers = 4;
- var maxIdleWorkers = 2;
- driver = BazelWorkerDriver(MockWorker.spawn,
- maxWorkers: maxWorkers, maxIdleWorkers: maxIdleWorkers);
- for (var i = 0; i < 10; i++) {
- await _doRequests(driver: driver);
- expect(MockWorker.liveWorkers.length, maxIdleWorkers);
- // No workers should be killed while there is ongoing work, but they
- // should be cleaned up once there isn't any more work to do.
- expect(MockWorker.deadWorkers.length,
- (maxWorkers - maxIdleWorkers) * (i + 1));
- }
- });
+ test(
+ 'can run multiple batches of requests through multiple workers',
+ () async {
+ var maxWorkers = 4;
+ var maxIdleWorkers = 2;
+ driver = BazelWorkerDriver(
+ MockWorker.spawn,
+ maxWorkers: maxWorkers,
+ maxIdleWorkers: maxIdleWorkers,
+ );
+ for (var i = 0; i < 10; i++) {
+ await _doRequests(driver: driver);
+ expect(MockWorker.liveWorkers.length, maxIdleWorkers);
+ // No workers should be killed while there is ongoing work, but they
+ // should be cleaned up once there isn't any more work to do.
+ expect(
+ MockWorker.deadWorkers.length,
+ (maxWorkers - maxIdleWorkers) * (i + 1),
+ );
+ }
+ },
+ );
test('can run multiple requests through one worker', () async {
var maxWorkers = 1;
var maxIdleWorkers = 1;
- driver = BazelWorkerDriver(MockWorker.spawn,
- maxWorkers: maxWorkers, maxIdleWorkers: maxIdleWorkers);
+ driver = BazelWorkerDriver(
+ MockWorker.spawn,
+ maxWorkers: maxWorkers,
+ maxIdleWorkers: maxIdleWorkers,
+ );
for (var i = 0; i < 10; i++) {
await _doRequests(driver: driver);
expect(MockWorker.liveWorkers.length, 1);
@@ -52,8 +62,11 @@
});
test('can run one request through multiple workers', () async {
- driver =
- BazelWorkerDriver(MockWorker.spawn, maxWorkers: 4, maxIdleWorkers: 4);
+ driver = BazelWorkerDriver(
+ MockWorker.spawn,
+ maxWorkers: 4,
+ maxIdleWorkers: 4,
+ );
for (var i = 0; i < 10; i++) {
await _doRequests(driver: driver, count: 1);
expect(MockWorker.liveWorkers.length, 1);
@@ -63,8 +76,11 @@
test('can run with maxIdleWorkers == 0', () async {
var maxWorkers = 4;
- driver = BazelWorkerDriver(MockWorker.spawn,
- maxWorkers: maxWorkers, maxIdleWorkers: 0);
+ driver = BazelWorkerDriver(
+ MockWorker.spawn,
+ maxWorkers: maxWorkers,
+ maxIdleWorkers: 0,
+ );
for (var i = 0; i < 10; i++) {
await _doRequests(driver: driver);
expect(MockWorker.liveWorkers.length, 0);
@@ -77,14 +93,15 @@
driver = BazelWorkerDriver(MockWorker.spawn, maxWorkers: maxWorkers);
var tracking = <Future>[];
await _doRequests(
- driver: driver,
- count: 10,
- trackWork: (Future response) {
- // We should never be tracking more than `maxWorkers` jobs at a time.
- expect(tracking.length, lessThan(maxWorkers));
- tracking.add(response);
- response.then((_) => tracking.remove(response));
- });
+ driver: driver,
+ count: 10,
+ trackWork: (Future response) {
+ // We should never be tracking more than `maxWorkers` jobs at a time.
+ expect(tracking.length, lessThan(maxWorkers));
+ tracking.add(response);
+ response.then((_) => tracking.remove(response));
+ },
+ );
});
group('failing workers', () {
@@ -93,27 +110,39 @@
void createDriver({int maxRetries = 2, int numBadWorkers = 2}) {
var numSpawned = 0;
driver = BazelWorkerDriver(
- () async => MockWorker(workerLoopFactory: (MockWorker worker) {
- var connection = StdAsyncWorkerConnection(
- inputStream: worker._stdinController.stream,
- outputStream: worker._stdoutController.sink);
- if (numSpawned < numBadWorkers) {
- numSpawned++;
- return ThrowingMockWorkerLoop(
- worker, MockWorker.responseQueue, connection);
- } else {
- return MockWorkerLoop(MockWorker.responseQueue,
- connection: connection);
- }
- }),
- maxRetries: maxRetries);
+ () async => MockWorker(
+ workerLoopFactory: (MockWorker worker) {
+ var connection = StdAsyncWorkerConnection(
+ inputStream: worker._stdinController.stream,
+ outputStream: worker._stdoutController.sink,
+ );
+ if (numSpawned < numBadWorkers) {
+ numSpawned++;
+ return ThrowingMockWorkerLoop(
+ worker,
+ MockWorker.responseQueue,
+ connection,
+ );
+ } else {
+ return MockWorkerLoop(
+ MockWorker.responseQueue,
+ connection: connection,
+ );
+ }
+ },
+ ),
+ maxRetries: maxRetries,
+ );
}
test('should retry up to maxRetries times', () async {
createDriver();
var expectedResponse = WorkResponse();
- MockWorker.responseQueue.addAll(
- [disconnectedResponse, disconnectedResponse, expectedResponse]);
+ MockWorker.responseQueue.addAll([
+ disconnectedResponse,
+ disconnectedResponse,
+ expectedResponse,
+ ]);
var actualResponse = await driver!.doWork(WorkRequest());
// The first 2 null responses are thrown away, and we should get the
// third one.
@@ -125,23 +154,29 @@
test('should fail if it exceeds maxRetries failures', () async {
createDriver(maxRetries: 2, numBadWorkers: 3);
- MockWorker.responseQueue.addAll(
- [disconnectedResponse, disconnectedResponse, WorkResponse()]);
+ MockWorker.responseQueue.addAll([
+ disconnectedResponse,
+ disconnectedResponse,
+ WorkResponse(),
+ ]);
var actualResponse = await driver!.doWork(WorkRequest());
// Should actually get a bad response.
expect(actualResponse.exitCode, 15);
expect(
- actualResponse.output,
- 'Invalid response from worker, this probably means it wrote '
- 'invalid output or died.');
+ actualResponse.output,
+ 'Invalid response from worker, this probably means it wrote '
+ 'invalid output or died.',
+ );
expect(MockWorker.deadWorkers.length, 3);
});
});
test('handles spawnWorker failures', () async {
- driver = BazelWorkerDriver(() async => throw StateError('oh no!'),
- maxRetries: 0);
+ driver = BazelWorkerDriver(
+ () async => throw StateError('oh no!'),
+ maxRetries: 0,
+ );
expect(driver!.doWork(WorkRequest()), throwsA(isA<StateError>()));
});
@@ -156,10 +191,11 @@
/// Runs [count] of fake work requests through [driver], and asserts that they
/// all completed.
-Future _doRequests(
- {BazelWorkerDriver? driver,
- int count = 100,
- void Function(Future<WorkResponse?>)? trackWork}) async {
+Future _doRequests({
+ BazelWorkerDriver? driver,
+ int count = 100,
+ void Function(Future<WorkResponse?>)? trackWork,
+}) async {
// If we create a driver, we need to make sure and terminate it.
var terminateDriver = driver == null;
driver ??= BazelWorkerDriver(MockWorker.spawn);
@@ -167,7 +203,8 @@
var responses = List.generate(count, (_) => WorkResponse());
MockWorker.responseQueue.addAll(responses);
var actualResponses = await Future.wait(
- requests.map((request) => driver!.doWork(request, trackWork: trackWork)));
+ requests.map((request) => driver!.doWork(request, trackWork: trackWork)),
+ );
expect(actualResponses, unorderedEquals(responses));
if (terminateDriver) await driver.terminateWorkers();
}
@@ -191,9 +228,11 @@
class ThrowingMockWorkerLoop extends MockWorkerLoop {
final MockWorker _mockWorker;
- ThrowingMockWorkerLoop(this._mockWorker, Queue<WorkResponse> responseQueue,
- AsyncWorkerConnection connection)
- : super(responseQueue, connection: connection);
+ ThrowingMockWorkerLoop(
+ this._mockWorker,
+ Queue<WorkResponse> responseQueue,
+ AsyncWorkerConnection connection,
+ ) : super(responseQueue, connection: connection);
/// Run the worker loop. The returned [Future] doesn't complete until
/// [connection#readRequest] returns `null`.
@@ -234,10 +273,13 @@
liveWorkers.add(this);
var workerLoop = workerLoopFactory != null
? workerLoopFactory(this)
- : MockWorkerLoop(responseQueue,
+ : MockWorkerLoop(
+ responseQueue,
connection: StdAsyncWorkerConnection(
- inputStream: _stdinController.stream,
- outputStream: _stdoutController.sink));
+ inputStream: _stdinController.stream,
+ outputStream: _stdoutController.sink,
+ ),
+ );
workerLoop.run();
}
@@ -260,8 +302,10 @@
int get pid => throw UnsupportedError('Not needed.');
@override
- bool kill(
- [ProcessSignal processSignal = ProcessSignal.sigterm, int exitCode = 0]) {
+ bool kill([
+ ProcessSignal processSignal = ProcessSignal.sigterm,
+ int exitCode = 0,
+ ]) {
if (_killed) return false;
() async {
await _stdoutController.close();
diff --git a/pkgs/bazel_worker/test/message_grouper_test.dart b/pkgs/bazel_worker/test/message_grouper_test.dart
index 475190e..fd99911 100644
--- a/pkgs/bazel_worker/test/message_grouper_test.dart
+++ b/pkgs/bazel_worker/test/message_grouper_test.dart
@@ -18,8 +18,10 @@
});
}
-void runTests(TestStdin Function() stdinFactory,
- MessageGrouper Function(Stdin) messageGrouperFactory) {
+void runTests(
+ TestStdin Function() stdinFactory,
+ MessageGrouper Function(Stdin) messageGrouperFactory,
+) {
late MessageGrouper messageGrouper;
late TestStdin stdinStream;
@@ -52,16 +54,12 @@
});
test('Short message', () async {
- await check([
- 5,
- 10,
- 20,
- 30,
- 40,
- 50
- ], [
- [10, 20, 30, 40, 50]
- ]);
+ await check(
+ [5, 10, 20, 30, 40, 50],
+ [
+ [10, 20, 30, 40, 50],
+ ],
+ );
});
test('Message with 2-byte length', () async {
@@ -79,57 +77,44 @@
});
test('Multiple messages', () async {
- await check([
- 2,
- 10,
- 20,
- 2,
- 30,
- 40
- ], [
- [10, 20],
- [30, 40]
- ]);
+ await check(
+ [2, 10, 20, 2, 30, 40],
+ [
+ [10, 20],
+ [30, 40],
+ ],
+ );
});
test('Empty message at start', () async {
- await check([
- 0,
- 2,
- 10,
- 20
- ], [
- [],
- [10, 20]
- ]);
+ await check(
+ [0, 2, 10, 20],
+ [
+ [],
+ [10, 20],
+ ],
+ );
});
test('Empty message at end', () async {
- await check([
- 2,
- 10,
- 20,
- 0
- ], [
- [10, 20],
- []
- ]);
+ await check(
+ [2, 10, 20, 0],
+ [
+ [10, 20],
+ [],
+ ],
+ );
});
test('Empty message in the middle', () async {
- await check([
- 2,
- 10,
- 20,
- 0,
- 2,
- 30,
- 40
- ], [
- [10, 20],
- [],
- [30, 40]
- ]);
+ await check(
+ [2, 10, 20, 0, 2, 30, 40],
+ [
+ [10, 20],
+ [],
+ [30, 40],
+ ],
+ );
});
test('Handles the case when stdin gives an error instead of EOF', () async {
diff --git a/pkgs/bazel_worker/test/worker_loop_test.dart b/pkgs/bazel_worker/test/worker_loop_test.dart
index 50d2151..24068b1 100644
--- a/pkgs/bazel_worker/test/worker_loop_test.dart
+++ b/pkgs/bazel_worker/test/worker_loop_test.dart
@@ -11,36 +11,45 @@
void main() {
group('SyncWorkerLoop', () {
- runTests(TestStdinSync.new, TestSyncWorkerConnection.new,
- TestSyncWorkerLoop.new);
+ runTests(
+ TestStdinSync.new,
+ TestSyncWorkerConnection.new,
+ TestSyncWorkerLoop.new,
+ );
});
group('AsyncWorkerLoop', () {
- runTests(TestStdinAsync.new, TestAsyncWorkerConnection.new,
- TestAsyncWorkerLoop.new);
+ runTests(
+ TestStdinAsync.new,
+ TestAsyncWorkerConnection.new,
+ TestAsyncWorkerLoop.new,
+ );
});
group('SyncWorkerLoopWithPrint', () {
runTests(
- TestStdinSync.new,
- TestSyncWorkerConnection.new,
- (TestSyncWorkerConnection connection) =>
- TestSyncWorkerLoop(connection, printMessage: 'Goodbye!'));
+ TestStdinSync.new,
+ TestSyncWorkerConnection.new,
+ (TestSyncWorkerConnection connection) =>
+ TestSyncWorkerLoop(connection, printMessage: 'Goodbye!'),
+ );
});
group('AsyncWorkerLoopWithPrint', () {
runTests(
- TestStdinAsync.new,
- TestAsyncWorkerConnection.new,
- (TestAsyncWorkerConnection connection) =>
- TestAsyncWorkerLoop(connection, printMessage: 'Goodbye!'));
+ TestStdinAsync.new,
+ TestAsyncWorkerConnection.new,
+ (TestAsyncWorkerConnection connection) =>
+ TestAsyncWorkerLoop(connection, printMessage: 'Goodbye!'),
+ );
});
}
void runTests<T extends TestWorkerConnection>(
- TestStdin Function() stdinFactory,
- T Function(Stdin, Stdout) workerConnectionFactory,
- TestWorkerLoop Function(T) workerLoopFactory) {
+ TestStdin Function() stdinFactory,
+ T Function(Stdin, Stdout) workerConnectionFactory,
+ TestWorkerLoop Function(T) workerLoopFactory,
+) {
late TestStdin stdinStream;
late TestStdoutStream stdoutStream;
late T connection;
@@ -63,19 +72,29 @@
// Make sure `print` never gets called in the parent zone.
var printMessages = <String>[];
- await runZoned(() => workerLoop.run(), zoneSpecification:
- ZoneSpecification(print: (self, parent, zone, message) {
- printMessages.add(message);
- }));
- expect(printMessages, isEmpty,
- reason: 'The worker loop should hide all print calls from the parent '
- 'zone.');
+ await runZoned(
+ () => workerLoop.run(),
+ zoneSpecification: ZoneSpecification(
+ print: (self, parent, zone, message) {
+ printMessages.add(message);
+ },
+ ),
+ );
+ expect(
+ printMessages,
+ isEmpty,
+ reason: 'The worker loop should hide all print calls from the parent '
+ 'zone.',
+ );
expect(connection.responses, hasLength(1));
expect(connection.responses[0], response);
if (workerLoop.printMessage != null) {
- expect(response.output, endsWith(workerLoop.printMessage!),
- reason: 'Print messages should get appended to the response output.');
+ expect(
+ response.output,
+ endsWith(workerLoop.printMessage!),
+ reason: 'Print messages should get appended to the response output.',
+ );
}
// Check that a serialized version was written to std out.
diff --git a/pkgs/clock/analysis_options.yaml b/pkgs/clock/analysis_options.yaml
index 9ee7c2b..db6072d 100644
--- a/pkgs/clock/analysis_options.yaml
+++ b/pkgs/clock/analysis_options.yaml
@@ -11,4 +11,3 @@
rules:
- avoid_private_typedef_functions
- avoid_redundant_argument_values
- - use_super_parameters
diff --git a/pkgs/coverage/analysis_options.yaml b/pkgs/coverage/analysis_options.yaml
index 82ce5e0..bb1afe0 100644
--- a/pkgs/coverage/analysis_options.yaml
+++ b/pkgs/coverage/analysis_options.yaml
@@ -9,14 +9,9 @@
linter:
rules:
- - always_declare_return_types
- avoid_slow_async_io
- cancel_subscriptions
- - comment_references
- literal_only_boolean_expressions
- prefer_final_locals
- sort_constructors_first
- sort_unnamed_constructors_first
- - test_types_in_equals
- - throw_in_finally
- - type_annotate_public_apis
diff --git a/pkgs/file/CHANGELOG.md b/pkgs/file/CHANGELOG.md
index 50c96c4..3a3969c 100644
--- a/pkgs/file/CHANGELOG.md
+++ b/pkgs/file/CHANGELOG.md
@@ -1,3 +1,5 @@
+## 7.0.2-wip
+
## 7.0.1
* Update the pubspec repository field to reflect the new package repository.
diff --git a/pkgs/file/analysis_options.yaml b/pkgs/file/analysis_options.yaml
index 8fbd2e4..d978f81 100644
--- a/pkgs/file/analysis_options.yaml
+++ b/pkgs/file/analysis_options.yaml
@@ -1,6 +1 @@
-include: package:lints/recommended.yaml
-
-analyzer:
- errors:
- # Allow having TODOs in the code
- todo: ignore
+include: package:dart_flutter_team_lints/analysis_options.yaml
diff --git a/pkgs/file/example/main.dart b/pkgs/file/example/main.dart
index 7ca0bc7..b03b363 100644
--- a/pkgs/file/example/main.dart
+++ b/pkgs/file/example/main.dart
@@ -7,8 +7,8 @@
Future<void> main() async {
final FileSystem fs = MemoryFileSystem();
- final Directory tmp = await fs.systemTempDirectory.createTemp('example_');
- final File outputFile = tmp.childFile('output');
+ final tmp = await fs.systemTempDirectory.createTemp('example_');
+ final outputFile = tmp.childFile('output');
await outputFile.writeAsString('Hello world!');
print(outputFile.readAsStringSync());
}
diff --git a/pkgs/file/lib/chroot.dart b/pkgs/file/lib/chroot.dart
index 56d2bd5..6992ad0 100644
--- a/pkgs/file/lib/chroot.dart
+++ b/pkgs/file/lib/chroot.dart
@@ -3,4 +3,6 @@
// BSD-style license that can be found in the LICENSE file.
/// A file system that provides a view into _another_ `FileSystem` via a path.
+library;
+
export 'src/backends/chroot.dart';
diff --git a/pkgs/file/lib/file.dart b/pkgs/file/lib/file.dart
index cdde9fe..c2e97b2 100644
--- a/pkgs/file/lib/file.dart
+++ b/pkgs/file/lib/file.dart
@@ -4,5 +4,7 @@
/// Core interfaces containing the abstract `FileSystem` interface definition
/// and all associated types used by `FileSystem`.
+library;
+
export 'src/forwarding.dart';
export 'src/interface.dart';
diff --git a/pkgs/file/lib/local.dart b/pkgs/file/lib/local.dart
index 74f506e..5b1e3cd 100644
--- a/pkgs/file/lib/local.dart
+++ b/pkgs/file/lib/local.dart
@@ -4,4 +4,6 @@
/// A local file system implementation. This relies on the use of `dart:io`
/// and is thus not suitable for use in the browser.
+library;
+
export 'src/backends/local.dart';
diff --git a/pkgs/file/lib/memory.dart b/pkgs/file/lib/memory.dart
index c5705ef..690b65f 100644
--- a/pkgs/file/lib/memory.dart
+++ b/pkgs/file/lib/memory.dart
@@ -4,5 +4,7 @@
/// An implementation of `FileSystem` that exists entirely in memory with an
/// internal representation loosely based on the Filesystem Hierarchy Standard.
+library;
+
export 'src/backends/memory.dart';
export 'src/backends/memory/operations.dart';
diff --git a/pkgs/file/lib/src/backends/chroot.dart b/pkgs/file/lib/src/backends/chroot.dart
index 6082e80..402dbec 100644
--- a/pkgs/file/lib/src/backends/chroot.dart
+++ b/pkgs/file/lib/src/backends/chroot.dart
@@ -2,16 +2,16 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-library file.src.backends.chroot;
-
import 'dart:convert';
import 'dart:typed_data';
-import 'package:file/file.dart';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
import 'package:path/path.dart' as p;
+import '../common.dart' as common;
+import '../forwarding.dart';
+import '../interface.dart';
+import '../io.dart' as io;
+
part 'chroot/chroot_directory.dart';
part 'chroot/chroot_file.dart';
part 'chroot/chroot_file_system.dart';
diff --git a/pkgs/file/lib/src/backends/chroot/chroot_directory.dart b/pkgs/file/lib/src/backends/chroot/chroot_directory.dart
index 8fec7b1..e094193 100644
--- a/pkgs/file/lib/src/backends/chroot/chroot_directory.dart
+++ b/pkgs/file/lib/src/backends/chroot/chroot_directory.dart
@@ -2,18 +2,18 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-part of file.src.backends.chroot;
+part of '../chroot.dart';
class _ChrootDirectory extends _ChrootFileSystemEntity<Directory, io.Directory>
with ForwardingDirectory<Directory>, common.DirectoryAddOnsMixin {
- _ChrootDirectory(ChrootFileSystem fs, String path) : super(fs, path);
+ _ChrootDirectory(super.fs, super.path);
factory _ChrootDirectory.wrapped(
ChrootFileSystem fs,
Directory delegate, {
bool relative = false,
}) {
- String localPath = fs._local(delegate.path, relative: relative);
+ var localPath = fs._local(delegate.path, relative: relative);
return _ChrootDirectory(fs, localPath);
}
@@ -32,7 +32,7 @@
if (await fileSystem.type(path) != expectedType) {
throw common.notADirectory(path);
}
- FileSystemEntityType type = await fileSystem.type(newPath);
+ var type = await fileSystem.type(newPath);
if (type != FileSystemEntityType.notFound) {
if (type != expectedType) {
throw common.notADirectory(newPath);
@@ -44,7 +44,7 @@
throw common.directoryNotEmpty(newPath);
}
}
- String target = await fileSystem.link(path).target();
+ var target = await fileSystem.link(path).target();
await fileSystem.link(path).delete();
await fileSystem.link(newPath).create(target);
return fileSystem.directory(newPath);
@@ -60,7 +60,7 @@
if (fileSystem.typeSync(path) != expectedType) {
throw common.notADirectory(path);
}
- FileSystemEntityType type = fileSystem.typeSync(newPath);
+ var type = fileSystem.typeSync(newPath);
if (type != FileSystemEntityType.notFound) {
if (type != expectedType) {
throw common.notADirectory(newPath);
@@ -72,7 +72,7 @@
throw common.directoryNotEmpty(newPath);
}
}
- String target = fileSystem.link(path).targetSync();
+ var target = fileSystem.link(path).targetSync();
fileSystem.link(path).deleteSync();
fileSystem.link(newPath).createSync(target);
return fileSystem.directory(newPath);
@@ -97,17 +97,15 @@
@override
Future<Directory> create({bool recursive = false}) async {
if (_isLink) {
- switch (await fileSystem.type(path)) {
- case FileSystemEntityType.notFound:
- throw common.noSuchFileOrDirectory(path);
- case FileSystemEntityType.file:
- throw common.fileExists(path);
- case FileSystemEntityType.directory:
+ return switch (await fileSystem.type(path)) {
+ FileSystemEntityType.notFound =>
+ throw common.noSuchFileOrDirectory(path),
+ FileSystemEntityType.file => throw common.fileExists(path),
+ FileSystemEntityType.directory =>
// Nothing to do.
- return this;
- default:
- throw AssertionError();
- }
+ this,
+ _ => throw AssertionError()
+ };
} else {
return wrap(await delegate.create(recursive: recursive));
}
@@ -137,8 +135,8 @@
bool recursive = false,
bool followLinks = true,
}) {
- Directory delegate = this.delegate as Directory;
- String dirname = delegate.path;
+ var delegate = this.delegate as Directory;
+ var dirname = delegate.path;
return delegate
.list(recursive: recursive, followLinks: followLinks)
.map((io.FileSystemEntity entity) => _denormalize(entity, dirname));
@@ -149,8 +147,8 @@
bool recursive = false,
bool followLinks = true,
}) {
- Directory delegate = this.delegate as Directory;
- String dirname = delegate.path;
+ var delegate = this.delegate as Directory;
+ var dirname = delegate.path;
return delegate
.listSync(recursive: recursive, followLinks: followLinks)
.map((io.FileSystemEntity entity) => _denormalize(entity, dirname))
@@ -158,9 +156,9 @@
}
FileSystemEntity _denormalize(io.FileSystemEntity entity, String dirname) {
- p.Context ctx = fileSystem.path;
- String relativePart = ctx.relative(entity.path, from: dirname);
- String entityPath = ctx.join(path, relativePart);
+ var ctx = fileSystem.path;
+ var relativePart = ctx.relative(entity.path, from: dirname);
+ var entityPath = ctx.join(path, relativePart);
if (entity is io.File) {
return _ChrootFile(fileSystem, entityPath);
} else if (entity is io.Directory) {
diff --git a/pkgs/file/lib/src/backends/chroot/chroot_file.dart b/pkgs/file/lib/src/backends/chroot/chroot_file.dart
index 4b67bc1..d6c29fc 100644
--- a/pkgs/file/lib/src/backends/chroot/chroot_file.dart
+++ b/pkgs/file/lib/src/backends/chroot/chroot_file.dart
@@ -2,20 +2,20 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-part of file.src.backends.chroot;
+part of '../chroot.dart';
typedef _SetupCallback = dynamic Function();
class _ChrootFile extends _ChrootFileSystemEntity<File, io.File>
with ForwardingFile {
- _ChrootFile(ChrootFileSystem fs, String path) : super(fs, path);
+ _ChrootFile(super.fs, super.path);
factory _ChrootFile.wrapped(
ChrootFileSystem fs,
io.File delegate, {
bool relative = false,
}) {
- String localPath = fs._local(delegate.path, relative: relative);
+ var localPath = fs._local(delegate.path, relative: relative);
return _ChrootFile(fs, localPath);
}
@@ -126,7 +126,7 @@
@override
Future<File> create({bool recursive = false, bool exclusive = false}) async {
- String path = fileSystem._resolve(
+ var path = fileSystem._resolve(
this.path,
followLinks: false,
notFound: recursive ? _NotFoundBehavior.mkdir : _NotFoundBehavior.allow,
@@ -158,7 +158,7 @@
@override
void createSync({bool recursive = false, bool exclusive = false}) {
- String path = fileSystem._resolve(
+ var path = fileSystem._resolve(
this.path,
followLinks: false,
notFound: recursive ? _NotFoundBehavior.mkdir : _NotFoundBehavior.allow,
diff --git a/pkgs/file/lib/src/backends/chroot/chroot_file_system.dart b/pkgs/file/lib/src/backends/chroot/chroot_file_system.dart
index 6889c98..503821f 100644
--- a/pkgs/file/lib/src/backends/chroot/chroot_file_system.dart
+++ b/pkgs/file/lib/src/backends/chroot/chroot_file_system.dart
@@ -2,7 +2,7 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-part of file.src.backends.chroot;
+part of '../chroot.dart';
const String _thisDir = '.';
const String _parentDir = '..';
@@ -107,7 +107,7 @@
}
value = _resolve(value, notFound: _NotFoundBehavior.throwError);
- String realPath = _real(value, resolve: false);
+ var realPath = _real(value, resolve: false);
switch (delegate.typeSync(realPath, followLinks: false)) {
case FileSystemEntityType.directory:
break;
@@ -117,7 +117,7 @@
throw common.notADirectory(path as String);
}
assert(() {
- p.Context ctx = delegate.path;
+ var ctx = delegate.path;
return ctx.isAbsolute(value) && value == ctx.canonicalize(value);
}());
_cwd = value;
@@ -201,7 +201,7 @@
throw _ChrootJailException();
}
// TODO(tvolkert): See if _context.relative() works here
- String result = realPath.substring(root.length);
+ var result = realPath.substring(root.length);
if (result.isEmpty) {
result = _localRoot;
}
@@ -263,8 +263,8 @@
throw common.noSuchFileOrDirectory(path);
}
- p.Context ctx = this.path;
- String root = _localRoot;
+ var ctx = this.path;
+ var root = _localRoot;
List<String> parts, ledger;
if (ctx.isAbsolute(path)) {
parts = ctx.split(path).sublist(1);
@@ -277,9 +277,9 @@
}
String getCurrentPath() => root + ctx.joinAll(ledger);
- Set<String> breadcrumbs = <String>{};
+ var breadcrumbs = <String>{};
while (parts.isNotEmpty) {
- String segment = parts.removeAt(0);
+ var segment = parts.removeAt(0);
if (segment == _thisDir) {
continue;
} else if (segment == _parentDir) {
@@ -290,8 +290,8 @@
}
ledger.add(segment);
- String currentPath = getCurrentPath();
- String realPath = _real(currentPath, resolve: false);
+ var currentPath = getCurrentPath();
+ var realPath = _real(currentPath, resolve: false);
switch (delegate.typeSync(realPath, followLinks: false)) {
case FileSystemEntityType.directory:
@@ -333,7 +333,7 @@
if (!breadcrumbs.add(currentPath)) {
throw common.tooManyLevelsOfSymbolicLinks(path);
}
- String target = delegate.link(realPath).targetSync();
+ var target = delegate.link(realPath).targetSync();
if (ctx.isAbsolute(target)) {
ledger.clear();
parts.insertAll(0, ctx.split(target).sublist(1));
diff --git a/pkgs/file/lib/src/backends/chroot/chroot_file_system_entity.dart b/pkgs/file/lib/src/backends/chroot/chroot_file_system_entity.dart
index 8e859ac..18e37cd 100644
--- a/pkgs/file/lib/src/backends/chroot/chroot_file_system_entity.dart
+++ b/pkgs/file/lib/src/backends/chroot/chroot_file_system_entity.dart
@@ -2,7 +2,7 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-part of file.src.backends.chroot;
+part of '../chroot.dart';
abstract class _ChrootFileSystemEntity<T extends FileSystemEntity,
D extends io.FileSystemEntity> extends ForwardingFileSystemEntity<T, D> {
@@ -103,7 +103,7 @@
@override
Future<T> delete({bool recursive = false}) async {
- String path = fileSystem._resolve(this.path,
+ var path = fileSystem._resolve(this.path,
followLinks: false, notFound: _NotFoundBehavior.throwError);
String real(String path) => fileSystem._real(path, resolve: false);
@@ -114,7 +114,7 @@
if (expectedType == FileSystemEntityType.link) {
await fileSystem.delegate.link(real(path)).delete();
} else {
- String resolvedPath = fileSystem._resolve(p.basename(path),
+ var resolvedPath = fileSystem._resolve(p.basename(path),
from: p.dirname(path), notFound: _NotFoundBehavior.allowAtTail);
if (!recursive && await type(resolvedPath) != expectedType) {
throw expectedType == FileSystemEntityType.file
@@ -132,7 +132,7 @@
@override
void deleteSync({bool recursive = false}) {
- String path = fileSystem._resolve(this.path,
+ var path = fileSystem._resolve(this.path,
followLinks: false, notFound: _NotFoundBehavior.throwError);
String real(String path) => fileSystem._real(path, resolve: false);
@@ -143,7 +143,7 @@
if (expectedType == FileSystemEntityType.link) {
fileSystem.delegate.link(real(path)).deleteSync();
} else {
- String resolvedPath = fileSystem._resolve(p.basename(path),
+ var resolvedPath = fileSystem._resolve(p.basename(path),
from: p.dirname(path), notFound: _NotFoundBehavior.allowAtTail);
if (!recursive && type(resolvedPath) != expectedType) {
throw expectedType == FileSystemEntityType.file
diff --git a/pkgs/file/lib/src/backends/chroot/chroot_link.dart b/pkgs/file/lib/src/backends/chroot/chroot_link.dart
index acbeda6..1620df9 100644
--- a/pkgs/file/lib/src/backends/chroot/chroot_link.dart
+++ b/pkgs/file/lib/src/backends/chroot/chroot_link.dart
@@ -2,18 +2,18 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-part of file.src.backends.chroot;
+part of '../chroot.dart';
class _ChrootLink extends _ChrootFileSystemEntity<Link, io.Link>
with ForwardingLink {
- _ChrootLink(ChrootFileSystem fs, String path) : super(fs, path);
+ _ChrootLink(super.fs, super.path);
factory _ChrootLink.wrapped(
ChrootFileSystem fs,
io.Link delegate, {
bool relative = false,
}) {
- String localPath = fs._local(delegate.path, relative: relative);
+ var localPath = fs._local(delegate.path, relative: relative);
return _ChrootLink(fs, localPath);
}
diff --git a/pkgs/file/lib/src/backends/chroot/chroot_random_access_file.dart b/pkgs/file/lib/src/backends/chroot/chroot_random_access_file.dart
index 4105ac8..10bbd70 100644
--- a/pkgs/file/lib/src/backends/chroot/chroot_random_access_file.dart
+++ b/pkgs/file/lib/src/backends/chroot/chroot_random_access_file.dart
@@ -2,7 +2,7 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-part of file.src.backends.chroot;
+part of '../chroot.dart';
class _ChrootRandomAccessFile with ForwardingRandomAccessFile {
_ChrootRandomAccessFile(this.path, this.delegate);
diff --git a/pkgs/file/lib/src/backends/local/local_directory.dart b/pkgs/file/lib/src/backends/local/local_directory.dart
index e23e68f..3e1db61 100644
--- a/pkgs/file/lib/src/backends/local/local_directory.dart
+++ b/pkgs/file/lib/src/backends/local/local_directory.dart
@@ -2,10 +2,10 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
-
+import '../../common.dart' as common;
+import '../../forwarding.dart';
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'local_file_system_entity.dart';
/// [Directory] implementation that forwards all calls to `dart:io`.
@@ -13,7 +13,7 @@
with ForwardingDirectory<LocalDirectory>, common.DirectoryAddOnsMixin {
/// Instantiates a new [LocalDirectory] tied to the specified file system
/// and delegating to the specified [delegate].
- LocalDirectory(FileSystem fs, io.Directory delegate) : super(fs, delegate);
+ LocalDirectory(super.fs, super.delegate);
@override
String toString() => "LocalDirectory: '$path'";
diff --git a/pkgs/file/lib/src/backends/local/local_file.dart b/pkgs/file/lib/src/backends/local/local_file.dart
index 36293ba..a4bc106 100644
--- a/pkgs/file/lib/src/backends/local/local_file.dart
+++ b/pkgs/file/lib/src/backends/local/local_file.dart
@@ -2,9 +2,9 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/io.dart' as io;
-
+import '../../forwarding.dart';
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'local_file_system_entity.dart';
/// [File] implementation that forwards all calls to `dart:io`.
@@ -12,7 +12,7 @@
with ForwardingFile {
/// Instantiates a new [LocalFile] tied to the specified file system
/// and delegating to the specified [delegate].
- LocalFile(FileSystem fs, io.File delegate) : super(fs, delegate);
+ LocalFile(super.fs, super.delegate);
@override
String toString() => "LocalFile: '$path'";
diff --git a/pkgs/file/lib/src/backends/local/local_file_system.dart b/pkgs/file/lib/src/backends/local/local_file_system.dart
index 635998e..7541c37 100644
--- a/pkgs/file/lib/src/backends/local/local_file_system.dart
+++ b/pkgs/file/lib/src/backends/local/local_file_system.dart
@@ -2,10 +2,10 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/src/io.dart' as io;
-import 'package:file/file.dart';
import 'package:path/path.dart' as p;
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'local_directory.dart';
import 'local_file.dart';
import 'local_link.dart';
diff --git a/pkgs/file/lib/src/backends/local/local_file_system_entity.dart b/pkgs/file/lib/src/backends/local/local_file_system_entity.dart
index ca4617b..d0da559 100644
--- a/pkgs/file/lib/src/backends/local/local_file_system_entity.dart
+++ b/pkgs/file/lib/src/backends/local/local_file_system_entity.dart
@@ -2,9 +2,9 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/io.dart' as io;
-
+import '../../forwarding.dart';
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'local_directory.dart';
import 'local_file.dart';
import 'local_link.dart';
diff --git a/pkgs/file/lib/src/backends/local/local_link.dart b/pkgs/file/lib/src/backends/local/local_link.dart
index fc67d5e..2ce4791 100644
--- a/pkgs/file/lib/src/backends/local/local_link.dart
+++ b/pkgs/file/lib/src/backends/local/local_link.dart
@@ -2,9 +2,9 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/io.dart' as io;
-
+import '../../forwarding.dart';
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'local_file_system_entity.dart';
/// [Link] implementation that forwards all calls to `dart:io`.
@@ -12,7 +12,7 @@
with ForwardingLink {
/// Instantiates a new [LocalLink] tied to the specified file system
/// and delegating to the specified [delegate].
- LocalLink(FileSystem fs, io.Link delegate) : super(fs, delegate);
+ LocalLink(super.fs, super.delegate);
@override
String toString() => "LocalLink: '$path'";
diff --git a/pkgs/file/lib/src/backends/memory/clock.dart b/pkgs/file/lib/src/backends/memory/clock.dart
index 98d5434..57c1b72 100644
--- a/pkgs/file/lib/src/backends/memory/clock.dart
+++ b/pkgs/file/lib/src/backends/memory/clock.dart
@@ -2,6 +2,8 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
+// ignore_for_file: comment_references
+
/// Interface describing clocks used by the [MemoryFileSystem].
///
/// The [MemoryFileSystem] uses a clock to determine the modification times of
diff --git a/pkgs/file/lib/src/backends/memory/common.dart b/pkgs/file/lib/src/backends/memory/common.dart
index 80e3c38..eb4ca43 100644
--- a/pkgs/file/lib/src/backends/memory/common.dart
+++ b/pkgs/file/lib/src/backends/memory/common.dart
@@ -2,7 +2,7 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/src/common.dart' as common;
+import '../../common.dart' as common;
/// Generates a path to use in error messages.
typedef PathGenerator = dynamic Function();
diff --git a/pkgs/file/lib/src/backends/memory/memory_directory.dart b/pkgs/file/lib/src/backends/memory/memory_directory.dart
index 95fe542..e73b967 100644
--- a/pkgs/file/lib/src/backends/memory/memory_directory.dart
+++ b/pkgs/file/lib/src/backends/memory/memory_directory.dart
@@ -2,11 +2,11 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
import 'package:meta/meta.dart';
+import '../../common.dart' as common;
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'common.dart';
import 'memory_file.dart';
import 'memory_file_system_entity.dart';
@@ -25,8 +25,7 @@
with common.DirectoryAddOnsMixin
implements Directory {
/// Instantiates a new [MemoryDirectory].
- MemoryDirectory(NodeBasedFileSystem fileSystem, String path)
- : super(fileSystem, path);
+ MemoryDirectory(super.fileSystem, super.path);
@override
io.FileSystemEntityType get expectedType => io.FileSystemEntityType.directory;
@@ -52,7 +51,7 @@
@override
void createSync({bool recursive = false}) {
fileSystem.opHandle(path, FileSystemOp.create);
- Node? node = internalCreateSync(
+ var node = internalCreateSync(
followTailLink: true,
visitLinks: true,
createChild: (DirectoryNode parent, bool isFinalSegment) {
@@ -75,19 +74,19 @@
@override
Directory createTempSync([String? prefix]) {
prefix = '${prefix ?? ''}rand';
- String fullPath = fileSystem.path.join(path, prefix);
- String dirname = fileSystem.path.dirname(fullPath);
- String basename = fileSystem.path.basename(fullPath);
- DirectoryNode? node = fileSystem.findNode(dirname) as DirectoryNode?;
+ var fullPath = fileSystem.path.join(path, prefix);
+ var dirname = fileSystem.path.dirname(fullPath);
+ var basename = fileSystem.path.basename(fullPath);
+ var node = fileSystem.findNode(dirname) as DirectoryNode?;
checkExists(node, () => dirname);
utils.checkIsDir(node!, () => dirname);
- int tempCounter = _systemTempCounter[fileSystem] ?? 0;
+ var tempCounter = _systemTempCounter[fileSystem] ?? 0;
String name() => '$basename$tempCounter';
while (node.children.containsKey(name())) {
tempCounter++;
}
_systemTempCounter[fileSystem] = tempCounter;
- DirectoryNode tempDir = DirectoryNode(node);
+ var tempDir = DirectoryNode(node);
node.children[name()] = tempDir;
return MemoryDirectory(fileSystem, fileSystem.path.join(dirname, name()))
..createSync();
@@ -128,9 +127,9 @@
bool recursive = false,
bool followLinks = true,
}) {
- DirectoryNode node = backing as DirectoryNode;
- List<FileSystemEntity> listing = <FileSystemEntity>[];
- List<_PendingListTask> tasks = <_PendingListTask>[
+ var node = backing as DirectoryNode;
+ var listing = <FileSystemEntity>[];
+ var tasks = <_PendingListTask>[
_PendingListTask(
node,
path.endsWith(fileSystem.path.separator)
@@ -140,14 +139,14 @@
),
];
while (tasks.isNotEmpty) {
- _PendingListTask task = tasks.removeLast();
+ var task = tasks.removeLast();
task.dir.children.forEach((String name, Node child) {
- Set<LinkNode> breadcrumbs = Set<LinkNode>.from(task.breadcrumbs);
- String childPath = fileSystem.path.join(task.path, name);
+ var breadcrumbs = Set<LinkNode>.from(task.breadcrumbs);
+ var childPath = fileSystem.path.join(task.path, name);
while (followLinks &&
utils.isLink(child) &&
breadcrumbs.add(child as LinkNode)) {
- Node? referent = child.referentOrNull;
+ var referent = child.referentOrNull;
if (referent != null) {
child = referent;
}
diff --git a/pkgs/file/lib/src/backends/memory/memory_file.dart b/pkgs/file/lib/src/backends/memory/memory_file.dart
index ba4faab..1a8f5f9 100644
--- a/pkgs/file/lib/src/backends/memory/memory_file.dart
+++ b/pkgs/file/lib/src/backends/memory/memory_file.dart
@@ -7,26 +7,25 @@
import 'dart:math' as math show min;
import 'dart:typed_data';
-import 'package:file/file.dart';
-import 'package:file/src/backends/memory/operations.dart';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
import 'package:meta/meta.dart';
+import '../../common.dart' as common;
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'common.dart';
import 'memory_file_system_entity.dart';
import 'memory_random_access_file.dart';
import 'node.dart';
+import 'operations.dart';
import 'utils.dart' as utils;
/// Internal implementation of [File].
class MemoryFile extends MemoryFileSystemEntity implements File {
/// Instantiates a new [MemoryFile].
- const MemoryFile(NodeBasedFileSystem fileSystem, String path)
- : super(fileSystem, path);
+ const MemoryFile(super.fileSystem, super.path);
FileNode get _resolvedBackingOrCreate {
- Node? node = backingOrNull;
+ var node = backingOrNull;
if (node == null) {
node = _doCreate();
} else {
@@ -61,7 +60,7 @@
}
Node? _doCreate({bool recursive = false}) {
- Node? node = internalCreateSync(
+ var node = internalCreateSync(
followTailLink: true,
createChild: (DirectoryNode parent, bool isFinalSegment) {
if (isFinalSegment) {
@@ -88,7 +87,7 @@
newPath,
followTailLink: true,
checkType: (Node node) {
- FileSystemEntityType actualType = node.stat.type;
+ var actualType = node.stat.type;
if (actualType != expectedType) {
throw actualType == FileSystemEntityType.notFound
? common.noSuchFileOrDirectory(path)
@@ -103,7 +102,7 @@
@override
File copySync(String newPath) {
fileSystem.opHandle(path, FileSystemOp.copy);
- FileNode sourceNode = resolvedBacking as FileNode;
+ var sourceNode = resolvedBacking as FileNode;
fileSystem.findNode(
newPath,
segmentVisitor: (
@@ -116,7 +115,7 @@
if (currentSegment == finalSegment) {
if (child != null) {
if (utils.isLink(child)) {
- List<String> ledger = <String>[];
+ var ledger = <String>[];
child = utils.resolveLinks(child as LinkNode, () => newPath,
ledger: ledger);
checkExists(child, () => newPath);
@@ -127,7 +126,7 @@
utils.checkType(expectedType, child.type, () => newPath);
parent.children.remove(childName);
}
- FileNode newNode = FileNode(parent);
+ var newNode = FileNode(parent);
newNode.copyFrom(sourceNode);
parent.children[childName] = newNode;
}
@@ -158,7 +157,7 @@
@override
void setLastAccessedSync(DateTime time) {
- FileNode node = resolvedBacking as FileNode;
+ var node = resolvedBacking as FileNode;
node.accessed = time.millisecondsSinceEpoch;
}
@@ -174,7 +173,7 @@
@override
void setLastModifiedSync(DateTime time) {
- FileNode node = resolvedBacking as FileNode;
+ var node = resolvedBacking as FileNode;
node.modified = time.millisecondsSinceEpoch;
}
@@ -199,8 +198,8 @@
Stream<List<int>> openRead([int? start, int? end]) {
fileSystem.opHandle(path, FileSystemOp.open);
try {
- FileNode node = resolvedBacking as FileNode;
- Uint8List content = node.content;
+ var node = resolvedBacking as FileNode;
+ var content = node.content;
if (start != null) {
content = end == null
? content.sublist(start)
@@ -253,13 +252,13 @@
@override
List<String> readAsLinesSync({Encoding encoding = utf8}) {
- String str = readAsStringSync(encoding: encoding);
+ var str = readAsStringSync(encoding: encoding);
if (str.isEmpty) {
return <String>[];
}
- final List<String> lines = str.split('\n');
+ final lines = str.split('\n');
if (str.endsWith('\n')) {
// A final newline should not create an additional line.
lines.removeLast();
@@ -287,7 +286,7 @@
if (!utils.isWriteMode(mode)) {
throw common.badFileDescriptor(path);
}
- FileNode node = _resolvedBackingOrCreate;
+ var node = _resolvedBackingOrCreate;
_truncateIfNecessary(node, mode);
fileSystem.opHandle(path, FileSystemOp.write);
node.write(bytes);
@@ -349,7 +348,7 @@
deferredException = e;
}
- Future<FileNode> future = Future<FileNode>.microtask(() {
+ var future = Future<FileNode>.microtask(() {
if (deferredException != null) {
throw deferredException;
}
@@ -387,7 +386,7 @@
@override
void writeAll(Iterable<dynamic> objects, [String separator = '']) {
- bool firstIter = true;
+ var firstIter = true;
for (dynamic obj in objects) {
if (!firstIter) {
write(separator);
@@ -418,7 +417,7 @@
_streamCompleter = Completer<void>();
stream.listen(
- (List<int> data) => _addData(data),
+ _addData,
cancelOnError: true,
onError: (Object error, StackTrace stackTrace) {
_streamCompleter!.completeError(error, stackTrace);
@@ -445,8 +444,7 @@
_isClosed = true;
_pendingWrites.then(
(_) => _completer.complete(),
- onError: (Object error, StackTrace stackTrace) =>
- _completer.completeError(error, stackTrace),
+ onError: _completer.completeError,
);
}
return _completer.future;
diff --git a/pkgs/file/lib/src/backends/memory/memory_file_stat.dart b/pkgs/file/lib/src/backends/memory/memory_file_stat.dart
index 94f86d1..ce6beda 100644
--- a/pkgs/file/lib/src/backends/memory/memory_file_stat.dart
+++ b/pkgs/file/lib/src/backends/memory/memory_file_stat.dart
@@ -2,7 +2,7 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/src/io.dart' as io;
+import '../../io.dart' as io;
/// Internal implementation of [io.FileStat].
class MemoryFileStat implements io.FileStat {
@@ -47,8 +47,8 @@
@override
String modeString() {
- int permissions = mode & 0xFFF;
- List<String> codes = const <String>[
+ var permissions = mode & 0xFFF;
+ var codes = const <String>[
'---',
'--x',
'-w-',
@@ -58,7 +58,7 @@
'rw-',
'rwx',
];
- List<String> result = <String>[];
+ var result = <String>[];
result
..add(codes[(permissions >> 6) & 0x7])
..add(codes[(permissions >> 3) & 0x7])
diff --git a/pkgs/file/lib/src/backends/memory/memory_file_system.dart b/pkgs/file/lib/src/backends/memory/memory_file_system.dart
index f3cdaee..dd359f0 100644
--- a/pkgs/file/lib/src/backends/memory/memory_file_system.dart
+++ b/pkgs/file/lib/src/backends/memory/memory_file_system.dart
@@ -2,11 +2,10 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/backends/memory/operations.dart';
-import 'package:file/src/io.dart' as io;
import 'package:path/path.dart' as p;
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'clock.dart';
import 'common.dart';
import 'memory_directory.dart';
@@ -14,6 +13,7 @@
import 'memory_file_stat.dart';
import 'memory_link.dart';
import 'node.dart';
+import 'operations.dart';
import 'style.dart';
import 'utils.dart' as utils;
@@ -91,7 +91,7 @@
p.Context _context;
@override
- final Function(String context, FileSystemOp operation) opHandle;
+ final void Function(String context, FileSystemOp operation) opHandle;
@override
final Clock clock;
@@ -141,7 +141,7 @@
}
value = directory(value).resolveSymbolicLinksSync();
- Node? node = findNode(value);
+ var node = findNode(value);
checkExists(node, () => value);
utils.checkIsDir(node!, () => value);
assert(_context.isAbsolute(value));
@@ -166,9 +166,9 @@
@override
bool identicalSync(String path1, String path2) {
- Node? node1 = findNode(path1);
+ var node1 = findNode(path1);
checkExists(node1, () => path1);
- Node? node2 = findNode(path2);
+ var node2 = findNode(path2);
checkExists(node2, () => path2);
return node1 != null && node1 == node2;
}
@@ -220,14 +220,13 @@
reference ??= _current;
}
- List<String> parts = path.split(style.separator)
- ..removeWhere(utils.isEmpty);
- DirectoryNode? directory = reference?.directory;
+ var parts = path.split(style.separator)..removeWhere(utils.isEmpty);
+ var directory = reference?.directory;
Node? child = directory;
- int finalSegment = parts.length - 1;
- for (int i = 0; i <= finalSegment; i++) {
- String basename = parts[i];
+ var finalSegment = parts.length - 1;
+ for (var i = 0; i <= finalSegment; i++) {
+ var basename = parts[i];
assert(basename.isNotEmpty);
switch (basename) {
diff --git a/pkgs/file/lib/src/backends/memory/memory_file_system_entity.dart b/pkgs/file/lib/src/backends/memory/memory_file_system_entity.dart
index ad987d7..1990abc 100644
--- a/pkgs/file/lib/src/backends/memory/memory_file_system_entity.dart
+++ b/pkgs/file/lib/src/backends/memory/memory_file_system_entity.dart
@@ -2,11 +2,11 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
import 'package:meta/meta.dart';
+import '../../common.dart' as common;
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'common.dart';
import 'memory_directory.dart';
import 'node.dart';
@@ -60,7 +60,7 @@
/// The type of the node is not guaranteed to match [expectedType].
@protected
Node get backing {
- Node? node = fileSystem.findNode(path);
+ var node = fileSystem.findNode(path);
checkExists(node, () => path);
return node!;
}
@@ -71,7 +71,7 @@
/// doesn't match, this will throw a [io.FileSystemException].
@protected
Node get resolvedBacking {
- Node node = backing;
+ var node = backing;
node = utils.isLink(node)
? utils.resolveLinks(node as LinkNode, () => path)
: node;
@@ -107,14 +107,14 @@
if (path.isEmpty) {
throw common.noSuchFileOrDirectory(path);
}
- List<String> ledger = <String>[];
+ var ledger = <String>[];
if (isAbsolute) {
ledger.add(fileSystem.style.drive);
}
- Node? node = fileSystem.findNode(path,
+ var node = fileSystem.findNode(path,
pathWithSymlinks: ledger, followTailLink: true);
checkExists(node, () => path);
- String resolved = ledger.join(fileSystem.path.separator);
+ var resolved = ledger.join(fileSystem.path.separator);
if (resolved == fileSystem.style.drive) {
resolved = fileSystem.style.root;
} else if (!fileSystem.path.isAbsolute(resolved)) {
@@ -151,7 +151,7 @@
@override
FileSystemEntity get absolute {
- String absolutePath = path;
+ var absolutePath = path;
if (!fileSystem.path.isAbsolute(absolutePath)) {
absolutePath = fileSystem.path.join(fileSystem.cwd, absolutePath);
}
@@ -242,7 +242,7 @@
bool followTailLink = false,
utils.TypeChecker? checkType,
}) {
- Node node = backing;
+ var node = backing;
(checkType ?? defaultCheckType)(node);
fileSystem.findNode(
newPath,
@@ -256,7 +256,7 @@
if (currentSegment == finalSegment) {
if (child != null) {
if (followTailLink) {
- FileSystemEntityType childType = child.stat.type;
+ var childType = child.stat.type;
if (childType != FileSystemEntityType.notFound) {
utils.checkType(expectedType, child.stat.type, () => newPath);
}
@@ -289,7 +289,7 @@
utils.TypeChecker? checkType,
}) {
fileSystem.opHandle(path, FileSystemOp.delete);
- Node node = backing;
+ var node = backing;
if (!recursive) {
if (node is DirectoryNode && node.children.isNotEmpty) {
throw common.directoryNotEmpty(path);
diff --git a/pkgs/file/lib/src/backends/memory/memory_link.dart b/pkgs/file/lib/src/backends/memory/memory_link.dart
index 7d5afb4..a599fe8 100644
--- a/pkgs/file/lib/src/backends/memory/memory_link.dart
+++ b/pkgs/file/lib/src/backends/memory/memory_link.dart
@@ -2,11 +2,11 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
import 'package:meta/meta.dart';
+import '../../common.dart' as common;
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'memory_file_system_entity.dart';
import 'node.dart';
import 'operations.dart';
@@ -15,8 +15,7 @@
/// Internal implementation of [Link].
class MemoryLink extends MemoryFileSystemEntity implements Link {
/// Instantiates a new [MemoryLink].
- const MemoryLink(NodeBasedFileSystem fileSystem, String path)
- : super(fileSystem, path);
+ const MemoryLink(super.fileSystem, super.path);
@override
io.FileSystemEntityType get expectedType => io.FileSystemEntityType.link;
@@ -50,7 +49,7 @@
@override
void createSync(String target, {bool recursive = false}) {
- bool preexisting = true;
+ var preexisting = true;
fileSystem.opHandle(path, FileSystemOp.create);
internalCreateSync(
createChild: (DirectoryNode parent, bool isFinalSegment) {
@@ -76,7 +75,7 @@
@override
void updateSync(String target) {
- Node node = backing;
+ var node = backing;
utils.checkType(expectedType, node.type, () => path);
(node as LinkNode).target = target;
}
@@ -93,7 +92,7 @@
@override
String targetSync() {
- Node node = backing;
+ var node = backing;
if (node.type != expectedType) {
// Note: this may change; https://github.com/dart-lang/sdk/issues/28204
throw common.noSuchFileOrDirectory(path);
diff --git a/pkgs/file/lib/src/backends/memory/memory_random_access_file.dart b/pkgs/file/lib/src/backends/memory/memory_random_access_file.dart
index d4fe73d..190f0a1 100644
--- a/pkgs/file/lib/src/backends/memory/memory_random_access_file.dart
+++ b/pkgs/file/lib/src/backends/memory/memory_random_access_file.dart
@@ -6,10 +6,11 @@
import 'dart:math' as math show min;
import 'dart:typed_data';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
-
+import '../../common.dart' as common;
+import '../../io.dart' as io;
+import '../memory.dart' show MemoryFileSystem;
import 'memory_file.dart';
+import 'memory_file_system.dart' show MemoryFileSystem;
import 'node.dart';
import 'utils.dart' as utils;
@@ -106,8 +107,8 @@
/// Wraps a synchronous function to make it appear asynchronous.
///
/// [_asyncOperationPending], [_checkAsync], and [_asyncWrapper] are used to
- /// mimic [RandomAccessFile]'s enforcement that only one asynchronous
- /// operation is pending for a [RandomAccessFile] instance. Since
+ /// mimic [io.RandomAccessFile]'s enforcement that only one asynchronous
+ /// operation is pending for a [io.RandomAccessFile] instance. Since
/// [MemoryFileSystem]-based classes are likely to be used in tests, fidelity
/// is important to catch errors that might occur in production.
///
@@ -211,7 +212,7 @@
_checkReadable('read');
// TODO(jamesderlin): Check for integer overflow.
final int end = math.min(_position + bytes, lengthSync());
- final Uint8List copy = _node.content.sublist(_position, end);
+ final copy = _node.content.sublist(_position, end);
_position = end;
return copy;
}
@@ -243,7 +244,7 @@
end = RangeError.checkValidRange(start, end, buffer.length);
- final int length = lengthSync();
+ final length = lengthSync();
int i;
for (i = start; i < end && _position < length; i += 1, _position += 1) {
buffer[i] = _node.content[_position];
@@ -288,7 +289,7 @@
'truncate failed', path, common.invalidArgument(path).osError);
}
- final int oldLength = lengthSync();
+ final oldLength = lengthSync();
if (length < oldLength) {
_node.truncate(length);
@@ -329,7 +330,7 @@
// [Uint8List] will truncate values to 8-bits automatically, so we don't
// need to check [value].
- int length = lengthSync();
+ var length = lengthSync();
if (_position >= length) {
// If [_position] is out of bounds, [RandomAccessFile] zero-fills the
// file.
@@ -363,8 +364,8 @@
end = RangeError.checkValidRange(start, end, buffer.length);
- final int writeByteCount = end - start;
- final int endPosition = _position + writeByteCount;
+ final writeByteCount = end - start;
+ final endPosition = _position + writeByteCount;
if (endPosition > lengthSync()) {
truncateSync(endPosition);
diff --git a/pkgs/file/lib/src/backends/memory/node.dart b/pkgs/file/lib/src/backends/memory/node.dart
index ae4d3f7..eea72b5 100644
--- a/pkgs/file/lib/src/backends/memory/node.dart
+++ b/pkgs/file/lib/src/backends/memory/node.dart
@@ -4,13 +4,12 @@
import 'dart:typed_data';
-import 'package:file/file.dart';
-import 'package:file/src/backends/memory/operations.dart';
-import 'package:file/src/io.dart' as io;
-
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'clock.dart';
import 'common.dart';
import 'memory_file_stat.dart';
+import 'operations.dart';
import 'style.dart';
/// Visitor callback for use with [NodeBasedFileSystem.findNode].
@@ -115,7 +114,7 @@
/// Reparents this node to live in the specified directory.
set parent(DirectoryNode parent) {
- DirectoryNode ancestor = parent;
+ var ancestor = parent;
while (!ancestor.isRoot) {
if (ancestor == this) {
throw const io.FileSystemException(
@@ -149,8 +148,8 @@
/// you call [stat] on them).
abstract class RealNode extends Node {
/// Constructs a new [RealNode] as a child of the specified [parent].
- RealNode(DirectoryNode? parent) : super(parent) {
- int now = clock.now.millisecondsSinceEpoch;
+ RealNode(super.parent) {
+ var now = clock.now.millisecondsSinceEpoch;
changed = now;
modified = now;
accessed = now;
@@ -195,7 +194,7 @@
/// Class that represents the backing for an in-memory directory.
class DirectoryNode extends RealNode {
/// Constructs a new [DirectoryNode] as a child of the specified [parent].
- DirectoryNode(DirectoryNode? parent) : super(parent);
+ DirectoryNode(super.parent);
/// Child nodes, indexed by their basename.
final Map<String, Node> children = <String, Node>{};
@@ -237,7 +236,7 @@
/// Class that represents the backing for an in-memory regular file.
class FileNode extends RealNode {
/// Constructs a new [FileNode] as a child of the specified [parent].
- FileNode(DirectoryNode parent) : super(parent);
+ FileNode(DirectoryNode super.parent);
/// File contents in bytes.
Uint8List get content => _content;
@@ -251,7 +250,7 @@
/// Appends the specified bytes to the end of this node's [content].
void write(List<int> bytes) {
- Uint8List existing = _content;
+ var existing = _content;
_content = Uint8List(existing.length + bytes.length);
_content.setRange(0, existing.length, existing);
_content.setRange(existing.length, _content.length, bytes);
@@ -286,9 +285,7 @@
class LinkNode extends Node {
/// Constructs a new [LinkNode] as a child of the specified [parent] and
/// linking to the specified [target] path.
- LinkNode(DirectoryNode parent, this.target)
- : assert(target.isNotEmpty),
- super(parent);
+ LinkNode(DirectoryNode super.parent, this.target) : assert(target.isNotEmpty);
/// The path to which this link points.
String target;
@@ -309,7 +306,7 @@
Node? Function(DirectoryNode parent, String childName, Node? child)?
tailVisitor,
}) {
- Node? referent = fs.findNode(
+ var referent = fs.findNode(
target,
reference: this,
segmentVisitor: (
@@ -349,7 +346,7 @@
}
_reentrant = true;
try {
- Node? node = referentOrNull;
+ var node = referentOrNull;
return node == null ? MemoryFileStat.notFound : node.stat;
} finally {
_reentrant = false;
diff --git a/pkgs/file/lib/src/backends/memory/operations.dart b/pkgs/file/lib/src/backends/memory/operations.dart
index 9fc7462..57d118b 100644
--- a/pkgs/file/lib/src/backends/memory/operations.dart
+++ b/pkgs/file/lib/src/backends/memory/operations.dart
@@ -2,6 +2,8 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
+// ignore_for_file: comment_references
+
/// A file system operation used by the [MemoryFileSytem] to allow
/// tests to insert errors for certain operations.
///
@@ -64,23 +66,15 @@
@override
String toString() {
- switch (_value) {
- case 0:
- return 'FileSystemOp.read';
- case 1:
- return 'FileSystemOp.write';
- case 2:
- return 'FileSystemOp.delete';
- case 3:
- return 'FileSystemOp.create';
- case 4:
- return 'FileSystemOp.open';
- case 5:
- return 'FileSystemOp.copy';
- case 6:
- return 'FileSystemOp.exists';
- default:
- throw StateError('Invalid FileSytemOp type: $this');
- }
+ return switch (_value) {
+ 0 => 'FileSystemOp.read',
+ 1 => 'FileSystemOp.write',
+ 2 => 'FileSystemOp.delete',
+ 3 => 'FileSystemOp.create',
+ 4 => 'FileSystemOp.open',
+ 5 => 'FileSystemOp.copy',
+ 6 => 'FileSystemOp.exists',
+ _ => throw StateError('Invalid FileSytemOp type: $this')
+ };
}
}
diff --git a/pkgs/file/lib/src/backends/memory/style.dart b/pkgs/file/lib/src/backends/memory/style.dart
index 701c9d0..f4bd33f 100644
--- a/pkgs/file/lib/src/backends/memory/style.dart
+++ b/pkgs/file/lib/src/backends/memory/style.dart
@@ -2,9 +2,10 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
import 'package:path/path.dart' as p;
+import '../../interface.dart';
+
/// Class that represents the path style that a memory file system should
/// adopt.
///
diff --git a/pkgs/file/lib/src/backends/memory/utils.dart b/pkgs/file/lib/src/backends/memory/utils.dart
index eec9980..aa24cfb 100644
--- a/pkgs/file/lib/src/backends/memory/utils.dart
+++ b/pkgs/file/lib/src/backends/memory/utils.dart
@@ -2,20 +2,19 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
-import 'package:file/src/common.dart' as common;
-import 'package:file/src/io.dart' as io;
-
+import '../../common.dart' as common;
+import '../../interface.dart';
+import '../../io.dart' as io;
import 'common.dart';
import 'node.dart';
-/// Checks if `node.type` returns [io.FileSystemEntityType.FILE].
+/// Checks if `node.type` returns [io.FileSystemEntityType.file].
bool isFile(Node? node) => node?.type == io.FileSystemEntityType.file;
-/// Checks if `node.type` returns [io.FileSystemEntityType.DIRECTORY].
+/// Checks if `node.type` returns [io.FileSystemEntityType.directory].
bool isDirectory(Node? node) => node?.type == io.FileSystemEntityType.directory;
-/// Checks if `node.type` returns [io.FileSystemEntityType.LINK].
+/// Checks if `node.type` returns [io.FileSystemEntityType.link].
bool isLink(Node? node) => node?.type == io.FileSystemEntityType.link;
/// Validator function that is expected to throw a [FileSystemException] if
@@ -86,7 +85,7 @@
tailVisitor,
}) {
// Record a breadcrumb trail to guard against symlink loops.
- Set<LinkNode> breadcrumbs = <LinkNode>{};
+ var breadcrumbs = <LinkNode>{};
Node node = link;
while (isLink(node)) {
diff --git a/pkgs/file/lib/src/forwarding/forwarding_directory.dart b/pkgs/file/lib/src/forwarding/forwarding_directory.dart
index dba0c8e..ad1c548 100644
--- a/pkgs/file/lib/src/forwarding/forwarding_directory.dart
+++ b/pkgs/file/lib/src/forwarding/forwarding_directory.dart
@@ -2,8 +2,9 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/src/io.dart' as io;
-import 'package:file/file.dart';
+import '../forwarding.dart';
+import '../interface.dart';
+import '../io.dart' as io;
/// A directory that forwards all methods and properties to a delegate.
mixin ForwardingDirectory<T extends Directory>
diff --git a/pkgs/file/lib/src/forwarding/forwarding_file.dart b/pkgs/file/lib/src/forwarding/forwarding_file.dart
index 49c211d..d6cfe3b 100644
--- a/pkgs/file/lib/src/forwarding/forwarding_file.dart
+++ b/pkgs/file/lib/src/forwarding/forwarding_file.dart
@@ -5,8 +5,9 @@
import 'dart:convert';
import 'dart:typed_data';
-import 'package:file/src/io.dart' as io;
-import 'package:file/file.dart';
+import '../forwarding.dart';
+import '../interface.dart';
+import '../io.dart' as io;
/// A file that forwards all methods and properties to a delegate.
mixin ForwardingFile
diff --git a/pkgs/file/lib/src/forwarding/forwarding_file_system.dart b/pkgs/file/lib/src/forwarding/forwarding_file_system.dart
index d864db9..885fdb6 100644
--- a/pkgs/file/lib/src/forwarding/forwarding_file_system.dart
+++ b/pkgs/file/lib/src/forwarding/forwarding_file_system.dart
@@ -2,11 +2,12 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/src/io.dart' as io;
-import 'package:file/file.dart';
import 'package:meta/meta.dart';
import 'package:path/path.dart' as p;
+import '../interface.dart';
+import '../io.dart' as io;
+
/// A file system that forwards all methods and properties to a delegate.
abstract class ForwardingFileSystem extends FileSystem {
/// Creates a new [ForwardingFileSystem] that forwards all methods and
diff --git a/pkgs/file/lib/src/forwarding/forwarding_file_system_entity.dart b/pkgs/file/lib/src/forwarding/forwarding_file_system_entity.dart
index 3c41b39..1c0628e 100644
--- a/pkgs/file/lib/src/forwarding/forwarding_file_system_entity.dart
+++ b/pkgs/file/lib/src/forwarding/forwarding_file_system_entity.dart
@@ -2,10 +2,11 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/src/io.dart' as io;
-import 'package:file/file.dart';
import 'package:meta/meta.dart';
+import '../interface.dart';
+import '../io.dart' as io;
+
/// A file system entity that forwards all methods and properties to a delegate.
abstract class ForwardingFileSystemEntity<T extends FileSystemEntity,
D extends io.FileSystemEntity> implements FileSystemEntity {
diff --git a/pkgs/file/lib/src/forwarding/forwarding_link.dart b/pkgs/file/lib/src/forwarding/forwarding_link.dart
index 7a60ecb..915e710 100644
--- a/pkgs/file/lib/src/forwarding/forwarding_link.dart
+++ b/pkgs/file/lib/src/forwarding/forwarding_link.dart
@@ -2,8 +2,9 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/src/io.dart' as io;
-import 'package:file/file.dart';
+import '../forwarding.dart';
+import '../interface.dart';
+import '../io.dart' as io;
/// A link that forwards all methods and properties to a delegate.
mixin ForwardingLink
diff --git a/pkgs/file/lib/src/forwarding/forwarding_random_access_file.dart b/pkgs/file/lib/src/forwarding/forwarding_random_access_file.dart
index 9dd4079..3847b91 100644
--- a/pkgs/file/lib/src/forwarding/forwarding_random_access_file.dart
+++ b/pkgs/file/lib/src/forwarding/forwarding_random_access_file.dart
@@ -5,11 +5,12 @@
import 'dart:convert';
import 'dart:typed_data';
-import 'package:file/src/io.dart' as io;
import 'package:meta/meta.dart';
-/// A [RandomAccessFile] implementation that forwards all methods and properties
-/// to a delegate.
+import '../io.dart' as io;
+
+/// A [io.RandomAccessFile] implementation that forwards all methods and
+/// properties to a delegate.
mixin ForwardingRandomAccessFile implements io.RandomAccessFile {
/// The entity to which this entity will forward all methods and properties.
@protected
diff --git a/pkgs/file/lib/src/interface.dart b/pkgs/file/lib/src/interface.dart
index 4662e35..d9b7ed5 100644
--- a/pkgs/file/lib/src/interface.dart
+++ b/pkgs/file/lib/src/interface.dart
@@ -2,8 +2,6 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-library file.src.interface;
-
export 'interface/directory.dart';
export 'interface/error_codes.dart';
export 'interface/file.dart';
diff --git a/pkgs/file/lib/src/interface/error_codes.dart b/pkgs/file/lib/src/interface/error_codes.dart
index 8943538..4836b56 100644
--- a/pkgs/file/lib/src/interface/error_codes.dart
+++ b/pkgs/file/lib/src/interface/error_codes.dart
@@ -168,7 +168,7 @@
static int get EXDEV => _platform((_Codes codes) => codes.exdev);
static int _platform(int Function(_Codes codes) getCode) {
- _Codes codes = (_platforms[operatingSystem] ?? _platforms['linux'])!;
+ var codes = (_platforms[operatingSystem] ?? _platforms['linux'])!;
return getCode(codes);
}
}
diff --git a/pkgs/file/lib/src/interface/file_system.dart b/pkgs/file/lib/src/interface/file_system.dart
index ecc01a8..2d4e4aa 100644
--- a/pkgs/file/lib/src/interface/file_system.dart
+++ b/pkgs/file/lib/src/interface/file_system.dart
@@ -6,7 +6,6 @@
import 'package:path/path.dart' as p;
import '../io.dart' as io;
-
import 'directory.dart';
import 'file.dart';
import 'file_system_entity.dart';
@@ -99,9 +98,9 @@
bool get isWatchSupported;
/// Finds the type of file system object that a [path] points to. Returns
- /// a Future<FileSystemEntityType> that completes with the result.
+ /// a `Future<FileSystemEntityType>` that completes with the result.
///
- /// [io.FileSystemEntityType.LINK] will only be returned if [followLinks] is
+ /// [io.FileSystemEntityType.link] will only be returned if [followLinks] is
/// `false`, and [path] points to a link
///
/// If the [path] does not point to a file system object or an error occurs
@@ -111,37 +110,38 @@
/// Syncronously finds the type of file system object that a [path] points
/// to. Returns a [io.FileSystemEntityType].
///
- /// [io.FileSystemEntityType.LINK] will only be returned if [followLinks] is
+ /// [io.FileSystemEntityType.link] will only be returned if [followLinks] is
/// `false`, and [path] points to a link
///
/// If the [path] does not point to a file system object or an error occurs
/// then [io.FileSystemEntityType.notFound] is returned.
io.FileSystemEntityType typeSync(String path, {bool followLinks = true});
- /// Checks if [`type(path)`](type) returns [io.FileSystemEntityType.FILE].
+ /// Checks if [`type(path)`](type) returns [io.FileSystemEntityType.file].
Future<bool> isFile(String path) async =>
await type(path) == io.FileSystemEntityType.file;
/// Synchronously checks if [`type(path)`](type) returns
- /// [io.FileSystemEntityType.FILE].
+ /// [io.FileSystemEntityType.file].
bool isFileSync(String path) =>
typeSync(path) == io.FileSystemEntityType.file;
- /// Checks if [`type(path)`](type) returns [io.FileSystemEntityType.DIRECTORY].
+ /// Checks if [`type(path)`](type) returns
+ /// [io.FileSystemEntityType.directory].
Future<bool> isDirectory(String path) async =>
await type(path) == io.FileSystemEntityType.directory;
/// Synchronously checks if [`type(path)`](type) returns
- /// [io.FileSystemEntityType.DIRECTORY].
+ /// [io.FileSystemEntityType.directory].
bool isDirectorySync(String path) =>
typeSync(path) == io.FileSystemEntityType.directory;
- /// Checks if [`type(path)`](type) returns [io.FileSystemEntityType.LINK].
+ /// Checks if [`type(path)`](type) returns [io.FileSystemEntityType.link].
Future<bool> isLink(String path) async =>
await type(path, followLinks: false) == io.FileSystemEntityType.link;
/// Synchronously checks if [`type(path)`](type) returns
- /// [io.FileSystemEntityType.LINK].
+ /// [io.FileSystemEntityType.link].
bool isLinkSync(String path) =>
typeSync(path, followLinks: false) == io.FileSystemEntityType.link;
diff --git a/pkgs/file/lib/src/io.dart b/pkgs/file/lib/src/io.dart
index 9d57e78..28c1d6d 100644
--- a/pkgs/file/lib/src/io.dart
+++ b/pkgs/file/lib/src/io.dart
@@ -8,6 +8,8 @@
/// the `file` package. The `file` package re-exports these interfaces (or in
/// some cases, implementations of these interfaces by the same name), so this
/// file need not be exposes publicly and exists for internal use only.
+library;
+
export 'dart:io'
show
Directory,
diff --git a/pkgs/file/pubspec.yaml b/pkgs/file/pubspec.yaml
index 5de5d37..0ad65b0 100644
--- a/pkgs/file/pubspec.yaml
+++ b/pkgs/file/pubspec.yaml
@@ -1,5 +1,5 @@
name: file
-version: 7.0.1
+version: 7.0.2-wip
description: A pluggable, mockable file system abstraction for Dart.
repository: https://github.com/dart-lang/tools/tree/main/pkgs/file
issue_tracker: https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Afile
@@ -12,6 +12,10 @@
path: ^1.8.3
dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
file_testing: ^3.0.0
- lints: ^2.0.1
test: ^1.23.1
+
+dependency_overrides:
+ file_testing:
+ path: ../file_testing
diff --git a/pkgs/file/test/chroot_test.dart b/pkgs/file/test/chroot_test.dart
index 6c34ff2..cf23f47 100644
--- a/pkgs/file/test/chroot_test.dart
+++ b/pkgs/file/test/chroot_test.dart
@@ -3,6 +3,8 @@
// BSD-style license that can be found in the LICENSE file.
@TestOn('vm')
+library;
+
import 'dart:io' as io;
import 'package:file/chroot.dart';
@@ -17,14 +19,15 @@
void main() {
group('ChrootFileSystem', () {
ChrootFileSystem createMemoryBackedChrootFileSystem() {
- MemoryFileSystem fs = MemoryFileSystem();
+ var fs = MemoryFileSystem();
fs.directory('/tmp').createSync();
return ChrootFileSystem(fs, '/tmp');
}
// TODO(jamesderlin): Make ChrootFile.openSync return a delegating
// RandomAccessFile that uses the chroot'd path.
- List<String> skipCommon = <String>[
+ var skipCommon = <String>[
+ // ignore: lines_longer_than_80_chars
'File > open > .* > RandomAccessFile > read > openReadHandleDoesNotChange',
'File > open > .* > RandomAccessFile > openWriteHandleDoesNotChange',
];
@@ -137,6 +140,7 @@
test('referencesRootEntityForJailbreakPath', () {
mem.file('/foo').createSync();
dynamic f = fs.file('../foo');
+ // ignore: avoid_dynamic_calls
expect(f.delegate.path, '/tmp/foo');
});
});
@@ -151,7 +155,7 @@
group('copy', () {
test('copiesToRootDirectoryIfDestinationIsJailbreakPath', () {
- File f = fs.file('/foo')..createSync();
+ var f = fs.file('/foo')..createSync();
f.copySync('../bar');
expect(mem.file('/bar'), isNot(exists));
expect(mem.file('/tmp/bar'), exists);
diff --git a/pkgs/file/test/common_tests.dart b/pkgs/file/test/common_tests.dart
index 6028c77..491d4f9 100644
--- a/pkgs/file/test/common_tests.dart
+++ b/pkgs/file/test/common_tests.dart
@@ -3,6 +3,8 @@
// BSD-style license that can be found in the LICENSE file.
@TestOn('vm')
+library;
+
import 'dart:async';
import 'dart:convert';
import 'dart:io' as io;
@@ -10,8 +12,8 @@
import 'package:file/file.dart';
import 'package:file_testing/file_testing.dart';
import 'package:path/path.dart' as p;
-import 'package:test/test.dart';
import 'package:test/test.dart' as testpkg show group, setUp, tearDown, test;
+import 'package:test/test.dart';
import 'utils.dart';
@@ -54,7 +56,7 @@
List<String> skip = const <String>[],
FileSystemGenerator? replay,
}) {
- RootPathGenerator? rootfn = root;
+ var rootfn = root;
group('common', () {
late FileSystemGenerator createFs;
@@ -62,7 +64,7 @@
late List<SetUpTearDown> tearDowns;
late FileSystem fs;
late String root;
- List<String> stack = <String>[];
+ var stack = <String>[];
void skipIfNecessary(String description, void Function() callback) {
stack.add(description);
@@ -105,7 +107,7 @@
testpkg.setUp(() async {
await Future.forEach(setUps, (SetUpTearDown setUp) => setUp());
await body();
- for (SetUpTearDown tearDown in tearDowns) {
+ for (var tearDown in tearDowns) {
await tearDown();
}
createFs = replay;
@@ -115,7 +117,7 @@
testpkg.test(description, body, skip: skip);
testpkg.tearDown(() async {
- for (SetUpTearDown tearDown in tearDowns) {
+ for (var tearDown in tearDowns) {
await tearDown();
}
});
@@ -126,13 +128,13 @@
/// Returns [path] prefixed by the [root] namespace.
/// This is only intended for absolute paths.
String ns(String path) {
- p.Context posix = p.Context(style: p.Style.posix);
- List<String> parts = posix.split(path);
+ var posix = p.Context(style: p.Style.posix);
+ var parts = posix.split(path);
parts[0] = root;
path = fs.path.joinAll(parts);
- String rootPrefix = fs.path.rootPrefix(path);
+ var rootPrefix = fs.path.rootPrefix(path);
assert(rootPrefix.isNotEmpty);
- String result = root == rootPrefix
+ var result = root == rootPrefix
? path
: (path == rootPrefix
? root
@@ -160,7 +162,7 @@
test('succeedsWithUriArgument', () {
fs.directory(ns('/foo')).createSync();
- Uri uri = fs.path.toUri(ns('/foo'));
+ var uri = fs.path.toUri(ns('/foo'));
expect(fs.directory(uri), exists);
});
@@ -173,11 +175,11 @@
});
// Fails due to
- // https://github.com/google/file.dart/issues/112
+ // https://github.com/dart-lang/tools/issues/632
test('considersBothSlashesEquivalent', () {
fs.directory(r'foo\bar_dir').createSync(recursive: true);
expect(fs.directory(r'foo/bar_dir'), exists);
- }, skip: 'Fails due to https://github.com/google/file.dart/issues/112');
+ }, skip: 'Fails due to https://github.com/dart-lang/tools/issues/632');
});
group('file', () {
@@ -191,7 +193,7 @@
test('succeedsWithUriArgument', () {
fs.file(ns('/foo')).createSync();
- Uri uri = fs.path.toUri(ns('/foo'));
+ var uri = fs.path.toUri(ns('/foo'));
expect(fs.file(uri), exists);
});
@@ -204,11 +206,11 @@
});
// Fails due to
- // https://github.com/google/file.dart/issues/112
+ // https://github.com/dart-lang/tools/issues/632
test('considersBothSlashesEquivalent', () {
fs.file(r'foo\bar_file').createSync(recursive: true);
expect(fs.file(r'foo/bar_file'), exists);
- }, skip: 'Fails due to https://github.com/google/file.dart/issues/112');
+ }, skip: 'Fails due to https://github.com/dart-lang/tools/issues/632');
});
group('link', () {
@@ -223,7 +225,7 @@
test('succeedsWithUriArgument', () {
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- Uri uri = fs.path.toUri(ns('/bar'));
+ var uri = fs.path.toUri(ns('/bar'));
expect(fs.link(uri), exists);
});
@@ -248,7 +250,7 @@
group('systemTempDirectory', () {
test('existsAsDirectory', () {
- Directory tmp = fs.systemTempDirectory;
+ var tmp = fs.systemTempDirectory;
expect(tmp, isDirectory);
expect(tmp, exists);
});
@@ -318,7 +320,7 @@
test('staysAtRootIfSetToParentOfRoot', () {
fs.currentDirectory =
List<String>.filled(20, '..').join(fs.path.separator);
- String cwd = fs.currentDirectory.path;
+ var cwd = fs.currentDirectory.path;
expect(cwd, fs.path.rootPrefix(cwd));
});
@@ -371,36 +373,36 @@
group('stat', () {
test('isNotFoundForEmptyPath', () {
- FileStat stat = fs.statSync('');
+ var stat = fs.statSync('');
expect(stat.type, FileSystemEntityType.notFound);
});
test('isNotFoundForPathToNonExistentEntityAtTail', () {
- FileStat stat = fs.statSync(ns('/foo'));
+ var stat = fs.statSync(ns('/foo'));
expect(stat.type, FileSystemEntityType.notFound);
});
test('isNotFoundForPathToNonExistentEntityInTraversal', () {
- FileStat stat = fs.statSync(ns('/foo/bar'));
+ var stat = fs.statSync(ns('/foo/bar'));
expect(stat.type, FileSystemEntityType.notFound);
});
test('isDirectoryForDirectory', () {
fs.directory(ns('/foo')).createSync();
- FileStat stat = fs.statSync(ns('/foo'));
+ var stat = fs.statSync(ns('/foo'));
expect(stat.type, FileSystemEntityType.directory);
});
test('isFileForFile', () {
fs.file(ns('/foo')).createSync();
- FileStat stat = fs.statSync(ns('/foo'));
+ var stat = fs.statSync(ns('/foo'));
expect(stat.type, FileSystemEntityType.file);
});
test('isFileForLinkToFile', () {
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- FileStat stat = fs.statSync(ns('/bar'));
+ var stat = fs.statSync(ns('/bar'));
expect(stat.type, FileSystemEntityType.file);
});
@@ -408,7 +410,7 @@
fs.link(ns('/foo')).createSync(ns('/bar'));
fs.link(ns('/bar')).createSync(ns('/baz'));
fs.link(ns('/baz')).createSync(ns('/foo'));
- FileStat stat = fs.statSync(ns('/foo'));
+ var stat = fs.statSync(ns('/foo'));
expect(stat.type, FileSystemEntityType.notFound);
});
});
@@ -454,18 +456,18 @@
group('type', () {
test('isFileForFile', () {
fs.file(ns('/foo')).createSync();
- FileSystemEntityType type = fs.typeSync(ns('/foo'));
+ var type = fs.typeSync(ns('/foo'));
expect(type, FileSystemEntityType.file);
});
test('isDirectoryForDirectory', () {
fs.directory(ns('/foo')).createSync();
- FileSystemEntityType type = fs.typeSync(ns('/foo'));
+ var type = fs.typeSync(ns('/foo'));
expect(type, FileSystemEntityType.directory);
});
test('isDirectoryForAncestorOfRoot', () {
- FileSystemEntityType type = fs
+ var type = fs
.typeSync(List<String>.filled(20, '..').join(fs.path.separator));
expect(type, FileSystemEntityType.directory);
});
@@ -473,15 +475,14 @@
test('isFileForLinkToFileAndFollowLinksTrue', () {
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- FileSystemEntityType type = fs.typeSync(ns('/bar'));
+ var type = fs.typeSync(ns('/bar'));
expect(type, FileSystemEntityType.file);
});
test('isLinkForLinkToFileAndFollowLinksFalse', () {
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- FileSystemEntityType type =
- fs.typeSync(ns('/bar'), followLinks: false);
+ var type = fs.typeSync(ns('/bar'), followLinks: false);
expect(type, FileSystemEntityType.link);
});
@@ -489,17 +490,17 @@
fs.link(ns('/foo')).createSync(ns('/bar'));
fs.link(ns('/bar')).createSync(ns('/baz'));
fs.link(ns('/baz')).createSync(ns('/foo'));
- FileSystemEntityType type = fs.typeSync(ns('/foo'));
+ var type = fs.typeSync(ns('/foo'));
expect(type, FileSystemEntityType.notFound);
});
test('isNotFoundForNoEntityAtTail', () {
- FileSystemEntityType type = fs.typeSync(ns('/foo'));
+ var type = fs.typeSync(ns('/foo'));
expect(type, FileSystemEntityType.notFound);
});
test('isNotFoundForNoDirectoryInTraversal', () {
- FileSystemEntityType type = fs.typeSync(ns('/foo/bar/baz'));
+ var type = fs.typeSync(ns('/foo/bar/baz'));
expect(type, FileSystemEntityType.notFound);
});
});
@@ -676,8 +677,8 @@
});
test('succeedsIfDestinationDoesntExist', () {
- Directory src = fs.directory(ns('/foo'))..createSync();
- Directory dest = src.renameSync(ns('/bar'));
+ var src = fs.directory(ns('/foo'))..createSync();
+ var dest = src.renameSync(ns('/bar'));
expect(dest.path, ns('/bar'));
expect(dest, exists);
});
@@ -686,8 +687,8 @@
'succeedsIfDestinationIsEmptyDirectory',
() {
fs.directory(ns('/bar')).createSync();
- Directory src = fs.directory(ns('/foo'))..createSync();
- Directory dest = src.renameSync(ns('/bar'));
+ var src = fs.directory(ns('/foo'))..createSync();
+ var dest = src.renameSync(ns('/bar'));
expect(src, isNot(exists));
expect(dest, exists);
},
@@ -697,14 +698,14 @@
test('throwsIfDestinationIsFile', () {
fs.file(ns('/bar')).createSync();
- Directory src = fs.directory(ns('/foo'))..createSync();
+ var src = fs.directory(ns('/foo'))..createSync();
expectFileSystemException(ErrorCodes.ENOTDIR, () {
src.renameSync(ns('/bar'));
});
});
test('throwsIfDestinationParentFolderDoesntExist', () {
- Directory src = fs.directory(ns('/foo'))..createSync();
+ var src = fs.directory(ns('/foo'))..createSync();
expectFileSystemException(ErrorCodes.ENOENT, () {
src.renameSync(ns('/bar/baz'));
});
@@ -712,7 +713,7 @@
test('throwsIfDestinationIsNonEmptyDirectory', () {
fs.file(ns('/bar/baz')).createSync(recursive: true);
- Directory src = fs.directory(ns('/foo'))..createSync();
+ var src = fs.directory(ns('/foo'))..createSync();
// The error will be 'Directory not empty' on OS X, but it will be
// 'File exists' on Linux.
expectFileSystemException(
@@ -749,7 +750,7 @@
});
test('throwsIfDestinationIsLinkToNotFound', () {
- Directory src = fs.directory(ns('/foo'))..createSync();
+ var src = fs.directory(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/baz'));
expectFileSystemException(ErrorCodes.ENOTDIR, () {
src.renameSync(ns('/bar'));
@@ -757,7 +758,7 @@
});
test('throwsIfDestinationIsLinkToEmptyDirectory', () {
- Directory src = fs.directory(ns('/foo'))..createSync();
+ var src = fs.directory(ns('/foo'))..createSync();
fs.directory(ns('/bar')).createSync();
fs.link(ns('/baz')).createSync(ns('/bar'));
expectFileSystemException(ErrorCodes.ENOTDIR, () {
@@ -766,7 +767,7 @@
});
test('succeedsIfDestinationIsInDifferentDirectory', () {
- Directory src = fs.directory(ns('/foo'))..createSync();
+ var src = fs.directory(ns('/foo'))..createSync();
fs.directory(ns('/bar')).createSync();
src.renameSync(ns('/bar/baz'));
expect(fs.typeSync(ns('/foo')), FileSystemEntityType.notFound);
@@ -790,24 +791,24 @@
group('delete', () {
test('returnsCovariantType', () async {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
expect(await dir.delete(), isDirectory);
});
test('succeedsIfEmptyDirectoryExistsAndRecursiveFalse', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
dir.deleteSync();
expect(dir, isNot(exists));
});
test('succeedsIfEmptyDirectoryExistsAndRecursiveTrue', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
dir.deleteSync(recursive: true);
expect(dir, isNot(exists));
});
test('throwsIfNonEmptyDirectoryExistsAndRecursiveFalse', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
fs.file(ns('/foo/bar')).createSync();
expectFileSystemException(ErrorCodes.ENOTEMPTY, () {
dir.deleteSync();
@@ -815,7 +816,7 @@
});
test('succeedsIfNonEmptyDirectoryExistsAndRecursiveTrue', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
fs.file(ns('/foo/bar')).createSync();
dir.deleteSync(recursive: true);
expect(fs.directory(ns('/foo')), isNot(exists));
@@ -997,7 +998,7 @@
test('handlesParentAndThisFolderReferences', () {
fs.directory(ns('/foo/bar/baz')).createSync(recursive: true);
fs.link(ns('/foo/bar/baz/qux')).createSync(fs.path.join('..', '..'));
- String resolved = fs
+ var resolved = fs
.directory(ns('/foo/./bar/baz/../baz/qux/bar'))
.resolveSymbolicLinksSync();
expect(resolved, ns('/foo/bar'));
@@ -1015,7 +1016,7 @@
.createSync(fs.path.join('..', '..', 'qux'), recursive: true);
fs.link(ns('/qux')).createSync('quux');
fs.link(ns('/quux/quuz')).createSync(ns('/foo'), recursive: true);
- String resolved = fs
+ var resolved = fs
.directory(ns('/foo//bar/./baz/quuz/bar/..///bar/baz/'))
.resolveSymbolicLinksSync();
expect(resolved, ns('/quux'));
@@ -1069,29 +1070,29 @@
test('resolvesNameCollisions', () {
fs.directory(ns('/foo/bar')).createSync(recursive: true);
- Directory tmp = fs.directory(ns('/foo')).createTempSync('bar');
+ var tmp = fs.directory(ns('/foo')).createTempSync('bar');
expect(tmp.path,
allOf(isNot(ns('/foo/bar')), startsWith(ns('/foo/bar'))));
});
test('succeedsWithoutPrefix', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
expect(dir.createTempSync().path, startsWith(ns('/foo/')));
});
test('succeedsWithPrefix', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
expect(dir.createTempSync('bar').path, startsWith(ns('/foo/bar')));
});
test('succeedsWithNestedPathPrefixThatExists', () {
fs.directory(ns('/foo/bar')).createSync(recursive: true);
- Directory tmp = fs.directory(ns('/foo')).createTempSync('bar/baz');
+ var tmp = fs.directory(ns('/foo')).createTempSync('bar/baz');
expect(tmp.path, startsWith(ns('/foo/bar/baz')));
});
test('throwsWithNestedPathPrefixThatDoesntExist', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
expectFileSystemException(ErrorCodes.ENOENT, () {
dir.createTempSync('bar/baz');
});
@@ -1123,7 +1124,7 @@
});
test('returnsEmptyListForEmptyDirectory', () {
- Directory empty = fs.directory(ns('/bar'))..createSync();
+ var empty = fs.directory(ns('/bar'))..createSync();
expect(empty.listSync(), isEmpty);
});
@@ -1134,7 +1135,7 @@
});
test('returnsLinkObjectsIfFollowLinksFalse', () {
- List<FileSystemEntity> list = dir.listSync(followLinks: false);
+ var list = dir.listSync(followLinks: false);
expect(list, hasLength(3));
expect(list, contains(allOf(isFile, hasPath(ns('/foo/bar')))));
expect(list, contains(allOf(isDirectory, hasPath(ns('/foo/baz')))));
@@ -1142,7 +1143,7 @@
});
test('followsLinksIfFollowLinksTrue', () {
- List<FileSystemEntity> list = dir.listSync();
+ var list = dir.listSync();
expect(list, hasLength(3));
expect(list, contains(allOf(isFile, hasPath(ns('/foo/bar')))));
expect(list, contains(allOf(isDirectory, hasPath(ns('/foo/baz')))));
@@ -1189,8 +1190,7 @@
test('childEntriesNotNormalized', () {
dir = fs.directory(ns('/bar/baz'))..createSync(recursive: true);
fs.file(ns('/bar/baz/qux')).createSync();
- List<FileSystemEntity> list =
- fs.directory(ns('/bar//../bar/./baz')).listSync();
+ var list = fs.directory(ns('/bar//../bar/./baz')).listSync();
expect(list, hasLength(1));
expect(list[0], allOf(isFile, hasPath(ns('/bar//../bar/./baz/qux'))));
});
@@ -1198,9 +1198,8 @@
test('symlinksToNotFoundAlwaysReturnedAsLinks', () {
dir = fs.directory(ns('/bar'))..createSync();
fs.link(ns('/bar/baz')).createSync('qux');
- for (bool followLinks in const <bool>[true, false]) {
- List<FileSystemEntity> list =
- dir.listSync(followLinks: followLinks);
+ for (var followLinks in const <bool>[true, false]) {
+ var list = dir.listSync(followLinks: followLinks);
expect(list, hasLength(1));
expect(list[0], allOf(isLink, hasPath(ns('/bar/baz'))));
}
@@ -1208,7 +1207,7 @@
});
test('childEntities', () {
- Directory dir = fs.directory(ns('/foo'))..createSync();
+ var dir = fs.directory(ns('/foo'))..createSync();
dir.childDirectory('bar').createSync();
dir.childFile('baz').createSync();
dir.childLink('qux').createSync('bar');
@@ -1321,22 +1320,22 @@
});
test('succeedsIfDestinationDoesntExistAtTail', () {
- File src = fs.file(ns('/foo'))..createSync();
- File dest = src.renameSync(ns('/bar'));
+ var src = fs.file(ns('/foo'))..createSync();
+ var dest = src.renameSync(ns('/bar'));
expect(fs.file(ns('/foo')), isNot(exists));
expect(fs.file(ns('/bar')), exists);
expect(dest.path, ns('/bar'));
});
test('throwsIfDestinationDoesntExistViaTraversal', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expectFileSystemException(ErrorCodes.ENOENT, () {
f.renameSync(ns('/bar/baz'));
});
});
test('succeedsIfDestinationExistsAsFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.file(ns('/bar')).createSync();
f.renameSync(ns('/bar'));
expect(fs.file(ns('/foo')), isNot(exists));
@@ -1344,7 +1343,7 @@
});
test('throwsIfDestinationExistsAsDirectory', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.directory(ns('/bar')).createSync();
expectFileSystemException(ErrorCodes.EISDIR, () {
f.renameSync(ns('/bar'));
@@ -1352,7 +1351,7 @@
});
test('succeedsIfDestinationExistsAsLinkToFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.file(ns('/bar')).createSync();
fs.link(ns('/baz')).createSync(ns('/bar'));
f.renameSync(ns('/baz'));
@@ -1364,7 +1363,7 @@
});
test('throwsIfDestinationExistsAsLinkToDirectory', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.directory(ns('/bar')).createSync();
fs.link(ns('/baz')).createSync(ns('/bar'));
expectFileSystemException(ErrorCodes.EISDIR, () {
@@ -1373,7 +1372,7 @@
});
test('succeedsIfDestinationExistsAsLinkToNotFound', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/baz'));
f.renameSync(ns('/bar'));
expect(fs.typeSync(ns('/foo')), FileSystemEntityType.notFound);
@@ -1429,7 +1428,7 @@
});
test('succeedsIfDestinationDoesntExistAtTail', () {
- File f = fs.file(ns('/foo'))
+ var f = fs.file(ns('/foo'))
..createSync()
..writeAsStringSync('foo');
f.copySync(ns('/bar'));
@@ -1439,14 +1438,14 @@
});
test('throwsIfDestinationDoesntExistViaTraversal', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expectFileSystemException(ErrorCodes.ENOENT, () {
f.copySync(ns('/bar/baz'));
});
});
test('succeedsIfDestinationExistsAsFile', () {
- File f = fs.file(ns('/foo'))
+ var f = fs.file(ns('/foo'))
..createSync()
..writeAsStringSync('foo');
fs.file(ns('/bar'))
@@ -1460,7 +1459,7 @@
});
test('throwsIfDestinationExistsAsDirectory', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.directory(ns('/bar')).createSync();
expectFileSystemException(ErrorCodes.EISDIR, () {
f.copySync(ns('/bar'));
@@ -1468,7 +1467,7 @@
});
test('succeedsIfDestinationExistsAsLinkToFile', () {
- File f = fs.file(ns('/foo'))
+ var f = fs.file(ns('/foo'))
..createSync()
..writeAsStringSync('foo');
fs.file(ns('/bar'))
@@ -1487,7 +1486,7 @@
}, skip: io.Platform.isWindows /* No links on Windows */);
test('throwsIfDestinationExistsAsLinkToDirectory', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.directory(ns('/bar')).createSync();
fs.link(ns('/baz')).createSync(ns('/bar'));
expectFileSystemException(ErrorCodes.EISDIR, () {
@@ -1525,7 +1524,7 @@
});
test('succeedsIfDestinationIsInDifferentDirectoryThanSource', () {
- File f = fs.file(ns('/foo/bar'))
+ var f = fs.file(ns('/foo/bar'))
..createSync(recursive: true)
..writeAsStringSync('foo');
fs.directory(ns('/baz')).createSync();
@@ -1587,12 +1586,12 @@
});
test('returnsZeroForNewlyCreatedFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expect(f.lengthSync(), 0);
});
test('writeNBytesReturnsLengthN', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsBytesSync(<int>[1, 2, 3, 4], flush: true);
expect(f.lengthSync(), 4);
});
@@ -1616,10 +1615,10 @@
group('lastAccessed', () {
test('isNowForNewlyCreatedFile', () {
- DateTime before = downstairs();
- File f = fs.file(ns('/foo'))..createSync();
- DateTime after = ceil();
- DateTime accessed = f.lastAccessedSync();
+ var before = downstairs();
+ var f = fs.file(ns('/foo'))..createSync();
+ var after = ceil();
+ var accessed = f.lastAccessedSync();
expect(accessed, isSameOrAfter(before));
expect(accessed, isSameOrBefore(after));
});
@@ -1638,18 +1637,18 @@
});
test('succeedsIfExistsAsLinkToFile', () {
- DateTime before = downstairs();
+ var before = downstairs();
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- DateTime after = ceil();
- DateTime accessed = fs.file(ns('/bar')).lastAccessedSync();
+ var after = ceil();
+ var accessed = fs.file(ns('/bar')).lastAccessedSync();
expect(accessed, isSameOrAfter(before));
expect(accessed, isSameOrBefore(after));
});
});
group('setLastAccessed', () {
- final DateTime time = DateTime(1999);
+ final time = DateTime(1999);
test('throwsIfDoesntExist', () {
expectFileSystemException(ErrorCodes.ENOENT, () {
@@ -1665,13 +1664,13 @@
});
test('succeedsIfExistsAsFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.setLastAccessedSync(time);
expect(fs.file(ns('/foo')).lastAccessedSync(), time);
});
test('succeedsIfExistsAsLinkToFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
f.setLastAccessedSync(time);
expect(fs.file(ns('/bar')).lastAccessedSync(), time);
@@ -1680,10 +1679,10 @@
group('lastModified', () {
test('isNowForNewlyCreatedFile', () {
- DateTime before = downstairs();
- File f = fs.file(ns('/foo'))..createSync();
- DateTime after = ceil();
- DateTime modified = f.lastModifiedSync();
+ var before = downstairs();
+ var f = fs.file(ns('/foo'))..createSync();
+ var after = ceil();
+ var modified = f.lastModifiedSync();
expect(modified, isSameOrAfter(before));
expect(modified, isSameOrBefore(after));
});
@@ -1702,18 +1701,18 @@
});
test('succeedsIfExistsAsLinkToFile', () {
- DateTime before = downstairs();
+ var before = downstairs();
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- DateTime after = ceil();
- DateTime modified = fs.file(ns('/bar')).lastModifiedSync();
+ var after = ceil();
+ var modified = fs.file(ns('/bar')).lastModifiedSync();
expect(modified, isSameOrAfter(before));
expect(modified, isSameOrBefore(after));
});
});
group('setLastModified', () {
- final DateTime time = DateTime(1999);
+ final time = DateTime(1999);
test('throwsIfDoesntExist', () {
expectFileSystemException(ErrorCodes.ENOENT, () {
@@ -1729,13 +1728,13 @@
});
test('succeedsIfExistsAsFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.setLastModifiedSync(time);
expect(fs.file(ns('/foo')).lastModifiedSync(), time);
});
test('succeedsIfExistsAsLinkToFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
f.setLastModifiedSync(time);
expect(fs.file(ns('/bar')).lastModifiedSync(), time);
@@ -1752,7 +1751,7 @@
});
} else {
test('createsFileIfDoesntExistAtTail', () {
- RandomAccessFile raf = fs.file(ns('/bar')).openSync(mode: mode);
+ var raf = fs.file(ns('/bar')).openSync(mode: mode);
raf.closeSync();
expect(fs.file(ns('/bar')), exists);
});
@@ -1877,39 +1876,39 @@
});
test('readIntoWithBufferLargerThanContent', () {
- List<int> buffer = List<int>.filled(1024, 0);
- int numRead = raf.readIntoSync(buffer);
+ var buffer = List<int>.filled(1024, 0);
+ var numRead = raf.readIntoSync(buffer);
expect(numRead, 21);
expect(utf8.decode(buffer.sublist(0, 21)),
'pre-existing content\n');
});
test('readIntoWithBufferSmallerThanContent', () {
- List<int> buffer = List<int>.filled(10, 0);
- int numRead = raf.readIntoSync(buffer);
+ var buffer = List<int>.filled(10, 0);
+ var numRead = raf.readIntoSync(buffer);
expect(numRead, 10);
expect(utf8.decode(buffer), 'pre-existi');
});
test('readIntoWithStart', () {
- List<int> buffer = List<int>.filled(10, 0);
- int numRead = raf.readIntoSync(buffer, 2);
+ var buffer = List<int>.filled(10, 0);
+ var numRead = raf.readIntoSync(buffer, 2);
expect(numRead, 8);
expect(utf8.decode(buffer.sublist(2)), 'pre-exis');
});
test('readIntoWithStartAndEnd', () {
- List<int> buffer = List<int>.filled(10, 0);
- int numRead = raf.readIntoSync(buffer, 2, 5);
+ var buffer = List<int>.filled(10, 0);
+ var numRead = raf.readIntoSync(buffer, 2, 5);
expect(numRead, 3);
expect(utf8.decode(buffer.sublist(2, 5)), 'pre');
});
test('openReadHandleDoesNotChange', () {
- final String initial = utf8.decode(raf.readSync(4));
+ final initial = utf8.decode(raf.readSync(4));
expect(initial, 'pre-');
- final File newFile = f.renameSync(ns('/bar'));
- String rest = utf8.decode(raf.readSync(1024));
+ final newFile = f.renameSync(ns('/bar'));
+ var rest = utf8.decode(raf.readSync(1024));
expect(rest, 'existing content\n');
assert(newFile.path != f.path);
@@ -1942,13 +1941,13 @@
});
} else {
test('lengthGrowsAsDataIsWritten', () {
- int lengthBefore = f.lengthSync();
+ var lengthBefore = f.lengthSync();
raf.writeByteSync(0xFACE);
expect(raf.lengthSync(), lengthBefore + 1);
});
test('flush', () {
- int lengthBefore = f.lengthSync();
+ var lengthBefore = f.lengthSync();
raf.writeByteSync(0xFACE);
raf.flushSync();
expect(f.lengthSync(), lengthBefore + 1);
@@ -2009,10 +2008,10 @@
test('openWriteHandleDoesNotChange', () {
raf.writeStringSync('Hello ');
- final File newFile = f.renameSync(ns('/bar'));
+ final newFile = f.renameSync(ns('/bar'));
raf.writeStringSync('world');
- final String contents = newFile.readAsStringSync();
+ final contents = newFile.readAsStringSync();
if (mode == FileMode.write || mode == FileMode.writeOnly) {
expect(contents, 'Hello world');
} else {
@@ -2067,7 +2066,7 @@
});
} else {
test('growsAfterWrite', () {
- int positionBefore = raf.positionSync();
+ var positionBefore = raf.positionSync();
raf.writeStringSync('Hello world');
expect(raf.positionSync(), positionBefore + 11);
});
@@ -2165,42 +2164,42 @@
group('openRead', () {
test('throwsIfDoesntExist', () {
- Stream<List<int>> stream = fs.file(ns('/foo')).openRead();
+ var stream = fs.file(ns('/foo')).openRead();
expect(stream.drain<void>(),
throwsFileSystemException(ErrorCodes.ENOENT));
});
test('succeedsIfExistsAsFile', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello world', flush: true);
- Stream<List<int>> stream = f.openRead();
- List<List<int>> data = await stream.toList();
+ var stream = f.openRead();
+ var data = await stream.toList();
expect(data, hasLength(1));
expect(utf8.decode(data[0]), 'Hello world');
});
test('throwsIfExistsAsDirectory', () {
fs.directory(ns('/foo')).createSync();
- Stream<List<int>> stream = fs.file(ns('/foo')).openRead();
+ var stream = fs.file(ns('/foo')).openRead();
expect(stream.drain<void>(),
throwsFileSystemException(ErrorCodes.EISDIR));
});
test('succeedsIfExistsAsLinkToFile', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
f.writeAsStringSync('Hello world', flush: true);
- Stream<List<int>> stream = fs.file(ns('/bar')).openRead();
- List<List<int>> data = await stream.toList();
+ var stream = fs.file(ns('/bar')).openRead();
+ var data = await stream.toList();
expect(data, hasLength(1));
expect(utf8.decode(data[0]), 'Hello world');
});
test('respectsStartAndEndParameters', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello world', flush: true);
- Stream<List<int>> stream = f.openRead(2);
- List<List<int>> data = await stream.toList();
+ var stream = f.openRead(2);
+ var data = await stream.toList();
expect(data, hasLength(1));
expect(utf8.decode(data[0]), 'llo world');
stream = f.openRead(2, 5);
@@ -2210,24 +2209,24 @@
});
test('throwsIfStartParameterIsNegative', () async {
- File f = fs.file(ns('/foo'))..createSync();
- Stream<List<int>> stream = f.openRead(-2);
+ var f = fs.file(ns('/foo'))..createSync();
+ var stream = f.openRead(-2);
expect(stream.drain<void>(), throwsRangeError);
});
test('stopsAtEndOfFileIfEndParameterIsPastEndOfFile', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello world', flush: true);
- Stream<List<int>> stream = f.openRead(2, 1024);
- List<List<int>> data = await stream.toList();
+ var stream = f.openRead(2, 1024);
+ var data = await stream.toList();
expect(data, hasLength(1));
expect(utf8.decode(data[0]), 'llo world');
});
test('providesSingleSubscriptionStream', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello world', flush: true);
- Stream<List<int>> stream = f.openRead();
+ var stream = f.openRead();
expect(stream.isBroadcast, isFalse);
await stream.drain<void>();
});
@@ -2237,20 +2236,20 @@
// split across multiple chunks in the [Stream]. However, there
// doesn't seem to be a good way to determine the chunk size used by
// [io.File].
- final List<int> data = List<int>.generate(
+ final data = List<int>.generate(
1024 * 256,
(int index) => index & 0xFF,
growable: false,
);
- final File f = fs.file(ns('/foo'))..createSync();
+ final f = fs.file(ns('/foo'))..createSync();
f.writeAsBytesSync(data, flush: true);
- final Stream<List<int>> stream = f.openRead();
+ final stream = f.openRead();
File? newFile;
List<int>? initialChunk;
- final List<int> remainingChunks = <int>[];
+ final remainingChunks = <int>[];
await for (List<int> chunk in stream) {
if (initialChunk == null) {
@@ -2276,7 +2275,7 @@
test('openReadCompatibleWithUtf8Decoder', () async {
const content = 'Hello world!';
- File file = fs.file(ns('/foo'))
+ var file = fs.file(ns('/foo'))
..createSync()
..writeAsStringSync(content);
expect(
@@ -2315,8 +2314,8 @@
});
test('succeedsIfExistsAsEmptyFile', () async {
- File f = fs.file(ns('/foo'))..createSync();
- IOSink sink = f.openWrite();
+ var f = fs.file(ns('/foo'))..createSync();
+ var sink = f.openWrite();
sink.write('Hello world');
await sink.flush();
await sink.close();
@@ -2326,7 +2325,7 @@
test('succeedsIfExistsAsLinkToFile', () async {
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- IOSink sink = fs.file(ns('/bar')).openWrite();
+ var sink = fs.file(ns('/bar')).openWrite();
sink.write('Hello world');
await sink.flush();
await sink.close();
@@ -2334,9 +2333,9 @@
});
test('overwritesContentInWriteMode', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello');
- IOSink sink = f.openWrite();
+ var sink = f.openWrite();
sink.write('Goodbye');
await sink.flush();
await sink.close();
@@ -2344,9 +2343,9 @@
});
test('appendsContentInAppendMode', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello');
- IOSink sink = f.openWrite(mode: FileMode.append);
+ var sink = f.openWrite(mode: FileMode.append);
sink.write('Goodbye');
await sink.flush();
await sink.close();
@@ -2354,12 +2353,12 @@
});
test('openWriteHandleDoesNotChange', () async {
- File f = fs.file(ns('/foo'))..createSync();
- IOSink sink = f.openWrite();
+ var f = fs.file(ns('/foo'))..createSync();
+ var sink = f.openWrite();
sink.write('Hello');
await sink.flush();
- final File newFile = f.renameSync(ns('/bar'));
+ final newFile = f.renameSync(ns('/bar'));
sink.write('Goodbye');
await sink.flush();
await sink.close();
@@ -2377,7 +2376,7 @@
late bool isSinkClosed;
Future<dynamic> closeSink() {
- Future<dynamic> future = sink.close();
+ var future = sink.close();
isSinkClosed = true;
return future;
}
@@ -2448,13 +2447,13 @@
test('ignoresCloseAfterAlreadyClosed', () async {
sink.write('Hello world');
- Future<dynamic> f1 = closeSink();
- Future<dynamic> f2 = closeSink();
+ var f1 = closeSink();
+ var f2 = closeSink();
await Future.wait<dynamic>(<Future<dynamic>>[f1, f2]);
});
test('returnsAccurateDoneFuture', () async {
- bool done = false;
+ var done = false;
// ignore: unawaited_futures
sink.done.then((dynamic _) => done = true);
expect(done, isFalse);
@@ -2469,7 +2468,7 @@
late bool isControllerClosed;
Future<dynamic> closeController() {
- Future<dynamic> future = controller.close();
+ var future = controller.close();
isControllerClosed = true;
return future;
}
@@ -2543,7 +2542,7 @@
});
test('succeedsIfExistsAsFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsBytesSync(<int>[1, 2, 3, 4]);
expect(f.readAsBytesSync(), <int>[1, 2, 3, 4]);
});
@@ -2556,12 +2555,12 @@
});
test('returnsEmptyListForZeroByteFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expect(f.readAsBytesSync(), isEmpty);
});
test('returns a copy, not a view, of the file content', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsBytesSync(<int>[1, 2, 3, 4]);
List<int> result = f.readAsBytesSync();
expect(result, <int>[1, 2, 3, 4]);
@@ -2593,7 +2592,7 @@
});
test('succeedsIfExistsAsFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello world');
expect(f.readAsStringSync(), 'Hello world');
});
@@ -2606,14 +2605,14 @@
});
test('returnsEmptyStringForZeroByteFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expect(f.readAsStringSync(), isEmpty);
});
});
group('readAsLines', () {
- const String testString = 'Hello world\nHow are you?\nI am fine';
- final List<String> expectedLines = <String>[
+ const testString = 'Hello world\nHow are you?\nI am fine';
+ final expectedLines = <String>[
'Hello world',
'How are you?',
'I am fine',
@@ -2641,25 +2640,25 @@
});
test('succeedsIfExistsAsFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync(testString);
expect(f.readAsLinesSync(), expectedLines);
});
test('succeedsIfExistsAsLinkToFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
f.writeAsStringSync(testString);
expect(f.readAsLinesSync(), expectedLines);
});
test('returnsEmptyListForZeroByteFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expect(f.readAsLinesSync(), isEmpty);
});
test('isTrailingNewlineAgnostic', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('$testString\n');
expect(f.readAsLinesSync(), expectedLines);
@@ -2677,7 +2676,7 @@
});
test('createsFileIfDoesntExist', () {
- File f = fs.file(ns('/foo'));
+ var f = fs.file(ns('/foo'));
expect(f, isNot(exists));
f.writeAsBytesSync(<int>[1, 2, 3, 4]);
expect(f, exists);
@@ -2699,21 +2698,21 @@
});
test('succeedsIfExistsAsLinkToFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
fs.file(ns('/bar')).writeAsBytesSync(<int>[1, 2, 3, 4]);
expect(f.readAsBytesSync(), <int>[1, 2, 3, 4]);
});
test('throwsIfFileModeRead', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expectFileSystemException(ErrorCodes.EBADF, () {
f.writeAsBytesSync(<int>[1], mode: FileMode.read);
});
});
test('overwritesContentIfFileModeWrite', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsBytesSync(<int>[1, 2]);
expect(f.readAsBytesSync(), <int>[1, 2]);
f.writeAsBytesSync(<int>[3, 4]);
@@ -2721,7 +2720,7 @@
});
test('appendsContentIfFileModeAppend', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsBytesSync(<int>[1, 2], mode: FileMode.append);
expect(f.readAsBytesSync(), <int>[1, 2]);
f.writeAsBytesSync(<int>[3, 4], mode: FileMode.append);
@@ -2729,17 +2728,17 @@
});
test('acceptsEmptyBytesList', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsBytesSync(<int>[]);
expect(f.readAsBytesSync(), <int>[]);
});
test('updatesLastModifiedTime', () async {
- File f = fs.file(ns('/foo'))..createSync();
- DateTime before = f.statSync().modified;
+ var f = fs.file(ns('/foo'))..createSync();
+ var before = f.statSync().modified;
await Future<void>.delayed(const Duration(seconds: 2));
f.writeAsBytesSync(<int>[1, 2, 3]);
- DateTime after = f.statSync().modified;
+ var after = f.statSync().modified;
expect(after, isAfter(before));
});
});
@@ -2750,7 +2749,7 @@
});
test('createsFileIfDoesntExist', () {
- File f = fs.file(ns('/foo'));
+ var f = fs.file(ns('/foo'));
expect(f, isNot(exists));
f.writeAsStringSync('Hello world');
expect(f, exists);
@@ -2772,21 +2771,21 @@
});
test('succeedsIfExistsAsLinkToFile', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
fs.file(ns('/bar')).writeAsStringSync('Hello world');
expect(f.readAsStringSync(), 'Hello world');
});
test('throwsIfFileModeRead', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expectFileSystemException(ErrorCodes.EBADF, () {
f.writeAsStringSync('Hello world', mode: FileMode.read);
});
});
test('overwritesContentIfFileModeWrite', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello world');
expect(f.readAsStringSync(), 'Hello world');
f.writeAsStringSync('Goodbye cruel world');
@@ -2794,7 +2793,7 @@
});
test('appendsContentIfFileModeAppend', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('Hello', mode: FileMode.append);
expect(f.readAsStringSync(), 'Hello');
f.writeAsStringSync('Goodbye', mode: FileMode.append);
@@ -2802,7 +2801,7 @@
});
test('acceptsEmptyString', () {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
f.writeAsStringSync('');
expect(f.readAsStringSync(), isEmpty);
});
@@ -2847,38 +2846,38 @@
group('stat', () {
test('isNotFoundIfDoesntExistAtTail', () {
- FileStat stat = fs.file(ns('/foo')).statSync();
+ var stat = fs.file(ns('/foo')).statSync();
expect(stat.type, FileSystemEntityType.notFound);
});
test('isNotFoundIfDoesntExistViaTraversal', () {
- FileStat stat = fs.file(ns('/foo/bar')).statSync();
+ var stat = fs.file(ns('/foo/bar')).statSync();
expect(stat.type, FileSystemEntityType.notFound);
});
test('isDirectoryIfExistsAsDirectory', () {
fs.directory(ns('/foo')).createSync();
- FileStat stat = fs.file(ns('/foo')).statSync();
+ var stat = fs.file(ns('/foo')).statSync();
expect(stat.type, FileSystemEntityType.directory);
});
test('isFileIfExistsAsFile', () {
fs.file(ns('/foo')).createSync();
- FileStat stat = fs.file(ns('/foo')).statSync();
+ var stat = fs.file(ns('/foo')).statSync();
expect(stat.type, FileSystemEntityType.file);
});
test('isFileIfExistsAsLinkToFile', () {
fs.file(ns('/foo')).createSync();
fs.link(ns('/bar')).createSync(ns('/foo'));
- FileStat stat = fs.file(ns('/bar')).statSync();
+ var stat = fs.file(ns('/bar')).statSync();
expect(stat.type, FileSystemEntityType.file);
});
});
group('delete', () {
test('returnsCovariantType', () async {
- File f = fs.file(ns('/foo'))..createSync();
+ var f = fs.file(ns('/foo'))..createSync();
expect(await f.delete(), isFile);
});
@@ -2953,14 +2952,14 @@
group('uri', () {
test('whenTargetIsDirectory', () {
fs.directory(ns('/foo')).createSync();
- Link l = fs.link(ns('/bar'))..createSync(ns('/foo'));
+ var l = fs.link(ns('/bar'))..createSync(ns('/foo'));
expect(l.uri, fs.path.toUri(ns('/bar')));
expect(fs.link('bar').uri.toString(), 'bar');
});
test('whenTargetIsFile', () {
fs.file(ns('/foo')).createSync();
- Link l = fs.link(ns('/bar'))..createSync(ns('/foo'));
+ var l = fs.link(ns('/bar'))..createSync(ns('/foo'));
expect(l.uri, fs.path.toUri(ns('/bar')));
expect(fs.link('bar').uri.toString(), 'bar');
});
@@ -2991,24 +2990,24 @@
});
test('isTrueIfTargetIsNotFound', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
expect(l, exists);
});
test('isTrueIfTargetIsFile', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/bar')).createSync();
expect(l, exists);
});
test('isTrueIfTargetIsDirectory', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/bar')).createSync();
expect(l, exists);
});
test('isTrueIfTargetIsLinkLoop', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.link(ns('/bar')).createSync(ns('/foo'));
expect(l, exists);
});
@@ -3038,29 +3037,29 @@
});
test('isNotFoundIfTargetNotFoundAtTail', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
expect(l.statSync().type, FileSystemEntityType.notFound);
});
test('isNotFoundIfTargetNotFoundViaTraversal', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar/baz'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar/baz'));
expect(l.statSync().type, FileSystemEntityType.notFound);
});
test('isNotFoundIfTargetIsLinkLoop', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.link(ns('/bar')).createSync(ns('/foo'));
expect(l.statSync().type, FileSystemEntityType.notFound);
});
test('isFileIfTargetIsFile', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/bar')).createSync();
expect(l.statSync().type, FileSystemEntityType.file);
});
test('isDirectoryIfTargetIsDirectory', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/bar')).createSync();
expect(l.statSync().type, FileSystemEntityType.directory);
});
@@ -3068,7 +3067,7 @@
group('delete', () {
test('returnsCovariantType', () async {
- Link link = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var link = fs.link(ns('/foo'))..createSync(ns('/bar'));
expect(await link.delete(), isLink);
});
@@ -3118,7 +3117,7 @@
});
test('unlinksIfTargetIsFileAndRecursiveFalse', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/bar')).createSync();
l.deleteSync();
expect(fs.typeSync(ns('/foo'), followLinks: false),
@@ -3128,7 +3127,7 @@
});
test('unlinksIfTargetIsFileAndRecursiveTrue', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/bar')).createSync();
l.deleteSync(recursive: true);
expect(fs.typeSync(ns('/foo'), followLinks: false),
@@ -3138,7 +3137,7 @@
});
test('unlinksIfTargetIsDirectoryAndRecursiveFalse', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/bar')).createSync();
l.deleteSync();
expect(fs.typeSync(ns('/foo'), followLinks: false),
@@ -3148,7 +3147,7 @@
});
test('unlinksIfTargetIsDirectoryAndRecursiveTrue', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/bar')).createSync();
l.deleteSync(recursive: true);
expect(fs.typeSync(ns('/foo'), followLinks: false),
@@ -3158,7 +3157,7 @@
});
test('unlinksIfTargetIsLinkLoop', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.link(ns('/bar')).createSync(ns('/foo'));
l.deleteSync();
expect(fs.typeSync(ns('/foo'), followLinks: false),
@@ -3178,7 +3177,7 @@
});
test('ignoresLinkTarget', () {
- Link l = fs.link(ns('/foo/bar'))
+ var l = fs.link(ns('/foo/bar'))
..createSync(ns('/baz/qux'), recursive: true);
expect(l.parent.path, ns('/foo'));
});
@@ -3190,7 +3189,7 @@
});
test('succeedsIfLinkDoesntExistAtTail', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
expect(fs.typeSync(ns('/foo'), followLinks: false),
FileSystemEntityType.link);
expect(l.targetSync(), ns('/bar'));
@@ -3203,7 +3202,7 @@
});
test('succeedsIfLinkDoesntExistViaTraversalAndRecursiveTrue', () {
- Link l = fs.link(ns('/foo/bar'))..createSync('baz', recursive: true);
+ var l = fs.link(ns('/foo/bar'))..createSync('baz', recursive: true);
expect(fs.typeSync(ns('/foo'), followLinks: false),
FileSystemEntityType.directory);
expect(fs.typeSync(ns('/foo/bar'), followLinks: false),
@@ -3242,7 +3241,7 @@
group('update', () {
test('returnsCovariantType', () async {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
expect(await l.update(ns('/baz')), isLink);
});
@@ -3336,24 +3335,24 @@
});
test('succeedsIfTargetIsNotFound', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
expect(l.targetSync(), ns('/bar'));
});
test('succeedsIfTargetIsFile', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/bar')).createSync();
expect(l.targetSync(), ns('/bar'));
});
test('succeedsIfTargetIsDirectory', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/bar')).createSync();
expect(l.targetSync(), ns('/bar'));
});
test('succeedsIfTargetIsLinkLoop', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.link(ns('/bar')).createSync(ns('/foo'));
expect(l.targetSync(), ns('/bar'));
});
@@ -3393,9 +3392,9 @@
});
test('succeedsIfSourceIsLinkToFile', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/bar')).createSync();
- Link renamed = l.renameSync(ns('/baz'));
+ var renamed = l.renameSync(ns('/baz'));
expect(renamed.path, ns('/baz'));
expect(fs.typeSync(ns('/foo'), followLinks: false),
FileSystemEntityType.notFound);
@@ -3407,8 +3406,8 @@
});
test('succeedsIfSourceIsLinkToNotFound', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
- Link renamed = l.renameSync(ns('/baz'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var renamed = l.renameSync(ns('/baz'));
expect(renamed.path, ns('/baz'));
expect(fs.typeSync(ns('/foo'), followLinks: false),
FileSystemEntityType.notFound);
@@ -3418,9 +3417,9 @@
});
test('succeedsIfSourceIsLinkToDirectory', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/bar')).createSync();
- Link renamed = l.renameSync(ns('/baz'));
+ var renamed = l.renameSync(ns('/baz'));
expect(renamed.path, ns('/baz'));
expect(fs.typeSync(ns('/foo'), followLinks: false),
FileSystemEntityType.notFound);
@@ -3432,9 +3431,9 @@
});
test('succeedsIfSourceIsLinkLoop', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.link(ns('/bar')).createSync(ns('/foo'));
- Link renamed = l.renameSync(ns('/baz'));
+ var renamed = l.renameSync(ns('/baz'));
expect(renamed.path, ns('/baz'));
expect(fs.typeSync(ns('/foo'), followLinks: false),
FileSystemEntityType.notFound);
@@ -3446,22 +3445,22 @@
});
test('succeedsIfDestinationDoesntExistAtTail', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
- Link renamed = l.renameSync(ns('/baz'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var renamed = l.renameSync(ns('/baz'));
expect(renamed.path, ns('/baz'));
expect(fs.link(ns('/foo')), isNot(exists));
expect(fs.link(ns('/baz')), exists);
});
test('throwsIfDestinationDoesntExistViaTraversal', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
expectFileSystemException(ErrorCodes.ENOENT, () {
l.renameSync(ns('/baz/qux'));
});
});
test('throwsIfDestinationExistsAsFile', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/baz')).createSync();
expectFileSystemException(ErrorCodes.EINVAL, () {
l.renameSync(ns('/baz'));
@@ -3469,7 +3468,7 @@
});
test('throwsIfDestinationExistsAsDirectory', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/baz')).createSync();
expectFileSystemException(ErrorCodes.EINVAL, () {
l.renameSync(ns('/baz'));
@@ -3477,7 +3476,7 @@
});
test('succeedsIfDestinationExistsAsLinkToFile', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.file(ns('/baz')).createSync();
fs.link(ns('/qux')).createSync(ns('/baz'));
l.renameSync(ns('/qux'));
@@ -3490,7 +3489,7 @@
});
test('throwsIfDestinationExistsAsLinkToDirectory', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.directory(ns('/baz')).createSync();
fs.link(ns('/qux')).createSync(ns('/baz'));
l.renameSync(ns('/qux'));
@@ -3503,7 +3502,7 @@
});
test('succeedsIfDestinationExistsAsLinkToNotFound', () {
- Link l = fs.link(ns('/foo'))..createSync(ns('/bar'));
+ var l = fs.link(ns('/foo'))..createSync(ns('/bar'));
fs.link(ns('/baz')).createSync(ns('/qux'));
l.renameSync(ns('/baz'));
expect(fs.typeSync(ns('/foo')), FileSystemEntityType.notFound);
diff --git a/pkgs/file/test/local_test.dart b/pkgs/file/test/local_test.dart
index e1618d2..b794ccd 100644
--- a/pkgs/file/test/local_test.dart
+++ b/pkgs/file/test/local_test.dart
@@ -2,7 +2,11 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
+// ignore_for_file: lines_longer_than_80_chars
+
@TestOn('vm')
+library;
+
import 'dart:io' as io;
import 'package:file/local.dart';
@@ -33,7 +37,7 @@
setUpAll(() {
if (io.Platform.isWindows) {
// TODO(tvolkert): Remove once all more serious test failures are fixed
- // https://github.com/google/file.dart/issues/56
+ // https://github.com/dart-lang/tools/issues/618
ignoreOsErrorCodes = true;
}
});
@@ -42,7 +46,7 @@
ignoreOsErrorCodes = false;
});
- Map<String, List<String>> skipOnPlatform = <String, List<String>>{
+ var skipOnPlatform = <String, List<String>>{
'windows': <String>[
'FileSystem > currentDirectory > throwsIfHasNonExistentPathInComplexChain',
'FileSystem > currentDirectory > resolvesLinksIfEncountered',
diff --git a/pkgs/file/test/memory_operations_test.dart b/pkgs/file/test/memory_operations_test.dart
index 5e27843..916707c 100644
--- a/pkgs/file/test/memory_operations_test.dart
+++ b/pkgs/file/test/memory_operations_test.dart
@@ -2,22 +2,21 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
-import 'package:file/file.dart';
import 'package:file/memory.dart';
import 'package:test/test.dart';
void main() {
test('Read operations invoke opHandle', () async {
- List<String> contexts = <String>[];
- List<FileSystemOp> operations = <FileSystemOp>[];
- MemoryFileSystem fs = MemoryFileSystem.test(
+ var contexts = <String>[];
+ var operations = <FileSystemOp>[];
+ var fs = MemoryFileSystem.test(
opHandle: (String context, FileSystemOp operation) {
if (operation == FileSystemOp.read) {
contexts.add(context);
operations.add(operation);
}
});
- final File file = fs.file('test')..createSync();
+ final file = fs.file('test')..createSync();
await file.readAsBytes();
file.readAsBytesSync();
@@ -34,16 +33,16 @@
});
test('Write operations invoke opHandle', () async {
- List<String> contexts = <String>[];
- List<FileSystemOp> operations = <FileSystemOp>[];
- MemoryFileSystem fs = MemoryFileSystem.test(
+ var contexts = <String>[];
+ var operations = <FileSystemOp>[];
+ var fs = MemoryFileSystem.test(
opHandle: (String context, FileSystemOp operation) {
if (operation == FileSystemOp.write) {
contexts.add(context);
operations.add(operation);
}
});
- final File file = fs.file('test')..createSync();
+ final file = fs.file('test')..createSync();
await file.writeAsBytes(<int>[]);
file.writeAsBytesSync(<int>[]);
@@ -60,18 +59,18 @@
});
test('Delete operations invoke opHandle', () async {
- List<String> contexts = <String>[];
- List<FileSystemOp> operations = <FileSystemOp>[];
- MemoryFileSystem fs = MemoryFileSystem.test(
+ var contexts = <String>[];
+ var operations = <FileSystemOp>[];
+ var fs = MemoryFileSystem.test(
opHandle: (String context, FileSystemOp operation) {
if (operation == FileSystemOp.delete) {
contexts.add(context);
operations.add(operation);
}
});
- final File file = fs.file('test')..createSync();
- final Directory directory = fs.directory('testDir')..createSync();
- final Link link = fs.link('testLink')..createSync('foo');
+ final file = fs.file('test')..createSync();
+ final directory = fs.directory('testDir')..createSync();
+ final link = fs.link('testLink')..createSync('foo');
await file.delete();
file.createSync();
@@ -98,9 +97,9 @@
});
test('Create operations invoke opHandle', () async {
- List<String> contexts = <String>[];
- List<FileSystemOp> operations = <FileSystemOp>[];
- MemoryFileSystem fs = MemoryFileSystem.test(
+ var contexts = <String>[];
+ var operations = <FileSystemOp>[];
+ var fs = MemoryFileSystem.test(
opHandle: (String context, FileSystemOp operation) {
if (operation == FileSystemOp.create) {
contexts.add(context);
@@ -139,16 +138,16 @@
});
test('Open operations invoke opHandle', () async {
- List<String> contexts = <String>[];
- List<FileSystemOp> operations = <FileSystemOp>[];
- MemoryFileSystem fs = MemoryFileSystem.test(
+ var contexts = <String>[];
+ var operations = <FileSystemOp>[];
+ var fs = MemoryFileSystem.test(
opHandle: (String context, FileSystemOp operation) {
if (operation == FileSystemOp.open) {
contexts.add(context);
operations.add(operation);
}
});
- final File file = fs.file('test')..createSync();
+ final file = fs.file('test')..createSync();
await file.open();
file.openSync();
@@ -165,16 +164,16 @@
});
test('Copy operations invoke opHandle', () async {
- List<String> contexts = <String>[];
- List<FileSystemOp> operations = <FileSystemOp>[];
- MemoryFileSystem fs = MemoryFileSystem.test(
+ var contexts = <String>[];
+ var operations = <FileSystemOp>[];
+ var fs = MemoryFileSystem.test(
opHandle: (String context, FileSystemOp operation) {
if (operation == FileSystemOp.copy) {
contexts.add(context);
operations.add(operation);
}
});
- final File file = fs.file('test')..createSync();
+ final file = fs.file('test')..createSync();
await file.copy('A');
file.copySync('B');
@@ -187,9 +186,9 @@
});
test('Exists operations invoke opHandle', () async {
- List<String> contexts = <String>[];
- List<FileSystemOp> operations = <FileSystemOp>[];
- MemoryFileSystem fs = MemoryFileSystem.test(
+ var contexts = <String>[];
+ var operations = <FileSystemOp>[];
+ var fs = MemoryFileSystem.test(
opHandle: (String context, FileSystemOp operation) {
if (operation == FileSystemOp.exists) {
contexts.add(context);
diff --git a/pkgs/file/test/memory_test.dart b/pkgs/file/test/memory_test.dart
index f3b324e..ce8675f 100644
--- a/pkgs/file/test/memory_test.dart
+++ b/pkgs/file/test/memory_test.dart
@@ -66,8 +66,7 @@
});
test('MemoryFileSystem.test', () {
- final MemoryFileSystem fs =
- MemoryFileSystem.test(); // creates root directory
+ final fs = MemoryFileSystem.test(); // creates root directory
fs.file('/test1.txt').createSync(); // creates file
fs.file('/test2.txt').createSync(); // creates file
expect(fs.directory('/').statSync().modified, DateTime(2000, 1, 1, 0, 1));
@@ -95,10 +94,10 @@
});
test('MemoryFile.openSync returns a MemoryRandomAccessFile', () async {
- final MemoryFileSystem fs = MemoryFileSystem.test();
+ final fs = MemoryFileSystem.test();
final io.File file = fs.file('/test1')..createSync();
- io.RandomAccessFile raf = file.openSync();
+ var raf = file.openSync();
try {
expect(raf, isA<MemoryRandomAccessFile>());
} finally {
@@ -114,7 +113,7 @@
});
test('MemoryFileSystem.systemTempDirectory test', () {
- final MemoryFileSystem fs = MemoryFileSystem.test();
+ final fs = MemoryFileSystem.test();
final io.Directory fooA = fs.systemTempDirectory.createTempSync('foo');
final io.Directory fooB = fs.systemTempDirectory.createTempSync('foo');
@@ -122,7 +121,7 @@
expect(fooA.path, '/.tmp_rand0/foorand0');
expect(fooB.path, '/.tmp_rand0/foorand1');
- final MemoryFileSystem secondFs = MemoryFileSystem.test();
+ final secondFs = MemoryFileSystem.test();
final io.Directory fooAA =
secondFs.systemTempDirectory.createTempSync('foo');
@@ -136,16 +135,16 @@
test('Failed UTF8 decoding in MemoryFileSystem throws a FileSystemException',
() {
- final MemoryFileSystem fileSystem = MemoryFileSystem.test();
- final File file = fileSystem.file('foo')
+ final fileSystem = MemoryFileSystem.test();
+ final file = fileSystem.file('foo')
..writeAsBytesSync(<int>[0xFFFE]); // Invalid UTF8
expect(file.readAsStringSync, throwsA(isA<FileSystemException>()));
});
test('Creating a temporary directory actually creates the directory', () {
- final MemoryFileSystem fileSystem = MemoryFileSystem.test();
- final Directory tempDir = fileSystem.currentDirectory.createTempSync('foo');
+ final fileSystem = MemoryFileSystem.test();
+ final tempDir = fileSystem.currentDirectory.createTempSync('foo');
expect(tempDir.existsSync(), true);
});
diff --git a/pkgs/file/test/utils.dart b/pkgs/file/test/utils.dart
index 231312f..797ec9d 100644
--- a/pkgs/file/test/utils.dart
+++ b/pkgs/file/test/utils.dart
@@ -25,7 +25,7 @@
/// If [time] is not specified, it will default to the current time.
DateTime ceil([DateTime? time]) {
time ??= DateTime.now();
- int microseconds = (1000 * time.millisecond) + time.microsecond;
+ var microseconds = (1000 * time.millisecond) + time.microsecond;
return (microseconds == 0)
? time
// Add just enough milliseconds and microseconds to reach the next second.
@@ -78,7 +78,7 @@
bool verbose,
) {
if (item is DateTime) {
- Duration diff = item.difference(_time).abs();
+ var diff = item.difference(_time).abs();
return description.add('is $mismatchAdjective $_time by $diff');
} else {
return description.add('is not a DateTime');
diff --git a/pkgs/file/test/utils_test.dart b/pkgs/file/test/utils_test.dart
index 75293bf..23788e9 100644
--- a/pkgs/file/test/utils_test.dart
+++ b/pkgs/file/test/utils_test.dart
@@ -8,9 +8,9 @@
void main() {
test('floorAndCeilProduceExactSecondDateTime', () {
- DateTime time = DateTime.fromMicrosecondsSinceEpoch(1001);
- DateTime lower = floor(time);
- DateTime upper = ceil(time);
+ var time = DateTime.fromMicrosecondsSinceEpoch(1001);
+ var lower = floor(time);
+ var upper = ceil(time);
expect(lower.millisecond, 0);
expect(upper.millisecond, 0);
expect(lower.microsecond, 0);
@@ -18,26 +18,26 @@
});
test('floorAndCeilWorkWithNow', () {
- DateTime time = DateTime.now();
- int lower = time.difference(floor(time)).inMicroseconds;
- int upper = ceil(time).difference(time).inMicroseconds;
+ var time = DateTime.now();
+ var lower = time.difference(floor(time)).inMicroseconds;
+ var upper = ceil(time).difference(time).inMicroseconds;
expect(lower, lessThan(1000000));
expect(upper, lessThanOrEqualTo(1000000));
});
test('floorAndCeilWorkWithExactSecondDateTime', () {
- DateTime time = DateTime.parse('1999-12-31 23:59:59');
- DateTime lower = floor(time);
- DateTime upper = ceil(time);
+ var time = DateTime.parse('1999-12-31 23:59:59');
+ var lower = floor(time);
+ var upper = ceil(time);
expect(lower, time);
expect(upper, time);
});
test('floorAndCeilWorkWithInexactSecondDateTime', () {
- DateTime time = DateTime.parse('1999-12-31 23:59:59.500');
- DateTime lower = floor(time);
- DateTime upper = ceil(time);
- Duration difference = upper.difference(lower);
+ var time = DateTime.parse('1999-12-31 23:59:59.500');
+ var lower = floor(time);
+ var upper = ceil(time);
+ var difference = upper.difference(lower);
expect(difference.inMicroseconds, 1000000);
});
}
diff --git a/pkgs/file_testing/CHANGELOG.md b/pkgs/file_testing/CHANGELOG.md
index 0af779d..17039ee 100644
--- a/pkgs/file_testing/CHANGELOG.md
+++ b/pkgs/file_testing/CHANGELOG.md
@@ -1,3 +1,8 @@
+## 3.1.0-wip
+
+* Changed the type of several matchers to `TypeMatcher` which allows cascading
+ their usage with `.having` and similar.
+
## 3.0.2
* Require Dart 3.1.
diff --git a/pkgs/file_testing/analysis_options.yaml b/pkgs/file_testing/analysis_options.yaml
index 8fbd2e4..d978f81 100644
--- a/pkgs/file_testing/analysis_options.yaml
+++ b/pkgs/file_testing/analysis_options.yaml
@@ -1,6 +1 @@
-include: package:lints/recommended.yaml
-
-analyzer:
- errors:
- # Allow having TODOs in the code
- todo: ignore
+include: package:dart_flutter_team_lints/analysis_options.yaml
diff --git a/pkgs/file_testing/lib/src/testing/core_matchers.dart b/pkgs/file_testing/lib/src/testing/core_matchers.dart
index f58539f..801209e 100644
--- a/pkgs/file_testing/lib/src/testing/core_matchers.dart
+++ b/pkgs/file_testing/lib/src/testing/core_matchers.dart
@@ -2,6 +2,8 @@
// for details. All rights reserved. Use of this source code is governed by a
// BSD-style license that can be found in the LICENSE file.
+// ignore_for_file: comment_references
+
import 'dart:io';
import 'package:test/test.dart';
@@ -9,26 +11,27 @@
import 'internal.dart';
/// Matcher that successfully matches against any instance of [Directory].
-const Matcher isDirectory = TypeMatcher<Directory>();
+const isDirectory = TypeMatcher<Directory>();
/// Matcher that successfully matches against any instance of [File].
-const Matcher isFile = TypeMatcher<File>();
+const isFile = TypeMatcher<File>();
/// Matcher that successfully matches against any instance of [Link].
-const Matcher isLink = TypeMatcher<Link>();
+const isLink = TypeMatcher<Link>();
/// Matcher that successfully matches against any instance of
/// [FileSystemEntity].
-const Matcher isFileSystemEntity = TypeMatcher<FileSystemEntity>();
+const isFileSystemEntity = TypeMatcher<FileSystemEntity>();
/// Matcher that successfully matches against any instance of [FileStat].
-const Matcher isFileStat = TypeMatcher<FileStat>();
+const isFileStat = TypeMatcher<FileStat>();
/// Returns a [Matcher] that matches [path] against an entity's path.
///
/// [path] may be a String, a predicate function, or a [Matcher]. If it is
/// a String, it will be wrapped in an equality matcher.
-Matcher hasPath(dynamic path) => _HasPath(path);
+TypeMatcher<FileSystemEntity> hasPath(dynamic path) =>
+ isFileSystemEntity.having((e) => e.path, 'path', path);
/// Returns a [Matcher] that successfully matches against an instance of
/// [FileSystemException].
@@ -39,7 +42,8 @@
/// [osErrorCode] may be an `int`, a predicate function, or a [Matcher]. If it
/// is an `int`, it will be wrapped in an equality matcher.
Matcher isFileSystemException([dynamic osErrorCode]) =>
- _FileSystemException(osErrorCode);
+ const TypeMatcher<FileSystemException>().having((e) => e.osError?.errorCode,
+ 'osError.errorCode', _fileExceptionWrapMatcher(osErrorCode));
/// Returns a matcher that successfully matches against a future or function
/// that throws a [FileSystemException].
@@ -67,89 +71,10 @@
/// Matcher that successfully matches against a [FileSystemEntity] that
/// exists ([FileSystemEntity.existsSync] returns true).
-const Matcher exists = _Exists();
+final TypeMatcher<FileSystemEntity> exists =
+ isFileSystemEntity.having((e) => e.existsSync(), 'existsSync', true);
-class _FileSystemException extends Matcher {
- _FileSystemException(dynamic osErrorCode)
- : _matcher = _wrapMatcher(osErrorCode);
-
- final Matcher? _matcher;
-
- static Matcher? _wrapMatcher(dynamic osErrorCode) {
- if (osErrorCode == null) {
- return null;
- }
- return ignoreOsErrorCodes ? anything : wrapMatcher(osErrorCode);
- }
-
- @override
- bool matches(dynamic item, Map<dynamic, dynamic> matchState) {
- if (item is FileSystemException) {
- return _matcher == null ||
- _matcher!.matches(item.osError?.errorCode, matchState);
- }
- return false;
- }
-
- @override
- Description describe(Description desc) {
- if (_matcher == null) {
- return desc.add('FileSystemException');
- } else {
- desc.add('FileSystemException with osError.errorCode: ');
- return _matcher!.describe(desc);
- }
- }
-}
-
-class _HasPath extends Matcher {
- _HasPath(dynamic path) : _matcher = wrapMatcher(path);
-
- final Matcher _matcher;
-
- @override
- bool matches(dynamic item, Map<dynamic, dynamic> matchState) =>
- _matcher.matches(item.path, matchState);
-
- @override
- Description describe(Description desc) {
- desc.add('has path: ');
- return _matcher.describe(desc);
- }
-
- @override
- Description describeMismatch(
- dynamic item,
- Description desc,
- Map<dynamic, dynamic> matchState,
- bool verbose,
- ) {
- desc.add('has path: \'${item.path}\'').add('\n Which: ');
- final Description pathDesc = StringDescription();
- _matcher.describeMismatch(item.path, pathDesc, matchState, verbose);
- desc.add(pathDesc.toString());
- return desc;
- }
-}
-
-class _Exists extends Matcher {
- const _Exists();
-
- @override
- bool matches(dynamic item, Map<dynamic, dynamic> matchState) =>
- item is FileSystemEntity && item.existsSync();
-
- @override
- Description describe(Description description) =>
- description.add('a file system entity that exists');
-
- @override
- Description describeMismatch(
- dynamic item,
- Description description,
- Map<dynamic, dynamic> matchState,
- bool verbose,
- ) {
- return description.add('does not exist');
- }
-}
+Matcher? _fileExceptionWrapMatcher(dynamic osErrorCode) =>
+ (osErrorCode == null || ignoreOsErrorCodes)
+ ? anything
+ : wrapMatcher(osErrorCode);
diff --git a/pkgs/file_testing/pubspec.yaml b/pkgs/file_testing/pubspec.yaml
index 691efa0..895826a 100644
--- a/pkgs/file_testing/pubspec.yaml
+++ b/pkgs/file_testing/pubspec.yaml
@@ -1,5 +1,5 @@
name: file_testing
-version: 3.0.2
+version: 3.1.0-wip
description: Testing utilities for package:file.
repository: https://github.com/dart-lang/tools/tree/main/pkgs/file_testing
issue_tracker: https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Afile_testing
@@ -10,5 +10,5 @@
dependencies:
test: ^1.23.1
-dev_dependencies:
- lints: ^5.0.0
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
diff --git a/pkgs/graphs/pubspec.yaml b/pkgs/graphs/pubspec.yaml
index bdbda2f..e25ee85 100644
--- a/pkgs/graphs/pubspec.yaml
+++ b/pkgs/graphs/pubspec.yaml
@@ -15,6 +15,6 @@
test: ^1.21.6
# For examples
- analyzer: '>=5.2.0 <7.0.0'
+ analyzer: '>=5.2.0 <8.0.0'
path: ^1.8.0
pool: ^1.5.0
diff --git a/pkgs/io/.gitignore b/pkgs/io/.gitignore
new file mode 100644
index 0000000..01d42c0
--- /dev/null
+++ b/pkgs/io/.gitignore
@@ -0,0 +1,4 @@
+.dart_tool/
+.pub/
+.packages
+pubspec.lock
diff --git a/pkgs/io/AUTHORS b/pkgs/io/AUTHORS
new file mode 100644
index 0000000..ff09364
--- /dev/null
+++ b/pkgs/io/AUTHORS
@@ -0,0 +1,7 @@
+# Below is a list of people and organizations that have contributed
+# to the project. Names should be added to the list like so:
+#
+# Name/Organization <email address>
+
+Google Inc.
+
diff --git a/pkgs/io/CHANGELOG.md b/pkgs/io/CHANGELOG.md
new file mode 100644
index 0000000..e0631fa
--- /dev/null
+++ b/pkgs/io/CHANGELOG.md
@@ -0,0 +1,119 @@
+## 1.0.5
+
+* Require Dart 3.4.
+* Move to `dart-lang/tools` monorepo.
+
+## 1.0.4
+
+* Updates to the readme.
+
+## 1.0.3
+
+* Revert `meta` constraint to `^1.3.0`.
+
+## 1.0.2
+
+* Update `meta` constraint to `>=1.3.0 <3.0.0`.
+
+## 1.0.1
+
+* Update code examples to call the unified `dart` developer tool.
+
+## 1.0.0
+
+* Migrate this package to null-safety.
+* Require Dart >=2.12.
+
+## 0.3.5
+
+* Require Dart >=2.1.
+* Remove dependency on `package:charcode`.
+
+## 0.3.4
+
+* Fix a number of issues affecting the package score on `pub.dev`.
+
+## 0.3.3
+
+* Updates for Dart 2 constants. Require at least Dart `2.0.0-dev.54`.
+
+* Fix the type of `StartProcess` typedef to match `Process.start` from
+ `dart:io`.
+
+## 0.3.2+1
+
+* `ansi.dart`
+
+ * The "forScript" code paths now ignore the `ansiOutputEnabled` value. Affects
+ the `escapeForScript` property on `AnsiCode` and the `wrap` and `wrapWith`
+ functions when `forScript` is true.
+
+## 0.3.2
+
+* `ansi.dart`
+
+ * Added `forScript` named argument to top-level `wrapWith` function.
+
+ * `AnsiCode`
+
+ * Added `String get escapeForScript` property.
+
+ * Added `forScript` named argument to `wrap` function.
+
+## 0.3.1
+
+- Added `SharedStdIn.nextLine` (similar to `readLineSync`) and `lines`:
+
+```dart
+main() async {
+ // Prints the first line entered on stdin.
+ print(await sharedStdIn.nextLine());
+
+ // Prints all remaining lines.
+ await for (final line in sharedStdIn.lines) {
+ print(line);
+ }
+}
+```
+
+- Added a `copyPath` and `copyPathSync` function, similar to `cp -R`.
+
+- Added a dependency on `package:path`.
+
+- Added the remaining missing arguments to `ProcessManager.spawnX` which
+ forward to `Process.start`. It is now an interchangeable function for running
+ a process.
+
+## 0.3.0
+
+- **BREAKING CHANGE**: The `arguments` argument to `ProcessManager.spawn` is
+ now positional (not named) and required. This makes it more similar to the
+ built-in `Process.start`, and easier to use as a drop in replacement:
+
+```dart
+main() {
+ processManager.spawn('dart', ['--version']);
+}
+```
+
+- Fixed a bug where processes created from `ProcessManager.spawn` could not
+ have their `stdout`/`stderr` read through their respective getters (a runtime
+ error was always thrown).
+
+- Added `ProcessMangaer#spawnBackground`, which does not forward `stdin`.
+
+- Added `ProcessManager#spawnDetached`, which does not forward any I/O.
+
+- Added the `shellSplit()` function, which parses a list of arguments in the
+ same manner as [the POSIX shell][what_is_posix_shell].
+
+[what_is_posix_shell]: https://pubs.opengroup.org/onlinepubs/9699919799/utilities/contents.html
+
+## 0.2.0
+
+- Initial commit of...
+ - `FutureOr<bool> String isExecutable(path)`.
+ - `ExitCode`
+ - `ProcessManager` and `Spawn`
+ - `sharedStdIn` and `SharedStdIn`
+ - `ansi.dart` library with support for formatting terminal output
diff --git a/pkgs/io/LICENSE b/pkgs/io/LICENSE
new file mode 100644
index 0000000..03af64a
--- /dev/null
+++ b/pkgs/io/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2017, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/io/README.md b/pkgs/io/README.md
new file mode 100644
index 0000000..adbc941
--- /dev/null
+++ b/pkgs/io/README.md
@@ -0,0 +1,104 @@
+[](https://github.com/dart-lang/tools/actions/workflows/io.yaml)
+[](https://pub.dev/packages/io)
+[](https://pub.dev/packages/io/publisher)
+
+Contains utilities for the Dart VM's `dart:io`.
+
+## Usage - `io.dart`
+
+### Files
+
+#### `isExecutable`
+
+Returns whether a provided file path is considered _executable_ on the host
+operating system.
+
+### Processes
+
+#### `ExitCode`
+
+An `enum`-like class that contains known exit codes.
+
+#### `ProcessManager`
+
+A higher-level service for spawning and communicating with processes.
+
+##### Use `spawn` to create a process with std[in|out|err] forwarded by default
+
+```dart
+Future<void> main() async {
+ final manager = ProcessManager();
+
+ // Print `dart` tool version to stdout.
+ print('** Running `dart --version`');
+ var spawn = await manager.spawn('dart', ['--version']);
+ await spawn.exitCode;
+
+ // Check formatting and print the result to stdout.
+ print('** Running `dart format --output=none .`');
+ spawn = await manager.spawn('dart', ['format', '--output=none', '.']);
+ await spawn.exitCode;
+
+ // Check if a package is ready for publishing.
+ // Upon hitting a blocking stdin state, you may directly
+ // output to the processes's stdin via your own, similar to how a bash or
+ // shell script would spawn a process.
+ print('** Running pub publish');
+ spawn = await manager.spawn('dart', ['pub', 'publish', '--dry-run']);
+ await spawn.exitCode;
+
+ // Closes stdin for the entire program.
+ await sharedStdIn.terminate();
+}
+```
+
+#### `sharedStdIn`
+
+A safer version of the default `stdin` stream from `dart:io` that allows a
+subscriber to cancel their subscription, and then allows a _new_ subscriber to
+start listening. This differs from the default behavior where only a single
+listener is ever allowed in the application lifecycle:
+
+```dart
+test('should allow multiple subscribers', () async {
+ final logs = <String>[];
+ final asUtf8 = sharedStdIn.transform(UTF8.decoder);
+ // Wait for input for the user.
+ logs.add(await asUtf8.first);
+ // Wait for more input for the user.
+ logs.add(await asUtf8.first);
+ expect(logs, ['Hello World', 'Goodbye World']);
+});
+```
+
+For testing, an instance of `SharedStdIn` may be created directly.
+
+## Usage - `ansi.dart`
+
+```dart
+import 'dart:io' as io;
+import 'package:io/ansi.dart';
+
+void main() {
+ // To use one style, call the `wrap` method on one of the provided top-level
+ // values.
+ io.stderr.writeln(red.wrap("Bad error!"));
+
+ // To use multiple styles, call `wrapWith`.
+ print(wrapWith('** Important **', [red, styleBold, styleUnderlined]));
+
+ // The wrap functions will simply return the provided value unchanged if
+ // `ansiOutputEnabled` is false.
+ //
+ // You can override the value `ansiOutputEnabled` by wrapping code in
+ // `overrideAnsiOutput`.
+ overrideAnsiOutput(false, () {
+ assert('Normal text' == green.wrap('Normal text'));
+ });
+}
+```
+
+## Publishing automation
+
+For information about our publishing automation and release process, see
+https://github.com/dart-lang/ecosystem/wiki/Publishing-automation.
diff --git a/pkgs/io/analysis_options.yaml b/pkgs/io/analysis_options.yaml
new file mode 100644
index 0000000..6d74ee9
--- /dev/null
+++ b/pkgs/io/analysis_options.yaml
@@ -0,0 +1,32 @@
+# https://dart.dev/guides/language/analysis-options
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-inference: true
+ strict-raw-types: true
+
+linter:
+ rules:
+ - avoid_bool_literals_in_conditional_expressions
+ - avoid_classes_with_only_static_members
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_returning_this
+ - avoid_unused_constructor_parameters
+ - avoid_void_async
+ - cancel_subscriptions
+ - join_return_with_assignment
+ - literal_only_boolean_expressions
+ - missing_whitespace_between_adjacent_strings
+ - no_adjacent_strings_in_list
+ - no_runtimeType_toString
+ - prefer_const_declarations
+ - prefer_expression_function_bodies
+ - prefer_final_locals
+ - unnecessary_await_in_return
+ - unnecessary_breaks
+ - use_if_null_to_convert_nulls_to_bools
+ - use_raw_strings
+ - use_string_buffers
diff --git a/pkgs/io/example/example.dart b/pkgs/io/example/example.dart
new file mode 100644
index 0000000..8e358fd
--- /dev/null
+++ b/pkgs/io/example/example.dart
@@ -0,0 +1,33 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math';
+
+import 'package:io/ansi.dart';
+
+/// Prints a sample of all of the `AnsiCode` values.
+void main(List<String> args) {
+ final forScript = args.contains('--for-script');
+
+ if (!ansiOutputEnabled) {
+ print('`ansiOutputEnabled` is `false`.');
+ print("Don't expect pretty output.");
+ }
+ _preview('Foreground', foregroundColors, forScript);
+ _preview('Background', backgroundColors, forScript);
+ _preview('Styles', styles, forScript);
+}
+
+void _preview(String name, List<AnsiCode> values, bool forScript) {
+ print('');
+ final longest = values.map((ac) => ac.name.length).reduce(max);
+
+ print(wrapWith('** $name **', [styleBold, styleUnderlined]));
+ for (var code in values) {
+ final header =
+ '${code.name.padRight(longest)} ${code.code.toString().padLeft(3)}';
+
+ print("$header: ${code.wrap('Sample', forScript: forScript)}");
+ }
+}
diff --git a/pkgs/io/example/spawn_process_example.dart b/pkgs/io/example/spawn_process_example.dart
new file mode 100644
index 0000000..b7ba247
--- /dev/null
+++ b/pkgs/io/example/spawn_process_example.dart
@@ -0,0 +1,33 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:io/io.dart';
+
+/// Runs a few subcommands in the `dart` command.
+Future<void> main() async {
+ final manager = ProcessManager();
+
+ // Print `dart` tool version to stdout.
+ print('** Running `dart --version`');
+ var spawn = await manager.spawn('dart', ['--version']);
+ await spawn.exitCode;
+
+ // Check formatting and print the result to stdout.
+ print('** Running `dart format --output=none .`');
+ spawn = await manager.spawn('dart', ['format', '--output=none', '.']);
+ await spawn.exitCode;
+
+ // Check if a package is ready for publishing.
+ // Upon hitting a blocking stdin state, you may directly
+ // output to the processes's stdin via your own, similar to how a bash or
+ // shell script would spawn a process.
+ print('** Running pub publish');
+ spawn = await manager.spawn('dart', ['pub', 'publish', '--dry-run']);
+ await spawn.exitCode;
+
+ // Closes stdin for the entire program.
+ await sharedStdIn.terminate();
+}
diff --git a/pkgs/io/lib/ansi.dart b/pkgs/io/lib/ansi.dart
new file mode 100644
index 0000000..a2adbe7
--- /dev/null
+++ b/pkgs/io/lib/ansi.dart
@@ -0,0 +1,5 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/ansi_code.dart';
diff --git a/pkgs/io/lib/io.dart b/pkgs/io/lib/io.dart
new file mode 100644
index 0000000..8ee0843
--- /dev/null
+++ b/pkgs/io/lib/io.dart
@@ -0,0 +1,10 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/copy_path.dart' show copyPath, copyPathSync;
+export 'src/exit_code.dart' show ExitCode;
+export 'src/permissions.dart' show isExecutable;
+export 'src/process_manager.dart' show ProcessManager, Spawn, StartProcess;
+export 'src/shared_stdin.dart' show SharedStdIn, sharedStdIn;
+export 'src/shell_words.dart' show shellSplit;
diff --git a/pkgs/io/lib/src/ansi_code.dart b/pkgs/io/lib/src/ansi_code.dart
new file mode 100644
index 0000000..c9a22c5
--- /dev/null
+++ b/pkgs/io/lib/src/ansi_code.dart
@@ -0,0 +1,316 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io' as io;
+
+const _ansiEscapeLiteral = '\x1B';
+const _ansiEscapeForScript = r'\033';
+
+/// Whether formatted ANSI output is enabled for [wrapWith] and [AnsiCode.wrap].
+///
+/// By default, returns `true` if both `stdout.supportsAnsiEscapes` and
+/// `stderr.supportsAnsiEscapes` from `dart:io` are `true`.
+///
+/// The default can be overridden by setting the [Zone] variable [AnsiCode] to
+/// either `true` or `false`.
+///
+/// [overrideAnsiOutput] is provided to make this easy.
+bool get ansiOutputEnabled =>
+ Zone.current[AnsiCode] as bool? ??
+ (io.stdout.supportsAnsiEscapes && io.stderr.supportsAnsiEscapes);
+
+/// Returns `true` no formatting is required for [input].
+bool _isNoop(bool skip, String? input, bool? forScript) =>
+ skip ||
+ input == null ||
+ input.isEmpty ||
+ !((forScript ?? false) || ansiOutputEnabled);
+
+/// Allows overriding [ansiOutputEnabled] to [enableAnsiOutput] for the code run
+/// within [body].
+T overrideAnsiOutput<T>(bool enableAnsiOutput, T Function() body) =>
+ runZoned(body, zoneValues: <Object, Object>{AnsiCode: enableAnsiOutput});
+
+/// The type of code represented by [AnsiCode].
+class AnsiCodeType {
+ final String _name;
+
+ /// A foreground color.
+ static const AnsiCodeType foreground = AnsiCodeType._('foreground');
+
+ /// A style.
+ static const AnsiCodeType style = AnsiCodeType._('style');
+
+ /// A background color.
+ static const AnsiCodeType background = AnsiCodeType._('background');
+
+ /// A reset value.
+ static const AnsiCodeType reset = AnsiCodeType._('reset');
+
+ const AnsiCodeType._(this._name);
+
+ @override
+ String toString() => 'AnsiType.$_name';
+}
+
+/// Standard ANSI escape code for customizing terminal text output.
+///
+/// [Source](https://en.wikipedia.org/wiki/ANSI_escape_code#Colors)
+class AnsiCode {
+ /// The numeric value associated with this code.
+ final int code;
+
+ /// The [AnsiCode] that resets this value, if one exists.
+ ///
+ /// Otherwise, `null`.
+ final AnsiCode? reset;
+
+ /// A description of this code.
+ final String name;
+
+ /// The type of code that is represented.
+ final AnsiCodeType type;
+
+ const AnsiCode._(this.name, this.type, this.code, this.reset);
+
+ /// Represents the value escaped for use in terminal output.
+ String get escape => '$_ansiEscapeLiteral[${code}m';
+
+ /// Represents the value as an unescaped literal suitable for scripts.
+ String get escapeForScript => '$_ansiEscapeForScript[${code}m';
+
+ String _escapeValue({bool forScript = false}) =>
+ forScript ? escapeForScript : escape;
+
+ /// Wraps [value] with the [escape] value for this code, followed by
+ /// [resetAll].
+ ///
+ /// If [forScript] is `true`, the return value is an unescaped literal. The
+ /// value of [ansiOutputEnabled] is also ignored.
+ ///
+ /// Returns `value` unchanged if
+ /// * [value] is `null` or empty
+ /// * both [ansiOutputEnabled] and [forScript] are `false`.
+ /// * [type] is [AnsiCodeType.reset]
+ String? wrap(String? value, {bool forScript = false}) =>
+ _isNoop(type == AnsiCodeType.reset, value, forScript)
+ ? value
+ : '${_escapeValue(forScript: forScript)}$value'
+ '${reset!._escapeValue(forScript: forScript)}';
+
+ @override
+ String toString() => '$name ${type._name} ($code)';
+}
+
+/// Returns a [String] formatted with [codes].
+///
+/// If [forScript] is `true`, the return value is an unescaped literal. The
+/// value of [ansiOutputEnabled] is also ignored.
+///
+/// Returns `value` unchanged if
+/// * [value] is `null` or empty.
+/// * both [ansiOutputEnabled] and [forScript] are `false`.
+/// * [codes] is empty.
+///
+/// Throws an [ArgumentError] if
+/// * [codes] contains more than one value of type [AnsiCodeType.foreground].
+/// * [codes] contains more than one value of type [AnsiCodeType.background].
+/// * [codes] contains any value of type [AnsiCodeType.reset].
+String? wrapWith(String? value, Iterable<AnsiCode> codes,
+ {bool forScript = false}) {
+ // Eliminate duplicates
+ final myCodes = codes.toSet();
+
+ if (_isNoop(myCodes.isEmpty, value, forScript)) {
+ return value;
+ }
+
+ var foreground = 0, background = 0;
+ for (var code in myCodes) {
+ switch (code.type) {
+ case AnsiCodeType.foreground:
+ foreground++;
+ if (foreground > 1) {
+ throw ArgumentError.value(codes, 'codes',
+ 'Cannot contain more than one foreground color code.');
+ }
+ case AnsiCodeType.background:
+ background++;
+ if (background > 1) {
+ throw ArgumentError.value(codes, 'codes',
+ 'Cannot contain more than one foreground color code.');
+ }
+ case AnsiCodeType.reset:
+ throw ArgumentError.value(
+ codes, 'codes', 'Cannot contain reset codes.');
+ case AnsiCodeType.style:
+ // Ignore.
+ break;
+ }
+ }
+
+ final sortedCodes = myCodes.map((ac) => ac.code).toList()..sort();
+ final escapeValue = forScript ? _ansiEscapeForScript : _ansiEscapeLiteral;
+
+ return "$escapeValue[${sortedCodes.join(';')}m$value"
+ '${resetAll._escapeValue(forScript: forScript)}';
+}
+
+//
+// Style values
+//
+
+const styleBold = AnsiCode._('bold', AnsiCodeType.style, 1, resetBold);
+const styleDim = AnsiCode._('dim', AnsiCodeType.style, 2, resetDim);
+const styleItalic = AnsiCode._('italic', AnsiCodeType.style, 3, resetItalic);
+const styleUnderlined =
+ AnsiCode._('underlined', AnsiCodeType.style, 4, resetUnderlined);
+const styleBlink = AnsiCode._('blink', AnsiCodeType.style, 5, resetBlink);
+const styleReverse = AnsiCode._('reverse', AnsiCodeType.style, 7, resetReverse);
+
+/// Not widely supported.
+const styleHidden = AnsiCode._('hidden', AnsiCodeType.style, 8, resetHidden);
+
+/// Not widely supported.
+const styleCrossedOut =
+ AnsiCode._('crossed out', AnsiCodeType.style, 9, resetCrossedOut);
+
+//
+// Reset values
+//
+
+const resetAll = AnsiCode._('all', AnsiCodeType.reset, 0, null);
+
+// NOTE: bold is weird. The reset code seems to be 22 sometimes – not 21
+// See https://gitlab.com/gnachman/iterm2/issues/3208
+const resetBold = AnsiCode._('bold', AnsiCodeType.reset, 22, null);
+const resetDim = AnsiCode._('dim', AnsiCodeType.reset, 22, null);
+const resetItalic = AnsiCode._('italic', AnsiCodeType.reset, 23, null);
+const resetUnderlined = AnsiCode._('underlined', AnsiCodeType.reset, 24, null);
+const resetBlink = AnsiCode._('blink', AnsiCodeType.reset, 25, null);
+const resetReverse = AnsiCode._('reverse', AnsiCodeType.reset, 27, null);
+const resetHidden = AnsiCode._('hidden', AnsiCodeType.reset, 28, null);
+const resetCrossedOut = AnsiCode._('crossed out', AnsiCodeType.reset, 29, null);
+
+//
+// Foreground values
+//
+
+const black = AnsiCode._('black', AnsiCodeType.foreground, 30, resetAll);
+const red = AnsiCode._('red', AnsiCodeType.foreground, 31, resetAll);
+const green = AnsiCode._('green', AnsiCodeType.foreground, 32, resetAll);
+const yellow = AnsiCode._('yellow', AnsiCodeType.foreground, 33, resetAll);
+const blue = AnsiCode._('blue', AnsiCodeType.foreground, 34, resetAll);
+const magenta = AnsiCode._('magenta', AnsiCodeType.foreground, 35, resetAll);
+const cyan = AnsiCode._('cyan', AnsiCodeType.foreground, 36, resetAll);
+const lightGray =
+ AnsiCode._('light gray', AnsiCodeType.foreground, 37, resetAll);
+const defaultForeground =
+ AnsiCode._('default', AnsiCodeType.foreground, 39, resetAll);
+const darkGray = AnsiCode._('dark gray', AnsiCodeType.foreground, 90, resetAll);
+const lightRed = AnsiCode._('light red', AnsiCodeType.foreground, 91, resetAll);
+const lightGreen =
+ AnsiCode._('light green', AnsiCodeType.foreground, 92, resetAll);
+const lightYellow =
+ AnsiCode._('light yellow', AnsiCodeType.foreground, 93, resetAll);
+const lightBlue =
+ AnsiCode._('light blue', AnsiCodeType.foreground, 94, resetAll);
+const lightMagenta =
+ AnsiCode._('light magenta', AnsiCodeType.foreground, 95, resetAll);
+const lightCyan =
+ AnsiCode._('light cyan', AnsiCodeType.foreground, 96, resetAll);
+const white = AnsiCode._('white', AnsiCodeType.foreground, 97, resetAll);
+
+//
+// Background values
+//
+
+const backgroundBlack =
+ AnsiCode._('black', AnsiCodeType.background, 40, resetAll);
+const backgroundRed = AnsiCode._('red', AnsiCodeType.background, 41, resetAll);
+const backgroundGreen =
+ AnsiCode._('green', AnsiCodeType.background, 42, resetAll);
+const backgroundYellow =
+ AnsiCode._('yellow', AnsiCodeType.background, 43, resetAll);
+const backgroundBlue =
+ AnsiCode._('blue', AnsiCodeType.background, 44, resetAll);
+const backgroundMagenta =
+ AnsiCode._('magenta', AnsiCodeType.background, 45, resetAll);
+const backgroundCyan =
+ AnsiCode._('cyan', AnsiCodeType.background, 46, resetAll);
+const backgroundLightGray =
+ AnsiCode._('light gray', AnsiCodeType.background, 47, resetAll);
+const backgroundDefault =
+ AnsiCode._('default', AnsiCodeType.background, 49, resetAll);
+const backgroundDarkGray =
+ AnsiCode._('dark gray', AnsiCodeType.background, 100, resetAll);
+const backgroundLightRed =
+ AnsiCode._('light red', AnsiCodeType.background, 101, resetAll);
+const backgroundLightGreen =
+ AnsiCode._('light green', AnsiCodeType.background, 102, resetAll);
+const backgroundLightYellow =
+ AnsiCode._('light yellow', AnsiCodeType.background, 103, resetAll);
+const backgroundLightBlue =
+ AnsiCode._('light blue', AnsiCodeType.background, 104, resetAll);
+const backgroundLightMagenta =
+ AnsiCode._('light magenta', AnsiCodeType.background, 105, resetAll);
+const backgroundLightCyan =
+ AnsiCode._('light cyan', AnsiCodeType.background, 106, resetAll);
+const backgroundWhite =
+ AnsiCode._('white', AnsiCodeType.background, 107, resetAll);
+
+/// All of the [AnsiCode] values that represent [AnsiCodeType.style].
+const List<AnsiCode> styles = [
+ styleBold,
+ styleDim,
+ styleItalic,
+ styleUnderlined,
+ styleBlink,
+ styleReverse,
+ styleHidden,
+ styleCrossedOut
+];
+
+/// All of the [AnsiCode] values that represent [AnsiCodeType.foreground].
+const List<AnsiCode> foregroundColors = [
+ black,
+ red,
+ green,
+ yellow,
+ blue,
+ magenta,
+ cyan,
+ lightGray,
+ defaultForeground,
+ darkGray,
+ lightRed,
+ lightGreen,
+ lightYellow,
+ lightBlue,
+ lightMagenta,
+ lightCyan,
+ white
+];
+
+/// All of the [AnsiCode] values that represent [AnsiCodeType.background].
+const List<AnsiCode> backgroundColors = [
+ backgroundBlack,
+ backgroundRed,
+ backgroundGreen,
+ backgroundYellow,
+ backgroundBlue,
+ backgroundMagenta,
+ backgroundCyan,
+ backgroundLightGray,
+ backgroundDefault,
+ backgroundDarkGray,
+ backgroundLightRed,
+ backgroundLightGreen,
+ backgroundLightYellow,
+ backgroundLightBlue,
+ backgroundLightMagenta,
+ backgroundLightCyan,
+ backgroundWhite
+];
diff --git a/pkgs/io/lib/src/charcodes.dart b/pkgs/io/lib/src/charcodes.dart
new file mode 100644
index 0000000..4acaf0a
--- /dev/null
+++ b/pkgs/io/lib/src/charcodes.dart
@@ -0,0 +1,34 @@
+// Copyright (c) 2021, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Generated using:
+// pub global run charcode \$=dollar \'=single_quote \"=double_quote \
+// \' '\\\n"$`# \t'
+
+/// "Horizontal Tab" control character, common name.
+const int $tab = 0x09;
+
+/// "Line feed" control character.
+const int $lf = 0x0a;
+
+/// Space character.
+const int $space = 0x20;
+
+/// Character `"`, short name.
+const int $doubleQuote = 0x22;
+
+/// Character `#`.
+const int $hash = 0x23;
+
+/// Character `$`.
+const int $dollar = 0x24;
+
+/// Character "'".
+const int $singleQuote = 0x27;
+
+/// Character `\`.
+const int $backslash = 0x5c;
+
+/// Character `` ` ``.
+const int $backquote = 0x60;
diff --git a/pkgs/io/lib/src/copy_path.dart b/pkgs/io/lib/src/copy_path.dart
new file mode 100644
index 0000000..3a999b6
--- /dev/null
+++ b/pkgs/io/lib/src/copy_path.dart
@@ -0,0 +1,69 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import 'package:path/path.dart' as p;
+
+bool _doNothing(String from, String to) {
+ if (p.canonicalize(from) == p.canonicalize(to)) {
+ return true;
+ }
+ if (p.isWithin(from, to)) {
+ throw ArgumentError('Cannot copy from $from to $to');
+ }
+ return false;
+}
+
+/// Copies all of the files in the [from] directory to [to].
+///
+/// This is similar to `cp -R <from> <to>`:
+/// * Symlinks are supported.
+/// * Existing files are over-written, if any.
+/// * If [to] is within [from], throws [ArgumentError] (an infinite operation).
+/// * If [from] and [to] are canonically the same, no operation occurs.
+///
+/// Returns a future that completes when complete.
+Future<void> copyPath(String from, String to) async {
+ if (_doNothing(from, to)) {
+ return;
+ }
+ await Directory(to).create(recursive: true);
+ await for (final file in Directory(from).list(recursive: true)) {
+ final copyTo = p.join(to, p.relative(file.path, from: from));
+ if (file is Directory) {
+ await Directory(copyTo).create(recursive: true);
+ } else if (file is File) {
+ await File(file.path).copy(copyTo);
+ } else if (file is Link) {
+ await Link(copyTo).create(await file.target(), recursive: true);
+ }
+ }
+}
+
+/// Copies all of the files in the [from] directory to [to].
+///
+/// This is similar to `cp -R <from> <to>`:
+/// * Symlinks are supported.
+/// * Existing files are over-written, if any.
+/// * If [to] is within [from], throws [ArgumentError] (an infinite operation).
+/// * If [from] and [to] are canonically the same, no operation occurs.
+///
+/// This action is performed synchronously (blocking I/O).
+void copyPathSync(String from, String to) {
+ if (_doNothing(from, to)) {
+ return;
+ }
+ Directory(to).createSync(recursive: true);
+ for (final file in Directory(from).listSync(recursive: true)) {
+ final copyTo = p.join(to, p.relative(file.path, from: from));
+ if (file is Directory) {
+ Directory(copyTo).createSync(recursive: true);
+ } else if (file is File) {
+ File(file.path).copySync(copyTo);
+ } else if (file is Link) {
+ Link(copyTo).createSync(file.targetSync(), recursive: true);
+ }
+ }
+}
diff --git a/pkgs/io/lib/src/exit_code.dart b/pkgs/io/lib/src/exit_code.dart
new file mode 100644
index 0000000..d405558
--- /dev/null
+++ b/pkgs/io/lib/src/exit_code.dart
@@ -0,0 +1,82 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Exit code constants.
+///
+/// [Source](https://www.freebsd.org/cgi/man.cgi?query=sysexits).
+class ExitCode {
+ /// Command completed successfully.
+ static const success = ExitCode._(0, 'success');
+
+ /// Command was used incorrectly.
+ ///
+ /// This may occur if the wrong number of arguments was used, a bad flag, or
+ /// bad syntax in a parameter.
+ static const usage = ExitCode._(64, 'usage');
+
+ /// Input data was used incorrectly.
+ ///
+ /// This should occur only for user data (not system files).
+ static const data = ExitCode._(65, 'data');
+
+ /// An input file (not a system file) did not exist or was not readable.
+ static const noInput = ExitCode._(66, 'noInput');
+
+ /// User specified did not exist.
+ static const noUser = ExitCode._(67, 'noUser');
+
+ /// Host specified did not exist.
+ static const noHost = ExitCode._(68, 'noHost');
+
+ /// A service is unavailable.
+ ///
+ /// This may occur if a support program or file does not exist. This may also
+ /// be used as a catch-all error when something you wanted to do does not
+ /// work, but you do not know why.
+ static const unavailable = ExitCode._(69, 'unavailable');
+
+ /// An internal software error has been detected.
+ ///
+ /// This should be limited to non-operating system related errors as possible.
+ static const software = ExitCode._(70, 'software');
+
+ /// An operating system error has been detected.
+ ///
+ /// This intended to be used for such thing as `cannot fork` or `cannot pipe`.
+ static const osError = ExitCode._(71, 'osError');
+
+ /// Some system file (e.g. `/etc/passwd`) does not exist or could not be read.
+ static const osFile = ExitCode._(72, 'osFile');
+
+ /// A (user specified) output file cannot be created.
+ static const cantCreate = ExitCode._(73, 'cantCreate');
+
+ /// An error occurred doing I/O on some file.
+ static const ioError = ExitCode._(74, 'ioError');
+
+ /// Temporary failure, indicating something is not really an error.
+ ///
+ /// In some cases, this can be re-attempted and will succeed later.
+ static const tempFail = ExitCode._(75, 'tempFail');
+
+ /// You did not have sufficient permissions to perform the operation.
+ ///
+ /// This is not intended for file system problems, which should use [noInput]
+ /// or [cantCreate], but rather for higher-level permissions.
+ static const noPerm = ExitCode._(77, 'noPerm');
+
+ /// Something was found in an unconfigured or misconfigured state.
+ static const config = ExitCode._(78, 'config');
+
+ /// Exit code value.
+ final int code;
+
+ /// Name of the exit code.
+ final String _name;
+
+ const ExitCode._(this.code, this._name);
+
+ @override
+ String toString() => '$_name: $code';
+}
diff --git a/pkgs/io/lib/src/permissions.dart b/pkgs/io/lib/src/permissions.dart
new file mode 100644
index 0000000..c516943
--- /dev/null
+++ b/pkgs/io/lib/src/permissions.dart
@@ -0,0 +1,69 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+/// What type of permission is granted to a file based on file permission roles.
+enum _FilePermission {
+ execute,
+ // Although these two values are unused, their positions in the enum are
+ // meaningful.
+ write, // ignore: unused_field
+ read, // ignore: unused_field
+ setGid,
+ setUid,
+ sticky,
+}
+
+/// What type of role is assigned to a file.
+enum _FilePermissionRole {
+ world,
+ group,
+ user,
+}
+
+/// Returns whether file [stat] has [permission] for a [role] type.
+bool _hasPermission(
+ FileStat stat,
+ _FilePermission permission, {
+ _FilePermissionRole role = _FilePermissionRole.world,
+}) {
+ final index = _permissionBitIndex(permission, role);
+ return (stat.mode & (1 << index)) != 0;
+}
+
+int _permissionBitIndex(_FilePermission permission, _FilePermissionRole role) =>
+ switch (permission) {
+ _FilePermission.setUid => 11,
+ _FilePermission.setGid => 10,
+ _FilePermission.sticky => 9,
+ _ => (role.index * 3) + permission.index
+ };
+
+/// Returns whether [path] is considered an executable file on this OS.
+///
+/// May optionally define how to implement [getStat] or whether to execute based
+/// on whether this is the windows platform ([isWindows]) - if not set it is
+/// automatically extracted from `dart:io#Platform`.
+///
+/// **NOTE**: On windows this always returns `true`.
+FutureOr<bool> isExecutable(
+ String path, {
+ bool? isWindows,
+ FutureOr<FileStat> Function(String path) getStat = FileStat.stat,
+}) {
+ // Windows has no concept of executable.
+ if (isWindows ?? Platform.isWindows) return true;
+ final stat = getStat(path);
+ if (stat is FileStat) {
+ return _isExecutable(stat);
+ }
+ return stat.then(_isExecutable);
+}
+
+bool _isExecutable(FileStat stat) =>
+ stat.type == FileSystemEntityType.file &&
+ _FilePermissionRole.values.any(
+ (role) => _hasPermission(stat, _FilePermission.execute, role: role));
diff --git a/pkgs/io/lib/src/process_manager.dart b/pkgs/io/lib/src/process_manager.dart
new file mode 100644
index 0000000..84d22ec
--- /dev/null
+++ b/pkgs/io/lib/src/process_manager.dart
@@ -0,0 +1,255 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: close_sinks,cancel_subscriptions
+
+import 'dart:async';
+import 'dart:io' as io;
+
+import 'package:meta/meta.dart';
+
+import 'shared_stdin.dart';
+
+/// Type definition for both [io.Process.start] and [ProcessManager.spawn].
+///
+/// Useful for taking different implementations of this base functionality.
+typedef StartProcess = Future<io.Process> Function(
+ String executable,
+ List<String> arguments, {
+ String workingDirectory,
+ Map<String, String> environment,
+ bool includeParentEnvironment,
+ bool runInShell,
+ io.ProcessStartMode mode,
+});
+
+/// A high-level abstraction around using and managing processes on the system.
+abstract class ProcessManager {
+ /// Terminates the global `stdin` listener, making future listens impossible.
+ ///
+ /// This method should be invoked only at the _end_ of a program's execution.
+ static Future<void> terminateStdIn() async {
+ await sharedStdIn.terminate();
+ }
+
+ /// Create a new instance of [ProcessManager] for the current platform.
+ ///
+ /// May manually specify whether the current platform [isWindows], otherwise
+ /// this is derived from the Dart runtime (i.e. [io.Platform.isWindows]).
+ factory ProcessManager({
+ Stream<List<int>>? stdin,
+ io.IOSink? stdout,
+ io.IOSink? stderr,
+ bool? isWindows,
+ }) {
+ stdin ??= sharedStdIn;
+ stdout ??= io.stdout;
+ stderr ??= io.stderr;
+ isWindows ??= io.Platform.isWindows;
+ if (isWindows) {
+ return _WindowsProcessManager(stdin, stdout, stderr);
+ }
+ return _UnixProcessManager(stdin, stdout, stderr);
+ }
+
+ final Stream<List<int>> _stdin;
+ final io.IOSink _stdout;
+ final io.IOSink _stderr;
+
+ const ProcessManager._(this._stdin, this._stdout, this._stderr);
+
+ /// Spawns a process by invoking [executable] with [arguments].
+ ///
+ /// This is _similar_ to [io.Process.start], but all standard input and output
+ /// is forwarded/routed between the process and the host, similar to how a
+ /// shell script works.
+ ///
+ /// Returns a future that completes with a handle to the spawned process.
+ Future<io.Process> spawn(
+ String executable,
+ Iterable<String> arguments, {
+ String? workingDirectory,
+ Map<String, String>? environment,
+ bool includeParentEnvironment = true,
+ bool runInShell = false,
+ io.ProcessStartMode mode = io.ProcessStartMode.normal,
+ }) async {
+ final process = io.Process.start(
+ executable,
+ arguments.toList(),
+ workingDirectory: workingDirectory,
+ environment: environment,
+ includeParentEnvironment: includeParentEnvironment,
+ runInShell: runInShell,
+ mode: mode,
+ );
+ return _ForwardingSpawn(await process, _stdin, _stdout, _stderr);
+ }
+
+ /// Spawns a process by invoking [executable] with [arguments].
+ ///
+ /// This is _similar_ to [io.Process.start], but `stdout` and `stderr` is
+ /// forwarded/routed between the process and host, similar to how a shell
+ /// script works.
+ ///
+ /// Returns a future that completes with a handle to the spawned process.
+ Future<io.Process> spawnBackground(
+ String executable,
+ Iterable<String> arguments, {
+ String? workingDirectory,
+ Map<String, String>? environment,
+ bool includeParentEnvironment = true,
+ bool runInShell = false,
+ io.ProcessStartMode mode = io.ProcessStartMode.normal,
+ }) async {
+ final process = io.Process.start(
+ executable,
+ arguments.toList(),
+ workingDirectory: workingDirectory,
+ environment: environment,
+ includeParentEnvironment: includeParentEnvironment,
+ runInShell: runInShell,
+ mode: mode,
+ );
+ return _ForwardingSpawn(
+ await process,
+ const Stream.empty(),
+ _stdout,
+ _stderr,
+ );
+ }
+
+ /// Spawns a process by invoking [executable] with [arguments].
+ ///
+ /// This is _identical to [io.Process.start] (no forwarding of I/O).
+ ///
+ /// Returns a future that completes with a handle to the spawned process.
+ Future<io.Process> spawnDetached(
+ String executable,
+ Iterable<String> arguments, {
+ String? workingDirectory,
+ Map<String, String>? environment,
+ bool includeParentEnvironment = true,
+ bool runInShell = false,
+ io.ProcessStartMode mode = io.ProcessStartMode.normal,
+ }) async =>
+ io.Process.start(
+ executable,
+ arguments.toList(),
+ workingDirectory: workingDirectory,
+ environment: environment,
+ includeParentEnvironment: includeParentEnvironment,
+ runInShell: runInShell,
+ mode: mode,
+ );
+}
+
+/// A process instance created and managed through [ProcessManager].
+///
+/// Unlike one created directly by [io.Process.start] or [io.Process.run], a
+/// spawned process works more like executing a command in a shell script.
+class Spawn implements io.Process {
+ final io.Process _delegate;
+
+ Spawn._(this._delegate) {
+ _delegate.exitCode.then((_) => _onClosed());
+ }
+
+ @mustCallSuper
+ void _onClosed() {}
+
+ @override
+ bool kill([io.ProcessSignal signal = io.ProcessSignal.sigterm]) =>
+ _delegate.kill(signal);
+
+ @override
+ Future<int> get exitCode => _delegate.exitCode;
+
+ @override
+ int get pid => _delegate.pid;
+
+ @override
+ Stream<List<int>> get stderr => _delegate.stderr;
+
+ @override
+ io.IOSink get stdin => _delegate.stdin;
+
+ @override
+ Stream<List<int>> get stdout => _delegate.stdout;
+}
+
+/// Forwards `stdin`/`stdout`/`stderr` to/from the host.
+class _ForwardingSpawn extends Spawn {
+ final StreamSubscription<List<int>> _stdInSub;
+ final StreamSubscription<List<int>> _stdOutSub;
+ final StreamSubscription<List<int>> _stdErrSub;
+ final StreamController<List<int>> _stdOut;
+ final StreamController<List<int>> _stdErr;
+
+ factory _ForwardingSpawn(
+ io.Process delegate,
+ Stream<List<int>> stdin,
+ io.IOSink stdout,
+ io.IOSink stderr,
+ ) {
+ final stdoutSelf = StreamController<List<int>>();
+ final stderrSelf = StreamController<List<int>>();
+ final stdInSub = stdin.listen(delegate.stdin.add);
+ final stdOutSub = delegate.stdout.listen((event) {
+ stdout.add(event);
+ stdoutSelf.add(event);
+ });
+ final stdErrSub = delegate.stderr.listen((event) {
+ stderr.add(event);
+ stderrSelf.add(event);
+ });
+ return _ForwardingSpawn._delegate(
+ delegate,
+ stdInSub,
+ stdOutSub,
+ stdErrSub,
+ stdoutSelf,
+ stderrSelf,
+ );
+ }
+
+ _ForwardingSpawn._delegate(
+ super.delegate,
+ this._stdInSub,
+ this._stdOutSub,
+ this._stdErrSub,
+ this._stdOut,
+ this._stdErr,
+ ) : super._();
+
+ @override
+ void _onClosed() {
+ _stdInSub.cancel();
+ _stdOutSub.cancel();
+ _stdErrSub.cancel();
+ super._onClosed();
+ }
+
+ @override
+ Stream<List<int>> get stdout => _stdOut.stream;
+
+ @override
+ Stream<List<int>> get stderr => _stdErr.stream;
+}
+
+class _UnixProcessManager extends ProcessManager {
+ const _UnixProcessManager(
+ super.stdin,
+ super.stdout,
+ super.stderr,
+ ) : super._();
+}
+
+class _WindowsProcessManager extends ProcessManager {
+ const _WindowsProcessManager(
+ super.stdin,
+ super.stdout,
+ super.stderr,
+ ) : super._();
+}
diff --git a/pkgs/io/lib/src/shared_stdin.dart b/pkgs/io/lib/src/shared_stdin.dart
new file mode 100644
index 0000000..72bb50c
--- /dev/null
+++ b/pkgs/io/lib/src/shared_stdin.dart
@@ -0,0 +1,99 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+import 'dart:io';
+
+import 'package:meta/meta.dart';
+
+/// A shared singleton instance of `dart:io`'s [stdin] stream.
+///
+/// _Unlike_ the normal [stdin] stream, [sharedStdIn] may switch subscribers
+/// as long as the previous subscriber cancels before the new subscriber starts
+/// listening.
+///
+/// [SharedStdIn.terminate] *must* be invoked in order to close the underlying
+/// connection to [stdin], allowing your program to close automatically without
+/// hanging.
+final SharedStdIn sharedStdIn = SharedStdIn(stdin);
+
+/// A singleton wrapper around `stdin` that allows new subscribers.
+///
+/// This class is visible in order to be used as a test harness for mock
+/// implementations of `stdin`. In normal programs, [sharedStdIn] should be
+/// used directly.
+@visibleForTesting
+class SharedStdIn extends Stream<List<int>> {
+ StreamController<List<int>>? _current;
+ StreamSubscription<List<int>>? _sub;
+
+ SharedStdIn([Stream<List<int>>? stream]) {
+ _sub = (stream ??= stdin).listen(_onInput);
+ }
+
+ /// Returns a future that completes with the next line.
+ ///
+ /// This is similar to the standard [Stdin.readLineSync], but asynchronous.
+ Future<String> nextLine({Encoding encoding = systemEncoding}) =>
+ lines(encoding: encoding).first;
+
+ /// Returns the stream transformed as UTF8 strings separated by line breaks.
+ ///
+ /// This is similar to synchronous code using [Stdin.readLineSync]:
+ /// ```dart
+ /// while (true) {
+ /// var line = stdin.readLineSync();
+ /// // ...
+ /// }
+ /// ```
+ ///
+ /// ... but asynchronous.
+ Stream<String> lines({Encoding encoding = systemEncoding}) =>
+ transform(utf8.decoder).transform(const LineSplitter());
+
+ void _onInput(List<int> event) => _getCurrent().add(event);
+
+ StreamController<List<int>> _getCurrent() =>
+ _current ??= StreamController<List<int>>(
+ onCancel: () {
+ _current = null;
+ },
+ sync: true);
+
+ @override
+ StreamSubscription<List<int>> listen(
+ void Function(List<int> event)? onData, {
+ Function? onError,
+ void Function()? onDone,
+ bool? cancelOnError,
+ }) {
+ if (_sub == null) {
+ throw StateError('Stdin has already been terminated.');
+ }
+ // ignore: close_sinks
+ final controller = _getCurrent();
+ if (controller.hasListener) {
+ throw StateError(''
+ 'Subscriber already listening. The existing subscriber must cancel '
+ 'before another may be added.');
+ }
+ return controller.stream.listen(
+ onData,
+ onDone: onDone,
+ onError: onError,
+ cancelOnError: cancelOnError,
+ );
+ }
+
+ /// Terminates the connection to `stdin`, closing all subscription.
+ Future<void> terminate() async {
+ if (_sub == null) {
+ throw StateError('Stdin has already been terminated.');
+ }
+ await _sub?.cancel();
+ await _current?.close();
+ _sub = null;
+ }
+}
diff --git a/pkgs/io/lib/src/shell_words.dart b/pkgs/io/lib/src/shell_words.dart
new file mode 100644
index 0000000..5fca6d9
--- /dev/null
+++ b/pkgs/io/lib/src/shell_words.dart
@@ -0,0 +1,142 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: comment_references
+
+import 'package:string_scanner/string_scanner.dart';
+
+import 'charcodes.dart';
+
+/// Splits [command] into tokens according to [the POSIX shell
+/// specification][spec].
+///
+/// [spec]: http://pubs.opengroup.org/onlinepubs/9699919799/utilities/contents.html
+///
+/// This returns the unquoted values of quoted tokens. For example,
+/// `shellSplit('foo "bar baz"')` returns `["foo", "bar baz"]`. It does not
+/// currently support here-documents. It does *not* treat dynamic features such
+/// as parameter expansion specially. For example, `shellSplit("foo $(bar
+/// baz)")` returns `["foo", "$(bar", "baz)"]`.
+///
+/// This will discard any comments at the end of [command].
+///
+/// Throws a [FormatException] if [command] isn't a valid shell command.
+List<String> shellSplit(String command) {
+ final scanner = StringScanner(command);
+ final results = <String>[];
+ final token = StringBuffer();
+
+ // Whether a token is being parsed, as opposed to a separator character. This
+ // is different than just [token.isEmpty], because empty quoted tokens can
+ // exist.
+ var hasToken = false;
+
+ while (!scanner.isDone) {
+ final next = scanner.readChar();
+ switch (next) {
+ case $backslash:
+ // Section 2.2.1: A <backslash> that is not quoted shall preserve the
+ // literal value of the following character, with the exception of a
+ // <newline>. If a <newline> follows the <backslash>, the shell shall
+ // interpret this as line continuation. The <backslash> and <newline>
+ // shall be removed before splitting the input into tokens. Since the
+ // escaped <newline> is removed entirely from the input and is not
+ // replaced by any white space, it cannot serve as a token separator.
+ if (scanner.scanChar($lf)) break;
+
+ hasToken = true;
+ token.writeCharCode(scanner.readChar());
+
+ case $singleQuote:
+ hasToken = true;
+ // Section 2.2.2: Enclosing characters in single-quotes ( '' ) shall
+ // preserve the literal value of each character within the
+ // single-quotes. A single-quote cannot occur within single-quotes.
+ final firstQuote = scanner.position - 1;
+ while (!scanner.scanChar($singleQuote)) {
+ _checkUnmatchedQuote(scanner, firstQuote);
+ token.writeCharCode(scanner.readChar());
+ }
+
+ case $doubleQuote:
+ hasToken = true;
+ // Section 2.2.3: Enclosing characters in double-quotes ( "" ) shall
+ // preserve the literal value of all characters within the
+ // double-quotes, with the exception of the characters backquote,
+ // <dollar-sign>, and <backslash>.
+ //
+ // (Note that this code doesn't preserve special behavior of backquote
+ // or dollar sign within double quotes, since those are dynamic
+ // features.)
+ final firstQuote = scanner.position - 1;
+ while (!scanner.scanChar($doubleQuote)) {
+ _checkUnmatchedQuote(scanner, firstQuote);
+
+ if (scanner.scanChar($backslash)) {
+ _checkUnmatchedQuote(scanner, firstQuote);
+
+ // The <backslash> shall retain its special meaning as an escape
+ // character (see Escape Character (Backslash)) only when followed
+ // by one of the following characters when considered special:
+ //
+ // $ ` " \ <newline>
+ final next = scanner.readChar();
+ if (next == $lf) continue;
+ if (next == $dollar ||
+ next == $backquote ||
+ next == $doubleQuote ||
+ next == $backslash) {
+ token.writeCharCode(next);
+ } else {
+ token
+ ..writeCharCode($backslash)
+ ..writeCharCode(next);
+ }
+ } else {
+ token.writeCharCode(scanner.readChar());
+ }
+ }
+
+ case $hash:
+ // Section 2.3: If the current character is a '#' [and the previous
+ // characters was not part of a word], it and all subsequent characters
+ // up to, but excluding, the next <newline> shall be discarded as a
+ // comment. The <newline> that ends the line is not considered part of
+ // the comment.
+ if (hasToken) {
+ token.writeCharCode($hash);
+ break;
+ }
+
+ while (!scanner.isDone && scanner.peekChar() != $lf) {
+ scanner.readChar();
+ }
+
+ case $space:
+ case $tab:
+ case $lf:
+ // ignore: invariant_booleans
+ if (hasToken) results.add(token.toString());
+ hasToken = false;
+ token.clear();
+
+ default:
+ hasToken = true;
+ token.writeCharCode(next);
+ }
+ }
+
+ if (hasToken) results.add(token.toString());
+ return results;
+}
+
+/// Throws a [FormatException] if [scanner] is done indicating that a closing
+/// quote matching the one at position [openingQuote] is missing.
+void _checkUnmatchedQuote(StringScanner scanner, int openingQuote) {
+ if (!scanner.isDone) return;
+ final type = scanner.substring(openingQuote, openingQuote + 1) == '"'
+ ? 'double'
+ : 'single';
+ scanner.error('Unmatched $type quote.', position: openingQuote, length: 1);
+}
diff --git a/pkgs/io/pubspec.yaml b/pkgs/io/pubspec.yaml
new file mode 100644
index 0000000..7e00d99
--- /dev/null
+++ b/pkgs/io/pubspec.yaml
@@ -0,0 +1,19 @@
+name: io
+description: >-
+ Utilities for the Dart VM Runtime including support for ANSI colors, file
+ copying, and standard exit code values.
+version: 1.0.5
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/io
+
+environment:
+ sdk: ^3.4.0
+
+dependencies:
+ meta: ^1.3.0
+ path: ^1.8.0
+ string_scanner: ^1.1.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.6
+ test_descriptor: ^2.0.0
diff --git a/pkgs/io/test/_files/is_executable.sh b/pkgs/io/test/_files/is_executable.sh
new file mode 100755
index 0000000..f1f641a
--- /dev/null
+++ b/pkgs/io/test/_files/is_executable.sh
@@ -0,0 +1 @@
+#!/usr/bin/env bash
diff --git a/pkgs/io/test/_files/is_not_executable.sh b/pkgs/io/test/_files/is_not_executable.sh
new file mode 100644
index 0000000..f1f641a
--- /dev/null
+++ b/pkgs/io/test/_files/is_not_executable.sh
@@ -0,0 +1 @@
+#!/usr/bin/env bash
diff --git a/pkgs/io/test/_files/stderr_hello.dart b/pkgs/io/test/_files/stderr_hello.dart
new file mode 100644
index 0000000..ac7a7d3
--- /dev/null
+++ b/pkgs/io/test/_files/stderr_hello.dart
@@ -0,0 +1,7 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+void main() => stderr.write('Hello');
diff --git a/pkgs/io/test/_files/stdin_echo.dart b/pkgs/io/test/_files/stdin_echo.dart
new file mode 100644
index 0000000..256e0ee
--- /dev/null
+++ b/pkgs/io/test/_files/stdin_echo.dart
@@ -0,0 +1,7 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+void main() => stdout.writeln('You said: ${stdin.readLineSync()}');
diff --git a/pkgs/io/test/_files/stdout_hello.dart b/pkgs/io/test/_files/stdout_hello.dart
new file mode 100644
index 0000000..af3bf51
--- /dev/null
+++ b/pkgs/io/test/_files/stdout_hello.dart
@@ -0,0 +1,7 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+void main() => stdout.write('Hello');
diff --git a/pkgs/io/test/ansi_code_test.dart b/pkgs/io/test/ansi_code_test.dart
new file mode 100644
index 0000000..98ae68b
--- /dev/null
+++ b/pkgs/io/test/ansi_code_test.dart
@@ -0,0 +1,187 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'dart:io';
+
+import 'package:io/ansi.dart';
+import 'package:test/test.dart';
+
+const _ansiEscapeLiteral = '\x1B';
+const _ansiEscapeForScript = r'\033';
+const sampleInput = 'sample input';
+
+void main() {
+ group('ansiOutputEnabled', () {
+ test('default value matches dart:io', () {
+ expect(ansiOutputEnabled,
+ stdout.supportsAnsiEscapes && stderr.supportsAnsiEscapes);
+ });
+
+ test('override true', () {
+ overrideAnsiOutput(true, () {
+ expect(ansiOutputEnabled, isTrue);
+ });
+ });
+
+ test('override false', () {
+ overrideAnsiOutput(false, () {
+ expect(ansiOutputEnabled, isFalse);
+ });
+ });
+
+ test('forScript variaents ignore `ansiOutputEnabled`', () {
+ const expected =
+ '$_ansiEscapeForScript[34m$sampleInput$_ansiEscapeForScript[0m';
+
+ for (var override in [true, false]) {
+ overrideAnsiOutput(override, () {
+ expect(blue.escapeForScript, '$_ansiEscapeForScript[34m');
+ expect(blue.wrap(sampleInput, forScript: true), expected);
+ expect(wrapWith(sampleInput, [blue], forScript: true), expected);
+ });
+ }
+ });
+ });
+
+ test('foreground and background colors match', () {
+ expect(foregroundColors, hasLength(backgroundColors.length));
+
+ for (var i = 0; i < foregroundColors.length; i++) {
+ final foreground = foregroundColors[i];
+ expect(foreground.type, AnsiCodeType.foreground);
+ expect(foreground.name.toLowerCase(), foreground.name,
+ reason: 'All names should be lower case');
+ final background = backgroundColors[i];
+ expect(background.type, AnsiCodeType.background);
+ expect(background.name.toLowerCase(), background.name,
+ reason: 'All names should be lower case');
+
+ expect(foreground.name, background.name);
+
+ // The last base-10 digit also matches – good to sanity check
+ expect(foreground.code % 10, background.code % 10);
+ }
+ });
+
+ test('all styles are styles', () {
+ for (var style in styles) {
+ expect(style.type, AnsiCodeType.style);
+ expect(style.name.toLowerCase(), style.name,
+ reason: 'All names should be lower case');
+ if (style == styleBold) {
+ expect(style.reset, resetBold);
+ } else {
+ expect(style.reset!.code, equals(style.code + 20));
+ }
+ expect(style.name, equals(style.reset!.name));
+ }
+ });
+
+ for (var forScript in [true, false]) {
+ group(forScript ? 'forScript' : 'escaped', () {
+ final escapeLiteral =
+ forScript ? _ansiEscapeForScript : _ansiEscapeLiteral;
+
+ group('wrap', () {
+ _test('color', () {
+ final expected = '$escapeLiteral[34m$sampleInput$escapeLiteral[0m';
+
+ expect(blue.wrap(sampleInput, forScript: forScript), expected);
+ });
+
+ _test('style', () {
+ final expected = '$escapeLiteral[1m$sampleInput$escapeLiteral[22m';
+
+ expect(styleBold.wrap(sampleInput, forScript: forScript), expected);
+ });
+
+ _test('style', () {
+ final expected = '$escapeLiteral[34m$sampleInput$escapeLiteral[0m';
+
+ expect(blue.wrap(sampleInput, forScript: forScript), expected);
+ });
+
+ test('empty', () {
+ expect(blue.wrap('', forScript: forScript), '');
+ });
+
+ test(null, () {
+ expect(blue.wrap(null, forScript: forScript), isNull);
+ });
+ });
+
+ group('wrapWith', () {
+ _test('foreground', () {
+ final expected = '$escapeLiteral[34m$sampleInput$escapeLiteral[0m';
+
+ expect(wrapWith(sampleInput, [blue], forScript: forScript), expected);
+ });
+
+ _test('background', () {
+ final expected = '$escapeLiteral[44m$sampleInput$escapeLiteral[0m';
+
+ expect(wrapWith(sampleInput, [backgroundBlue], forScript: forScript),
+ expected);
+ });
+
+ _test('style', () {
+ final expected = '$escapeLiteral[1m$sampleInput$escapeLiteral[0m';
+
+ expect(wrapWith(sampleInput, [styleBold], forScript: forScript),
+ expected);
+ });
+
+ _test('2 styles', () {
+ final expected = '$escapeLiteral[1;3m$sampleInput$escapeLiteral[0m';
+
+ expect(
+ wrapWith(sampleInput, [styleBold, styleItalic],
+ forScript: forScript),
+ expected);
+ });
+
+ _test('2 foregrounds', () {
+ expect(
+ () => wrapWith(sampleInput, [blue, white], forScript: forScript),
+ throwsArgumentError);
+ });
+
+ _test('multi', () {
+ final expected =
+ '$escapeLiteral[1;4;34;107m$sampleInput$escapeLiteral[0m';
+
+ expect(
+ wrapWith(sampleInput,
+ [blue, backgroundWhite, styleBold, styleUnderlined],
+ forScript: forScript),
+ expected);
+ });
+
+ test('no codes', () {
+ expect(wrapWith(sampleInput, []), sampleInput);
+ });
+
+ _test('empty', () {
+ expect(
+ wrapWith('', [blue, backgroundWhite, styleBold],
+ forScript: forScript),
+ '');
+ });
+
+ _test('null', () {
+ expect(
+ wrapWith(null, [blue, backgroundWhite, styleBold],
+ forScript: forScript),
+ isNull);
+ });
+ });
+ });
+ }
+}
+
+void _test<T>(String name, T Function() body) =>
+ test(name, () => overrideAnsiOutput<T>(true, body));
diff --git a/pkgs/io/test/copy_path_test.dart b/pkgs/io/test/copy_path_test.dart
new file mode 100644
index 0000000..fd1e9ce
--- /dev/null
+++ b/pkgs/io/test/copy_path_test.dart
@@ -0,0 +1,45 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'package:io/io.dart';
+import 'package:path/path.dart' as p;
+import 'package:test/test.dart';
+import 'package:test_descriptor/test_descriptor.dart' as d;
+
+void main() {
+ test('should copy a directory (async)', () async {
+ await _create();
+ await copyPath(p.join(d.sandbox, 'parent'), p.join(d.sandbox, 'copy'));
+ await _validate();
+ });
+
+ test('should copy a directory (sync)', () async {
+ await _create();
+ copyPathSync(p.join(d.sandbox, 'parent'), p.join(d.sandbox, 'copy'));
+ await _validate();
+ });
+
+ test('should catch an infinite operation', () async {
+ await _create();
+ expect(
+ copyPath(
+ p.join(d.sandbox, 'parent'),
+ p.join(d.sandbox, 'parent', 'child'),
+ ),
+ throwsArgumentError,
+ );
+ });
+}
+
+d.DirectoryDescriptor _struct() => d.dir('parent', [
+ d.dir('child', [
+ d.file('foo.txt'),
+ ]),
+ ]);
+
+Future<void> _create() => _struct().create();
+Future<void> _validate() => _struct().validate();
diff --git a/pkgs/io/test/permissions_test.dart b/pkgs/io/test/permissions_test.dart
new file mode 100644
index 0000000..478e8df
--- /dev/null
+++ b/pkgs/io/test/permissions_test.dart
@@ -0,0 +1,37 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'package:io/io.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('isExecutable', () {
+ const files = 'test/_files';
+ const shellIsExec = '$files/is_executable.sh';
+ const shellNotExec = '$files/is_not_executable.sh';
+
+ group('on shell scripts', () {
+ test('should return true for "is_executable.sh"', () async {
+ expect(await isExecutable(shellIsExec), isTrue);
+ });
+
+ test('should return false for "is_not_executable.sh"', () async {
+ expect(await isExecutable(shellNotExec), isFalse);
+ });
+ }, testOn: '!windows');
+
+ group('on shell scripts [windows]', () {
+ test('should return true for "is_executable.sh"', () async {
+ expect(await isExecutable(shellIsExec, isWindows: true), isTrue);
+ });
+
+ test('should return true for "is_not_executable.sh"', () async {
+ expect(await isExecutable(shellNotExec, isWindows: true), isTrue);
+ });
+ });
+ });
+}
diff --git a/pkgs/io/test/process_manager_test.dart b/pkgs/io/test/process_manager_test.dart
new file mode 100644
index 0000000..9871a77
--- /dev/null
+++ b/pkgs/io/test/process_manager_test.dart
@@ -0,0 +1,100 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: close_sinks
+
+import 'dart:async';
+import 'dart:convert';
+import 'dart:io';
+
+import 'package:io/io.dart' hide sharedStdIn;
+import 'package:path/path.dart' as p;
+import 'package:test/test.dart';
+
+void main() {
+ StreamController<String> fakeStdIn;
+ late ProcessManager processManager;
+ SharedStdIn sharedStdIn;
+ late List<String> stdoutLog;
+ late List<String> stderrLog;
+
+ test('spawn functions should match the type definition of Process.start', () {
+ const isStartProcess = TypeMatcher<StartProcess>();
+ expect(Process.start, isStartProcess);
+ final manager = ProcessManager();
+ expect(manager.spawn, isStartProcess);
+ expect(manager.spawnBackground, isStartProcess);
+ expect(manager.spawnDetached, isStartProcess);
+ });
+
+ group('spawn', () {
+ setUp(() async {
+ fakeStdIn = StreamController<String>(sync: true);
+ sharedStdIn = SharedStdIn(fakeStdIn.stream.map((s) => s.codeUnits));
+ stdoutLog = <String>[];
+ stderrLog = <String>[];
+
+ final stdoutController = StreamController<List<int>>(sync: true);
+ stdoutController.stream.map(utf8.decode).listen(stdoutLog.add);
+ final stdout = IOSink(stdoutController);
+ final stderrController = StreamController<List<int>>(sync: true);
+ stderrController.stream.map(utf8.decode).listen(stderrLog.add);
+ final stderr = IOSink(stderrController);
+
+ processManager = ProcessManager(
+ stdin: sharedStdIn,
+ stdout: stdout,
+ stderr: stderr,
+ );
+ });
+
+ final dart = Platform.executable;
+
+ test('should output Hello from another process [via stdout]', () async {
+ final spawn = await processManager.spawn(
+ dart,
+ [p.join('test', '_files', 'stdout_hello.dart')],
+ );
+ await spawn.exitCode;
+ expect(stdoutLog, ['Hello']);
+ });
+
+ test('should output Hello from another process [via stderr]', () async {
+ final spawn = await processManager.spawn(
+ dart,
+ [p.join('test', '_files', 'stderr_hello.dart')],
+ );
+ await spawn.exitCode;
+ expect(stderrLog, ['Hello']);
+ });
+
+ test('should forward stdin to another process', () async {
+ final spawn = await processManager.spawn(
+ dart,
+ [p.join('test', '_files', 'stdin_echo.dart')],
+ );
+ spawn.stdin.writeln('Ping');
+ await spawn.exitCode;
+ expect(stdoutLog.join(), contains('You said: Ping'));
+ });
+
+ group('should return a Process where', () {
+ test('.stdout is readable', () async {
+ final spawn = await processManager.spawn(
+ dart,
+ [p.join('test', '_files', 'stdout_hello.dart')],
+ );
+ expect(await spawn.stdout.transform(utf8.decoder).first, 'Hello');
+ });
+
+ test('.stderr is readable', () async {
+ final spawn = await processManager.spawn(
+ dart,
+ [p.join('test', '_files', 'stderr_hello.dart')],
+ );
+ expect(await spawn.stderr.transform(utf8.decoder).first, 'Hello');
+ });
+ });
+ });
+}
diff --git a/pkgs/io/test/shared_stdin_test.dart b/pkgs/io/test/shared_stdin_test.dart
new file mode 100644
index 0000000..71629ec
--- /dev/null
+++ b/pkgs/io/test/shared_stdin_test.dart
@@ -0,0 +1,80 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+
+import 'package:io/io.dart' hide sharedStdIn;
+import 'package:test/test.dart';
+
+void main() {
+ // ignore: close_sinks
+ late StreamController<String> fakeStdIn;
+ late SharedStdIn sharedStdIn;
+
+ setUp(() {
+ fakeStdIn = StreamController<String>(sync: true);
+ sharedStdIn = SharedStdIn(fakeStdIn.stream.map((s) => s.codeUnits));
+ });
+
+ test('should allow a single subscriber', () async {
+ final logs = <String>[];
+ final sub = sharedStdIn.transform(utf8.decoder).listen(logs.add);
+ fakeStdIn.add('Hello World');
+ await sub.cancel();
+ expect(logs, ['Hello World']);
+ });
+
+ test('should allow multiple subscribers', () async {
+ final logs = <String>[];
+ final asUtf8 = sharedStdIn.transform(utf8.decoder);
+ var sub = asUtf8.listen(logs.add);
+ fakeStdIn.add('Hello World');
+ await sub.cancel();
+ sub = asUtf8.listen(logs.add);
+ fakeStdIn.add('Goodbye World');
+ await sub.cancel();
+ expect(logs, ['Hello World', 'Goodbye World']);
+ });
+
+ test('should throw if a subscriber is still active', () async {
+ final active = sharedStdIn.listen((_) {});
+ expect(() => sharedStdIn.listen((_) {}), throwsStateError);
+ await active.cancel();
+ expect(() => sharedStdIn.listen((_) {}), returnsNormally);
+ });
+
+ test('should return a stream of lines', () async {
+ expect(
+ sharedStdIn.lines(),
+ emitsInOrder(<dynamic>[
+ 'I',
+ 'Think',
+ 'Therefore',
+ 'I',
+ 'Am',
+ ]),
+ );
+ [
+ 'I\nThink\n',
+ 'Therefore\n',
+ 'I\n',
+ 'Am\n',
+ ].forEach(fakeStdIn.add);
+ });
+
+ test('should return the next line', () {
+ expect(sharedStdIn.nextLine(), completion('Hello World'));
+ fakeStdIn.add('Hello World\n');
+ });
+
+ test('should allow listening for new lines multiple times', () async {
+ expect(sharedStdIn.nextLine(), completion('Hello World'));
+ fakeStdIn.add('Hello World\n');
+ await Future<void>.value();
+
+ expect(sharedStdIn.nextLine(), completion('Hello World'));
+ fakeStdIn.add('Hello World\n');
+ });
+}
diff --git a/pkgs/io/test/shell_words_test.dart b/pkgs/io/test/shell_words_test.dart
new file mode 100644
index 0000000..dc4441c
--- /dev/null
+++ b/pkgs/io/test/shell_words_test.dart
@@ -0,0 +1,185 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:io/io.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('shellSplit()', () {
+ group('returns an empty list for', () {
+ test('an empty string', () {
+ expect(shellSplit(''), isEmpty);
+ });
+
+ test('spaces', () {
+ expect(shellSplit(' '), isEmpty);
+ });
+
+ test('tabs', () {
+ expect(shellSplit('\t\t\t'), isEmpty);
+ });
+
+ test('newlines', () {
+ expect(shellSplit('\n\n\n'), isEmpty);
+ });
+
+ test('a comment', () {
+ expect(shellSplit('#foo bar baz'), isEmpty);
+ });
+
+ test('a mix', () {
+ expect(shellSplit(' \t\n# foo'), isEmpty);
+ });
+ });
+
+ group('parses unquoted', () {
+ test('a single token', () {
+ expect(shellSplit('foo'), equals(['foo']));
+ });
+
+ test('multiple tokens', () {
+ expect(shellSplit('foo bar baz'), equals(['foo', 'bar', 'baz']));
+ });
+
+ test('tokens separated by tabs', () {
+ expect(shellSplit('foo\tbar\tbaz'), equals(['foo', 'bar', 'baz']));
+ });
+
+ test('tokens separated by newlines', () {
+ expect(shellSplit('foo\nbar\nbaz'), equals(['foo', 'bar', 'baz']));
+ });
+
+ test('a token after whitespace', () {
+ expect(shellSplit(' \t\nfoo'), equals(['foo']));
+ });
+
+ test('a token before whitespace', () {
+ expect(shellSplit('foo \t\n'), equals(['foo']));
+ });
+
+ test('a token with a hash', () {
+ expect(shellSplit('foo#bar'), equals(['foo#bar']));
+ });
+
+ test('a token before a comment', () {
+ expect(shellSplit('foo #bar'), equals(['foo']));
+ });
+
+ test('dynamic shell features', () {
+ expect(
+ shellSplit(r'foo $(bar baz)'), equals(['foo', r'$(bar', 'baz)']));
+ expect(shellSplit('foo `bar baz`'), equals(['foo', '`bar', 'baz`']));
+ expect(shellSplit(r'foo $bar | baz'),
+ equals(['foo', r'$bar', '|', 'baz']));
+ });
+ });
+
+ group('parses a backslash', () {
+ test('before a normal character', () {
+ expect(shellSplit(r'foo\bar'), equals(['foobar']));
+ });
+
+ test('before a dynamic shell feature', () {
+ expect(shellSplit(r'foo\$bar'), equals([r'foo$bar']));
+ });
+
+ test('before a single quote', () {
+ expect(shellSplit(r"foo\'bar"), equals(["foo'bar"]));
+ });
+
+ test('before a double quote', () {
+ expect(shellSplit(r'foo\"bar'), equals(['foo"bar']));
+ });
+
+ test('before a space', () {
+ expect(shellSplit(r'foo\ bar'), equals(['foo bar']));
+ });
+
+ test('at the beginning of a token', () {
+ expect(shellSplit(r'\ foo'), equals([' foo']));
+ });
+
+ test('before whitespace followed by a hash', () {
+ expect(shellSplit(r'\ #foo'), equals([' #foo']));
+ });
+
+ test('before a newline in a token', () {
+ expect(shellSplit('foo\\\nbar'), equals(['foobar']));
+ });
+
+ test('before a newline outside a token', () {
+ expect(shellSplit('foo \\\n bar'), equals(['foo', 'bar']));
+ });
+
+ test('before a backslash', () {
+ expect(shellSplit(r'foo\\bar'), equals([r'foo\bar']));
+ });
+ });
+
+ group('parses single quotes', () {
+ test('that are empty', () {
+ expect(shellSplit("''"), equals(['']));
+ });
+
+ test('that contain normal characters', () {
+ expect(shellSplit("'foo'"), equals(['foo']));
+ });
+
+ test('that contain active characters', () {
+ expect(shellSplit("'\" \\#'"), equals([r'" \#']));
+ });
+
+ test('before a hash', () {
+ expect(shellSplit("''#foo"), equals([r'#foo']));
+ });
+
+ test('inside a token', () {
+ expect(shellSplit("foo'bar baz'qux"), equals([r'foobar bazqux']));
+ });
+
+ test('without a closing quote', () {
+ expect(() => shellSplit("'foo bar"), throwsFormatException);
+ });
+ });
+
+ group('parses double quotes', () {
+ test('that are empty', () {
+ expect(shellSplit('""'), equals(['']));
+ });
+
+ test('that contain normal characters', () {
+ expect(shellSplit('"foo"'), equals(['foo']));
+ });
+
+ test('that contain otherwise-active characters', () {
+ expect(shellSplit('"\' #"'), equals(["' #"]));
+ });
+
+ test('that contain escaped characters', () {
+ expect(shellSplit(r'"\$\`\"\\"'), equals([r'$`"\']));
+ });
+
+ test('that contain an escaped newline', () {
+ expect(shellSplit('"\\\n"'), equals(['']));
+ });
+
+ test("that contain a backslash that's not an escape", () {
+ expect(shellSplit(r'"f\oo"'), equals([r'f\oo']));
+ });
+
+ test('before a hash', () {
+ expect(shellSplit('""#foo'), equals([r'#foo']));
+ });
+
+ test('inside a token', () {
+ expect(shellSplit('foo"bar baz"qux'), equals([r'foobar bazqux']));
+ });
+
+ test('without a closing quote', () {
+ expect(() => shellSplit('"foo bar'), throwsFormatException);
+ expect(() => shellSplit(r'"foo bar\'), throwsFormatException);
+ });
+ });
+ });
+}
diff --git a/pkgs/package_config/.gitignore b/pkgs/package_config/.gitignore
new file mode 100644
index 0000000..7b888b8
--- /dev/null
+++ b/pkgs/package_config/.gitignore
@@ -0,0 +1,7 @@
+.packages
+.pub
+.dart_tool/
+.vscode/
+packages
+pubspec.lock
+doc/api/
diff --git a/pkgs/package_config/AUTHORS b/pkgs/package_config/AUTHORS
new file mode 100644
index 0000000..e8063a8
--- /dev/null
+++ b/pkgs/package_config/AUTHORS
@@ -0,0 +1,6 @@
+# Below is a list of people and organizations that have contributed
+# to the project. Names should be added to the list like so:
+#
+# Name/Organization <email address>
+
+Google Inc.
diff --git a/pkgs/package_config/CHANGELOG.md b/pkgs/package_config/CHANGELOG.md
new file mode 100644
index 0000000..101a0fe
--- /dev/null
+++ b/pkgs/package_config/CHANGELOG.md
@@ -0,0 +1,108 @@
+## 2.1.1
+
+- Require Dart 3.4
+- Move to `dart-lang/tools` monorepo.
+
+## 2.1.0
+
+- Adds `minVersion` to `findPackageConfig` and `findPackageConfigVersion`
+ which allows ignoring earlier versions (which currently only means
+ ignoring version 1, aka. `.packages` files.)
+
+- Changes the version number of `SimplePackageConfig.empty` to the
+ current maximum version.
+
+- Improve file read performance; improve lookup performance.
+- Emit an error when a package is inside the package root of another package.
+- Fix a link in the readme.
+
+## 2.0.2
+
+- Update package description and README.
+- Change to package:lints for style checking.
+- Add an example.
+
+## 2.0.1
+
+- Use unique library names to correct docs issue.
+
+## 2.0.0
+
+- Migrate to null safety.
+- Remove legacy APIs.
+- Adds `relativeRoot` property to `Package` which controls whether to
+ make the root URI relative when writing a configuration file.
+
+## 1.9.3
+
+- Fix `Package` constructor not accepting relative `packageUriRoot`.
+
+## 1.9.2
+
+- Updated to support new rules for picking `package_config.json` over
+ a specified `.packages`.
+- Deduce package root from `.packages` derived package configuration,
+ and default all such packages to language version 2.7.
+
+## 1.9.1
+
+- Remove accidental transitive import of `dart:io` from entrypoints that are
+ supposed to be cross-platform compatible.
+
+## 1.9.0
+
+- Based on new JSON file format with more content.
+- This version includes all the new functionality intended for a 2.0.0
+ version, as well as the, now deprecated, version 1 functionality.
+ When we release 2.0.0, the deprecated functionality will be removed.
+
+## 1.1.0
+
+- Allow parsing files with default-package entries and metadata.
+ A default-package entry has an empty key and a valid package name
+ as value.
+ Metadata is attached as fragments to base URIs.
+
+## 1.0.5
+
+- Fix usage of SDK constants.
+
+## 1.0.4
+
+- Set max SDK version to <3.0.0.
+
+## 1.0.3
+
+- Removed unneeded dependency constraint on SDK.
+
+## 1.0.2
+
+- Update SDK constraint to be 2.0.0 dev friendly.
+
+## 1.0.1
+
+- Fix test to not write to sink after it's closed.
+
+## 1.0.0
+
+- Public API marked stable.
+
+## 0.1.5
+
+- `FilePackagesDirectoryPackages.getBase(..)` performance improvements.
+
+## 0.1.4
+
+- Strong mode fixes.
+
+## 0.1.3
+
+- Invalid test cleanup (to keep up with changes in `Uri`).
+
+## 0.1.1
+
+- Syntax updates.
+
+## 0.1.0
+
+- Initial implementation.
diff --git a/pkgs/package_config/LICENSE b/pkgs/package_config/LICENSE
new file mode 100644
index 0000000..7670007
--- /dev/null
+++ b/pkgs/package_config/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2019, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/package_config/README.md b/pkgs/package_config/README.md
new file mode 100644
index 0000000..76fd3cb
--- /dev/null
+++ b/pkgs/package_config/README.md
@@ -0,0 +1,26 @@
+[](https://github.com/dart-lang/tools/actions/workflows/package_config.yaml)
+[](https://pub.dev/packages/package_config)
+[](https://pub.dev/packages/package_config/publisher)
+
+Support for working with **Package Configuration** files as described
+in the Package Configuration v2 [design document](https://github.com/dart-lang/language/blob/master/accepted/2.8/language-versioning/package-config-file-v2.md).
+
+A Dart package configuration file is used to resolve Dart package names (e.g.
+`foobar`) to Dart files containing the source code for that package (e.g.
+`file:///Users/myuser/.pub-cache/hosted/pub.dartlang.org/foobar-1.1.0`). The
+standard package configuration file is `.dart_tool/package_config.json`, and is
+written by the Dart tool when the command `dart pub get` is run.
+
+The primary libraries of this package are
+* `package_config.dart`:
+ Defines the `PackageConfig` class and other types needed to use
+ package configurations, and provides functions to find, read and
+ write package configuration files.
+
+* `package_config_types.dart`:
+ Just the `PackageConfig` class and other types needed to use
+ package configurations. This library does not depend on `dart:io`.
+
+The package includes deprecated backwards compatible functionality to
+work with the `.packages` file. This functionality will not be maintained,
+and will be removed in a future version of this package.
diff --git a/pkgs/package_config/analysis_options.yaml b/pkgs/package_config/analysis_options.yaml
new file mode 100644
index 0000000..c0249e5
--- /dev/null
+++ b/pkgs/package_config/analysis_options.yaml
@@ -0,0 +1,5 @@
+# Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+# for details. All rights reserved. Use of this source code is governed by a
+# BSD-style license that can be found in the LICENSE file.
+
+include: package:dart_flutter_team_lints/analysis_options.yaml
diff --git a/pkgs/package_config/example/main.dart b/pkgs/package_config/example/main.dart
new file mode 100644
index 0000000..db137ca
--- /dev/null
+++ b/pkgs/package_config/example/main.dart
@@ -0,0 +1,19 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io' show Directory;
+
+import 'package:package_config/package_config.dart';
+
+void main() async {
+ var packageConfig = await findPackageConfig(Directory.current);
+ if (packageConfig == null) {
+ print('Failed to locate or read package config.');
+ } else {
+ print('This package depends on ${packageConfig.packages.length} packages:');
+ for (var package in packageConfig.packages) {
+ print('- ${package.name}');
+ }
+ }
+}
diff --git a/pkgs/package_config/lib/package_config.dart b/pkgs/package_config/lib/package_config.dart
new file mode 100644
index 0000000..074c977
--- /dev/null
+++ b/pkgs/package_config/lib/package_config.dart
@@ -0,0 +1,199 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// A package configuration is a way to assign file paths to package URIs,
+/// and vice-versa.
+///
+/// This package provides functionality to find, read and write package
+/// configurations in the [specified format](https://github.com/dart-lang/language/blob/master/accepted/future-releases/language-versioning/package-config-file-v2.md).
+library;
+
+import 'dart:io' show Directory, File;
+import 'dart:typed_data' show Uint8List;
+
+import 'src/discovery.dart' as discover;
+import 'src/errors.dart' show throwError;
+import 'src/package_config.dart';
+import 'src/package_config_io.dart';
+
+export 'package_config_types.dart';
+
+/// Reads a specific package configuration file.
+///
+/// The file must exist and be readable.
+/// It must be either a valid `package_config.json` file
+/// or a valid `.packages` file.
+/// It is considered a `package_config.json` file if its first character
+/// is a `{`.
+///
+/// If the file is a `.packages` file (the file name is `.packages`)
+/// and [preferNewest] is true, the default, also checks if there is
+/// a `.dart_tool/package_config.json` file next
+/// to the original file, and if so, loads that instead.
+/// If [preferNewest] is set to false, a directly specified `.packages` file
+/// is loaded even if there is an available `package_config.json` file.
+/// The caller can determine this from the [PackageConfig.version]
+/// being 1 and look for a `package_config.json` file themselves.
+///
+/// If [onError] is provided, the configuration file parsing will report errors
+/// by calling that function, and then try to recover.
+/// The returned package configuration is a *best effort* attempt to create
+/// a valid configuration from the invalid configuration file.
+/// If no [onError] is provided, errors are thrown immediately.
+Future<PackageConfig> loadPackageConfig(File file,
+ {bool preferNewest = true, void Function(Object error)? onError}) =>
+ readAnyConfigFile(file, preferNewest, onError ?? throwError);
+
+/// Reads a specific package configuration URI.
+///
+/// The file of the URI must exist and be readable.
+/// It must be either a valid `package_config.json` file
+/// or a valid `.packages` file.
+/// It is considered a `package_config.json` file if its first
+/// non-whitespace character is a `{`.
+///
+/// If [preferNewest] is true, the default, and the file is a `.packages` file,
+/// as determined by its file name being `.packages`,
+/// first checks if there is a `.dart_tool/package_config.json` file
+/// next to the original file, and if so, loads that instead.
+/// The [file] *must not* be a `package:` URI.
+/// If [preferNewest] is set to false, a directly specified `.packages` file
+/// is loaded even if there is an available `package_config.json` file.
+/// The caller can determine this from the [PackageConfig.version]
+/// being 1 and look for a `package_config.json` file themselves.
+///
+/// If [loader] is provided, URIs are loaded using that function.
+/// The future returned by the loader must complete with a [Uint8List]
+/// containing the entire file content encoded as UTF-8,
+/// or with `null` if the file does not exist.
+/// The loader may throw at its own discretion, for situations where
+/// it determines that an error might be need user attention,
+/// but it is always allowed to return `null`.
+/// This function makes no attempt to catch such errors.
+/// As such, it may throw any error that [loader] throws.
+///
+/// If no [loader] is supplied, a default loader is used which
+/// only accepts `file:`, `http:` and `https:` URIs,
+/// and which uses the platform file system and HTTP requests to
+/// fetch file content. The default loader never throws because
+/// of an I/O issue, as long as the location URIs are valid.
+/// As such, it does not distinguish between a file not existing,
+/// and it being temporarily locked or unreachable.
+///
+/// If [onError] is provided, the configuration file parsing will report errors
+/// by calling that function, and then try to recover.
+/// The returned package configuration is a *best effort* attempt to create
+/// a valid configuration from the invalid configuration file.
+/// If no [onError] is provided, errors are thrown immediately.
+Future<PackageConfig> loadPackageConfigUri(Uri file,
+ {Future<Uint8List?> Function(Uri uri)? loader,
+ bool preferNewest = true,
+ void Function(Object error)? onError}) =>
+ readAnyConfigFileUri(file, loader, onError ?? throwError, preferNewest);
+
+/// Finds a package configuration relative to [directory].
+///
+/// If [directory] contains a package configuration,
+/// either a `.dart_tool/package_config.json` file or,
+/// if not, a `.packages`, then that file is loaded.
+///
+/// If no file is found in the current directory,
+/// then the parent directories are checked recursively,
+/// all the way to the root directory, to check if those contains
+/// a package configuration.
+/// If [recurse] is set to `false`, this parent directory check is not
+/// performed.
+///
+/// If [onError] is provided, the configuration file parsing will report errors
+/// by calling that function, and then try to recover.
+/// The returned package configuration is a *best effort* attempt to create
+/// a valid configuration from the invalid configuration file.
+/// If no [onError] is provided, errors are thrown immediately.
+///
+/// If [minVersion] is set to something greater than its default,
+/// any lower-version configuration files are ignored in the search.
+///
+/// Returns `null` if no configuration file is found.
+Future<PackageConfig?> findPackageConfig(Directory directory,
+ {bool recurse = true,
+ void Function(Object error)? onError,
+ int minVersion = 1}) {
+ if (minVersion > PackageConfig.maxVersion) {
+ throw ArgumentError.value(minVersion, 'minVersion',
+ 'Maximum known version is ${PackageConfig.maxVersion}');
+ }
+ return discover.findPackageConfig(
+ directory, minVersion, recurse, onError ?? throwError);
+}
+
+/// Finds a package configuration relative to [location].
+///
+/// If [location] contains a package configuration,
+/// either a `.dart_tool/package_config.json` file or,
+/// if not, a `.packages`, then that file is loaded.
+/// The [location] URI *must not* be a `package:` URI.
+/// It should be a hierarchical URI which is supported
+/// by [loader].
+///
+/// If no file is found in the current directory,
+/// then the parent directories are checked recursively,
+/// all the way to the root directory, to check if those contains
+/// a package configuration.
+/// If [recurse] is set to `false`, this parent directory check is not
+/// performed.
+///
+/// If [loader] is provided, URIs are loaded using that function.
+/// The future returned by the loader must complete with a [Uint8List]
+/// containing the entire file content,
+/// or with `null` if the file does not exist.
+/// The loader may throw at its own discretion, for situations where
+/// it determines that an error might be need user attention,
+/// but it is always allowed to return `null`.
+/// This function makes no attempt to catch such errors.
+///
+/// If no [loader] is supplied, a default loader is used which
+/// only accepts `file:`, `http:` and `https:` URIs,
+/// and which uses the platform file system and HTTP requests to
+/// fetch file content. The default loader never throws because
+/// of an I/O issue, as long as the location URIs are valid.
+/// As such, it does not distinguish between a file not existing,
+/// and it being temporarily locked or unreachable.
+///
+/// If [onError] is provided, the configuration file parsing will report errors
+/// by calling that function, and then try to recover.
+/// The returned package configuration is a *best effort* attempt to create
+/// a valid configuration from the invalid configuration file.
+/// If no [onError] is provided, errors are thrown immediately.
+///
+/// If [minVersion] is set to something greater than its default,
+/// any lower-version configuration files are ignored in the search.
+///
+/// Returns `null` if no configuration file is found.
+Future<PackageConfig?> findPackageConfigUri(Uri location,
+ {bool recurse = true,
+ int minVersion = 1,
+ Future<Uint8List?> Function(Uri uri)? loader,
+ void Function(Object error)? onError}) {
+ if (minVersion > PackageConfig.maxVersion) {
+ throw ArgumentError.value(minVersion, 'minVersion',
+ 'Maximum known version is ${PackageConfig.maxVersion}');
+ }
+ return discover.findPackageConfigUri(
+ location, minVersion, loader, onError ?? throwError, recurse);
+}
+
+/// Writes a package configuration to the provided directory.
+///
+/// Writes `.dart_tool/package_config.json` relative to [directory].
+/// If the `.dart_tool/` directory does not exist, it is created.
+/// If it cannot be created, this operation fails.
+///
+/// Also writes a `.packages` file in [directory].
+/// This will stop happening eventually as the `.packages` file becomes
+/// discontinued.
+/// A comment is generated if `[PackageConfig.extraData]` contains a
+/// `"generator"` entry.
+Future<void> savePackageConfig(
+ PackageConfig configuration, Directory directory) =>
+ writePackageConfigJsonFile(configuration, directory);
diff --git a/pkgs/package_config/lib/package_config_types.dart b/pkgs/package_config/lib/package_config_types.dart
new file mode 100644
index 0000000..825f7ac
--- /dev/null
+++ b/pkgs/package_config/lib/package_config_types.dart
@@ -0,0 +1,17 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// A package configuration is a way to assign file paths to package URIs,
+/// and vice-versa.
+///
+/// {@canonicalFor package_config.InvalidLanguageVersion}
+/// {@canonicalFor package_config.LanguageVersion}
+/// {@canonicalFor package_config.Package}
+/// {@canonicalFor package_config.PackageConfig}
+/// {@canonicalFor errors.PackageConfigError}
+library;
+
+export 'src/errors.dart' show PackageConfigError;
+export 'src/package_config.dart'
+ show InvalidLanguageVersion, LanguageVersion, Package, PackageConfig;
diff --git a/pkgs/package_config/lib/src/discovery.dart b/pkgs/package_config/lib/src/discovery.dart
new file mode 100644
index 0000000..b678410
--- /dev/null
+++ b/pkgs/package_config/lib/src/discovery.dart
@@ -0,0 +1,148 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+import 'dart:typed_data';
+
+import 'errors.dart';
+import 'package_config_impl.dart';
+import 'package_config_io.dart';
+import 'package_config_json.dart';
+import 'packages_file.dart' as packages_file;
+import 'util_io.dart' show defaultLoader, pathJoin;
+
+final Uri packageConfigJsonPath = Uri(path: '.dart_tool/package_config.json');
+final Uri dotPackagesPath = Uri(path: '.packages');
+final Uri currentPath = Uri(path: '.');
+final Uri parentPath = Uri(path: '..');
+
+/// Discover the package configuration for a Dart script.
+///
+/// The [baseDirectory] points to the directory of the Dart script.
+/// A package resolution strategy is found by going through the following steps,
+/// and stopping when something is found.
+///
+/// * Check if a `.dart_tool/package_config.json` file exists in the directory.
+/// * Check if a `.packages` file exists in the directory
+/// (if `minVersion <= 1`).
+/// * Repeat these checks for the parent directories until reaching the
+/// root directory if [recursive] is true.
+///
+/// If any of these tests succeed, a `PackageConfig` class is returned.
+/// Returns `null` if no configuration was found. If a configuration
+/// is needed, then the caller can supply [PackageConfig.empty].
+///
+/// If [minVersion] is greater than 1, `.packages` files are ignored.
+/// If [minVersion] is greater than the version read from the
+/// `package_config.json` file, it too is ignored.
+Future<PackageConfig?> findPackageConfig(Directory baseDirectory,
+ int minVersion, bool recursive, void Function(Object error) onError) async {
+ var directory = baseDirectory;
+ if (!directory.isAbsolute) directory = directory.absolute;
+ if (!await directory.exists()) {
+ return null;
+ }
+ do {
+ // Check for $cwd/.packages
+ var packageConfig =
+ await findPackageConfigInDirectory(directory, minVersion, onError);
+ if (packageConfig != null) return packageConfig;
+ if (!recursive) break;
+ // Check in parent directories.
+ var parentDirectory = directory.parent;
+ if (parentDirectory.path == directory.path) break;
+ directory = parentDirectory;
+ } while (true);
+ return null;
+}
+
+/// Similar to [findPackageConfig] but based on a URI.
+Future<PackageConfig?> findPackageConfigUri(
+ Uri location,
+ int minVersion,
+ Future<Uint8List?> Function(Uri uri)? loader,
+ void Function(Object error) onError,
+ bool recursive) async {
+ if (location.isScheme('package')) {
+ onError(PackageConfigArgumentError(
+ location, 'location', 'Must not be a package: URI'));
+ return null;
+ }
+ if (loader == null) {
+ if (location.isScheme('file')) {
+ return findPackageConfig(
+ Directory.fromUri(location.resolveUri(currentPath)),
+ minVersion,
+ recursive,
+ onError);
+ }
+ loader = defaultLoader;
+ }
+ if (!location.path.endsWith('/')) location = location.resolveUri(currentPath);
+ while (true) {
+ var file = location.resolveUri(packageConfigJsonPath);
+ var bytes = await loader(file);
+ if (bytes != null) {
+ var config = parsePackageConfigBytes(bytes, file, onError);
+ if (config.version >= minVersion) return config;
+ }
+ if (minVersion <= 1) {
+ file = location.resolveUri(dotPackagesPath);
+ bytes = await loader(file);
+ if (bytes != null) {
+ return packages_file.parse(bytes, file, onError);
+ }
+ }
+ if (!recursive) break;
+ var parent = location.resolveUri(parentPath);
+ if (parent == location) break;
+ location = parent;
+ }
+ return null;
+}
+
+/// Finds a `.packages` or `.dart_tool/package_config.json` file in [directory].
+///
+/// Loads the file, if it is there, and returns the resulting [PackageConfig].
+/// Returns `null` if the file isn't there.
+/// Reports a [FormatException] if a file is there but the content is not valid.
+/// If the file exists, but fails to be read, the file system error is reported.
+///
+/// If [onError] is supplied, parsing errors are reported using that, and
+/// a best-effort attempt is made to return a package configuration.
+/// This may be the empty package configuration.
+///
+/// If [minVersion] is greater than 1, `.packages` files are ignored.
+/// If [minVersion] is greater than the version read from the
+/// `package_config.json` file, it too is ignored.
+Future<PackageConfig?> findPackageConfigInDirectory(Directory directory,
+ int minVersion, void Function(Object error) onError) async {
+ var packageConfigFile = await checkForPackageConfigJsonFile(directory);
+ if (packageConfigFile != null) {
+ var config = await readPackageConfigJsonFile(packageConfigFile, onError);
+ if (config.version < minVersion) return null;
+ return config;
+ }
+ if (minVersion <= 1) {
+ packageConfigFile = await checkForDotPackagesFile(directory);
+ if (packageConfigFile != null) {
+ return await readDotPackagesFile(packageConfigFile, onError);
+ }
+ }
+ return null;
+}
+
+Future<File?> checkForPackageConfigJsonFile(Directory directory) async {
+ assert(directory.isAbsolute);
+ var file =
+ File(pathJoin(directory.path, '.dart_tool', 'package_config.json'));
+ if (await file.exists()) return file;
+ return null;
+}
+
+Future<File?> checkForDotPackagesFile(Directory directory) async {
+ var file = File(pathJoin(directory.path, '.packages'));
+ if (await file.exists()) return file;
+ return null;
+}
diff --git a/pkgs/package_config/lib/src/errors.dart b/pkgs/package_config/lib/src/errors.dart
new file mode 100644
index 0000000..a66fef7
--- /dev/null
+++ b/pkgs/package_config/lib/src/errors.dart
@@ -0,0 +1,34 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// General superclass of most errors and exceptions thrown by this package.
+///
+/// Only covers errors thrown while parsing package configuration files.
+/// Programming errors and I/O exceptions are not covered.
+abstract class PackageConfigError {
+ PackageConfigError._();
+}
+
+class PackageConfigArgumentError extends ArgumentError
+ implements PackageConfigError {
+ PackageConfigArgumentError(
+ Object? super.value, String super.name, String super.message)
+ : super.value();
+
+ PackageConfigArgumentError.from(ArgumentError error)
+ : super.value(error.invalidValue, error.name, error.message);
+}
+
+class PackageConfigFormatException extends FormatException
+ implements PackageConfigError {
+ PackageConfigFormatException(super.message, Object? super.source,
+ [super.offset]);
+
+ PackageConfigFormatException.from(FormatException exception)
+ : super(exception.message, exception.source, exception.offset);
+}
+
+/// The default `onError` handler.
+// ignore: only_throw_errors
+Never throwError(Object error) => throw error;
diff --git a/pkgs/package_config/lib/src/package_config.dart b/pkgs/package_config/lib/src/package_config.dart
new file mode 100644
index 0000000..155dfc5
--- /dev/null
+++ b/pkgs/package_config/lib/src/package_config.dart
@@ -0,0 +1,402 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:typed_data';
+
+import 'errors.dart';
+import 'package_config_impl.dart';
+import 'package_config_json.dart';
+
+/// A package configuration.
+///
+/// Associates configuration data to packages and files in packages.
+///
+/// More members may be added to this class in the future,
+/// so classes outside of this package must not implement [PackageConfig]
+/// or any subclass of it.
+abstract class PackageConfig {
+ /// The largest configuration version currently recognized.
+ static const int maxVersion = 2;
+
+ /// An empty package configuration.
+ ///
+ /// A package configuration with no available packages.
+ /// Is used as a default value where a package configuration
+ /// is expected, but none have been specified or found.
+ static const PackageConfig empty = SimplePackageConfig.empty();
+
+ /// Creates a package configuration with the provided available [packages].
+ ///
+ /// The packages must be valid packages (valid package name, valid
+ /// absolute directory URIs, valid language version, if any),
+ /// and there must not be two packages with the same name.
+ ///
+ /// The package's root ([Package.root]) and package-root
+ /// ([Package.packageUriRoot]) paths must satisfy a number of constraints
+ /// We say that one path (which we know ends with a `/` character)
+ /// is inside another path, if the latter path is a prefix of the former path,
+ /// including the two paths being the same.
+ ///
+ /// * No package's root must be the same as another package's root.
+ /// * The package-root of a package must be inside the package's root.
+ /// * If one package's package-root is inside another package's root,
+ /// then the latter package's package root must not be inside the former
+ /// package's root. (No getting between a package and its package root!)
+ /// This also disallows a package's root being the same as another
+ /// package's package root.
+ ///
+ /// If supplied, the [extraData] will be available as the
+ /// [PackageConfig.extraData] of the created configuration.
+ ///
+ /// The version of the resulting configuration is always [maxVersion].
+ factory PackageConfig(Iterable<Package> packages, {Object? extraData}) =>
+ SimplePackageConfig(maxVersion, packages, extraData);
+
+ /// Parses a package configuration file.
+ ///
+ /// The [bytes] must be an UTF-8 encoded JSON object
+ /// containing a valid package configuration.
+ ///
+ /// The [baseUri] is used as the base for resolving relative
+ /// URI references in the configuration file. If the configuration
+ /// has been read from a file, the [baseUri] can be the URI of that
+ /// file, or of the directory it occurs in.
+ ///
+ /// If [onError] is provided, errors found during parsing or building
+ /// the configuration are reported by calling [onError] instead of
+ /// throwing, and parser makes a *best effort* attempt to continue
+ /// despite the error. The input must still be valid JSON.
+ /// The result may be [PackageConfig.empty] if there is no way to
+ /// extract useful information from the bytes.
+ static PackageConfig parseBytes(Uint8List bytes, Uri baseUri,
+ {void Function(Object error)? onError}) =>
+ parsePackageConfigBytes(bytes, baseUri, onError ?? throwError);
+
+ /// Parses a package configuration file.
+ ///
+ /// The [configuration] must be a JSON object
+ /// containing a valid package configuration.
+ ///
+ /// The [baseUri] is used as the base for resolving relative
+ /// URI references in the configuration file. If the configuration
+ /// has been read from a file, the [baseUri] can be the URI of that
+ /// file, or of the directory it occurs in.
+ ///
+ /// If [onError] is provided, errors found during parsing or building
+ /// the configuration are reported by calling [onError] instead of
+ /// throwing, and parser makes a *best effort* attempt to continue
+ /// despite the error. The input must still be valid JSON.
+ /// The result may be [PackageConfig.empty] if there is no way to
+ /// extract useful information from the bytes.
+ static PackageConfig parseString(String configuration, Uri baseUri,
+ {void Function(Object error)? onError}) =>
+ parsePackageConfigString(configuration, baseUri, onError ?? throwError);
+
+ /// Parses the JSON data of a package configuration file.
+ ///
+ /// The [jsonData] must be a JSON-like Dart data structure,
+ /// like the one provided by parsing JSON text using `dart:convert`,
+ /// containing a valid package configuration.
+ ///
+ /// The [baseUri] is used as the base for resolving relative
+ /// URI references in the configuration file. If the configuration
+ /// has been read from a file, the [baseUri] can be the URI of that
+ /// file, or of the directory it occurs in.
+ ///
+ /// If [onError] is provided, errors found during parsing or building
+ /// the configuration are reported by calling [onError] instead of
+ /// throwing, and parser makes a *best effort* attempt to continue
+ /// despite the error. The input must still be valid JSON.
+ /// The result may be [PackageConfig.empty] if there is no way to
+ /// extract useful information from the bytes.
+ static PackageConfig parseJson(Object? jsonData, Uri baseUri,
+ {void Function(Object error)? onError}) =>
+ parsePackageConfigJson(jsonData, baseUri, onError ?? throwError);
+
+ /// Writes a configuration file for this configuration on [output].
+ ///
+ /// If [baseUri] is provided, URI references in the generated file
+ /// will be made relative to [baseUri] where possible.
+ static void writeBytes(PackageConfig configuration, Sink<Uint8List> output,
+ [Uri? baseUri]) {
+ writePackageConfigJsonUtf8(configuration, baseUri, output);
+ }
+
+ /// Writes a configuration JSON text for this configuration on [output].
+ ///
+ /// If [baseUri] is provided, URI references in the generated file
+ /// will be made relative to [baseUri] where possible.
+ static void writeString(PackageConfig configuration, StringSink output,
+ [Uri? baseUri]) {
+ writePackageConfigJsonString(configuration, baseUri, output);
+ }
+
+ /// Converts a configuration to a JSON-like data structure.
+ ///
+ /// If [baseUri] is provided, URI references in the generated data
+ /// will be made relative to [baseUri] where possible.
+ static Map<String, Object?> toJson(PackageConfig configuration,
+ [Uri? baseUri]) =>
+ packageConfigToJson(configuration, baseUri);
+
+ /// The configuration version number.
+ ///
+ /// Currently this is 1 or 2, where
+ /// * Version one is the `.packages` file format and
+ /// * Version two is the first `package_config.json` format.
+ ///
+ /// Instances of this class supports both, and the version
+ /// is only useful for detecting which kind of file the configuration
+ /// was read from.
+ int get version;
+
+ /// All the available packages of this configuration.
+ ///
+ /// No two of these packages have the same name,
+ /// and no two [Package.root] directories overlap.
+ Iterable<Package> get packages;
+
+ /// Look up a package by name.
+ ///
+ /// Returns the [Package] from [packages] with [packageName] as
+ /// [Package.name]. Returns `null` if the package is not available in the
+ /// current configuration.
+ Package? operator [](String packageName);
+
+ /// Provides the associated package for a specific [file] (or directory).
+ ///
+ /// Returns a [Package] which contains the [file]'s path, if any.
+ /// That is, the [Package.root] directory is a parent directory
+ /// of the [file]'s location.
+ ///
+ /// Returns `null` if the file does not belong to any package.
+ Package? packageOf(Uri file);
+
+ /// Resolves a `package:` URI to a non-package URI
+ ///
+ /// The [packageUri] must be a valid package URI. That means:
+ /// * A URI with `package` as scheme,
+ /// * with no authority part (`package://...`),
+ /// * with a path starting with a valid package name followed by a slash, and
+ /// * with no query or fragment part.
+ ///
+ /// Throws an [ArgumentError] (which also implements [PackageConfigError])
+ /// if the package URI is not valid.
+ ///
+ /// Returns `null` if the package name of [packageUri] is not available
+ /// in this package configuration.
+ /// Returns the remaining path of the package URI resolved relative to the
+ /// [Package.packageUriRoot] of the corresponding package.
+ Uri? resolve(Uri packageUri);
+
+ /// The package URI which resolves to [nonPackageUri].
+ ///
+ /// The [nonPackageUri] must not have any query or fragment part,
+ /// and it must not have `package` as scheme.
+ /// Throws an [ArgumentError] (which also implements [PackageConfigError])
+ /// if the non-package URI is not valid.
+ ///
+ /// Returns a package URI which [resolve] will convert to [nonPackageUri],
+ /// if any such URI exists. Returns `null` if no such package URI exists.
+ Uri? toPackageUri(Uri nonPackageUri);
+
+ /// Extra data associated with the package configuration.
+ ///
+ /// The data may be in any format, depending on who introduced it.
+ /// The standard `package_config.json` file storage will only store
+ /// JSON-like list/map data structures.
+ Object? get extraData;
+}
+
+/// Configuration data for a single package.
+abstract class Package {
+ /// Creates a package with the provided properties.
+ ///
+ /// The [name] must be a valid package name.
+ /// The [root] must be an absolute directory URI, meaning an absolute URI
+ /// with no query or fragment path and a path starting and ending with `/`.
+ /// The [packageUriRoot], if provided, must be either an absolute
+ /// directory URI or a relative URI reference which is then resolved
+ /// relative to [root]. It must then also be a subdirectory of [root],
+ /// or the same directory, and must end with `/`.
+ /// If [languageVersion] is supplied, it must be a valid Dart language
+ /// version, which means two decimal integer literals separated by a `.`,
+ /// where the integer literals have no leading zeros unless they are
+ /// a single zero digit.
+ ///
+ /// The [relativeRoot] controls whether the [root] is written as
+ /// relative to the `package_config.json` file when the package
+ /// configuration is written to a file. It defaults to being relative.
+ ///
+ /// If [extraData] is supplied, it will be available as the
+ /// [Package.extraData] of the created package.
+ factory Package(String name, Uri root,
+ {Uri? packageUriRoot,
+ LanguageVersion? languageVersion,
+ Object? extraData,
+ bool relativeRoot = true}) =>
+ SimplePackage.validate(name, root, packageUriRoot, languageVersion,
+ extraData, relativeRoot, throwError)!;
+
+ /// The package-name of the package.
+ String get name;
+
+ /// The location of the root of the package.
+ ///
+ /// Is always an absolute URI with no query or fragment parts,
+ /// and with a path ending in `/`.
+ ///
+ /// All files in the [root] directory are considered
+ /// part of the package for purposes where that that matters.
+ Uri get root;
+
+ /// The root of the files available through `package:` URIs.
+ ///
+ /// A `package:` URI with [name] as the package name is
+ /// resolved relative to this location.
+ ///
+ /// Is always an absolute URI with no query or fragment part
+ /// with a path ending in `/`,
+ /// and with a location which is a subdirectory
+ /// of the [root], or the same as the [root].
+ Uri get packageUriRoot;
+
+ /// The default language version associated with this package.
+ ///
+ /// Each package may have a default language version associated,
+ /// which is the language version used to parse and compile
+ /// Dart files in the package.
+ /// A package version is defined by two non-negative numbers,
+ /// the *major* and *minor* version numbers.
+ ///
+ /// A package may have no language version associated with it
+ /// in the package configuration, in which case tools should
+ /// use a default behavior for the package.
+ LanguageVersion? get languageVersion;
+
+ /// Extra data associated with the specific package.
+ ///
+ /// The data may be in any format, depending on who introduced it.
+ /// The standard `package_config.json` file storage will only store
+ /// JSON-like list/map data structures.
+ Object? get extraData;
+
+ /// Whether the [root] URI should be written as relative.
+ ///
+ /// When the configuration is written to a `package_config.json`
+ /// file, the [root] URI can be either relative to the file
+ /// location or absolute, controller by this value.
+ bool get relativeRoot;
+}
+
+/// A language version.
+///
+/// A language version is represented by two non-negative integers,
+/// the [major] and [minor] version numbers.
+///
+/// If errors during parsing are handled using an `onError` handler,
+/// then an *invalid* language version may be represented by an
+/// [InvalidLanguageVersion] object.
+abstract class LanguageVersion implements Comparable<LanguageVersion> {
+ /// The maximal value allowed by [major] and [minor] values;
+ static const int maxValue = 0x7FFFFFFF;
+ factory LanguageVersion(int major, int minor) {
+ RangeError.checkValueInInterval(major, 0, maxValue, 'major');
+ RangeError.checkValueInInterval(minor, 0, maxValue, 'major');
+ return SimpleLanguageVersion(major, minor, null);
+ }
+
+ /// Parses a language version string.
+ ///
+ /// A valid language version string has the form
+ ///
+ /// > *decimalNumber* `.` *decimalNumber*
+ ///
+ /// where a *decimalNumber* is a non-empty sequence of decimal digits
+ /// with no unnecessary leading zeros (the decimal number only starts
+ /// with a zero digit if that digit is the entire number).
+ /// No spaces are allowed in the string.
+ ///
+ /// If the [source] is valid then it is parsed into a valid
+ /// [LanguageVersion] object.
+ /// If not, then the [onError] is called with a [FormatException].
+ /// If [onError] is not supplied, it defaults to throwing the exception.
+ /// If the call does not throw, then an [InvalidLanguageVersion] is returned
+ /// containing the original [source].
+ static LanguageVersion parse(String source,
+ {void Function(Object error)? onError}) =>
+ parseLanguageVersion(source, onError ?? throwError);
+
+ /// The major language version.
+ ///
+ /// A non-negative integer less than 2<sup>31</sup>.
+ ///
+ /// The value is negative for objects representing *invalid* language
+ /// versions ([InvalidLanguageVersion]).
+ int get major;
+
+ /// The minor language version.
+ ///
+ /// A non-negative integer less than 2<sup>31</sup>.
+ ///
+ /// The value is negative for objects representing *invalid* language
+ /// versions ([InvalidLanguageVersion]).
+ int get minor;
+
+ /// Compares language versions.
+ ///
+ /// Two language versions are considered equal if they have the
+ /// same major and minor version numbers.
+ ///
+ /// A language version is greater then another if the former's major version
+ /// is greater than the latter's major version, or if they have
+ /// the same major version and the former's minor version is greater than
+ /// the latter's.
+ @override
+ int compareTo(LanguageVersion other);
+
+ /// Valid language versions with the same [major] and [minor] values are
+ /// equal.
+ ///
+ /// Invalid language versions ([InvalidLanguageVersion]) are not equal to
+ /// any other object.
+ @override
+ bool operator ==(Object other);
+
+ @override
+ int get hashCode;
+
+ /// A string representation of the language version.
+ ///
+ /// A valid language version is represented as
+ /// `"${version.major}.${version.minor}"`.
+ @override
+ String toString();
+}
+
+/// An *invalid* language version.
+///
+/// Stored in a [Package] when the original language version string
+/// was invalid and a `onError` handler was passed to the parser
+/// which did not throw on an error.
+abstract class InvalidLanguageVersion implements LanguageVersion {
+ /// The value -1 for an invalid language version.
+ @override
+ int get major;
+
+ /// The value -1 for an invalid language version.
+ @override
+ int get minor;
+
+ /// An invalid language version is only equal to itself.
+ @override
+ bool operator ==(Object other);
+
+ @override
+ int get hashCode;
+
+ /// The original invalid version string.
+ @override
+ String toString();
+}
diff --git a/pkgs/package_config/lib/src/package_config_impl.dart b/pkgs/package_config/lib/src/package_config_impl.dart
new file mode 100644
index 0000000..865e99a
--- /dev/null
+++ b/pkgs/package_config/lib/src/package_config_impl.dart
@@ -0,0 +1,568 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'errors.dart';
+import 'package_config.dart';
+import 'util.dart';
+
+export 'package_config.dart';
+
+const bool _disallowPackagesInsidePackageUriRoot = false;
+
+// Implementations of the main data types exposed by the API of this package.
+
+class SimplePackageConfig implements PackageConfig {
+ @override
+ final int version;
+ final Map<String, Package> _packages;
+ final PackageTree _packageTree;
+ @override
+ final Object? extraData;
+
+ factory SimplePackageConfig(int version, Iterable<Package> packages,
+ [Object? extraData, void Function(Object error)? onError]) {
+ onError ??= throwError;
+ var validVersion = _validateVersion(version, onError);
+ var sortedPackages = [...packages]..sort(_compareRoot);
+ var packageTree = _validatePackages(packages, sortedPackages, onError);
+ return SimplePackageConfig._(validVersion, packageTree,
+ {for (var p in packageTree.allPackages) p.name: p}, extraData);
+ }
+
+ SimplePackageConfig._(
+ this.version, this._packageTree, this._packages, this.extraData);
+
+ /// Creates empty configuration.
+ ///
+ /// The empty configuration can be used in cases where no configuration is
+ /// found, but code expects a non-null configuration.
+ ///
+ /// The version number is [PackageConfig.maxVersion] to avoid
+ /// minimum-version filters discarding the configuration.
+ const SimplePackageConfig.empty()
+ : version = PackageConfig.maxVersion,
+ _packageTree = const EmptyPackageTree(),
+ _packages = const <String, Package>{},
+ extraData = null;
+
+ static int _validateVersion(
+ int version, void Function(Object error) onError) {
+ if (version < 0 || version > PackageConfig.maxVersion) {
+ onError(PackageConfigArgumentError(version, 'version',
+ 'Must be in the range 1 to ${PackageConfig.maxVersion}'));
+ return 2; // The minimal version supporting a SimplePackageConfig.
+ }
+ return version;
+ }
+
+ static PackageTree _validatePackages(Iterable<Package> originalPackages,
+ List<Package> packages, void Function(Object error) onError) {
+ var packageNames = <String>{};
+ var tree = TriePackageTree();
+ for (var originalPackage in packages) {
+ SimplePackage? newPackage;
+ if (originalPackage is! SimplePackage) {
+ // SimplePackage validates these properties.
+ newPackage = SimplePackage.validate(
+ originalPackage.name,
+ originalPackage.root,
+ originalPackage.packageUriRoot,
+ originalPackage.languageVersion,
+ originalPackage.extraData,
+ originalPackage.relativeRoot, (error) {
+ if (error is PackageConfigArgumentError) {
+ onError(PackageConfigArgumentError(packages, 'packages',
+ 'Package ${newPackage!.name}: ${error.message}'));
+ } else {
+ onError(error);
+ }
+ });
+ if (newPackage == null) continue;
+ } else {
+ newPackage = originalPackage;
+ }
+ var name = newPackage.name;
+ if (packageNames.contains(name)) {
+ onError(PackageConfigArgumentError(
+ name, 'packages', "Duplicate package name '$name'"));
+ continue;
+ }
+ packageNames.add(name);
+ tree.add(newPackage, (error) {
+ if (error is ConflictException) {
+ // There is a conflict with an existing package.
+ var existingPackage = error.existingPackage;
+ switch (error.conflictType) {
+ case ConflictType.sameRoots:
+ onError(PackageConfigArgumentError(
+ originalPackages,
+ 'packages',
+ 'Packages ${newPackage!.name} and ${existingPackage.name} '
+ 'have the same root directory: ${newPackage.root}.\n'));
+ break;
+ case ConflictType.interleaving:
+ // The new package is inside the package URI root of the existing
+ // package.
+ onError(PackageConfigArgumentError(
+ originalPackages,
+ 'packages',
+ 'Package ${newPackage!.name} is inside the root of '
+ 'package ${existingPackage.name}, and the package root '
+ 'of ${existingPackage.name} is inside the root of '
+ '${newPackage.name}.\n'
+ '${existingPackage.name} package root: '
+ '${existingPackage.packageUriRoot}\n'
+ '${newPackage.name} root: ${newPackage.root}\n'));
+ break;
+ case ConflictType.insidePackageRoot:
+ onError(PackageConfigArgumentError(
+ originalPackages,
+ 'packages',
+ 'Package ${newPackage!.name} is inside the package root of '
+ 'package ${existingPackage.name}.\n'
+ '${existingPackage.name} package root: '
+ '${existingPackage.packageUriRoot}\n'
+ '${newPackage.name} root: ${newPackage.root}\n'));
+ break;
+ }
+ } else {
+ // Any other error.
+ onError(error);
+ }
+ });
+ }
+ return tree;
+ }
+
+ @override
+ Iterable<Package> get packages => _packages.values;
+
+ @override
+ Package? operator [](String packageName) => _packages[packageName];
+
+ @override
+ Package? packageOf(Uri file) => _packageTree.packageOf(file);
+
+ @override
+ Uri? resolve(Uri packageUri) {
+ var packageName = checkValidPackageUri(packageUri, 'packageUri');
+ return _packages[packageName]?.packageUriRoot.resolveUri(
+ Uri(path: packageUri.path.substring(packageName.length + 1)));
+ }
+
+ @override
+ Uri? toPackageUri(Uri nonPackageUri) {
+ if (nonPackageUri.isScheme('package')) {
+ throw PackageConfigArgumentError(
+ nonPackageUri, 'nonPackageUri', 'Must not be a package URI');
+ }
+ if (nonPackageUri.hasQuery || nonPackageUri.hasFragment) {
+ throw PackageConfigArgumentError(nonPackageUri, 'nonPackageUri',
+ 'Must not have query or fragment part');
+ }
+ // Find package that file belongs to.
+ var package = _packageTree.packageOf(nonPackageUri);
+ if (package == null) return null;
+ // Check if it is inside the package URI root.
+ var path = nonPackageUri.toString();
+ var root = package.packageUriRoot.toString();
+ if (_beginsWith(package.root.toString().length, root, path)) {
+ var rest = path.substring(root.length);
+ return Uri(scheme: 'package', path: '${package.name}/$rest');
+ }
+ return null;
+ }
+}
+
+/// Configuration data for a single package.
+class SimplePackage implements Package {
+ @override
+ final String name;
+ @override
+ final Uri root;
+ @override
+ final Uri packageUriRoot;
+ @override
+ final LanguageVersion? languageVersion;
+ @override
+ final Object? extraData;
+ @override
+ final bool relativeRoot;
+
+ SimplePackage._(this.name, this.root, this.packageUriRoot,
+ this.languageVersion, this.extraData, this.relativeRoot);
+
+ /// Creates a [SimplePackage] with the provided content.
+ ///
+ /// The provided arguments must be valid.
+ ///
+ /// If the arguments are invalid then the error is reported by
+ /// calling [onError], then the erroneous entry is ignored.
+ ///
+ /// If [onError] is provided, the user is expected to be able to handle
+ /// errors themselves. An invalid [languageVersion] string
+ /// will be replaced with the string `"invalid"`. This allows
+ /// users to detect the difference between an absent version and
+ /// an invalid one.
+ ///
+ /// Returns `null` if the input is invalid and an approximately valid package
+ /// cannot be salvaged from the input.
+ static SimplePackage? validate(
+ String name,
+ Uri root,
+ Uri? packageUriRoot,
+ LanguageVersion? languageVersion,
+ Object? extraData,
+ bool relativeRoot,
+ void Function(Object error) onError) {
+ var fatalError = false;
+ var invalidIndex = checkPackageName(name);
+ if (invalidIndex >= 0) {
+ onError(PackageConfigFormatException(
+ 'Not a valid package name', name, invalidIndex));
+ fatalError = true;
+ }
+ if (root.isScheme('package')) {
+ onError(PackageConfigArgumentError(
+ '$root', 'root', 'Must not be a package URI'));
+ fatalError = true;
+ } else if (!isAbsoluteDirectoryUri(root)) {
+ onError(PackageConfigArgumentError(
+ '$root',
+ 'root',
+ 'In package $name: Not an absolute URI with no query or fragment '
+ 'with a path ending in /'));
+ // Try to recover. If the URI has a scheme,
+ // then ensure that the path ends with `/`.
+ if (!root.hasScheme) {
+ fatalError = true;
+ } else if (!root.path.endsWith('/')) {
+ root = root.replace(path: '${root.path}/');
+ }
+ }
+ if (packageUriRoot == null) {
+ packageUriRoot = root;
+ } else if (!fatalError) {
+ packageUriRoot = root.resolveUri(packageUriRoot);
+ if (!isAbsoluteDirectoryUri(packageUriRoot)) {
+ onError(PackageConfigArgumentError(
+ packageUriRoot,
+ 'packageUriRoot',
+ 'In package $name: Not an absolute URI with no query or fragment '
+ 'with a path ending in /'));
+ packageUriRoot = root;
+ } else if (!isUriPrefix(root, packageUriRoot)) {
+ onError(PackageConfigArgumentError(packageUriRoot, 'packageUriRoot',
+ 'The package URI root is not below the package root'));
+ packageUriRoot = root;
+ }
+ }
+ if (fatalError) return null;
+ return SimplePackage._(
+ name, root, packageUriRoot, languageVersion, extraData, relativeRoot);
+ }
+}
+
+/// Checks whether [source] is a valid Dart language version string.
+///
+/// The format is (as RegExp) `^(0|[1-9]\d+)\.(0|[1-9]\d+)$`.
+///
+/// Reports a format exception on [onError] if not, or if the numbers
+/// are too large (at most 32-bit signed integers).
+LanguageVersion parseLanguageVersion(
+ String? source, void Function(Object error) onError) {
+ var index = 0;
+ // Reads a positive decimal numeral. Returns the value of the numeral,
+ // or a negative number in case of an error.
+ // Starts at [index] and increments the index to the position after
+ // the numeral.
+ // It is an error if the numeral value is greater than 0x7FFFFFFFF.
+ // It is a recoverable error if the numeral starts with leading zeros.
+ int readNumeral() {
+ const maxValue = 0x7FFFFFFF;
+ if (index == source!.length) {
+ onError(PackageConfigFormatException('Missing number', source, index));
+ return -1;
+ }
+ var start = index;
+
+ var char = source.codeUnitAt(index);
+ var digit = char ^ 0x30;
+ if (digit > 9) {
+ onError(PackageConfigFormatException('Missing number', source, index));
+ return -1;
+ }
+ var firstDigit = digit;
+ var value = 0;
+ do {
+ value = value * 10 + digit;
+ if (value > maxValue) {
+ onError(
+ PackageConfigFormatException('Number too large', source, start));
+ return -1;
+ }
+ index++;
+ if (index == source.length) break;
+ char = source.codeUnitAt(index);
+ digit = char ^ 0x30;
+ } while (digit <= 9);
+ if (firstDigit == 0 && index > start + 1) {
+ onError(PackageConfigFormatException(
+ 'Leading zero not allowed', source, start));
+ }
+ return value;
+ }
+
+ var major = readNumeral();
+ if (major < 0) {
+ return SimpleInvalidLanguageVersion(source);
+ }
+ if (index == source!.length || source.codeUnitAt(index) != $dot) {
+ onError(PackageConfigFormatException("Missing '.'", source, index));
+ return SimpleInvalidLanguageVersion(source);
+ }
+ index++;
+ var minor = readNumeral();
+ if (minor < 0) {
+ return SimpleInvalidLanguageVersion(source);
+ }
+ if (index != source.length) {
+ onError(PackageConfigFormatException(
+ 'Unexpected trailing character', source, index));
+ return SimpleInvalidLanguageVersion(source);
+ }
+ return SimpleLanguageVersion(major, minor, source);
+}
+
+abstract class _SimpleLanguageVersionBase implements LanguageVersion {
+ @override
+ int compareTo(LanguageVersion other) {
+ var result = major.compareTo(other.major);
+ if (result != 0) return result;
+ return minor.compareTo(other.minor);
+ }
+}
+
+class SimpleLanguageVersion extends _SimpleLanguageVersionBase {
+ @override
+ final int major;
+ @override
+ final int minor;
+ String? _source;
+ SimpleLanguageVersion(this.major, this.minor, this._source);
+
+ @override
+ bool operator ==(Object other) =>
+ other is LanguageVersion && major == other.major && minor == other.minor;
+
+ @override
+ int get hashCode => (major * 17 ^ minor * 37) & 0x3FFFFFFF;
+
+ @override
+ String toString() => _source ??= '$major.$minor';
+}
+
+class SimpleInvalidLanguageVersion extends _SimpleLanguageVersionBase
+ implements InvalidLanguageVersion {
+ final String? _source;
+ SimpleInvalidLanguageVersion(this._source);
+ @override
+ int get major => -1;
+ @override
+ int get minor => -1;
+
+ @override
+ String toString() => _source!;
+}
+
+abstract class PackageTree {
+ Iterable<Package> get allPackages;
+ SimplePackage? packageOf(Uri file);
+}
+
+class _PackageTrieNode {
+ SimplePackage? package;
+
+ /// Indexed by path segment.
+ Map<String, _PackageTrieNode> map = {};
+}
+
+/// Packages of a package configuration ordered by root path.
+///
+/// A package has a root path and a package root path, where the latter
+/// contains the files exposed by `package:` URIs.
+///
+/// A package is said to be inside another package if the root path URI of
+/// the latter is a prefix of the root path URI of the former.
+///
+/// No two packages of a package may have the same root path.
+/// The package root path of a package must not be inside another package's
+/// root path.
+/// Entire other packages are allowed inside a package's root.
+class TriePackageTree implements PackageTree {
+ /// Indexed by URI scheme.
+ final Map<String, _PackageTrieNode> _map = {};
+
+ /// A list of all packages.
+ final List<SimplePackage> _packages = [];
+
+ @override
+ Iterable<Package> get allPackages sync* {
+ for (var package in _packages) {
+ yield package;
+ }
+ }
+
+ bool _checkConflict(_PackageTrieNode node, SimplePackage newPackage,
+ void Function(Object error) onError) {
+ var existingPackage = node.package;
+ if (existingPackage != null) {
+ // Trying to add package that is inside the existing package.
+ // 1) If it's an exact match it's not allowed (i.e. the roots can't be
+ // the same).
+ if (newPackage.root.path.length == existingPackage.root.path.length) {
+ onError(ConflictException(
+ newPackage, existingPackage, ConflictType.sameRoots));
+ return true;
+ }
+ // 2) The existing package has a packageUriRoot thats inside the
+ // root of the new package.
+ if (_beginsWith(0, newPackage.root.toString(),
+ existingPackage.packageUriRoot.toString())) {
+ onError(ConflictException(
+ newPackage, existingPackage, ConflictType.interleaving));
+ return true;
+ }
+
+ // For internal reasons we allow this (for now). One should still never do
+ // it though.
+ // 3) The new package is inside the packageUriRoot of existing package.
+ if (_disallowPackagesInsidePackageUriRoot) {
+ if (_beginsWith(0, existingPackage.packageUriRoot.toString(),
+ newPackage.root.toString())) {
+ onError(ConflictException(
+ newPackage, existingPackage, ConflictType.insidePackageRoot));
+ return true;
+ }
+ }
+ }
+ return false;
+ }
+
+ /// Tries to add `newPackage` to the tree.
+ ///
+ /// Reports a [ConflictException] if the added package conflicts with an
+ /// existing package.
+ /// It conflicts if its root or package root is the same as an existing
+ /// package's root or package root, is between the two, or if it's inside the
+ /// package root of an existing package.
+ ///
+ /// If a conflict is detected between [newPackage] and a previous package,
+ /// then [onError] is called with a [ConflictException] object
+ /// and the [newPackage] is not added to the tree.
+ ///
+ /// The packages are added in order of their root path.
+ void add(SimplePackage newPackage, void Function(Object error) onError) {
+ var root = newPackage.root;
+ var node = _map[root.scheme] ??= _PackageTrieNode();
+ if (_checkConflict(node, newPackage, onError)) return;
+ var segments = root.pathSegments;
+ // Notice that we're skipping the last segment as it's always the empty
+ // string because roots are directories.
+ for (var i = 0; i < segments.length - 1; i++) {
+ var path = segments[i];
+ node = node.map[path] ??= _PackageTrieNode();
+ if (_checkConflict(node, newPackage, onError)) return;
+ }
+ node.package = newPackage;
+ _packages.add(newPackage);
+ }
+
+ bool _isMatch(
+ String path, _PackageTrieNode node, List<SimplePackage> potential) {
+ var currentPackage = node.package;
+ if (currentPackage != null) {
+ var currentPackageRootLength = currentPackage.root.toString().length;
+ if (path.length == currentPackageRootLength) return true;
+ var currentPackageUriRoot = currentPackage.packageUriRoot.toString();
+ // Is [file] inside the package root of [currentPackage]?
+ if (currentPackageUriRoot.length == currentPackageRootLength ||
+ _beginsWith(currentPackageRootLength, currentPackageUriRoot, path)) {
+ return true;
+ }
+ potential.add(currentPackage);
+ }
+ return false;
+ }
+
+ @override
+ SimplePackage? packageOf(Uri file) {
+ var currentTrieNode = _map[file.scheme];
+ if (currentTrieNode == null) return null;
+ var path = file.toString();
+ var potential = <SimplePackage>[];
+ if (_isMatch(path, currentTrieNode, potential)) {
+ return currentTrieNode.package;
+ }
+ var segments = file.pathSegments;
+
+ for (var i = 0; i < segments.length - 1; i++) {
+ var segment = segments[i];
+ currentTrieNode = currentTrieNode!.map[segment];
+ if (currentTrieNode == null) break;
+ if (_isMatch(path, currentTrieNode, potential)) {
+ return currentTrieNode.package;
+ }
+ }
+ if (potential.isEmpty) return null;
+ return potential.last;
+ }
+}
+
+class EmptyPackageTree implements PackageTree {
+ const EmptyPackageTree();
+
+ @override
+ Iterable<Package> get allPackages => const Iterable<Package>.empty();
+
+ @override
+ SimplePackage? packageOf(Uri file) => null;
+}
+
+/// Checks whether [longerPath] begins with [parentPath].
+///
+/// Skips checking the [start] first characters which are assumed to
+/// already have been matched.
+bool _beginsWith(int start, String parentPath, String longerPath) {
+ if (longerPath.length < parentPath.length) return false;
+ for (var i = start; i < parentPath.length; i++) {
+ if (longerPath.codeUnitAt(i) != parentPath.codeUnitAt(i)) return false;
+ }
+ return true;
+}
+
+enum ConflictType { sameRoots, interleaving, insidePackageRoot }
+
+/// Conflict between packages added to the same configuration.
+///
+/// The [package] conflicts with [existingPackage] if it has
+/// the same root path or the package URI root path
+/// of [existingPackage] is inside the root path of [package].
+class ConflictException {
+ /// The existing package that [package] conflicts with.
+ final SimplePackage existingPackage;
+
+ /// The package that could not be added without a conflict.
+ final SimplePackage package;
+
+ /// Whether the conflict is with the package URI root of [existingPackage].
+ final ConflictType conflictType;
+
+ /// Creates a root conflict between [package] and [existingPackage].
+ ConflictException(this.package, this.existingPackage, this.conflictType);
+}
+
+/// Used for sorting packages by root path.
+int _compareRoot(Package p1, Package p2) =>
+ p1.root.toString().compareTo(p2.root.toString());
diff --git a/pkgs/package_config/lib/src/package_config_io.dart b/pkgs/package_config/lib/src/package_config_io.dart
new file mode 100644
index 0000000..8c5773b
--- /dev/null
+++ b/pkgs/package_config/lib/src/package_config_io.dart
@@ -0,0 +1,166 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// dart:io dependent functionality for reading and writing configuration files.
+
+import 'dart:convert';
+import 'dart:io';
+import 'dart:typed_data';
+
+import 'errors.dart';
+import 'package_config_impl.dart';
+import 'package_config_json.dart';
+import 'packages_file.dart' as packages_file;
+import 'util.dart';
+import 'util_io.dart';
+
+/// Name of directory where Dart tools store their configuration.
+///
+/// Directory is created in the package root directory.
+const dartToolDirName = '.dart_tool';
+
+/// Name of file containing new package configuration data.
+///
+/// File is stored in the dart tool directory.
+const packageConfigFileName = 'package_config.json';
+
+/// Name of file containing legacy package configuration data.
+///
+/// File is stored in the package root directory.
+const packagesFileName = '.packages';
+
+/// Reads a package configuration file.
+///
+/// Detects whether the [file] is a version one `.packages` file or
+/// a version two `package_config.json` file.
+///
+/// If the [file] is a `.packages` file and [preferNewest] is true,
+/// first checks whether there is an adjacent `.dart_tool/package_config.json`
+/// file, and if so, reads that instead.
+/// If [preferNewest] is false, the specified file is loaded even if it is
+/// a `.packages` file and there is an available `package_config.json` file.
+///
+/// The file must exist and be a normal file.
+Future<PackageConfig> readAnyConfigFile(
+ File file, bool preferNewest, void Function(Object error) onError) async {
+ if (preferNewest && fileName(file.path) == packagesFileName) {
+ var alternateFile = File(
+ pathJoin(dirName(file.path), dartToolDirName, packageConfigFileName));
+ if (alternateFile.existsSync()) {
+ return await readPackageConfigJsonFile(alternateFile, onError);
+ }
+ }
+ Uint8List bytes;
+ try {
+ bytes = await file.readAsBytes();
+ } catch (e) {
+ onError(e);
+ return const SimplePackageConfig.empty();
+ }
+ return parseAnyConfigFile(bytes, file.uri, onError);
+}
+
+/// Like [readAnyConfigFile] but uses a URI and an optional loader.
+Future<PackageConfig> readAnyConfigFileUri(
+ Uri file,
+ Future<Uint8List?> Function(Uri uri)? loader,
+ void Function(Object error) onError,
+ bool preferNewest) async {
+ if (file.isScheme('package')) {
+ throw PackageConfigArgumentError(
+ file, 'file', 'Must not be a package: URI');
+ }
+ if (loader == null) {
+ if (file.isScheme('file')) {
+ return await readAnyConfigFile(File.fromUri(file), preferNewest, onError);
+ }
+ loader = defaultLoader;
+ }
+ if (preferNewest && file.pathSegments.last == packagesFileName) {
+ var alternateFile = file.resolve('$dartToolDirName/$packageConfigFileName');
+ Uint8List? bytes;
+ try {
+ bytes = await loader(alternateFile);
+ } catch (e) {
+ onError(e);
+ return const SimplePackageConfig.empty();
+ }
+ if (bytes != null) {
+ return parsePackageConfigBytes(bytes, alternateFile, onError);
+ }
+ }
+ Uint8List? bytes;
+ try {
+ bytes = await loader(file);
+ } catch (e) {
+ onError(e);
+ return const SimplePackageConfig.empty();
+ }
+ if (bytes == null) {
+ onError(PackageConfigArgumentError(
+ file.toString(), 'file', 'File cannot be read'));
+ return const SimplePackageConfig.empty();
+ }
+ return parseAnyConfigFile(bytes, file, onError);
+}
+
+/// Parses a `.packages` or `package_config.json` file's contents.
+///
+/// Assumes it's a JSON file if the first non-whitespace character
+/// is `{`, otherwise assumes it's a `.packages` file.
+PackageConfig parseAnyConfigFile(
+ Uint8List bytes, Uri file, void Function(Object error) onError) {
+ var firstChar = firstNonWhitespaceChar(bytes);
+ if (firstChar != $lbrace) {
+ // Definitely not a JSON object, probably a .packages.
+ return packages_file.parse(bytes, file, onError);
+ }
+ return parsePackageConfigBytes(bytes, file, onError);
+}
+
+Future<PackageConfig> readPackageConfigJsonFile(
+ File file, void Function(Object error) onError) async {
+ Uint8List bytes;
+ try {
+ bytes = await file.readAsBytes();
+ } catch (error) {
+ onError(error);
+ return const SimplePackageConfig.empty();
+ }
+ return parsePackageConfigBytes(bytes, file.uri, onError);
+}
+
+Future<PackageConfig> readDotPackagesFile(
+ File file, void Function(Object error) onError) async {
+ Uint8List bytes;
+ try {
+ bytes = await file.readAsBytes();
+ } catch (error) {
+ onError(error);
+ return const SimplePackageConfig.empty();
+ }
+ return packages_file.parse(bytes, file.uri, onError);
+}
+
+Future<void> writePackageConfigJsonFile(
+ PackageConfig config, Directory targetDirectory) async {
+ // Write .dart_tool/package_config.json first.
+ var dartToolDir = Directory(pathJoin(targetDirectory.path, dartToolDirName));
+ await dartToolDir.create(recursive: true);
+ var file = File(pathJoin(dartToolDir.path, packageConfigFileName));
+ var baseUri = file.uri;
+
+ var sink = file.openWrite(encoding: utf8);
+ writePackageConfigJsonUtf8(config, baseUri, sink);
+ var doneJson = sink.close();
+
+ // Write .packages too.
+ file = File(pathJoin(targetDirectory.path, packagesFileName));
+ baseUri = file.uri;
+ sink = file.openWrite(encoding: utf8);
+ writeDotPackages(config, baseUri, sink);
+ var donePackages = sink.close();
+
+ await Future.wait([doneJson, donePackages]);
+}
diff --git a/pkgs/package_config/lib/src/package_config_json.dart b/pkgs/package_config/lib/src/package_config_json.dart
new file mode 100644
index 0000000..65560a0
--- /dev/null
+++ b/pkgs/package_config/lib/src/package_config_json.dart
@@ -0,0 +1,321 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Parsing and serialization of package configurations.
+
+import 'dart:convert';
+import 'dart:typed_data';
+
+import 'errors.dart';
+import 'package_config_impl.dart';
+import 'packages_file.dart' as packages_file;
+import 'util.dart';
+
+const String _configVersionKey = 'configVersion';
+const String _packagesKey = 'packages';
+const List<String> _topNames = [_configVersionKey, _packagesKey];
+const String _nameKey = 'name';
+const String _rootUriKey = 'rootUri';
+const String _packageUriKey = 'packageUri';
+const String _languageVersionKey = 'languageVersion';
+const List<String> _packageNames = [
+ _nameKey,
+ _rootUriKey,
+ _packageUriKey,
+ _languageVersionKey
+];
+
+const String _generatedKey = 'generated';
+const String _generatorKey = 'generator';
+const String _generatorVersionKey = 'generatorVersion';
+
+final _jsonUtf8Decoder = json.fuse(utf8).decoder;
+
+PackageConfig parsePackageConfigBytes(
+ Uint8List bytes, Uri file, void Function(Object error) onError) {
+ // TODO(lrn): Make this simpler. Maybe parse directly from bytes.
+ Object? jsonObject;
+ try {
+ jsonObject = _jsonUtf8Decoder.convert(bytes);
+ } on FormatException catch (e) {
+ onError(PackageConfigFormatException.from(e));
+ return const SimplePackageConfig.empty();
+ }
+ return parsePackageConfigJson(jsonObject, file, onError);
+}
+
+PackageConfig parsePackageConfigString(
+ String source, Uri file, void Function(Object error) onError) {
+ Object? jsonObject;
+ try {
+ jsonObject = jsonDecode(source);
+ } on FormatException catch (e) {
+ onError(PackageConfigFormatException.from(e));
+ return const SimplePackageConfig.empty();
+ }
+ return parsePackageConfigJson(jsonObject, file, onError);
+}
+
+/// Creates a [PackageConfig] from a parsed JSON-like object structure.
+///
+/// The [json] argument must be a JSON object (`Map<String, Object?>`)
+/// containing a `"configVersion"` entry with an integer value in the range
+/// 1 to [PackageConfig.maxVersion],
+/// and with a `"packages"` entry which is a JSON array (`List<Object?>`)
+/// containing JSON objects which each has the following properties:
+///
+/// * `"name"`: The package name as a string.
+/// * `"rootUri"`: The root of the package as a URI stored as a string.
+/// * `"packageUri"`: Optionally the root of for `package:` URI resolution
+/// for the package, as a relative URI below the root URI
+/// stored as a string.
+/// * `"languageVersion"`: Optionally a language version string which is a
+/// an integer numeral, a decimal point (`.`) and another integer numeral,
+/// where the integer numeral cannot have a sign, and can only have a
+/// leading zero if the entire numeral is a single zero.
+///
+/// The [baseLocation] is used as base URI to resolve the "rootUri"
+/// URI reference string.
+PackageConfig parsePackageConfigJson(
+ Object? json, Uri baseLocation, void Function(Object error) onError) {
+ if (!baseLocation.hasScheme || baseLocation.isScheme('package')) {
+ throw PackageConfigArgumentError(baseLocation.toString(), 'baseLocation',
+ 'Must be an absolute non-package: URI');
+ }
+
+ if (!baseLocation.path.endsWith('/')) {
+ baseLocation = baseLocation.resolveUri(Uri(path: '.'));
+ }
+
+ String typeName<T>() {
+ if (0 is T) return 'int';
+ if ('' is T) return 'string';
+ if (const <Object?>[] is T) return 'array';
+ return 'object';
+ }
+
+ T? checkType<T>(Object? value, String name, [String? packageName]) {
+ if (value is T) return value;
+ // The only types we are called with are [int], [String], [List<Object?>]
+ // and Map<String, Object?>. Recognize which to give a better error message.
+ var message =
+ "$name${packageName != null ? " of package $packageName" : ""}"
+ ' is not a JSON ${typeName<T>()}';
+ onError(PackageConfigFormatException(message, value));
+ return null;
+ }
+
+ Package? parsePackage(Map<String, Object?> entry) {
+ String? name;
+ String? rootUri;
+ String? packageUri;
+ String? languageVersion;
+ Map<String, Object?>? extraData;
+ var hasName = false;
+ var hasRoot = false;
+ var hasVersion = false;
+ entry.forEach((key, value) {
+ switch (key) {
+ case _nameKey:
+ hasName = true;
+ name = checkType<String>(value, _nameKey);
+ break;
+ case _rootUriKey:
+ hasRoot = true;
+ rootUri = checkType<String>(value, _rootUriKey, name);
+ break;
+ case _packageUriKey:
+ packageUri = checkType<String>(value, _packageUriKey, name);
+ break;
+ case _languageVersionKey:
+ hasVersion = true;
+ languageVersion = checkType<String>(value, _languageVersionKey, name);
+ break;
+ default:
+ (extraData ??= {})[key] = value;
+ break;
+ }
+ });
+ if (!hasName) {
+ onError(PackageConfigFormatException('Missing name entry', entry));
+ }
+ if (!hasRoot) {
+ onError(PackageConfigFormatException('Missing rootUri entry', entry));
+ }
+ if (name == null || rootUri == null) return null;
+ var parsedRootUri = Uri.parse(rootUri!);
+ var relativeRoot = !hasAbsolutePath(parsedRootUri);
+ var root = baseLocation.resolveUri(parsedRootUri);
+ if (!root.path.endsWith('/')) root = root.replace(path: '${root.path}/');
+ var packageRoot = root;
+ if (packageUri != null) packageRoot = root.resolve(packageUri!);
+ if (!packageRoot.path.endsWith('/')) {
+ packageRoot = packageRoot.replace(path: '${packageRoot.path}/');
+ }
+
+ LanguageVersion? version;
+ if (languageVersion != null) {
+ version = parseLanguageVersion(languageVersion, onError);
+ } else if (hasVersion) {
+ version = SimpleInvalidLanguageVersion('invalid');
+ }
+
+ return SimplePackage.validate(
+ name!, root, packageRoot, version, extraData, relativeRoot, (error) {
+ if (error is ArgumentError) {
+ onError(
+ PackageConfigFormatException(
+ error.message.toString(), error.invalidValue),
+ );
+ } else {
+ onError(error);
+ }
+ });
+ }
+
+ var map = checkType<Map<String, Object?>>(json, 'value');
+ if (map == null) return const SimplePackageConfig.empty();
+ Map<String, Object?>? extraData;
+ List<Package>? packageList;
+ int? configVersion;
+ map.forEach((key, value) {
+ switch (key) {
+ case _configVersionKey:
+ configVersion = checkType<int>(value, _configVersionKey) ?? 2;
+ break;
+ case _packagesKey:
+ var packageArray = checkType<List<Object?>>(value, _packagesKey) ?? [];
+ var packages = <Package>[];
+ for (var package in packageArray) {
+ var packageMap =
+ checkType<Map<String, Object?>>(package, 'package entry');
+ if (packageMap != null) {
+ var entry = parsePackage(packageMap);
+ if (entry != null) {
+ packages.add(entry);
+ }
+ }
+ }
+ packageList = packages;
+ break;
+ default:
+ (extraData ??= {})[key] = value;
+ break;
+ }
+ });
+ if (configVersion == null) {
+ onError(PackageConfigFormatException('Missing configVersion entry', json));
+ configVersion = 2;
+ }
+ if (packageList == null) {
+ onError(PackageConfigFormatException('Missing packages list', json));
+ packageList = [];
+ }
+ return SimplePackageConfig(configVersion!, packageList!, extraData, (error) {
+ if (error is ArgumentError) {
+ onError(
+ PackageConfigFormatException(
+ error.message.toString(), error.invalidValue),
+ );
+ } else {
+ onError(error);
+ }
+ });
+}
+
+final _jsonUtf8Encoder = JsonUtf8Encoder(' ');
+
+void writePackageConfigJsonUtf8(
+ PackageConfig config, Uri? baseUri, Sink<List<int>> output) {
+ // Can be optimized.
+ var data = packageConfigToJson(config, baseUri);
+ output.add(_jsonUtf8Encoder.convert(data) as Uint8List);
+}
+
+void writePackageConfigJsonString(
+ PackageConfig config, Uri? baseUri, StringSink output) {
+ // Can be optimized.
+ var data = packageConfigToJson(config, baseUri);
+ output.write(const JsonEncoder.withIndent(' ').convert(data));
+}
+
+Map<String, Object?> packageConfigToJson(PackageConfig config, Uri? baseUri) =>
+ <String, Object?>{
+ ...?_extractExtraData(config.extraData, _topNames),
+ _configVersionKey: PackageConfig.maxVersion,
+ _packagesKey: [
+ for (var package in config.packages)
+ <String, Object?>{
+ _nameKey: package.name,
+ _rootUriKey: trailingSlash((package.relativeRoot
+ ? relativizeUri(package.root, baseUri)
+ : package.root)
+ .toString()),
+ if (package.root != package.packageUriRoot)
+ _packageUriKey: trailingSlash(
+ relativizeUri(package.packageUriRoot, package.root)
+ .toString()),
+ if (package.languageVersion != null &&
+ package.languageVersion is! InvalidLanguageVersion)
+ _languageVersionKey: package.languageVersion.toString(),
+ ...?_extractExtraData(package.extraData, _packageNames),
+ }
+ ],
+ };
+
+void writeDotPackages(PackageConfig config, Uri baseUri, StringSink output) {
+ var extraData = config.extraData;
+ // Write .packages too.
+ String? comment;
+ if (extraData is Map<String, Object?>) {
+ var generator = extraData[_generatorKey];
+ if (generator is String) {
+ var generated = extraData[_generatedKey];
+ var generatorVersion = extraData[_generatorVersionKey];
+ comment = 'Generated by $generator'
+ "${generatorVersion is String ? " $generatorVersion" : ""}"
+ "${generated is String ? " on $generated" : ""}.";
+ }
+ }
+ packages_file.write(output, config, baseUri: baseUri, comment: comment);
+}
+
+/// If "extraData" is a JSON map, then return it, otherwise return null.
+///
+/// If the value contains any of the [reservedNames] for the current context,
+/// entries with that name in the extra data are dropped.
+Map<String, Object?>? _extractExtraData(
+ Object? data, Iterable<String> reservedNames) {
+ if (data is Map<String, Object?>) {
+ if (data.isEmpty) return null;
+ for (var name in reservedNames) {
+ if (data.containsKey(name)) {
+ var filteredData = {
+ for (var key in data.keys)
+ if (!reservedNames.contains(key)) key: data[key]
+ };
+ if (filteredData.isEmpty) return null;
+ for (var value in filteredData.values) {
+ if (!_validateJson(value)) return null;
+ }
+ return filteredData;
+ }
+ }
+ return data;
+ }
+ return null;
+}
+
+/// Checks that the object is a valid JSON-like data structure.
+bool _validateJson(Object? object) {
+ if (object == null || true == object || false == object) return true;
+ if (object is num || object is String) return true;
+ if (object is List<Object?>) {
+ return object.every(_validateJson);
+ }
+ if (object is Map<String, Object?>) {
+ return object.values.every(_validateJson);
+ }
+ return false;
+}
diff --git a/pkgs/package_config/lib/src/packages_file.dart b/pkgs/package_config/lib/src/packages_file.dart
new file mode 100644
index 0000000..bf68f2c
--- /dev/null
+++ b/pkgs/package_config/lib/src/packages_file.dart
@@ -0,0 +1,193 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'errors.dart';
+import 'package_config_impl.dart';
+import 'util.dart';
+
+/// The language version prior to the release of language versioning.
+///
+/// This is the default language version used by all packages from a
+/// `.packages` file.
+final LanguageVersion _languageVersion = LanguageVersion(2, 7);
+
+/// Parses a `.packages` file into a [PackageConfig].
+///
+/// The [source] is the byte content of a `.packages` file, assumed to be
+/// UTF-8 encoded. In practice, all significant parts of the file must be ASCII,
+/// so Latin-1 or Windows-1252 encoding will also work fine.
+///
+/// If the file content is available as a string, its [String.codeUnits] can
+/// be used as the `source` argument of this function.
+///
+/// The [baseLocation] is used as a base URI to resolve all relative
+/// URI references against.
+/// If the content was read from a file, `baseLocation` should be the
+/// location of that file.
+///
+/// Returns a simple package configuration where each package's
+/// [Package.packageUriRoot] is the same as its [Package.root]
+/// and it has no [Package.languageVersion].
+PackageConfig parse(
+ List<int> source, Uri baseLocation, void Function(Object error) onError) {
+ if (baseLocation.isScheme('package')) {
+ onError(PackageConfigArgumentError(
+ baseLocation, 'baseLocation', 'Must not be a package: URI'));
+ return PackageConfig.empty;
+ }
+ var index = 0;
+ var packages = <Package>[];
+ var packageNames = <String>{};
+ while (index < source.length) {
+ var ignoreLine = false;
+ var start = index;
+ var separatorIndex = -1;
+ var end = source.length;
+ var char = source[index++];
+ if (char == $cr || char == $lf) {
+ continue;
+ }
+ if (char == $colon) {
+ onError(PackageConfigFormatException(
+ 'Missing package name', source, index - 1));
+ ignoreLine = true; // Ignore if package name is invalid.
+ } else {
+ ignoreLine = char == $hash; // Ignore if comment.
+ }
+ var queryStart = -1;
+ var fragmentStart = -1;
+ while (index < source.length) {
+ char = source[index++];
+ if (char == $colon && separatorIndex < 0) {
+ separatorIndex = index - 1;
+ } else if (char == $cr || char == $lf) {
+ end = index - 1;
+ break;
+ } else if (char == $question && queryStart < 0 && fragmentStart < 0) {
+ queryStart = index - 1;
+ } else if (char == $hash && fragmentStart < 0) {
+ fragmentStart = index - 1;
+ }
+ }
+ if (ignoreLine) continue;
+ if (separatorIndex < 0) {
+ onError(
+ PackageConfigFormatException("No ':' on line", source, index - 1));
+ continue;
+ }
+ var packageName = String.fromCharCodes(source, start, separatorIndex);
+ var invalidIndex = checkPackageName(packageName);
+ if (invalidIndex >= 0) {
+ onError(PackageConfigFormatException(
+ 'Not a valid package name', source, start + invalidIndex));
+ continue;
+ }
+ if (queryStart >= 0) {
+ onError(PackageConfigFormatException(
+ 'Location URI must not have query', source, queryStart));
+ end = queryStart;
+ } else if (fragmentStart >= 0) {
+ onError(PackageConfigFormatException(
+ 'Location URI must not have fragment', source, fragmentStart));
+ end = fragmentStart;
+ }
+ var packageValue = String.fromCharCodes(source, separatorIndex + 1, end);
+ Uri packageLocation;
+ try {
+ packageLocation = Uri.parse(packageValue);
+ } on FormatException catch (e) {
+ onError(PackageConfigFormatException.from(e));
+ continue;
+ }
+ var relativeRoot = !hasAbsolutePath(packageLocation);
+ packageLocation = baseLocation.resolveUri(packageLocation);
+ if (packageLocation.isScheme('package')) {
+ onError(PackageConfigFormatException(
+ 'Package URI as location for package', source, separatorIndex + 1));
+ continue;
+ }
+ var path = packageLocation.path;
+ if (!path.endsWith('/')) {
+ path += '/';
+ packageLocation = packageLocation.replace(path: path);
+ }
+ if (packageNames.contains(packageName)) {
+ onError(PackageConfigFormatException(
+ 'Same package name occurred more than once', source, start));
+ continue;
+ }
+ var rootUri = packageLocation;
+ if (path.endsWith('/lib/')) {
+ // Assume default Pub package layout. Include package itself in root.
+ rootUri =
+ packageLocation.replace(path: path.substring(0, path.length - 4));
+ }
+ var package = SimplePackage.validate(packageName, rootUri, packageLocation,
+ _languageVersion, null, relativeRoot, (error) {
+ if (error is ArgumentError) {
+ onError(PackageConfigFormatException(error.message.toString(), source));
+ } else {
+ onError(error);
+ }
+ });
+ if (package != null) {
+ packages.add(package);
+ packageNames.add(packageName);
+ }
+ }
+ return SimplePackageConfig(1, packages, null, onError);
+}
+
+/// Writes the configuration to a [StringSink].
+///
+/// If [comment] is provided, the output will contain this comment
+/// with `# ` in front of each line.
+/// Lines are defined as ending in line feed (`'\n'`). If the final
+/// line of the comment doesn't end in a line feed, one will be added.
+///
+/// If [baseUri] is provided, package locations will be made relative
+/// to the base URI, if possible, before writing.
+void write(StringSink output, PackageConfig config,
+ {Uri? baseUri, String? comment}) {
+ if (baseUri != null && !baseUri.isAbsolute) {
+ throw PackageConfigArgumentError(baseUri, 'baseUri', 'Must be absolute');
+ }
+
+ if (comment != null) {
+ var lines = comment.split('\n');
+ if (lines.last.isEmpty) lines.removeLast();
+ for (var commentLine in lines) {
+ output.write('# ');
+ output.writeln(commentLine);
+ }
+ } else {
+ output.write('# generated by package:package_config at ');
+ output.write(DateTime.now());
+ output.writeln();
+ }
+ for (var package in config.packages) {
+ var packageName = package.name;
+ var uri = package.packageUriRoot;
+ // Validate packageName.
+ if (!isValidPackageName(packageName)) {
+ throw PackageConfigArgumentError(
+ config, 'config', '"$packageName" is not a valid package name');
+ }
+ if (uri.scheme == 'package') {
+ throw PackageConfigArgumentError(
+ config, 'config', 'Package location must not be a package URI: $uri');
+ }
+ output.write(packageName);
+ output.write(':');
+ // If baseUri is provided, make the URI relative to baseUri.
+ if (baseUri != null) {
+ uri = relativizeUri(uri, baseUri)!;
+ }
+ if (!uri.path.endsWith('/')) {
+ uri = uri.replace(path: '${uri.path}/');
+ }
+ output.write(uri);
+ output.writeln();
+ }
+}
diff --git a/pkgs/package_config/lib/src/util.dart b/pkgs/package_config/lib/src/util.dart
new file mode 100644
index 0000000..4f0210c
--- /dev/null
+++ b/pkgs/package_config/lib/src/util.dart
@@ -0,0 +1,253 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Utility methods used by more than one library in the package.
+library;
+
+import 'errors.dart';
+
+// All ASCII characters that are valid in a package name, with space
+// for all the invalid ones (including space).
+const String _validPackageNameCharacters =
+ r" ! $ &'()*+,-. 0123456789 ; = "
+ r'@ABCDEFGHIJKLMNOPQRSTUVWXYZ _ abcdefghijklmnopqrstuvwxyz ~ ';
+
+/// Tests whether something is a valid Dart package name.
+bool isValidPackageName(String string) {
+ return checkPackageName(string) < 0;
+}
+
+/// Check if a string is a valid package name.
+///
+/// Valid package names contain only characters in [_validPackageNameCharacters]
+/// and must contain at least one non-'.' character.
+///
+/// Returns `-1` if the string is valid.
+/// Otherwise returns the index of the first invalid character,
+/// or `string.length` if the string contains no non-'.' character.
+int checkPackageName(String string) {
+ // Becomes non-zero if any non-'.' character is encountered.
+ var nonDot = 0;
+ for (var i = 0; i < string.length; i++) {
+ var c = string.codeUnitAt(i);
+ if (c > 0x7f || _validPackageNameCharacters.codeUnitAt(c) <= $space) {
+ return i;
+ }
+ nonDot += c ^ $dot;
+ }
+ if (nonDot == 0) return string.length;
+ return -1;
+}
+
+/// Validate that a [Uri] is a valid `package:` URI.
+///
+/// Used to validate user input.
+///
+/// Returns the package name extracted from the package URI,
+/// which is the path segment between `package:` and the first `/`.
+String checkValidPackageUri(Uri packageUri, String name) {
+ if (packageUri.scheme != 'package') {
+ throw PackageConfigArgumentError(packageUri, name, 'Not a package: URI');
+ }
+ if (packageUri.hasAuthority) {
+ throw PackageConfigArgumentError(
+ packageUri, name, 'Package URIs must not have a host part');
+ }
+ if (packageUri.hasQuery) {
+ // A query makes no sense if resolved to a file: URI.
+ throw PackageConfigArgumentError(
+ packageUri, name, 'Package URIs must not have a query part');
+ }
+ if (packageUri.hasFragment) {
+ // We could leave the fragment after the URL when resolving,
+ // but it would be odd if "package:foo/foo.dart#1" and
+ // "package:foo/foo.dart#2" were considered different libraries.
+ // Keep the syntax open in case we ever get multiple libraries in one file.
+ throw PackageConfigArgumentError(
+ packageUri, name, 'Package URIs must not have a fragment part');
+ }
+ if (packageUri.path.startsWith('/')) {
+ throw PackageConfigArgumentError(
+ packageUri, name, "Package URIs must not start with a '/'");
+ }
+ var firstSlash = packageUri.path.indexOf('/');
+ if (firstSlash == -1) {
+ throw PackageConfigArgumentError(packageUri, name,
+ "Package URIs must start with the package name followed by a '/'");
+ }
+ var packageName = packageUri.path.substring(0, firstSlash);
+ var badIndex = checkPackageName(packageName);
+ if (badIndex >= 0) {
+ if (packageName.isEmpty) {
+ throw PackageConfigArgumentError(
+ packageUri, name, 'Package names mus be non-empty');
+ }
+ if (badIndex == packageName.length) {
+ throw PackageConfigArgumentError(packageUri, name,
+ "Package names must contain at least one non-'.' character");
+ }
+ assert(badIndex < packageName.length);
+ var badCharCode = packageName.codeUnitAt(badIndex);
+ var badChar = 'U+${badCharCode.toRadixString(16).padLeft(4, '0')}';
+ if (badCharCode >= 0x20 && badCharCode <= 0x7e) {
+ // Printable character.
+ badChar = "'${packageName[badIndex]}' ($badChar)";
+ }
+ throw PackageConfigArgumentError(
+ packageUri, name, 'Package names must not contain $badChar');
+ }
+ return packageName;
+}
+
+/// Checks whether URI is just an absolute directory.
+///
+/// * It must have a scheme.
+/// * It must not have a query or fragment.
+/// * The path must end with `/`.
+bool isAbsoluteDirectoryUri(Uri uri) {
+ if (uri.hasQuery) return false;
+ if (uri.hasFragment) return false;
+ if (!uri.hasScheme) return false;
+ var path = uri.path;
+ if (!path.endsWith('/')) return false;
+ return true;
+}
+
+/// Whether the former URI is a prefix of the latter.
+bool isUriPrefix(Uri prefix, Uri path) {
+ assert(!prefix.hasFragment);
+ assert(!prefix.hasQuery);
+ assert(!path.hasQuery);
+ assert(!path.hasFragment);
+ assert(prefix.path.endsWith('/'));
+ return path.toString().startsWith(prefix.toString());
+}
+
+/// Finds the first non-JSON-whitespace character in a file.
+///
+/// Used to heuristically detect whether a file is a JSON file or an .ini file.
+int firstNonWhitespaceChar(List<int> bytes) {
+ for (var i = 0; i < bytes.length; i++) {
+ var char = bytes[i];
+ if (char != 0x20 && char != 0x09 && char != 0x0a && char != 0x0d) {
+ return char;
+ }
+ }
+ return -1;
+}
+
+/// Appends a trailing `/` if the path doesn't end with one.
+String trailingSlash(String path) {
+ if (path.isEmpty || path.endsWith('/')) return path;
+ return '$path/';
+}
+
+/// Whether a URI should not be considered relative to the base URI.
+///
+/// Used to determine whether a parsed root URI is relative
+/// to the configuration file or not.
+/// If it is relative, then it's rewritten as relative when
+/// output again later. If not, it's output as absolute.
+bool hasAbsolutePath(Uri uri) =>
+ uri.hasScheme || uri.hasAuthority || uri.hasAbsolutePath;
+
+/// Attempts to return a relative path-only URI for [uri].
+///
+/// First removes any query or fragment part from [uri].
+///
+/// If [uri] is already relative (has no scheme), it's returned as-is.
+/// If that is not desired, the caller can pass `baseUri.resolveUri(uri)`
+/// as the [uri] instead.
+///
+/// If the [uri] has a scheme or authority part which differs from
+/// the [baseUri], or if there is no overlap in the paths of the
+/// two URIs at all, the [uri] is returned as-is.
+///
+/// Otherwise the result is a path-only URI which satisfies
+/// `baseUri.resolveUri(result) == uri`,
+///
+/// The `baseUri` must be absolute.
+Uri? relativizeUri(Uri? uri, Uri? baseUri) {
+ if (baseUri == null) return uri;
+ assert(baseUri.isAbsolute);
+ if (uri!.hasQuery || uri.hasFragment) {
+ uri = Uri(
+ scheme: uri.scheme,
+ userInfo: uri.hasAuthority ? uri.userInfo : null,
+ host: uri.hasAuthority ? uri.host : null,
+ port: uri.hasAuthority ? uri.port : null,
+ path: uri.path);
+ }
+
+ // Already relative. We assume the caller knows what they are doing.
+ if (!uri.isAbsolute) return uri;
+
+ if (baseUri.scheme != uri.scheme) {
+ return uri;
+ }
+
+ // If authority differs, we could remove the scheme, but it's not worth it.
+ if (uri.hasAuthority != baseUri.hasAuthority) return uri;
+ if (uri.hasAuthority) {
+ if (uri.userInfo != baseUri.userInfo ||
+ uri.host.toLowerCase() != baseUri.host.toLowerCase() ||
+ uri.port != baseUri.port) {
+ return uri;
+ }
+ }
+
+ baseUri = baseUri.normalizePath();
+ var base = [...baseUri.pathSegments];
+ if (base.isNotEmpty) base.removeLast();
+ uri = uri.normalizePath();
+ var target = [...uri.pathSegments];
+ if (target.isNotEmpty && target.last.isEmpty) target.removeLast();
+ var index = 0;
+ while (index < base.length && index < target.length) {
+ if (base[index] != target[index]) {
+ break;
+ }
+ index++;
+ }
+ if (index == base.length) {
+ if (index == target.length) {
+ return Uri(path: './');
+ }
+ return Uri(path: target.skip(index).join('/'));
+ } else if (index > 0) {
+ var buffer = StringBuffer();
+ for (var n = base.length - index; n > 0; --n) {
+ buffer.write('../');
+ }
+ buffer.writeAll(target.skip(index), '/');
+ return Uri(path: buffer.toString());
+ } else {
+ return uri;
+ }
+}
+
+// Character constants used by this package.
+/// "Line feed" control character.
+const int $lf = 0x0a;
+
+/// "Carriage return" control character.
+const int $cr = 0x0d;
+
+/// Space character.
+const int $space = 0x20;
+
+/// Character `#`.
+const int $hash = 0x23;
+
+/// Character `.`.
+const int $dot = 0x2e;
+
+/// Character `:`.
+const int $colon = 0x3a;
+
+/// Character `?`.
+const int $question = 0x3f;
+
+/// Character `{`.
+const int $lbrace = 0x7b;
diff --git a/pkgs/package_config/lib/src/util_io.dart b/pkgs/package_config/lib/src/util_io.dart
new file mode 100644
index 0000000..4680eef
--- /dev/null
+++ b/pkgs/package_config/lib/src/util_io.dart
@@ -0,0 +1,108 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Utility methods requiring dart:io and used by more than one library in the
+/// package.
+library;
+
+import 'dart:io';
+import 'dart:typed_data';
+
+Future<Uint8List?> defaultLoader(Uri uri) async {
+ if (uri.isScheme('file')) {
+ var file = File.fromUri(uri);
+ try {
+ return await file.readAsBytes();
+ } catch (_) {
+ return null;
+ }
+ }
+ if (uri.isScheme('http') || uri.isScheme('https')) {
+ return _httpGet(uri);
+ }
+ throw UnsupportedError('Default URI unsupported scheme: $uri');
+}
+
+Future<Uint8List?> _httpGet(Uri uri) async {
+ assert(uri.isScheme('http') || uri.isScheme('https'));
+ var client = HttpClient();
+ var request = await client.getUrl(uri);
+ var response = await request.close();
+ if (response.statusCode != HttpStatus.ok) {
+ return null;
+ }
+ var splitContent = await response.toList();
+ var totalLength = 0;
+ if (splitContent.length == 1) {
+ var part = splitContent[0];
+ if (part is Uint8List) {
+ return part;
+ }
+ }
+ for (var list in splitContent) {
+ totalLength += list.length;
+ }
+ var result = Uint8List(totalLength);
+ var offset = 0;
+ for (var contentPart in splitContent as Iterable<Uint8List>) {
+ result.setRange(offset, offset + contentPart.length, contentPart);
+ offset += contentPart.length;
+ }
+ return result;
+}
+
+/// The file name of a path.
+///
+/// The file name is everything after the last occurrence of
+/// [Platform.pathSeparator], or the entire string if no
+/// path separator occurs in the string.
+String fileName(String path) {
+ var separator = Platform.pathSeparator;
+ var lastSeparator = path.lastIndexOf(separator);
+ if (lastSeparator < 0) return path;
+ return path.substring(lastSeparator + separator.length);
+}
+
+/// The directory name of a path.
+///
+/// The directory name is everything before the last occurrence of
+/// [Platform.pathSeparator], or the empty string if no
+/// path separator occurs in the string.
+String dirName(String path) {
+ var separator = Platform.pathSeparator;
+ var lastSeparator = path.lastIndexOf(separator);
+ if (lastSeparator < 0) return '';
+ return path.substring(0, lastSeparator);
+}
+
+/// Join path parts with the [Platform.pathSeparator].
+///
+/// If a part ends with a path separator, then no extra separator is
+/// inserted.
+String pathJoin(String part1, String part2, [String? part3]) {
+ var separator = Platform.pathSeparator;
+ var separator1 = part1.endsWith(separator) ? '' : separator;
+ if (part3 == null) {
+ return '$part1$separator1$part2';
+ }
+ var separator2 = part2.endsWith(separator) ? '' : separator;
+ return '$part1$separator1$part2$separator2$part3';
+}
+
+/// Join an unknown number of path parts with [Platform.pathSeparator].
+///
+/// If a part ends with a path separator, then no extra separator is
+/// inserted.
+String pathJoinAll(Iterable<String> parts) {
+ var buffer = StringBuffer();
+ var separator = '';
+ for (var part in parts) {
+ buffer
+ ..write(separator)
+ ..write(part);
+ separator =
+ part.endsWith(Platform.pathSeparator) ? '' : Platform.pathSeparator;
+ }
+ return buffer.toString();
+}
diff --git a/pkgs/package_config/pubspec.yaml b/pkgs/package_config/pubspec.yaml
new file mode 100644
index 0000000..28f3e13
--- /dev/null
+++ b/pkgs/package_config/pubspec.yaml
@@ -0,0 +1,14 @@
+name: package_config
+version: 2.1.1
+description: Support for reading and writing Dart Package Configuration files.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/package_config
+
+environment:
+ sdk: ^3.4.0
+
+dependencies:
+ path: ^1.8.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.0
diff --git a/pkgs/package_config/test/bench.dart b/pkgs/package_config/test/bench.dart
new file mode 100644
index 0000000..8428481
--- /dev/null
+++ b/pkgs/package_config/test/bench.dart
@@ -0,0 +1,71 @@
+// Copyright (c) 2021, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert';
+import 'dart:typed_data';
+
+import 'package:package_config/src/errors.dart';
+import 'package:package_config/src/package_config_json.dart';
+
+void bench(final int size, final bool doPrint) {
+ var sb = StringBuffer();
+ sb.writeln('{');
+ sb.writeln('"configVersion": 2,');
+ sb.writeln('"packages": [');
+ for (var i = 0; i < size; i++) {
+ if (i != 0) {
+ sb.writeln(',');
+ }
+ sb.writeln('{');
+ sb.writeln(' "name": "p_$i",');
+ sb.writeln(' "rootUri": "file:///p_$i/",');
+ sb.writeln(' "packageUri": "lib/",');
+ sb.writeln(' "languageVersion": "2.5",');
+ sb.writeln(' "nonstandard": true');
+ sb.writeln('}');
+ }
+ sb.writeln('],');
+ sb.writeln('"generator": "pub",');
+ sb.writeln('"other": [42]');
+ sb.writeln('}');
+ var stopwatch = Stopwatch()..start();
+ var config = parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ utf8.encode(sb.toString()) as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'),
+ throwError,
+ );
+ final read = stopwatch.elapsedMilliseconds;
+
+ stopwatch.reset();
+ for (var i = 0; i < size; i++) {
+ if (config.packageOf(Uri.parse('file:///p_$i/lib/src/foo.dart'))!.name !=
+ 'p_$i') {
+ throw StateError('Unexpected result!');
+ }
+ }
+ final lookup = stopwatch.elapsedMilliseconds;
+
+ if (doPrint) {
+ print('Read file with $size packages in $read ms, '
+ 'looked up all packages in $lookup ms');
+ }
+}
+
+void main(List<String> args) {
+ if (args.length != 1 && args.length != 2) {
+ throw ArgumentError('Expects arguments: <size> <warmup iterations>?');
+ }
+ final size = int.parse(args[0]);
+ if (args.length > 1) {
+ final warmups = int.parse(args[1]);
+ print('Performing $warmups warmup iterations.');
+ for (var i = 0; i < warmups; i++) {
+ bench(10, false);
+ }
+ }
+
+ // Benchmark.
+ bench(size, true);
+}
diff --git a/pkgs/package_config/test/discovery_test.dart b/pkgs/package_config/test/discovery_test.dart
new file mode 100644
index 0000000..6d1b655
--- /dev/null
+++ b/pkgs/package_config/test/discovery_test.dart
@@ -0,0 +1,346 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'dart:io';
+
+import 'package:package_config/package_config.dart';
+import 'package:test/test.dart';
+
+import 'src/util.dart';
+import 'src/util_io.dart';
+
+const packagesFile = '''
+# A comment
+foo:file:///dart/packages/foo/
+bar:/dart/packages/bar/
+baz:packages/baz/
+''';
+
+const packageConfigFile = '''
+{
+ "configVersion": 2,
+ "packages": [
+ {
+ "name": "foo",
+ "rootUri": "file:///dart/packages/foo/"
+ },
+ {
+ "name": "bar",
+ "rootUri": "/dart/packages/bar/"
+ },
+ {
+ "name": "baz",
+ "rootUri": "../packages/baz/"
+ }
+ ],
+ "extra": [42]
+}
+''';
+
+void validatePackagesFile(PackageConfig resolver, Directory directory) {
+ expect(resolver, isNotNull);
+ expect(resolver.resolve(pkg('foo', 'bar/baz')),
+ equals(Uri.parse('file:///dart/packages/foo/bar/baz')));
+ expect(resolver.resolve(pkg('bar', 'baz/qux')),
+ equals(Uri.parse('file:///dart/packages/bar/baz/qux')));
+ expect(resolver.resolve(pkg('baz', 'qux/foo')),
+ equals(Uri.directory(directory.path).resolve('packages/baz/qux/foo')));
+ expect([for (var p in resolver.packages) p.name],
+ unorderedEquals(['foo', 'bar', 'baz']));
+}
+
+void main() {
+ group('findPackages', () {
+ // Finds package_config.json if there.
+ fileTest('package_config.json', {
+ '.packages': 'invalid .packages file',
+ 'script.dart': 'main(){}',
+ 'packages': {'shouldNotBeFound': <Never, Never>{}},
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ }
+ }, (Directory directory) async {
+ var config = (await findPackageConfig(directory))!;
+ expect(config.version, 2); // Found package_config.json file.
+ validatePackagesFile(config, directory);
+ });
+
+ // Finds .packages if no package_config.json.
+ fileTest('.packages', {
+ '.packages': packagesFile,
+ 'script.dart': 'main(){}',
+ 'packages': {'shouldNotBeFound': <Object, Object>{}}
+ }, (Directory directory) async {
+ var config = (await findPackageConfig(directory))!;
+ expect(config.version, 1); // Found .packages file.
+ validatePackagesFile(config, directory);
+ });
+
+ // Finds package_config.json in super-directory.
+ fileTest('package_config.json recursive', {
+ '.packages': packagesFile,
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ 'subdir': {
+ 'script.dart': 'main(){}',
+ }
+ }, (Directory directory) async {
+ var config = (await findPackageConfig(subdir(directory, 'subdir/')))!;
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+
+ // Finds .packages in super-directory.
+ fileTest('.packages recursive', {
+ '.packages': packagesFile,
+ 'subdir': {'script.dart': 'main(){}'}
+ }, (Directory directory) async {
+ var config = (await findPackageConfig(subdir(directory, 'subdir/')))!;
+ expect(config.version, 1);
+ validatePackagesFile(config, directory);
+ });
+
+ // Does not find a packages/ directory, and returns null if nothing found.
+ fileTest('package directory packages not supported', {
+ 'packages': {
+ 'foo': <String, dynamic>{},
+ }
+ }, (Directory directory) async {
+ var config = await findPackageConfig(directory);
+ expect(config, null);
+ });
+
+ group('throws', () {
+ fileTest('invalid .packages', {
+ '.packages': 'not a .packages file',
+ }, (Directory directory) {
+ expect(findPackageConfig(directory), throwsA(isA<FormatException>()));
+ });
+
+ fileTest('invalid .packages as JSON', {
+ '.packages': packageConfigFile,
+ }, (Directory directory) {
+ expect(findPackageConfig(directory), throwsA(isA<FormatException>()));
+ });
+
+ fileTest('invalid .packages', {
+ '.dart_tool': {
+ 'package_config.json': 'not a JSON file',
+ }
+ }, (Directory directory) {
+ expect(findPackageConfig(directory), throwsA(isA<FormatException>()));
+ });
+
+ fileTest('invalid .packages as INI', {
+ '.dart_tool': {
+ 'package_config.json': packagesFile,
+ }
+ }, (Directory directory) {
+ expect(findPackageConfig(directory), throwsA(isA<FormatException>()));
+ });
+ });
+
+ group('handles error', () {
+ fileTest('invalid .packages', {
+ '.packages': 'not a .packages file',
+ }, (Directory directory) async {
+ var hadError = false;
+ await findPackageConfig(directory,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<FormatException>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+
+ fileTest('invalid .packages as JSON', {
+ '.packages': packageConfigFile,
+ }, (Directory directory) async {
+ var hadError = false;
+ await findPackageConfig(directory,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<FormatException>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+
+ fileTest('invalid package_config not JSON', {
+ '.dart_tool': {
+ 'package_config.json': 'not a JSON file',
+ }
+ }, (Directory directory) async {
+ var hadError = false;
+ await findPackageConfig(directory,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<FormatException>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+
+ fileTest('invalid package config as INI', {
+ '.dart_tool': {
+ 'package_config.json': packagesFile,
+ }
+ }, (Directory directory) async {
+ var hadError = false;
+ await findPackageConfig(directory,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<FormatException>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+ });
+
+ // Does not find .packages if no package_config.json and minVersion > 1.
+ fileTest('.packages ignored', {
+ '.packages': packagesFile,
+ 'script.dart': 'main(){}'
+ }, (Directory directory) async {
+ var config = await findPackageConfig(directory, minVersion: 2);
+ expect(config, null);
+ });
+
+ // Finds package_config.json in super-directory, with .packages in
+ // subdir and minVersion > 1.
+ fileTest('package_config.json recursive .packages ignored', {
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ 'subdir': {
+ '.packages': packagesFile,
+ 'script.dart': 'main(){}',
+ }
+ }, (Directory directory) async {
+ var config = (await findPackageConfig(subdir(directory, 'subdir/'),
+ minVersion: 2))!;
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+ });
+
+ group('loadPackageConfig', () {
+ // Load a specific files
+ group('package_config.json', () {
+ var files = {
+ '.packages': packagesFile,
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ };
+ fileTest('directly', files, (Directory directory) async {
+ var file =
+ dirFile(subdir(directory, '.dart_tool'), 'package_config.json');
+ var config = await loadPackageConfig(file);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+ fileTest('indirectly through .packages', files,
+ (Directory directory) async {
+ var file = dirFile(directory, '.packages');
+ var config = await loadPackageConfig(file);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+ fileTest('prefer .packages', files, (Directory directory) async {
+ var file = dirFile(directory, '.packages');
+ var config = await loadPackageConfig(file, preferNewest: false);
+ expect(config.version, 1);
+ validatePackagesFile(config, directory);
+ });
+ });
+
+ fileTest('package_config.json non-default name', {
+ '.packages': packagesFile,
+ 'subdir': {
+ 'pheldagriff': packageConfigFile,
+ },
+ }, (Directory directory) async {
+ var file = dirFile(directory, 'subdir/pheldagriff');
+ var config = await loadPackageConfig(file);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+
+ fileTest('package_config.json named .packages', {
+ 'subdir': {
+ '.packages': packageConfigFile,
+ },
+ }, (Directory directory) async {
+ var file = dirFile(directory, 'subdir/.packages');
+ var config = await loadPackageConfig(file);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+
+ fileTest('.packages', {
+ '.packages': packagesFile,
+ }, (Directory directory) async {
+ var file = dirFile(directory, '.packages');
+ var config = await loadPackageConfig(file);
+ expect(config.version, 1);
+ validatePackagesFile(config, directory);
+ });
+
+ fileTest('.packages non-default name', {
+ 'pheldagriff': packagesFile,
+ }, (Directory directory) async {
+ var file = dirFile(directory, 'pheldagriff');
+ var config = await loadPackageConfig(file);
+ expect(config.version, 1);
+ validatePackagesFile(config, directory);
+ });
+
+ fileTest('no config found', {}, (Directory directory) {
+ var file = dirFile(directory, 'anyname');
+ expect(
+ () => loadPackageConfig(file), throwsA(isA<FileSystemException>()));
+ });
+
+ fileTest('no config found, handled', {}, (Directory directory) async {
+ var file = dirFile(directory, 'anyname');
+ var hadError = false;
+ await loadPackageConfig(file,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<FileSystemException>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+
+ fileTest('specified file syntax error', {
+ 'anyname': 'syntax error',
+ }, (Directory directory) {
+ var file = dirFile(directory, 'anyname');
+ expect(() => loadPackageConfig(file), throwsFormatException);
+ });
+
+ // Find package_config.json in subdir even if initial file syntax error.
+ fileTest('specified file syntax onError', {
+ '.packages': 'syntax error',
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ }, (Directory directory) async {
+ var file = dirFile(directory, '.packages');
+ var config = await loadPackageConfig(file);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+
+ // A file starting with `{` is a package_config.json file.
+ fileTest('file syntax error with {', {
+ '.packages': '{syntax error',
+ }, (Directory directory) {
+ var file = dirFile(directory, '.packages');
+ expect(() => loadPackageConfig(file), throwsFormatException);
+ });
+ });
+}
diff --git a/pkgs/package_config/test/discovery_uri_test.dart b/pkgs/package_config/test/discovery_uri_test.dart
new file mode 100644
index 0000000..542bf0a
--- /dev/null
+++ b/pkgs/package_config/test/discovery_uri_test.dart
@@ -0,0 +1,310 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'package:package_config/package_config.dart';
+import 'package:test/test.dart';
+
+import 'src/util.dart';
+
+const packagesFile = '''
+# A comment
+foo:file:///dart/packages/foo/
+bar:/dart/packages/bar/
+baz:packages/baz/
+''';
+
+const packageConfigFile = '''
+{
+ "configVersion": 2,
+ "packages": [
+ {
+ "name": "foo",
+ "rootUri": "file:///dart/packages/foo/"
+ },
+ {
+ "name": "bar",
+ "rootUri": "/dart/packages/bar/"
+ },
+ {
+ "name": "baz",
+ "rootUri": "../packages/baz/"
+ }
+ ],
+ "extra": [42]
+}
+''';
+
+void validatePackagesFile(PackageConfig resolver, Uri directory) {
+ expect(resolver, isNotNull);
+ expect(resolver.resolve(pkg('foo', 'bar/baz')),
+ equals(Uri.parse('file:///dart/packages/foo/bar/baz')));
+ expect(resolver.resolve(pkg('bar', 'baz/qux')),
+ equals(directory.resolve('/dart/packages/bar/baz/qux')));
+ expect(resolver.resolve(pkg('baz', 'qux/foo')),
+ equals(directory.resolve('packages/baz/qux/foo')));
+ expect([for (var p in resolver.packages) p.name],
+ unorderedEquals(['foo', 'bar', 'baz']));
+}
+
+void main() {
+ group('findPackages', () {
+ // Finds package_config.json if there.
+ loaderTest('package_config.json', {
+ '.packages': 'invalid .packages file',
+ 'script.dart': 'main(){}',
+ 'packages': {'shouldNotBeFound': <String, dynamic>{}},
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ }
+ }, (directory, loader) async {
+ var config = (await findPackageConfigUri(directory, loader: loader))!;
+ expect(config.version, 2); // Found package_config.json file.
+ validatePackagesFile(config, directory);
+ });
+
+ // Finds .packages if no package_config.json.
+ loaderTest('.packages', {
+ '.packages': packagesFile,
+ 'script.dart': 'main(){}',
+ 'packages': {'shouldNotBeFound': <String, dynamic>{}}
+ }, (directory, loader) async {
+ var config = (await findPackageConfigUri(directory, loader: loader))!;
+ expect(config.version, 1); // Found .packages file.
+ validatePackagesFile(config, directory);
+ });
+
+ // Finds package_config.json in super-directory.
+ loaderTest('package_config.json recursive', {
+ '.packages': packagesFile,
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ 'subdir': {
+ 'script.dart': 'main(){}',
+ }
+ }, (directory, loader) async {
+ var config = (await findPackageConfigUri(directory.resolve('subdir/'),
+ loader: loader))!;
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+
+ // Finds .packages in super-directory.
+ loaderTest('.packages recursive', {
+ '.packages': packagesFile,
+ 'subdir': {'script.dart': 'main(){}'}
+ }, (directory, loader) async {
+ var config = (await findPackageConfigUri(directory.resolve('subdir/'),
+ loader: loader))!;
+ expect(config.version, 1);
+ validatePackagesFile(config, directory);
+ });
+
+ // Does not find a packages/ directory, and returns null if nothing found.
+ loaderTest('package directory packages not supported', {
+ 'packages': {
+ 'foo': <String, dynamic>{},
+ }
+ }, (Uri directory, loader) async {
+ var config = await findPackageConfigUri(directory, loader: loader);
+ expect(config, null);
+ });
+
+ loaderTest('invalid .packages', {
+ '.packages': 'not a .packages file',
+ }, (Uri directory, loader) {
+ expect(() => findPackageConfigUri(directory, loader: loader),
+ throwsA(isA<FormatException>()));
+ });
+
+ loaderTest('invalid .packages as JSON', {
+ '.packages': packageConfigFile,
+ }, (Uri directory, loader) {
+ expect(() => findPackageConfigUri(directory, loader: loader),
+ throwsA(isA<FormatException>()));
+ });
+
+ loaderTest('invalid .packages', {
+ '.dart_tool': {
+ 'package_config.json': 'not a JSON file',
+ }
+ }, (Uri directory, loader) {
+ expect(() => findPackageConfigUri(directory, loader: loader),
+ throwsA(isA<FormatException>()));
+ });
+
+ loaderTest('invalid .packages as INI', {
+ '.dart_tool': {
+ 'package_config.json': packagesFile,
+ }
+ }, (Uri directory, loader) {
+ expect(() => findPackageConfigUri(directory, loader: loader),
+ throwsA(isA<FormatException>()));
+ });
+
+ // Does not find .packages if no package_config.json and minVersion > 1.
+ loaderTest('.packages ignored', {
+ '.packages': packagesFile,
+ 'script.dart': 'main(){}'
+ }, (directory, loader) async {
+ var config =
+ await findPackageConfigUri(directory, minVersion: 2, loader: loader);
+ expect(config, null);
+ });
+
+ // Finds package_config.json in super-directory, with .packages in
+ // subdir and minVersion > 1.
+ loaderTest('package_config.json recursive ignores .packages', {
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ 'subdir': {
+ '.packages': packagesFile,
+ 'script.dart': 'main(){}',
+ }
+ }, (directory, loader) async {
+ var config = (await findPackageConfigUri(directory.resolve('subdir/'),
+ minVersion: 2, loader: loader))!;
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+ });
+
+ group('loadPackageConfig', () {
+ // Load a specific files
+ group('package_config.json', () {
+ var files = {
+ '.packages': packagesFile,
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ };
+ loaderTest('directly', files, (Uri directory, loader) async {
+ var file = directory.resolve('.dart_tool/package_config.json');
+ var config = await loadPackageConfigUri(file, loader: loader);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+ loaderTest('indirectly through .packages', files,
+ (Uri directory, loader) async {
+ var file = directory.resolve('.packages');
+ var config = await loadPackageConfigUri(file, loader: loader);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+ });
+
+ loaderTest('package_config.json non-default name', {
+ '.packages': packagesFile,
+ 'subdir': {
+ 'pheldagriff': packageConfigFile,
+ },
+ }, (Uri directory, loader) async {
+ var file = directory.resolve('subdir/pheldagriff');
+ var config = await loadPackageConfigUri(file, loader: loader);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+
+ loaderTest('package_config.json named .packages', {
+ 'subdir': {
+ '.packages': packageConfigFile,
+ },
+ }, (Uri directory, loader) async {
+ var file = directory.resolve('subdir/.packages');
+ var config = await loadPackageConfigUri(file, loader: loader);
+ expect(config.version, 2);
+ validatePackagesFile(config, directory);
+ });
+
+ loaderTest('.packages', {
+ '.packages': packagesFile,
+ }, (Uri directory, loader) async {
+ var file = directory.resolve('.packages');
+ var config = await loadPackageConfigUri(file, loader: loader);
+ expect(config.version, 1);
+ validatePackagesFile(config, directory);
+ });
+
+ loaderTest('.packages non-default name', {
+ 'pheldagriff': packagesFile,
+ }, (Uri directory, loader) async {
+ var file = directory.resolve('pheldagriff');
+ var config = await loadPackageConfigUri(file, loader: loader);
+ expect(config.version, 1);
+ validatePackagesFile(config, directory);
+ });
+
+ loaderTest('no config found', {}, (Uri directory, loader) {
+ var file = directory.resolve('anyname');
+ expect(() => loadPackageConfigUri(file, loader: loader),
+ throwsA(isA<ArgumentError>()));
+ });
+
+ loaderTest('no config found, handle error', {},
+ (Uri directory, loader) async {
+ var file = directory.resolve('anyname');
+ var hadError = false;
+ await loadPackageConfigUri(file,
+ loader: loader,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<ArgumentError>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+
+ loaderTest('specified file syntax error', {
+ 'anyname': 'syntax error',
+ }, (Uri directory, loader) {
+ var file = directory.resolve('anyname');
+ expect(() => loadPackageConfigUri(file, loader: loader),
+ throwsFormatException);
+ });
+
+ loaderTest('specified file syntax onError', {
+ 'anyname': 'syntax error',
+ }, (directory, loader) async {
+ var file = directory.resolve('anyname');
+ var hadError = false;
+ await loadPackageConfigUri(file,
+ loader: loader,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<FormatException>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+
+ // Don't look for package_config.json if original file not named .packages.
+ loaderTest('specified file syntax error with alternative', {
+ 'anyname': 'syntax error',
+ '.dart_tool': {
+ 'package_config.json': packageConfigFile,
+ },
+ }, (directory, loader) async {
+ var file = directory.resolve('anyname');
+ expect(() => loadPackageConfigUri(file, loader: loader),
+ throwsFormatException);
+ });
+
+ // A file starting with `{` is a package_config.json file.
+ loaderTest('file syntax error with {', {
+ '.packages': '{syntax error',
+ }, (directory, loader) async {
+ var file = directory.resolve('.packages');
+ var hadError = false;
+ await loadPackageConfigUri(file,
+ loader: loader,
+ onError: expectAsync1((error) {
+ hadError = true;
+ expect(error, isA<FormatException>());
+ }, max: -1));
+ expect(hadError, true);
+ });
+ });
+}
diff --git a/pkgs/package_config/test/package_config_impl_test.dart b/pkgs/package_config/test/package_config_impl_test.dart
new file mode 100644
index 0000000..0f39963
--- /dev/null
+++ b/pkgs/package_config/test/package_config_impl_test.dart
@@ -0,0 +1,188 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert' show jsonDecode;
+
+import 'package:package_config/package_config_types.dart';
+import 'package:test/test.dart';
+import 'src/util.dart';
+
+void main() {
+ var unique = Object();
+ var root = Uri.file('/tmp/root/');
+
+ group('LanguageVersion', () {
+ test('minimal', () {
+ var version = LanguageVersion(3, 5);
+ expect(version.major, 3);
+ expect(version.minor, 5);
+ });
+
+ test('negative major', () {
+ expect(() => LanguageVersion(-1, 1), throwsArgumentError);
+ });
+
+ test('negative minor', () {
+ expect(() => LanguageVersion(1, -1), throwsArgumentError);
+ });
+
+ test('minimal parse', () {
+ var version = LanguageVersion.parse('3.5');
+ expect(version.major, 3);
+ expect(version.minor, 5);
+ });
+
+ void failParse(String name, String input) {
+ test('$name - error', () {
+ expect(() => LanguageVersion.parse(input),
+ throwsA(isA<PackageConfigError>()));
+ expect(() => LanguageVersion.parse(input), throwsFormatException);
+ var failed = false;
+ var actual = LanguageVersion.parse(input, onError: (_) {
+ failed = true;
+ });
+ expect(failed, true);
+ expect(actual, isA<LanguageVersion>());
+ });
+ }
+
+ failParse('Leading zero major', '01.1');
+ failParse('Leading zero minor', '1.01');
+ failParse('Sign+ major', '+1.1');
+ failParse('Sign- major', '-1.1');
+ failParse('Sign+ minor', '1.+1');
+ failParse('Sign- minor', '1.-1');
+ failParse('WhiteSpace 1', ' 1.1');
+ failParse('WhiteSpace 2', '1 .1');
+ failParse('WhiteSpace 3', '1. 1');
+ failParse('WhiteSpace 4', '1.1 ');
+ });
+
+ group('Package', () {
+ test('minimal', () {
+ var package = Package('name', root, extraData: unique);
+ expect(package.name, 'name');
+ expect(package.root, root);
+ expect(package.packageUriRoot, root);
+ expect(package.languageVersion, null);
+ expect(package.extraData, same(unique));
+ });
+
+ test('absolute package root', () {
+ var version = LanguageVersion(1, 1);
+ var absolute = root.resolve('foo/bar/');
+ var package = Package('name', root,
+ packageUriRoot: absolute,
+ relativeRoot: false,
+ languageVersion: version,
+ extraData: unique);
+ expect(package.name, 'name');
+ expect(package.root, root);
+ expect(package.packageUriRoot, absolute);
+ expect(package.languageVersion, version);
+ expect(package.extraData, same(unique));
+ expect(package.relativeRoot, false);
+ });
+
+ test('relative package root', () {
+ var relative = Uri.parse('foo/bar/');
+ var absolute = root.resolveUri(relative);
+ var package = Package('name', root,
+ packageUriRoot: relative, relativeRoot: true, extraData: unique);
+ expect(package.name, 'name');
+ expect(package.root, root);
+ expect(package.packageUriRoot, absolute);
+ expect(package.relativeRoot, true);
+ expect(package.languageVersion, null);
+ expect(package.extraData, same(unique));
+ });
+
+ for (var badName in ['a/z', 'a:z', '', '...']) {
+ test("Invalid name '$badName'", () {
+ expect(() => Package(badName, root), throwsPackageConfigError);
+ });
+ }
+
+ test('Invalid root, not absolute', () {
+ expect(
+ () => Package('name', Uri.parse('/foo/')), throwsPackageConfigError);
+ });
+
+ test('Invalid root, not ending in slash', () {
+ expect(() => Package('name', Uri.parse('file:///foo')),
+ throwsPackageConfigError);
+ });
+
+ test('invalid package root, not ending in slash', () {
+ expect(() => Package('name', root, packageUriRoot: Uri.parse('foo')),
+ throwsPackageConfigError);
+ });
+
+ test('invalid package root, not inside root', () {
+ expect(() => Package('name', root, packageUriRoot: Uri.parse('../baz/')),
+ throwsPackageConfigError);
+ });
+ });
+
+ group('package config', () {
+ test('empty', () {
+ var empty = PackageConfig([], extraData: unique);
+ expect(empty.version, 2);
+ expect(empty.packages, isEmpty);
+ expect(empty.extraData, same(unique));
+ expect(empty.resolve(pkg('a', 'b')), isNull);
+ });
+
+ test('single', () {
+ var package = Package('name', root);
+ var single = PackageConfig([package], extraData: unique);
+ expect(single.version, 2);
+ expect(single.packages, hasLength(1));
+ expect(single.extraData, same(unique));
+ expect(single.resolve(pkg('a', 'b')), isNull);
+ var resolved = single.resolve(pkg('name', 'a/b'));
+ expect(resolved, root.resolve('a/b'));
+ });
+ });
+ test('writeString', () {
+ var config = PackageConfig([
+ Package('foo', Uri.parse('file:///pkg/foo/'),
+ packageUriRoot: Uri.parse('file:///pkg/foo/lib/'),
+ relativeRoot: false,
+ languageVersion: LanguageVersion(2, 4),
+ extraData: {'foo': 'foo!'}),
+ Package('bar', Uri.parse('file:///pkg/bar/'),
+ packageUriRoot: Uri.parse('file:///pkg/bar/lib/'),
+ relativeRoot: true,
+ extraData: {'bar': 'bar!'}),
+ ], extraData: {
+ 'extra': 'data'
+ });
+ var buffer = StringBuffer();
+ PackageConfig.writeString(config, buffer, Uri.parse('file:///pkg/'));
+ var text = buffer.toString();
+ var json = jsonDecode(text); // Is valid JSON.
+ expect(json, {
+ 'configVersion': 2,
+ 'packages': unorderedEquals([
+ {
+ 'name': 'foo',
+ 'rootUri': 'file:///pkg/foo/',
+ 'packageUri': 'lib/',
+ 'languageVersion': '2.4',
+ 'foo': 'foo!',
+ },
+ {
+ 'name': 'bar',
+ 'rootUri': 'bar/',
+ 'packageUri': 'lib/',
+ 'bar': 'bar!',
+ },
+ ]),
+ 'extra': 'data',
+ });
+ });
+}
+
+final Matcher throwsPackageConfigError = throwsA(isA<PackageConfigError>());
diff --git a/pkgs/package_config/test/parse_test.dart b/pkgs/package_config/test/parse_test.dart
new file mode 100644
index 0000000..a92b9bf
--- /dev/null
+++ b/pkgs/package_config/test/parse_test.dart
@@ -0,0 +1,552 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert';
+import 'dart:typed_data';
+
+import 'package:package_config/package_config_types.dart';
+import 'package:package_config/src/errors.dart';
+import 'package:package_config/src/package_config_json.dart';
+import 'package:package_config/src/packages_file.dart' as packages;
+import 'package:test/test.dart';
+
+import 'src/util.dart';
+
+void main() {
+ group('.packages', () {
+ test('valid', () {
+ var packagesFile = '# Generated by pub yadda yadda\n'
+ 'foo:file:///foo/lib/\n'
+ 'bar:/bar/lib/\n'
+ 'baz:lib/\n';
+ var result = packages.parse(utf8.encode(packagesFile),
+ Uri.parse('file:///tmp/file.dart'), throwError);
+ expect(result.version, 1);
+ expect({for (var p in result.packages) p.name}, {'foo', 'bar', 'baz'});
+ expect(result.resolve(pkg('foo', 'foo.dart')),
+ Uri.parse('file:///foo/lib/foo.dart'));
+ expect(result.resolve(pkg('bar', 'bar.dart')),
+ Uri.parse('file:///bar/lib/bar.dart'));
+ expect(result.resolve(pkg('baz', 'baz.dart')),
+ Uri.parse('file:///tmp/lib/baz.dart'));
+
+ var foo = result['foo']!;
+ expect(foo, isNotNull);
+ expect(foo.root, Uri.parse('file:///foo/'));
+ expect(foo.packageUriRoot, Uri.parse('file:///foo/lib/'));
+ expect(foo.languageVersion, LanguageVersion(2, 7));
+ expect(foo.relativeRoot, false);
+ });
+
+ test('valid empty', () {
+ var packagesFile = '# Generated by pub yadda yadda\n';
+ var result = packages.parse(
+ utf8.encode(packagesFile), Uri.file('/tmp/file.dart'), throwError);
+ expect(result.version, 1);
+ expect({for (var p in result.packages) p.name}, <String>{});
+ });
+
+ group('invalid', () {
+ var baseFile = Uri.file('/tmp/file.dart');
+ void testThrows(String name, String content) {
+ test(name, () {
+ expect(
+ () => packages.parse(utf8.encode(content), baseFile, throwError),
+ throwsA(isA<FormatException>()));
+ });
+ test('$name, handle error', () {
+ var hadError = false;
+ packages.parse(utf8.encode(content), baseFile, (error) {
+ hadError = true;
+ expect(error, isA<FormatException>());
+ });
+ expect(hadError, true);
+ });
+ }
+
+ testThrows('repeated package name', 'foo:lib/\nfoo:lib\n');
+ testThrows('no colon', 'foo\n');
+ testThrows('empty package name', ':lib/\n');
+ testThrows('dot only package name', '.:lib/\n');
+ testThrows('dot only package name', '..:lib/\n');
+ testThrows('invalid package name character', 'f\\o:lib/\n');
+ testThrows('package URI', 'foo:package:bar/lib/');
+ testThrows('location with query', 'f\\o:lib/?\n');
+ testThrows('location with fragment', 'f\\o:lib/#\n');
+ });
+ });
+
+ group('package_config.json', () {
+ test('valid', () {
+ var packageConfigFile = '''
+ {
+ "configVersion": 2,
+ "packages": [
+ {
+ "name": "foo",
+ "rootUri": "file:///foo/",
+ "packageUri": "lib/",
+ "languageVersion": "2.5",
+ "nonstandard": true
+ },
+ {
+ "name": "bar",
+ "rootUri": "/bar/",
+ "packageUri": "lib/",
+ "languageVersion": "9999.9999"
+ },
+ {
+ "name": "baz",
+ "rootUri": "../",
+ "packageUri": "lib/"
+ },
+ {
+ "name": "noslash",
+ "rootUri": "../noslash",
+ "packageUri": "lib"
+ }
+ ],
+ "generator": "pub",
+ "other": [42]
+ }
+ ''';
+ var config = parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ utf8.encode(packageConfigFile) as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'),
+ throwError);
+ expect(config.version, 2);
+ expect({for (var p in config.packages) p.name},
+ {'foo', 'bar', 'baz', 'noslash'});
+
+ expect(config.resolve(pkg('foo', 'foo.dart')),
+ Uri.parse('file:///foo/lib/foo.dart'));
+ expect(config.resolve(pkg('bar', 'bar.dart')),
+ Uri.parse('file:///bar/lib/bar.dart'));
+ expect(config.resolve(pkg('baz', 'baz.dart')),
+ Uri.parse('file:///tmp/lib/baz.dart'));
+
+ var foo = config['foo']!;
+ expect(foo, isNotNull);
+ expect(foo.root, Uri.parse('file:///foo/'));
+ expect(foo.packageUriRoot, Uri.parse('file:///foo/lib/'));
+ expect(foo.languageVersion, LanguageVersion(2, 5));
+ expect(foo.extraData, {'nonstandard': true});
+ expect(foo.relativeRoot, false);
+
+ var bar = config['bar']!;
+ expect(bar, isNotNull);
+ expect(bar.root, Uri.parse('file:///bar/'));
+ expect(bar.packageUriRoot, Uri.parse('file:///bar/lib/'));
+ expect(bar.languageVersion, LanguageVersion(9999, 9999));
+ expect(bar.extraData, null);
+ expect(bar.relativeRoot, false);
+
+ var baz = config['baz']!;
+ expect(baz, isNotNull);
+ expect(baz.root, Uri.parse('file:///tmp/'));
+ expect(baz.packageUriRoot, Uri.parse('file:///tmp/lib/'));
+ expect(baz.languageVersion, null);
+ expect(baz.relativeRoot, true);
+
+ // No slash after root or package root. One is inserted.
+ var noslash = config['noslash']!;
+ expect(noslash, isNotNull);
+ expect(noslash.root, Uri.parse('file:///tmp/noslash/'));
+ expect(noslash.packageUriRoot, Uri.parse('file:///tmp/noslash/lib/'));
+ expect(noslash.languageVersion, null);
+ expect(noslash.relativeRoot, true);
+
+ expect(config.extraData, {
+ 'generator': 'pub',
+ 'other': [42]
+ });
+ });
+
+ test('valid other order', () {
+ // The ordering in the file is not important.
+ var packageConfigFile = '''
+ {
+ "generator": "pub",
+ "other": [42],
+ "packages": [
+ {
+ "languageVersion": "2.5",
+ "packageUri": "lib/",
+ "rootUri": "file:///foo/",
+ "name": "foo"
+ },
+ {
+ "packageUri": "lib/",
+ "languageVersion": "9999.9999",
+ "rootUri": "/bar/",
+ "name": "bar"
+ },
+ {
+ "packageUri": "lib/",
+ "name": "baz",
+ "rootUri": "../"
+ }
+ ],
+ "configVersion": 2
+ }
+ ''';
+ var config = parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ utf8.encode(packageConfigFile) as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'),
+ throwError);
+ expect(config.version, 2);
+ expect({for (var p in config.packages) p.name}, {'foo', 'bar', 'baz'});
+
+ expect(config.resolve(pkg('foo', 'foo.dart')),
+ Uri.parse('file:///foo/lib/foo.dart'));
+ expect(config.resolve(pkg('bar', 'bar.dart')),
+ Uri.parse('file:///bar/lib/bar.dart'));
+ expect(config.resolve(pkg('baz', 'baz.dart')),
+ Uri.parse('file:///tmp/lib/baz.dart'));
+ expect(config.extraData, {
+ 'generator': 'pub',
+ 'other': [42]
+ });
+ });
+
+ // Check that a few minimal configurations are valid.
+ // These form the basis of invalid tests below.
+ var cfg = '"configVersion":2';
+ var pkgs = '"packages":[]';
+ var name = '"name":"foo"';
+ var root = '"rootUri":"/foo/"';
+ test('minimal', () {
+ var config = parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ utf8.encode('{$cfg,$pkgs}') as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'),
+ throwError);
+ expect(config.version, 2);
+ expect(config.packages, isEmpty);
+ });
+ test('minimal package', () {
+ // A package must have a name and a rootUri, the remaining properties
+ // are optional.
+ var config = parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ utf8.encode('{$cfg,"packages":[{$name,$root}]}') as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'),
+ throwError);
+ expect(config.version, 2);
+ expect(config.packages.first.name, 'foo');
+ });
+
+ test('nested packages', () {
+ var configBytes = utf8.encode(json.encode({
+ 'configVersion': 2,
+ 'packages': [
+ {'name': 'foo', 'rootUri': '/foo/', 'packageUri': 'lib/'},
+ {'name': 'bar', 'rootUri': '/foo/bar/', 'packageUri': 'lib/'},
+ {'name': 'baz', 'rootUri': '/foo/bar/baz/', 'packageUri': 'lib/'},
+ {'name': 'qux', 'rootUri': '/foo/qux/', 'packageUri': 'lib/'},
+ ]
+ }));
+ // ignore: unnecessary_cast
+ var config = parsePackageConfigBytes(configBytes as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'), throwError);
+ expect(config.version, 2);
+ expect(config.packageOf(Uri.parse('file:///foo/lala/lala.dart'))!.name,
+ 'foo');
+ expect(config.packageOf(Uri.parse('file:///foo/bar/lala.dart'))!.name,
+ 'bar');
+ expect(config.packageOf(Uri.parse('file:///foo/bar/baz/lala.dart'))!.name,
+ 'baz');
+ expect(config.packageOf(Uri.parse('file:///foo/qux/lala.dart'))!.name,
+ 'qux');
+ expect(config.toPackageUri(Uri.parse('file:///foo/lib/diz')),
+ Uri.parse('package:foo/diz'));
+ expect(config.toPackageUri(Uri.parse('file:///foo/bar/lib/diz')),
+ Uri.parse('package:bar/diz'));
+ expect(config.toPackageUri(Uri.parse('file:///foo/bar/baz/lib/diz')),
+ Uri.parse('package:baz/diz'));
+ expect(config.toPackageUri(Uri.parse('file:///foo/qux/lib/diz')),
+ Uri.parse('package:qux/diz'));
+ });
+
+ test('nested packages 2', () {
+ var configBytes = utf8.encode(json.encode({
+ 'configVersion': 2,
+ 'packages': [
+ {'name': 'foo', 'rootUri': '/', 'packageUri': 'lib/'},
+ {'name': 'bar', 'rootUri': '/bar/', 'packageUri': 'lib/'},
+ {'name': 'baz', 'rootUri': '/bar/baz/', 'packageUri': 'lib/'},
+ {'name': 'qux', 'rootUri': '/qux/', 'packageUri': 'lib/'},
+ ]
+ }));
+ // ignore: unnecessary_cast
+ var config = parsePackageConfigBytes(configBytes as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'), throwError);
+ expect(config.version, 2);
+ expect(
+ config.packageOf(Uri.parse('file:///lala/lala.dart'))!.name, 'foo');
+ expect(config.packageOf(Uri.parse('file:///bar/lala.dart'))!.name, 'bar');
+ expect(config.packageOf(Uri.parse('file:///bar/baz/lala.dart'))!.name,
+ 'baz');
+ expect(config.packageOf(Uri.parse('file:///qux/lala.dart'))!.name, 'qux');
+ expect(config.toPackageUri(Uri.parse('file:///lib/diz')),
+ Uri.parse('package:foo/diz'));
+ expect(config.toPackageUri(Uri.parse('file:///bar/lib/diz')),
+ Uri.parse('package:bar/diz'));
+ expect(config.toPackageUri(Uri.parse('file:///bar/baz/lib/diz')),
+ Uri.parse('package:baz/diz'));
+ expect(config.toPackageUri(Uri.parse('file:///qux/lib/diz')),
+ Uri.parse('package:qux/diz'));
+ });
+
+ test('packageOf is case sensitive on windows', () {
+ var configBytes = utf8.encode(json.encode({
+ 'configVersion': 2,
+ 'packages': [
+ {'name': 'foo', 'rootUri': 'file:///C:/Foo/', 'packageUri': 'lib/'},
+ ]
+ }));
+ var config = parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ configBytes as Uint8List,
+ Uri.parse('file:///C:/tmp/.dart_tool/file.dart'),
+ throwError);
+ expect(config.version, 2);
+ expect(
+ config.packageOf(Uri.parse('file:///C:/foo/lala/lala.dart')), null);
+ expect(config.packageOf(Uri.parse('file:///C:/Foo/lala/lala.dart'))!.name,
+ 'foo');
+ });
+
+ group('invalid', () {
+ void testThrows(String name, String source) {
+ test(name, () {
+ expect(
+ // ignore: unnecessary_cast
+ () => parsePackageConfigBytes(utf8.encode(source) as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'), throwError),
+ throwsA(isA<FormatException>()));
+ });
+ }
+
+ void testThrowsContains(
+ String name, String source, String containsString) {
+ test(name, () {
+ dynamic exception;
+ try {
+ parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ utf8.encode(source) as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'),
+ throwError,
+ );
+ } catch (e) {
+ exception = e;
+ }
+ if (exception == null) fail("Didn't get exception");
+ expect('$exception', contains(containsString));
+ });
+ }
+
+ testThrows('comment', '# comment\n {$cfg,$pkgs}');
+ testThrows('.packages file', 'foo:/foo\n');
+ testThrows('no configVersion', '{$pkgs}');
+ testThrows('no packages', '{$cfg}');
+ group('config version:', () {
+ testThrows('null', '{"configVersion":null,$pkgs}');
+ testThrows('string', '{"configVersion":"2",$pkgs}');
+ testThrows('array', '{"configVersion":[2],$pkgs}');
+ });
+ group('packages:', () {
+ testThrows('null', '{$cfg,"packages":null}');
+ testThrows('string', '{$cfg,"packages":"foo"}');
+ testThrows('object', '{$cfg,"packages":{}}');
+ });
+ group('packages entry:', () {
+ testThrows('null', '{$cfg,"packages":[null]}');
+ testThrows('string', '{$cfg,"packages":["foo"]}');
+ testThrows('array', '{$cfg,"packages":[[]]}');
+ });
+ group('package', () {
+ testThrows('no name', '{$cfg,"packages":[{$root}]}');
+ group('name:', () {
+ testThrows('null', '{$cfg,"packages":[{"name":null,$root}]}');
+ testThrows('num', '{$cfg,"packages":[{"name":1,$root}]}');
+ testThrows('object', '{$cfg,"packages":[{"name":{},$root}]}');
+ testThrows('empty', '{$cfg,"packages":[{"name":"",$root}]}');
+ testThrows('one-dot', '{$cfg,"packages":[{"name":".",$root}]}');
+ testThrows('two-dot', '{$cfg,"packages":[{"name":"..",$root}]}');
+ testThrows(
+ "invalid char '\\'", '{$cfg,"packages":[{"name":"\\",$root}]}');
+ testThrows(
+ "invalid char ':'", '{$cfg,"packages":[{"name":":",$root}]}');
+ testThrows(
+ "invalid char ' '", '{$cfg,"packages":[{"name":" ",$root}]}');
+ });
+
+ testThrows('no root', '{$cfg,"packages":[{$name}]}');
+ group('root:', () {
+ testThrows('null', '{$cfg,"packages":[{$name,"rootUri":null}]}');
+ testThrows('num', '{$cfg,"packages":[{$name,"rootUri":1}]}');
+ testThrows('object', '{$cfg,"packages":[{$name,"rootUri":{}}]}');
+ testThrows('fragment', '{$cfg,"packages":[{$name,"rootUri":"x/#"}]}');
+ testThrows('query', '{$cfg,"packages":[{$name,"rootUri":"x/?"}]}');
+ testThrows('package-URI',
+ '{$cfg,"packages":[{$name,"rootUri":"package:x/x/"}]}');
+ });
+ group('package-URI root:', () {
+ testThrows(
+ 'null', '{$cfg,"packages":[{$name,$root,"packageUri":null}]}');
+ testThrows('num', '{$cfg,"packages":[{$name,$root,"packageUri":1}]}');
+ testThrows(
+ 'object', '{$cfg,"packages":[{$name,$root,"packageUri":{}}]}');
+ testThrows('fragment',
+ '{$cfg,"packages":[{$name,$root,"packageUri":"x/#"}]}');
+ testThrows(
+ 'query', '{$cfg,"packages":[{$name,$root,"packageUri":"x/?"}]}');
+ testThrows('package: URI',
+ '{$cfg,"packages":[{$name,$root,"packageUri":"package:x/x/"}]}');
+ testThrows('not inside root',
+ '{$cfg,"packages":[{$name,$root,"packageUri":"../other/"}]}');
+ });
+ group('language version', () {
+ testThrows('null',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":null}]}');
+ testThrows(
+ 'num', '{$cfg,"packages":[{$name,$root,"languageVersion":1}]}');
+ testThrows('object',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":{}}]}');
+ testThrows('empty',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":""}]}');
+ testThrows('non number.number',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"x.1"}]}');
+ testThrows('number.non number',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"1.x"}]}');
+ testThrows('non number',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"x"}]}');
+ testThrows('one number',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"1"}]}');
+ testThrows('three numbers',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"1.2.3"}]}');
+ testThrows('leading zero first',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"01.1"}]}');
+ testThrows('leading zero second',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"1.01"}]}');
+ testThrows('trailing-',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"1.1-1"}]}');
+ testThrows('trailing+',
+ '{$cfg,"packages":[{$name,$root,"languageVersion":"1.1+1"}]}');
+ });
+ });
+ testThrows('duplicate package name',
+ '{$cfg,"packages":[{$name,$root},{$name,"rootUri":"/other/"}]}');
+ testThrowsContains(
+ // The roots of foo and bar are the same.
+ 'same roots',
+ '{$cfg,"packages":[{$name,$root},{"name":"bar",$root}]}',
+ 'the same root directory');
+ testThrowsContains(
+ // The roots of foo and bar are the same.
+ 'same roots 2',
+ '{$cfg,"packages":[{$name,"rootUri":"/"},{"name":"bar","rootUri":"/"}]}',
+ 'the same root directory');
+ testThrowsContains(
+ // The root of bar is inside the root of foo,
+ // but the package root of foo is inside the root of bar.
+ 'between root and lib',
+ '{$cfg,"packages":['
+ '{"name":"foo","rootUri":"/foo/","packageUri":"bar/lib/"},'
+ '{"name":"bar","rootUri":"/foo/bar/","packageUri":"baz/lib"}]}',
+ 'package root of foo is inside the root of bar');
+
+ // This shouldn't be allowed, but for internal reasons it is.
+ test('package inside package root', () {
+ var config = parsePackageConfigBytes(
+ // ignore: unnecessary_cast
+ utf8.encode(
+ '{$cfg,"packages":['
+ '{"name":"foo","rootUri":"/foo/","packageUri":"lib/"},'
+ '{"name":"bar","rootUri":"/foo/lib/bar/","packageUri":"lib"}]}',
+ ) as Uint8List,
+ Uri.parse('file:///tmp/.dart_tool/file.dart'),
+ throwError);
+ expect(
+ config
+ .packageOf(Uri.parse('file:///foo/lib/bar/lib/lala.dart'))!
+ .name,
+ 'foo'); // why not bar?
+ expect(config.toPackageUri(Uri.parse('file:///foo/lib/bar/lib/diz')),
+ Uri.parse('package:foo/bar/lib/diz')); // why not package:bar/diz?
+ });
+ });
+ });
+
+ group('factories', () {
+ void testConfig(String name, PackageConfig config, PackageConfig expected) {
+ group(name, () {
+ test('structure', () {
+ expect(config.version, expected.version);
+ var expectedPackages = {for (var p in expected.packages) p.name};
+ var actualPackages = {for (var p in config.packages) p.name};
+ expect(actualPackages, expectedPackages);
+ });
+ for (var package in config.packages) {
+ var name = package.name;
+ test('package $name', () {
+ var expectedPackage = expected[name]!;
+ expect(expectedPackage, isNotNull);
+ expect(package.root, expectedPackage.root, reason: 'root');
+ expect(package.packageUriRoot, expectedPackage.packageUriRoot,
+ reason: 'package root');
+ expect(package.languageVersion, expectedPackage.languageVersion,
+ reason: 'languageVersion');
+ });
+ }
+ });
+ }
+
+ var configText = '''
+ {"configVersion": 2, "packages": [
+ {
+ "name": "foo",
+ "rootUri": "foo/",
+ "packageUri": "bar/",
+ "languageVersion": "1.2"
+ }
+ ]}
+ ''';
+ var baseUri = Uri.parse('file:///start/');
+ var config = PackageConfig([
+ Package('foo', Uri.parse('file:///start/foo/'),
+ packageUriRoot: Uri.parse('file:///start/foo/bar/'),
+ languageVersion: LanguageVersion(1, 2))
+ ]);
+ testConfig(
+ 'string', PackageConfig.parseString(configText, baseUri), config);
+ testConfig(
+ 'bytes',
+ PackageConfig.parseBytes(
+ Uint8List.fromList(configText.codeUnits), baseUri),
+ config);
+ testConfig('json', PackageConfig.parseJson(jsonDecode(configText), baseUri),
+ config);
+
+ baseUri = Uri.parse('file:///start2/');
+ config = PackageConfig([
+ Package('foo', Uri.parse('file:///start2/foo/'),
+ packageUriRoot: Uri.parse('file:///start2/foo/bar/'),
+ languageVersion: LanguageVersion(1, 2))
+ ]);
+ testConfig(
+ 'string2', PackageConfig.parseString(configText, baseUri), config);
+ testConfig(
+ 'bytes2',
+ PackageConfig.parseBytes(
+ Uint8List.fromList(configText.codeUnits), baseUri),
+ config);
+ testConfig('json2',
+ PackageConfig.parseJson(jsonDecode(configText), baseUri), config);
+ });
+}
diff --git a/pkgs/package_config/test/src/util.dart b/pkgs/package_config/test/src/util.dart
new file mode 100644
index 0000000..780ee80
--- /dev/null
+++ b/pkgs/package_config/test/src/util.dart
@@ -0,0 +1,57 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert';
+import 'dart:typed_data';
+
+import 'package:test/test.dart';
+
+/// Creates a package: URI.
+Uri pkg(String packageName, String packagePath) {
+ var path =
+ "$packageName${packagePath.startsWith('/') ? "" : "/"}$packagePath";
+ return Uri(scheme: 'package', path: path);
+}
+
+// Remove if not used.
+String configFromPackages(List<List<String>> packages) => """
+{
+ "configVersion": 2,
+ "packages": [
+${packages.map((nu) => """
+ {
+ "name": "${nu[0]}",
+ "rootUri": "${nu[1]}"
+ }""").join(",\n")}
+ ]
+}
+""";
+
+/// Mimics a directory structure of [description] and runs [loaderTest].
+///
+/// Description is a map, each key is a file entry. If the value is a map,
+/// it's a subdirectory, otherwise it's a file and the value is the content
+/// as a string.
+void loaderTest(
+ String name,
+ Map<String, Object> description,
+ void Function(Uri root, Future<Uint8List?> Function(Uri) loader) loaderTest,
+) {
+ var root = Uri(scheme: 'test', path: '/');
+ Future<Uint8List?> loader(Uri uri) async {
+ var path = uri.path;
+ if (!uri.isScheme('test') || !path.startsWith('/')) return null;
+ var parts = path.split('/');
+ Object? value = description;
+ for (var i = 1; i < parts.length; i++) {
+ if (value is! Map<String, Object?>) return null;
+ value = value[parts[i]];
+ }
+ // ignore: unnecessary_cast
+ if (value is String) return utf8.encode(value) as Uint8List;
+ return null;
+ }
+
+ test(name, () => loaderTest(root, loader));
+}
diff --git a/pkgs/package_config/test/src/util_io.dart b/pkgs/package_config/test/src/util_io.dart
new file mode 100644
index 0000000..e032556
--- /dev/null
+++ b/pkgs/package_config/test/src/util_io.dart
@@ -0,0 +1,62 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import 'package:package_config/src/util_io.dart';
+import 'package:test/test.dart';
+
+/// Creates a directory structure from [description] and runs [fileTest].
+///
+/// Description is a map, each key is a file entry. If the value is a map,
+/// it's a subdirectory, otherwise it's a file and the value is the content
+/// as a string.
+/// Introduces a group to hold the [setUp]/[tearDown] logic.
+void fileTest(String name, Map<String, Object> description,
+ void Function(Directory directory) fileTest) {
+ group('file-test', () {
+ var tempDir = Directory.systemTemp.createTempSync('pkgcfgtest');
+ setUp(() {
+ _createFiles(tempDir, description);
+ });
+ tearDown(() {
+ tempDir.deleteSync(recursive: true);
+ });
+ test(name, () => fileTest(tempDir));
+ });
+}
+
+/// Creates a set of files under a new temporary directory.
+/// Returns the temporary directory.
+///
+/// The [description] is a map from file names to content.
+/// If the content is again a map, it represents a subdirectory
+/// with the content as description.
+/// Otherwise the content should be a string,
+/// which is written to the file as UTF-8.
+// Directory createTestFiles(Map<String, Object> description) {
+// var target = Directory.systemTemp.createTempSync("pkgcfgtest");
+// _createFiles(target, description);
+// return target;
+// }
+
+// Creates temporary files in the target directory.
+void _createFiles(Directory target, Map<Object?, Object?> description) {
+ description.forEach((name, content) {
+ var entryName = pathJoin(target.path, '$name');
+ if (content is Map<Object?, Object?>) {
+ _createFiles(Directory(entryName)..createSync(), content);
+ } else {
+ File(entryName).writeAsStringSync(content as String, flush: true);
+ }
+ });
+}
+
+/// Creates a [Directory] for a subdirectory of [parent].
+Directory subdir(Directory parent, String dirName) =>
+ Directory(pathJoinAll([parent.path, ...dirName.split('/')]));
+
+/// Creates a [File] for an entry in the [directory] directory.
+File dirFile(Directory directory, String fileName) =>
+ File(pathJoin(directory.path, fileName));
diff --git a/pkgs/pool/.gitignore b/pkgs/pool/.gitignore
new file mode 100644
index 0000000..e450c83
--- /dev/null
+++ b/pkgs/pool/.gitignore
@@ -0,0 +1,5 @@
+# Don’t commit the following directories created by pub.
+.dart_tool/
+.packages
+.pub/
+pubspec.lock
diff --git a/pkgs/pool/CHANGELOG.md b/pkgs/pool/CHANGELOG.md
new file mode 100644
index 0000000..56424fc
--- /dev/null
+++ b/pkgs/pool/CHANGELOG.md
@@ -0,0 +1,105 @@
+## 1.5.2-wip
+
+* Require Dart 3.4.
+* Move to `dart-lang/tools` monorepo.
+
+## 1.5.1
+
+* Populate the pubspec `repository` field.
+
+## 1.5.0
+
+* Stable release for null safety.
+
+## 1.5.0-nullsafety.3
+
+* Update SDK constraints to `>=2.12.0-0 <3.0.0` based on beta release
+ guidelines.
+
+## 1.5.0-nullsafety.2
+
+* Allow prerelease versions of the 2.12 sdk.
+
+## 1.5.0-nullsafety.1
+
+* Allow 2.10 stable and 2.11.0 dev SDK versions.
+
+## 1.5.0-nullsafety
+
+* Migrate to null safety.
+* `forEach`: Avoid `await null` if the `Stream` is not paused.
+ Improves trivial benchmark by 40%.
+
+## 1.4.0
+
+* Add `forEach` to `Pool` to support efficient async processing of an
+ `Iterable`.
+
+* Throw ArgumentError if poolSize <= 0
+
+## 1.3.6
+
+* Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 1.3.5
+
+- Updated SDK version to 2.0.0-dev.17.0
+
+## 1.3.4
+
+* Modify code to eliminate Future flattening.
+
+## 1.3.3
+
+* Declare support for `async` 2.0.0.
+
+## 1.3.2
+
+* Update to make the code work with strong-mode clean Zone API.
+
+* Required minimum SDK of 1.23.0.
+
+## 1.3.1
+
+* Fix the type annotation of `Pool.withResource()` to indicate that it takes
+ `() -> FutureOr<T>`.
+
+## 1.3.0
+
+* Add a `Pool.done` getter that returns the same future returned by
+ `Pool.close()`.
+
+## 1.2.4
+
+* Fix a strong-mode error.
+
+## 1.2.3
+
+* Fix a bug in which `Pool.withResource()` could throw a `StateError` when
+ called immediately before closing the pool.
+
+## 1.2.2
+
+* Fix strong mode warnings and add generic method annotations.
+
+## 1.2.1
+
+* Internal changes only.
+
+## 1.2.0
+
+* Add `Pool.close()`, which forbids new resource requests and releases all
+ releasable resources.
+
+## 1.1.0
+
+* Add `PoolResource.allowRelease()`, which allows a resource to indicate that it
+ can be released without forcing it to deallocate immediately.
+
+## 1.0.2
+
+* Fixed the homepage.
+
+## 1.0.1
+
+* A `TimeoutException` is now correctly thrown if the pool detects a deadlock.
diff --git a/pkgs/pool/LICENSE b/pkgs/pool/LICENSE
new file mode 100644
index 0000000..000cd7b
--- /dev/null
+++ b/pkgs/pool/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2014, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/pool/README.md b/pkgs/pool/README.md
new file mode 100644
index 0000000..461e872
--- /dev/null
+++ b/pkgs/pool/README.md
@@ -0,0 +1,57 @@
+[](https://github.com/dart-lang/tools/actions/workflows/pool.yaml)
+[](https://pub.dev/packages/pool)
+[](https://pub.dev/packages/pool/publisher)
+
+The pool package exposes a `Pool` class which makes it easy to manage a limited
+pool of resources.
+
+The easiest way to use a pool is by calling `withResource`. This runs a callback
+and returns its result, but only once there aren't too many other callbacks
+currently running.
+
+```dart
+// Create a Pool that will only allocate 10 resources at once. After 30 seconds
+// of inactivity with all resources checked out, the pool will throw an error.
+final pool = new Pool(10, timeout: new Duration(seconds: 30));
+
+Future<String> readFile(String path) {
+ // Since the call to [File.readAsString] is within [withResource], no more
+ // than ten files will be open at once.
+ return pool.withResource(() => new File(path).readAsString());
+}
+```
+
+For more fine-grained control, the user can also explicitly request generic
+`PoolResource` objects that can later be released back into the pool. This is
+what `withResource` does under the covers: requests a resource, then releases it
+once the callback completes.
+
+`Pool` ensures that only a limited number of resources are allocated at once.
+It's the caller's responsibility to ensure that the corresponding physical
+resource is only consumed when a `PoolResource` is allocated.
+
+```dart
+class PooledFile implements RandomAccessFile {
+ final RandomAccessFile _file;
+ final PoolResource _resource;
+
+ static Future<PooledFile> open(String path) {
+ return pool.request().then((resource) {
+ return new File(path).open().then((file) {
+ return new PooledFile._(file, resource);
+ });
+ });
+ }
+
+ PooledFile(this._file, this._resource);
+
+ // ...
+
+ Future<RandomAccessFile> close() {
+ return _file.close.then((_) {
+ _resource.release();
+ return this;
+ });
+ }
+}
+```
diff --git a/pkgs/pool/analysis_options.yaml b/pkgs/pool/analysis_options.yaml
new file mode 100644
index 0000000..44cda4d
--- /dev/null
+++ b/pkgs/pool/analysis_options.yaml
@@ -0,0 +1,5 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
diff --git a/pkgs/pool/benchmark/for_each_benchmark.dart b/pkgs/pool/benchmark/for_each_benchmark.dart
new file mode 100644
index 0000000..0cd2543
--- /dev/null
+++ b/pkgs/pool/benchmark/for_each_benchmark.dart
@@ -0,0 +1,55 @@
+// Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:pool/pool.dart';
+
+void main(List<String> args) async {
+ var poolSize = args.isEmpty ? 5 : int.parse(args.first);
+ print('Pool size: $poolSize');
+
+ final pool = Pool(poolSize);
+ final watch = Stopwatch()..start();
+ final start = DateTime.now();
+
+ DateTime? lastLog;
+ Duration? fastest;
+ late int fastestIteration;
+ var i = 1;
+
+ void log(bool force) {
+ var now = DateTime.now();
+ if (force ||
+ lastLog == null ||
+ now.difference(lastLog!) > const Duration(seconds: 1)) {
+ lastLog = now;
+ print([
+ now.difference(start),
+ i.toString().padLeft(10),
+ fastestIteration.toString().padLeft(7),
+ fastest!.inMicroseconds.toString().padLeft(9)
+ ].join(' '));
+ }
+ }
+
+ print(['Elapsed ', 'Iterations', 'Fastest', 'Time (us)'].join(' '));
+
+ for (;; i++) {
+ watch.reset();
+
+ var sum = await pool
+ .forEach<int, int>(Iterable<int>.generate(100000), (i) => i)
+ .reduce((a, b) => a + b);
+
+ assert(sum == 4999950000, 'was $sum');
+
+ var elapsed = watch.elapsed;
+ if (fastest == null || fastest > elapsed) {
+ fastest = elapsed;
+ fastestIteration = i;
+ log(true);
+ } else {
+ log(false);
+ }
+ }
+}
diff --git a/pkgs/pool/lib/pool.dart b/pkgs/pool/lib/pool.dart
new file mode 100644
index 0000000..70e9df1
--- /dev/null
+++ b/pkgs/pool/lib/pool.dart
@@ -0,0 +1,380 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:collection';
+
+import 'package:async/async.dart';
+import 'package:stack_trace/stack_trace.dart';
+
+/// Manages an abstract pool of resources with a limit on how many may be in use
+/// at once.
+///
+/// When a resource is needed, the user should call [request]. When the returned
+/// future completes with a [PoolResource], the resource may be allocated. Once
+/// the resource has been released, the user should call [PoolResource.release].
+/// The pool will ensure that only a certain number of [PoolResource]s may be
+/// allocated at once.
+class Pool {
+ /// Completers for requests beyond the first [_maxAllocatedResources].
+ ///
+ /// When an item is released, the next element of [_requestedResources] will
+ /// be completed.
+ final _requestedResources = Queue<Completer<PoolResource>>();
+
+ /// Callbacks that must be called before additional resources can be
+ /// allocated.
+ ///
+ /// See [PoolResource.allowRelease].
+ final _onReleaseCallbacks = Queue<void Function()>();
+
+ /// Completers that will be completed once `onRelease` callbacks are done
+ /// running.
+ ///
+ /// These are kept in a queue to ensure that the earliest request completes
+ /// first regardless of what order the `onRelease` callbacks complete in.
+ final _onReleaseCompleters = Queue<Completer<PoolResource>>();
+
+ /// The maximum number of resources that may be allocated at once.
+ final int _maxAllocatedResources;
+
+ /// The number of resources that are currently allocated.
+ int _allocatedResources = 0;
+
+ /// The timeout timer.
+ ///
+ /// This timer is canceled as long as the pool is below the resource limit.
+ /// It's reset once the resource limit is reached and again every time an
+ /// resource is released or a new resource is requested. If it fires, that
+ /// indicates that the caller became deadlocked, likely due to files waiting
+ /// for additional files to be read before they could be closed.
+ ///
+ /// This is `null` if this pool shouldn't time out.
+ RestartableTimer? _timer;
+
+ /// The amount of time to wait before timing out the pending resources.
+ final Duration? _timeout;
+
+ /// A [FutureGroup] that tracks all the `onRelease` callbacks for resources
+ /// that have been marked releasable.
+ ///
+ /// This is `null` until [close] is called.
+ FutureGroup? _closeGroup;
+
+ /// Whether [close] has been called.
+ bool get isClosed => _closeMemo.hasRun;
+
+ /// A future that completes once the pool is closed and all its outstanding
+ /// resources have been released.
+ ///
+ /// If any [PoolResource.allowRelease] callback throws an exception after the
+ /// pool is closed, this completes with that exception.
+ Future get done => _closeMemo.future;
+
+ /// Creates a new pool with the given limit on how many resources may be
+ /// allocated at once.
+ ///
+ /// If [timeout] is passed, then if that much time passes without any activity
+ /// all pending [request] futures will throw a [TimeoutException]. This is
+ /// intended to avoid deadlocks.
+ Pool(this._maxAllocatedResources, {Duration? timeout}) : _timeout = timeout {
+ if (_maxAllocatedResources <= 0) {
+ throw ArgumentError.value(_maxAllocatedResources, 'maxAllocatedResources',
+ 'Must be greater than zero.');
+ }
+
+ if (timeout != null) {
+ // Start the timer canceled since we only want to start counting down once
+ // we've run out of available resources.
+ _timer = RestartableTimer(timeout, _onTimeout)..cancel();
+ }
+ }
+
+ /// Request a [PoolResource].
+ ///
+ /// If the maximum number of resources is already allocated, this will delay
+ /// until one of them is released.
+ Future<PoolResource> request() {
+ if (isClosed) {
+ throw StateError('request() may not be called on a closed Pool.');
+ }
+
+ if (_allocatedResources < _maxAllocatedResources) {
+ _allocatedResources++;
+ return Future.value(PoolResource._(this));
+ } else if (_onReleaseCallbacks.isNotEmpty) {
+ return _runOnRelease(_onReleaseCallbacks.removeFirst());
+ } else {
+ var completer = Completer<PoolResource>();
+ _requestedResources.add(completer);
+ _resetTimer();
+ return completer.future;
+ }
+ }
+
+ /// Requests a resource for the duration of [callback], which may return a
+ /// Future.
+ ///
+ /// The return value of [callback] is piped to the returned Future.
+ Future<T> withResource<T>(FutureOr<T> Function() callback) async {
+ if (isClosed) {
+ throw StateError('withResource() may not be called on a closed Pool.');
+ }
+
+ var resource = await request();
+ try {
+ return await callback();
+ } finally {
+ resource.release();
+ }
+ }
+
+ /// Returns a [Stream] containing the result of [action] applied to each
+ /// element of [elements].
+ ///
+ /// While [action] is invoked on each element of [elements] in order,
+ /// it's possible the return [Stream] may have items out-of-order – especially
+ /// if the completion time of [action] varies.
+ ///
+ /// If [action] throws an error the source item along with the error object
+ /// and [StackTrace] are passed to [onError], if it is provided. If [onError]
+ /// returns `true`, the error is added to the returned [Stream], otherwise
+ /// it is ignored.
+ ///
+ /// Errors thrown from iterating [elements] will not be passed to
+ /// [onError]. They will always be added to the returned stream as an error.
+ ///
+ /// Note: all of the resources of the this [Pool] will be used when the
+ /// returned [Stream] is listened to until it is completed or canceled.
+ ///
+ /// Note: if this [Pool] is closed before the returned [Stream] is listened
+ /// to, a [StateError] is thrown.
+ Stream<T> forEach<S, T>(
+ Iterable<S> elements, FutureOr<T> Function(S source) action,
+ {bool Function(S item, Object error, StackTrace stack)? onError}) {
+ onError ??= (item, e, s) => true;
+
+ var cancelPending = false;
+
+ Completer? resumeCompleter;
+ late StreamController<T> controller;
+
+ late Iterator<S> iterator;
+
+ Future<void> run(int _) async {
+ while (iterator.moveNext()) {
+ // caching `current` is necessary because there are async breaks
+ // in this code and `iterator` is shared across many workers
+ final current = iterator.current;
+
+ _resetTimer();
+
+ if (resumeCompleter != null) {
+ await resumeCompleter!.future;
+ }
+
+ if (cancelPending) {
+ break;
+ }
+
+ T value;
+ try {
+ value = await action(current);
+ } catch (e, stack) {
+ if (onError!(current, e, stack)) {
+ controller.addError(e, stack);
+ }
+ continue;
+ }
+ controller.add(value);
+ }
+ }
+
+ Future<void>? doneFuture;
+
+ void onListen() {
+ iterator = elements.iterator;
+
+ assert(doneFuture == null);
+ var futures = Iterable<Future<void>>.generate(
+ _maxAllocatedResources, (i) => withResource(() => run(i)));
+ doneFuture = Future.wait(futures, eagerError: true)
+ .then<void>((_) {})
+ .catchError(controller.addError);
+
+ doneFuture!.whenComplete(controller.close);
+ }
+
+ controller = StreamController<T>(
+ sync: true,
+ onListen: onListen,
+ onCancel: () async {
+ assert(!cancelPending);
+ cancelPending = true;
+ await doneFuture;
+ },
+ onPause: () {
+ assert(resumeCompleter == null);
+ resumeCompleter = Completer<void>();
+ },
+ onResume: () {
+ assert(resumeCompleter != null);
+ resumeCompleter!.complete();
+ resumeCompleter = null;
+ },
+ );
+
+ return controller.stream;
+ }
+
+ /// Closes the pool so that no more resources are requested.
+ ///
+ /// Existing resource requests remain unchanged.
+ ///
+ /// Any resources that are marked as releasable using
+ /// [PoolResource.allowRelease] are released immediately. Once all resources
+ /// have been released and any `onRelease` callbacks have completed, the
+ /// returned future completes successfully. If any `onRelease` callback throws
+ /// an error, the returned future completes with that error.
+ ///
+ /// This may be called more than once; it returns the same [Future] each time.
+ Future close() => _closeMemo.runOnce(_close);
+
+ Future<void> _close() {
+ if (_closeGroup != null) return _closeGroup!.future;
+
+ _resetTimer();
+
+ _closeGroup = FutureGroup();
+ for (var callback in _onReleaseCallbacks) {
+ _closeGroup!.add(Future.sync(callback));
+ }
+
+ _allocatedResources -= _onReleaseCallbacks.length;
+ _onReleaseCallbacks.clear();
+
+ if (_allocatedResources == 0) _closeGroup!.close();
+ return _closeGroup!.future;
+ }
+
+ final _closeMemo = AsyncMemoizer<void>();
+
+ /// If there are any pending requests, this will fire the oldest one.
+ void _onResourceReleased() {
+ _resetTimer();
+
+ if (_requestedResources.isNotEmpty) {
+ var pending = _requestedResources.removeFirst();
+ pending.complete(PoolResource._(this));
+ } else {
+ _allocatedResources--;
+ if (isClosed && _allocatedResources == 0) _closeGroup!.close();
+ }
+ }
+
+ /// If there are any pending requests, this will fire the oldest one after
+ /// running [onRelease].
+ void _onResourceReleaseAllowed(void Function() onRelease) {
+ _resetTimer();
+
+ if (_requestedResources.isNotEmpty) {
+ var pending = _requestedResources.removeFirst();
+ pending.complete(_runOnRelease(onRelease));
+ } else if (isClosed) {
+ _closeGroup!.add(Future.sync(onRelease));
+ _allocatedResources--;
+ if (_allocatedResources == 0) _closeGroup!.close();
+ } else {
+ var zone = Zone.current;
+ var registered = zone.registerCallback(onRelease);
+ _onReleaseCallbacks.add(() => zone.run(registered));
+ }
+ }
+
+ /// Runs [onRelease] and returns a Future that completes to a resource once an
+ /// [onRelease] callback completes.
+ ///
+ /// Futures returned by [_runOnRelease] always complete in the order they were
+ /// created, even if earlier [onRelease] callbacks take longer to run.
+ Future<PoolResource> _runOnRelease(void Function() onRelease) {
+ Future.sync(onRelease).then((value) {
+ _onReleaseCompleters.removeFirst().complete(PoolResource._(this));
+ }).catchError((Object error, StackTrace stackTrace) {
+ _onReleaseCompleters.removeFirst().completeError(error, stackTrace);
+ });
+
+ var completer = Completer<PoolResource>.sync();
+ _onReleaseCompleters.add(completer);
+ return completer.future;
+ }
+
+ /// A resource has been requested, allocated, or released.
+ void _resetTimer() {
+ if (_timer == null) return;
+
+ if (_requestedResources.isEmpty) {
+ _timer!.cancel();
+ } else {
+ _timer!.reset();
+ }
+ }
+
+ /// Handles [_timer] timing out by causing all pending resource completers to
+ /// emit exceptions.
+ void _onTimeout() {
+ for (var completer in _requestedResources) {
+ completer.completeError(
+ TimeoutException(
+ 'Pool deadlock: all resources have been '
+ 'allocated for too long.',
+ _timeout),
+ Chain.current());
+ }
+ _requestedResources.clear();
+ _timer = null;
+ }
+}
+
+/// A member of a [Pool].
+///
+/// A [PoolResource] is a token that indicates that a resource is allocated.
+/// When the associated resource is released, the user should call [release].
+class PoolResource {
+ final Pool _pool;
+
+ /// Whether `this` has been released yet.
+ bool _released = false;
+
+ PoolResource._(this._pool);
+
+ /// Tells the parent [Pool] that the resource associated with this resource is
+ /// no longer allocated, and that a new [PoolResource] may be allocated.
+ void release() {
+ if (_released) {
+ throw StateError('A PoolResource may only be released once.');
+ }
+ _released = true;
+ _pool._onResourceReleased();
+ }
+
+ /// Tells the parent [Pool] that the resource associated with this resource is
+ /// no longer necessary, but should remain allocated until more resources are
+ /// needed.
+ ///
+ /// When [Pool.request] is called and there are no remaining available
+ /// resources, the [onRelease] callback is called. It should free the
+ /// resource, and it may return a Future or `null`. Once that completes, the
+ /// [Pool.request] call will complete to a new [PoolResource].
+ ///
+ /// This is useful when a resource's main function is complete, but it may
+ /// produce additional information later on. For example, an isolate's task
+ /// may be complete, but it could still emit asynchronous errors.
+ void allowRelease(FutureOr<void> Function() onRelease) {
+ if (_released) {
+ throw StateError('A PoolResource may only be released once.');
+ }
+ _released = true;
+ _pool._onResourceReleaseAllowed(onRelease);
+ }
+}
diff --git a/pkgs/pool/pubspec.yaml b/pkgs/pool/pubspec.yaml
new file mode 100644
index 0000000..a205b74
--- /dev/null
+++ b/pkgs/pool/pubspec.yaml
@@ -0,0 +1,18 @@
+name: pool
+version: 1.5.2-wip
+description: >-
+ Manage a finite pool of resources.
+ Useful for controlling concurrent file system or network requests.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/pool
+
+environment:
+ sdk: ^3.4.0
+
+dependencies:
+ async: ^2.5.0
+ stack_trace: ^1.10.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ fake_async: ^1.2.0
+ test: ^1.16.6
diff --git a/pkgs/pool/test/pool_test.dart b/pkgs/pool/test/pool_test.dart
new file mode 100644
index 0000000..6334a8a
--- /dev/null
+++ b/pkgs/pool/test/pool_test.dart
@@ -0,0 +1,745 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:fake_async/fake_async.dart';
+import 'package:pool/pool.dart';
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('request()', () {
+ test('resources can be requested freely up to the limit', () {
+ var pool = Pool(50);
+ for (var i = 0; i < 50; i++) {
+ expect(pool.request(), completes);
+ }
+ });
+
+ test('resources block past the limit', () {
+ FakeAsync().run((async) {
+ var pool = Pool(50);
+ for (var i = 0; i < 50; i++) {
+ expect(pool.request(), completes);
+ }
+ expect(pool.request(), doesNotComplete);
+
+ async.elapse(const Duration(seconds: 1));
+ });
+ });
+
+ test('a blocked resource is allocated when another is released', () {
+ FakeAsync().run((async) {
+ var pool = Pool(50);
+ for (var i = 0; i < 49; i++) {
+ expect(pool.request(), completes);
+ }
+
+ pool.request().then((lastAllocatedResource) {
+ // This will only complete once [lastAllocatedResource] is released.
+ expect(pool.request(), completes);
+
+ Future<void>.delayed(const Duration(microseconds: 1)).then((_) {
+ lastAllocatedResource.release();
+ });
+ });
+
+ async.elapse(const Duration(seconds: 1));
+ });
+ });
+ });
+
+ group('withResource()', () {
+ test('can be called freely up to the limit', () {
+ var pool = Pool(50);
+ for (var i = 0; i < 50; i++) {
+ pool.withResource(expectAsync0(() => Completer<void>().future));
+ }
+ });
+
+ test('blocks the callback past the limit', () {
+ FakeAsync().run((async) {
+ var pool = Pool(50);
+ for (var i = 0; i < 50; i++) {
+ pool.withResource(expectAsync0(() => Completer<void>().future));
+ }
+ pool.withResource(expectNoAsync());
+
+ async.elapse(const Duration(seconds: 1));
+ });
+ });
+
+ test('a blocked resource is allocated when another is released', () {
+ FakeAsync().run((async) {
+ var pool = Pool(50);
+ for (var i = 0; i < 49; i++) {
+ pool.withResource(expectAsync0(() => Completer<void>().future));
+ }
+
+ var completer = Completer<void>();
+ pool.withResource(() => completer.future);
+ var blockedResourceAllocated = false;
+ pool.withResource(() {
+ blockedResourceAllocated = true;
+ });
+
+ Future<void>.delayed(const Duration(microseconds: 1)).then((_) {
+ expect(blockedResourceAllocated, isFalse);
+ completer.complete();
+ return Future<void>.delayed(const Duration(microseconds: 1));
+ }).then((_) {
+ expect(blockedResourceAllocated, isTrue);
+ });
+
+ async.elapse(const Duration(seconds: 1));
+ });
+ });
+
+ // Regression test for #3.
+ test('can be called immediately before close()', () async {
+ var pool = Pool(1);
+ unawaited(pool.withResource(expectAsync0(() {})));
+ await pool.close();
+ });
+ });
+
+ group('with a timeout', () {
+ test("doesn't time out if there are no pending requests", () {
+ FakeAsync().run((async) {
+ var pool = Pool(50, timeout: const Duration(seconds: 5));
+ for (var i = 0; i < 50; i++) {
+ expect(pool.request(), completes);
+ }
+
+ async.elapse(const Duration(seconds: 6));
+ });
+ });
+
+ test('resets the timer if a resource is returned', () {
+ FakeAsync().run((async) {
+ var pool = Pool(50, timeout: const Duration(seconds: 5));
+ for (var i = 0; i < 49; i++) {
+ expect(pool.request(), completes);
+ }
+
+ pool.request().then((lastAllocatedResource) {
+ // This will only complete once [lastAllocatedResource] is released.
+ expect(pool.request(), completes);
+
+ Future<void>.delayed(const Duration(seconds: 3)).then((_) {
+ lastAllocatedResource.release();
+ expect(pool.request(), doesNotComplete);
+ });
+ });
+
+ async.elapse(const Duration(seconds: 6));
+ });
+ });
+
+ test('resets the timer if a resource is requested', () {
+ FakeAsync().run((async) {
+ var pool = Pool(50, timeout: const Duration(seconds: 5));
+ for (var i = 0; i < 50; i++) {
+ expect(pool.request(), completes);
+ }
+ expect(pool.request(), doesNotComplete);
+
+ Future<void>.delayed(const Duration(seconds: 3)).then((_) {
+ expect(pool.request(), doesNotComplete);
+ });
+
+ async.elapse(const Duration(seconds: 6));
+ });
+ });
+
+ test('times out if nothing happens', () {
+ FakeAsync().run((async) {
+ var pool = Pool(50, timeout: const Duration(seconds: 5));
+ for (var i = 0; i < 50; i++) {
+ expect(pool.request(), completes);
+ }
+ expect(pool.request(), throwsA(const TypeMatcher<TimeoutException>()));
+
+ async.elapse(const Duration(seconds: 6));
+ });
+ });
+ });
+
+ group('allowRelease()', () {
+ test('runs the callback once the resource limit is exceeded', () async {
+ var pool = Pool(50);
+ for (var i = 0; i < 49; i++) {
+ expect(pool.request(), completes);
+ }
+
+ var resource = await pool.request();
+ var onReleaseCalled = false;
+ resource.allowRelease(() => onReleaseCalled = true);
+ await Future<void>.delayed(Duration.zero);
+ expect(onReleaseCalled, isFalse);
+
+ expect(pool.request(), completes);
+ await Future<void>.delayed(Duration.zero);
+ expect(onReleaseCalled, isTrue);
+ });
+
+ test('runs the callback immediately if there are blocked requests',
+ () async {
+ var pool = Pool(1);
+ var resource = await pool.request();
+
+ // This will be blocked until [resource.allowRelease] is called.
+ expect(pool.request(), completes);
+
+ var onReleaseCalled = false;
+ resource.allowRelease(() => onReleaseCalled = true);
+ await Future<void>.delayed(Duration.zero);
+ expect(onReleaseCalled, isTrue);
+ });
+
+ test('blocks the request until the callback completes', () async {
+ var pool = Pool(1);
+ var resource = await pool.request();
+
+ var requestComplete = false;
+ unawaited(pool.request().then((_) => requestComplete = true));
+
+ var completer = Completer<void>();
+ resource.allowRelease(() => completer.future);
+ await Future<void>.delayed(Duration.zero);
+ expect(requestComplete, isFalse);
+
+ completer.complete();
+ await Future<void>.delayed(Duration.zero);
+ expect(requestComplete, isTrue);
+ });
+
+ test('completes requests in request order regardless of callback order',
+ () async {
+ var pool = Pool(2);
+ var resource1 = await pool.request();
+ var resource2 = await pool.request();
+
+ var request1Complete = false;
+ unawaited(pool.request().then((_) => request1Complete = true));
+ var request2Complete = false;
+ unawaited(pool.request().then((_) => request2Complete = true));
+
+ var onRelease1Called = false;
+ var completer1 = Completer<void>();
+ resource1.allowRelease(() {
+ onRelease1Called = true;
+ return completer1.future;
+ });
+ await Future<void>.delayed(Duration.zero);
+ expect(onRelease1Called, isTrue);
+
+ var onRelease2Called = false;
+ var completer2 = Completer<void>();
+ resource2.allowRelease(() {
+ onRelease2Called = true;
+ return completer2.future;
+ });
+ await Future<void>.delayed(Duration.zero);
+ expect(onRelease2Called, isTrue);
+ expect(request1Complete, isFalse);
+ expect(request2Complete, isFalse);
+
+ // Complete the second resource's onRelease callback first. Even though it
+ // was triggered by the second blocking request, it should complete the
+ // first one to preserve ordering.
+ completer2.complete();
+ await Future<void>.delayed(Duration.zero);
+ expect(request1Complete, isTrue);
+ expect(request2Complete, isFalse);
+
+ completer1.complete();
+ await Future<void>.delayed(Duration.zero);
+ expect(request1Complete, isTrue);
+ expect(request2Complete, isTrue);
+ });
+
+ test('runs onRequest in the zone it was created', () async {
+ var pool = Pool(1);
+ var resource = await pool.request();
+
+ var outerZone = Zone.current;
+ runZoned(() {
+ var innerZone = Zone.current;
+ expect(innerZone, isNot(equals(outerZone)));
+
+ resource.allowRelease(expectAsync0(() {
+ expect(Zone.current, equals(innerZone));
+ }));
+ });
+
+ await pool.request();
+ });
+ });
+
+ test("done doesn't complete without close", () async {
+ var pool = Pool(1);
+ unawaited(pool.done.then(expectAsync1((_) {}, count: 0)));
+
+ var resource = await pool.request();
+ resource.release();
+
+ await Future<void>.delayed(Duration.zero);
+ });
+
+ group('close()', () {
+ test('disallows request() and withResource()', () {
+ var pool = Pool(1)..close();
+ expect(pool.request, throwsStateError);
+ expect(() => pool.withResource(() {}), throwsStateError);
+ });
+
+ test('pending requests are fulfilled', () async {
+ var pool = Pool(1);
+ var resource1 = await pool.request();
+ expect(
+ pool.request().then((resource2) {
+ resource2.release();
+ }),
+ completes);
+ expect(pool.done, completes);
+ expect(pool.close(), completes);
+ resource1.release();
+ });
+
+ test('pending requests are fulfilled with allowRelease', () async {
+ var pool = Pool(1);
+ var resource1 = await pool.request();
+
+ var completer = Completer<void>();
+ expect(
+ pool.request().then((resource2) {
+ expect(completer.isCompleted, isTrue);
+ resource2.release();
+ }),
+ completes);
+ expect(pool.close(), completes);
+
+ resource1.allowRelease(() => completer.future);
+ await Future<void>.delayed(Duration.zero);
+
+ completer.complete();
+ });
+
+ test("doesn't complete until all resources are released", () async {
+ var pool = Pool(2);
+ var resource1 = await pool.request();
+ var resource2 = await pool.request();
+ var resource3Future = pool.request();
+
+ var resource1Released = false;
+ var resource2Released = false;
+ var resource3Released = false;
+ expect(
+ pool.close().then((_) {
+ expect(resource1Released, isTrue);
+ expect(resource2Released, isTrue);
+ expect(resource3Released, isTrue);
+ }),
+ completes);
+
+ resource1Released = true;
+ resource1.release();
+ await Future<void>.delayed(Duration.zero);
+
+ resource2Released = true;
+ resource2.release();
+ await Future<void>.delayed(Duration.zero);
+
+ var resource3 = await resource3Future;
+ resource3Released = true;
+ resource3.release();
+ });
+
+ test('active onReleases complete as usual', () async {
+ var pool = Pool(1);
+ var resource = await pool.request();
+
+ // Set up an onRelease callback whose completion is controlled by
+ // [completer].
+ var completer = Completer<void>();
+ resource.allowRelease(() => completer.future);
+ expect(
+ pool.request().then((_) {
+ expect(completer.isCompleted, isTrue);
+ }),
+ completes);
+
+ await Future<void>.delayed(Duration.zero);
+ unawaited(pool.close());
+
+ await Future<void>.delayed(Duration.zero);
+ completer.complete();
+ });
+
+ test('inactive onReleases fire', () async {
+ var pool = Pool(2);
+ var resource1 = await pool.request();
+ var resource2 = await pool.request();
+
+ var completer1 = Completer<void>();
+ resource1.allowRelease(() => completer1.future);
+ var completer2 = Completer<void>();
+ resource2.allowRelease(() => completer2.future);
+
+ expect(
+ pool.close().then((_) {
+ expect(completer1.isCompleted, isTrue);
+ expect(completer2.isCompleted, isTrue);
+ }),
+ completes);
+
+ await Future<void>.delayed(Duration.zero);
+ completer1.complete();
+
+ await Future<void>.delayed(Duration.zero);
+ completer2.complete();
+ });
+
+ test('new allowReleases fire immediately', () async {
+ var pool = Pool(1);
+ var resource = await pool.request();
+
+ var completer = Completer<void>();
+ expect(
+ pool.close().then((_) {
+ expect(completer.isCompleted, isTrue);
+ }),
+ completes);
+
+ await Future<void>.delayed(Duration.zero);
+ resource.allowRelease(() => completer.future);
+
+ await Future<void>.delayed(Duration.zero);
+ completer.complete();
+ });
+
+ test('an onRelease error is piped to the return value', () async {
+ var pool = Pool(1);
+ var resource = await pool.request();
+
+ var completer = Completer<void>();
+ resource.allowRelease(() => completer.future);
+
+ expect(pool.done, throwsA('oh no!'));
+ expect(pool.close(), throwsA('oh no!'));
+
+ await Future<void>.delayed(Duration.zero);
+ completer.completeError('oh no!');
+ });
+ });
+
+ group('forEach', () {
+ late Pool pool;
+
+ tearDown(() async {
+ await pool.close();
+ });
+
+ const delayedToStringDuration = Duration(milliseconds: 10);
+
+ Future<String> delayedToString(int i) =>
+ Future<String>.delayed(delayedToStringDuration, () => i.toString());
+
+ for (var itemCount in [0, 5]) {
+ for (var poolSize in [1, 5, 6]) {
+ test('poolSize: $poolSize, itemCount: $itemCount', () async {
+ pool = Pool(poolSize);
+
+ var finishedItems = 0;
+
+ await for (var item in pool.forEach(
+ Iterable.generate(itemCount, (i) {
+ expect(i, lessThanOrEqualTo(finishedItems + poolSize),
+ reason: 'the iterator should be called lazily');
+ return i;
+ }),
+ delayedToString)) {
+ expect(int.parse(item), lessThan(itemCount));
+ finishedItems++;
+ }
+
+ expect(finishedItems, itemCount);
+ });
+ }
+ }
+
+ test('pool closed before listen', () async {
+ pool = Pool(2);
+
+ var stream = pool.forEach(Iterable<int>.generate(5), delayedToString);
+
+ await pool.close();
+
+ expect(stream.toList(), throwsStateError);
+ });
+
+ test('completes even if the pool is partially used', () async {
+ pool = Pool(2);
+
+ var resource = await pool.request();
+
+ var stream = pool.forEach(<int>[], delayedToString);
+
+ expect(await stream.length, 0);
+
+ resource.release();
+ });
+
+ test('stream paused longer than timeout', () async {
+ pool = Pool(2, timeout: delayedToStringDuration);
+
+ var resource = await pool.request();
+
+ var stream = pool.forEach<int, int>(
+ Iterable.generate(100, (i) {
+ expect(i, lessThan(20),
+ reason: 'The timeout should happen '
+ 'before the entire iterable is iterated.');
+ return i;
+ }), (i) async {
+ await Future<void>.delayed(Duration(milliseconds: i));
+ return i;
+ });
+
+ await expectLater(
+ stream.toList,
+ throwsA(const TypeMatcher<TimeoutException>().having(
+ (te) => te.message,
+ 'message',
+ contains('Pool deadlock: '
+ 'all resources have been allocated for too long.'))));
+
+ resource.release();
+ });
+
+ group('timing and timeout', () {
+ for (var poolSize in [2, 8, 64]) {
+ for (var otherTaskCount
+ in [0, 1, 7, 63].where((otc) => otc < poolSize)) {
+ test('poolSize: $poolSize, otherTaskCount: $otherTaskCount',
+ () async {
+ final itemCount = 128;
+ pool = Pool(poolSize, timeout: const Duration(milliseconds: 20));
+
+ var otherTasks = await Future.wait(
+ Iterable<int>.generate(otherTaskCount)
+ .map((i) => pool.request()));
+
+ try {
+ var finishedItems = 0;
+
+ var watch = Stopwatch()..start();
+
+ await for (var item in pool.forEach(
+ Iterable.generate(itemCount, (i) {
+ expect(i, lessThanOrEqualTo(finishedItems + poolSize),
+ reason: 'the iterator should be called lazily');
+ return i;
+ }),
+ delayedToString)) {
+ expect(int.parse(item), lessThan(itemCount));
+ finishedItems++;
+ }
+
+ expect(finishedItems, itemCount);
+
+ final expectedElapsed =
+ delayedToStringDuration.inMicroseconds * 4;
+
+ expect((watch.elapsed ~/ itemCount).inMicroseconds,
+ lessThan(expectedElapsed / (poolSize - otherTaskCount)),
+ reason: 'Average time per task should be '
+ 'proportionate to the available pool resources.');
+ } finally {
+ for (var task in otherTasks) {
+ task.release();
+ }
+ }
+ });
+ }
+ }
+ }, testOn: 'vm');
+
+ test('partial iteration', () async {
+ pool = Pool(5);
+ var stream = pool.forEach(Iterable<int>.generate(100), delayedToString);
+ expect(await stream.take(10).toList(), hasLength(10));
+ });
+
+ test('pool close during data with waiting to be done', () async {
+ pool = Pool(5);
+
+ var stream = pool.forEach(Iterable<int>.generate(100), delayedToString);
+
+ var dataCount = 0;
+ var subscription = stream.listen((data) {
+ dataCount++;
+ pool.close();
+ });
+
+ await subscription.asFuture<void>();
+ expect(dataCount, 100);
+ await subscription.cancel();
+ });
+
+ test('pause and resume ', () async {
+ var generatedCount = 0;
+ var dataCount = 0;
+ final poolSize = 5;
+
+ pool = Pool(poolSize);
+
+ var stream = pool.forEach(
+ Iterable<int>.generate(40, (i) {
+ expect(generatedCount, lessThanOrEqualTo(dataCount + 2 * poolSize),
+ reason: 'The iterator should not be called '
+ 'much faster than the data is consumed.');
+ generatedCount++;
+ return i;
+ }),
+ delayedToString);
+
+ // ignore: cancel_subscriptions
+ late StreamSubscription subscription;
+
+ subscription = stream.listen(
+ (data) {
+ dataCount++;
+
+ if (int.parse(data) % 3 == 1) {
+ subscription.pause(Future(() async {
+ await Future<void>.delayed(const Duration(milliseconds: 100));
+ }));
+ }
+ },
+ onError: registerException,
+ onDone: expectAsync0(() {
+ expect(dataCount, 40);
+ }),
+ );
+ });
+
+ group('cancel', () {
+ final dataSize = 32;
+ for (var i = 1; i < 5; i++) {
+ test('with pool size $i', () async {
+ pool = Pool(i);
+
+ var stream =
+ pool.forEach(Iterable<int>.generate(dataSize), delayedToString);
+
+ var cancelCompleter = Completer<void>();
+
+ StreamSubscription subscription;
+
+ var eventCount = 0;
+ subscription = stream.listen((data) {
+ eventCount++;
+ if (int.parse(data) == dataSize ~/ 2) {
+ cancelCompleter.complete();
+ }
+ }, onError: registerException);
+
+ await cancelCompleter.future;
+
+ await subscription.cancel();
+
+ expect(eventCount, 1 + dataSize ~/ 2);
+ });
+ }
+ });
+
+ group('errors', () {
+ Future<void> errorInIterator({
+ bool Function(int item, Object error, StackTrace stack)? onError,
+ }) async {
+ pool = Pool(20);
+
+ var listFuture = pool
+ .forEach(
+ Iterable.generate(100, (i) {
+ if (i == 50) {
+ throw StateError('error while generating item in iterator');
+ }
+
+ return i;
+ }),
+ delayedToString,
+ onError: onError)
+ .toList();
+
+ await expectLater(() async => listFuture, throwsStateError);
+ }
+
+ test('iteration, no onError', () async {
+ await errorInIterator();
+ });
+ test('iteration, with onError', () async {
+ await errorInIterator(onError: (i, e, s) => false);
+ });
+
+ test('error in action, no onError', () async {
+ pool = Pool(20);
+
+ var listFuture = pool.forEach(Iterable<int>.generate(100), (i) async {
+ await Future<void>.delayed(const Duration(milliseconds: 10));
+ if (i == 10) {
+ throw UnsupportedError('10 is not supported');
+ }
+ return i.toString();
+ }).toList();
+
+ await expectLater(() async => listFuture, throwsUnsupportedError);
+ });
+
+ test('error in action, no onError', () async {
+ pool = Pool(20);
+
+ var list = await pool.forEach(Iterable<int>.generate(100),
+ (int i) async {
+ await Future<void>.delayed(const Duration(milliseconds: 10));
+ if (i % 10 == 0) {
+ throw UnsupportedError('Multiples of 10 not supported');
+ }
+ return i.toString();
+ },
+ onError: (item, error, stack) =>
+ error is! UnsupportedError).toList();
+
+ expect(list, hasLength(90));
+ });
+ });
+ });
+
+ test('throw error when pool limit <= 0', () {
+ expect(() => Pool(-1), throwsArgumentError);
+ expect(() => Pool(0), throwsArgumentError);
+ });
+}
+
+/// Returns a function that will cause the test to fail if it's called.
+///
+/// This should only be called within a [FakeAsync.run] zone.
+void Function() expectNoAsync() {
+ var stack = Trace.current(1);
+ return () => registerException(
+ TestFailure('Expected function not to be called.'), stack);
+}
+
+/// A matcher for Futures that asserts that they don't complete.
+///
+/// This should only be called within a [FakeAsync.run] zone.
+Matcher get doesNotComplete => predicate((Future future) {
+ var stack = Trace.current(1);
+ future.then((_) => registerException(
+ TestFailure('Expected future not to complete.'), stack));
+ return true;
+ });
diff --git a/pkgs/pub_semver/.gitignore b/pkgs/pub_semver/.gitignore
new file mode 100644
index 0000000..49ce72d
--- /dev/null
+++ b/pkgs/pub_semver/.gitignore
@@ -0,0 +1,3 @@
+.dart_tool/
+.packages
+pubspec.lock
diff --git a/pkgs/pub_semver/CHANGELOG.md b/pkgs/pub_semver/CHANGELOG.md
new file mode 100644
index 0000000..a31fbb2
--- /dev/null
+++ b/pkgs/pub_semver/CHANGELOG.md
@@ -0,0 +1,177 @@
+## 2.1.5
+
+- Require Dart `3.4.0`.
+- Move to `dart-lang/tools` monorepo.
+
+## 2.1.4
+
+- Added topics to `pubspec.yaml`.
+
+## 2.1.3
+
+- Add type parameters to the signatures of the `Version.preRelease` and
+ `Version.build` fields (`List` ==> `List<Object>`).
+ [#74](https://github.com/dart-lang/pub_semver/pull/74).
+- Require Dart 2.17.
+
+## 2.1.2
+
+- Add markdown badges to the readme.
+
+## 2.1.1
+
+- Fixed the version parsing pattern to only accept dots between version
+ components.
+
+## 2.1.0
+
+- Added `Version.canonicalizedVersion` to help scrub leading zeros and highlight
+ that `Version.toString()` preserves leading zeros.
+- Annotated `Version` with `@sealed` to discourage users from implementing the
+ interface.
+
+## 2.0.0
+
+- Stable null safety release.
+- `Version.primary` now throws `StateError` if the `versions` argument is empty.
+
+## 1.4.4
+
+- Fix a bug of `VersionRange.union` where ranges bounded at infinity would get
+ combined wrongly.
+
+# 1.4.3
+
+- Update Dart SDK constraint to `>=2.0.0 <3.0.0`.
+- Update `package:collection` constraint to `^1.0.0`.
+
+## 1.4.2
+
+* Set max SDK version to `<3.0.0`.
+
+## 1.4.1
+
+* Fix a bug where there upper bound of a version range with a build identifier
+ could accidentally be rewritten.
+
+## 1.4.0
+
+* Add a `Version.firstPreRelease` getter that returns the first possible
+ pre-release of a version.
+
+* Add a `Version.isFirstPreRelease` getter that returns whether a version is the
+ first possible pre-release.
+
+* `new VersionRange()` with an exclusive maximum now replaces the maximum with
+ its first pre-release version. This matches the existing semantics, where an
+ exclusive maximum would exclude pre-release versions of that maximum.
+
+ Explicitly representing this by changing the maximum version ensures that all
+ operations behave correctly with respect to the special pre-release semantics.
+ In particular, it fixes bugs where, for example,
+ `(>=1.0.0 <2.0.0-dev).union(>=2.0.0-dev <2.0.0)` and
+ `(>=1.0.0 <3.0.0).difference(^1.0.0)` wouldn't include `2.0.0-dev`.
+
+* Add an `alwaysIncludeMaxPreRelease` parameter to `new VersionRange()`, which
+ disables the replacement described above and allows users to create ranges
+ that do include the pre-release versions of an exclusive max version.
+
+## 1.3.7
+
+* Fix more bugs with `VersionRange.intersect()`, `VersionRange.difference()`,
+ and `VersionRange.union()` involving version ranges with pre-release maximums.
+
+## 1.3.6
+
+* Fix a bug where constraints that only allowed pre-release versions would be
+ parsed as empty constraints.
+
+## 1.3.5
+
+* Fix a bug where `VersionRange.intersect()` would return incorrect results for
+ pre-release versions with the same base version number as release versions.
+
+## 1.3.4
+
+* Fix a bug where `VersionRange.allowsAll()`, `VersionRange.allowsAny()`, and
+ `VersionRange.difference()` would return incorrect results for pre-release
+ versions with the same base version number as release versions.
+
+## 1.3.3
+
+* Fix a bug where `VersionRange.difference()` with a union constraint that
+ covered the entire range would crash.
+
+## 1.3.2
+
+* Fix a checked-mode error in `VersionRange.difference()`.
+
+## 1.3.1
+
+* Fix a new strong mode error.
+
+## 1.3.0
+
+* Make the `VersionUnion` class public. This was previously used internally to
+ implement `new VersionConstraint.unionOf()` and `VersionConstraint.union()`.
+ Now it's public so you can use it too.
+
+* Added `VersionConstraint.difference()`. This returns a constraint matching all
+ versions matched by one constraint but not another.
+
+* Make `VersionRange` implement `Comparable<VersionRange>`. Ranges are ordered
+ first by lower bound, then by upper bound.
+
+## 1.2.4
+
+* Fix all remaining strong mode warnings.
+
+## 1.2.3
+
+* Addressed three strong mode warnings.
+
+## 1.2.2
+
+* Make the package analyze under strong mode and compile with the DDC (Dart Dev
+ Compiler). Fix two issues with a private subclass of `VersionConstraint`
+ having different types for overridden methods.
+
+## 1.2.1
+
+* Allow version ranges like `>=1.2.3-dev.1 <1.2.3` to match pre-release versions
+ of `1.2.3`. Previously, these didn't match, since the pre-release versions had
+ the same major, minor, and patch numbers as the max; now an exception has been
+ added if they also have the same major, minor, and patch numbers as the min
+ *and* the min is also a pre-release version.
+
+## 1.2.0
+
+* Add a `VersionConstraint.union()` method and a `new
+ VersionConstraint.unionOf()` constructor. These each return a constraint that
+ matches multiple existing constraints.
+
+* Add a `VersionConstraint.allowsAll()` method, which returns whether one
+ constraint is a superset of another.
+
+* Add a `VersionConstraint.allowsAny()` method, which returns whether one
+ constraint overlaps another.
+
+* `Version` now implements `VersionRange`.
+
+## 1.1.0
+
+* Add support for the `^` operator for compatible versions according to pub's
+ notion of compatibility. `^1.2.3` is equivalent to `>=1.2.3 <2.0.0`; `^0.1.2`
+ is equivalent to `>=0.1.2 <0.2.0`.
+
+* Add `Version.nextBreaking`, which returns the next version that introduces
+ breaking changes after a given version.
+
+* Add `new VersionConstraint.compatibleWith()`, which returns a range covering
+ all versions compatible with a given version.
+
+* Add a custom `VersionRange.hashCode` to make it properly hashable.
+
+## 1.0.0
+
+* Initial release.
diff --git a/pkgs/pub_semver/LICENSE b/pkgs/pub_semver/LICENSE
new file mode 100644
index 0000000..000cd7b
--- /dev/null
+++ b/pkgs/pub_semver/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2014, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/pub_semver/README.md b/pkgs/pub_semver/README.md
new file mode 100644
index 0000000..03c92a3
--- /dev/null
+++ b/pkgs/pub_semver/README.md
@@ -0,0 +1,107 @@
+[](https://github.com/dart-lang/tools/actions/workflows/pub_semver.yaml)
+[](https://pub.dev/packages/pub_semver)
+[](https://pub.dev/packages/pub_semver/publisher)
+
+Handles version numbers and version constraints in the same way that [pub][]
+does.
+
+## Semantics
+
+The semantics here very closely follow the
+[Semantic Versioning spec version 2.0.0-rc.1][semver]. It differs from semver
+in a few corner cases:
+
+ * **Version ordering does take build suffixes into account.** This is unlike
+ semver 2.0.0 but like earlier versions of semver. Version `1.2.3+1` is
+ considered a lower number than `1.2.3+2`.
+
+ Since a package may have published multiple versions that differ only by
+ build suffix, pub still has to pick one of them *somehow*. Semver leaves
+ that issue unresolved, so we just say that build numbers are sorted like
+ pre-release suffixes.
+
+ * **Pre-release versions are excluded from most max ranges.** Let's say a
+ user is depending on "foo" with constraint `>=1.0.0 <2.0.0` and that "foo"
+ has published these versions:
+
+ * `1.0.0`
+ * `1.1.0`
+ * `1.2.0`
+ * `2.0.0-alpha`
+ * `2.0.0-beta`
+ * `2.0.0`
+ * `2.1.0`
+
+ Versions `2.0.0` and `2.1.0` are excluded by the constraint since neither
+ matches `<2.0.0`. However, since semver specifies that pre-release versions
+ are lower than the non-prerelease version (i.e. `2.0.0-beta < 2.0.0`, then
+ the `<2.0.0` constraint does technically allow those.
+
+ But that's almost never what the user wants. If their package doesn't work
+ with foo `2.0.0`, it's certainly not likely to work with experimental,
+ unstable versions of `2.0.0`'s API, which is what pre-release versions
+ represent.
+
+ To handle that, `<` version ranges don't allow pre-release versions of the
+ maximum unless the max is itself a pre-release, or the min is a pre-release
+ of the same version. In other words, a `<2.0.0` constraint will prohibit not
+ just `2.0.0` but any pre-release of `2.0.0`. However, `<2.0.0-beta` will
+ exclude `2.0.0-beta` but allow `2.0.0-alpha`. Likewise, `>2.0.0-alpha
+ <2.0.0` will exclude `2.0.0-alpha` but allow `2.0.0-beta`.
+
+ * **Pre-release versions are avoided when possible.** The above case
+ handles pre-release versions at the top of the range, but what about in
+ the middle? What if "foo" has these versions:
+
+ * `1.0.0`
+ * `1.2.0-alpha`
+ * `1.2.0`
+ * `1.3.0-experimental`
+
+ When a number of versions are valid, pub chooses the best one where "best"
+ usually means "highest numbered". That follows the user's intuition that,
+ all else being equal, they want the latest and greatest. Here, that would
+ mean `1.3.0-experimental`. However, most users don't want to use unstable
+ versions of their dependencies.
+
+ We want pre-releases to be explicitly opt-in so that package consumers
+ don't get unpleasant surprises and so that package maintainers are free to
+ put out pre-releases and get feedback without dragging all of their users
+ onto the bleeding edge.
+
+ To accommodate that, when pub is choosing a version, it uses *priority*
+ order which is different from strict comparison ordering. Any stable
+ version is considered higher priority than any unstable version. The above
+ versions, in priority order, are:
+
+ * `1.2.0-alpha`
+ * `1.3.0-experimental`
+ * `1.0.0`
+ * `1.2.0`
+
+ This ensures that users only end up with an unstable version when there are
+ no alternatives. Usually this means they've picked a constraint that
+ specifically selects that unstable version -- they've deliberately opted
+ into it.
+
+ * **There is a notion of compatibility between pre-1.0.0 versions.** Semver
+ deems all pre-1.0.0 versions to be incompatible. This means that the only
+ way to ensure compatibility when depending on a pre-1.0.0 package is to
+ pin the dependency to an exact version. Pinned version constraints prevent
+ automatic patch and pre-release updates. To avoid this situation, pub
+ defines the "next breaking" version as the version which increments the
+ major version if it's greater than zero, and the minor version otherwise,
+ resets subsequent digits to zero, and strips any pre-release or build
+ suffix. For example, here are some versions along with their next breaking
+ ones:
+
+ `0.0.3` -> `0.1.0`
+ `0.7.2-alpha` -> `0.8.0`
+ `1.2.3` -> `2.0.0`
+
+ To make use of this, pub defines a "^" operator which yields a version
+ constraint greater than or equal to a given version, but less than its next
+ breaking one.
+
+[pub]: https://pub.dev
+[semver]: https://semver.org/spec/v2.0.0-rc.1.html
diff --git a/pkgs/pub_semver/analysis_options.yaml b/pkgs/pub_semver/analysis_options.yaml
new file mode 100644
index 0000000..76380a0
--- /dev/null
+++ b/pkgs/pub_semver/analysis_options.yaml
@@ -0,0 +1,31 @@
+# https://dart.dev/guides/language/analysis-options
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-inference: true
+ strict-raw-types: true
+
+linter:
+ rules:
+ - avoid_bool_literals_in_conditional_expressions
+ - avoid_classes_with_only_static_members
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_returning_this
+ - avoid_unused_constructor_parameters
+ - avoid_void_async
+ - cancel_subscriptions
+ - cascade_invocations
+ - join_return_with_assignment
+ - literal_only_boolean_expressions
+ - missing_whitespace_between_adjacent_strings
+ - no_adjacent_strings_in_list
+ - no_runtimeType_toString
+ - prefer_const_declarations
+ - prefer_expression_function_bodies
+ - unnecessary_await_in_return
+ - use_if_null_to_convert_nulls_to_bools
+ - use_raw_strings
+ - use_string_buffers
diff --git a/pkgs/pub_semver/example/example.dart b/pkgs/pub_semver/example/example.dart
new file mode 100644
index 0000000..890343c
--- /dev/null
+++ b/pkgs/pub_semver/example/example.dart
@@ -0,0 +1,17 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:pub_semver/pub_semver.dart';
+
+void main() {
+ final range = VersionConstraint.parse('^2.0.0');
+
+ for (var version in [
+ Version.parse('1.2.3-pre'),
+ Version.parse('2.0.0+123'),
+ Version.parse('3.0.0-dev'),
+ ]) {
+ print('$version ${version.isPreRelease} ${range.allows(version)}');
+ }
+}
diff --git a/pkgs/pub_semver/lib/pub_semver.dart b/pkgs/pub_semver/lib/pub_semver.dart
new file mode 100644
index 0000000..4b6487c
--- /dev/null
+++ b/pkgs/pub_semver/lib/pub_semver.dart
@@ -0,0 +1,8 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/version.dart';
+export 'src/version_constraint.dart';
+export 'src/version_range.dart' hide CompatibleWithVersionRange;
+export 'src/version_union.dart';
diff --git a/pkgs/pub_semver/lib/src/patterns.dart b/pkgs/pub_semver/lib/src/patterns.dart
new file mode 100644
index 0000000..03119ac
--- /dev/null
+++ b/pkgs/pub_semver/lib/src/patterns.dart
@@ -0,0 +1,19 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Regex that matches a version number at the beginning of a string.
+final startVersion = RegExp(r'^' // Start at beginning.
+ r'(\d+)\.(\d+)\.(\d+)' // Version number.
+ r'(-([0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?' // Pre-release.
+ r'(\+([0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?'); // Build.
+
+/// Like [startVersion] but matches the entire string.
+final completeVersion = RegExp('${startVersion.pattern}\$');
+
+/// Parses a comparison operator ("<", ">", "<=", or ">=") at the beginning of
+/// a string.
+final startComparison = RegExp(r'^[<>]=?');
+
+/// The "compatible with" operator.
+const compatibleWithChar = '^';
diff --git a/pkgs/pub_semver/lib/src/utils.dart b/pkgs/pub_semver/lib/src/utils.dart
new file mode 100644
index 0000000..a9f714f
--- /dev/null
+++ b/pkgs/pub_semver/lib/src/utils.dart
@@ -0,0 +1,58 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'version.dart';
+import 'version_range.dart';
+
+/// Returns whether [range1] is immediately next to, but not overlapping,
+/// [range2].
+bool areAdjacent(VersionRange range1, VersionRange range2) {
+ if (range1.max != range2.min) return false;
+
+ return (range1.includeMax && !range2.includeMin) ||
+ (!range1.includeMax && range2.includeMin);
+}
+
+/// Returns whether [range1] allows lower versions than [range2].
+bool allowsLower(VersionRange range1, VersionRange range2) {
+ if (range1.min == null) return range2.min != null;
+ if (range2.min == null) return false;
+
+ var comparison = range1.min!.compareTo(range2.min!);
+ if (comparison == -1) return true;
+ if (comparison == 1) return false;
+ return range1.includeMin && !range2.includeMin;
+}
+
+/// Returns whether [range1] allows higher versions than [range2].
+bool allowsHigher(VersionRange range1, VersionRange range2) {
+ if (range1.max == null) return range2.max != null;
+ if (range2.max == null) return false;
+
+ var comparison = range1.max!.compareTo(range2.max!);
+ if (comparison == 1) return true;
+ if (comparison == -1) return false;
+ return range1.includeMax && !range2.includeMax;
+}
+
+/// Returns whether [range1] allows only versions lower than those allowed by
+/// [range2].
+bool strictlyLower(VersionRange range1, VersionRange range2) {
+ if (range1.max == null || range2.min == null) return false;
+
+ var comparison = range1.max!.compareTo(range2.min!);
+ if (comparison == -1) return true;
+ if (comparison == 1) return false;
+ return !range1.includeMax || !range2.includeMin;
+}
+
+/// Returns whether [range1] allows only versions higher than those allowed by
+/// [range2].
+bool strictlyHigher(VersionRange range1, VersionRange range2) =>
+ strictlyLower(range2, range1);
+
+bool equalsWithoutPreRelease(Version version1, Version version2) =>
+ version1.major == version2.major &&
+ version1.minor == version2.minor &&
+ version1.patch == version2.patch;
diff --git a/pkgs/pub_semver/lib/src/version.dart b/pkgs/pub_semver/lib/src/version.dart
new file mode 100644
index 0000000..90f3d53
--- /dev/null
+++ b/pkgs/pub_semver/lib/src/version.dart
@@ -0,0 +1,391 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math' as math;
+
+import 'package:collection/collection.dart';
+import 'package:meta/meta.dart' show sealed;
+
+import 'patterns.dart';
+import 'version_constraint.dart';
+import 'version_range.dart';
+
+/// The equality operator to use for comparing version components.
+const _equality = IterableEquality<Object>();
+
+/// A parsed semantic version number.
+@sealed
+class Version implements VersionConstraint, VersionRange {
+ /// No released version: i.e. "0.0.0".
+ static Version get none => Version(0, 0, 0);
+
+ /// Compares [a] and [b] to see which takes priority over the other.
+ ///
+ /// Returns `1` if [a] takes priority over [b] and `-1` if vice versa. If
+ /// [a] and [b] are equivalent, returns `0`.
+ ///
+ /// Unlike [compareTo], which *orders* versions, this determines which
+ /// version a user is likely to prefer. In particular, it prioritizes
+ /// pre-release versions lower than stable versions, regardless of their
+ /// version numbers. Pub uses this when determining which version to prefer
+ /// when a number of versions are allowed. In that case, it will always
+ /// choose a stable version when possible.
+ ///
+ /// When used to sort a list, orders in ascending priority so that the
+ /// highest priority version is *last* in the result.
+ static int prioritize(Version a, Version b) {
+ // Sort all prerelease versions after all normal versions. This way
+ // the solver will prefer stable packages over unstable ones.
+ if (a.isPreRelease && !b.isPreRelease) return -1;
+ if (!a.isPreRelease && b.isPreRelease) return 1;
+
+ return a.compareTo(b);
+ }
+
+ /// Like [prioritize], but lower version numbers are considered greater than
+ /// higher version numbers.
+ ///
+ /// This still considers prerelease versions to be lower than non-prerelease
+ /// versions. Pub uses this when downgrading -- it chooses the lowest version
+ /// but still excludes pre-release versions when possible.
+ static int antiprioritize(Version a, Version b) {
+ if (a.isPreRelease && !b.isPreRelease) return -1;
+ if (!a.isPreRelease && b.isPreRelease) return 1;
+
+ return b.compareTo(a);
+ }
+
+ /// The major version number: "1" in "1.2.3".
+ final int major;
+
+ /// The minor version number: "2" in "1.2.3".
+ final int minor;
+
+ /// The patch version number: "3" in "1.2.3".
+ final int patch;
+
+ /// The pre-release identifier: "foo" in "1.2.3-foo".
+ ///
+ /// This is split into a list of components, each of which may be either a
+ /// string or a non-negative integer. It may also be empty, indicating that
+ /// this version has no pre-release identifier.
+ final List<Object> preRelease;
+
+ /// The build identifier: "foo" in "1.2.3+foo".
+ ///
+ /// This is split into a list of components, each of which may be either a
+ /// string or a non-negative integer. It may also be empty, indicating that
+ /// this version has no build identifier.
+ final List<Object> build;
+
+ /// The original string representation of the version number.
+ ///
+ /// This preserves textual artifacts like leading zeros that may be left out
+ /// of the parsed version.
+ final String _text;
+
+ @override
+ Version get min => this;
+ @override
+ Version get max => this;
+ @override
+ bool get includeMin => true;
+ @override
+ bool get includeMax => true;
+
+ Version._(this.major, this.minor, this.patch, String? preRelease,
+ String? build, this._text)
+ : preRelease = preRelease == null ? <Object>[] : _splitParts(preRelease),
+ build = build == null ? [] : _splitParts(build) {
+ if (major < 0) throw ArgumentError('Major version must be non-negative.');
+ if (minor < 0) throw ArgumentError('Minor version must be non-negative.');
+ if (patch < 0) throw ArgumentError('Patch version must be non-negative.');
+ }
+
+ /// Creates a new [Version] object.
+ factory Version(int major, int minor, int patch,
+ {String? pre, String? build}) {
+ var text = '$major.$minor.$patch';
+ if (pre != null) text += '-$pre';
+ if (build != null) text += '+$build';
+
+ return Version._(major, minor, patch, pre, build, text);
+ }
+
+ /// Creates a new [Version] by parsing [text].
+ factory Version.parse(String text) {
+ final match = completeVersion.firstMatch(text);
+ if (match == null) {
+ throw FormatException('Could not parse "$text".');
+ }
+
+ try {
+ var major = int.parse(match[1]!);
+ var minor = int.parse(match[2]!);
+ var patch = int.parse(match[3]!);
+
+ var preRelease = match[5];
+ var build = match[8];
+
+ return Version._(major, minor, patch, preRelease, build, text);
+ } on FormatException {
+ throw FormatException('Could not parse "$text".');
+ }
+ }
+
+ /// Returns the primary version out of [versions].
+ ///
+ /// This is the highest-numbered stable (non-prerelease) version. If there
+ /// are no stable versions, it's just the highest-numbered version.
+ ///
+ /// If [versions] is empty, throws a [StateError].
+ static Version primary(List<Version> versions) {
+ var primary = versions.first;
+ for (var version in versions.skip(1)) {
+ if ((!version.isPreRelease && primary.isPreRelease) ||
+ (version.isPreRelease == primary.isPreRelease && version > primary)) {
+ primary = version;
+ }
+ }
+ return primary;
+ }
+
+ /// Splits a string of dot-delimited identifiers into their component parts.
+ ///
+ /// Identifiers that are numeric are converted to numbers.
+ static List<Object> _splitParts(String text) => text
+ .split('.')
+ .map((part) =>
+ // Return an integer part if possible, otherwise return the string
+ // as-is
+ int.tryParse(part) ?? part)
+ .toList();
+
+ @override
+ bool operator ==(Object other) =>
+ other is Version &&
+ major == other.major &&
+ minor == other.minor &&
+ patch == other.patch &&
+ _equality.equals(preRelease, other.preRelease) &&
+ _equality.equals(build, other.build);
+
+ @override
+ int get hashCode =>
+ major ^
+ minor ^
+ patch ^
+ _equality.hash(preRelease) ^
+ _equality.hash(build);
+
+ bool operator <(Version other) => compareTo(other) < 0;
+ bool operator >(Version other) => compareTo(other) > 0;
+ bool operator <=(Version other) => compareTo(other) <= 0;
+ bool operator >=(Version other) => compareTo(other) >= 0;
+
+ @override
+ bool get isAny => false;
+ @override
+ bool get isEmpty => false;
+
+ /// Whether or not this is a pre-release version.
+ bool get isPreRelease => preRelease.isNotEmpty;
+
+ /// Gets the next major version number that follows this one.
+ ///
+ /// If this version is a pre-release of a major version release (i.e. the
+ /// minor and patch versions are zero), then it just strips the pre-release
+ /// suffix. Otherwise, it increments the major version and resets the minor
+ /// and patch.
+ Version get nextMajor {
+ if (isPreRelease && minor == 0 && patch == 0) {
+ return Version(major, minor, patch);
+ }
+
+ return _incrementMajor();
+ }
+
+ /// Gets the next minor version number that follows this one.
+ ///
+ /// If this version is a pre-release of a minor version release (i.e. the
+ /// patch version is zero), then it just strips the pre-release suffix.
+ /// Otherwise, it increments the minor version and resets the patch.
+ Version get nextMinor {
+ if (isPreRelease && patch == 0) {
+ return Version(major, minor, patch);
+ }
+
+ return _incrementMinor();
+ }
+
+ /// Gets the next patch version number that follows this one.
+ ///
+ /// If this version is a pre-release, then it just strips the pre-release
+ /// suffix. Otherwise, it increments the patch version.
+ Version get nextPatch {
+ if (isPreRelease) {
+ return Version(major, minor, patch);
+ }
+
+ return _incrementPatch();
+ }
+
+ /// Gets the next breaking version number that follows this one.
+ ///
+ /// Increments [major] if it's greater than zero, otherwise [minor], resets
+ /// subsequent digits to zero, and strips any [preRelease] or [build]
+ /// suffix.
+ Version get nextBreaking {
+ if (major == 0) {
+ return _incrementMinor();
+ }
+
+ return _incrementMajor();
+ }
+
+ /// Returns the first possible pre-release of this version.
+ Version get firstPreRelease => Version(major, minor, patch, pre: '0');
+
+ /// Returns whether this is the first possible pre-release of its version.
+ bool get isFirstPreRelease => preRelease.length == 1 && preRelease.first == 0;
+
+ Version _incrementMajor() => Version(major + 1, 0, 0);
+ Version _incrementMinor() => Version(major, minor + 1, 0);
+ Version _incrementPatch() => Version(major, minor, patch + 1);
+
+ /// Tests if [other] matches this version exactly.
+ @override
+ bool allows(Version other) => this == other;
+
+ @override
+ bool allowsAll(VersionConstraint other) => other.isEmpty || other == this;
+
+ @override
+ bool allowsAny(VersionConstraint other) => other.allows(this);
+
+ @override
+ VersionConstraint intersect(VersionConstraint other) =>
+ other.allows(this) ? this : VersionConstraint.empty;
+
+ @override
+ VersionConstraint union(VersionConstraint other) {
+ if (other.allows(this)) return other;
+
+ if (other is VersionRange) {
+ if (other.min == this) {
+ return VersionRange(
+ min: other.min,
+ max: other.max,
+ includeMin: true,
+ includeMax: other.includeMax,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ if (other.max == this) {
+ return VersionRange(
+ min: other.min,
+ max: other.max,
+ includeMin: other.includeMin,
+ includeMax: true,
+ alwaysIncludeMaxPreRelease: true);
+ }
+ }
+
+ return VersionConstraint.unionOf([this, other]);
+ }
+
+ @override
+ VersionConstraint difference(VersionConstraint other) =>
+ other.allows(this) ? VersionConstraint.empty : this;
+
+ @override
+ int compareTo(VersionRange other) {
+ if (other is Version) {
+ if (major != other.major) return major.compareTo(other.major);
+ if (minor != other.minor) return minor.compareTo(other.minor);
+ if (patch != other.patch) return patch.compareTo(other.patch);
+
+ // Pre-releases always come before no pre-release string.
+ if (!isPreRelease && other.isPreRelease) return 1;
+ if (!other.isPreRelease && isPreRelease) return -1;
+
+ var comparison = _compareLists(preRelease, other.preRelease);
+ if (comparison != 0) return comparison;
+
+ // Builds always come after no build string.
+ if (build.isEmpty && other.build.isNotEmpty) return -1;
+ if (other.build.isEmpty && build.isNotEmpty) return 1;
+ return _compareLists(build, other.build);
+ } else {
+ return -other.compareTo(this);
+ }
+ }
+
+ /// Get non-canonical string representation of this [Version].
+ ///
+ /// If created with [Version.parse], the string from which the version was
+ /// parsed is returned. Unlike the [canonicalizedVersion] this preserves
+ /// artifacts such as leading zeros.
+ @override
+ String toString() => _text;
+
+ /// Get a canonicalized string representation of this [Version].
+ ///
+ /// Unlike [Version.toString()] this always returns a canonical string
+ /// representation of this [Version].
+ ///
+ /// **Example**
+ /// ```dart
+ /// final v = Version.parse('01.02.03-01.dev+pre.02');
+ ///
+ /// assert(v.toString() == '01.02.03-01.dev+pre.02');
+ /// assert(v.canonicalizedVersion == '1.2.3-1.dev+pre.2');
+ /// assert(Version.parse(v.canonicalizedVersion) == v);
+ /// ```
+ String get canonicalizedVersion => Version(
+ major,
+ minor,
+ patch,
+ pre: preRelease.isNotEmpty ? preRelease.join('.') : null,
+ build: build.isNotEmpty ? build.join('.') : null,
+ ).toString();
+
+ /// Compares a dot-separated component of two versions.
+ ///
+ /// This is used for the pre-release and build version parts. This follows
+ /// Rule 12 of the Semantic Versioning spec (v2.0.0-rc.1).
+ int _compareLists(List<Object> a, List<Object> b) {
+ for (var i = 0; i < math.max(a.length, b.length); i++) {
+ var aPart = (i < a.length) ? a[i] : null;
+ var bPart = (i < b.length) ? b[i] : null;
+
+ if (aPart == bPart) continue;
+
+ // Missing parts come before present ones.
+ if (aPart == null) return -1;
+ if (bPart == null) return 1;
+
+ if (aPart is num) {
+ if (bPart is num) {
+ // Compare two numbers.
+ return aPart.compareTo(bPart);
+ } else {
+ // Numbers come before strings.
+ return -1;
+ }
+ } else {
+ if (bPart is num) {
+ // Strings come after numbers.
+ return 1;
+ } else {
+ // Compare two strings.
+ return (aPart as String).compareTo(bPart as String);
+ }
+ }
+ }
+
+ // The lists are entirely equal.
+ return 0;
+ }
+}
diff --git a/pkgs/pub_semver/lib/src/version_constraint.dart b/pkgs/pub_semver/lib/src/version_constraint.dart
new file mode 100644
index 0000000..948118e
--- /dev/null
+++ b/pkgs/pub_semver/lib/src/version_constraint.dart
@@ -0,0 +1,287 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'patterns.dart';
+import 'utils.dart';
+import 'version.dart';
+import 'version_range.dart';
+import 'version_union.dart';
+
+/// A [VersionConstraint] is a predicate that can determine whether a given
+/// version is valid or not.
+///
+/// For example, a ">= 2.0.0" constraint allows any version that is "2.0.0" or
+/// greater. Version objects themselves implement this to match a specific
+/// version.
+abstract class VersionConstraint {
+ /// A [VersionConstraint] that allows all versions.
+ static VersionConstraint any = VersionRange();
+
+ /// A [VersionConstraint] that allows no versions -- the empty set.
+ static VersionConstraint empty = const _EmptyVersion();
+
+ /// Parses a version constraint.
+ ///
+ /// This string is one of:
+ ///
+ /// * "any". [any] version.
+ /// * "^" followed by a version string. Versions compatible with
+ /// ([VersionConstraint.compatibleWith]) the version.
+ /// * a series of version parts. Each part can be one of:
+ /// * A version string like `1.2.3`. In other words, anything that can be
+ /// parsed by [Version.parse()].
+ /// * A comparison operator (`<`, `>`, `<=`, or `>=`) followed by a
+ /// version string.
+ ///
+ /// Whitespace is ignored.
+ ///
+ /// Examples:
+ ///
+ /// any
+ /// ^0.7.2
+ /// ^1.0.0-alpha
+ /// 1.2.3-alpha
+ /// <=5.1.4
+ /// >2.0.4 <= 2.4.6
+ factory VersionConstraint.parse(String text) {
+ var originalText = text;
+
+ void skipWhitespace() {
+ text = text.trim();
+ }
+
+ skipWhitespace();
+
+ // Handle the "any" constraint.
+ if (text == 'any') return any;
+
+ // Try to parse and consume a version number.
+ Version? matchVersion() {
+ var version = startVersion.firstMatch(text);
+ if (version == null) return null;
+
+ text = text.substring(version.end);
+ return Version.parse(version[0]!);
+ }
+
+ // Try to parse and consume a comparison operator followed by a version.
+ VersionRange? matchComparison() {
+ var comparison = startComparison.firstMatch(text);
+ if (comparison == null) return null;
+
+ var op = comparison[0]!;
+ text = text.substring(comparison.end);
+ skipWhitespace();
+
+ var version = matchVersion();
+ if (version == null) {
+ throw FormatException('Expected version number after "$op" in '
+ '"$originalText", got "$text".');
+ }
+
+ return switch (op) {
+ '<=' => VersionRange(max: version, includeMax: true),
+ '<' => VersionRange(max: version, alwaysIncludeMaxPreRelease: true),
+ '>=' => VersionRange(min: version, includeMin: true),
+ '>' => VersionRange(min: version),
+ _ => throw UnsupportedError(op),
+ };
+ }
+
+ // Try to parse the "^" operator followed by a version.
+ VersionConstraint? matchCompatibleWith() {
+ if (!text.startsWith(compatibleWithChar)) return null;
+
+ text = text.substring(compatibleWithChar.length);
+ skipWhitespace();
+
+ var version = matchVersion();
+ if (version == null) {
+ throw FormatException('Expected version number after '
+ '"$compatibleWithChar" in "$originalText", got "$text".');
+ }
+
+ if (text.isNotEmpty) {
+ throw FormatException('Cannot include other constraints with '
+ '"$compatibleWithChar" constraint in "$originalText".');
+ }
+
+ return VersionConstraint.compatibleWith(version);
+ }
+
+ var compatibleWith = matchCompatibleWith();
+ if (compatibleWith != null) return compatibleWith;
+
+ Version? min;
+ var includeMin = false;
+ Version? max;
+ var includeMax = false;
+
+ for (;;) {
+ skipWhitespace();
+
+ if (text.isEmpty) break;
+
+ var newRange = matchVersion() ?? matchComparison();
+ if (newRange == null) {
+ throw FormatException('Could not parse version "$originalText". '
+ 'Unknown text at "$text".');
+ }
+
+ if (newRange.min != null) {
+ if (min == null || newRange.min! > min) {
+ min = newRange.min;
+ includeMin = newRange.includeMin;
+ } else if (newRange.min == min && !newRange.includeMin) {
+ includeMin = false;
+ }
+ }
+
+ if (newRange.max != null) {
+ if (max == null || newRange.max! < max) {
+ max = newRange.max;
+ includeMax = newRange.includeMax;
+ } else if (newRange.max == max && !newRange.includeMax) {
+ includeMax = false;
+ }
+ }
+ }
+
+ if (min == null && max == null) {
+ throw const FormatException('Cannot parse an empty string.');
+ }
+
+ if (min != null && max != null) {
+ if (min > max) return VersionConstraint.empty;
+ if (min == max) {
+ if (includeMin && includeMax) return min;
+ return VersionConstraint.empty;
+ }
+ }
+
+ return VersionRange(
+ min: min, includeMin: includeMin, max: max, includeMax: includeMax);
+ }
+
+ /// Creates a version constraint which allows all versions that are
+ /// backward compatible with [version].
+ ///
+ /// Versions are considered backward compatible with [version] if they
+ /// are greater than or equal to [version], but less than the next breaking
+ /// version ([Version.nextBreaking]) of [version].
+ factory VersionConstraint.compatibleWith(Version version) =>
+ CompatibleWithVersionRange(version);
+
+ /// Creates a new version constraint that is the intersection of
+ /// [constraints].
+ ///
+ /// It only allows versions that all of those constraints allow. If
+ /// constraints is empty, then it returns a VersionConstraint that allows
+ /// all versions.
+ factory VersionConstraint.intersection(
+ Iterable<VersionConstraint> constraints) {
+ var constraint = VersionRange();
+ for (var other in constraints) {
+ constraint = constraint.intersect(other) as VersionRange;
+ }
+ return constraint;
+ }
+
+ /// Creates a new version constraint that is the union of [constraints].
+ ///
+ /// It allows any versions that any of those constraints allows. If
+ /// [constraints] is empty, this returns a constraint that allows no versions.
+ factory VersionConstraint.unionOf(Iterable<VersionConstraint> constraints) {
+ var flattened = constraints.expand((constraint) {
+ if (constraint.isEmpty) return <VersionRange>[];
+ if (constraint is VersionUnion) return constraint.ranges;
+ if (constraint is VersionRange) return [constraint];
+ throw ArgumentError('Unknown VersionConstraint type $constraint.');
+ }).toList();
+
+ if (flattened.isEmpty) return VersionConstraint.empty;
+
+ if (flattened.any((constraint) => constraint.isAny)) {
+ return VersionConstraint.any;
+ }
+
+ flattened.sort();
+
+ var merged = <VersionRange>[];
+ for (var constraint in flattened) {
+ // Merge this constraint with the previous one, but only if they touch.
+ if (merged.isEmpty ||
+ (!merged.last.allowsAny(constraint) &&
+ !areAdjacent(merged.last, constraint))) {
+ merged.add(constraint);
+ } else {
+ merged[merged.length - 1] =
+ merged.last.union(constraint) as VersionRange;
+ }
+ }
+
+ if (merged.length == 1) return merged.single;
+ return VersionUnion.fromRanges(merged);
+ }
+
+ /// Returns `true` if this constraint allows no versions.
+ bool get isEmpty;
+
+ /// Returns `true` if this constraint allows all versions.
+ bool get isAny;
+
+ /// Returns `true` if this constraint allows [version].
+ bool allows(Version version);
+
+ /// Returns `true` if this constraint allows all the versions that [other]
+ /// allows.
+ bool allowsAll(VersionConstraint other);
+
+ /// Returns `true` if this constraint allows any of the versions that [other]
+ /// allows.
+ bool allowsAny(VersionConstraint other);
+
+ /// Returns a [VersionConstraint] that only allows [Version]s allowed by both
+ /// this and [other].
+ VersionConstraint intersect(VersionConstraint other);
+
+ /// Returns a [VersionConstraint] that allows [Version]s allowed by either
+ /// this or [other].
+ VersionConstraint union(VersionConstraint other);
+
+ /// Returns a [VersionConstraint] that allows [Version]s allowed by this but
+ /// not [other].
+ VersionConstraint difference(VersionConstraint other);
+}
+
+class _EmptyVersion implements VersionConstraint {
+ const _EmptyVersion();
+
+ @override
+ bool get isEmpty => true;
+
+ @override
+ bool get isAny => false;
+
+ @override
+ bool allows(Version other) => false;
+
+ @override
+ bool allowsAll(VersionConstraint other) => other.isEmpty;
+
+ @override
+ bool allowsAny(VersionConstraint other) => false;
+
+ @override
+ VersionConstraint intersect(VersionConstraint other) => this;
+
+ @override
+ VersionConstraint union(VersionConstraint other) => other;
+
+ @override
+ VersionConstraint difference(VersionConstraint other) => this;
+
+ @override
+ String toString() => '<empty>';
+}
diff --git a/pkgs/pub_semver/lib/src/version_range.dart b/pkgs/pub_semver/lib/src/version_range.dart
new file mode 100644
index 0000000..6f2ed54
--- /dev/null
+++ b/pkgs/pub_semver/lib/src/version_range.dart
@@ -0,0 +1,476 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'utils.dart';
+import 'version.dart';
+import 'version_constraint.dart';
+import 'version_union.dart';
+
+/// Constrains versions to a fall within a given range.
+///
+/// If there is a minimum, then this only allows versions that are at that
+/// minimum or greater. If there is a maximum, then only versions less than
+/// that are allowed. In other words, this allows `>= min, < max`.
+///
+/// Version ranges are ordered first by their lower bounds, then by their upper
+/// bounds. For example, `>=1.0.0 <2.0.0` is before `>=1.5.0 <2.0.0` is before
+/// `>=1.5.0 <3.0.0`.
+class VersionRange implements Comparable<VersionRange>, VersionConstraint {
+ /// The minimum end of the range.
+ ///
+ /// If [includeMin] is `true`, this will be the minimum allowed version.
+ /// Otherwise, it will be the highest version below the range that is not
+ /// allowed.
+ ///
+ /// This may be `null` in which case the range has no minimum end and allows
+ /// any version less than the maximum.
+ final Version? min;
+
+ /// The maximum end of the range.
+ ///
+ /// If [includeMax] is `true`, this will be the maximum allowed version.
+ /// Otherwise, it will be the lowest version above the range that is not
+ /// allowed.
+ ///
+ /// This may be `null` in which case the range has no maximum end and allows
+ /// any version greater than the minimum.
+ final Version? max;
+
+ /// If `true` then [min] is allowed by the range.
+ final bool includeMin;
+
+ /// If `true`, then [max] is allowed by the range.
+ final bool includeMax;
+
+ /// Creates a new version range from [min] to [max], either inclusive or
+ /// exclusive.
+ ///
+ /// If it is an error if [min] is greater than [max].
+ ///
+ /// Either [max] or [min] may be omitted to not clamp the range at that end.
+ /// If both are omitted, the range allows all versions.
+ ///
+ /// If [includeMin] is `true`, then the minimum end of the range is inclusive.
+ /// Likewise, passing [includeMax] as `true` makes the upper end inclusive.
+ ///
+ /// If [alwaysIncludeMaxPreRelease] is `true`, this will always include
+ /// pre-release versions of an exclusive [max]. Otherwise, it will use the
+ /// default behavior for pre-release versions of [max].
+ factory VersionRange(
+ {Version? min,
+ Version? max,
+ bool includeMin = false,
+ bool includeMax = false,
+ bool alwaysIncludeMaxPreRelease = false}) {
+ if (min != null && max != null && min > max) {
+ throw ArgumentError(
+ 'Minimum version ("$min") must be less than maximum ("$max").');
+ }
+
+ if (!alwaysIncludeMaxPreRelease &&
+ !includeMax &&
+ max != null &&
+ !max.isPreRelease &&
+ max.build.isEmpty &&
+ (min == null ||
+ !min.isPreRelease ||
+ !equalsWithoutPreRelease(min, max))) {
+ max = max.firstPreRelease;
+ }
+
+ return VersionRange._(min, max, includeMin, includeMax);
+ }
+
+ VersionRange._(this.min, this.max, this.includeMin, this.includeMax);
+
+ @override
+ bool operator ==(Object other) {
+ if (other is! VersionRange) return false;
+
+ return min == other.min &&
+ max == other.max &&
+ includeMin == other.includeMin &&
+ includeMax == other.includeMax;
+ }
+
+ @override
+ int get hashCode =>
+ min.hashCode ^
+ (max.hashCode * 3) ^
+ (includeMin.hashCode * 5) ^
+ (includeMax.hashCode * 7);
+
+ @override
+ bool get isEmpty => false;
+
+ @override
+ bool get isAny => min == null && max == null;
+
+ /// Tests if [other] falls within this version range.
+ @override
+ bool allows(Version other) {
+ if (min != null) {
+ if (other < min!) return false;
+ if (!includeMin && other == min) return false;
+ }
+
+ if (max != null) {
+ if (other > max!) return false;
+ if (!includeMax && other == max) return false;
+ }
+
+ return true;
+ }
+
+ @override
+ bool allowsAll(VersionConstraint other) {
+ if (other.isEmpty) return true;
+ if (other is Version) return allows(other);
+
+ if (other is VersionUnion) {
+ return other.ranges.every(allowsAll);
+ }
+
+ if (other is VersionRange) {
+ return !allowsLower(other, this) && !allowsHigher(other, this);
+ }
+
+ throw ArgumentError('Unknown VersionConstraint type $other.');
+ }
+
+ @override
+ bool allowsAny(VersionConstraint other) {
+ if (other.isEmpty) return false;
+ if (other is Version) return allows(other);
+
+ if (other is VersionUnion) {
+ return other.ranges.any(allowsAny);
+ }
+
+ if (other is VersionRange) {
+ return !strictlyLower(other, this) && !strictlyHigher(other, this);
+ }
+
+ throw ArgumentError('Unknown VersionConstraint type $other.');
+ }
+
+ @override
+ VersionConstraint intersect(VersionConstraint other) {
+ if (other.isEmpty) return other;
+ if (other is VersionUnion) return other.intersect(this);
+
+ // A range and a Version just yields the version if it's in the range.
+ if (other is Version) {
+ return allows(other) ? other : VersionConstraint.empty;
+ }
+
+ if (other is VersionRange) {
+ // Intersect the two ranges.
+ Version? intersectMin;
+ bool intersectIncludeMin;
+ if (allowsLower(this, other)) {
+ if (strictlyLower(this, other)) return VersionConstraint.empty;
+ intersectMin = other.min;
+ intersectIncludeMin = other.includeMin;
+ } else {
+ if (strictlyLower(other, this)) return VersionConstraint.empty;
+ intersectMin = min;
+ intersectIncludeMin = includeMin;
+ }
+
+ Version? intersectMax;
+ bool intersectIncludeMax;
+ if (allowsHigher(this, other)) {
+ intersectMax = other.max;
+ intersectIncludeMax = other.includeMax;
+ } else {
+ intersectMax = max;
+ intersectIncludeMax = includeMax;
+ }
+
+ if (intersectMin == null && intersectMax == null) {
+ // Open range.
+ return VersionRange();
+ }
+
+ // If the range is just a single version.
+ if (intersectMin == intersectMax) {
+ // Because we already verified that the lower range isn't strictly
+ // lower, there must be some overlap.
+ assert(intersectIncludeMin && intersectIncludeMax);
+ return intersectMin!;
+ }
+
+ // If we got here, there is an actual range.
+ return VersionRange(
+ min: intersectMin,
+ max: intersectMax,
+ includeMin: intersectIncludeMin,
+ includeMax: intersectIncludeMax,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ throw ArgumentError('Unknown VersionConstraint type $other.');
+ }
+
+ @override
+ VersionConstraint union(VersionConstraint other) {
+ if (other is Version) {
+ if (allows(other)) return this;
+
+ if (other == min) {
+ return VersionRange(
+ min: min,
+ max: max,
+ includeMin: true,
+ includeMax: includeMax,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ if (other == max) {
+ return VersionRange(
+ min: min,
+ max: max,
+ includeMin: includeMin,
+ includeMax: true,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ return VersionConstraint.unionOf([this, other]);
+ }
+
+ if (other is VersionRange) {
+ // If the two ranges don't overlap, we won't be able to create a single
+ // VersionRange for both of them.
+ var edgesTouch = (max != null &&
+ max == other.min &&
+ (includeMax || other.includeMin)) ||
+ (min != null && min == other.max && (includeMin || other.includeMax));
+ if (!edgesTouch && !allowsAny(other)) {
+ return VersionConstraint.unionOf([this, other]);
+ }
+
+ Version? unionMin;
+ bool unionIncludeMin;
+ if (allowsLower(this, other)) {
+ unionMin = min;
+ unionIncludeMin = includeMin;
+ } else {
+ unionMin = other.min;
+ unionIncludeMin = other.includeMin;
+ }
+
+ Version? unionMax;
+ bool unionIncludeMax;
+ if (allowsHigher(this, other)) {
+ unionMax = max;
+ unionIncludeMax = includeMax;
+ } else {
+ unionMax = other.max;
+ unionIncludeMax = other.includeMax;
+ }
+
+ return VersionRange(
+ min: unionMin,
+ max: unionMax,
+ includeMin: unionIncludeMin,
+ includeMax: unionIncludeMax,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ return VersionConstraint.unionOf([this, other]);
+ }
+
+ @override
+ VersionConstraint difference(VersionConstraint other) {
+ if (other.isEmpty) return this;
+
+ if (other is Version) {
+ if (!allows(other)) return this;
+
+ if (other == min) {
+ if (!includeMin) return this;
+ return VersionRange(
+ min: min,
+ max: max,
+ includeMax: includeMax,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ if (other == max) {
+ if (!includeMax) return this;
+ return VersionRange(
+ min: min,
+ max: max,
+ includeMin: includeMin,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ return VersionUnion.fromRanges([
+ VersionRange(
+ min: min,
+ max: other,
+ includeMin: includeMin,
+ alwaysIncludeMaxPreRelease: true),
+ VersionRange(
+ min: other,
+ max: max,
+ includeMax: includeMax,
+ alwaysIncludeMaxPreRelease: true)
+ ]);
+ } else if (other is VersionRange) {
+ if (!allowsAny(other)) return this;
+
+ VersionRange? before;
+ if (!allowsLower(this, other)) {
+ before = null;
+ } else if (min == other.min) {
+ assert(includeMin && !other.includeMin);
+ assert(min != null);
+ before = min;
+ } else {
+ before = VersionRange(
+ min: min,
+ max: other.min,
+ includeMin: includeMin,
+ includeMax: !other.includeMin,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ VersionRange? after;
+ if (!allowsHigher(this, other)) {
+ after = null;
+ } else if (max == other.max) {
+ assert(includeMax && !other.includeMax);
+ assert(max != null);
+ after = max;
+ } else {
+ after = VersionRange(
+ min: other.max,
+ max: max,
+ includeMin: !other.includeMax,
+ includeMax: includeMax,
+ alwaysIncludeMaxPreRelease: true);
+ }
+
+ if (before == null && after == null) return VersionConstraint.empty;
+ if (before == null) return after!;
+ if (after == null) return before;
+ return VersionUnion.fromRanges([before, after]);
+ } else if (other is VersionUnion) {
+ var ranges = <VersionRange>[];
+ var current = this;
+
+ for (var range in other.ranges) {
+ // Skip any ranges that are strictly lower than [current].
+ if (strictlyLower(range, current)) continue;
+
+ // If we reach a range strictly higher than [current], no more ranges
+ // will be relevant so we can bail early.
+ if (strictlyHigher(range, current)) break;
+
+ var difference = current.difference(range);
+ if (difference.isEmpty) {
+ return VersionConstraint.empty;
+ } else if (difference is VersionUnion) {
+ // If [range] split [current] in half, we only need to continue
+ // checking future ranges against the latter half.
+ assert(difference.ranges.length == 2);
+ ranges.add(difference.ranges.first);
+ current = difference.ranges.last;
+ } else {
+ current = difference as VersionRange;
+ }
+ }
+
+ if (ranges.isEmpty) return current;
+ return VersionUnion.fromRanges(ranges..add(current));
+ }
+
+ throw ArgumentError('Unknown VersionConstraint type $other.');
+ }
+
+ @override
+ int compareTo(VersionRange other) {
+ if (min == null) {
+ if (other.min == null) return _compareMax(other);
+ return -1;
+ } else if (other.min == null) {
+ return 1;
+ }
+
+ var result = min!.compareTo(other.min!);
+ if (result != 0) return result;
+ if (includeMin != other.includeMin) return includeMin ? -1 : 1;
+
+ return _compareMax(other);
+ }
+
+ /// Compares the maximum values of `this` and [other].
+ int _compareMax(VersionRange other) {
+ if (max == null) {
+ if (other.max == null) return 0;
+ return 1;
+ } else if (other.max == null) {
+ return -1;
+ }
+
+ var result = max!.compareTo(other.max!);
+ if (result != 0) return result;
+ if (includeMax != other.includeMax) return includeMax ? 1 : -1;
+ return 0;
+ }
+
+ @override
+ String toString() {
+ var buffer = StringBuffer();
+
+ final min = this.min;
+ if (min != null) {
+ buffer
+ ..write(includeMin ? '>=' : '>')
+ ..write(min);
+ }
+
+ final max = this.max;
+
+ if (max != null) {
+ if (min != null) buffer.write(' ');
+ if (includeMax) {
+ buffer
+ ..write('<=')
+ ..write(max);
+ } else {
+ buffer.write('<');
+ if (max.isFirstPreRelease) {
+ // Since `"<$max"` would parse the same as `"<$max-0"`, we just emit
+ // `<$max` to avoid confusing "-0" suffixes.
+ buffer.write('${max.major}.${max.minor}.${max.patch}');
+ } else {
+ buffer.write(max);
+
+ // If `">=$min <$max"` would parse as `">=$min <$max-0"`, add `-*` to
+ // indicate that actually does allow pre-release versions.
+ var minIsPreReleaseOfMax = min != null &&
+ min.isPreRelease &&
+ equalsWithoutPreRelease(min, max);
+ if (!max.isPreRelease && max.build.isEmpty && !minIsPreReleaseOfMax) {
+ buffer.write('-∞');
+ }
+ }
+ }
+ }
+
+ if (min == null && max == null) buffer.write('any');
+ return buffer.toString();
+ }
+}
+
+class CompatibleWithVersionRange extends VersionRange {
+ CompatibleWithVersionRange(Version version)
+ : super._(version, version.nextBreaking.firstPreRelease, true, false);
+
+ @override
+ String toString() => '^$min';
+}
diff --git a/pkgs/pub_semver/lib/src/version_union.dart b/pkgs/pub_semver/lib/src/version_union.dart
new file mode 100644
index 0000000..844d3b8
--- /dev/null
+++ b/pkgs/pub_semver/lib/src/version_union.dart
@@ -0,0 +1,224 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:collection/collection.dart';
+
+import 'utils.dart';
+import 'version.dart';
+import 'version_constraint.dart';
+import 'version_range.dart';
+
+/// A version constraint representing a union of multiple disjoint version
+/// ranges.
+///
+/// An instance of this will only be created if the version can't be represented
+/// as a non-compound value.
+class VersionUnion implements VersionConstraint {
+ /// The constraints that compose this union.
+ ///
+ /// This list has two invariants:
+ ///
+ /// * Its contents are sorted using the standard ordering of [VersionRange]s.
+ /// * Its contents are disjoint and non-adjacent. In other words, for any two
+ /// constraints next to each other in the list, there's some version between
+ /// those constraints that they don't match.
+ final List<VersionRange> ranges;
+
+ @override
+ bool get isEmpty => false;
+
+ @override
+ bool get isAny => false;
+
+ /// Creates a union from a list of ranges with no pre-processing.
+ ///
+ /// It's up to the caller to ensure that the invariants described in [ranges]
+ /// are maintained. They are not verified by this constructor. To
+ /// automatically ensure that they're maintained, use
+ /// [VersionConstraint.unionOf] instead.
+ VersionUnion.fromRanges(this.ranges);
+
+ @override
+ bool allows(Version version) =>
+ ranges.any((constraint) => constraint.allows(version));
+
+ @override
+ bool allowsAll(VersionConstraint other) {
+ var ourRanges = ranges.iterator;
+ var theirRanges = _rangesFor(other).iterator;
+
+ // Because both lists of ranges are ordered by minimum version, we can
+ // safely move through them linearly here.
+ var ourRangesMoved = ourRanges.moveNext();
+ var theirRangesMoved = theirRanges.moveNext();
+ while (ourRangesMoved && theirRangesMoved) {
+ if (ourRanges.current.allowsAll(theirRanges.current)) {
+ theirRangesMoved = theirRanges.moveNext();
+ } else {
+ ourRangesMoved = ourRanges.moveNext();
+ }
+ }
+
+ // If our ranges have allowed all of their ranges, we'll have consumed all
+ // of them.
+ return !theirRangesMoved;
+ }
+
+ @override
+ bool allowsAny(VersionConstraint other) {
+ var ourRanges = ranges.iterator;
+ var theirRanges = _rangesFor(other).iterator;
+
+ // Because both lists of ranges are ordered by minimum version, we can
+ // safely move through them linearly here.
+ var ourRangesMoved = ourRanges.moveNext();
+ var theirRangesMoved = theirRanges.moveNext();
+ while (ourRangesMoved && theirRangesMoved) {
+ if (ourRanges.current.allowsAny(theirRanges.current)) {
+ return true;
+ }
+
+ // Move the constraint with the lower max value forward. This ensures that
+ // we keep both lists in sync as much as possible.
+ if (allowsHigher(theirRanges.current, ourRanges.current)) {
+ ourRangesMoved = ourRanges.moveNext();
+ } else {
+ theirRangesMoved = theirRanges.moveNext();
+ }
+ }
+
+ return false;
+ }
+
+ @override
+ VersionConstraint intersect(VersionConstraint other) {
+ var ourRanges = ranges.iterator;
+ var theirRanges = _rangesFor(other).iterator;
+
+ // Because both lists of ranges are ordered by minimum version, we can
+ // safely move through them linearly here.
+ var newRanges = <VersionRange>[];
+ var ourRangesMoved = ourRanges.moveNext();
+ var theirRangesMoved = theirRanges.moveNext();
+ while (ourRangesMoved && theirRangesMoved) {
+ var intersection = ourRanges.current.intersect(theirRanges.current);
+
+ if (!intersection.isEmpty) newRanges.add(intersection as VersionRange);
+
+ // Move the constraint with the lower max value forward. This ensures that
+ // we keep both lists in sync as much as possible, and that large ranges
+ // have a chance to match multiple small ranges that they contain.
+ if (allowsHigher(theirRanges.current, ourRanges.current)) {
+ ourRangesMoved = ourRanges.moveNext();
+ } else {
+ theirRangesMoved = theirRanges.moveNext();
+ }
+ }
+
+ if (newRanges.isEmpty) return VersionConstraint.empty;
+ if (newRanges.length == 1) return newRanges.single;
+
+ return VersionUnion.fromRanges(newRanges);
+ }
+
+ @override
+ VersionConstraint difference(VersionConstraint other) {
+ var ourRanges = ranges.iterator;
+ var theirRanges = _rangesFor(other).iterator;
+
+ var newRanges = <VersionRange>[];
+ ourRanges.moveNext();
+ theirRanges.moveNext();
+ var current = ourRanges.current;
+
+ bool theirNextRange() {
+ if (theirRanges.moveNext()) return true;
+
+ // If there are no more of their ranges, none of the rest of our ranges
+ // need to be subtracted so we can add them as-is.
+ newRanges.add(current);
+ while (ourRanges.moveNext()) {
+ newRanges.add(ourRanges.current);
+ }
+ return false;
+ }
+
+ bool ourNextRange({bool includeCurrent = true}) {
+ if (includeCurrent) newRanges.add(current);
+ if (!ourRanges.moveNext()) return false;
+ current = ourRanges.current;
+ return true;
+ }
+
+ for (;;) {
+ // If the current ranges are disjoint, move the lowest one forward.
+ if (strictlyLower(theirRanges.current, current)) {
+ if (!theirNextRange()) break;
+ continue;
+ }
+
+ if (strictlyHigher(theirRanges.current, current)) {
+ if (!ourNextRange()) break;
+ continue;
+ }
+
+ // If we're here, we know [theirRanges.current] overlaps [current].
+ var difference = current.difference(theirRanges.current);
+ if (difference is VersionUnion) {
+ // If their range split [current] in half, we only need to continue
+ // checking future ranges against the latter half.
+ assert(difference.ranges.length == 2);
+ newRanges.add(difference.ranges.first);
+ current = difference.ranges.last;
+
+ // Since their range split [current], it definitely doesn't allow higher
+ // versions, so we should move their ranges forward.
+ if (!theirNextRange()) break;
+ } else if (difference.isEmpty) {
+ if (!ourNextRange(includeCurrent: false)) break;
+ } else {
+ current = difference as VersionRange;
+
+ // Move the constraint with the lower max value forward. This ensures
+ // that we keep both lists in sync as much as possible, and that large
+ // ranges have a chance to subtract or be subtracted by multiple small
+ // ranges that they contain.
+ if (allowsHigher(current, theirRanges.current)) {
+ if (!theirNextRange()) break;
+ } else {
+ if (!ourNextRange()) break;
+ }
+ }
+ }
+
+ if (newRanges.isEmpty) return VersionConstraint.empty;
+ if (newRanges.length == 1) return newRanges.single;
+ return VersionUnion.fromRanges(newRanges);
+ }
+
+ /// Returns [constraint] as a list of ranges.
+ ///
+ /// This is used to normalize ranges of various types.
+ List<VersionRange> _rangesFor(VersionConstraint constraint) {
+ if (constraint.isEmpty) return [];
+ if (constraint is VersionUnion) return constraint.ranges;
+ if (constraint is VersionRange) return [constraint];
+ throw ArgumentError('Unknown VersionConstraint type $constraint.');
+ }
+
+ @override
+ VersionConstraint union(VersionConstraint other) =>
+ VersionConstraint.unionOf([this, other]);
+
+ @override
+ bool operator ==(Object other) =>
+ other is VersionUnion &&
+ const ListEquality<VersionRange>().equals(ranges, other.ranges);
+
+ @override
+ int get hashCode => const ListEquality<VersionRange>().hash(ranges);
+
+ @override
+ String toString() => ranges.join(' or ');
+}
diff --git a/pkgs/pub_semver/pubspec.yaml b/pkgs/pub_semver/pubspec.yaml
new file mode 100644
index 0000000..290fb92
--- /dev/null
+++ b/pkgs/pub_semver/pubspec.yaml
@@ -0,0 +1,20 @@
+name: pub_semver
+version: 2.1.5
+description: >-
+ Versions and version constraints implementing pub's versioning policy. This
+ is very similar to vanilla semver, with a few corner cases.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/pub_semver
+topics:
+ - dart-pub
+ - semver
+
+environment:
+ sdk: ^3.4.0
+
+dependencies:
+ collection: ^1.15.0
+ meta: ^1.3.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.0
diff --git a/pkgs/pub_semver/test/utils.dart b/pkgs/pub_semver/test/utils.dart
new file mode 100644
index 0000000..bd7aa8f
--- /dev/null
+++ b/pkgs/pub_semver/test/utils.dart
@@ -0,0 +1,123 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:pub_semver/pub_semver.dart';
+import 'package:test/test.dart';
+
+/// Some stock example versions to use in tests.
+final v003 = Version.parse('0.0.3');
+final v010 = Version.parse('0.1.0');
+final v072 = Version.parse('0.7.2');
+final v080 = Version.parse('0.8.0');
+final v114 = Version.parse('1.1.4');
+final v123 = Version.parse('1.2.3');
+final v124 = Version.parse('1.2.4');
+final v130 = Version.parse('1.3.0');
+final v140 = Version.parse('1.4.0');
+final v200 = Version.parse('2.0.0');
+final v201 = Version.parse('2.0.1');
+final v234 = Version.parse('2.3.4');
+final v250 = Version.parse('2.5.0');
+final v300 = Version.parse('3.0.0');
+
+/// A range that allows pre-release versions of its max version.
+final includeMaxPreReleaseRange =
+ VersionRange(max: v200, alwaysIncludeMaxPreRelease: true);
+
+/// A [Matcher] that tests if a [VersionConstraint] allows or does not allow a
+/// given list of [Version]s.
+class _VersionConstraintMatcher implements Matcher {
+ final List<Version> _expected;
+ final bool _allow;
+
+ _VersionConstraintMatcher(this._expected, this._allow);
+
+ @override
+ bool matches(dynamic item, Map<dynamic, dynamic> matchState) =>
+ (item is VersionConstraint) &&
+ _expected.every((version) => item.allows(version) == _allow);
+
+ @override
+ Description describe(Description description) {
+ description.addAll(' ${_allow ? "allows" : "does not allow"} versions ',
+ ', ', '', _expected);
+ return description;
+ }
+
+ @override
+ Description describeMismatch(dynamic item, Description mismatchDescription,
+ Map<dynamic, dynamic> matchState, bool verbose) {
+ if (item is! VersionConstraint) {
+ mismatchDescription.add('was not a VersionConstraint');
+ return mismatchDescription;
+ }
+
+ var first = true;
+ for (var version in _expected) {
+ if (item.allows(version) != _allow) {
+ if (first) {
+ if (_allow) {
+ mismatchDescription.addDescriptionOf(item).add(' did not allow ');
+ } else {
+ mismatchDescription.addDescriptionOf(item).add(' allowed ');
+ }
+ } else {
+ mismatchDescription.add(' and ');
+ }
+ first = false;
+
+ mismatchDescription.add(version.toString());
+ }
+ }
+
+ return mismatchDescription;
+ }
+}
+
+/// Gets a [Matcher] that validates that a [VersionConstraint] allows all
+/// given versions.
+Matcher allows(Version v1,
+ [Version? v2,
+ Version? v3,
+ Version? v4,
+ Version? v5,
+ Version? v6,
+ Version? v7,
+ Version? v8]) {
+ var versions = _makeVersionList(v1, v2, v3, v4, v5, v6, v7, v8);
+ return _VersionConstraintMatcher(versions, true);
+}
+
+/// Gets a [Matcher] that validates that a [VersionConstraint] allows none of
+/// the given versions.
+Matcher doesNotAllow(Version v1,
+ [Version? v2,
+ Version? v3,
+ Version? v4,
+ Version? v5,
+ Version? v6,
+ Version? v7,
+ Version? v8]) {
+ var versions = _makeVersionList(v1, v2, v3, v4, v5, v6, v7, v8);
+ return _VersionConstraintMatcher(versions, false);
+}
+
+List<Version> _makeVersionList(Version v1,
+ [Version? v2,
+ Version? v3,
+ Version? v4,
+ Version? v5,
+ Version? v6,
+ Version? v7,
+ Version? v8]) {
+ var versions = [v1];
+ if (v2 != null) versions.add(v2);
+ if (v3 != null) versions.add(v3);
+ if (v4 != null) versions.add(v4);
+ if (v5 != null) versions.add(v5);
+ if (v6 != null) versions.add(v6);
+ if (v7 != null) versions.add(v7);
+ if (v8 != null) versions.add(v8);
+ return versions;
+}
diff --git a/pkgs/pub_semver/test/version_constraint_test.dart b/pkgs/pub_semver/test/version_constraint_test.dart
new file mode 100644
index 0000000..4fbcbe0
--- /dev/null
+++ b/pkgs/pub_semver/test/version_constraint_test.dart
@@ -0,0 +1,185 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:pub_semver/pub_semver.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ test('any', () {
+ expect(VersionConstraint.any.isAny, isTrue);
+ expect(
+ VersionConstraint.any,
+ allows(Version.parse('0.0.0-blah'), Version.parse('1.2.3'),
+ Version.parse('12345.678.90')));
+ });
+
+ test('empty', () {
+ expect(VersionConstraint.empty.isEmpty, isTrue);
+ expect(VersionConstraint.empty.isAny, isFalse);
+ expect(
+ VersionConstraint.empty,
+ doesNotAllow(Version.parse('0.0.0-blah'), Version.parse('1.2.3'),
+ Version.parse('12345.678.90')));
+ });
+
+ group('parse()', () {
+ test('parses an exact version', () {
+ var constraint = VersionConstraint.parse('1.2.3-alpha');
+
+ expect(constraint is Version, isTrue);
+ expect(constraint, equals(Version(1, 2, 3, pre: 'alpha')));
+ });
+
+ test('parses "any"', () {
+ var constraint = VersionConstraint.parse('any');
+
+ expect(
+ constraint,
+ allows(Version.parse('0.0.0'), Version.parse('1.2.3'),
+ Version.parse('12345.678.90')));
+ });
+
+ test('parses a ">" minimum version', () {
+ var constraint = VersionConstraint.parse('>1.2.3');
+
+ expect(constraint,
+ allows(Version.parse('1.2.3+foo'), Version.parse('1.2.4')));
+ expect(
+ constraint,
+ doesNotAllow(Version.parse('1.2.1'), Version.parse('1.2.3-build'),
+ Version.parse('1.2.3')));
+ });
+
+ test('parses a ">=" minimum version', () {
+ var constraint = VersionConstraint.parse('>=1.2.3');
+
+ expect(
+ constraint,
+ allows(Version.parse('1.2.3'), Version.parse('1.2.3+foo'),
+ Version.parse('1.2.4')));
+ expect(constraint,
+ doesNotAllow(Version.parse('1.2.1'), Version.parse('1.2.3-build')));
+ });
+
+ test('parses a "<" maximum version', () {
+ var constraint = VersionConstraint.parse('<1.2.3');
+
+ expect(constraint,
+ allows(Version.parse('1.2.1'), Version.parse('1.2.2+foo')));
+ expect(
+ constraint,
+ doesNotAllow(Version.parse('1.2.3'), Version.parse('1.2.3+foo'),
+ Version.parse('1.2.4')));
+ });
+
+ test('parses a "<=" maximum version', () {
+ var constraint = VersionConstraint.parse('<=1.2.3');
+
+ expect(
+ constraint,
+ allows(Version.parse('1.2.1'), Version.parse('1.2.3-build'),
+ Version.parse('1.2.3')));
+ expect(constraint,
+ doesNotAllow(Version.parse('1.2.3+foo'), Version.parse('1.2.4')));
+ });
+
+ test('parses a series of space-separated constraints', () {
+ var constraint = VersionConstraint.parse('>1.0.0 >=1.2.3 <1.3.0');
+
+ expect(
+ constraint, allows(Version.parse('1.2.3'), Version.parse('1.2.5')));
+ expect(
+ constraint,
+ doesNotAllow(Version.parse('1.2.3-pre'), Version.parse('1.3.0'),
+ Version.parse('3.4.5')));
+ });
+
+ test('parses a pre-release-only constraint', () {
+ var constraint = VersionConstraint.parse('>=1.0.0-dev.2 <1.0.0');
+ expect(constraint,
+ allows(Version.parse('1.0.0-dev.2'), Version.parse('1.0.0-dev.3')));
+ expect(constraint,
+ doesNotAllow(Version.parse('1.0.0-dev.1'), Version.parse('1.0.0')));
+ });
+
+ test('ignores whitespace around comparison operators', () {
+ var constraint = VersionConstraint.parse(' >1.0.0>=1.2.3 < 1.3.0');
+
+ expect(
+ constraint, allows(Version.parse('1.2.3'), Version.parse('1.2.5')));
+ expect(
+ constraint,
+ doesNotAllow(Version.parse('1.2.3-pre'), Version.parse('1.3.0'),
+ Version.parse('3.4.5')));
+ });
+
+ test('does not allow "any" to be mixed with other constraints', () {
+ expect(() => VersionConstraint.parse('any 1.0.0'), throwsFormatException);
+ });
+
+ test('parses a "^" version', () {
+ expect(VersionConstraint.parse('^0.0.3'),
+ equals(VersionConstraint.compatibleWith(v003)));
+
+ expect(VersionConstraint.parse('^0.7.2'),
+ equals(VersionConstraint.compatibleWith(v072)));
+
+ expect(VersionConstraint.parse('^1.2.3'),
+ equals(VersionConstraint.compatibleWith(v123)));
+
+ var min = Version.parse('0.7.2-pre+1');
+ expect(VersionConstraint.parse('^0.7.2-pre+1'),
+ equals(VersionConstraint.compatibleWith(min)));
+ });
+
+ test('does not allow "^" to be mixed with other constraints', () {
+ expect(() => VersionConstraint.parse('>=1.2.3 ^1.0.0'),
+ throwsFormatException);
+ expect(() => VersionConstraint.parse('^1.0.0 <1.2.3'),
+ throwsFormatException);
+ });
+
+ test('ignores whitespace around "^"', () {
+ var constraint = VersionConstraint.parse(' ^ 1.2.3 ');
+
+ expect(constraint, equals(VersionConstraint.compatibleWith(v123)));
+ });
+
+ test('throws FormatException on a bad string', () {
+ var bad = [
+ '', ' ', // Empty string.
+ 'foo', // Bad text.
+ '>foo', // Bad text after operator.
+ '^foo', // Bad text after "^".
+ '1.0.0 foo', '1.0.0foo', // Bad text after version.
+ 'anything', // Bad text after "any".
+ '<>1.0.0', // Multiple operators.
+ '1.0.0<' // Trailing operator.
+ ];
+
+ for (var text in bad) {
+ expect(() => VersionConstraint.parse(text), throwsFormatException);
+ }
+ });
+ });
+
+ group('compatibleWith()', () {
+ test('returns the range of compatible versions', () {
+ var constraint = VersionConstraint.compatibleWith(v072);
+
+ expect(
+ constraint,
+ equals(VersionRange(
+ min: v072, includeMin: true, max: v072.nextBreaking)));
+ });
+
+ test('toString() uses "^"', () {
+ var constraint = VersionConstraint.compatibleWith(v072);
+
+ expect(constraint.toString(), equals('^0.7.2'));
+ });
+ });
+}
diff --git a/pkgs/pub_semver/test/version_range_test.dart b/pkgs/pub_semver/test/version_range_test.dart
new file mode 100644
index 0000000..5978df0
--- /dev/null
+++ b/pkgs/pub_semver/test/version_range_test.dart
@@ -0,0 +1,998 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:pub_semver/pub_semver.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ group('constructor', () {
+ test('takes a min and max', () {
+ var range = VersionRange(min: v123, max: v124);
+ expect(range.isAny, isFalse);
+ expect(range.min, equals(v123));
+ expect(range.max, equals(v124.firstPreRelease));
+ });
+
+ group("doesn't make the max a pre-release if", () {
+ test("it's already a pre-release", () {
+ expect(VersionRange(max: Version.parse('1.2.4-pre')).max,
+ equals(Version.parse('1.2.4-pre')));
+ });
+
+ test('includeMax is true', () {
+ expect(VersionRange(max: v124, includeMax: true).max, equals(v124));
+ });
+
+ test('min is a prerelease of max', () {
+ expect(VersionRange(min: Version.parse('1.2.4-pre'), max: v124).max,
+ equals(v124));
+ });
+
+ test('max has a build identifier', () {
+ expect(VersionRange(max: Version.parse('1.2.4+1')).max,
+ equals(Version.parse('1.2.4+1')));
+ });
+ });
+
+ test('allows omitting max', () {
+ var range = VersionRange(min: v123);
+ expect(range.isAny, isFalse);
+ expect(range.min, equals(v123));
+ expect(range.max, isNull);
+ });
+
+ test('allows omitting min and max', () {
+ var range = VersionRange();
+ expect(range.isAny, isTrue);
+ expect(range.min, isNull);
+ expect(range.max, isNull);
+ });
+
+ test('takes includeMin', () {
+ var range = VersionRange(min: v123, includeMin: true);
+ expect(range.includeMin, isTrue);
+ });
+
+ test('includeMin defaults to false if omitted', () {
+ var range = VersionRange(min: v123);
+ expect(range.includeMin, isFalse);
+ });
+
+ test('takes includeMax', () {
+ var range = VersionRange(max: v123, includeMax: true);
+ expect(range.includeMax, isTrue);
+ });
+
+ test('includeMax defaults to false if omitted', () {
+ var range = VersionRange(max: v123);
+ expect(range.includeMax, isFalse);
+ });
+
+ test('throws if min > max', () {
+ expect(() => VersionRange(min: v124, max: v123), throwsArgumentError);
+ });
+ });
+
+ group('allows()', () {
+ test('version must be greater than min', () {
+ var range = VersionRange(min: v123);
+
+ expect(range, allows(Version.parse('1.3.3'), Version.parse('2.3.3')));
+ expect(
+ range, doesNotAllow(Version.parse('1.2.2'), Version.parse('1.2.3')));
+ });
+
+ test('version must be min or greater if includeMin', () {
+ var range = VersionRange(min: v123, includeMin: true);
+
+ expect(
+ range,
+ allows(Version.parse('1.2.3'), Version.parse('1.3.3'),
+ Version.parse('2.3.3')));
+ expect(range, doesNotAllow(Version.parse('1.2.2')));
+ });
+
+ test('pre-release versions of inclusive min are excluded', () {
+ var range = VersionRange(min: v123, includeMin: true);
+
+ expect(range, allows(Version.parse('1.2.4-dev')));
+ expect(range, doesNotAllow(Version.parse('1.2.3-dev')));
+ });
+
+ test('version must be less than max', () {
+ var range = VersionRange(max: v234);
+
+ expect(range, allows(Version.parse('2.3.3')));
+ expect(
+ range, doesNotAllow(Version.parse('2.3.4'), Version.parse('2.4.3')));
+ });
+
+ test('pre-release versions of non-pre-release max are excluded', () {
+ var range = VersionRange(max: v234);
+
+ expect(range, allows(Version.parse('2.3.3')));
+ expect(range,
+ doesNotAllow(Version.parse('2.3.4-dev'), Version.parse('2.3.4')));
+ });
+
+ test(
+ 'pre-release versions of non-pre-release max are included if min is a '
+ 'pre-release of the same version', () {
+ var range = VersionRange(min: Version.parse('2.3.4-dev.0'), max: v234);
+
+ expect(range, allows(Version.parse('2.3.4-dev.1')));
+ expect(
+ range,
+ doesNotAllow(Version.parse('2.3.3'), Version.parse('2.3.4-dev'),
+ Version.parse('2.3.4')));
+ });
+
+ test('pre-release versions of pre-release max are included', () {
+ var range = VersionRange(max: Version.parse('2.3.4-dev.2'));
+
+ expect(range, allows(Version.parse('2.3.4-dev.1')));
+ expect(
+ range,
+ doesNotAllow(
+ Version.parse('2.3.4-dev.2'), Version.parse('2.3.4-dev.3')));
+ });
+
+ test('version must be max or less if includeMax', () {
+ var range = VersionRange(min: v123, max: v234, includeMax: true);
+
+ expect(
+ range,
+ allows(
+ Version.parse('2.3.3'),
+ Version.parse('2.3.4'),
+ // Pre-releases of the max are allowed.
+ Version.parse('2.3.4-dev')));
+ expect(range, doesNotAllow(Version.parse('2.4.3')));
+ });
+
+ test('has no min if one was not set', () {
+ var range = VersionRange(max: v123);
+
+ expect(range, allows(Version.parse('0.0.0')));
+ expect(range, doesNotAllow(Version.parse('1.2.3')));
+ });
+
+ test('has no max if one was not set', () {
+ var range = VersionRange(min: v123);
+
+ expect(range, allows(Version.parse('1.3.3'), Version.parse('999.3.3')));
+ expect(range, doesNotAllow(Version.parse('1.2.3')));
+ });
+
+ test('allows any version if there is no min or max', () {
+ var range = VersionRange();
+
+ expect(range, allows(Version.parse('0.0.0'), Version.parse('999.99.9')));
+ });
+
+ test('allows pre-releases of the max with includeMaxPreRelease', () {
+ expect(includeMaxPreReleaseRange, allows(Version.parse('2.0.0-dev')));
+ });
+ });
+
+ group('allowsAll()', () {
+ test('allows an empty constraint', () {
+ expect(
+ VersionRange(min: v123, max: v250).allowsAll(VersionConstraint.empty),
+ isTrue);
+ });
+
+ test('allows allowed versions', () {
+ var range = VersionRange(min: v123, max: v250, includeMax: true);
+ expect(range.allowsAll(v123), isFalse);
+ expect(range.allowsAll(v124), isTrue);
+ expect(range.allowsAll(v250), isTrue);
+ expect(range.allowsAll(v300), isFalse);
+ });
+
+ test('with no min', () {
+ var range = VersionRange(max: v250);
+ expect(range.allowsAll(VersionRange(min: v080, max: v140)), isTrue);
+ expect(range.allowsAll(VersionRange(min: v080, max: v300)), isFalse);
+ expect(range.allowsAll(VersionRange(max: v140)), isTrue);
+ expect(range.allowsAll(VersionRange(max: v300)), isFalse);
+ expect(range.allowsAll(range), isTrue);
+ expect(range.allowsAll(VersionConstraint.any), isFalse);
+ });
+
+ test('with no max', () {
+ var range = VersionRange(min: v010);
+ expect(range.allowsAll(VersionRange(min: v080, max: v140)), isTrue);
+ expect(range.allowsAll(VersionRange(min: v003, max: v140)), isFalse);
+ expect(range.allowsAll(VersionRange(min: v080)), isTrue);
+ expect(range.allowsAll(VersionRange(min: v003)), isFalse);
+ expect(range.allowsAll(range), isTrue);
+ expect(range.allowsAll(VersionConstraint.any), isFalse);
+ });
+
+ test('with a min and max', () {
+ var range = VersionRange(min: v010, max: v250);
+ expect(range.allowsAll(VersionRange(min: v080, max: v140)), isTrue);
+ expect(range.allowsAll(VersionRange(min: v080, max: v300)), isFalse);
+ expect(range.allowsAll(VersionRange(min: v003, max: v140)), isFalse);
+ expect(range.allowsAll(VersionRange(min: v080)), isFalse);
+ expect(range.allowsAll(VersionRange(max: v140)), isFalse);
+ expect(range.allowsAll(range), isTrue);
+ });
+
+ test("allows a bordering range that's not more inclusive", () {
+ var exclusive = VersionRange(min: v010, max: v250);
+ var inclusive = VersionRange(
+ min: v010, includeMin: true, max: v250, includeMax: true);
+ expect(inclusive.allowsAll(exclusive), isTrue);
+ expect(inclusive.allowsAll(inclusive), isTrue);
+ expect(exclusive.allowsAll(inclusive), isFalse);
+ expect(exclusive.allowsAll(exclusive), isTrue);
+ });
+
+ test('allows unions that are completely contained', () {
+ var range = VersionRange(min: v114, max: v200);
+ expect(range.allowsAll(VersionRange(min: v123, max: v124).union(v140)),
+ isTrue);
+ expect(range.allowsAll(VersionRange(min: v010, max: v124).union(v140)),
+ isFalse);
+ expect(range.allowsAll(VersionRange(min: v123, max: v234).union(v140)),
+ isFalse);
+ });
+
+ group('pre-release versions', () {
+ test('of inclusive min are excluded', () {
+ var range = VersionRange(min: v123, includeMin: true);
+
+ expect(range.allowsAll(VersionConstraint.parse('>1.2.4-dev')), isTrue);
+ expect(range.allowsAll(VersionConstraint.parse('>1.2.3-dev')), isFalse);
+ });
+
+ test('of non-pre-release max are excluded', () {
+ var range = VersionRange(max: v234);
+
+ expect(range.allowsAll(VersionConstraint.parse('<2.3.3')), isTrue);
+ expect(range.allowsAll(VersionConstraint.parse('<2.3.4-dev')), isFalse);
+ });
+
+ test('of non-pre-release max are included with includeMaxPreRelease', () {
+ expect(
+ includeMaxPreReleaseRange
+ .allowsAll(VersionConstraint.parse('<2.0.0-dev')),
+ isTrue);
+ });
+
+ test(
+ 'of non-pre-release max are included if min is a pre-release of the '
+ 'same version', () {
+ var range = VersionRange(min: Version.parse('2.3.4-dev.0'), max: v234);
+
+ expect(
+ range.allowsAll(
+ VersionConstraint.parse('>2.3.4-dev.0 <2.3.4-dev.1')),
+ isTrue);
+ });
+
+ test('of pre-release max are included', () {
+ var range = VersionRange(max: Version.parse('2.3.4-dev.2'));
+
+ expect(
+ range.allowsAll(VersionConstraint.parse('<2.3.4-dev.1')), isTrue);
+ expect(
+ range.allowsAll(VersionConstraint.parse('<2.3.4-dev.2')), isTrue);
+ expect(
+ range.allowsAll(VersionConstraint.parse('<=2.3.4-dev.2')), isFalse);
+ expect(
+ range.allowsAll(VersionConstraint.parse('<2.3.4-dev.3')), isFalse);
+ });
+ });
+ });
+
+ group('allowsAny()', () {
+ test('disallows an empty constraint', () {
+ expect(
+ VersionRange(min: v123, max: v250).allowsAny(VersionConstraint.empty),
+ isFalse);
+ });
+
+ test('allows allowed versions', () {
+ var range = VersionRange(min: v123, max: v250, includeMax: true);
+ expect(range.allowsAny(v123), isFalse);
+ expect(range.allowsAny(v124), isTrue);
+ expect(range.allowsAny(v250), isTrue);
+ expect(range.allowsAny(v300), isFalse);
+ });
+
+ test('with no min', () {
+ var range = VersionRange(max: v200);
+ expect(range.allowsAny(VersionRange(min: v140, max: v300)), isTrue);
+ expect(range.allowsAny(VersionRange(min: v234, max: v300)), isFalse);
+ expect(range.allowsAny(VersionRange(min: v140)), isTrue);
+ expect(range.allowsAny(VersionRange(min: v234)), isFalse);
+ expect(range.allowsAny(range), isTrue);
+ });
+
+ test('with no max', () {
+ var range = VersionRange(min: v072);
+ expect(range.allowsAny(VersionRange(min: v003, max: v140)), isTrue);
+ expect(range.allowsAny(VersionRange(min: v003, max: v010)), isFalse);
+ expect(range.allowsAny(VersionRange(max: v080)), isTrue);
+ expect(range.allowsAny(VersionRange(max: v003)), isFalse);
+ expect(range.allowsAny(range), isTrue);
+ });
+
+ test('with a min and max', () {
+ var range = VersionRange(min: v072, max: v200);
+ expect(range.allowsAny(VersionRange(min: v003, max: v140)), isTrue);
+ expect(range.allowsAny(VersionRange(min: v140, max: v300)), isTrue);
+ expect(range.allowsAny(VersionRange(min: v003, max: v010)), isFalse);
+ expect(range.allowsAny(VersionRange(min: v234, max: v300)), isFalse);
+ expect(range.allowsAny(VersionRange(max: v010)), isFalse);
+ expect(range.allowsAny(VersionRange(min: v234)), isFalse);
+ expect(range.allowsAny(range), isTrue);
+ });
+
+ test('allows a bordering range when both are inclusive', () {
+ expect(
+ VersionRange(max: v250).allowsAny(VersionRange(min: v250)), isFalse);
+
+ expect(
+ VersionRange(max: v250, includeMax: true)
+ .allowsAny(VersionRange(min: v250)),
+ isFalse);
+
+ expect(
+ VersionRange(max: v250)
+ .allowsAny(VersionRange(min: v250, includeMin: true)),
+ isFalse);
+
+ expect(
+ VersionRange(max: v250, includeMax: true)
+ .allowsAny(VersionRange(min: v250, includeMin: true)),
+ isTrue);
+
+ expect(
+ VersionRange(min: v250).allowsAny(VersionRange(max: v250)), isFalse);
+
+ expect(
+ VersionRange(min: v250, includeMin: true)
+ .allowsAny(VersionRange(max: v250)),
+ isFalse);
+
+ expect(
+ VersionRange(min: v250)
+ .allowsAny(VersionRange(max: v250, includeMax: true)),
+ isFalse);
+
+ expect(
+ VersionRange(min: v250, includeMin: true)
+ .allowsAny(VersionRange(max: v250, includeMax: true)),
+ isTrue);
+ });
+
+ test('allows unions that are partially contained', () {
+ var range = VersionRange(min: v114, max: v200);
+ expect(range.allowsAny(VersionRange(min: v010, max: v080).union(v140)),
+ isTrue);
+ expect(range.allowsAny(VersionRange(min: v123, max: v234).union(v300)),
+ isTrue);
+ expect(range.allowsAny(VersionRange(min: v234, max: v300).union(v010)),
+ isFalse);
+ });
+
+ group('pre-release versions', () {
+ test('of inclusive min are excluded', () {
+ var range = VersionRange(min: v123, includeMin: true);
+
+ expect(range.allowsAny(VersionConstraint.parse('<1.2.4-dev')), isTrue);
+ expect(range.allowsAny(VersionConstraint.parse('<1.2.3-dev')), isFalse);
+ });
+
+ test('of non-pre-release max are excluded', () {
+ var range = VersionRange(max: v234);
+
+ expect(range.allowsAny(VersionConstraint.parse('>2.3.3')), isTrue);
+ expect(range.allowsAny(VersionConstraint.parse('>2.3.4-dev')), isFalse);
+ });
+
+ test('of non-pre-release max are included with includeMaxPreRelease', () {
+ expect(
+ includeMaxPreReleaseRange
+ .allowsAny(VersionConstraint.parse('>2.0.0-dev')),
+ isTrue);
+ });
+
+ test(
+ 'of non-pre-release max are included if min is a pre-release of the '
+ 'same version', () {
+ var range = VersionRange(min: Version.parse('2.3.4-dev.0'), max: v234);
+
+ expect(
+ range.allowsAny(VersionConstraint.parse('>2.3.4-dev.1')), isTrue);
+ expect(range.allowsAny(VersionConstraint.parse('>2.3.4')), isFalse);
+
+ expect(
+ range.allowsAny(VersionConstraint.parse('<2.3.4-dev.1')), isTrue);
+ expect(range.allowsAny(VersionConstraint.parse('<2.3.4-dev')), isFalse);
+ });
+
+ test('of pre-release max are included', () {
+ var range = VersionConstraint.parse('<2.3.4-dev.2');
+
+ expect(
+ range.allowsAny(VersionConstraint.parse('>2.3.4-dev.1')), isTrue);
+ expect(
+ range.allowsAny(VersionConstraint.parse('>2.3.4-dev.2')), isFalse);
+ expect(
+ range.allowsAny(VersionConstraint.parse('>2.3.4-dev.3')), isFalse);
+ });
+ });
+ });
+
+ group('intersect()', () {
+ test('two overlapping ranges', () {
+ expect(
+ VersionRange(min: v123, max: v250)
+ .intersect(VersionRange(min: v200, max: v300)),
+ equals(VersionRange(min: v200, max: v250)));
+ });
+
+ test('a non-overlapping range allows no versions', () {
+ var a = VersionRange(min: v114, max: v124);
+ var b = VersionRange(min: v200, max: v250);
+ expect(a.intersect(b).isEmpty, isTrue);
+ });
+
+ test('adjacent ranges allow no versions if exclusive', () {
+ var a = VersionRange(min: v114, max: v124);
+ var b = VersionRange(min: v124, max: v200);
+ expect(a.intersect(b).isEmpty, isTrue);
+ });
+
+ test('adjacent ranges allow version if inclusive', () {
+ var a = VersionRange(min: v114, max: v124, includeMax: true);
+ var b = VersionRange(min: v124, max: v200, includeMin: true);
+ expect(a.intersect(b), equals(v124));
+ });
+
+ test('with an open range', () {
+ var open = VersionRange();
+ var a = VersionRange(min: v114, max: v124);
+ expect(open.intersect(open), equals(open));
+ expect(a.intersect(open), equals(a));
+ });
+
+ test('returns the version if the range allows it', () {
+ expect(VersionRange(min: v114, max: v124).intersect(v123), equals(v123));
+ expect(
+ VersionRange(min: v123, max: v124).intersect(v114).isEmpty, isTrue);
+ });
+
+ test('with a range with a pre-release min, returns an empty constraint',
+ () {
+ expect(
+ VersionRange(max: v200)
+ .intersect(VersionConstraint.parse('>=2.0.0-dev')),
+ equals(VersionConstraint.empty));
+ });
+
+ test('with a range with a pre-release max, returns the original', () {
+ expect(
+ VersionRange(max: v200)
+ .intersect(VersionConstraint.parse('<2.0.0-dev')),
+ equals(VersionRange(max: v200)));
+ });
+
+ group('with includeMaxPreRelease', () {
+ test('preserves includeMaxPreRelease if the max version is included', () {
+ expect(
+ includeMaxPreReleaseRange
+ .intersect(VersionConstraint.parse('<1.0.0')),
+ equals(VersionConstraint.parse('<1.0.0')));
+ expect(
+ includeMaxPreReleaseRange
+ .intersect(VersionConstraint.parse('<2.0.0')),
+ equals(VersionConstraint.parse('<2.0.0')));
+ expect(includeMaxPreReleaseRange.intersect(includeMaxPreReleaseRange),
+ equals(includeMaxPreReleaseRange));
+ expect(
+ includeMaxPreReleaseRange
+ .intersect(VersionConstraint.parse('<3.0.0')),
+ equals(includeMaxPreReleaseRange));
+ expect(
+ includeMaxPreReleaseRange
+ .intersect(VersionConstraint.parse('>1.1.4')),
+ equals(VersionRange(
+ min: v114, max: v200, alwaysIncludeMaxPreRelease: true)));
+ });
+
+ test(
+ 'and a range with a pre-release min, returns '
+ 'an intersection', () {
+ expect(
+ includeMaxPreReleaseRange
+ .intersect(VersionConstraint.parse('>=2.0.0-dev')),
+ equals(VersionConstraint.parse('>=2.0.0-dev <2.0.0')));
+ });
+
+ test(
+ 'and a range with a pre-release max, returns '
+ 'the narrower constraint', () {
+ expect(
+ includeMaxPreReleaseRange
+ .intersect(VersionConstraint.parse('<2.0.0-dev')),
+ equals(VersionConstraint.parse('<2.0.0-dev')));
+ });
+ });
+ });
+
+ group('union()', () {
+ test('with a version returns the range if it contains the version', () {
+ var range = VersionRange(min: v114, max: v124);
+ expect(range.union(v123), equals(range));
+ });
+
+ test('with a version on the edge of the range, expands the range', () {
+ expect(
+ VersionRange(min: v114, max: v124, alwaysIncludeMaxPreRelease: true)
+ .union(v124),
+ equals(VersionRange(min: v114, max: v124, includeMax: true)));
+ expect(VersionRange(min: v114, max: v124).union(v114),
+ equals(VersionRange(min: v114, max: v124, includeMin: true)));
+ });
+
+ test(
+ 'with a version allows both the range and the version if the range '
+ "doesn't contain the version", () {
+ var result = VersionRange(min: v003, max: v114).union(v124);
+ expect(result, allows(v010));
+ expect(result, doesNotAllow(v123));
+ expect(result, allows(v124));
+ });
+
+ test('returns a VersionUnion for a disjoint range', () {
+ var result = VersionRange(min: v003, max: v114)
+ .union(VersionRange(min: v130, max: v200));
+ expect(result, allows(v080));
+ expect(result, doesNotAllow(v123));
+ expect(result, allows(v140));
+ });
+
+ test('returns a VersionUnion for a disjoint range with infinite end', () {
+ void isVersionUnion(VersionConstraint constraint) {
+ expect(constraint, allows(v080));
+ expect(constraint, doesNotAllow(v123));
+ expect(constraint, allows(v140));
+ }
+
+ for (final includeAMin in [true, false]) {
+ for (final includeAMax in [true, false]) {
+ for (final includeBMin in [true, false]) {
+ for (final includeBMax in [true, false]) {
+ final a = VersionRange(
+ min: v130, includeMin: includeAMin, includeMax: includeAMax);
+ final b = VersionRange(
+ max: v114, includeMin: includeBMin, includeMax: includeBMax);
+ isVersionUnion(a.union(b));
+ isVersionUnion(b.union(a));
+ }
+ }
+ }
+ }
+ });
+
+ test('considers open ranges disjoint', () {
+ var result = VersionRange(min: v003, max: v114)
+ .union(VersionRange(min: v114, max: v200));
+ expect(result, allows(v080));
+ expect(result, doesNotAllow(v114));
+ expect(result, allows(v140));
+
+ result = VersionRange(min: v114, max: v200)
+ .union(VersionRange(min: v003, max: v114));
+ expect(result, allows(v080));
+ expect(result, doesNotAllow(v114));
+ expect(result, allows(v140));
+ });
+
+ test('returns a merged range for an overlapping range', () {
+ var result = VersionRange(min: v003, max: v114)
+ .union(VersionRange(min: v080, max: v200));
+ expect(result, equals(VersionRange(min: v003, max: v200)));
+ });
+
+ test('considers closed ranges overlapping', () {
+ var result = VersionRange(min: v003, max: v114, includeMax: true)
+ .union(VersionRange(min: v114, max: v200));
+ expect(result, equals(VersionRange(min: v003, max: v200)));
+
+ result =
+ VersionRange(min: v003, max: v114, alwaysIncludeMaxPreRelease: true)
+ .union(VersionRange(min: v114, max: v200, includeMin: true));
+ expect(result, equals(VersionRange(min: v003, max: v200)));
+
+ result = VersionRange(min: v114, max: v200)
+ .union(VersionRange(min: v003, max: v114, includeMax: true));
+ expect(result, equals(VersionRange(min: v003, max: v200)));
+
+ result = VersionRange(min: v114, max: v200, includeMin: true).union(
+ VersionRange(min: v003, max: v114, alwaysIncludeMaxPreRelease: true));
+ expect(result, equals(VersionRange(min: v003, max: v200)));
+ });
+
+ test('includes edges if either range does', () {
+ var result = VersionRange(min: v003, max: v114, includeMin: true)
+ .union(VersionRange(min: v003, max: v114, includeMax: true));
+ expect(
+ result,
+ equals(VersionRange(
+ min: v003, max: v114, includeMin: true, includeMax: true)));
+ });
+
+ test('with a range with a pre-release min, returns a constraint with a gap',
+ () {
+ var result =
+ VersionRange(max: v200).union(VersionConstraint.parse('>=2.0.0-dev'));
+ expect(result, allows(v140));
+ expect(result, doesNotAllow(Version.parse('2.0.0-alpha')));
+ expect(result, allows(Version.parse('2.0.0-dev')));
+ expect(result, allows(Version.parse('2.0.0-dev.1')));
+ expect(result, allows(Version.parse('2.0.0')));
+ });
+
+ test('with a range with a pre-release max, returns the larger constraint',
+ () {
+ expect(
+ VersionRange(max: v200).union(VersionConstraint.parse('<2.0.0-dev')),
+ equals(VersionConstraint.parse('<2.0.0-dev')));
+ });
+
+ group('with includeMaxPreRelease', () {
+ test('adds includeMaxPreRelease if the max version is included', () {
+ expect(
+ includeMaxPreReleaseRange.union(VersionConstraint.parse('<1.0.0')),
+ equals(includeMaxPreReleaseRange));
+ expect(includeMaxPreReleaseRange.union(includeMaxPreReleaseRange),
+ equals(includeMaxPreReleaseRange));
+ expect(
+ includeMaxPreReleaseRange.union(VersionConstraint.parse('<2.0.0')),
+ equals(includeMaxPreReleaseRange));
+ expect(
+ includeMaxPreReleaseRange.union(VersionConstraint.parse('<3.0.0')),
+ equals(VersionConstraint.parse('<3.0.0')));
+ });
+
+ test('and a range with a pre-release min, returns any', () {
+ expect(
+ includeMaxPreReleaseRange
+ .union(VersionConstraint.parse('>=2.0.0-dev')),
+ equals(VersionConstraint.any));
+ });
+
+ test('and a range with a pre-release max, returns the original', () {
+ expect(
+ includeMaxPreReleaseRange
+ .union(VersionConstraint.parse('<2.0.0-dev')),
+ equals(includeMaxPreReleaseRange));
+ });
+ });
+ });
+
+ group('difference()', () {
+ test('with an empty range returns the original range', () {
+ expect(
+ VersionRange(min: v003, max: v114)
+ .difference(VersionConstraint.empty),
+ equals(VersionRange(min: v003, max: v114)));
+ });
+
+ test('with a version outside the range returns the original range', () {
+ expect(VersionRange(min: v003, max: v114).difference(v200),
+ equals(VersionRange(min: v003, max: v114)));
+ });
+
+ test('with a version in the range splits the range', () {
+ expect(
+ VersionRange(min: v003, max: v114).difference(v072),
+ equals(VersionConstraint.unionOf([
+ VersionRange(
+ min: v003, max: v072, alwaysIncludeMaxPreRelease: true),
+ VersionRange(min: v072, max: v114)
+ ])));
+ });
+
+ test('with the max version makes the max exclusive', () {
+ expect(
+ VersionRange(min: v003, max: v114, includeMax: true).difference(v114),
+ equals(VersionRange(
+ min: v003, max: v114, alwaysIncludeMaxPreRelease: true)));
+ });
+
+ test('with the min version makes the min exclusive', () {
+ expect(
+ VersionRange(min: v003, max: v114, includeMin: true).difference(v003),
+ equals(VersionRange(min: v003, max: v114)));
+ });
+
+ test('with a disjoint range returns the original', () {
+ expect(
+ VersionRange(min: v003, max: v114)
+ .difference(VersionRange(min: v123, max: v140)),
+ equals(VersionRange(min: v003, max: v114)));
+ });
+
+ test('with an adjacent range returns the original', () {
+ expect(
+ VersionRange(min: v003, max: v114, includeMax: true)
+ .difference(VersionRange(min: v114, max: v140)),
+ equals(VersionRange(min: v003, max: v114, includeMax: true)));
+ });
+
+ test('with a range at the beginning cuts off the beginning of the range',
+ () {
+ expect(
+ VersionRange(min: v080, max: v130)
+ .difference(VersionRange(min: v010, max: v114)),
+ equals(VersionConstraint.parse('>=1.1.4-0 <1.3.0')));
+ expect(
+ VersionRange(min: v080, max: v130)
+ .difference(VersionRange(max: v114)),
+ equals(VersionConstraint.parse('>=1.1.4-0 <1.3.0')));
+ expect(
+ VersionRange(min: v080, max: v130)
+ .difference(VersionRange(min: v010, max: v114, includeMax: true)),
+ equals(VersionRange(min: v114, max: v130)));
+ expect(
+ VersionRange(min: v080, max: v130, includeMin: true)
+ .difference(VersionRange(min: v010, max: v080, includeMax: true)),
+ equals(VersionRange(min: v080, max: v130)));
+ expect(
+ VersionRange(min: v080, max: v130, includeMax: true)
+ .difference(VersionRange(min: v080, max: v130)),
+ equals(VersionConstraint.parse('>=1.3.0-0 <=1.3.0')));
+ });
+
+ test('with a range at the end cuts off the end of the range', () {
+ expect(
+ VersionRange(min: v080, max: v130)
+ .difference(VersionRange(min: v114, max: v140)),
+ equals(VersionRange(min: v080, max: v114, includeMax: true)));
+ expect(
+ VersionRange(min: v080, max: v130)
+ .difference(VersionRange(min: v114)),
+ equals(VersionRange(min: v080, max: v114, includeMax: true)));
+ expect(
+ VersionRange(min: v080, max: v130)
+ .difference(VersionRange(min: v114, max: v140, includeMin: true)),
+ equals(VersionRange(
+ min: v080, max: v114, alwaysIncludeMaxPreRelease: true)));
+ expect(
+ VersionRange(min: v080, max: v130, includeMax: true)
+ .difference(VersionRange(min: v130, max: v140, includeMin: true)),
+ equals(VersionRange(
+ min: v080, max: v130, alwaysIncludeMaxPreRelease: true)));
+ expect(
+ VersionRange(min: v080, max: v130, includeMin: true)
+ .difference(VersionRange(min: v080, max: v130)),
+ equals(v080));
+ });
+
+ test('with a range in the middle cuts the range in half', () {
+ expect(
+ VersionRange(min: v003, max: v130)
+ .difference(VersionRange(min: v072, max: v114)),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v072, includeMax: true),
+ VersionConstraint.parse('>=1.1.4-0 <1.3.0')
+ ])));
+ });
+
+ test('with a totally covering range returns empty', () {
+ expect(
+ VersionRange(min: v114, max: v200)
+ .difference(VersionRange(min: v072, max: v300)),
+ isEmpty);
+ expect(
+ VersionRange(min: v003, max: v114)
+ .difference(VersionRange(min: v003, max: v114)),
+ isEmpty);
+ expect(
+ VersionRange(min: v003, max: v114, includeMin: true, includeMax: true)
+ .difference(VersionRange(
+ min: v003, max: v114, includeMin: true, includeMax: true)),
+ isEmpty);
+ });
+
+ test(
+ "with a version union that doesn't cover the range, returns the "
+ 'original', () {
+ expect(
+ VersionRange(min: v114, max: v140)
+ .difference(VersionConstraint.unionOf([v010, v200])),
+ equals(VersionRange(min: v114, max: v140)));
+ });
+
+ test('with a version union that intersects the ends, chops them off', () {
+ expect(
+ VersionRange(min: v114, max: v140).difference(
+ VersionConstraint.unionOf([
+ VersionRange(min: v080, max: v123),
+ VersionRange(min: v130, max: v200)
+ ])),
+ equals(VersionConstraint.parse('>=1.2.3-0 <=1.3.0')));
+ });
+
+ test('with a version union that intersects the middle, chops it up', () {
+ expect(
+ VersionRange(min: v114, max: v140)
+ .difference(VersionConstraint.unionOf([v123, v124, v130])),
+ equals(VersionConstraint.unionOf([
+ VersionRange(
+ min: v114, max: v123, alwaysIncludeMaxPreRelease: true),
+ VersionRange(
+ min: v123, max: v124, alwaysIncludeMaxPreRelease: true),
+ VersionRange(
+ min: v124, max: v130, alwaysIncludeMaxPreRelease: true),
+ VersionRange(min: v130, max: v140)
+ ])));
+ });
+
+ test('with a version union that covers the whole range, returns empty', () {
+ expect(
+ VersionRange(min: v114, max: v140).difference(
+ VersionConstraint.unionOf([v003, VersionRange(min: v010)])),
+ equals(VersionConstraint.empty));
+ });
+
+ test('with a range with a pre-release min, returns the original', () {
+ expect(
+ VersionRange(max: v200)
+ .difference(VersionConstraint.parse('>=2.0.0-dev')),
+ equals(VersionRange(max: v200)));
+ });
+
+ test('with a range with a pre-release max, returns null', () {
+ expect(
+ VersionRange(max: v200)
+ .difference(VersionConstraint.parse('<2.0.0-dev')),
+ equals(VersionConstraint.empty));
+ });
+
+ group('with includeMaxPreRelease', () {
+ group('for the minuend', () {
+ test('preserves includeMaxPreRelease if the max version is included',
+ () {
+ expect(
+ includeMaxPreReleaseRange
+ .difference(VersionConstraint.parse('<1.0.0')),
+ equals(VersionRange(
+ min: Version.parse('1.0.0-0'),
+ max: v200,
+ includeMin: true,
+ alwaysIncludeMaxPreRelease: true)));
+ expect(
+ includeMaxPreReleaseRange
+ .difference(VersionConstraint.parse('<2.0.0')),
+ equals(VersionRange(
+ min: v200.firstPreRelease,
+ max: v200,
+ includeMin: true,
+ alwaysIncludeMaxPreRelease: true)));
+ expect(
+ includeMaxPreReleaseRange.difference(includeMaxPreReleaseRange),
+ equals(VersionConstraint.empty));
+ expect(
+ includeMaxPreReleaseRange
+ .difference(VersionConstraint.parse('<3.0.0')),
+ equals(VersionConstraint.empty));
+ });
+
+ test('with a range with a pre-release min, adjusts the max', () {
+ expect(
+ includeMaxPreReleaseRange
+ .difference(VersionConstraint.parse('>=2.0.0-dev')),
+ equals(VersionConstraint.parse('<2.0.0-dev')));
+ });
+
+ test('with a range with a pre-release max, adjusts the min', () {
+ expect(
+ includeMaxPreReleaseRange
+ .difference(VersionConstraint.parse('<2.0.0-dev')),
+ equals(VersionConstraint.parse('>=2.0.0-dev <2.0.0')));
+ });
+ });
+
+ group('for the subtrahend', () {
+ group("doesn't create a pre-release minimum", () {
+ test('when cutting off the bottom', () {
+ expect(
+ VersionConstraint.parse('<3.0.0')
+ .difference(includeMaxPreReleaseRange),
+ equals(VersionRange(min: v200, max: v300, includeMin: true)));
+ });
+
+ test('with splitting down the middle', () {
+ expect(
+ VersionConstraint.parse('<4.0.0').difference(VersionRange(
+ min: v200,
+ max: v300,
+ includeMin: true,
+ alwaysIncludeMaxPreRelease: true)),
+ equals(VersionConstraint.unionOf([
+ VersionRange(max: v200, alwaysIncludeMaxPreRelease: true),
+ VersionConstraint.parse('>=3.0.0 <4.0.0')
+ ])));
+ });
+
+ test('can leave a single version', () {
+ expect(
+ VersionConstraint.parse('<=2.0.0')
+ .difference(includeMaxPreReleaseRange),
+ equals(v200));
+ });
+ });
+ });
+ });
+ });
+
+ test('isEmpty', () {
+ expect(VersionRange().isEmpty, isFalse);
+ expect(VersionRange(min: v123, max: v124).isEmpty, isFalse);
+ });
+
+ group('compareTo()', () {
+ test('orders by minimum first', () {
+ _expectComparesSmaller(VersionRange(min: v003, max: v080),
+ VersionRange(min: v010, max: v072));
+ _expectComparesSmaller(VersionRange(min: v003, max: v080),
+ VersionRange(min: v010, max: v080));
+ _expectComparesSmaller(VersionRange(min: v003, max: v080),
+ VersionRange(min: v010, max: v114));
+ });
+
+ test('orders by maximum second', () {
+ _expectComparesSmaller(VersionRange(min: v003, max: v010),
+ VersionRange(min: v003, max: v072));
+ });
+
+ test('includeMin comes before !includeMin', () {
+ _expectComparesSmaller(
+ VersionRange(min: v003, max: v080, includeMin: true),
+ VersionRange(min: v003, max: v080));
+ });
+
+ test('includeMax comes after !includeMax', () {
+ _expectComparesSmaller(VersionRange(min: v003, max: v080),
+ VersionRange(min: v003, max: v080, includeMax: true));
+ });
+
+ test('includeMaxPreRelease comes after !includeMaxPreRelease', () {
+ _expectComparesSmaller(
+ VersionRange(max: v200), includeMaxPreReleaseRange);
+ });
+
+ test('no minimum comes before small minimum', () {
+ _expectComparesSmaller(
+ VersionRange(max: v010), VersionRange(min: v003, max: v010));
+ _expectComparesSmaller(VersionRange(max: v010, includeMin: true),
+ VersionRange(min: v003, max: v010));
+ });
+
+ test('no maximium comes after large maximum', () {
+ _expectComparesSmaller(
+ VersionRange(min: v003, max: v300), VersionRange(min: v003));
+ _expectComparesSmaller(VersionRange(min: v003, max: v300),
+ VersionRange(min: v003, includeMax: true));
+ });
+ });
+}
+
+void _expectComparesSmaller(VersionRange smaller, VersionRange larger) {
+ expect(smaller.compareTo(larger), lessThan(0),
+ reason: 'expected $smaller to sort below $larger');
+ expect(larger.compareTo(smaller), greaterThan(0),
+ reason: 'expected $larger to sort above $smaller');
+}
diff --git a/pkgs/pub_semver/test/version_test.dart b/pkgs/pub_semver/test/version_test.dart
new file mode 100644
index 0000000..d7f1197
--- /dev/null
+++ b/pkgs/pub_semver/test/version_test.dart
@@ -0,0 +1,411 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:pub_semver/pub_semver.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ test('none', () {
+ expect(Version.none.toString(), equals('0.0.0'));
+ });
+
+ test('prioritize()', () {
+ // A correctly sorted list of versions in order of increasing priority.
+ var versions = [
+ '1.0.0-alpha',
+ '2.0.0-alpha',
+ '1.0.0',
+ '1.0.0+build',
+ '1.0.1',
+ '1.1.0',
+ '2.0.0'
+ ];
+
+ // Ensure that every pair of versions is prioritized in the order that it
+ // appears in the list.
+ for (var i = 0; i < versions.length; i++) {
+ for (var j = 0; j < versions.length; j++) {
+ var a = Version.parse(versions[i]);
+ var b = Version.parse(versions[j]);
+ expect(Version.prioritize(a, b), equals(i.compareTo(j)));
+ }
+ }
+ });
+
+ test('antiprioritize()', () {
+ // A correctly sorted list of versions in order of increasing antipriority.
+ var versions = [
+ '2.0.0-alpha',
+ '1.0.0-alpha',
+ '2.0.0',
+ '1.1.0',
+ '1.0.1',
+ '1.0.0+build',
+ '1.0.0'
+ ];
+
+ // Ensure that every pair of versions is prioritized in the order that it
+ // appears in the list.
+ for (var i = 0; i < versions.length; i++) {
+ for (var j = 0; j < versions.length; j++) {
+ var a = Version.parse(versions[i]);
+ var b = Version.parse(versions[j]);
+ expect(Version.antiprioritize(a, b), equals(i.compareTo(j)));
+ }
+ }
+ });
+
+ group('constructor', () {
+ test('throws on negative numbers', () {
+ expect(() => Version(-1, 1, 1), throwsArgumentError);
+ expect(() => Version(1, -1, 1), throwsArgumentError);
+ expect(() => Version(1, 1, -1), throwsArgumentError);
+ });
+ });
+
+ group('comparison', () {
+ // A correctly sorted list of versions.
+ var versions = [
+ '1.0.0-alpha',
+ '1.0.0-alpha.1',
+ '1.0.0-beta.2',
+ '1.0.0-beta.11',
+ '1.0.0-rc.1',
+ '1.0.0-rc.1+build.1',
+ '1.0.0',
+ '1.0.0+0.3.7',
+ '1.3.7+build',
+ '1.3.7+build.2.b8f12d7',
+ '1.3.7+build.11.e0f985a',
+ '2.0.0',
+ '2.1.0',
+ '2.2.0',
+ '2.11.0',
+ '2.11.1'
+ ];
+
+ test('compareTo()', () {
+ // Ensure that every pair of versions compares in the order that it
+ // appears in the list.
+ for (var i = 0; i < versions.length; i++) {
+ for (var j = 0; j < versions.length; j++) {
+ var a = Version.parse(versions[i]);
+ var b = Version.parse(versions[j]);
+ expect(a.compareTo(b), equals(i.compareTo(j)));
+ }
+ }
+ });
+
+ test('operators', () {
+ for (var i = 0; i < versions.length; i++) {
+ for (var j = 0; j < versions.length; j++) {
+ var a = Version.parse(versions[i]);
+ var b = Version.parse(versions[j]);
+ expect(a < b, equals(i < j));
+ expect(a > b, equals(i > j));
+ expect(a <= b, equals(i <= j));
+ expect(a >= b, equals(i >= j));
+ expect(a == b, equals(i == j));
+ expect(a != b, equals(i != j));
+ }
+ }
+ });
+
+ test('equality', () {
+ expect(Version.parse('01.2.3'), equals(Version.parse('1.2.3')));
+ expect(Version.parse('1.02.3'), equals(Version.parse('1.2.3')));
+ expect(Version.parse('1.2.03'), equals(Version.parse('1.2.3')));
+ expect(Version.parse('1.2.3-01'), equals(Version.parse('1.2.3-1')));
+ expect(Version.parse('1.2.3+01'), equals(Version.parse('1.2.3+1')));
+ });
+ });
+
+ test('allows()', () {
+ expect(v123, allows(v123));
+ expect(
+ v123,
+ doesNotAllow(
+ Version.parse('2.2.3'),
+ Version.parse('1.3.3'),
+ Version.parse('1.2.4'),
+ Version.parse('1.2.3-dev'),
+ Version.parse('1.2.3+build')));
+ });
+
+ test('allowsAll()', () {
+ expect(v123.allowsAll(v123), isTrue);
+ expect(v123.allowsAll(v003), isFalse);
+ expect(v123.allowsAll(VersionRange(min: v114, max: v124)), isFalse);
+ expect(v123.allowsAll(VersionConstraint.any), isFalse);
+ expect(v123.allowsAll(VersionConstraint.empty), isTrue);
+ });
+
+ test('allowsAny()', () {
+ expect(v123.allowsAny(v123), isTrue);
+ expect(v123.allowsAny(v003), isFalse);
+ expect(v123.allowsAny(VersionRange(min: v114, max: v124)), isTrue);
+ expect(v123.allowsAny(VersionConstraint.any), isTrue);
+ expect(v123.allowsAny(VersionConstraint.empty), isFalse);
+ });
+
+ test('intersect()', () {
+ // Intersecting the same version returns the version.
+ expect(v123.intersect(v123), equals(v123));
+
+ // Intersecting a different version allows no versions.
+ expect(v123.intersect(v114).isEmpty, isTrue);
+
+ // Intersecting a range returns the version if the range allows it.
+ expect(v123.intersect(VersionRange(min: v114, max: v124)), equals(v123));
+
+ // Intersecting a range allows no versions if the range doesn't allow it.
+ expect(v114.intersect(VersionRange(min: v123, max: v124)).isEmpty, isTrue);
+ });
+
+ group('union()', () {
+ test('with the same version returns the version', () {
+ expect(v123.union(v123), equals(v123));
+ });
+
+ test('with a different version returns a version that matches both', () {
+ var result = v123.union(v080);
+ expect(result, allows(v123));
+ expect(result, allows(v080));
+
+ // Nothing in between should match.
+ expect(result, doesNotAllow(v114));
+ });
+
+ test('with a range returns the range if it contains the version', () {
+ var range = VersionRange(min: v114, max: v124);
+ expect(v123.union(range), equals(range));
+ });
+
+ test('with a range with the version on the edge, expands the range', () {
+ expect(
+ v124.union(VersionRange(
+ min: v114, max: v124, alwaysIncludeMaxPreRelease: true)),
+ equals(VersionRange(min: v114, max: v124, includeMax: true)));
+ expect(
+ v124.firstPreRelease.union(VersionRange(min: v114, max: v124)),
+ equals(VersionRange(
+ min: v114, max: v124.firstPreRelease, includeMax: true)));
+ expect(v114.union(VersionRange(min: v114, max: v124)),
+ equals(VersionRange(min: v114, max: v124, includeMin: true)));
+ });
+
+ test(
+ 'with a range allows both the range and the version if the range '
+ "doesn't contain the version", () {
+ var result = v123.union(VersionRange(min: v003, max: v114));
+ expect(result, allows(v123));
+ expect(result, allows(v010));
+ });
+ });
+
+ group('difference()', () {
+ test('with the same version returns an empty constraint', () {
+ expect(v123.difference(v123), isEmpty);
+ });
+
+ test('with a different version returns the original version', () {
+ expect(v123.difference(v080), equals(v123));
+ });
+
+ test('returns an empty constraint with a range that contains the version',
+ () {
+ expect(v123.difference(VersionRange(min: v114, max: v124)), isEmpty);
+ });
+
+ test("returns the version constraint with a range that doesn't contain it",
+ () {
+ expect(v123.difference(VersionRange(min: v140, max: v300)), equals(v123));
+ });
+ });
+
+ test('isEmpty', () {
+ expect(v123.isEmpty, isFalse);
+ });
+
+ test('nextMajor', () {
+ expect(v123.nextMajor, equals(v200));
+ expect(v114.nextMajor, equals(v200));
+ expect(v200.nextMajor, equals(v300));
+
+ // Ignores pre-release if not on a major version.
+ expect(Version.parse('1.2.3-dev').nextMajor, equals(v200));
+
+ // Just removes it if on a major version.
+ expect(Version.parse('2.0.0-dev').nextMajor, equals(v200));
+
+ // Strips build suffix.
+ expect(Version.parse('1.2.3+patch').nextMajor, equals(v200));
+ });
+
+ test('nextMinor', () {
+ expect(v123.nextMinor, equals(v130));
+ expect(v130.nextMinor, equals(v140));
+
+ // Ignores pre-release if not on a minor version.
+ expect(Version.parse('1.2.3-dev').nextMinor, equals(v130));
+
+ // Just removes it if on a minor version.
+ expect(Version.parse('1.3.0-dev').nextMinor, equals(v130));
+
+ // Strips build suffix.
+ expect(Version.parse('1.2.3+patch').nextMinor, equals(v130));
+ });
+
+ test('nextPatch', () {
+ expect(v123.nextPatch, equals(v124));
+ expect(v200.nextPatch, equals(v201));
+
+ // Just removes pre-release version if present.
+ expect(Version.parse('1.2.4-dev').nextPatch, equals(v124));
+
+ // Strips build suffix.
+ expect(Version.parse('1.2.3+patch').nextPatch, equals(v124));
+ });
+
+ test('nextBreaking', () {
+ expect(v123.nextBreaking, equals(v200));
+ expect(v072.nextBreaking, equals(v080));
+ expect(v003.nextBreaking, equals(v010));
+
+ // Removes pre-release version if present.
+ expect(Version.parse('1.2.3-dev').nextBreaking, equals(v200));
+
+ // Strips build suffix.
+ expect(Version.parse('1.2.3+patch').nextBreaking, equals(v200));
+ });
+
+ test('parse()', () {
+ expect(Version.parse('0.0.0'), equals(Version(0, 0, 0)));
+ expect(Version.parse('12.34.56'), equals(Version(12, 34, 56)));
+
+ expect(Version.parse('1.2.3-alpha.1'),
+ equals(Version(1, 2, 3, pre: 'alpha.1')));
+ expect(Version.parse('1.2.3-x.7.z-92'),
+ equals(Version(1, 2, 3, pre: 'x.7.z-92')));
+
+ expect(Version.parse('1.2.3+build.1'),
+ equals(Version(1, 2, 3, build: 'build.1')));
+ expect(Version.parse('1.2.3+x.7.z-92'),
+ equals(Version(1, 2, 3, build: 'x.7.z-92')));
+
+ expect(Version.parse('1.0.0-rc-1+build-1'),
+ equals(Version(1, 0, 0, pre: 'rc-1', build: 'build-1')));
+
+ expect(() => Version.parse('1.0'), throwsFormatException);
+ expect(() => Version.parse('1a2b3'), throwsFormatException);
+ expect(() => Version.parse('1.2.3.4'), throwsFormatException);
+ expect(() => Version.parse('1234'), throwsFormatException);
+ expect(() => Version.parse('-2.3.4'), throwsFormatException);
+ expect(() => Version.parse('1.3-pre'), throwsFormatException);
+ expect(() => Version.parse('1.3+build'), throwsFormatException);
+ expect(() => Version.parse('1.3+bu?!3ild'), throwsFormatException);
+ });
+
+ group('toString()', () {
+ test('returns the version string', () {
+ expect(Version(0, 0, 0).toString(), equals('0.0.0'));
+ expect(Version(12, 34, 56).toString(), equals('12.34.56'));
+
+ expect(
+ Version(1, 2, 3, pre: 'alpha.1').toString(), equals('1.2.3-alpha.1'));
+ expect(Version(1, 2, 3, pre: 'x.7.z-92').toString(),
+ equals('1.2.3-x.7.z-92'));
+
+ expect(Version(1, 2, 3, build: 'build.1').toString(),
+ equals('1.2.3+build.1'));
+ expect(Version(1, 2, 3, pre: 'pre', build: 'bui').toString(),
+ equals('1.2.3-pre+bui'));
+ });
+
+ test('preserves leading zeroes', () {
+ expect(Version.parse('001.02.0003-01.dev+pre.002').toString(),
+ equals('001.02.0003-01.dev+pre.002'));
+ });
+ });
+
+ group('canonicalizedVersion', () {
+ test('returns version string', () {
+ expect(Version(0, 0, 0).canonicalizedVersion, equals('0.0.0'));
+ expect(Version(12, 34, 56).canonicalizedVersion, equals('12.34.56'));
+
+ expect(Version(1, 2, 3, pre: 'alpha.1').canonicalizedVersion,
+ equals('1.2.3-alpha.1'));
+ expect(Version(1, 2, 3, pre: 'x.7.z-92').canonicalizedVersion,
+ equals('1.2.3-x.7.z-92'));
+
+ expect(Version(1, 2, 3, build: 'build.1').canonicalizedVersion,
+ equals('1.2.3+build.1'));
+ expect(Version(1, 2, 3, pre: 'pre', build: 'bui').canonicalizedVersion,
+ equals('1.2.3-pre+bui'));
+ });
+
+ test('discards leading zeroes', () {
+ expect(Version.parse('001.02.0003-01.dev+pre.002').canonicalizedVersion,
+ equals('1.2.3-1.dev+pre.2'));
+ });
+
+ test('example from documentation', () {
+ final v = Version.parse('01.02.03-01.dev+pre.02');
+
+ assert(v.toString() == '01.02.03-01.dev+pre.02');
+ assert(v.canonicalizedVersion == '1.2.3-1.dev+pre.2');
+ assert(Version.parse(v.canonicalizedVersion) == v);
+ });
+ });
+
+ group('primary', () {
+ test('single', () {
+ expect(
+ _primary([
+ '1.2.3',
+ ]).toString(),
+ '1.2.3',
+ );
+ });
+
+ test('normal', () {
+ expect(
+ _primary([
+ '1.2.3',
+ '1.2.2',
+ ]).toString(),
+ '1.2.3',
+ );
+ });
+
+ test('all prerelease', () {
+ expect(
+ _primary([
+ '1.2.2-dev.1',
+ '1.2.2-dev.2',
+ ]).toString(),
+ '1.2.2-dev.2',
+ );
+ });
+
+ test('later prerelease', () {
+ expect(
+ _primary([
+ '1.2.3',
+ '1.2.3-dev',
+ ]).toString(),
+ '1.2.3',
+ );
+ });
+
+ test('empty', () {
+ expect(() => Version.primary([]), throwsStateError);
+ });
+ });
+}
+
+Version _primary(List<String> input) =>
+ Version.primary(input.map(Version.parse).toList());
diff --git a/pkgs/pub_semver/test/version_union_test.dart b/pkgs/pub_semver/test/version_union_test.dart
new file mode 100644
index 0000000..857f10e
--- /dev/null
+++ b/pkgs/pub_semver/test/version_union_test.dart
@@ -0,0 +1,482 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:pub_semver/pub_semver.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ group('factory', () {
+ test('ignores empty constraints', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionConstraint.empty,
+ VersionConstraint.empty,
+ v123,
+ VersionConstraint.empty
+ ]),
+ equals(v123));
+
+ expect(
+ VersionConstraint.unionOf(
+ [VersionConstraint.empty, VersionConstraint.empty]),
+ isEmpty);
+ });
+
+ test('returns an empty constraint for an empty list', () {
+ expect(VersionConstraint.unionOf([]), isEmpty);
+ });
+
+ test('any constraints override everything', () {
+ expect(
+ VersionConstraint.unionOf([
+ v123,
+ VersionConstraint.any,
+ v200,
+ VersionRange(min: v234, max: v250)
+ ]),
+ equals(VersionConstraint.any));
+ });
+
+ test('flattens other unions', () {
+ expect(
+ VersionConstraint.unionOf([
+ v072,
+ VersionConstraint.unionOf([v123, v124]),
+ v250
+ ]),
+ equals(VersionConstraint.unionOf([v072, v123, v124, v250])));
+ });
+
+ test('returns a single merged range as-is', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v080, max: v140),
+ VersionRange(min: v123, max: v200)
+ ]),
+ equals(VersionRange(min: v080, max: v200)));
+ });
+ });
+
+ group('equality', () {
+ test("doesn't depend on original order", () {
+ expect(
+ VersionConstraint.unionOf([
+ v250,
+ VersionRange(min: v201, max: v234),
+ v124,
+ v072,
+ VersionRange(min: v080, max: v114),
+ v123
+ ]),
+ equals(VersionConstraint.unionOf([
+ v072,
+ VersionRange(min: v080, max: v114),
+ v123,
+ v124,
+ VersionRange(min: v201, max: v234),
+ v250
+ ])));
+ });
+
+ test('merges overlapping ranges', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v072),
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v114, max: v124),
+ VersionRange(min: v123, max: v130)
+ ]),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v114, max: v130)
+ ])));
+ });
+
+ test('merges adjacent ranges', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v072, includeMax: true),
+ VersionRange(min: v072, max: v080),
+ VersionRange(
+ min: v114, max: v124, alwaysIncludeMaxPreRelease: true),
+ VersionRange(min: v124, max: v130, includeMin: true),
+ VersionRange(min: v130.firstPreRelease, max: v200, includeMin: true)
+ ]),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v114, max: v200)
+ ])));
+ });
+
+ test("doesn't merge not-quite-adjacent ranges", () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v114, max: v124),
+ VersionRange(min: v124, max: v130, includeMin: true)
+ ]),
+ isNot(equals(VersionRange(min: v114, max: v130))));
+
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v072),
+ VersionRange(min: v072, max: v080)
+ ]),
+ isNot(equals(VersionRange(min: v003, max: v080))));
+ });
+
+ test('merges version numbers into ranges', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v072),
+ v010,
+ VersionRange(min: v114, max: v124),
+ v123
+ ]),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v072),
+ VersionRange(min: v114, max: v124)
+ ])));
+ });
+
+ test('merges adjacent version numbers into ranges', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(
+ min: v003, max: v072, alwaysIncludeMaxPreRelease: true),
+ v072,
+ v114,
+ VersionRange(min: v114, max: v124),
+ v124.firstPreRelease
+ ]),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v072, includeMax: true),
+ VersionRange(
+ min: v114,
+ max: v124.firstPreRelease,
+ includeMin: true,
+ includeMax: true)
+ ])));
+ });
+
+ test("doesn't merge not-quite-adjacent version numbers into ranges", () {
+ expect(
+ VersionConstraint.unionOf([VersionRange(min: v003, max: v072), v072]),
+ isNot(equals(VersionRange(min: v003, max: v072, includeMax: true))));
+ });
+ });
+
+ test('isEmpty returns false', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130),
+ ]),
+ isNot(isEmpty));
+ });
+
+ test('isAny returns false', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130),
+ ]).isAny,
+ isFalse);
+ });
+
+ test('allows() allows anything the components allow', () {
+ var union = VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130),
+ v200
+ ]);
+
+ expect(union, allows(v010));
+ expect(union, doesNotAllow(v080));
+ expect(union, allows(v124));
+ expect(union, doesNotAllow(v140));
+ expect(union, allows(v200));
+ });
+
+ group('allowsAll()', () {
+ test('for a version, returns true if any component allows the version', () {
+ var union = VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130),
+ v200
+ ]);
+
+ expect(union.allowsAll(v010), isTrue);
+ expect(union.allowsAll(v080), isFalse);
+ expect(union.allowsAll(v124), isTrue);
+ expect(union.allowsAll(v140), isFalse);
+ expect(union.allowsAll(v200), isTrue);
+ });
+
+ test(
+ 'for a version range, returns true if any component allows the whole '
+ 'range', () {
+ var union = VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130)
+ ]);
+
+ expect(union.allowsAll(VersionRange(min: v003, max: v080)), isTrue);
+ expect(union.allowsAll(VersionRange(min: v010, max: v072)), isTrue);
+ expect(union.allowsAll(VersionRange(min: v010, max: v124)), isFalse);
+ });
+
+ group('for a union,', () {
+ var union = VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130)
+ ]);
+
+ test('returns true if every constraint matches a different constraint',
+ () {
+ expect(
+ union.allowsAll(VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v072),
+ VersionRange(min: v124, max: v130)
+ ])),
+ isTrue);
+ });
+
+ test('returns true if every constraint matches the same constraint', () {
+ expect(
+ union.allowsAll(VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v010),
+ VersionRange(min: v072, max: v080)
+ ])),
+ isTrue);
+ });
+
+ test("returns false if there's an unmatched constraint", () {
+ expect(
+ union.allowsAll(VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v072),
+ VersionRange(min: v124, max: v130),
+ VersionRange(min: v140, max: v200)
+ ])),
+ isFalse);
+ });
+
+ test("returns false if a constraint isn't fully matched", () {
+ expect(
+ union.allowsAll(VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v114),
+ VersionRange(min: v124, max: v130)
+ ])),
+ isFalse);
+ });
+ });
+ });
+
+ group('allowsAny()', () {
+ test('for a version, returns true if any component allows the version', () {
+ var union = VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130),
+ v200
+ ]);
+
+ expect(union.allowsAny(v010), isTrue);
+ expect(union.allowsAny(v080), isFalse);
+ expect(union.allowsAny(v124), isTrue);
+ expect(union.allowsAny(v140), isFalse);
+ expect(union.allowsAny(v200), isTrue);
+ });
+
+ test(
+ 'for a version range, returns true if any component allows part of '
+ 'the range', () {
+ var union =
+ VersionConstraint.unionOf([VersionRange(min: v003, max: v080), v123]);
+
+ expect(union.allowsAny(VersionRange(min: v010, max: v114)), isTrue);
+ expect(union.allowsAny(VersionRange(min: v114, max: v124)), isTrue);
+ expect(union.allowsAny(VersionRange(min: v124, max: v130)), isFalse);
+ });
+
+ group('for a union,', () {
+ var union = VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v123, max: v130)
+ ]);
+
+ test('returns true if any constraint matches', () {
+ expect(
+ union.allowsAny(VersionConstraint.unionOf(
+ [v072, VersionRange(min: v200, max: v300)])),
+ isTrue);
+
+ expect(
+ union.allowsAny(VersionConstraint.unionOf(
+ [v003, VersionRange(min: v124, max: v300)])),
+ isTrue);
+ });
+
+ test('returns false if no constraint matches', () {
+ expect(
+ union.allowsAny(VersionConstraint.unionOf([
+ v003,
+ VersionRange(min: v130, max: v140),
+ VersionRange(min: v140, max: v200)
+ ])),
+ isFalse);
+ });
+ });
+ });
+
+ group('intersect()', () {
+ test('with an overlapping version, returns that version', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v123, max: v140)
+ ]).intersect(v072),
+ equals(v072));
+ });
+
+ test('with a non-overlapping version, returns an empty constraint', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v123, max: v140)
+ ]).intersect(v300),
+ isEmpty);
+ });
+
+ test('with an overlapping range, returns that range', () {
+ var range = VersionRange(min: v072, max: v080);
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v123, max: v140)
+ ]).intersect(range),
+ equals(range));
+ });
+
+ test('with a non-overlapping range, returns an empty constraint', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v123, max: v140)
+ ]).intersect(VersionRange(min: v080, max: v123)),
+ isEmpty);
+ });
+
+ test('with a parially-overlapping range, returns the overlapping parts',
+ () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v123, max: v140)
+ ]).intersect(VersionRange(min: v072, max: v130)),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v072, max: v080),
+ VersionRange(min: v123, max: v130)
+ ])));
+ });
+
+ group('for a union,', () {
+ var union = VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v080),
+ VersionRange(min: v123, max: v130)
+ ]);
+
+ test('returns the overlapping parts', () {
+ expect(
+ union.intersect(VersionConstraint.unionOf([
+ v010,
+ VersionRange(min: v072, max: v124),
+ VersionRange(min: v124, max: v130)
+ ])),
+ equals(VersionConstraint.unionOf([
+ v010,
+ VersionRange(min: v072, max: v080),
+ VersionRange(min: v123, max: v124),
+ VersionRange(min: v124, max: v130)
+ ])));
+ });
+
+ test("drops parts that don't match", () {
+ expect(
+ union.intersect(VersionConstraint.unionOf([
+ v003,
+ VersionRange(min: v072, max: v080),
+ VersionRange(min: v080, max: v123)
+ ])),
+ equals(VersionRange(min: v072, max: v080)));
+ });
+ });
+ });
+
+ group('difference()', () {
+ test("ignores ranges that don't intersect", () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v072, max: v080),
+ VersionRange(min: v123, max: v130)
+ ]).difference(VersionConstraint.unionOf([
+ VersionRange(min: v003, max: v010),
+ VersionRange(min: v080, max: v123),
+ VersionRange(min: v140)
+ ])),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v072, max: v080),
+ VersionRange(min: v123, max: v130)
+ ])));
+ });
+
+ test('removes overlapping portions', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v080),
+ VersionRange(min: v123, max: v130)
+ ]).difference(VersionConstraint.unionOf(
+ [VersionRange(min: v003, max: v072), VersionRange(min: v124)])),
+ equals(VersionConstraint.unionOf([
+ VersionRange(
+ min: v072.firstPreRelease, max: v080, includeMin: true),
+ VersionRange(min: v123, max: v124, includeMax: true)
+ ])));
+ });
+
+ test('removes multiple portions from the same range', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v114),
+ VersionRange(min: v130, max: v200)
+ ]).difference(VersionConstraint.unionOf([v072, v080])),
+ equals(VersionConstraint.unionOf([
+ VersionRange(
+ min: v010, max: v072, alwaysIncludeMaxPreRelease: true),
+ VersionRange(
+ min: v072, max: v080, alwaysIncludeMaxPreRelease: true),
+ VersionRange(min: v080, max: v114),
+ VersionRange(min: v130, max: v200)
+ ])));
+ });
+
+ test('removes the same range from multiple ranges', () {
+ expect(
+ VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v072),
+ VersionRange(min: v080, max: v123),
+ VersionRange(min: v124, max: v130),
+ VersionRange(min: v200, max: v234),
+ VersionRange(min: v250, max: v300)
+ ]).difference(VersionRange(min: v114, max: v201)),
+ equals(VersionConstraint.unionOf([
+ VersionRange(min: v010, max: v072),
+ VersionRange(min: v080, max: v114, includeMax: true),
+ VersionRange(
+ min: v201.firstPreRelease, max: v234, includeMin: true),
+ VersionRange(min: v250, max: v300)
+ ])));
+ });
+ });
+}
diff --git a/pkgs/pubspec_parse/.gitignore b/pkgs/pubspec_parse/.gitignore
new file mode 100644
index 0000000..ec8eae3
--- /dev/null
+++ b/pkgs/pubspec_parse/.gitignore
@@ -0,0 +1,4 @@
+# Don’t commit the following directories created by pub.
+.dart_tool/
+.packages
+pubspec.lock
diff --git a/pkgs/pubspec_parse/CHANGELOG.md b/pkgs/pubspec_parse/CHANGELOG.md
new file mode 100644
index 0000000..5aeb498
--- /dev/null
+++ b/pkgs/pubspec_parse/CHANGELOG.md
@@ -0,0 +1,110 @@
+## 1.5.0
+
+- Added fields to `Pubspec`: `executables`, `resolution`, `workspace`.
+- Require Dart 3.6
+- Update dependencies.
+
+## 1.4.0
+
+- Require Dart 3.2
+- Seal the `Dependency` class.
+- Set `Pubspec.environment` to non-nullable.
+- Remove deprecated package_api_docs rule
+- Move to `dart-lang/tools` monorepo.
+
+## 1.3.0
+
+- Require Dart 3.0
+- Added support for `ignored_advisories` field.
+- Added structural equality for `Dependency` subclasses and `HostedDetails`.
+
+## 1.2.3
+
+- Added topics to `pubspec.yaml`.
+
+## 1.2.2
+
+- Require Dart SDK >= 2.18.0
+- Required `json_annotation: ^4.8.0`
+- Added support for `topics` field.
+
+## 1.2.1
+
+- Added support for `funding` field.
+
+## 1.2.0
+
+- Added support for `screenshots` field.
+- Update `HostedDetails` to reflect how `hosted` dependencies are parsed in
+ Dart 2.15:
+ - Add `HostedDetails.declaredName` as the (optional) `name` property in a
+ `hosted` block.
+ - `HostedDetails.name` now falls back to the name of the dependency if no
+ name is declared in the block.
+- Require Dart SDK >= 2.14.0
+
+## 1.1.0
+
+- Export `HostedDetails` publicly.
+
+## 1.0.0
+
+- Migrate to null-safety.
+- Pubspec: `author` and `authors` are both now deprecated.
+ See https://dart.dev/tools/pub/pubspec#authorauthors
+
+## 0.1.8
+
+- Allow the latest `package:pub_semver`.
+
+## 0.1.7
+
+- Allow `package:yaml` `v3.x`.
+
+## 0.1.6
+
+- Update SDK requirement to `>=2.7.0 <3.0.0`.
+- Allow `package:json_annotation` `v4.x`.
+
+## 0.1.5
+
+- Update SDK requirement to `>=2.2.0 <3.0.0`.
+- Support the latest `package:json_annotation`.
+
+## 0.1.4
+
+- Added `lenient` named argument to `Pubspec.fromJson` to ignore format and type errors.
+
+## 0.1.3
+
+- Added support for `flutter`, `issue_tracker`, `publish_to`, and `repository`
+ fields.
+
+## 0.1.2+3
+
+- Support the latest version of `package:json_annotation`.
+
+## 0.1.2+2
+
+- Support `package:json_annotation` v1.
+
+## 0.1.2+1
+
+- Support the Dart 2 stable release.
+
+## 0.1.2
+
+- Allow superfluous `version` keys with `git` and `path` dependencies.
+- Improve errors when unsupported keys are provided in dependencies.
+- Provide better errors with invalid `sdk` dependency values.
+- Support "scp-like syntax" for Git SSH URIs in the form
+ `[user@]host.xz:path/to/repo.git/`.
+
+## 0.1.1
+
+- Fixed name collision with error type in latest `package:json_annotation`.
+- Improved parsing of hosted dependencies and environment constraints.
+
+## 0.1.0
+
+- Initial release.
diff --git a/pkgs/pubspec_parse/LICENSE b/pkgs/pubspec_parse/LICENSE
new file mode 100644
index 0000000..4d1ad40
--- /dev/null
+++ b/pkgs/pubspec_parse/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2018, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/pubspec_parse/README.md b/pkgs/pubspec_parse/README.md
new file mode 100644
index 0000000..1d04aa4
--- /dev/null
+++ b/pkgs/pubspec_parse/README.md
@@ -0,0 +1,12 @@
+[](https://github.com/dart-lang/tools/actions/workflows/pubspec_parse.yaml)
+[](https://pub.dev/packages/pubspec_parse)
+[](https://pub.dev/packages/pubspec_parse/publisher)
+
+## What's this?
+
+Supports parsing `pubspec.yaml` files with robust error reporting and support
+for most of the documented features.
+
+## More information
+
+Read more about the [pubspec format](https://dart.dev/tools/pub/pubspec).
diff --git a/pkgs/pubspec_parse/analysis_options.yaml b/pkgs/pubspec_parse/analysis_options.yaml
new file mode 100644
index 0000000..93eeebf
--- /dev/null
+++ b/pkgs/pubspec_parse/analysis_options.yaml
@@ -0,0 +1,30 @@
+# https://dart.dev/guides/language/analysis-options
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-inference: true
+
+linter:
+ rules:
+ - avoid_bool_literals_in_conditional_expressions
+ - avoid_classes_with_only_static_members
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_returning_this
+ - avoid_unused_constructor_parameters
+ - avoid_void_async
+ - cancel_subscriptions
+ - cascade_invocations
+ - join_return_with_assignment
+ - literal_only_boolean_expressions
+ - missing_whitespace_between_adjacent_strings
+ - no_adjacent_strings_in_list
+ - no_runtimeType_toString
+ - prefer_const_declarations
+ - prefer_expression_function_bodies
+ - prefer_final_locals
+ - require_trailing_commas
+ - unnecessary_await_in_return
+ - use_string_buffers
diff --git a/pkgs/pubspec_parse/build.yaml b/pkgs/pubspec_parse/build.yaml
new file mode 100644
index 0000000..2003bc2
--- /dev/null
+++ b/pkgs/pubspec_parse/build.yaml
@@ -0,0 +1,25 @@
+# Read about `build.yaml` at https://pub.dev/packages/build_config
+# To update generated code, run `pub run build_runner build`
+targets:
+ $default:
+ builders:
+ json_serializable:
+ generate_for:
+ - lib/src/pubspec.dart
+ - lib/src/dependency.dart
+ options:
+ any_map: true
+ checked: true
+ create_to_json: false
+ field_rename: snake
+
+ # The end-user of a builder which applies "source_gen|combining_builder"
+ # may configure the builder to ignore specific lints for their project
+ source_gen|combining_builder:
+ options:
+ ignore_for_file:
+ - deprecated_member_use_from_same_package
+ - lines_longer_than_80_chars
+ - require_trailing_commas
+ # https://github.com/google/json_serializable.dart/issues/945
+ - unnecessary_cast
diff --git a/pkgs/pubspec_parse/dart_test.yaml b/pkgs/pubspec_parse/dart_test.yaml
new file mode 100644
index 0000000..1d7ac69
--- /dev/null
+++ b/pkgs/pubspec_parse/dart_test.yaml
@@ -0,0 +1,3 @@
+tags:
+ presubmit-only:
+ skip: "Should only be run during presubmit"
diff --git a/pkgs/pubspec_parse/lib/pubspec_parse.dart b/pkgs/pubspec_parse/lib/pubspec_parse.dart
new file mode 100644
index 0000000..b5c12e4
--- /dev/null
+++ b/pkgs/pubspec_parse/lib/pubspec_parse.dart
@@ -0,0 +1,14 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/dependency.dart'
+ show
+ Dependency,
+ GitDependency,
+ HostedDependency,
+ HostedDetails,
+ PathDependency,
+ SdkDependency;
+export 'src/pubspec.dart' show Pubspec;
+export 'src/screenshot.dart' show Screenshot;
diff --git a/pkgs/pubspec_parse/lib/src/dependency.dart b/pkgs/pubspec_parse/lib/src/dependency.dart
new file mode 100644
index 0000000..24c65ea
--- /dev/null
+++ b/pkgs/pubspec_parse/lib/src/dependency.dart
@@ -0,0 +1,277 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:collection/collection.dart';
+import 'package:json_annotation/json_annotation.dart';
+import 'package:pub_semver/pub_semver.dart';
+import 'package:yaml/yaml.dart';
+
+part 'dependency.g.dart';
+
+Map<String, Dependency> parseDeps(Map? source) =>
+ source?.map((k, v) {
+ final key = k as String;
+ Dependency? value;
+ try {
+ value = _fromJson(v, k);
+ } on CheckedFromJsonException catch (e) {
+ if (e.map is! YamlMap) {
+ // This is likely a "synthetic" map created from a String value
+ // Use `source` to throw this exception with an actual YamlMap and
+ // extract the associated error information.
+ throw CheckedFromJsonException(source, key, e.className!, e.message);
+ }
+ rethrow;
+ }
+
+ if (value == null) {
+ throw CheckedFromJsonException(
+ source,
+ key,
+ 'Pubspec',
+ 'Not a valid dependency value.',
+ );
+ }
+ return MapEntry(key, value);
+ }) ??
+ {};
+
+const _sourceKeys = ['sdk', 'git', 'path', 'hosted'];
+
+/// Returns `null` if the data could not be parsed.
+Dependency? _fromJson(Object? data, String name) {
+ if (data is String || data == null) {
+ return _$HostedDependencyFromJson({'version': data});
+ }
+
+ if (data is Map) {
+ final matchedKeys =
+ data.keys.cast<String>().where((key) => key != 'version').toList();
+
+ if (data.isEmpty || (matchedKeys.isEmpty && data.containsKey('version'))) {
+ return _$HostedDependencyFromJson(data);
+ } else {
+ final firstUnrecognizedKey =
+ matchedKeys.firstWhereOrNull((k) => !_sourceKeys.contains(k));
+
+ return $checkedNew<Dependency>('Dependency', data, () {
+ if (firstUnrecognizedKey != null) {
+ throw UnrecognizedKeysException(
+ [firstUnrecognizedKey],
+ data,
+ _sourceKeys,
+ );
+ }
+ if (matchedKeys.length > 1) {
+ throw CheckedFromJsonException(
+ data,
+ matchedKeys[1],
+ 'Dependency',
+ 'A dependency may only have one source.',
+ );
+ }
+
+ final key = matchedKeys.single;
+
+ return switch (key) {
+ 'git' => GitDependency.fromData(data[key]),
+ 'path' => PathDependency.fromData(data[key]),
+ 'sdk' => _$SdkDependencyFromJson(data),
+ 'hosted' => _$HostedDependencyFromJson(data)
+ ..hosted?._nameOfPackage = name,
+ _ => throw StateError('There is a bug in pubspec_parse.'),
+ };
+ });
+ }
+ }
+
+ // Not a String or a Map – return null so parent logic can throw proper error
+ return null;
+}
+
+sealed class Dependency {}
+
+@JsonSerializable()
+class SdkDependency extends Dependency {
+ final String sdk;
+ @JsonKey(fromJson: _constraintFromString)
+ final VersionConstraint version;
+
+ SdkDependency(this.sdk, {VersionConstraint? version})
+ : version = version ?? VersionConstraint.any;
+
+ @override
+ bool operator ==(Object other) =>
+ other is SdkDependency && other.sdk == sdk && other.version == version;
+
+ @override
+ int get hashCode => Object.hash(sdk, version);
+
+ @override
+ String toString() => 'SdkDependency: $sdk';
+}
+
+@JsonSerializable()
+class GitDependency extends Dependency {
+ @JsonKey(fromJson: parseGitUri)
+ final Uri url;
+ final String? ref;
+ final String? path;
+
+ GitDependency(this.url, {this.ref, this.path});
+
+ factory GitDependency.fromData(Object? data) {
+ if (data is String) {
+ data = {'url': data};
+ }
+
+ if (data is Map) {
+ return _$GitDependencyFromJson(data);
+ }
+
+ throw ArgumentError.value(data, 'git', 'Must be a String or a Map.');
+ }
+
+ @override
+ bool operator ==(Object other) =>
+ other is GitDependency &&
+ other.url == url &&
+ other.ref == ref &&
+ other.path == path;
+
+ @override
+ int get hashCode => Object.hash(url, ref, path);
+
+ @override
+ String toString() => 'GitDependency: url@$url';
+}
+
+Uri? parseGitUriOrNull(String? value) =>
+ value == null ? null : parseGitUri(value);
+
+Uri parseGitUri(String value) => _tryParseScpUri(value) ?? Uri.parse(value);
+
+/// Supports URIs like `[user@]host.xz:path/to/repo.git/`
+/// See https://git-scm.com/docs/git-clone#_git_urls_a_id_urls_a
+Uri? _tryParseScpUri(String value) {
+ final colonIndex = value.indexOf(':');
+
+ if (colonIndex < 0) {
+ return null;
+ } else if (colonIndex == value.indexOf('://')) {
+ // If the first colon is part of a scheme, it's not an scp-like URI
+ return null;
+ }
+ final slashIndex = value.indexOf('/');
+
+ if (slashIndex >= 0 && slashIndex < colonIndex) {
+ // Per docs: This syntax is only recognized if there are no slashes before
+ // the first colon. This helps differentiate a local path that contains a
+ // colon. For example the local path foo:bar could be specified as an
+ // absolute path or ./foo:bar to avoid being misinterpreted as an ssh url.
+ return null;
+ }
+
+ final atIndex = value.indexOf('@');
+ if (colonIndex > atIndex) {
+ final user = atIndex >= 0 ? value.substring(0, atIndex) : null;
+ final host = value.substring(atIndex + 1, colonIndex);
+ final path = value.substring(colonIndex + 1);
+ return Uri(scheme: 'ssh', userInfo: user, host: host, path: path);
+ }
+ return null;
+}
+
+class PathDependency extends Dependency {
+ final String path;
+
+ PathDependency(this.path);
+
+ factory PathDependency.fromData(Object? data) {
+ if (data is String) {
+ return PathDependency(data);
+ }
+ throw ArgumentError.value(data, 'path', 'Must be a String.');
+ }
+
+ @override
+ bool operator ==(Object other) =>
+ other is PathDependency && other.path == path;
+
+ @override
+ int get hashCode => path.hashCode;
+
+ @override
+ String toString() => 'PathDependency: path@$path';
+}
+
+@JsonSerializable(disallowUnrecognizedKeys: true)
+class HostedDependency extends Dependency {
+ @JsonKey(fromJson: _constraintFromString)
+ final VersionConstraint version;
+
+ @JsonKey(disallowNullValue: true)
+ final HostedDetails? hosted;
+
+ HostedDependency({VersionConstraint? version, this.hosted})
+ : version = version ?? VersionConstraint.any;
+
+ @override
+ bool operator ==(Object other) =>
+ other is HostedDependency &&
+ other.version == version &&
+ other.hosted == hosted;
+
+ @override
+ int get hashCode => Object.hash(version, hosted);
+
+ @override
+ String toString() => 'HostedDependency: $version';
+}
+
+@JsonSerializable(disallowUnrecognizedKeys: true)
+class HostedDetails {
+ /// The name of the target dependency as declared in a `hosted` block.
+ ///
+ /// This may be null if no explicit name is present, for instance because the
+ /// hosted dependency was declared as a string (`hosted: pub.example.org`).
+ @JsonKey(name: 'name')
+ final String? declaredName;
+
+ @JsonKey(fromJson: parseGitUriOrNull, disallowNullValue: true)
+ final Uri? url;
+
+ @JsonKey(includeFromJson: false, includeToJson: false)
+ String? _nameOfPackage;
+
+ /// The name of this package on the package repository.
+ ///
+ /// If this hosted block has a [declaredName], that one will be used.
+ /// Otherwise, the name will be inferred from the surrounding package name.
+ String get name => declaredName ?? _nameOfPackage!;
+
+ HostedDetails(this.declaredName, this.url);
+
+ factory HostedDetails.fromJson(Object data) {
+ if (data is String) {
+ data = {'url': data};
+ }
+
+ if (data is Map) {
+ return _$HostedDetailsFromJson(data);
+ }
+
+ throw ArgumentError.value(data, 'hosted', 'Must be a Map or String.');
+ }
+
+ @override
+ bool operator ==(Object other) =>
+ other is HostedDetails && other.name == name && other.url == url;
+
+ @override
+ int get hashCode => Object.hash(name, url);
+}
+
+VersionConstraint _constraintFromString(String? input) =>
+ input == null ? VersionConstraint.any : VersionConstraint.parse(input);
diff --git a/pkgs/pubspec_parse/lib/src/dependency.g.dart b/pkgs/pubspec_parse/lib/src/dependency.g.dart
new file mode 100644
index 0000000..1a504f1
--- /dev/null
+++ b/pkgs/pubspec_parse/lib/src/dependency.g.dart
@@ -0,0 +1,72 @@
+// GENERATED CODE - DO NOT MODIFY BY HAND
+
+// ignore_for_file: deprecated_member_use_from_same_package, lines_longer_than_80_chars, require_trailing_commas, unnecessary_cast
+
+part of 'dependency.dart';
+
+// **************************************************************************
+// JsonSerializableGenerator
+// **************************************************************************
+
+SdkDependency _$SdkDependencyFromJson(Map json) => $checkedCreate(
+ 'SdkDependency',
+ json,
+ ($checkedConvert) {
+ final val = SdkDependency(
+ $checkedConvert('sdk', (v) => v as String),
+ version: $checkedConvert(
+ 'version', (v) => _constraintFromString(v as String?)),
+ );
+ return val;
+ },
+ );
+
+GitDependency _$GitDependencyFromJson(Map json) => $checkedCreate(
+ 'GitDependency',
+ json,
+ ($checkedConvert) {
+ final val = GitDependency(
+ $checkedConvert('url', (v) => parseGitUri(v as String)),
+ ref: $checkedConvert('ref', (v) => v as String?),
+ path: $checkedConvert('path', (v) => v as String?),
+ );
+ return val;
+ },
+ );
+
+HostedDependency _$HostedDependencyFromJson(Map json) => $checkedCreate(
+ 'HostedDependency',
+ json,
+ ($checkedConvert) {
+ $checkKeys(
+ json,
+ allowedKeys: const ['version', 'hosted'],
+ disallowNullValues: const ['hosted'],
+ );
+ final val = HostedDependency(
+ version: $checkedConvert(
+ 'version', (v) => _constraintFromString(v as String?)),
+ hosted: $checkedConvert('hosted',
+ (v) => v == null ? null : HostedDetails.fromJson(v as Object)),
+ );
+ return val;
+ },
+ );
+
+HostedDetails _$HostedDetailsFromJson(Map json) => $checkedCreate(
+ 'HostedDetails',
+ json,
+ ($checkedConvert) {
+ $checkKeys(
+ json,
+ allowedKeys: const ['name', 'url'],
+ disallowNullValues: const ['url'],
+ );
+ final val = HostedDetails(
+ $checkedConvert('name', (v) => v as String?),
+ $checkedConvert('url', (v) => parseGitUriOrNull(v as String?)),
+ );
+ return val;
+ },
+ fieldKeyMap: const {'declaredName': 'name'},
+ );
diff --git a/pkgs/pubspec_parse/lib/src/pubspec.dart b/pkgs/pubspec_parse/lib/src/pubspec.dart
new file mode 100644
index 0000000..eb77908
--- /dev/null
+++ b/pkgs/pubspec_parse/lib/src/pubspec.dart
@@ -0,0 +1,258 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:checked_yaml/checked_yaml.dart';
+import 'package:json_annotation/json_annotation.dart';
+import 'package:pub_semver/pub_semver.dart';
+
+import 'dependency.dart';
+import 'screenshot.dart';
+
+part 'pubspec.g.dart';
+
+@JsonSerializable()
+class Pubspec {
+ // TODO: executables
+
+ final String name;
+
+ @JsonKey(fromJson: _versionFromString)
+ final Version? version;
+
+ final String? description;
+
+ /// This should be a URL pointing to the website for the package.
+ final String? homepage;
+
+ /// Specifies where to publish this package.
+ ///
+ /// Accepted values: `null`, `'none'` or an `http` or `https` URL.
+ ///
+ /// [More information](https://dart.dev/tools/pub/pubspec#publish_to).
+ final String? publishTo;
+
+ /// Optional field to specify the source code repository of the package.
+ /// Useful when a package has both a home page and a repository.
+ final Uri? repository;
+
+ /// Optional field to a web page where developers can report new issues or
+ /// view existing ones.
+ final Uri? issueTracker;
+
+ /// Optional field to list the URLs where the package authors accept
+ /// support or funding.
+ final List<Uri>? funding;
+
+ /// Optional field to list the topics that this packages belongs to.
+ final List<String>? topics;
+
+ /// Optional field to list advisories to be ignored by the client.
+ final List<String>? ignoredAdvisories;
+
+ /// Optional field for specifying included screenshot files.
+ @JsonKey(fromJson: parseScreenshots)
+ final List<Screenshot>? screenshots;
+
+ /// If there is exactly 1 value in [authors], returns it.
+ ///
+ /// If there are 0 or more than 1, returns `null`.
+ @Deprecated(
+ 'See https://dart.dev/tools/pub/pubspec#authorauthors',
+ )
+ String? get author {
+ if (authors.length == 1) {
+ return authors.single;
+ }
+ return null;
+ }
+
+ @Deprecated(
+ 'See https://dart.dev/tools/pub/pubspec#authorauthors',
+ )
+ final List<String> authors;
+ final String? documentation;
+
+ @JsonKey(fromJson: _environmentMap)
+ final Map<String, VersionConstraint?> environment;
+
+ @JsonKey(fromJson: parseDeps)
+ final Map<String, Dependency> dependencies;
+
+ @JsonKey(fromJson: parseDeps)
+ final Map<String, Dependency> devDependencies;
+
+ @JsonKey(fromJson: parseDeps)
+ final Map<String, Dependency> dependencyOverrides;
+
+ /// Optional configuration specific to [Flutter](https://flutter.io/)
+ /// packages.
+ ///
+ /// May include
+ /// [assets](https://flutter.io/docs/development/ui/assets-and-images)
+ /// and other settings.
+ final Map<String, dynamic>? flutter;
+
+ /// Optional field to specify executables
+ @JsonKey(fromJson: _executablesMap)
+ final Map<String, String?> executables;
+
+ /// If this package is a Pub Workspace, this field lists the sub-packages.
+ final List<String>? workspace;
+
+ /// Specifies how to resolve dependencies with the surrounding Pub Workspace.
+ final String? resolution;
+
+ /// If [author] and [authors] are both provided, their values are combined
+ /// with duplicates eliminated.
+ Pubspec(
+ this.name, {
+ this.version,
+ this.publishTo,
+ @Deprecated(
+ 'See https://dart.dev/tools/pub/pubspec#authorauthors',
+ )
+ String? author,
+ @Deprecated(
+ 'See https://dart.dev/tools/pub/pubspec#authorauthors',
+ )
+ List<String>? authors,
+ Map<String, VersionConstraint?>? environment,
+ this.homepage,
+ this.repository,
+ this.issueTracker,
+ this.funding,
+ this.topics,
+ this.ignoredAdvisories,
+ this.screenshots,
+ this.documentation,
+ this.description,
+ this.workspace,
+ this.resolution,
+ Map<String, Dependency>? dependencies,
+ Map<String, Dependency>? devDependencies,
+ Map<String, Dependency>? dependencyOverrides,
+ this.flutter,
+ Map<String, String?>? executables,
+ }) :
+ // ignore: deprecated_member_use_from_same_package
+ authors = _normalizeAuthors(author, authors),
+ environment = environment ?? const {},
+ dependencies = dependencies ?? const {},
+ devDependencies = devDependencies ?? const {},
+ executables = executables ?? const {},
+ dependencyOverrides = dependencyOverrides ?? const {} {
+ if (name.isEmpty) {
+ throw ArgumentError.value(name, 'name', '"name" cannot be empty.');
+ }
+
+ if (publishTo != null && publishTo != 'none') {
+ try {
+ final targetUri = Uri.parse(publishTo!);
+ if (!(targetUri.isScheme('http') || targetUri.isScheme('https'))) {
+ throw const FormatException('Must be an http or https URL.');
+ }
+ } on FormatException catch (e) {
+ throw ArgumentError.value(publishTo, 'publishTo', e.message);
+ }
+ }
+ }
+
+ factory Pubspec.fromJson(Map json, {bool lenient = false}) {
+ if (lenient) {
+ while (json.isNotEmpty) {
+ // Attempting to remove top-level properties that cause parsing errors.
+ try {
+ return _$PubspecFromJson(json);
+ } on CheckedFromJsonException catch (e) {
+ if (e.map == json && json.containsKey(e.key)) {
+ json = Map.from(json)..remove(e.key);
+ continue;
+ }
+ rethrow;
+ }
+ }
+ }
+
+ return _$PubspecFromJson(json);
+ }
+
+ /// Parses source [yaml] into [Pubspec].
+ ///
+ /// When [lenient] is set, top-level property-parsing or type cast errors are
+ /// ignored and `null` values are returned.
+ factory Pubspec.parse(String yaml, {Uri? sourceUrl, bool lenient = false}) =>
+ checkedYamlDecode(
+ yaml,
+ (map) => Pubspec.fromJson(map!, lenient: lenient),
+ sourceUrl: sourceUrl,
+ );
+
+ static List<String> _normalizeAuthors(String? author, List<String>? authors) {
+ final value = <String>{
+ if (author != null) author,
+ ...?authors,
+ };
+ return value.toList();
+ }
+}
+
+Version? _versionFromString(String? input) =>
+ input == null ? null : Version.parse(input);
+
+Map<String, VersionConstraint?> _environmentMap(Map? source) =>
+ source?.map((k, value) {
+ final key = k as String;
+ if (key == 'dart') {
+ // github.com/dart-lang/pub/blob/d84173eeb03c3/lib/src/pubspec.dart#L342
+ // 'dart' is not allowed as a key!
+ throw CheckedFromJsonException(
+ source,
+ 'dart',
+ 'VersionConstraint',
+ 'Use "sdk" to for Dart SDK constraints.',
+ badKey: true,
+ );
+ }
+
+ VersionConstraint? constraint;
+ if (value == null) {
+ constraint = null;
+ } else if (value is String) {
+ try {
+ constraint = VersionConstraint.parse(value);
+ } on FormatException catch (e) {
+ throw CheckedFromJsonException(source, key, 'Pubspec', e.message);
+ }
+
+ return MapEntry(key, constraint);
+ } else {
+ throw CheckedFromJsonException(
+ source,
+ key,
+ 'VersionConstraint',
+ '`$value` is not a String.',
+ );
+ }
+
+ return MapEntry(key, constraint);
+ }) ??
+ {};
+
+Map<String, String?> _executablesMap(Map? source) =>
+ source?.map((k, value) {
+ final key = k as String;
+ if (value == null) {
+ return MapEntry(key, null);
+ } else if (value is String) {
+ return MapEntry(key, value);
+ } else {
+ throw CheckedFromJsonException(
+ source,
+ key,
+ 'String',
+ '`$value` is not a String.',
+ );
+ }
+ }) ??
+ {};
diff --git a/pkgs/pubspec_parse/lib/src/pubspec.g.dart b/pkgs/pubspec_parse/lib/src/pubspec.g.dart
new file mode 100644
index 0000000..58e015a
--- /dev/null
+++ b/pkgs/pubspec_parse/lib/src/pubspec.g.dart
@@ -0,0 +1,69 @@
+// GENERATED CODE - DO NOT MODIFY BY HAND
+
+// ignore_for_file: deprecated_member_use_from_same_package, lines_longer_than_80_chars, require_trailing_commas, unnecessary_cast
+
+part of 'pubspec.dart';
+
+// **************************************************************************
+// JsonSerializableGenerator
+// **************************************************************************
+
+Pubspec _$PubspecFromJson(Map json) => $checkedCreate(
+ 'Pubspec',
+ json,
+ ($checkedConvert) {
+ final val = Pubspec(
+ $checkedConvert('name', (v) => v as String),
+ version: $checkedConvert(
+ 'version', (v) => _versionFromString(v as String?)),
+ publishTo: $checkedConvert('publish_to', (v) => v as String?),
+ author: $checkedConvert('author', (v) => v as String?),
+ authors: $checkedConvert('authors',
+ (v) => (v as List<dynamic>?)?.map((e) => e as String).toList()),
+ environment:
+ $checkedConvert('environment', (v) => _environmentMap(v as Map?)),
+ homepage: $checkedConvert('homepage', (v) => v as String?),
+ repository: $checkedConvert(
+ 'repository', (v) => v == null ? null : Uri.parse(v as String)),
+ issueTracker: $checkedConvert('issue_tracker',
+ (v) => v == null ? null : Uri.parse(v as String)),
+ funding: $checkedConvert(
+ 'funding',
+ (v) => (v as List<dynamic>?)
+ ?.map((e) => Uri.parse(e as String))
+ .toList()),
+ topics: $checkedConvert('topics',
+ (v) => (v as List<dynamic>?)?.map((e) => e as String).toList()),
+ ignoredAdvisories: $checkedConvert('ignored_advisories',
+ (v) => (v as List<dynamic>?)?.map((e) => e as String).toList()),
+ screenshots: $checkedConvert(
+ 'screenshots', (v) => parseScreenshots(v as List?)),
+ documentation: $checkedConvert('documentation', (v) => v as String?),
+ description: $checkedConvert('description', (v) => v as String?),
+ workspace: $checkedConvert('workspace',
+ (v) => (v as List<dynamic>?)?.map((e) => e as String).toList()),
+ resolution: $checkedConvert('resolution', (v) => v as String?),
+ dependencies:
+ $checkedConvert('dependencies', (v) => parseDeps(v as Map?)),
+ devDependencies:
+ $checkedConvert('dev_dependencies', (v) => parseDeps(v as Map?)),
+ dependencyOverrides: $checkedConvert(
+ 'dependency_overrides', (v) => parseDeps(v as Map?)),
+ flutter: $checkedConvert(
+ 'flutter',
+ (v) => (v as Map?)?.map(
+ (k, e) => MapEntry(k as String, e),
+ )),
+ executables:
+ $checkedConvert('executables', (v) => _executablesMap(v as Map?)),
+ );
+ return val;
+ },
+ fieldKeyMap: const {
+ 'publishTo': 'publish_to',
+ 'issueTracker': 'issue_tracker',
+ 'ignoredAdvisories': 'ignored_advisories',
+ 'devDependencies': 'dev_dependencies',
+ 'dependencyOverrides': 'dependency_overrides'
+ },
+ );
diff --git a/pkgs/pubspec_parse/lib/src/screenshot.dart b/pkgs/pubspec_parse/lib/src/screenshot.dart
new file mode 100644
index 0000000..f5f0be2
--- /dev/null
+++ b/pkgs/pubspec_parse/lib/src/screenshot.dart
@@ -0,0 +1,65 @@
+// Copyright (c) 2021, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:json_annotation/json_annotation.dart';
+
+@JsonSerializable()
+class Screenshot {
+ final String description;
+ final String path;
+
+ Screenshot(this.description, this.path);
+}
+
+List<Screenshot> parseScreenshots(List? input) {
+ final res = <Screenshot>[];
+ if (input == null) {
+ return res;
+ }
+
+ for (final e in input) {
+ if (e is! Map) continue;
+
+ final description = e['description'];
+ if (description == null) {
+ throw CheckedFromJsonException(
+ e,
+ 'description',
+ 'Screenshot',
+ 'Missing required key `description`',
+ );
+ }
+
+ if (description is! String) {
+ throw CheckedFromJsonException(
+ e,
+ 'description',
+ 'Screenshot',
+ '`$description` is not a String',
+ );
+ }
+
+ final path = e['path'];
+ if (path == null) {
+ throw CheckedFromJsonException(
+ e,
+ 'path',
+ 'Screenshot',
+ 'Missing required key `path`',
+ );
+ }
+
+ if (path is! String) {
+ throw CheckedFromJsonException(
+ e,
+ 'path',
+ 'Screenshot',
+ '`$path` is not a String',
+ );
+ }
+
+ res.add(Screenshot(description, path));
+ }
+ return res;
+}
diff --git a/pkgs/pubspec_parse/pubspec.yaml b/pkgs/pubspec_parse/pubspec.yaml
new file mode 100644
index 0000000..73a1117
--- /dev/null
+++ b/pkgs/pubspec_parse/pubspec.yaml
@@ -0,0 +1,32 @@
+name: pubspec_parse
+version: 1.5.0
+description: >-
+ Simple package for parsing pubspec.yaml files with a type-safe API and rich
+ error reporting.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/pubspec_parse
+
+topics:
+- dart-pub
+
+environment:
+ sdk: ^3.6.0
+
+dependencies:
+ checked_yaml: ^2.0.1
+ collection: ^1.19.0
+ json_annotation: ^4.9.0
+ pub_semver: ^2.1.4
+ yaml: ^3.0.0
+
+dev_dependencies:
+ build_runner: ^2.4.6
+ build_verify: ^3.0.0
+ dart_flutter_team_lints: ^3.0.0
+ json_serializable: ^6.9.1
+ path: ^1.9.0
+ # Needed because we are configuring `combining_builder`
+ source_gen: ^2.0.0
+ stack_trace: ^1.10.0
+ test: ^1.24.4
+ test_descriptor: ^2.0.0
+ test_process: ^2.0.0
diff --git a/pkgs/pubspec_parse/test/dependency_test.dart b/pkgs/pubspec_parse/test/dependency_test.dart
new file mode 100644
index 0000000..f1e4f57
--- /dev/null
+++ b/pkgs/pubspec_parse/test/dependency_test.dart
@@ -0,0 +1,446 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import 'package:pub_semver/pub_semver.dart';
+import 'package:pubspec_parse/pubspec_parse.dart';
+import 'package:test/test.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ group('hosted', _hostedDependency);
+ group('git', _gitDependency);
+ group('sdk', _sdkDependency);
+ group('path', _pathDependency);
+
+ group('errors', () {
+ test('List', () {
+ _expectThrows(
+ [],
+ r'''
+line 4, column 10: Unsupported value for "dep". Not a valid dependency value.
+ ╷
+4 │ "dep": []
+ │ ^^
+ ╵''',
+ );
+ });
+
+ test('int', () {
+ _expectThrows(
+ 42,
+ r'''
+line 4, column 10: Unsupported value for "dep". Not a valid dependency value.
+ ╷
+4 │ "dep": 42
+ │ ┌──────────^
+5 │ │ }
+ │ └─^
+ ╵''',
+ );
+ });
+
+ test('map with too many keys', () {
+ _expectThrows(
+ {'path': 'a', 'git': 'b'},
+ r'''
+line 6, column 11: Unsupported value for "git". A dependency may only have one source.
+ ╷
+6 │ "git": "b"
+ │ ^^^
+ ╵''',
+ );
+ });
+
+ test('map with unsupported keys', () {
+ _expectThrows(
+ {'bob': 'a', 'jones': 'b'},
+ r'''
+line 5, column 4: Unrecognized keys: [bob]; supported keys: [sdk, git, path, hosted]
+ ╷
+5 │ "bob": "a",
+ │ ^^^^^
+ ╵''',
+ );
+ });
+ });
+}
+
+void _hostedDependency() {
+ test('null', () async {
+ final dep = await _dependency<HostedDependency>(null);
+ expect(dep.version.toString(), 'any');
+ expect(dep.hosted, isNull);
+ expect(dep.toString(), 'HostedDependency: any');
+ });
+
+ test('empty map', () async {
+ final dep = await _dependency<HostedDependency>({});
+ expect(dep.hosted, isNull);
+ expect(dep.toString(), 'HostedDependency: any');
+ });
+
+ test('string version', () async {
+ final dep = await _dependency<HostedDependency>('^1.0.0');
+ expect(dep.version.toString(), '^1.0.0');
+ expect(dep.hosted, isNull);
+ expect(dep.toString(), 'HostedDependency: ^1.0.0');
+ });
+
+ test('bad string version', () {
+ _expectThrows(
+ 'not a version',
+ r'''
+line 4, column 10: Unsupported value for "dep". Could not parse version "not a version". Unknown text at "not a version".
+ ╷
+4 │ "dep": "not a version"
+ │ ^^^^^^^^^^^^^^^
+ ╵''',
+ );
+ });
+
+ test('map w/ just version', () async {
+ final dep = await _dependency<HostedDependency>({'version': '^1.0.0'});
+ expect(dep.version.toString(), '^1.0.0');
+ expect(dep.hosted, isNull);
+ expect(dep.toString(), 'HostedDependency: ^1.0.0');
+ });
+
+ test('map w/ version and hosted as Map', () async {
+ final dep = await _dependency<HostedDependency>({
+ 'version': '^1.0.0',
+ 'hosted': {'name': 'hosted_name', 'url': 'https://hosted_url'},
+ });
+ expect(dep.version.toString(), '^1.0.0');
+ expect(dep.hosted!.name, 'hosted_name');
+ expect(dep.hosted!.url.toString(), 'https://hosted_url');
+ expect(dep.toString(), 'HostedDependency: ^1.0.0');
+ });
+
+ test('map /w hosted as a map without name', () async {
+ final dep = await _dependency<HostedDependency>(
+ {
+ 'version': '^1.0.0',
+ 'hosted': {'url': 'https://hosted_url'},
+ },
+ skipTryPub: true, // todo: Unskip once pub supports this syntax
+ );
+ expect(dep.version.toString(), '^1.0.0');
+ expect(dep.hosted!.declaredName, isNull);
+ expect(dep.hosted!.name, 'dep');
+ expect(dep.hosted!.url.toString(), 'https://hosted_url');
+ expect(dep.toString(), 'HostedDependency: ^1.0.0');
+ });
+
+ test('map w/ bad version value', () {
+ _expectThrows(
+ {
+ 'version': 'not a version',
+ 'hosted': {'name': 'hosted_name', 'url': 'hosted_url'},
+ },
+ r'''
+line 5, column 15: Unsupported value for "version". Could not parse version "not a version". Unknown text at "not a version".
+ ╷
+5 │ "version": "not a version",
+ │ ^^^^^^^^^^^^^^^
+ ╵''',
+ );
+ });
+
+ test('map w/ extra keys should fail', () {
+ _expectThrows(
+ {
+ 'version': '^1.0.0',
+ 'hosted': {'name': 'hosted_name', 'url': 'hosted_url'},
+ 'not_supported': null,
+ },
+ r'''
+line 10, column 4: Unrecognized keys: [not_supported]; supported keys: [sdk, git, path, hosted]
+ ╷
+10 │ "not_supported": null
+ │ ^^^^^^^^^^^^^^^
+ ╵''',
+ );
+ });
+
+ test('map w/ version and hosted as String', () async {
+ final dep = await _dependency<HostedDependency>(
+ {'version': '^1.0.0', 'hosted': 'hosted_url'},
+ skipTryPub: true, // todo: Unskip once put supports this
+ );
+ expect(dep.version.toString(), '^1.0.0');
+ expect(dep.hosted!.declaredName, isNull);
+ expect(dep.hosted!.name, 'dep');
+ expect(dep.hosted!.url, Uri.parse('hosted_url'));
+ expect(dep.toString(), 'HostedDependency: ^1.0.0');
+ });
+
+ test('map w/ hosted as String', () async {
+ final dep = await _dependency<HostedDependency>({'hosted': 'hosted_url'});
+ expect(dep.version, VersionConstraint.any);
+ expect(dep.hosted!.declaredName, isNull);
+ expect(dep.hosted!.name, 'dep');
+ expect(dep.hosted!.url, Uri.parse('hosted_url'));
+ expect(dep.toString(), 'HostedDependency: any');
+ });
+
+ test('map w/ null hosted should error', () {
+ _expectThrows(
+ {'hosted': null},
+ r'''
+line 5, column 4: These keys had `null` values, which is not allowed: [hosted]
+ ╷
+5 │ "hosted": null
+ │ ^^^^^^^^
+ ╵''',
+ );
+ });
+
+ test('map w/ null version is fine', () async {
+ final dep = await _dependency<HostedDependency>({'version': null});
+ expect(dep.version, VersionConstraint.any);
+ expect(dep.hosted, isNull);
+ expect(dep.toString(), 'HostedDependency: any');
+ });
+}
+
+void _sdkDependency() {
+ test('without version', () async {
+ final dep = await _dependency<SdkDependency>({'sdk': 'flutter'});
+ expect(dep.sdk, 'flutter');
+ expect(dep.version, VersionConstraint.any);
+ expect(dep.toString(), 'SdkDependency: flutter');
+ });
+
+ test('with version', () async {
+ final dep = await _dependency<SdkDependency>(
+ {'sdk': 'flutter', 'version': '>=1.2.3 <2.0.0'},
+ );
+ expect(dep.sdk, 'flutter');
+ expect(dep.version.toString(), '>=1.2.3 <2.0.0');
+ expect(dep.toString(), 'SdkDependency: flutter');
+ });
+
+ test('null content', () {
+ _expectThrowsContaining(
+ {'sdk': null},
+ r"type 'Null' is not a subtype of type 'String'",
+ );
+ });
+
+ test('number content', () {
+ _expectThrowsContaining(
+ {'sdk': 42},
+ r"type 'int' is not a subtype of type 'String'",
+ );
+ });
+}
+
+void _gitDependency() {
+ test('string', () async {
+ final dep = await _dependency<GitDependency>({'git': 'url'});
+ expect(dep.url.toString(), 'url');
+ expect(dep.path, isNull);
+ expect(dep.ref, isNull);
+ expect(dep.toString(), 'GitDependency: url@url');
+ });
+
+ test('string with version key is ignored', () async {
+ // Regression test for https://github.com/dart-lang/pubspec_parse/issues/13
+ final dep =
+ await _dependency<GitDependency>({'git': 'url', 'version': '^1.2.3'});
+ expect(dep.url.toString(), 'url');
+ expect(dep.path, isNull);
+ expect(dep.ref, isNull);
+ expect(dep.toString(), 'GitDependency: url@url');
+ });
+
+ test('string with user@ URL', () async {
+ final skipTryParse = Platform.environment.containsKey('TRAVIS');
+ if (skipTryParse) {
+ print('FYI: not validating git@ URI on travis due to failure');
+ }
+ final dep = await _dependency<GitDependency>(
+ {'git': 'git@localhost:dep.git'},
+ skipTryPub: skipTryParse,
+ );
+ expect(dep.url.toString(), 'ssh://git@localhost/dep.git');
+ expect(dep.path, isNull);
+ expect(dep.ref, isNull);
+ expect(dep.toString(), 'GitDependency: url@ssh://git@localhost/dep.git');
+ });
+
+ test('string with random extra key fails', () {
+ _expectThrows(
+ {'git': 'url', 'bob': '^1.2.3'},
+ r'''
+line 6, column 4: Unrecognized keys: [bob]; supported keys: [sdk, git, path, hosted]
+ ╷
+6 │ "bob": "^1.2.3"
+ │ ^^^^^
+ ╵''',
+ );
+ });
+
+ test('map', () async {
+ final dep = await _dependency<GitDependency>({
+ 'git': {'url': 'url', 'path': 'path', 'ref': 'ref'},
+ });
+ expect(dep.url.toString(), 'url');
+ expect(dep.path, 'path');
+ expect(dep.ref, 'ref');
+ expect(dep.toString(), 'GitDependency: url@url');
+ });
+
+ test('git - null content', () {
+ _expectThrows(
+ {'git': null},
+ r'''
+line 5, column 11: Unsupported value for "git". Must be a String or a Map.
+ ╷
+5 │ "git": null
+ │ ┌───────────^
+6 │ │ }
+ │ └──^
+ ╵''',
+ );
+ });
+
+ test('git - int content', () {
+ _expectThrows(
+ {'git': 42},
+ r'''
+line 5, column 11: Unsupported value for "git". Must be a String or a Map.
+ ╷
+5 │ "git": 42
+ │ ┌───────────^
+6 │ │ }
+ │ └──^
+ ╵''',
+ );
+ });
+
+ test('git - empty map', () {
+ _expectThrowsContaining(
+ {'git': <String, dynamic>{}},
+ r"type 'Null' is not a subtype of type 'String'",
+ );
+ });
+
+ test('git - null url', () {
+ _expectThrowsContaining(
+ {
+ 'git': {'url': null},
+ },
+ r"type 'Null' is not a subtype of type 'String'",
+ );
+ });
+
+ test('git - int url', () {
+ _expectThrowsContaining(
+ {
+ 'git': {'url': 42},
+ },
+ r"type 'int' is not a subtype of type 'String'",
+ );
+ });
+}
+
+void _pathDependency() {
+ test('valid', () async {
+ final dep = await _dependency<PathDependency>({'path': '../path'});
+ expect(dep.path, '../path');
+ expect(dep.toString(), 'PathDependency: path@../path');
+ });
+
+ test('valid with version key is ignored', () async {
+ final dep = await _dependency<PathDependency>(
+ {'path': '../path', 'version': '^1.2.3'},
+ );
+ expect(dep.path, '../path');
+ expect(dep.toString(), 'PathDependency: path@../path');
+ });
+
+ test('valid with random extra key fails', () {
+ _expectThrows(
+ {'path': '../path', 'bob': '^1.2.3'},
+ r'''
+line 6, column 4: Unrecognized keys: [bob]; supported keys: [sdk, git, path, hosted]
+ ╷
+6 │ "bob": "^1.2.3"
+ │ ^^^^^
+ ╵''',
+ );
+ });
+
+ test('null content', () {
+ _expectThrows(
+ {'path': null},
+ r'''
+line 5, column 12: Unsupported value for "path". Must be a String.
+ ╷
+5 │ "path": null
+ │ ┌────────────^
+6 │ │ }
+ │ └──^
+ ╵''',
+ );
+ });
+
+ test('int content', () {
+ _expectThrows(
+ {'path': 42},
+ r'''
+line 5, column 12: Unsupported value for "path". Must be a String.
+ ╷
+5 │ "path": 42
+ │ ┌────────────^
+6 │ │ }
+ │ └──^
+ ╵''',
+ );
+ });
+}
+
+void _expectThrows(Object content, String expectedError) {
+ expectParseThrows(
+ {
+ 'name': 'sample',
+ 'dependencies': {'dep': content},
+ },
+ expectedError,
+ );
+}
+
+void _expectThrowsContaining(Object content, String errorText) {
+ expectParseThrowsContaining(
+ {
+ 'name': 'sample',
+ 'dependencies': {'dep': content},
+ },
+ errorText,
+ );
+}
+
+Future<T> _dependency<T extends Dependency>(
+ Object? content, {
+ bool skipTryPub = false,
+}) async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'dependencies': {'dep': content},
+ },
+ skipTryPub: skipTryPub,
+ );
+ expect(value.name, 'sample');
+ expect(value.dependencies, hasLength(1));
+
+ final entry = value.dependencies.entries.single;
+ expect(entry.key, 'dep');
+
+ return entry.value as T;
+}
diff --git a/pkgs/pubspec_parse/test/ensure_build_test.dart b/pkgs/pubspec_parse/test/ensure_build_test.dart
new file mode 100644
index 0000000..0e4371c
--- /dev/null
+++ b/pkgs/pubspec_parse/test/ensure_build_test.dart
@@ -0,0 +1,18 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@Timeout.factor(2)
+@TestOn('vm')
+@Tags(['presubmit-only'])
+library;
+
+import 'package:build_verify/build_verify.dart';
+import 'package:test/test.dart';
+
+void main() {
+ test(
+ 'ensure_build',
+ () => expectBuildClean(packageRelativeDirectory: 'pkgs/pubspec_parse/'),
+ );
+}
diff --git a/pkgs/pubspec_parse/test/git_uri_test.dart b/pkgs/pubspec_parse/test/git_uri_test.dart
new file mode 100644
index 0000000..be89ba8
--- /dev/null
+++ b/pkgs/pubspec_parse/test/git_uri_test.dart
@@ -0,0 +1,25 @@
+import 'package:pubspec_parse/src/dependency.dart';
+import 'package:test/test.dart';
+
+void main() {
+ for (var item in {
+ 'git@github.com:google/grinder.dart.git':
+ 'ssh://git@github.com/google/grinder.dart.git',
+ 'host.xz:path/to/repo.git/': 'ssh://host.xz/path/to/repo.git/',
+ 'http:path/to/repo.git/': 'ssh://http/path/to/repo.git/',
+ 'file:path/to/repo.git/': 'ssh://file/path/to/repo.git/',
+ './foo:bar': 'foo%3Abar',
+ '/path/to/repo.git/': '/path/to/repo.git/',
+ 'file:///path/to/repo.git/': 'file:///path/to/repo.git/',
+ }.entries) {
+ test(item.key, () {
+ final uri = parseGitUri(item.key);
+
+ printOnFailure(
+ [uri.scheme, uri.userInfo, uri.host, uri.port, uri.path].join('\n'),
+ );
+
+ expect(uri, Uri.parse(item.value));
+ });
+ }
+}
diff --git a/pkgs/pubspec_parse/test/parse_test.dart b/pkgs/pubspec_parse/test/parse_test.dart
new file mode 100644
index 0000000..e0698af
--- /dev/null
+++ b/pkgs/pubspec_parse/test/parse_test.dart
@@ -0,0 +1,827 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: deprecated_member_use_from_same_package
+// ignore_for_file: lines_longer_than_80_chars
+
+import 'package:pub_semver/pub_semver.dart';
+import 'package:test/test.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ test('minimal set values', () async {
+ final value = await parse(defaultPubspec);
+ expect(value.name, 'sample');
+ expect(value.version, isNull);
+ expect(value.publishTo, isNull);
+ expect(value.description, isNull);
+ expect(value.homepage, isNull);
+ expect(value.author, isNull);
+ expect(value.authors, isEmpty);
+ expect(
+ value.environment,
+ {'sdk': VersionConstraint.parse('>=2.12.0 <3.0.0')},
+ );
+ expect(value.documentation, isNull);
+ expect(value.dependencies, isEmpty);
+ expect(value.devDependencies, isEmpty);
+ expect(value.dependencyOverrides, isEmpty);
+ expect(value.flutter, isNull);
+ expect(value.repository, isNull);
+ expect(value.issueTracker, isNull);
+ expect(value.screenshots, isEmpty);
+ expect(value.workspace, isNull);
+ expect(value.resolution, isNull);
+ expect(value.executables, isEmpty);
+ });
+
+ test('all fields set', () async {
+ final version = Version.parse('1.2.3');
+ final sdkConstraint = VersionConstraint.parse('>=3.6.0 <4.0.0');
+ final value = await parse(
+ {
+ 'name': 'sample',
+ 'version': version.toString(),
+ 'publish_to': 'none',
+ 'author': 'name@example.com',
+ 'environment': {'sdk': sdkConstraint.toString()},
+ 'description': 'description',
+ 'homepage': 'homepage',
+ 'documentation': 'documentation',
+ 'repository': 'https://github.com/example/repo',
+ 'issue_tracker': 'https://github.com/example/repo/issues',
+ 'funding': [
+ 'https://patreon.com/example',
+ ],
+ 'topics': ['widget', 'button'],
+ 'ignored_advisories': ['111', '222'],
+ 'screenshots': [
+ {'description': 'my screenshot', 'path': 'path/to/screenshot'},
+ ],
+ 'workspace': [
+ 'pkg1',
+ 'pkg2',
+ ],
+ 'resolution': 'workspace',
+ 'executables': {
+ 'my_script': 'bin/my_script.dart',
+ 'my_script2': 'bin/my_script2.dart',
+ },
+ },
+ skipTryPub: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.version, version);
+ expect(value.publishTo, 'none');
+ expect(value.description, 'description');
+ expect(value.homepage, 'homepage');
+ expect(value.author, 'name@example.com');
+ expect(value.authors, ['name@example.com']);
+ expect(value.environment, hasLength(1));
+ expect(value.environment, containsPair('sdk', sdkConstraint));
+ expect(value.documentation, 'documentation');
+ expect(value.dependencies, isEmpty);
+ expect(value.devDependencies, isEmpty);
+ expect(value.dependencyOverrides, isEmpty);
+ expect(value.repository, Uri.parse('https://github.com/example/repo'));
+ expect(
+ value.issueTracker,
+ Uri.parse('https://github.com/example/repo/issues'),
+ );
+ expect(value.funding, hasLength(1));
+ expect(value.funding!.single.toString(), 'https://patreon.com/example');
+ expect(value.topics, hasLength(2));
+ expect(value.topics!.first, 'widget');
+ expect(value.topics!.last, 'button');
+ expect(value.ignoredAdvisories, hasLength(2));
+ expect(value.ignoredAdvisories!.first, '111');
+ expect(value.ignoredAdvisories!.last, '222');
+ expect(value.screenshots, hasLength(1));
+ expect(value.screenshots!.first.description, 'my screenshot');
+ expect(value.screenshots!.first.path, 'path/to/screenshot');
+ expect(value.executables, hasLength(2));
+ expect(value.executables.keys, contains('my_script'));
+ expect(value.executables.keys, contains('my_script2'));
+ expect(value.executables['my_script'], 'bin/my_script.dart');
+ expect(value.executables['my_script2'], 'bin/my_script2.dart');
+ expect(value.workspace, hasLength(2));
+ expect(value.workspace!.first, 'pkg1');
+ expect(value.workspace!.last, 'pkg2');
+ expect(value.resolution, 'workspace');
+ });
+
+ test('environment values can be null', () async {
+ final value = await parse(
+ {
+ 'name': 'sample',
+ 'environment': {
+ 'sdk': '>=2.12.0 <3.0.0',
+ 'bob': null,
+ },
+ },
+ skipTryPub: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.environment, hasLength(2));
+ expect(value.environment, containsPair('bob', isNull));
+ });
+
+ group('publish_to', () {
+ for (var entry in {
+ 42: "Unsupported value for \"publish_to\". type 'int' is not a subtype of type 'String?'",
+ '##not a uri!': r'''
+line 3, column 16: Unsupported value for "publish_to". Must be an http or https URL.
+ ╷
+3 │ "publish_to": "##not a uri!"
+ │ ^^^^^^^^^^^^^^
+ ╵''',
+ '/cool/beans': r'''
+line 3, column 16: Unsupported value for "publish_to". Must be an http or https URL.
+ ╷
+3 │ "publish_to": "/cool/beans"
+ │ ^^^^^^^^^^^^^
+ ╵''',
+ 'file:///Users/kevmoo/': r'''
+line 3, column 16: Unsupported value for "publish_to". Must be an http or https URL.
+ ╷
+3 │ "publish_to": "file:///Users/kevmoo/"
+ │ ^^^^^^^^^^^^^^^^^^^^^^^
+ ╵''',
+ }.entries) {
+ test('cannot be `${entry.key}`', () {
+ expectParseThrowsContaining(
+ {'name': 'sample', 'publish_to': entry.key},
+ entry.value,
+ skipTryPub: true,
+ );
+ });
+ }
+
+ for (var entry in {
+ null: null,
+ 'http': 'http://example.com',
+ 'https': 'https://example.com',
+ 'none': 'none',
+ }.entries) {
+ test('can be ${entry.key}', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'publish_to': entry.value,
+ });
+ expect(value.publishTo, entry.value);
+ });
+ }
+ });
+
+ group('author, authors', () {
+ test('one author', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'author': 'name@example.com',
+ });
+ expect(value.author, 'name@example.com');
+ expect(value.authors, ['name@example.com']);
+ });
+
+ test('one author, via authors', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'authors': ['name@example.com'],
+ });
+ expect(value.author, 'name@example.com');
+ expect(value.authors, ['name@example.com']);
+ });
+
+ test('many authors', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'authors': ['name@example.com', 'name2@example.com'],
+ });
+ expect(value.author, isNull);
+ expect(value.authors, ['name@example.com', 'name2@example.com']);
+ });
+
+ test('author and authors', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'author': 'name@example.com',
+ 'authors': ['name2@example.com'],
+ });
+ expect(value.author, isNull);
+ expect(value.authors, ['name@example.com', 'name2@example.com']);
+ });
+
+ test('duplicate author values', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'author': 'name@example.com',
+ 'authors': ['name@example.com', 'name@example.com'],
+ });
+ expect(value.author, 'name@example.com');
+ expect(value.authors, ['name@example.com']);
+ });
+
+ test('flutter', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'flutter': {'key': 'value'},
+ });
+ expect(value.flutter, {'key': 'value'});
+ });
+ });
+
+ group('executables', () {
+ test('one executable', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'executables': {'my_script': 'bin/my_script.dart'},
+ });
+ expect(value.executables, hasLength(1));
+ expect(value.executables.keys, contains('my_script'));
+ expect(value.executables['my_script'], 'bin/my_script.dart');
+ });
+
+ test('many executables', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'executables': {
+ 'my_script': 'bin/my_script.dart',
+ 'my_script2': 'bin/my_script2.dart',
+ },
+ });
+ expect(value.executables, hasLength(2));
+ expect(value.executables.keys, contains('my_script'));
+ expect(value.executables.keys, contains('my_script2'));
+ expect(value.executables['my_script'], 'bin/my_script.dart');
+ expect(value.executables['my_script2'], 'bin/my_script2.dart');
+ });
+
+ test('invalid value', () async {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'executables': {
+ 'script': 32,
+ },
+ },
+ 'Unsupported value for "script". `32` is not a String.',
+ skipTryPub: true,
+ );
+ });
+
+ test('invalid executable - lenient', () async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'executables': 'Invalid value',
+ },
+ lenient: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.executables, isEmpty);
+ });
+ });
+
+ group('invalid', () {
+ test('null', () {
+ expectParseThrows(
+ null,
+ r'''
+line 1, column 1: Not a map
+ ╷
+1 │ null
+ │ ^^^^
+ ╵''',
+ );
+ });
+ test('empty string', () {
+ expectParseThrows(
+ '',
+ r'''
+line 1, column 1: Not a map
+ ╷
+1 │ ""
+ │ ^^
+ ╵''',
+ );
+ });
+ test('array', () {
+ expectParseThrows(
+ [],
+ r'''
+line 1, column 1: Not a map
+ ╷
+1 │ []
+ │ ^^
+ ╵''',
+ );
+ });
+
+ test('missing name', () {
+ expectParseThrowsContaining(
+ {},
+ "Missing key \"name\". type 'Null' is not a subtype of type 'String'",
+ );
+ });
+
+ test('null name value', () {
+ expectParseThrowsContaining(
+ {'name': null},
+ "Unsupported value for \"name\". type 'Null' is not a subtype of type 'String'",
+ );
+ });
+
+ test('empty name value', () {
+ expectParseThrows(
+ {'name': ''},
+ r'''
+line 2, column 10: Unsupported value for "name". "name" cannot be empty.
+ ╷
+2 │ "name": ""
+ │ ^^
+ ╵''',
+ );
+ });
+
+ test('"dart" is an invalid environment key', () {
+ expectParseThrows(
+ {
+ 'name': 'sample',
+ 'environment': {'dart': 'cool'},
+ },
+ r'''
+line 4, column 3: Use "sdk" to for Dart SDK constraints.
+ ╷
+4 │ "dart": "cool"
+ │ ^^^^^^
+ ╵''',
+ );
+ });
+
+ test('environment values cannot be int', () {
+ expectParseThrows(
+ {
+ 'name': 'sample',
+ 'environment': {'sdk': 42},
+ },
+ r'''
+line 4, column 10: Unsupported value for "sdk". `42` is not a String.
+ ╷
+4 │ "sdk": 42
+ │ ┌──────────^
+5 │ │ }
+ │ └─^
+ ╵''',
+ );
+ });
+
+ test('version', () {
+ expectParseThrows(
+ {'name': 'sample', 'version': 'invalid'},
+ r'''
+line 3, column 13: Unsupported value for "version". Could not parse "invalid".
+ ╷
+3 │ "version": "invalid"
+ │ ^^^^^^^^^
+ ╵''',
+ );
+ });
+
+ test('invalid environment value', () {
+ expectParseThrows(
+ {
+ 'name': 'sample',
+ 'environment': {'sdk': 'silly'},
+ },
+ r'''
+line 4, column 10: Unsupported value for "sdk". Could not parse version "silly". Unknown text at "silly".
+ ╷
+4 │ "sdk": "silly"
+ │ ^^^^^^^
+ ╵''',
+ );
+ });
+
+ test('bad repository url', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'repository': {'x': 'y'},
+ },
+ "Unsupported value for \"repository\". type 'YamlMap' is not a subtype of type 'String'",
+ skipTryPub: true,
+ );
+ });
+
+ test('bad issue_tracker url', () {
+ expectParseThrowsContaining(
+ {
+ 'name': 'sample',
+ 'issue_tracker': {'x': 'y'},
+ },
+ "Unsupported value for \"issue_tracker\". type 'YamlMap' is not a subtype of type 'String'",
+ skipTryPub: true,
+ );
+ });
+ });
+
+ group('funding', () {
+ test('not a list', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'funding': 1,
+ },
+ "Unsupported value for \"funding\". type 'int' is not a subtype of type 'List<dynamic>?'",
+ skipTryPub: true,
+ );
+ });
+
+ test('not an uri', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'funding': [1],
+ },
+ "Unsupported value for \"funding\". type 'int' is not a subtype of type 'String'",
+ skipTryPub: true,
+ );
+ });
+
+ test('not an uri', () {
+ expectParseThrows(
+ {
+ ...defaultPubspec,
+ 'funding': ['ht tps://example.com/'],
+ },
+ r'''
+line 6, column 13: Unsupported value for "funding". Illegal scheme character at offset 2.
+ ╷
+6 │ "funding": [
+ │ ┌─────────────^
+7 │ │ "ht tps://example.com/"
+8 │ └ ]
+ ╵''',
+ skipTryPub: true,
+ );
+ });
+ });
+ group('topics', () {
+ test('not a list', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'topics': 1,
+ },
+ "Unsupported value for \"topics\". type 'int' is not a subtype of type 'List<dynamic>?'",
+ skipTryPub: true,
+ );
+ });
+
+ test('not a string', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'topics': [1],
+ },
+ "Unsupported value for \"topics\". type 'int' is not a subtype of type 'String'",
+ skipTryPub: true,
+ );
+ });
+
+ test('invalid data - lenient', () async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'topics': [1],
+ },
+ skipTryPub: true,
+ lenient: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.topics, isNull);
+ });
+ });
+
+ group('ignored_advisories', () {
+ test('not a list', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'ignored_advisories': 1,
+ },
+ "Unsupported value for \"ignored_advisories\". type 'int' is not a subtype of type 'List<dynamic>?'",
+ skipTryPub: true,
+ );
+ });
+
+ test('not a string', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'ignored_advisories': [1],
+ },
+ "Unsupported value for \"ignored_advisories\". type 'int' is not a subtype of type 'String'",
+ skipTryPub: true,
+ );
+ });
+
+ test('invalid data - lenient', () async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'ignored_advisories': [1],
+ },
+ skipTryPub: true,
+ lenient: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.ignoredAdvisories, isNull);
+ });
+ });
+
+ group('screenshots', () {
+ test('one screenshot', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'screenshots': [
+ {'description': 'my screenshot', 'path': 'path/to/screenshot'},
+ ],
+ });
+ expect(value.screenshots, hasLength(1));
+ expect(value.screenshots!.first.description, 'my screenshot');
+ expect(value.screenshots!.first.path, 'path/to/screenshot');
+ });
+
+ test('many screenshots', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'screenshots': [
+ {'description': 'my screenshot', 'path': 'path/to/screenshot'},
+ {
+ 'description': 'my second screenshot',
+ 'path': 'path/to/screenshot2',
+ },
+ ],
+ });
+ expect(value.screenshots, hasLength(2));
+ expect(value.screenshots!.first.description, 'my screenshot');
+ expect(value.screenshots!.first.path, 'path/to/screenshot');
+ expect(value.screenshots!.last.description, 'my second screenshot');
+ expect(value.screenshots!.last.path, 'path/to/screenshot2');
+ });
+
+ test('one screenshot plus invalid entries', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'screenshots': [
+ 42,
+ {
+ 'description': 'my screenshot',
+ 'path': 'path/to/screenshot',
+ 'extraKey': 'not important',
+ },
+ 'not a screenshot',
+ ],
+ });
+ expect(value.screenshots, hasLength(1));
+ expect(value.screenshots!.first.description, 'my screenshot');
+ expect(value.screenshots!.first.path, 'path/to/screenshot');
+ });
+
+ test('invalid entries', () async {
+ final value = await parse({
+ ...defaultPubspec,
+ 'screenshots': [
+ 42,
+ 'not a screenshot',
+ ],
+ });
+ expect(value.screenshots, isEmpty);
+ });
+
+ test('missing key `dessription', () {
+ expectParseThrows(
+ {
+ ...defaultPubspec,
+ 'screenshots': [
+ {'path': 'my/path'},
+ ],
+ },
+ r'''
+line 7, column 3: Missing key "description". Missing required key `description`
+ ╷
+7 │ ┌ {
+8 │ │ "path": "my/path"
+9 │ └ }
+ ╵''',
+ skipTryPub: true,
+ );
+ });
+
+ test('missing key `path`', () {
+ expectParseThrows(
+ {
+ ...defaultPubspec,
+ 'screenshots': [
+ {'description': 'my screenshot'},
+ ],
+ },
+ r'''
+line 7, column 3: Missing key "path". Missing required key `path`
+ ╷
+7 │ ┌ {
+8 │ │ "description": "my screenshot"
+9 │ └ }
+ ╵''',
+ skipTryPub: true,
+ );
+ });
+
+ test('Value of description not a String`', () {
+ expectParseThrows(
+ {
+ ...defaultPubspec,
+ 'screenshots': [
+ {'description': 42},
+ ],
+ },
+ r'''
+line 8, column 19: Unsupported value for "description". `42` is not a String
+ ╷
+8 │ "description": 42
+ │ ┌───────────────────^
+9 │ │ }
+ │ └──^
+ ╵''',
+ skipTryPub: true,
+ );
+ });
+
+ test('Value of path not a String`', () {
+ expectParseThrows(
+ {
+ ...defaultPubspec,
+ 'screenshots': [
+ {
+ 'description': '',
+ 'path': 42,
+ },
+ ],
+ },
+ r'''
+line 9, column 12: Unsupported value for "path". `42` is not a String
+ ╷
+9 │ "path": 42
+ │ ┌────────────^
+10 │ │ }
+ │ └──^
+ ╵''',
+ skipTryPub: true,
+ );
+ });
+
+ test('invalid screenshot - lenient', () async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'screenshots': 'Invalid value',
+ },
+ lenient: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.screenshots, isEmpty);
+ });
+ });
+
+ group('lenient', () {
+ test('null', () {
+ expectParseThrows(
+ null,
+ r'''
+line 1, column 1: Not a map
+ ╷
+1 │ null
+ │ ^^^^
+ ╵''',
+ lenient: true,
+ );
+ });
+
+ test('empty string', () {
+ expectParseThrows(
+ '',
+ r'''
+line 1, column 1: Not a map
+ ╷
+1 │ ""
+ │ ^^
+ ╵''',
+ lenient: true,
+ );
+ });
+
+ test('name cannot be empty', () {
+ expectParseThrowsContaining(
+ {},
+ "Missing key \"name\". type 'Null' is not a subtype of type 'String'",
+ lenient: true,
+ );
+ });
+
+ test('bad repository url', () async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'repository': {'x': 'y'},
+ },
+ lenient: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.repository, isNull);
+ });
+
+ test('bad issue_tracker url', () async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'issue_tracker': {'x': 'y'},
+ },
+ lenient: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.issueTracker, isNull);
+ });
+
+ test('multiple bad values', () async {
+ final value = await parse(
+ {
+ ...defaultPubspec,
+ 'repository': {'x': 'y'},
+ 'issue_tracker': {'x': 'y'},
+ },
+ lenient: true,
+ );
+ expect(value.name, 'sample');
+ expect(value.repository, isNull);
+ expect(value.issueTracker, isNull);
+ });
+
+ test('deep error throws with lenient', () {
+ expect(
+ () => parse(
+ {
+ 'name': 'sample',
+ 'dependencies': {
+ 'foo': {
+ 'git': {'url': 1},
+ },
+ },
+ 'issue_tracker': {'x': 'y'},
+ },
+ skipTryPub: true,
+ lenient: true,
+ ),
+ throwsException,
+ );
+ });
+ });
+
+ group('workspaces', () {
+ test('workspace key must be a list', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'workspace': 42,
+ },
+ 'Unsupported value for "workspace". type \'int\' is not a subtype of type \'List<dynamic>?\' in type cast',
+ skipTryPub: true,
+ );
+ });
+
+ test('workspace key must be a list of strings', () {
+ expectParseThrowsContaining(
+ {
+ ...defaultPubspec,
+ 'workspace': [42],
+ },
+ 'Unsupported value for "workspace". type \'int\' is not a subtype of type \'String\' in type cast',
+ skipTryPub: true,
+ );
+ });
+
+ test('resolution key must be a string', () {
+ expectParseThrowsContaining(
+ {
+ 'name': 'sample',
+ 'environment': {'sdk': '^3.6.0'},
+ 'resolution': 42,
+ },
+ 'Unsupported value for "resolution". type \'int\' is not a subtype of type \'String?\' in type cast',
+ skipTryPub: true,
+ );
+ });
+ });
+}
diff --git a/pkgs/pubspec_parse/test/pub_utils.dart b/pkgs/pubspec_parse/test/pub_utils.dart
new file mode 100644
index 0000000..a60aa2a
--- /dev/null
+++ b/pkgs/pubspec_parse/test/pub_utils.dart
@@ -0,0 +1,88 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+import 'package:path/path.dart' as p;
+import 'package:test/test.dart';
+import 'package:test_descriptor/test_descriptor.dart' as d;
+import 'package:test_process/test_process.dart';
+
+Future<ProcResult> tryPub(String content) async {
+ await d.file('pubspec.yaml', content).create();
+
+ final proc = await TestProcess.start(
+ Platform.resolvedExecutable,
+ ['pub', 'get', '--offline'],
+ workingDirectory: d.sandbox,
+ // Don't pass current process VM options to child
+ environment: Map.from(Platform.environment)..remove('DART_VM_OPTIONS'),
+ );
+
+ final result = await ProcResult.fromTestProcess(proc);
+
+ printOnFailure(
+ [
+ '-----BEGIN pub output-----',
+ result.toString().trim(),
+ '-----END pub output-----',
+ ].join('\n'),
+ );
+
+ if (result.exitCode == 0) {
+ final lockContent =
+ File(p.join(d.sandbox, 'pubspec.lock')).readAsStringSync();
+
+ printOnFailure(
+ [
+ '-----BEGIN pubspec.lock-----',
+ lockContent.trim(),
+ '-----END pubspec.lock-----',
+ ].join('\n'),
+ );
+ }
+
+ return result;
+}
+
+class ProcResult {
+ final int exitCode;
+ final List<ProcLine> lines;
+
+ bool get cleanParse => exitCode == 0 || exitCode == 66 || exitCode == 69;
+
+ ProcResult(this.exitCode, this.lines);
+
+ static Future<ProcResult> fromTestProcess(TestProcess proc) async {
+ final items = <ProcLine>[];
+
+ final values = await Future.wait([
+ proc.exitCode,
+ proc.stdoutStream().forEach((line) => items.add(ProcLine(false, line))),
+ proc.stderrStream().forEach((line) => items.add(ProcLine(true, line))),
+ ]);
+
+ return ProcResult(values[0] as int, items);
+ }
+
+ @override
+ String toString() {
+ final buffer = StringBuffer('Exit code: $exitCode');
+ for (var line in lines) {
+ buffer.write('\n$line');
+ }
+ return buffer.toString();
+ }
+}
+
+class ProcLine {
+ final bool isError;
+ final String line;
+
+ ProcLine(this.isError, this.line);
+
+ @override
+ String toString() => '${isError ? 'err' : 'out'} $line';
+}
diff --git a/pkgs/pubspec_parse/test/test_utils.dart b/pkgs/pubspec_parse/test/test_utils.dart
new file mode 100644
index 0000000..cc46522
--- /dev/null
+++ b/pkgs/pubspec_parse/test/test_utils.dart
@@ -0,0 +1,157 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert';
+
+import 'package:checked_yaml/checked_yaml.dart';
+import 'package:json_annotation/json_annotation.dart';
+import 'package:pubspec_parse/pubspec_parse.dart';
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+import 'pub_utils.dart';
+
+const defaultPubspec = {
+ 'name': 'sample',
+ 'environment': {'sdk': '>=2.12.0 <3.0.0'},
+};
+
+String _encodeJson(Object? input) =>
+ const JsonEncoder.withIndent(' ').convert(input);
+
+Matcher _throwsParsedYamlException(String prettyValue) => throwsA(
+ const TypeMatcher<ParsedYamlException>().having(
+ (e) {
+ final message = e.formattedMessage;
+ printOnFailure("Actual error format:\nr'''\n$message'''");
+ _printDebugParsedYamlException(e);
+ return message;
+ },
+ 'formattedMessage',
+ prettyValue,
+ ),
+ );
+
+void _printDebugParsedYamlException(ParsedYamlException e) {
+ var innerError = e.innerError;
+ StackTrace? innerStack;
+
+ if (innerError is CheckedFromJsonException) {
+ final cfje = innerError;
+
+ if (cfje.innerError != null) {
+ innerError = cfje.innerError;
+ innerStack = cfje.innerStack;
+ }
+ }
+
+ if (innerError != null) {
+ final items = [innerError];
+ if (innerStack != null) {
+ items.add(Trace.format(innerStack));
+ }
+
+ final content =
+ LineSplitter.split(items.join('\n')).map((e) => ' $e').join('\n');
+
+ printOnFailure('Inner error details:\n$content');
+ }
+}
+
+Future<Pubspec> parse(
+ Object? content, {
+ bool quietOnError = false,
+ bool skipTryPub = false,
+ bool lenient = false,
+}) async {
+ final encoded = _encodeJson(content);
+
+ ProcResult? pubResult;
+ if (!skipTryPub) {
+ // ignore: deprecated_member_use
+ pubResult = await tryPub(encoded);
+ expect(pubResult, isNotNull);
+ }
+
+ try {
+ final value = Pubspec.parse(encoded, lenient: lenient);
+
+ if (pubResult != null) {
+ addTearDown(() {
+ expect(
+ pubResult!.cleanParse,
+ isTrue,
+ reason:
+ 'On success, parsing from the pub client should also succeed.',
+ );
+ });
+ }
+ return value;
+ } catch (e) {
+ if (pubResult != null) {
+ addTearDown(() {
+ expect(
+ pubResult!.cleanParse,
+ isFalse,
+ reason: 'On failure, parsing from the pub client should also fail.',
+ );
+ });
+ }
+ if (e is ParsedYamlException) {
+ if (!quietOnError) {
+ _printDebugParsedYamlException(e);
+ }
+ }
+ rethrow;
+ }
+}
+
+void expectParseThrows(
+ Object? content,
+ String expectedError, {
+ bool skipTryPub = false,
+ bool lenient = false,
+}) =>
+ expect(
+ () => parse(
+ content,
+ lenient: lenient,
+ quietOnError: true,
+ skipTryPub: skipTryPub,
+ ),
+ _throwsParsedYamlException(expectedError),
+ );
+
+void expectParseThrowsContaining(
+ Object? content,
+ String errorFragment, {
+ bool skipTryPub = false,
+ bool lenient = false,
+}) {
+ expect(
+ () => parse(
+ content,
+ lenient: lenient,
+ quietOnError: true,
+ skipTryPub: skipTryPub,
+ ),
+ _throwsParsedYamlExceptionContaining(errorFragment),
+ );
+}
+
+// ignore: prefer_expression_function_bodies
+Matcher _throwsParsedYamlExceptionContaining(String errorFragment) {
+ return throwsA(
+ const TypeMatcher<ParsedYamlException>().having(
+ (e) {
+ final message = e.formattedMessage;
+ printOnFailure("Actual error format:\nr'''\n$message'''");
+ _printDebugParsedYamlException(e);
+ return message;
+ },
+ 'formattedMessage',
+ contains(errorFragment),
+ ),
+ );
+}
diff --git a/pkgs/source_maps/.gitignore b/pkgs/source_maps/.gitignore
new file mode 100644
index 0000000..f73b2f9
--- /dev/null
+++ b/pkgs/source_maps/.gitignore
@@ -0,0 +1,4 @@
+.dart_tool/
+.packages
+.pub/
+pubspec.lock
diff --git a/pkgs/source_maps/CHANGELOG.md b/pkgs/source_maps/CHANGELOG.md
new file mode 100644
index 0000000..b06ac72
--- /dev/null
+++ b/pkgs/source_maps/CHANGELOG.md
@@ -0,0 +1,133 @@
+## 0.10.14-wip
+
+## 0.10.13
+
+* Require Dart 3.3
+* Move to `dart-lang/tools` monorepo.
+
+## 0.10.12
+
+* Add additional types at API boundaries.
+
+## 0.10.11
+
+* Populate the pubspec `repository` field.
+* Update the source map documentation link in the readme.
+
+## 0.10.10
+
+* Stable release for null safety.
+
+## 0.10.9
+
+* Fix a number of document comment issues.
+* Allow parsing source map files with a missing `names` field.
+
+## 0.10.8
+
+* Preserve source-map extensions in `SingleMapping`. Extensions are keys in the
+ json map that start with `"x_"`.
+
+## 0.10.7
+
+* Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 0.10.6
+
+* Require version 2.0.0 of the Dart SDK.
+
+## 0.10.5
+
+* Add a `SingleMapping.files` field which provides access to `SourceFile`s
+ representing the `"sourcesContent"` fields in the source map.
+
+* Add an `includeSourceContents` flag to `SingleMapping.toJson()` which
+ indicates whether to include source file contents in the source map.
+
+## 0.10.4
+* Implement `highlight` in `SourceMapFileSpan`.
+* Require version `^1.3.0` of `source_span`.
+
+## 0.10.3
+ * Add `addMapping` and `containsMapping` members to `MappingBundle`.
+
+## 0.10.2
+ * Support for extended source map format.
+ * Polish `MappingBundle.spanFor` handling of URIs that have a suffix that
+ exactly match a source map in the MappingBundle.
+
+## 0.10.1+5
+ * Fix strong mode warning in test.
+
+## 0.10.1+4
+
+* Extend `MappingBundle.spanFor` to accept requests for output files that
+ don't have source maps.
+
+## 0.10.1+3
+
+* Add `MappingBundle` class that handles extended source map format that
+ supports source maps for multiple output files in a single mapper.
+ Extend `Mapping.spanFor` API to accept a uri parameter that is optional
+ for normal source maps but required for MappingBundle source maps.
+
+## 0.10.1+2
+
+* Fix more strong mode warnings.
+
+## 0.10.1+1
+
+* Fix all strong mode warnings.
+
+## 0.10.1
+
+* Add a `mapUrl` named argument to `parse` and `parseJson`. This argument is
+ used to resolve source URLs for source spans.
+
+## 0.10.0+2
+
+* Fix analyzer error (FileSpan has a new field since `source_span` 1.1.1)
+
+## 0.10.0+1
+
+* Remove an unnecessary warning printed when the "file" field is missing from a
+ Json formatted source map. This field is optional and its absence is not
+ unusual.
+
+## 0.10.0
+
+* Remove the `Span`, `Location` and `SourceFile` classes. Use the
+ corresponding `source_span` classes instead.
+
+## 0.9.4
+
+* Update `SpanFormatException` with `source` and `offset`.
+
+* All methods that take `Span`s, `Location`s, and `SourceFile`s as inputs now
+ also accept the corresponding `source_span` classes as well. Using the old
+ classes is now deprecated and will be unsupported in version 0.10.0.
+
+## 0.9.3
+
+* Support writing SingleMapping objects to source map version 3 format.
+* Support the `sourceRoot` field in the SingleMapping class.
+* Support updating the `targetUrl` field in the SingleMapping class.
+
+## 0.9.2+2
+
+* Fix a bug in `FixedSpan.getLocationMessage`.
+
+## 0.9.2+1
+
+* Minor readability improvements to `FixedSpan.getLocationMessage` and
+ `SpanException.toString`.
+
+## 0.9.2
+
+* Add `SpanException` and `SpanFormatException` classes.
+
+## 0.9.1
+
+* Support unmapped areas in source maps.
+
+* Increase the readability of location messages.
diff --git a/pkgs/source_maps/LICENSE b/pkgs/source_maps/LICENSE
new file mode 100644
index 0000000..162572a
--- /dev/null
+++ b/pkgs/source_maps/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2014, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/source_maps/README.md b/pkgs/source_maps/README.md
new file mode 100644
index 0000000..cf80291
--- /dev/null
+++ b/pkgs/source_maps/README.md
@@ -0,0 +1,25 @@
+[](https://github.com/dart-lang/tools/actions/workflows/source_maps.yaml)
+[](https://pub.dev/packages/source_maps)
+[](https://pub.dev/packages/source_maps/publisher)
+
+This project implements a Dart pub package to work with source maps.
+
+## Docs and usage
+
+The implementation is based on the [source map version 3 spec][spec] which was
+originated from the [Closure Compiler][closure] and has been implemented in
+Chrome and Firefox.
+
+In this package we provide:
+
+ * Data types defining file locations and spans: these are not part of the
+ original source map specification. These data types are great for tracking
+ source locations on source maps, but they can also be used by tools to
+ reporting useful error messages that include on source locations.
+ * A builder that creates a source map programmatically and produces the encoded
+ source map format.
+ * A parser that reads the source map format and provides APIs to read the
+ mapping information.
+
+[closure]: https://github.com/google/closure-compiler/wiki/Source-Maps
+[spec]: https://docs.google.com/a/google.com/document/d/1U1RGAehQwRypUTovF1KRlpiOFze0b-_2gc6fAH0KY0k/edit
diff --git a/pkgs/source_maps/analysis_options.yaml b/pkgs/source_maps/analysis_options.yaml
new file mode 100644
index 0000000..d978f81
--- /dev/null
+++ b/pkgs/source_maps/analysis_options.yaml
@@ -0,0 +1 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
diff --git a/pkgs/source_maps/lib/builder.dart b/pkgs/source_maps/lib/builder.dart
new file mode 100644
index 0000000..9043c63
--- /dev/null
+++ b/pkgs/source_maps/lib/builder.dart
@@ -0,0 +1,84 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Contains a builder object useful for creating source maps programatically.
+library;
+
+// TODO(sigmund): add a builder for multi-section mappings.
+
+import 'dart:convert';
+
+import 'package:source_span/source_span.dart';
+
+import 'parser.dart';
+import 'src/source_map_span.dart';
+
+/// Builds a source map given a set of mappings.
+class SourceMapBuilder {
+ final List<Entry> _entries = <Entry>[];
+
+ /// Adds an entry mapping the [targetOffset] to [source].
+ void addFromOffset(SourceLocation source, SourceFile targetFile,
+ int targetOffset, String identifier) {
+ ArgumentError.checkNotNull(targetFile, 'targetFile');
+ _entries.add(Entry(source, targetFile.location(targetOffset), identifier));
+ }
+
+ /// Adds an entry mapping [target] to [source].
+ ///
+ /// If [isIdentifier] is true or if [target] is a [SourceMapSpan] with
+ /// `isIdentifier` set to true, this entry is considered to represent an
+ /// identifier whose value will be stored in the source map. [isIdentifier]
+ /// takes precedence over [target]'s `isIdentifier` value.
+ void addSpan(SourceSpan source, SourceSpan target, {bool? isIdentifier}) {
+ isIdentifier ??= source is SourceMapSpan ? source.isIdentifier : false;
+
+ var name = isIdentifier ? source.text : null;
+ _entries.add(Entry(source.start, target.start, name));
+ }
+
+ /// Adds an entry mapping [target] to [source].
+ void addLocation(
+ SourceLocation source, SourceLocation target, String? identifier) {
+ _entries.add(Entry(source, target, identifier));
+ }
+
+ /// Encodes all mappings added to this builder as a json map.
+ Map<String, dynamic> build(String fileUrl) {
+ return SingleMapping.fromEntries(_entries, fileUrl).toJson();
+ }
+
+ /// Encodes all mappings added to this builder as a json string.
+ String toJson(String fileUrl) => jsonEncode(build(fileUrl));
+}
+
+/// An entry in the source map builder.
+class Entry implements Comparable<Entry> {
+ /// Span denoting the original location in the input source file
+ final SourceLocation source;
+
+ /// Span indicating the corresponding location in the target file.
+ final SourceLocation target;
+
+ /// An identifier name, when this location is the start of an identifier.
+ final String? identifierName;
+
+ /// Creates a new [Entry] mapping [target] to [source].
+ Entry(this.source, this.target, this.identifierName);
+
+ /// Implements [Comparable] to ensure that entries are ordered by their
+ /// location in the target file. We sort primarily by the target offset
+ /// because source map files are encoded by printing each mapping in order as
+ /// they appear in the target file.
+ @override
+ int compareTo(Entry other) {
+ var res = target.compareTo(other.target);
+ if (res != 0) return res;
+ res = source.sourceUrl
+ .toString()
+ .compareTo(other.source.sourceUrl.toString());
+ if (res != 0) return res;
+ return source.compareTo(other.source);
+ }
+}
diff --git a/pkgs/source_maps/lib/parser.dart b/pkgs/source_maps/lib/parser.dart
new file mode 100644
index 0000000..590dfc6
--- /dev/null
+++ b/pkgs/source_maps/lib/parser.dart
@@ -0,0 +1,718 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Contains the top-level function to parse source maps version 3.
+library;
+
+import 'dart:convert';
+
+import 'package:source_span/source_span.dart';
+
+import 'builder.dart' as builder;
+import 'src/source_map_span.dart';
+import 'src/utils.dart';
+import 'src/vlq.dart';
+
+/// Parses a source map directly from a json string.
+///
+/// [mapUrl], which may be either a [String] or a [Uri], indicates the URL of
+/// the source map file itself. If it's passed, any URLs in the source
+/// map will be interpreted as relative to this URL when generating spans.
+// TODO(sigmund): evaluate whether other maps should have the json parsed, or
+// the string represenation.
+// TODO(tjblasi): Ignore the first line of [jsonMap] if the JSON safety string
+// `)]}'` begins the string representation of the map.
+Mapping parse(String jsonMap,
+ {Map<String, Map>? otherMaps, /*String|Uri*/ Object? mapUrl}) =>
+ parseJson(jsonDecode(jsonMap) as Map, otherMaps: otherMaps, mapUrl: mapUrl);
+
+/// Parses a source map or source map bundle directly from a json string.
+///
+/// [mapUrl], which may be either a [String] or a [Uri], indicates the URL of
+/// the source map file itself. If it's passed, any URLs in the source
+/// map will be interpreted as relative to this URL when generating spans.
+Mapping parseExtended(String jsonMap,
+ {Map<String, Map>? otherMaps, /*String|Uri*/ Object? mapUrl}) =>
+ parseJsonExtended(jsonDecode(jsonMap),
+ otherMaps: otherMaps, mapUrl: mapUrl);
+
+/// Parses a source map or source map bundle.
+///
+/// [mapUrl], which may be either a [String] or a [Uri], indicates the URL of
+/// the source map file itself. If it's passed, any URLs in the source
+/// map will be interpreted as relative to this URL when generating spans.
+Mapping parseJsonExtended(/*List|Map*/ Object? json,
+ {Map<String, Map>? otherMaps, /*String|Uri*/ Object? mapUrl}) {
+ if (json is List) {
+ return MappingBundle.fromJson(json, mapUrl: mapUrl);
+ }
+ return parseJson(json as Map);
+}
+
+/// Parses a source map.
+///
+/// [mapUrl], which may be either a [String] or a [Uri], indicates the URL of
+/// the source map file itself. If it's passed, any URLs in the source
+/// map will be interpreted as relative to this URL when generating spans.
+Mapping parseJson(Map map,
+ {Map<String, Map>? otherMaps, /*String|Uri*/ Object? mapUrl}) {
+ if (map['version'] != 3) {
+ throw ArgumentError('unexpected source map version: ${map["version"]}. '
+ 'Only version 3 is supported.');
+ }
+
+ if (map.containsKey('sections')) {
+ if (map.containsKey('mappings') ||
+ map.containsKey('sources') ||
+ map.containsKey('names')) {
+ throw const FormatException('map containing "sections" '
+ 'cannot contain "mappings", "sources", or "names".');
+ }
+ return MultiSectionMapping.fromJson(map['sections'] as List, otherMaps,
+ mapUrl: mapUrl);
+ }
+ return SingleMapping.fromJson(map.cast<String, dynamic>(), mapUrl: mapUrl);
+}
+
+/// A mapping parsed out of a source map.
+abstract class Mapping {
+ /// Returns the span associated with [line] and [column].
+ ///
+ /// [uri] is the optional location of the output file to find the span for
+ /// to disambiguate cases where a mapping may have different mappings for
+ /// different output files.
+ SourceMapSpan? spanFor(int line, int column,
+ {Map<String, SourceFile>? files, String? uri});
+
+ /// Returns the span associated with [location].
+ SourceMapSpan? spanForLocation(SourceLocation location,
+ {Map<String, SourceFile>? files}) {
+ return spanFor(location.line, location.column,
+ uri: location.sourceUrl?.toString(), files: files);
+ }
+}
+
+/// A meta-level map containing sections.
+class MultiSectionMapping extends Mapping {
+ /// For each section, the start line offset.
+ final List<int> _lineStart = <int>[];
+
+ /// For each section, the start column offset.
+ final List<int> _columnStart = <int>[];
+
+ /// For each section, the actual source map information, which is not adjusted
+ /// for offsets.
+ final List<Mapping> _maps = <Mapping>[];
+
+ /// Creates a section mapping from json.
+ MultiSectionMapping.fromJson(List sections, Map<String, Map>? otherMaps,
+ {/*String|Uri*/ Object? mapUrl}) {
+ for (var section in sections.cast<Map>()) {
+ var offset = section['offset'] as Map?;
+ if (offset == null) throw const FormatException('section missing offset');
+
+ var line = offset['line'] as int?;
+ if (line == null) throw const FormatException('offset missing line');
+
+ var column = offset['column'] as int?;
+ if (column == null) throw const FormatException('offset missing column');
+
+ _lineStart.add(line);
+ _columnStart.add(column);
+
+ var url = section['url'] as String?;
+ var map = section['map'] as Map?;
+
+ if (url != null && map != null) {
+ throw const FormatException(
+ "section can't use both url and map entries");
+ } else if (url != null) {
+ var other = otherMaps?[url];
+ if (otherMaps == null || other == null) {
+ throw FormatException(
+ 'section contains refers to $url, but no map was '
+ 'given for it. Make sure a map is passed in "otherMaps"');
+ }
+ _maps.add(parseJson(other, otherMaps: otherMaps, mapUrl: url));
+ } else if (map != null) {
+ _maps.add(parseJson(map, otherMaps: otherMaps, mapUrl: mapUrl));
+ } else {
+ throw const FormatException('section missing url or map');
+ }
+ }
+ if (_lineStart.isEmpty) {
+ throw const FormatException('expected at least one section');
+ }
+ }
+
+ int _indexFor(int line, int column) {
+ for (var i = 0; i < _lineStart.length; i++) {
+ if (line < _lineStart[i]) return i - 1;
+ if (line == _lineStart[i] && column < _columnStart[i]) return i - 1;
+ }
+ return _lineStart.length - 1;
+ }
+
+ @override
+ SourceMapSpan? spanFor(int line, int column,
+ {Map<String, SourceFile>? files, String? uri}) {
+ // TODO(jacobr): perhaps verify that targetUrl matches the actual uri
+ // or at least ends in the same file name.
+ var index = _indexFor(line, column);
+ return _maps[index].spanFor(
+ line - _lineStart[index], column - _columnStart[index],
+ files: files);
+ }
+
+ @override
+ String toString() {
+ var buff = StringBuffer('$runtimeType : [');
+ for (var i = 0; i < _lineStart.length; i++) {
+ buff
+ ..write('(')
+ ..write(_lineStart[i])
+ ..write(',')
+ ..write(_columnStart[i])
+ ..write(':')
+ ..write(_maps[i])
+ ..write(')');
+ }
+ buff.write(']');
+ return buff.toString();
+ }
+}
+
+class MappingBundle extends Mapping {
+ final Map<String, SingleMapping> _mappings = {};
+
+ MappingBundle();
+
+ MappingBundle.fromJson(List json, {/*String|Uri*/ Object? mapUrl}) {
+ for (var map in json) {
+ addMapping(parseJson(map as Map, mapUrl: mapUrl) as SingleMapping);
+ }
+ }
+
+ void addMapping(SingleMapping mapping) {
+ // TODO(jacobr): verify that targetUrl is valid uri instead of a windows
+ // path.
+ // TODO: Remove type arg https://github.com/dart-lang/sdk/issues/42227
+ var targetUrl = ArgumentError.checkNotNull<String>(
+ mapping.targetUrl, 'mapping.targetUrl');
+ _mappings[targetUrl] = mapping;
+ }
+
+ /// Encodes the Mapping mappings as a json map.
+ List toJson() => _mappings.values.map((v) => v.toJson()).toList();
+
+ @override
+ String toString() {
+ var buff = StringBuffer();
+ for (var map in _mappings.values) {
+ buff.write(map.toString());
+ }
+ return buff.toString();
+ }
+
+ bool containsMapping(String url) => _mappings.containsKey(url);
+
+ @override
+ SourceMapSpan? spanFor(int line, int column,
+ {Map<String, SourceFile>? files, String? uri}) {
+ // TODO: Remove type arg https://github.com/dart-lang/sdk/issues/42227
+ uri = ArgumentError.checkNotNull<String>(uri, 'uri');
+
+ // Find the longest suffix of the uri that matches the sourcemap
+ // where the suffix starts after a path segment boundary.
+ // We consider ":" and "/" as path segment boundaries so that
+ // "package:" uris can be handled with minimal special casing. Having a
+ // few false positive path segment boundaries is not a significant issue
+ // as we prefer the longest matching prefix.
+ // Using package:path `path.split` to find path segment boundaries would
+ // not generate all of the path segment boundaries we want for "package:"
+ // urls as "package:package_name" would be one path segment when we want
+ // "package" and "package_name" to be sepearate path segments.
+
+ var onBoundary = true;
+ var separatorCodeUnits = ['/'.codeUnitAt(0), ':'.codeUnitAt(0)];
+ for (var i = 0; i < uri.length; ++i) {
+ if (onBoundary) {
+ var candidate = uri.substring(i);
+ var candidateMapping = _mappings[candidate];
+ if (candidateMapping != null) {
+ return candidateMapping.spanFor(line, column,
+ files: files, uri: candidate);
+ }
+ }
+ onBoundary = separatorCodeUnits.contains(uri.codeUnitAt(i));
+ }
+
+ // Note: when there is no source map for an uri, this behaves like an
+ // identity function, returning the requested location as the result.
+
+ // Create a mock offset for the output location. We compute it in terms
+ // of the input line and column to minimize the chances that two different
+ // line and column locations are mapped to the same offset.
+ var offset = line * 1000000 + column;
+ var location = SourceLocation(offset,
+ line: line, column: column, sourceUrl: Uri.parse(uri));
+ return SourceMapSpan(location, location, '');
+ }
+}
+
+/// A map containing direct source mappings.
+class SingleMapping extends Mapping {
+ /// Source urls used in the mapping, indexed by id.
+ final List<String> urls;
+
+ /// Source names used in the mapping, indexed by id.
+ final List<String> names;
+
+ /// The [SourceFile]s to which the entries in [lines] refer.
+ ///
+ /// This is in the same order as [urls]. If this was constructed using
+ /// [SingleMapping.fromEntries], this contains files from any [FileLocation]s
+ /// used to build the mapping. If it was parsed from JSON, it contains files
+ /// for any sources whose contents were provided via the `"sourcesContent"`
+ /// field.
+ ///
+ /// Files whose contents aren't available are `null`.
+ final List<SourceFile?> files;
+
+ /// Entries indicating the beginning of each span.
+ final List<TargetLineEntry> lines;
+
+ /// Url of the target file.
+ String? targetUrl;
+
+ /// Source root prepended to all entries in [urls].
+ String? sourceRoot;
+
+ final Uri? _mapUrl;
+
+ final Map<String, dynamic> extensions;
+
+ SingleMapping._(this.targetUrl, this.files, this.urls, this.names, this.lines)
+ : _mapUrl = null,
+ extensions = {};
+
+ factory SingleMapping.fromEntries(Iterable<builder.Entry> entries,
+ [String? fileUrl]) {
+ // The entries needs to be sorted by the target offsets.
+ var sourceEntries = entries.toList()..sort();
+ var lines = <TargetLineEntry>[];
+
+ // Indices associated with file urls that will be part of the source map. We
+ // rely on map order so that `urls.keys[urls[u]] == u`
+ var urls = <String, int>{};
+
+ // Indices associated with identifiers that will be part of the source map.
+ // We rely on map order so that `names.keys[names[n]] == n`
+ var names = <String, int>{};
+
+ /// The file for each URL, indexed by [urls]' values.
+ var files = <int, SourceFile>{};
+
+ int? lineNum;
+ late List<TargetEntry> targetEntries;
+ for (var sourceEntry in sourceEntries) {
+ if (lineNum == null || sourceEntry.target.line > lineNum) {
+ lineNum = sourceEntry.target.line;
+ targetEntries = <TargetEntry>[];
+ lines.add(TargetLineEntry(lineNum, targetEntries));
+ }
+
+ var sourceUrl = sourceEntry.source.sourceUrl;
+ var urlId = urls.putIfAbsent(
+ sourceUrl == null ? '' : sourceUrl.toString(), () => urls.length);
+
+ if (sourceEntry.source is FileLocation) {
+ files.putIfAbsent(
+ urlId, () => (sourceEntry.source as FileLocation).file);
+ }
+
+ var sourceEntryIdentifierName = sourceEntry.identifierName;
+ var srcNameId = sourceEntryIdentifierName == null
+ ? null
+ : names.putIfAbsent(sourceEntryIdentifierName, () => names.length);
+ targetEntries.add(TargetEntry(sourceEntry.target.column, urlId,
+ sourceEntry.source.line, sourceEntry.source.column, srcNameId));
+ }
+ return SingleMapping._(fileUrl, urls.values.map((i) => files[i]).toList(),
+ urls.keys.toList(), names.keys.toList(), lines);
+ }
+
+ SingleMapping.fromJson(Map<String, dynamic> map, {Object? mapUrl})
+ : targetUrl = map['file'] as String?,
+ urls = List<String>.from(map['sources'] as List),
+ names = List<String>.from((map['names'] as List?) ?? []),
+ files = List.filled((map['sources'] as List).length, null),
+ sourceRoot = map['sourceRoot'] as String?,
+ lines = <TargetLineEntry>[],
+ _mapUrl = mapUrl is String ? Uri.parse(mapUrl) : (mapUrl as Uri?),
+ extensions = {} {
+ var sourcesContent = map['sourcesContent'] == null
+ ? const <String?>[]
+ : List<String?>.from(map['sourcesContent'] as List);
+ for (var i = 0; i < urls.length && i < sourcesContent.length; i++) {
+ var source = sourcesContent[i];
+ if (source == null) continue;
+ files[i] = SourceFile.fromString(source, url: urls[i]);
+ }
+
+ var line = 0;
+ var column = 0;
+ var srcUrlId = 0;
+ var srcLine = 0;
+ var srcColumn = 0;
+ var srcNameId = 0;
+ var tokenizer = _MappingTokenizer(map['mappings'] as String);
+ var entries = <TargetEntry>[];
+
+ while (tokenizer.hasTokens) {
+ if (tokenizer.nextKind.isNewLine) {
+ if (entries.isNotEmpty) {
+ lines.add(TargetLineEntry(line, entries));
+ entries = <TargetEntry>[];
+ }
+ line++;
+ column = 0;
+ tokenizer._consumeNewLine();
+ continue;
+ }
+
+ // Decode the next entry, using the previous encountered values to
+ // decode the relative values.
+ //
+ // We expect 1, 4, or 5 values. If present, values are expected in the
+ // following order:
+ // 0: the starting column in the current line of the generated file
+ // 1: the id of the original source file
+ // 2: the starting line in the original source
+ // 3: the starting column in the original source
+ // 4: the id of the original symbol name
+ // The values are relative to the previous encountered values.
+ if (tokenizer.nextKind.isNewSegment) throw _segmentError(0, line);
+ column += tokenizer._consumeValue();
+ if (!tokenizer.nextKind.isValue) {
+ entries.add(TargetEntry(column));
+ } else {
+ srcUrlId += tokenizer._consumeValue();
+ if (srcUrlId >= urls.length) {
+ throw StateError(
+ 'Invalid source url id. $targetUrl, $line, $srcUrlId');
+ }
+ if (!tokenizer.nextKind.isValue) throw _segmentError(2, line);
+ srcLine += tokenizer._consumeValue();
+ if (!tokenizer.nextKind.isValue) throw _segmentError(3, line);
+ srcColumn += tokenizer._consumeValue();
+ if (!tokenizer.nextKind.isValue) {
+ entries.add(TargetEntry(column, srcUrlId, srcLine, srcColumn));
+ } else {
+ srcNameId += tokenizer._consumeValue();
+ if (srcNameId >= names.length) {
+ throw StateError('Invalid name id: $targetUrl, $line, $srcNameId');
+ }
+ entries.add(
+ TargetEntry(column, srcUrlId, srcLine, srcColumn, srcNameId));
+ }
+ }
+ if (tokenizer.nextKind.isNewSegment) tokenizer._consumeNewSegment();
+ }
+ if (entries.isNotEmpty) {
+ lines.add(TargetLineEntry(line, entries));
+ }
+
+ map.forEach((name, value) {
+ if (name.startsWith('x_')) extensions[name] = value;
+ });
+ }
+
+ /// Encodes the Mapping mappings as a json map.
+ ///
+ /// If [includeSourceContents] is `true`, this includes the source file
+ /// contents from [files] in the map if possible.
+ Map<String, dynamic> toJson({bool includeSourceContents = false}) {
+ var buff = StringBuffer();
+ var line = 0;
+ var column = 0;
+ var srcLine = 0;
+ var srcColumn = 0;
+ var srcUrlId = 0;
+ var srcNameId = 0;
+ var first = true;
+
+ for (var entry in lines) {
+ var nextLine = entry.line;
+ if (nextLine > line) {
+ for (var i = line; i < nextLine; ++i) {
+ buff.write(';');
+ }
+ line = nextLine;
+ column = 0;
+ first = true;
+ }
+
+ for (var segment in entry.entries) {
+ if (!first) buff.write(',');
+ first = false;
+ column = _append(buff, column, segment.column);
+
+ // Encoding can be just the column offset if there is no source
+ // information.
+ var newUrlId = segment.sourceUrlId;
+ if (newUrlId == null) continue;
+ srcUrlId = _append(buff, srcUrlId, newUrlId);
+ srcLine = _append(buff, srcLine, segment.sourceLine!);
+ srcColumn = _append(buff, srcColumn, segment.sourceColumn!);
+
+ if (segment.sourceNameId == null) continue;
+ srcNameId = _append(buff, srcNameId, segment.sourceNameId!);
+ }
+ }
+
+ var result = <String, dynamic>{
+ 'version': 3,
+ 'sourceRoot': sourceRoot ?? '',
+ 'sources': urls,
+ 'names': names,
+ 'mappings': buff.toString(),
+ };
+ if (targetUrl != null) result['file'] = targetUrl!;
+
+ if (includeSourceContents) {
+ result['sourcesContent'] = files.map((file) => file?.getText(0)).toList();
+ }
+ extensions.forEach((name, value) => result[name] = value);
+
+ return result;
+ }
+
+ /// Appends to [buff] a VLQ encoding of [newValue] using the difference
+ /// between [oldValue] and [newValue]
+ static int _append(StringBuffer buff, int oldValue, int newValue) {
+ buff.writeAll(encodeVlq(newValue - oldValue));
+ return newValue;
+ }
+
+ StateError _segmentError(int seen, int line) =>
+ StateError('Invalid entry in sourcemap, expected 1, 4, or 5'
+ ' values, but got $seen.\ntargeturl: $targetUrl, line: $line');
+
+ /// Returns [TargetLineEntry] which includes the location in the target [line]
+ /// number. In particular, the resulting entry is the last entry whose line
+ /// number is lower or equal to [line].
+ TargetLineEntry? _findLine(int line) {
+ var index = binarySearch(lines, (e) => e.line > line);
+ return (index <= 0) ? null : lines[index - 1];
+ }
+
+ /// Returns [TargetEntry] which includes the location denoted by
+ /// [line], [column]. If [lineEntry] corresponds to [line], then this will be
+ /// the last entry whose column is lower or equal than [column]. If
+ /// [lineEntry] corresponds to a line prior to [line], then the result will be
+ /// the very last entry on that line.
+ TargetEntry? _findColumn(int line, int column, TargetLineEntry? lineEntry) {
+ if (lineEntry == null || lineEntry.entries.isEmpty) return null;
+ if (lineEntry.line != line) return lineEntry.entries.last;
+ var entries = lineEntry.entries;
+ var index = binarySearch(entries, (e) => e.column > column);
+ return (index <= 0) ? null : entries[index - 1];
+ }
+
+ @override
+ SourceMapSpan? spanFor(int line, int column,
+ {Map<String, SourceFile>? files, String? uri}) {
+ var entry = _findColumn(line, column, _findLine(line));
+ if (entry == null) return null;
+
+ var sourceUrlId = entry.sourceUrlId;
+ if (sourceUrlId == null) return null;
+
+ var url = urls[sourceUrlId];
+ if (sourceRoot != null) {
+ url = '$sourceRoot$url';
+ }
+
+ var sourceNameId = entry.sourceNameId;
+ var file = files?[url];
+ if (file != null) {
+ var start = file.getOffset(entry.sourceLine!, entry.sourceColumn);
+ if (sourceNameId != null) {
+ var text = names[sourceNameId];
+ return SourceMapFileSpan(file.span(start, start + text.length),
+ isIdentifier: true);
+ } else {
+ return SourceMapFileSpan(file.location(start).pointSpan());
+ }
+ } else {
+ var start = SourceLocation(0,
+ sourceUrl: _mapUrl?.resolve(url) ?? url,
+ line: entry.sourceLine,
+ column: entry.sourceColumn);
+
+ // Offset and other context is not available.
+ if (sourceNameId != null) {
+ return SourceMapSpan.identifier(start, names[sourceNameId]);
+ } else {
+ return SourceMapSpan(start, start, '');
+ }
+ }
+ }
+
+ @override
+ String toString() {
+ return (StringBuffer('$runtimeType : [')
+ ..write('targetUrl: ')
+ ..write(targetUrl)
+ ..write(', sourceRoot: ')
+ ..write(sourceRoot)
+ ..write(', urls: ')
+ ..write(urls)
+ ..write(', names: ')
+ ..write(names)
+ ..write(', lines: ')
+ ..write(lines)
+ ..write(']'))
+ .toString();
+ }
+
+ String get debugString {
+ var buff = StringBuffer();
+ for (var lineEntry in lines) {
+ var line = lineEntry.line;
+ for (var entry in lineEntry.entries) {
+ buff
+ ..write(targetUrl)
+ ..write(': ')
+ ..write(line)
+ ..write(':')
+ ..write(entry.column);
+ var sourceUrlId = entry.sourceUrlId;
+ if (sourceUrlId != null) {
+ buff
+ ..write(' --> ')
+ ..write(sourceRoot)
+ ..write(urls[sourceUrlId])
+ ..write(': ')
+ ..write(entry.sourceLine)
+ ..write(':')
+ ..write(entry.sourceColumn);
+ }
+ var sourceNameId = entry.sourceNameId;
+ if (sourceNameId != null) {
+ buff
+ ..write(' (')
+ ..write(names[sourceNameId])
+ ..write(')');
+ }
+ buff.write('\n');
+ }
+ }
+ return buff.toString();
+ }
+}
+
+/// A line entry read from a source map.
+class TargetLineEntry {
+ final int line;
+ List<TargetEntry> entries;
+ TargetLineEntry(this.line, this.entries);
+
+ @override
+ String toString() => '$runtimeType: $line $entries';
+}
+
+/// A target segment entry read from a source map
+class TargetEntry {
+ final int column;
+ final int? sourceUrlId;
+ final int? sourceLine;
+ final int? sourceColumn;
+ final int? sourceNameId;
+
+ TargetEntry(this.column,
+ [this.sourceUrlId,
+ this.sourceLine,
+ this.sourceColumn,
+ this.sourceNameId]);
+
+ @override
+ String toString() => '$runtimeType: '
+ '($column, $sourceUrlId, $sourceLine, $sourceColumn, $sourceNameId)';
+}
+
+/// A character iterator over a string that can peek one character ahead.
+class _MappingTokenizer implements Iterator<String> {
+ final String _internal;
+ final int _length;
+ int index = -1;
+ _MappingTokenizer(String internal)
+ : _internal = internal,
+ _length = internal.length;
+
+ // Iterator API is used by decodeVlq to consume VLQ entries.
+ @override
+ bool moveNext() => ++index < _length;
+
+ @override
+ String get current => (index >= 0 && index < _length)
+ ? _internal[index]
+ : throw RangeError.index(index, _internal);
+
+ bool get hasTokens => index < _length - 1 && _length > 0;
+
+ _TokenKind get nextKind {
+ if (!hasTokens) return _TokenKind.eof;
+ var next = _internal[index + 1];
+ if (next == ';') return _TokenKind.line;
+ if (next == ',') return _TokenKind.segment;
+ return _TokenKind.value;
+ }
+
+ int _consumeValue() => decodeVlq(this);
+ void _consumeNewLine() {
+ ++index;
+ }
+
+ void _consumeNewSegment() {
+ ++index;
+ }
+
+ // Print the state of the iterator, with colors indicating the current
+ // position.
+ @override
+ String toString() {
+ var buff = StringBuffer();
+ for (var i = 0; i < index; i++) {
+ buff.write(_internal[i]);
+ }
+ buff.write('[31m');
+ try {
+ buff.write(current);
+ // TODO: Determine whether this try / catch can be removed.
+ // ignore: avoid_catching_errors
+ } on RangeError catch (_) {}
+ buff.write('[0m');
+ for (var i = index + 1; i < _internal.length; i++) {
+ buff.write(_internal[i]);
+ }
+ buff.write(' ($index)');
+ return buff.toString();
+ }
+}
+
+class _TokenKind {
+ static const _TokenKind line = _TokenKind(isNewLine: true);
+ static const _TokenKind segment = _TokenKind(isNewSegment: true);
+ static const _TokenKind eof = _TokenKind(isEof: true);
+ static const _TokenKind value = _TokenKind();
+ final bool isNewLine;
+ final bool isNewSegment;
+ final bool isEof;
+ bool get isValue => !isNewLine && !isNewSegment && !isEof;
+
+ const _TokenKind(
+ {this.isNewLine = false, this.isNewSegment = false, this.isEof = false});
+}
diff --git a/pkgs/source_maps/lib/printer.dart b/pkgs/source_maps/lib/printer.dart
new file mode 100644
index 0000000..32523d6
--- /dev/null
+++ b/pkgs/source_maps/lib/printer.dart
@@ -0,0 +1,262 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Contains a code printer that generates code by recording the source maps.
+library;
+
+import 'package:source_span/source_span.dart';
+
+import 'builder.dart';
+import 'src/source_map_span.dart';
+import 'src/utils.dart';
+
+/// A simple printer that keeps track of offset locations and records source
+/// maps locations.
+class Printer {
+ final String filename;
+ final StringBuffer _buff = StringBuffer();
+ final SourceMapBuilder _maps = SourceMapBuilder();
+ String get text => _buff.toString();
+ String get map => _maps.toJson(filename);
+
+ /// Current source location mapping.
+ SourceLocation? _loc;
+
+ /// Current line in the buffer;
+ int _line = 0;
+
+ /// Current column in the buffer.
+ int _column = 0;
+
+ Printer(this.filename);
+
+ /// Add [str] contents to the output, tracking new lines to track correct
+ /// positions for span locations. When [projectMarks] is true, this method
+ /// adds a source map location on each new line, projecting that every new
+ /// line in the target file (printed here) corresponds to a new line in the
+ /// source file.
+ void add(String str, {bool projectMarks = false}) {
+ var chars = str.runes.toList();
+ var length = chars.length;
+ for (var i = 0; i < length; i++) {
+ var c = chars[i];
+ if (c == lineFeed ||
+ (c == carriageReturn &&
+ (i + 1 == length || chars[i + 1] != lineFeed))) {
+ // Return not followed by line-feed is treated as a new line.
+ _line++;
+ _column = 0;
+ {
+ // **Warning**: Any calls to `mark` will change the value of `_loc`,
+ // so this local variable is no longer up to date after that point.
+ //
+ // This is why it has been put inside its own block to limit the
+ // scope in which it is available.
+ var loc = _loc;
+ if (projectMarks && loc != null) {
+ if (loc is FileLocation) {
+ var file = loc.file;
+ mark(file.location(file.getOffset(loc.line + 1)));
+ } else {
+ mark(SourceLocation(0,
+ sourceUrl: loc.sourceUrl, line: loc.line + 1, column: 0));
+ }
+ }
+ }
+ } else {
+ _column++;
+ }
+ }
+ _buff.write(str);
+ }
+
+ /// Append a [total] number of spaces in the target file. Typically used for
+ /// formatting indentation.
+ void addSpaces(int total) {
+ for (var i = 0; i < total; i++) {
+ _buff.write(' ');
+ }
+ _column += total;
+ }
+
+ /// Marks that the current point in the target file corresponds to the [mark]
+ /// in the source file, which can be either a [SourceLocation] or a
+ /// [SourceSpan]. When the mark is a [SourceMapSpan] with `isIdentifier` set,
+ /// this also records the name of the identifier in the source map
+ /// information.
+ void mark(Object mark) {
+ late final SourceLocation loc;
+ String? identifier;
+ if (mark is SourceLocation) {
+ loc = mark;
+ } else if (mark is SourceSpan) {
+ loc = mark.start;
+ if (mark is SourceMapSpan && mark.isIdentifier) identifier = mark.text;
+ }
+ _maps.addLocation(loc,
+ SourceLocation(_buff.length, line: _line, column: _column), identifier);
+ _loc = loc;
+ }
+}
+
+/// A more advanced printer that keeps track of offset locations to record
+/// source maps, but additionally allows nesting of different kind of items,
+/// including [NestedPrinter]s, and it let's you automatically indent text.
+///
+/// This class is especially useful when doing code generation, where different
+/// pieces of the code are generated independently on separate printers, and are
+/// finally put together in the end.
+class NestedPrinter implements NestedItem {
+ /// Items recoded by this printer, which can be [String] literals,
+ /// [NestedItem]s, and source map information like [SourceLocation] and
+ /// [SourceSpan].
+ final List<Object> _items = [];
+
+ /// Internal buffer to merge consecutive strings added to this printer.
+ StringBuffer? _buff;
+
+ /// Current indentation, which can be updated from outside this class.
+ int indent;
+
+ /// [Printer] used during the last call to [build], if any.
+ Printer? printer;
+
+ /// Returns the text produced after calling [build].
+ String? get text => printer?.text;
+
+ /// Returns the source-map information produced after calling [build].
+ String? get map => printer?.map;
+
+ /// Item used to indicate that the following item is copied from the original
+ /// source code, and hence we should preserve source-maps on every new line.
+ static final _original = Object();
+
+ NestedPrinter([this.indent = 0]);
+
+ /// Adds [object] to this printer. [object] can be a [String],
+ /// [NestedPrinter], or anything implementing [NestedItem]. If [object] is a
+ /// [String], the value is appended directly, without doing any formatting
+ /// changes. If you wish to add a line of code with automatic indentation, use
+ /// [addLine] instead. [NestedPrinter]s and [NestedItem]s are not processed
+ /// until [build] gets called later on. We ensure that [build] emits every
+ /// object in the order that they were added to this printer.
+ ///
+ /// The [location] and [span] parameters indicate the corresponding source map
+ /// location of [object] in the original input. Only one, [location] or
+ /// [span], should be provided at a time.
+ ///
+ /// Indicate [isOriginal] when [object] is copied directly from the user code.
+ /// Setting [isOriginal] will make this printer propagate source map locations
+ /// on every line-break.
+ void add(Object object,
+ {SourceLocation? location, SourceSpan? span, bool isOriginal = false}) {
+ if (object is! String || location != null || span != null || isOriginal) {
+ _flush();
+ assert(location == null || span == null);
+ if (location != null) _items.add(location);
+ if (span != null) _items.add(span);
+ if (isOriginal) _items.add(_original);
+ }
+
+ if (object is String) {
+ _appendString(object);
+ } else {
+ _items.add(object);
+ }
+ }
+
+ /// Append `2 * indent` spaces to this printer.
+ void insertIndent() => _indent(indent);
+
+ /// Add a [line], autoindenting to the current value of [indent]. Note,
+ /// indentation is not inferred from the contents added to this printer. If a
+ /// line starts or ends an indentation block, you need to also update [indent]
+ /// accordingly. Also, indentation is not adapted for nested printers. If
+ /// you add a [NestedPrinter] to this printer, its indentation is set
+ /// separately and will not include any the indentation set here.
+ ///
+ /// The [location] and [span] parameters indicate the corresponding source map
+ /// location of [line] in the original input. Only one, [location] or
+ /// [span], should be provided at a time.
+ void addLine(String? line, {SourceLocation? location, SourceSpan? span}) {
+ if (location != null || span != null) {
+ _flush();
+ assert(location == null || span == null);
+ if (location != null) _items.add(location);
+ if (span != null) _items.add(span);
+ }
+ if (line == null) return;
+ if (line != '') {
+ // We don't indent empty lines.
+ _indent(indent);
+ _appendString(line);
+ }
+ _appendString('\n');
+ }
+
+ /// Appends a string merging it with any previous strings, if possible.
+ void _appendString(String s) {
+ var buf = _buff ??= StringBuffer();
+ buf.write(s);
+ }
+
+ /// Adds all of the current [_buff] contents as a string item.
+ void _flush() {
+ if (_buff != null) {
+ _items.add(_buff.toString());
+ _buff = null;
+ }
+ }
+
+ void _indent(int indent) {
+ for (var i = 0; i < indent; i++) {
+ _appendString(' ');
+ }
+ }
+
+ /// Returns a string representation of all the contents appended to this
+ /// printer, including source map location tokens.
+ @override
+ String toString() {
+ _flush();
+ return (StringBuffer()..writeAll(_items)).toString();
+ }
+
+ /// Builds the output of this printer and source map information. After
+ /// calling this function, you can use [text] and [map] to retrieve the
+ /// geenrated code and source map information, respectively.
+ void build(String filename) {
+ writeTo(printer = Printer(filename));
+ }
+
+ /// Implements the [NestedItem] interface.
+ @override
+ void writeTo(Printer printer) {
+ _flush();
+ var propagate = false;
+ for (var item in _items) {
+ if (item is NestedItem) {
+ item.writeTo(printer);
+ } else if (item is String) {
+ printer.add(item, projectMarks: propagate);
+ propagate = false;
+ } else if (item is SourceLocation || item is SourceSpan) {
+ printer.mark(item);
+ } else if (item == _original) {
+ // we insert booleans when we are about to quote text that was copied
+ // from the original source. In such case, we will propagate marks on
+ // every new-line.
+ propagate = true;
+ } else {
+ throw UnsupportedError('Unknown item type: $item');
+ }
+ }
+ }
+}
+
+/// An item added to a [NestedPrinter].
+abstract class NestedItem {
+ /// Write the contents of this item into [printer].
+ void writeTo(Printer printer);
+}
diff --git a/pkgs/source_maps/lib/refactor.dart b/pkgs/source_maps/lib/refactor.dart
new file mode 100644
index 0000000..a518a0c
--- /dev/null
+++ b/pkgs/source_maps/lib/refactor.dart
@@ -0,0 +1,140 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Tools to help implement refactoring like transformations to Dart code.
+///
+/// [TextEditTransaction] supports making a series of changes to a text buffer.
+/// [guessIndent] helps to guess the appropriate indentiation for the new code.
+library;
+
+import 'package:source_span/source_span.dart';
+
+import 'printer.dart';
+import 'src/utils.dart';
+
+/// Editable text transaction.
+///
+/// Applies a series of edits using original location
+/// information, and composes them into the edited string.
+class TextEditTransaction {
+ final SourceFile? file;
+ final String original;
+ final _edits = <_TextEdit>[];
+
+ /// Creates a new transaction.
+ TextEditTransaction(this.original, this.file);
+
+ bool get hasEdits => _edits.isNotEmpty;
+
+ /// Edit the original text, replacing text on the range [begin] and [end]
+ /// with the [replacement]. [replacement] can be either a string or a
+ /// [NestedPrinter].
+ void edit(int begin, int end, Object replacement) {
+ _edits.add(_TextEdit(begin, end, replacement));
+ }
+
+ /// Create a source map [SourceLocation] for [offset], if [file] is not
+ /// `null`.
+ SourceLocation? _loc(int offset) => file?.location(offset);
+
+ /// Applies all pending [edit]s and returns a [NestedPrinter] containing the
+ /// rewritten string and source map information. [file]`.location` is given to
+ /// the underlying printer to indicate the name of the generated file that
+ /// will contains the source map information.
+ ///
+ /// Throws [UnsupportedError] if the edits were overlapping. If no edits were
+ /// made, the printer simply contains the original string.
+ NestedPrinter commit() {
+ var printer = NestedPrinter();
+ if (_edits.isEmpty) {
+ return printer..add(original, location: _loc(0), isOriginal: true);
+ }
+
+ // Sort edits by start location.
+ _edits.sort();
+
+ var consumed = 0;
+ for (var edit in _edits) {
+ if (consumed > edit.begin) {
+ var sb = StringBuffer();
+ sb
+ ..write(file?.location(edit.begin).toolString)
+ ..write(': overlapping edits. Insert at offset ')
+ ..write(edit.begin)
+ ..write(' but have consumed ')
+ ..write(consumed)
+ ..write(' input characters. List of edits:');
+ for (var e in _edits) {
+ sb
+ ..write('\n ')
+ ..write(e);
+ }
+ throw UnsupportedError(sb.toString());
+ }
+
+ // Add characters from the original string between this edit and the last
+ // one, if any.
+ var betweenEdits = original.substring(consumed, edit.begin);
+ printer
+ ..add(betweenEdits, location: _loc(consumed), isOriginal: true)
+ ..add(edit.replace, location: _loc(edit.begin));
+ consumed = edit.end;
+ }
+
+ // Add any text from the end of the original string that was not replaced.
+ printer.add(original.substring(consumed),
+ location: _loc(consumed), isOriginal: true);
+ return printer;
+ }
+}
+
+class _TextEdit implements Comparable<_TextEdit> {
+ final int begin;
+ final int end;
+
+ /// The replacement used by the edit, can be a string or a [NestedPrinter].
+ final Object replace;
+
+ _TextEdit(this.begin, this.end, this.replace);
+
+ int get length => end - begin;
+
+ @override
+ String toString() => '(Edit @ $begin,$end: "$replace")';
+
+ @override
+ int compareTo(_TextEdit other) {
+ var diff = begin - other.begin;
+ if (diff != 0) return diff;
+ return end - other.end;
+ }
+}
+
+/// Returns all whitespace characters at the start of [charOffset]'s line.
+String guessIndent(String code, int charOffset) {
+ // Find the beginning of the line
+ var lineStart = 0;
+ for (var i = charOffset - 1; i >= 0; i--) {
+ var c = code.codeUnitAt(i);
+ if (c == lineFeed || c == carriageReturn) {
+ lineStart = i + 1;
+ break;
+ }
+ }
+
+ // Grab all the whitespace
+ var whitespaceEnd = code.length;
+ for (var i = lineStart; i < code.length; i++) {
+ var c = code.codeUnitAt(i);
+ if (c != _space && c != _tab) {
+ whitespaceEnd = i;
+ break;
+ }
+ }
+
+ return code.substring(lineStart, whitespaceEnd);
+}
+
+const int _tab = 9;
+const int _space = 32;
diff --git a/pkgs/source_maps/lib/source_maps.dart b/pkgs/source_maps/lib/source_maps.dart
new file mode 100644
index 0000000..244dee7
--- /dev/null
+++ b/pkgs/source_maps/lib/source_maps.dart
@@ -0,0 +1,38 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Library to create and parse source maps.
+///
+/// Create a source map using [SourceMapBuilder]. For example:
+///
+/// ```dart
+/// var json = (new SourceMapBuilder()
+/// ..add(inputSpan1, outputSpan1)
+/// ..add(inputSpan2, outputSpan2)
+/// ..add(inputSpan3, outputSpan3)
+/// .toJson(outputFile);
+/// ```
+///
+/// Use the source_span package's [SourceSpan] and [SourceFile] classes to
+/// specify span locations.
+///
+/// Parse a source map using [parse], and call `spanFor` on the returned mapping
+/// object. For example:
+///
+/// ```dart
+/// var mapping = parse(json);
+/// mapping.spanFor(outputSpan1.line, outputSpan1.column)
+/// ```
+library;
+
+import 'package:source_span/source_span.dart';
+
+import 'builder.dart';
+import 'parser.dart';
+
+export 'builder.dart';
+export 'parser.dart';
+export 'printer.dart';
+export 'refactor.dart';
+export 'src/source_map_span.dart';
diff --git a/pkgs/source_maps/lib/src/source_map_span.dart b/pkgs/source_maps/lib/src/source_map_span.dart
new file mode 100644
index 0000000..aad8a32
--- /dev/null
+++ b/pkgs/source_maps/lib/src/source_map_span.dart
@@ -0,0 +1,72 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+
+/// A [SourceSpan] for spans coming from or being written to source maps.
+///
+/// These spans have an extra piece of metadata: whether or not they represent
+/// an identifier (see [isIdentifier]).
+class SourceMapSpan extends SourceSpanBase {
+ /// Whether this span represents an identifier.
+ ///
+ /// If this is `true`, [text] is the value of the identifier.
+ final bool isIdentifier;
+
+ SourceMapSpan(super.start, super.end, super.text,
+ {this.isIdentifier = false});
+
+ /// Creates a [SourceMapSpan] for an identifier with value [text] starting at
+ /// [start].
+ ///
+ /// The [end] location is determined by adding [text] to [start].
+ SourceMapSpan.identifier(SourceLocation start, String text)
+ : this(
+ start,
+ SourceLocation(start.offset + text.length,
+ sourceUrl: start.sourceUrl,
+ line: start.line,
+ column: start.column + text.length),
+ text,
+ isIdentifier: true);
+}
+
+/// A wrapper aruond a [FileSpan] that implements [SourceMapSpan].
+class SourceMapFileSpan implements SourceMapSpan, FileSpan {
+ final FileSpan _inner;
+ @override
+ final bool isIdentifier;
+
+ @override
+ SourceFile get file => _inner.file;
+ @override
+ FileLocation get start => _inner.start;
+ @override
+ FileLocation get end => _inner.end;
+ @override
+ String get text => _inner.text;
+ @override
+ String get context => _inner.context;
+ @override
+ Uri? get sourceUrl => _inner.sourceUrl;
+ @override
+ int get length => _inner.length;
+
+ SourceMapFileSpan(this._inner, {this.isIdentifier = false});
+
+ @override
+ int compareTo(SourceSpan other) => _inner.compareTo(other);
+ @override
+ String highlight({Object? color}) => _inner.highlight(color: color);
+ @override
+ SourceSpan union(SourceSpan other) => _inner.union(other);
+ @override
+ FileSpan expand(FileSpan other) => _inner.expand(other);
+ @override
+ String message(String message, {Object? color}) =>
+ _inner.message(message, color: color);
+ @override
+ String toString() =>
+ _inner.toString().replaceAll('FileSpan', 'SourceMapFileSpan');
+}
diff --git a/pkgs/source_maps/lib/src/utils.dart b/pkgs/source_maps/lib/src/utils.dart
new file mode 100644
index 0000000..ba04fbb
--- /dev/null
+++ b/pkgs/source_maps/lib/src/utils.dart
@@ -0,0 +1,32 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Utilities that shouldn't be in this package.
+library;
+
+/// Find the first entry in a sorted [list] that matches a monotonic predicate.
+/// Given a result `n`, that all items before `n` will not match, `n` matches,
+/// and all items after `n` match too. The result is -1 when there are no
+/// items, 0 when all items match, and list.length when none does.
+// TODO(sigmund): remove this function after dartbug.com/5624 is fixed.
+int binarySearch<T>(List<T> list, bool Function(T) matches) {
+ if (list.isEmpty) return -1;
+ if (matches(list.first)) return 0;
+ if (!matches(list.last)) return list.length;
+
+ var min = 0;
+ var max = list.length - 1;
+ while (min < max) {
+ var half = min + ((max - min) ~/ 2);
+ if (matches(list[half])) {
+ max = half;
+ } else {
+ min = half + 1;
+ }
+ }
+ return max;
+}
+
+const int lineFeed = 10;
+const int carriageReturn = 13;
diff --git a/pkgs/source_maps/lib/src/vlq.dart b/pkgs/source_maps/lib/src/vlq.dart
new file mode 100644
index 0000000..3b0562d
--- /dev/null
+++ b/pkgs/source_maps/lib/src/vlq.dart
@@ -0,0 +1,101 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Utilities to encode and decode VLQ values used in source maps.
+///
+/// Sourcemaps are encoded with variable length numbers as base64 encoded
+/// strings with the least significant digit coming first. Each base64 digit
+/// encodes a 5-bit value (0-31) and a continuation bit. Signed values can be
+/// represented by using the least significant bit of the value as the sign bit.
+///
+/// For more details see the source map [version 3 documentation](https://docs.google.com/document/d/1U1RGAehQwRypUTovF1KRlpiOFze0b-_2gc6fAH0KY0k/edit?usp=sharing).
+library;
+
+import 'dart:math';
+
+const int vlqBaseShift = 5;
+
+const int vlqBaseMask = (1 << 5) - 1;
+
+const int vlqContinuationBit = 1 << 5;
+
+const int vlqContinuationMask = 1 << 5;
+
+const String base64Digits =
+ 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/';
+
+final Map<String, int> _digits = () {
+ var map = <String, int>{};
+ for (var i = 0; i < 64; i++) {
+ map[base64Digits[i]] = i;
+ }
+ return map;
+}();
+
+final int maxInt32 = (pow(2, 31) as int) - 1;
+final int minInt32 = -(pow(2, 31) as int);
+
+/// Creates the VLQ encoding of [value] as a sequence of characters
+Iterable<String> encodeVlq(int value) {
+ if (value < minInt32 || value > maxInt32) {
+ throw ArgumentError('expected 32 bit int, got: $value');
+ }
+ var res = <String>[];
+ var signBit = 0;
+ if (value < 0) {
+ signBit = 1;
+ value = -value;
+ }
+ value = (value << 1) | signBit;
+ do {
+ var digit = value & vlqBaseMask;
+ value >>= vlqBaseShift;
+ if (value > 0) {
+ digit |= vlqContinuationBit;
+ }
+ res.add(base64Digits[digit]);
+ } while (value > 0);
+ return res;
+}
+
+/// Decodes a value written as a sequence of VLQ characters. The first input
+/// character will be `chars.current` after calling `chars.moveNext` once. The
+/// iterator is advanced until a stop character is found (a character without
+/// the [vlqContinuationBit]).
+int decodeVlq(Iterator<String> chars) {
+ var result = 0;
+ var stop = false;
+ var shift = 0;
+ while (!stop) {
+ if (!chars.moveNext()) throw StateError('incomplete VLQ value');
+ var char = chars.current;
+ var digit = _digits[char];
+ if (digit == null) {
+ throw FormatException('invalid character in VLQ encoding: $char');
+ }
+ stop = (digit & vlqContinuationBit) == 0;
+ digit &= vlqBaseMask;
+ result += digit << shift;
+ shift += vlqBaseShift;
+ }
+
+ // Result uses the least significant bit as a sign bit. We convert it into a
+ // two-complement value. For example,
+ // 2 (10 binary) becomes 1
+ // 3 (11 binary) becomes -1
+ // 4 (100 binary) becomes 2
+ // 5 (101 binary) becomes -2
+ // 6 (110 binary) becomes 3
+ // 7 (111 binary) becomes -3
+ var negate = (result & 1) == 1;
+ result = result >> 1;
+ result = negate ? -result : result;
+
+ // TODO(sigmund): can we detect this earlier?
+ if (result < minInt32 || result > maxInt32) {
+ throw FormatException(
+ 'expected an encoded 32 bit int, but we got: $result');
+ }
+ return result;
+}
diff --git a/pkgs/source_maps/pubspec.yaml b/pkgs/source_maps/pubspec.yaml
new file mode 100644
index 0000000..32cbf4f
--- /dev/null
+++ b/pkgs/source_maps/pubspec.yaml
@@ -0,0 +1,15 @@
+name: source_maps
+version: 0.10.14-wip
+description: A library to programmatically manipulate source map files.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/source_maps
+
+environment:
+ sdk: ^3.3.0
+
+dependencies:
+ source_span: ^1.8.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ term_glyph: ^1.2.0
+ test: ^1.16.0
diff --git a/pkgs/source_maps/test/builder_test.dart b/pkgs/source_maps/test/builder_test.dart
new file mode 100644
index 0000000..4f773e7
--- /dev/null
+++ b/pkgs/source_maps/test/builder_test.dart
@@ -0,0 +1,32 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert';
+
+import 'package:source_maps/source_maps.dart';
+import 'package:test/test.dart';
+
+import 'common.dart';
+
+void main() {
+ test('builder - with span', () {
+ var map = (SourceMapBuilder()
+ ..addSpan(inputVar1, outputVar1)
+ ..addSpan(inputFunction, outputFunction)
+ ..addSpan(inputVar2, outputVar2)
+ ..addSpan(inputExpr, outputExpr))
+ .build(output.url.toString());
+ expect(map, equals(expectedMap));
+ });
+
+ test('builder - with location', () {
+ var str = (SourceMapBuilder()
+ ..addLocation(inputVar1.start, outputVar1.start, 'longVar1')
+ ..addLocation(inputFunction.start, outputFunction.start, 'longName')
+ ..addLocation(inputVar2.start, outputVar2.start, 'longVar2')
+ ..addLocation(inputExpr.start, outputExpr.start, null))
+ .toJson(output.url.toString());
+ expect(str, jsonEncode(expectedMap));
+ });
+}
diff --git a/pkgs/source_maps/test/common.dart b/pkgs/source_maps/test/common.dart
new file mode 100644
index 0000000..e225ff5
--- /dev/null
+++ b/pkgs/source_maps/test/common.dart
@@ -0,0 +1,107 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Common input/output used by builder, parser and end2end tests
+library;
+
+import 'package:source_maps/source_maps.dart';
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+
+/// Content of the source file
+const String inputContent = '''
+/** this is a comment. */
+int longVar1 = 3;
+
+// this is a comment too
+int longName(int longVar2) {
+ return longVar1 + longVar2;
+}
+''';
+final input = SourceFile.fromString(inputContent, url: 'input.dart');
+
+/// A span in the input file
+SourceMapSpan ispan(int start, int end, [bool isIdentifier = false]) =>
+ SourceMapFileSpan(input.span(start, end), isIdentifier: isIdentifier);
+
+SourceMapSpan inputVar1 = ispan(30, 38, true);
+SourceMapSpan inputFunction = ispan(74, 82, true);
+SourceMapSpan inputVar2 = ispan(87, 95, true);
+
+SourceMapSpan inputVar1NoSymbol = ispan(30, 38);
+SourceMapSpan inputFunctionNoSymbol = ispan(74, 82);
+SourceMapSpan inputVar2NoSymbol = ispan(87, 95);
+
+SourceMapSpan inputExpr = ispan(108, 127);
+
+/// Content of the target file
+const String outputContent = '''
+var x = 3;
+f(y) => x + y;
+''';
+final output = SourceFile.fromString(outputContent, url: 'output.dart');
+
+/// A span in the output file
+SourceMapSpan ospan(int start, int end, [bool isIdentifier = false]) =>
+ SourceMapFileSpan(output.span(start, end), isIdentifier: isIdentifier);
+
+SourceMapSpan outputVar1 = ospan(4, 5, true);
+SourceMapSpan outputFunction = ospan(11, 12, true);
+SourceMapSpan outputVar2 = ospan(13, 14, true);
+SourceMapSpan outputVar1NoSymbol = ospan(4, 5);
+SourceMapSpan outputFunctionNoSymbol = ospan(11, 12);
+SourceMapSpan outputVar2NoSymbol = ospan(13, 14);
+SourceMapSpan outputExpr = ospan(19, 24);
+
+/// Expected output mapping when recording the following four mappings:
+/// inputVar1 <= outputVar1
+/// inputFunction <= outputFunction
+/// inputVar2 <= outputVar2
+/// inputExpr <= outputExpr
+///
+/// This mapping is stored in the tests so we can independently test the builder
+/// and parser algorithms without relying entirely on end2end tests.
+const Map<String, dynamic> expectedMap = {
+ 'version': 3,
+ 'sourceRoot': '',
+ 'sources': ['input.dart'],
+ 'names': ['longVar1', 'longName', 'longVar2'],
+ 'mappings': 'IACIA;AAGAC,EAAaC,MACR',
+ 'file': 'output.dart'
+};
+
+void check(SourceSpan outputSpan, Mapping mapping, SourceMapSpan inputSpan,
+ bool realOffsets) {
+ var line = outputSpan.start.line;
+ var column = outputSpan.start.column;
+ var files = realOffsets ? {'input.dart': input} : null;
+ var span = mapping.spanFor(line, column, files: files)!;
+ var span2 = mapping.spanForLocation(outputSpan.start, files: files)!;
+
+ // Both mapping APIs are equivalent.
+ expect(span.start.offset, span2.start.offset);
+ expect(span.start.line, span2.start.line);
+ expect(span.start.column, span2.start.column);
+ expect(span.end.offset, span2.end.offset);
+ expect(span.end.line, span2.end.line);
+ expect(span.end.column, span2.end.column);
+
+ // Mapping matches our input location (modulo using real offsets)
+ expect(span.start.line, inputSpan.start.line);
+ expect(span.start.column, inputSpan.start.column);
+ expect(span.sourceUrl, inputSpan.sourceUrl);
+ expect(span.start.offset, realOffsets ? inputSpan.start.offset : 0);
+
+ // Mapping includes the identifier, if any
+ if (inputSpan.isIdentifier) {
+ expect(span.end.line, inputSpan.end.line);
+ expect(span.end.column, inputSpan.end.column);
+ expect(span.end.offset, span.start.offset + inputSpan.text.length);
+ if (realOffsets) expect(span.end.offset, inputSpan.end.offset);
+ } else {
+ expect(span.end.offset, span.start.offset);
+ expect(span.end.line, span.start.line);
+ expect(span.end.column, span.start.column);
+ }
+}
diff --git a/pkgs/source_maps/test/end2end_test.dart b/pkgs/source_maps/test/end2end_test.dart
new file mode 100644
index 0000000..84dd5ba
--- /dev/null
+++ b/pkgs/source_maps/test/end2end_test.dart
@@ -0,0 +1,160 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_maps/source_maps.dart';
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+
+import 'common.dart';
+
+void main() {
+ test('end-to-end setup', () {
+ expect(inputVar1.text, 'longVar1');
+ expect(inputFunction.text, 'longName');
+ expect(inputVar2.text, 'longVar2');
+ expect(inputVar1NoSymbol.text, 'longVar1');
+ expect(inputFunctionNoSymbol.text, 'longName');
+ expect(inputVar2NoSymbol.text, 'longVar2');
+ expect(inputExpr.text, 'longVar1 + longVar2');
+
+ expect(outputVar1.text, 'x');
+ expect(outputFunction.text, 'f');
+ expect(outputVar2.text, 'y');
+ expect(outputVar1NoSymbol.text, 'x');
+ expect(outputFunctionNoSymbol.text, 'f');
+ expect(outputVar2NoSymbol.text, 'y');
+ expect(outputExpr.text, 'x + y');
+ });
+
+ test('build + parse', () {
+ var map = (SourceMapBuilder()
+ ..addSpan(inputVar1, outputVar1)
+ ..addSpan(inputFunction, outputFunction)
+ ..addSpan(inputVar2, outputVar2)
+ ..addSpan(inputExpr, outputExpr))
+ .build(output.url.toString());
+ var mapping = parseJson(map);
+ check(outputVar1, mapping, inputVar1, false);
+ check(outputVar2, mapping, inputVar2, false);
+ check(outputFunction, mapping, inputFunction, false);
+ check(outputExpr, mapping, inputExpr, false);
+ });
+
+ test('build + parse - no symbols', () {
+ var map = (SourceMapBuilder()
+ ..addSpan(inputVar1NoSymbol, outputVar1NoSymbol)
+ ..addSpan(inputFunctionNoSymbol, outputFunctionNoSymbol)
+ ..addSpan(inputVar2NoSymbol, outputVar2NoSymbol)
+ ..addSpan(inputExpr, outputExpr))
+ .build(output.url.toString());
+ var mapping = parseJson(map);
+ check(outputVar1NoSymbol, mapping, inputVar1NoSymbol, false);
+ check(outputVar2NoSymbol, mapping, inputVar2NoSymbol, false);
+ check(outputFunctionNoSymbol, mapping, inputFunctionNoSymbol, false);
+ check(outputExpr, mapping, inputExpr, false);
+ });
+
+ test('build + parse, repeated entries', () {
+ var map = (SourceMapBuilder()
+ ..addSpan(inputVar1, outputVar1)
+ ..addSpan(inputVar1, outputVar1)
+ ..addSpan(inputFunction, outputFunction)
+ ..addSpan(inputFunction, outputFunction)
+ ..addSpan(inputVar2, outputVar2)
+ ..addSpan(inputVar2, outputVar2)
+ ..addSpan(inputExpr, outputExpr)
+ ..addSpan(inputExpr, outputExpr))
+ .build(output.url.toString());
+ var mapping = parseJson(map);
+ check(outputVar1, mapping, inputVar1, false);
+ check(outputVar2, mapping, inputVar2, false);
+ check(outputFunction, mapping, inputFunction, false);
+ check(outputExpr, mapping, inputExpr, false);
+ });
+
+ test('build + parse - no symbols, repeated entries', () {
+ var map = (SourceMapBuilder()
+ ..addSpan(inputVar1NoSymbol, outputVar1NoSymbol)
+ ..addSpan(inputVar1NoSymbol, outputVar1NoSymbol)
+ ..addSpan(inputFunctionNoSymbol, outputFunctionNoSymbol)
+ ..addSpan(inputFunctionNoSymbol, outputFunctionNoSymbol)
+ ..addSpan(inputVar2NoSymbol, outputVar2NoSymbol)
+ ..addSpan(inputVar2NoSymbol, outputVar2NoSymbol)
+ ..addSpan(inputExpr, outputExpr))
+ .build(output.url.toString());
+ var mapping = parseJson(map);
+ check(outputVar1NoSymbol, mapping, inputVar1NoSymbol, false);
+ check(outputVar2NoSymbol, mapping, inputVar2NoSymbol, false);
+ check(outputFunctionNoSymbol, mapping, inputFunctionNoSymbol, false);
+ check(outputExpr, mapping, inputExpr, false);
+ });
+
+ test('build + parse with file', () {
+ var json = (SourceMapBuilder()
+ ..addSpan(inputVar1, outputVar1)
+ ..addSpan(inputFunction, outputFunction)
+ ..addSpan(inputVar2, outputVar2)
+ ..addSpan(inputExpr, outputExpr))
+ .toJson(output.url.toString());
+ var mapping = parse(json);
+ check(outputVar1, mapping, inputVar1, true);
+ check(outputVar2, mapping, inputVar2, true);
+ check(outputFunction, mapping, inputFunction, true);
+ check(outputExpr, mapping, inputExpr, true);
+ });
+
+ test('printer projecting marks + parse', () {
+ var out = inputContent.replaceAll('long', '_s');
+ var file = SourceFile.fromString(out, url: 'output2.dart');
+ var printer = Printer('output2.dart');
+ printer.mark(ispan(0, 0));
+
+ var segments = inputContent.split('long');
+ expect(segments.length, 6);
+ printer.add(segments[0], projectMarks: true);
+ printer.mark(inputVar1);
+ printer.add('_s');
+ printer.add(segments[1], projectMarks: true);
+ printer.mark(inputFunction);
+ printer.add('_s');
+ printer.add(segments[2], projectMarks: true);
+ printer.mark(inputVar2);
+ printer.add('_s');
+ printer.add(segments[3], projectMarks: true);
+ printer.mark(inputExpr);
+ printer.add('_s');
+ printer.add(segments[4], projectMarks: true);
+ printer.add('_s');
+ printer.add(segments[5], projectMarks: true);
+
+ expect(printer.text, out);
+
+ var mapping = parse(printer.map);
+ void checkHelper(SourceMapSpan inputSpan, int adjustment) {
+ var start = inputSpan.start.offset - adjustment;
+ var end = (inputSpan.end.offset - adjustment) - 2;
+ var span = SourceMapFileSpan(file.span(start, end),
+ isIdentifier: inputSpan.isIdentifier);
+ check(span, mapping, inputSpan, true);
+ }
+
+ checkHelper(inputVar1, 0);
+ checkHelper(inputFunction, 2);
+ checkHelper(inputVar2, 4);
+ checkHelper(inputExpr, 6);
+
+ // We projected correctly lines that have no mappings
+ check(file.span(66, 66), mapping, ispan(45, 45), true);
+ check(file.span(63, 64), mapping, ispan(45, 45), true);
+ check(file.span(68, 68), mapping, ispan(70, 70), true);
+ check(file.span(71, 71), mapping, ispan(70, 70), true);
+
+ // Start of the last line
+ var oOffset = out.length - 2;
+ var iOffset = inputContent.length - 2;
+ check(file.span(oOffset, oOffset), mapping, ispan(iOffset, iOffset), true);
+ check(file.span(oOffset + 1, oOffset + 1), mapping, ispan(iOffset, iOffset),
+ true);
+ });
+}
diff --git a/pkgs/source_maps/test/parser_test.dart b/pkgs/source_maps/test/parser_test.dart
new file mode 100644
index 0000000..6cfe928
--- /dev/null
+++ b/pkgs/source_maps/test/parser_test.dart
@@ -0,0 +1,431 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: inference_failure_on_collection_literal
+// ignore_for_file: inference_failure_on_instance_creation
+
+import 'dart:convert';
+
+import 'package:source_maps/source_maps.dart';
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+
+import 'common.dart';
+
+const Map<String, dynamic> _mapWithNoSourceLocation = {
+ 'version': 3,
+ 'sourceRoot': '',
+ 'sources': ['input.dart'],
+ 'names': [],
+ 'mappings': 'A',
+ 'file': 'output.dart'
+};
+
+const Map<String, dynamic> _mapWithSourceLocation = {
+ 'version': 3,
+ 'sourceRoot': '',
+ 'sources': ['input.dart'],
+ 'names': [],
+ 'mappings': 'AAAA',
+ 'file': 'output.dart'
+};
+
+const Map<String, dynamic> _mapWithSourceLocationAndMissingNames = {
+ 'version': 3,
+ 'sourceRoot': '',
+ 'sources': ['input.dart'],
+ 'mappings': 'AAAA',
+ 'file': 'output.dart'
+};
+
+const Map<String, dynamic> _mapWithSourceLocationAndName = {
+ 'version': 3,
+ 'sourceRoot': '',
+ 'sources': ['input.dart'],
+ 'names': ['var'],
+ 'mappings': 'AAAAA',
+ 'file': 'output.dart'
+};
+
+const Map<String, dynamic> _mapWithSourceLocationAndName1 = {
+ 'version': 3,
+ 'sourceRoot': 'pkg/',
+ 'sources': ['input1.dart'],
+ 'names': ['var1'],
+ 'mappings': 'AAAAA',
+ 'file': 'output.dart'
+};
+
+const Map<String, dynamic> _mapWithSourceLocationAndName2 = {
+ 'version': 3,
+ 'sourceRoot': 'pkg/',
+ 'sources': ['input2.dart'],
+ 'names': ['var2'],
+ 'mappings': 'AAAAA',
+ 'file': 'output2.dart'
+};
+
+const Map<String, dynamic> _mapWithSourceLocationAndName3 = {
+ 'version': 3,
+ 'sourceRoot': 'pkg/',
+ 'sources': ['input3.dart'],
+ 'names': ['var3'],
+ 'mappings': 'AAAAA',
+ 'file': '3/output.dart'
+};
+
+const _sourceMapBundle = [
+ _mapWithSourceLocationAndName1,
+ _mapWithSourceLocationAndName2,
+ _mapWithSourceLocationAndName3,
+];
+
+void main() {
+ test('parse', () {
+ var mapping = parseJson(expectedMap);
+ check(outputVar1, mapping, inputVar1, false);
+ check(outputVar2, mapping, inputVar2, false);
+ check(outputFunction, mapping, inputFunction, false);
+ check(outputExpr, mapping, inputExpr, false);
+ });
+
+ test('parse + json', () {
+ var mapping = parse(jsonEncode(expectedMap));
+ check(outputVar1, mapping, inputVar1, false);
+ check(outputVar2, mapping, inputVar2, false);
+ check(outputFunction, mapping, inputFunction, false);
+ check(outputExpr, mapping, inputExpr, false);
+ });
+
+ test('parse with file', () {
+ var mapping = parseJson(expectedMap);
+ check(outputVar1, mapping, inputVar1, true);
+ check(outputVar2, mapping, inputVar2, true);
+ check(outputFunction, mapping, inputFunction, true);
+ check(outputExpr, mapping, inputExpr, true);
+ });
+
+ test('parse with no source location', () {
+ var map = parse(jsonEncode(_mapWithNoSourceLocation)) as SingleMapping;
+ expect(map.lines.length, 1);
+ expect(map.lines.first.entries.length, 1);
+ var entry = map.lines.first.entries.first;
+
+ expect(entry.column, 0);
+ expect(entry.sourceUrlId, null);
+ expect(entry.sourceColumn, null);
+ expect(entry.sourceLine, null);
+ expect(entry.sourceNameId, null);
+ });
+
+ test('parse with source location and no name', () {
+ var map = parse(jsonEncode(_mapWithSourceLocation)) as SingleMapping;
+ expect(map.lines.length, 1);
+ expect(map.lines.first.entries.length, 1);
+ var entry = map.lines.first.entries.first;
+
+ expect(entry.column, 0);
+ expect(entry.sourceUrlId, 0);
+ expect(entry.sourceColumn, 0);
+ expect(entry.sourceLine, 0);
+ expect(entry.sourceNameId, null);
+ });
+
+ test('parse with source location and missing names entry', () {
+ var map = parse(jsonEncode(_mapWithSourceLocationAndMissingNames))
+ as SingleMapping;
+ expect(map.lines.length, 1);
+ expect(map.lines.first.entries.length, 1);
+ var entry = map.lines.first.entries.first;
+
+ expect(entry.column, 0);
+ expect(entry.sourceUrlId, 0);
+ expect(entry.sourceColumn, 0);
+ expect(entry.sourceLine, 0);
+ expect(entry.sourceNameId, null);
+ });
+
+ test('parse with source location and name', () {
+ var map = parse(jsonEncode(_mapWithSourceLocationAndName)) as SingleMapping;
+ expect(map.lines.length, 1);
+ expect(map.lines.first.entries.length, 1);
+ var entry = map.lines.first.entries.first;
+
+ expect(entry.sourceUrlId, 0);
+ expect(entry.sourceUrlId, 0);
+ expect(entry.sourceColumn, 0);
+ expect(entry.sourceLine, 0);
+ expect(entry.sourceNameId, 0);
+ });
+
+ test('parse with source root', () {
+ var inputMap = Map.from(_mapWithSourceLocation);
+ inputMap['sourceRoot'] = '/pkg/';
+ var mapping = parseJson(inputMap) as SingleMapping;
+ expect(mapping.spanFor(0, 0)?.sourceUrl, Uri.parse('/pkg/input.dart'));
+ expect(
+ mapping
+ .spanForLocation(
+ SourceLocation(0, sourceUrl: Uri.parse('ignored.dart')))
+ ?.sourceUrl,
+ Uri.parse('/pkg/input.dart'));
+
+ var newSourceRoot = '/new/';
+
+ mapping.sourceRoot = newSourceRoot;
+ inputMap['sourceRoot'] = newSourceRoot;
+
+ expect(mapping.toJson(), equals(inputMap));
+ });
+
+ test('parse with map URL', () {
+ var inputMap = Map.from(_mapWithSourceLocation);
+ inputMap['sourceRoot'] = 'pkg/';
+ var mapping = parseJson(inputMap, mapUrl: 'file:///path/to/map');
+ expect(mapping.spanFor(0, 0)?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input.dart'));
+ });
+
+ group('parse with bundle', () {
+ var mapping =
+ parseJsonExtended(_sourceMapBundle, mapUrl: 'file:///path/to/map');
+
+ test('simple', () {
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.file('/path/to/output.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.file('/path/to/output2.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.file('/path/to/3/output.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+
+ expect(
+ mapping.spanFor(0, 0, uri: 'file:///path/to/output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(
+ mapping.spanFor(0, 0, uri: 'file:///path/to/output2.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(
+ mapping
+ .spanFor(0, 0, uri: 'file:///path/to/3/output.dart')
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+ });
+
+ test('package uris', () {
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.parse('package:1/output.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.parse('package:2/output2.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.parse('package:3/output.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+
+ expect(mapping.spanFor(0, 0, uri: 'package:1/output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(mapping.spanFor(0, 0, uri: 'package:2/output2.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(mapping.spanFor(0, 0, uri: 'package:3/output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+ });
+
+ test('unmapped path', () {
+ var span = mapping.spanFor(0, 0, uri: 'unmapped_output.dart')!;
+ expect(span.sourceUrl, Uri.parse('unmapped_output.dart'));
+ expect(span.start.line, equals(0));
+ expect(span.start.column, equals(0));
+
+ span = mapping.spanFor(10, 5, uri: 'unmapped_output.dart')!;
+ expect(span.sourceUrl, Uri.parse('unmapped_output.dart'));
+ expect(span.start.line, equals(10));
+ expect(span.start.column, equals(5));
+ });
+
+ test('missing path', () {
+ expect(() => mapping.spanFor(0, 0), throwsA(anything));
+ });
+
+ test('incomplete paths', () {
+ expect(mapping.spanFor(0, 0, uri: 'output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(mapping.spanFor(0, 0, uri: 'output2.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(mapping.spanFor(0, 0, uri: '3/output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+ });
+
+ test('parseExtended', () {
+ var mapping = parseExtended(jsonEncode(_sourceMapBundle),
+ mapUrl: 'file:///path/to/map');
+
+ expect(mapping.spanFor(0, 0, uri: 'output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(mapping.spanFor(0, 0, uri: 'output2.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(mapping.spanFor(0, 0, uri: '3/output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+ });
+
+ test('build bundle incrementally', () {
+ var mapping = MappingBundle();
+
+ mapping.addMapping(parseJson(_mapWithSourceLocationAndName1,
+ mapUrl: 'file:///path/to/map') as SingleMapping);
+ expect(mapping.spanFor(0, 0, uri: 'output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+
+ expect(mapping.containsMapping('output2.dart'), isFalse);
+ mapping.addMapping(parseJson(_mapWithSourceLocationAndName2,
+ mapUrl: 'file:///path/to/map') as SingleMapping);
+ expect(mapping.containsMapping('output2.dart'), isTrue);
+ expect(mapping.spanFor(0, 0, uri: 'output2.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+
+ expect(mapping.containsMapping('3/output.dart'), isFalse);
+ mapping.addMapping(parseJson(_mapWithSourceLocationAndName3,
+ mapUrl: 'file:///path/to/map') as SingleMapping);
+ expect(mapping.containsMapping('3/output.dart'), isTrue);
+ expect(mapping.spanFor(0, 0, uri: '3/output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+ });
+
+ // Test that the source map can handle cases where the uri passed in is
+ // not from the expected host but it is still unambiguous which source
+ // map should be used.
+ test('different paths', () {
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.parse('http://localhost/output.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.parse('http://localhost/output2.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(
+ mapping
+ .spanForLocation(SourceLocation(0,
+ sourceUrl: Uri.parse('http://localhost/3/output.dart')))
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+
+ expect(
+ mapping.spanFor(0, 0, uri: 'http://localhost/output.dart')?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input1.dart'));
+ expect(
+ mapping
+ .spanFor(0, 0, uri: 'http://localhost/output2.dart')
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input2.dart'));
+ expect(
+ mapping
+ .spanFor(0, 0, uri: 'http://localhost/3/output.dart')
+ ?.sourceUrl,
+ Uri.parse('file:///path/to/pkg/input3.dart'));
+ });
+ });
+
+ test('parse and re-emit', () {
+ for (var expected in [
+ expectedMap,
+ _mapWithNoSourceLocation,
+ _mapWithSourceLocation,
+ _mapWithSourceLocationAndName
+ ]) {
+ var mapping = parseJson(expected) as SingleMapping;
+ expect(mapping.toJson(), equals(expected));
+
+ mapping = parseJsonExtended(expected) as SingleMapping;
+ expect(mapping.toJson(), equals(expected));
+ }
+
+ var mapping = parseJsonExtended(_sourceMapBundle) as MappingBundle;
+ expect(mapping.toJson(), equals(_sourceMapBundle));
+ });
+
+ test('parse extensions', () {
+ var map = Map.from(expectedMap);
+ map['x_foo'] = 'a';
+ map['x_bar'] = [3];
+ var mapping = parseJson(map) as SingleMapping;
+ expect(mapping.toJson(), equals(map));
+ expect(mapping.extensions['x_foo'], equals('a'));
+ expect((mapping.extensions['x_bar'] as List).first, equals(3));
+ });
+
+ group('source files', () {
+ group('from fromEntries()', () {
+ test('are null for non-FileLocations', () {
+ var mapping = SingleMapping.fromEntries([
+ Entry(SourceLocation(10, line: 1, column: 8), outputVar1.start, null)
+ ]);
+ expect(mapping.files, equals([null]));
+ });
+
+ test("use a file location's file", () {
+ var mapping = SingleMapping.fromEntries(
+ [Entry(inputVar1.start, outputVar1.start, null)]);
+ expect(mapping.files, equals([input]));
+ });
+ });
+
+ group('from parse()', () {
+ group('are null', () {
+ test('with no sourcesContent field', () {
+ var mapping = parseJson(expectedMap) as SingleMapping;
+ expect(mapping.files, equals([null]));
+ });
+
+ test('with null sourcesContent values', () {
+ var map = Map.from(expectedMap);
+ map['sourcesContent'] = [null];
+ var mapping = parseJson(map) as SingleMapping;
+ expect(mapping.files, equals([null]));
+ });
+
+ test('with a too-short sourcesContent', () {
+ var map = Map.from(expectedMap);
+ map['sourcesContent'] = [];
+ var mapping = parseJson(map) as SingleMapping;
+ expect(mapping.files, equals([null]));
+ });
+ });
+
+ test('are parsed from sourcesContent', () {
+ var map = Map.from(expectedMap);
+ map['sourcesContent'] = ['hello, world!'];
+ var mapping = parseJson(map) as SingleMapping;
+
+ var file = mapping.files[0]!;
+ expect(file.url, equals(Uri.parse('input.dart')));
+ expect(file.getText(0), equals('hello, world!'));
+ });
+ });
+ });
+}
diff --git a/pkgs/source_maps/test/printer_test.dart b/pkgs/source_maps/test/printer_test.dart
new file mode 100644
index 0000000..89265e3
--- /dev/null
+++ b/pkgs/source_maps/test/printer_test.dart
@@ -0,0 +1,126 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert';
+
+import 'package:source_maps/source_maps.dart';
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+
+import 'common.dart';
+
+void main() {
+ test('printer', () {
+ var printer = Printer('output.dart');
+ printer
+ ..add('var ')
+ ..mark(inputVar1)
+ ..add('x = 3;\n')
+ ..mark(inputFunction)
+ ..add('f(')
+ ..mark(inputVar2)
+ ..add('y) => ')
+ ..mark(inputExpr)
+ ..add('x + y;\n');
+ expect(printer.text, outputContent);
+ expect(printer.map, jsonEncode(expectedMap));
+ });
+
+ test('printer projecting marks', () {
+ var out = inputContent.replaceAll('long', '_s');
+ var printer = Printer('output2.dart');
+
+ var segments = inputContent.split('long');
+ expect(segments.length, 6);
+ printer
+ ..mark(ispan(0, 0))
+ ..add(segments[0], projectMarks: true)
+ ..mark(inputVar1)
+ ..add('_s')
+ ..add(segments[1], projectMarks: true)
+ ..mark(inputFunction)
+ ..add('_s')
+ ..add(segments[2], projectMarks: true)
+ ..mark(inputVar2)
+ ..add('_s')
+ ..add(segments[3], projectMarks: true)
+ ..mark(inputExpr)
+ ..add('_s')
+ ..add(segments[4], projectMarks: true)
+ ..add('_s')
+ ..add(segments[5], projectMarks: true);
+
+ expect(printer.text, out);
+ // 8 new lines in the source map:
+ expect(printer.map.split(';').length, 8);
+
+ SourceMapSpan asFixed(SourceMapSpan s) =>
+ SourceMapSpan(s.start, s.end, s.text, isIdentifier: s.isIdentifier);
+
+ // The result is the same if we use fixed positions
+ var printer2 = Printer('output2.dart');
+ printer2
+ ..mark(SourceLocation(0, sourceUrl: 'input.dart').pointSpan())
+ ..add(segments[0], projectMarks: true)
+ ..mark(asFixed(inputVar1))
+ ..add('_s')
+ ..add(segments[1], projectMarks: true)
+ ..mark(asFixed(inputFunction))
+ ..add('_s')
+ ..add(segments[2], projectMarks: true)
+ ..mark(asFixed(inputVar2))
+ ..add('_s')
+ ..add(segments[3], projectMarks: true)
+ ..mark(asFixed(inputExpr))
+ ..add('_s')
+ ..add(segments[4], projectMarks: true)
+ ..add('_s')
+ ..add(segments[5], projectMarks: true);
+
+ expect(printer2.text, out);
+ expect(printer2.map, printer.map);
+ });
+
+ group('nested printer', () {
+ test('simple use', () {
+ var printer = NestedPrinter();
+ printer
+ ..add('var ')
+ ..add('x = 3;\n', span: inputVar1)
+ ..add('f(', span: inputFunction)
+ ..add('y) => ', span: inputVar2)
+ ..add('x + y;\n', span: inputExpr)
+ ..build('output.dart');
+ expect(printer.text, outputContent);
+ expect(printer.map, jsonEncode(expectedMap));
+ });
+
+ test('nested use', () {
+ var printer = NestedPrinter();
+ printer
+ ..add('var ')
+ ..add(NestedPrinter()..add('x = 3;\n', span: inputVar1))
+ ..add('f(', span: inputFunction)
+ ..add(NestedPrinter()..add('y) => ', span: inputVar2))
+ ..add('x + y;\n', span: inputExpr)
+ ..build('output.dart');
+ expect(printer.text, outputContent);
+ expect(printer.map, jsonEncode(expectedMap));
+ });
+
+ test('add indentation', () {
+ var out = inputContent.replaceAll('long', '_s');
+ var lines = inputContent.trim().split('\n');
+ expect(lines.length, 7);
+ var printer = NestedPrinter();
+ for (var i = 0; i < lines.length; i++) {
+ if (i == 5) printer.indent++;
+ printer.addLine(lines[i].replaceAll('long', '_s').trim());
+ if (i == 5) printer.indent--;
+ }
+ printer.build('output.dart');
+ expect(printer.text, out);
+ });
+ });
+}
diff --git a/pkgs/source_maps/test/refactor_test.dart b/pkgs/source_maps/test/refactor_test.dart
new file mode 100644
index 0000000..5bc3818
--- /dev/null
+++ b/pkgs/source_maps/test/refactor_test.dart
@@ -0,0 +1,199 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_maps/parser.dart' show Mapping, parse;
+import 'package:source_maps/refactor.dart';
+import 'package:source_span/source_span.dart';
+import 'package:term_glyph/term_glyph.dart' as term_glyph;
+import 'package:test/test.dart';
+
+void main() {
+ setUpAll(() {
+ term_glyph.ascii = true;
+ });
+
+ group('conflict detection', () {
+ var original = '0123456789abcdefghij';
+ var file = SourceFile.fromString(original);
+
+ test('no conflict, in order', () {
+ var txn = TextEditTransaction(original, file);
+ txn.edit(2, 4, '.');
+ txn.edit(5, 5, '|');
+ txn.edit(6, 6, '-');
+ txn.edit(6, 7, '_');
+ expect((txn.commit()..build('')).text, '01.4|5-_789abcdefghij');
+ });
+
+ test('no conflict, out of order', () {
+ var txn = TextEditTransaction(original, file);
+ txn.edit(2, 4, '.');
+ txn.edit(5, 5, '|');
+
+ // Regresion test for issue #404: there is no conflict/overlap for edits
+ // that don't remove any of the original code.
+ txn.edit(6, 7, '_');
+ txn.edit(6, 6, '-');
+ expect((txn.commit()..build('')).text, '01.4|5-_789abcdefghij');
+ });
+
+ test('conflict', () {
+ var txn = TextEditTransaction(original, file);
+ txn.edit(2, 4, '.');
+ txn.edit(3, 3, '-');
+ expect(
+ () => txn.commit(),
+ throwsA(
+ predicate((e) => e.toString().contains('overlapping edits'))));
+ });
+ });
+
+ test('generated source maps', () {
+ var original =
+ '0123456789\n0*23456789\n01*3456789\nabcdefghij\nabcd*fghij\n';
+ var file = SourceFile.fromString(original);
+ var txn = TextEditTransaction(original, file);
+ txn.edit(27, 29, '__\n ');
+ txn.edit(34, 35, '___');
+ var printer = (txn.commit()..build(''));
+ var output = printer.text;
+ var map = parse(printer.map!);
+ expect(output,
+ '0123456789\n0*23456789\n01*34__\n 789\na___cdefghij\nabcd*fghij\n');
+
+ // Line 1 and 2 are unmodified: mapping any column returns the beginning
+ // of the corresponding line:
+ expect(
+ _span(1, 1, map, file),
+ 'line 1, column 1: \n'
+ ' ,\n'
+ '1 | 0123456789\n'
+ ' | ^\n'
+ " '");
+ expect(
+ _span(1, 5, map, file),
+ 'line 1, column 1: \n'
+ ' ,\n'
+ '1 | 0123456789\n'
+ ' | ^\n'
+ " '");
+ expect(
+ _span(2, 1, map, file),
+ 'line 2, column 1: \n'
+ ' ,\n'
+ '2 | 0*23456789\n'
+ ' | ^\n'
+ " '");
+ expect(
+ _span(2, 8, map, file),
+ 'line 2, column 1: \n'
+ ' ,\n'
+ '2 | 0*23456789\n'
+ ' | ^\n'
+ " '");
+
+ // Line 3 is modified part way: mappings before the edits have the right
+ // mapping, after the edits the mapping is null.
+ expect(
+ _span(3, 1, map, file),
+ 'line 3, column 1: \n'
+ ' ,\n'
+ '3 | 01*3456789\n'
+ ' | ^\n'
+ " '");
+ expect(
+ _span(3, 5, map, file),
+ 'line 3, column 1: \n'
+ ' ,\n'
+ '3 | 01*3456789\n'
+ ' | ^\n'
+ " '");
+
+ // Start of edits map to beginning of the edit secion:
+ expect(
+ _span(3, 6, map, file),
+ 'line 3, column 6: \n'
+ ' ,\n'
+ '3 | 01*3456789\n'
+ ' | ^\n'
+ " '");
+ expect(
+ _span(3, 7, map, file),
+ 'line 3, column 6: \n'
+ ' ,\n'
+ '3 | 01*3456789\n'
+ ' | ^\n'
+ " '");
+
+ // Lines added have no mapping (they should inherit the last mapping),
+ // but the end of the edit region continues were we left off:
+ expect(_span(4, 1, map, file), isNull);
+ expect(
+ _span(4, 5, map, file),
+ 'line 3, column 8: \n'
+ ' ,\n'
+ '3 | 01*3456789\n'
+ ' | ^\n'
+ " '");
+
+ // Subsequent lines are still mapped correctly:
+ // a (in a___cd...)
+ expect(
+ _span(5, 1, map, file),
+ 'line 4, column 1: \n'
+ ' ,\n'
+ '4 | abcdefghij\n'
+ ' | ^\n'
+ " '");
+ // _ (in a___cd...)
+ expect(
+ _span(5, 2, map, file),
+ 'line 4, column 2: \n'
+ ' ,\n'
+ '4 | abcdefghij\n'
+ ' | ^\n'
+ " '");
+ // _ (in a___cd...)
+ expect(
+ _span(5, 3, map, file),
+ 'line 4, column 2: \n'
+ ' ,\n'
+ '4 | abcdefghij\n'
+ ' | ^\n'
+ " '");
+ // _ (in a___cd...)
+ expect(
+ _span(5, 4, map, file),
+ 'line 4, column 2: \n'
+ ' ,\n'
+ '4 | abcdefghij\n'
+ ' | ^\n'
+ " '");
+ // c (in a___cd...)
+ expect(
+ _span(5, 5, map, file),
+ 'line 4, column 3: \n'
+ ' ,\n'
+ '4 | abcdefghij\n'
+ ' | ^\n'
+ " '");
+ expect(
+ _span(6, 1, map, file),
+ 'line 5, column 1: \n'
+ ' ,\n'
+ '5 | abcd*fghij\n'
+ ' | ^\n'
+ " '");
+ expect(
+ _span(6, 8, map, file),
+ 'line 5, column 1: \n'
+ ' ,\n'
+ '5 | abcd*fghij\n'
+ ' | ^\n'
+ " '");
+ });
+}
+
+String? _span(int line, int column, Mapping map, SourceFile file) =>
+ map.spanFor(line - 1, column - 1, files: {'': file})?.message('').trim();
diff --git a/pkgs/source_maps/test/utils_test.dart b/pkgs/source_maps/test/utils_test.dart
new file mode 100644
index 0000000..2516d1e
--- /dev/null
+++ b/pkgs/source_maps/test/utils_test.dart
@@ -0,0 +1,53 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Tests for the binary search utility algorithm.
+library;
+
+import 'package:source_maps/src/utils.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('binary search', () {
+ test('empty', () {
+ expect(binarySearch([], (x) => true), -1);
+ });
+
+ test('single element', () {
+ expect(binarySearch([1], (x) => true), 0);
+ expect(binarySearch([1], (x) => false), 1);
+ });
+
+ test('no matches', () {
+ var list = [1, 2, 3, 4, 5, 6, 7];
+ expect(binarySearch(list, (x) => false), list.length);
+ });
+
+ test('all match', () {
+ var list = [1, 2, 3, 4, 5, 6, 7];
+ expect(binarySearch(list, (x) => true), 0);
+ });
+
+ test('compare with linear search', () {
+ for (var size = 0; size < 100; size++) {
+ var list = <int>[];
+ for (var i = 0; i < size; i++) {
+ list.add(i);
+ }
+ for (var pos = 0; pos <= size; pos++) {
+ expect(binarySearch(list, (x) => x >= pos),
+ _linearSearch(list, (x) => x >= pos));
+ }
+ }
+ });
+ });
+}
+
+int _linearSearch<T>(List<T> list, bool Function(T) predicate) {
+ if (list.isEmpty) return -1;
+ for (var i = 0; i < list.length; i++) {
+ if (predicate(list[i])) return i;
+ }
+ return list.length;
+}
diff --git a/pkgs/source_maps/test/vlq_test.dart b/pkgs/source_maps/test/vlq_test.dart
new file mode 100644
index 0000000..4568cff
--- /dev/null
+++ b/pkgs/source_maps/test/vlq_test.dart
@@ -0,0 +1,59 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math';
+
+import 'package:source_maps/src/vlq.dart';
+import 'package:test/test.dart';
+
+void main() {
+ test('encode and decode - simple values', () {
+ expect(encodeVlq(1).join(''), 'C');
+ expect(encodeVlq(2).join(''), 'E');
+ expect(encodeVlq(3).join(''), 'G');
+ expect(encodeVlq(100).join(''), 'oG');
+ expect(decodeVlq('C'.split('').iterator), 1);
+ expect(decodeVlq('E'.split('').iterator), 2);
+ expect(decodeVlq('G'.split('').iterator), 3);
+ expect(decodeVlq('oG'.split('').iterator), 100);
+ });
+
+ test('encode and decode', () {
+ for (var i = -10000; i < 10000; i++) {
+ _checkEncodeDecode(i);
+ }
+ });
+
+ test('only 32-bit ints allowed', () {
+ var maxInt = (pow(2, 31) as int) - 1;
+ var minInt = -(pow(2, 31) as int);
+ _checkEncodeDecode(maxInt - 1);
+ _checkEncodeDecode(minInt + 1);
+ _checkEncodeDecode(maxInt);
+ _checkEncodeDecode(minInt);
+
+ expect(encodeVlq(minInt).join(''), 'hgggggE');
+ expect(decodeVlq('hgggggE'.split('').iterator), minInt);
+
+ expect(() => encodeVlq(maxInt + 1), throwsA(anything));
+ expect(() => encodeVlq(maxInt + 2), throwsA(anything));
+ expect(() => encodeVlq(minInt - 1), throwsA(anything));
+ expect(() => encodeVlq(minInt - 2), throwsA(anything));
+
+ // if we allowed more than 32 bits, these would be the expected encodings
+ // for the large numbers above.
+ expect(() => decodeVlq('ggggggE'.split('').iterator), throwsA(anything));
+ expect(() => decodeVlq('igggggE'.split('').iterator), throwsA(anything));
+ expect(() => decodeVlq('jgggggE'.split('').iterator), throwsA(anything));
+ expect(() => decodeVlq('lgggggE'.split('').iterator), throwsA(anything));
+ },
+ // This test uses integers so large they overflow in JS.
+ testOn: 'dart-vm');
+}
+
+void _checkEncodeDecode(int value) {
+ var encoded = encodeVlq(value);
+ expect(decodeVlq(encoded.iterator), value);
+ expect(decodeVlq(encoded.join('').split('').iterator), value);
+}
diff --git a/pkgs/source_span/.gitignore b/pkgs/source_span/.gitignore
new file mode 100644
index 0000000..ab3cb76
--- /dev/null
+++ b/pkgs/source_span/.gitignore
@@ -0,0 +1,16 @@
+# Don’t commit the following directories created by pub.
+.buildlog
+.dart_tool/
+.pub/
+build/
+packages
+.packages
+
+# Or the files created by dart2js.
+*.dart.js
+*.js_
+*.js.deps
+*.js.map
+
+# Include when developing application packages.
+pubspec.lock
diff --git a/pkgs/source_span/CHANGELOG.md b/pkgs/source_span/CHANGELOG.md
new file mode 100644
index 0000000..b8319d7
--- /dev/null
+++ b/pkgs/source_span/CHANGELOG.md
@@ -0,0 +1,240 @@
+## 1.10.1
+
+* Require Dart 3.1
+* Move to `dart-lang/tools` monorepo.
+
+## 1.10.0
+
+* Add a `SourceFile.codeUnits` property.
+* Require Dart 2.18
+* Add an API usage example in `example/`.
+
+## 1.9.1
+
+* Properly handle multi-line labels for multi-span highlights.
+
+* Populate the pubspec `repository` field.
+
+## 1.9.0
+
+* Add `SourceSpanWithContextExtension.subspan` that returns a
+ `SourceSpanWithContext` rather than a plain `SourceSpan`.
+
+## 1.8.2
+
+* Fix a bug where highlighting multiple spans with `null` URLs could cause an
+ assertion error. Now when multiple spans are passed with `null` URLs, they're
+ highlighted as though they all come from different source files.
+
+## 1.8.1
+
+* Fix a bug where the URL header for the highlights with multiple files would
+ get omitted only one span has a non-null URI.
+
+## 1.8.0
+
+* Stable release for null safety.
+
+## 1.7.0
+
+* Add a `SourceSpan.subspan()` extension method which returns a slice of an
+ existing source span.
+
+## 1.6.0
+
+* Add support for highlighting multiple source spans at once, providing more
+ context for span-based messages. This is exposed through the new APIs
+ `SourceSpan.highlightMultiple()` and `SourceSpan.messageMultiple()` (both
+ extension methods), `MultiSourceSpanException`, and
+ `MultiSourceSpanFormatException`.
+
+## 1.5.6
+
+* Fix padding around line numbers that are powers of 10 in
+ `FileSpan.highlight()`.
+
+## 1.5.5
+
+* Fix a bug where `FileSpan.highlight()` would crash for spans that covered a
+ trailing newline and a single additional empty line.
+
+## 1.5.4
+
+* `FileSpan.highlight()` now properly highlights point spans at the beginning of
+ lines.
+
+## 1.5.3
+
+* Fix an edge case where `FileSpan.highlight()` would put the highlight
+ indicator in the wrong position when highlighting a point span after the end
+ of a file.
+
+## 1.5.2
+
+* `SourceFile.span()` now goes to the end of the file by default, rather than
+ ending one character before the end of the file. This matches the documented
+ behavior.
+
+* `FileSpan.context` now includes the full line on which the span appears for
+ empty spans at the beginning and end of lines.
+
+* Fix an edge case where `FileSpan.highlight()` could crash when highlighting a
+ span that ended with an empty line.
+
+## 1.5.1
+
+* Produce better source span highlights for multi-line spans that cover the
+ entire last line of the span, including the newline.
+
+* Produce better source span highlights for spans that contain Windows-style
+ newlines.
+
+## 1.5.0
+
+* Improve the output of `SourceSpan.highlight()` and `SourceSpan.message()`:
+
+ * They now include line numbers.
+ * They will now print every line of a multiline span.
+ * They will now use Unicode box-drawing characters by default (this can be
+ controlled using [`term_glyph.ascii`][]).
+
+[`term_glyph.ascii`]: https://pub.dartlang.org/documentation/term_glyph/latest/term_glyph/ascii.html
+
+## 1.4.1
+
+* Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 1.4.0
+
+* The `new SourceFile()` constructor is deprecated. This constructed a source
+ file from a string's runes, rather than its code units, which runs counter to
+ the way Dart handles strings otherwise. The `new StringFile.fromString()`
+ constructor (see below) should be used instead.
+
+* The `new SourceFile.fromString()` constructor was added. This works like `new
+ SourceFile()`, except it uses code units rather than runes.
+
+* The current behavior when characters larger than `0xFFFF` are passed to `new
+ SourceFile.decoded()` is now considered deprecated.
+
+## 1.3.1
+
+* Properly highlight spans for lines that include tabs with
+ `SourceSpan.highlight()` and `SourceSpan.message()`.
+
+## 1.3.0
+
+* Add `SourceSpan.highlight()`, which returns just the highlighted text that
+ would be included in `SourceSpan.message()`.
+
+## 1.2.4
+
+* Fix a new strong mode error.
+
+## 1.2.3
+
+* Fix a bug where a point span at the end of a file without a trailing newline
+ would be printed incorrectly.
+
+## 1.2.2
+
+* Allow `SourceSpanException.message`, `SourceSpanFormatException.source`, and
+ `SourceSpanWithContext.context` to be overridden in strong mode.
+
+## 1.2.1
+
+* Fix the declared type of `FileSpan.start` and `FileSpan.end`. In 1.2.0 these
+ were mistakenly changed from `FileLocation` to `SourceLocation`.
+
+## 1.2.0
+
+* **Deprecated:** Extending `SourceLocation` directly is deprecated. Instead,
+ extend the new `SourceLocationBase` class or mix in the new
+ `SourceLocationMixin` mixin.
+
+* Dramatically improve the performance of `FileLocation`.
+
+## 1.1.6
+
+* Optimize `getLine()` in `SourceFile` when repeatedly called.
+
+## 1.1.5
+
+* Fixed another case in which `FileSpan.union` could throw an exception for
+ external implementations of `FileSpan`.
+
+## 1.1.4
+
+* Eliminated dart2js warning about overriding `==`, but not `hashCode`.
+
+## 1.1.3
+
+* `FileSpan.compareTo`, `FileSpan.==`, `FileSpan.union`, and `FileSpan.expand`
+ no longer throw exceptions for external implementations of `FileSpan`.
+
+* `FileSpan.hashCode` now fully agrees with `FileSpan.==`.
+
+## 1.1.2
+
+* Fixed validation in `SourceSpanWithContext` to allow multiple occurrences of
+ `text` within `context`.
+
+## 1.1.1
+
+* Fixed `FileSpan`'s context to include the full span text, not just the first
+ line of it.
+
+## 1.1.0
+
+* Added `SourceSpanWithContext`: a span that also includes the full line of text
+ that contains the span.
+
+## 1.0.3
+
+* Cleanup equality operator to accept any Object rather than just a
+ `SourceLocation`.
+
+## 1.0.2
+
+* Avoid unintentionally allocating extra objects for internal `FileSpan`
+ operations.
+
+* Ensure that `SourceSpan.operator==` works on arbitrary `Object`s.
+
+## 1.0.1
+
+* Use a more compact internal representation for `FileSpan`.
+
+## 1.0.0
+
+This package was extracted from the
+[`source_maps`](https://pub.dev/packages/source_maps) package, but the
+API has many differences. Among them:
+
+* `Span` has been renamed to `SourceSpan` and `Location` has been renamed to
+ `SourceLocation` to clarify their purpose and maintain consistency with the
+ package name. Likewise, `SpanException` is now `SourceSpanException` and
+ `SpanFormatException` is not `SourceSpanFormatException`.
+
+* `FixedSpan` and `FixedLocation` have been rolled into the `Span` and
+ `Location` classes, respectively.
+
+* `SourceFile` is more aggressive about validating its arguments. Out-of-bounds
+ lines, columns, and offsets will now throw errors rather than be silently
+ clamped.
+
+* `SourceSpan.sourceUrl`, `SourceLocation.sourceUrl`, and `SourceFile.url` now
+ return `Uri` objects rather than `String`s. The constructors allow either
+ `String`s or `Uri`s.
+
+* `Span.getLocationMessage` and `SourceFile.getLocationMessage` are now
+ `SourceSpan.message` and `SourceFile.message`, respectively. Rather than
+ taking both a `useColor` and a `color` parameter, they now take a single
+ `color` parameter that controls both whether and which color is used.
+
+* `Span.isIdentifier` has been removed. This property doesn't make sense outside
+ of a source map context.
+
+* `SourceFileSegment` has been removed. This class wasn't widely used and was
+ inconsistent in its choice of which parameters were considered relative and
+ which absolute.
diff --git a/pkgs/source_span/LICENSE b/pkgs/source_span/LICENSE
new file mode 100644
index 0000000..000cd7b
--- /dev/null
+++ b/pkgs/source_span/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2014, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/source_span/README.md b/pkgs/source_span/README.md
new file mode 100644
index 0000000..b4ce25f
--- /dev/null
+++ b/pkgs/source_span/README.md
@@ -0,0 +1,21 @@
+[](https://github.com/dart-lang/tools/actions/workflows/source_span.yaml)
+[](https://pub.dev/packages/source_span)
+[](https://pub.dev/packages/source_span/publisher)
+
+## About this package
+
+`source_span` is a library for tracking locations in source code. It's designed
+to provide a standard representation for source code locations and spans so that
+disparate packages can easily pass them among one another, and to make it easy
+to generate human-friendly messages associated with a given piece of code.
+
+The most commonly-used class is the package's namesake, `SourceSpan`. It
+represents a span of characters in some source file, and is often attached to an
+object that has been parsed to indicate where it was parsed from. It provides
+access to the text of the span via `SourceSpan.text` and can be used to produce
+human-friendly messages using `SourceSpan.message()`.
+
+When parsing code from a file, `SourceFile` is useful. Not only does it provide
+an efficient means of computing line and column numbers, `SourceFile.span()`
+returns special `FileSpan`s that are able to provide more context for their
+error messages.
diff --git a/pkgs/source_span/analysis_options.yaml b/pkgs/source_span/analysis_options.yaml
new file mode 100644
index 0000000..d2ebdbf
--- /dev/null
+++ b/pkgs/source_span/analysis_options.yaml
@@ -0,0 +1,32 @@
+# https://dart.dev/guides/language/analysis-options
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-inference: true
+ strict-raw-types: true
+
+linter:
+ rules:
+ - avoid_bool_literals_in_conditional_expressions
+ - avoid_classes_with_only_static_members
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_returning_this
+ - avoid_unused_constructor_parameters
+ - avoid_void_async
+ - cancel_subscriptions
+ - cascade_invocations
+ - join_return_with_assignment
+ - literal_only_boolean_expressions
+ - missing_whitespace_between_adjacent_strings
+ - no_adjacent_strings_in_list
+ - prefer_const_declarations
+ - prefer_expression_function_bodies
+ - prefer_final_locals
+ - unnecessary_await_in_return
+ - unnecessary_raw_strings
+ - use_if_null_to_convert_nulls_to_bools
+ - use_raw_strings
+ - use_string_buffers
diff --git a/pkgs/source_span/example/main.dart b/pkgs/source_span/example/main.dart
new file mode 100644
index 0000000..e296765
--- /dev/null
+++ b/pkgs/source_span/example/main.dart
@@ -0,0 +1,51 @@
+// Copyright (c) 2023, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import 'package:source_span/source_span.dart';
+
+void main(List<String> args) {
+ final file = File('README.md');
+ final contents = file.readAsStringSync();
+
+ final sourceFile = SourceFile.fromString(contents, url: file.uri);
+ final spans = _parseFile(contents, sourceFile);
+
+ for (var span in spans.take(30)) {
+ print('[${span.start.line + 1}:${span.start.column + 1}] ${span.text}');
+ }
+}
+
+Iterable<SourceSpan> _parseFile(String contents, SourceFile sourceFile) sync* {
+ var wordStart = 0;
+ var inWhiteSpace = true;
+
+ for (var i = 0; i < contents.length; i++) {
+ final codeUnit = contents.codeUnitAt(i);
+
+ if (codeUnit == _eol || codeUnit == _space) {
+ if (!inWhiteSpace) {
+ inWhiteSpace = true;
+
+ // emit a word
+ yield sourceFile.span(wordStart, i);
+ }
+ } else {
+ if (inWhiteSpace) {
+ inWhiteSpace = false;
+
+ wordStart = i;
+ }
+ }
+ }
+
+ if (!inWhiteSpace) {
+ // emit a word
+ yield sourceFile.span(wordStart, contents.length);
+ }
+}
+
+const int _eol = 10;
+const int _space = 32;
diff --git a/pkgs/source_span/lib/source_span.dart b/pkgs/source_span/lib/source_span.dart
new file mode 100644
index 0000000..534a3a7
--- /dev/null
+++ b/pkgs/source_span/lib/source_span.dart
@@ -0,0 +1,11 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/file.dart';
+export 'src/location.dart';
+export 'src/location_mixin.dart';
+export 'src/span.dart';
+export 'src/span_exception.dart';
+export 'src/span_mixin.dart';
+export 'src/span_with_context.dart';
diff --git a/pkgs/source_span/lib/src/charcode.dart b/pkgs/source_span/lib/src/charcode.dart
new file mode 100644
index 0000000..5182638
--- /dev/null
+++ b/pkgs/source_span/lib/src/charcode.dart
@@ -0,0 +1,15 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// "Carriage return" control character.
+const int $cr = 0x0D;
+
+/// "Line feed" control character.
+const int $lf = 0x0A;
+
+/// Space character.
+const int $space = 0x20;
+
+/// "Horizontal Tab" control character, common name.
+const int $tab = 0x09;
diff --git a/pkgs/source_span/lib/src/colors.dart b/pkgs/source_span/lib/src/colors.dart
new file mode 100644
index 0000000..b48d468
--- /dev/null
+++ b/pkgs/source_span/lib/src/colors.dart
@@ -0,0 +1,12 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Color constants used for generating messages.
+const String red = '\u001b[31m';
+
+const String yellow = '\u001b[33m';
+
+const String blue = '\u001b[34m';
+
+const String none = '\u001b[0m';
diff --git a/pkgs/source_span/lib/src/file.dart b/pkgs/source_span/lib/src/file.dart
new file mode 100644
index 0000000..74c9234
--- /dev/null
+++ b/pkgs/source_span/lib/src/file.dart
@@ -0,0 +1,454 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math' as math;
+import 'dart:typed_data';
+
+import 'location.dart';
+import 'location_mixin.dart';
+import 'span.dart';
+import 'span_mixin.dart';
+import 'span_with_context.dart';
+
+// Constants to determine end-of-lines.
+const int _lf = 10;
+const int _cr = 13;
+
+/// A class representing a source file.
+///
+/// This doesn't necessarily have to correspond to a file on disk, just a chunk
+/// of text usually with a URL associated with it.
+class SourceFile {
+ /// The URL where the source file is located.
+ ///
+ /// This may be null, indicating that the URL is unknown or unavailable.
+ final Uri? url;
+
+ /// An array of offsets for each line beginning in the file.
+ ///
+ /// Each offset refers to the first character *after* the newline. If the
+ /// source file has a trailing newline, the final offset won't actually be in
+ /// the file.
+ final _lineStarts = <int>[0];
+
+ /// The code units of the characters in the file.
+ ///
+ /// If this was constructed with the deprecated `SourceFile()` constructor,
+ /// this will instead contain the code _points_ of the characters in the file
+ /// (so characters above 2^16 are represented as individual integers rather
+ /// than surrogate pairs).
+ List<int> get codeUnits => _decodedChars;
+
+ /// The code units of the characters in this file.
+ final Uint32List _decodedChars;
+
+ /// The length of the file in characters.
+ int get length => _decodedChars.length;
+
+ /// The number of lines in the file.
+ int get lines => _lineStarts.length;
+
+ /// The line that the offset fell on the last time [getLine] was called.
+ ///
+ /// In many cases, sequential calls to getLine() are for nearby, usually
+ /// increasing offsets. In that case, we can find the line for an offset
+ /// quickly by first checking to see if the offset is on the same line as the
+ /// previous result.
+ int? _cachedLine;
+
+ /// This constructor is deprecated.
+ ///
+ /// Use [SourceFile.fromString] instead.
+ @Deprecated('Will be removed in 2.0.0')
+ SourceFile(String text, {Object? url}) : this.decoded(text.runes, url: url);
+
+ /// Creates a new source file from [text].
+ ///
+ /// [url] may be either a [String], a [Uri], or `null`.
+ SourceFile.fromString(String text, {Object? url})
+ : this.decoded(text.codeUnits, url: url);
+
+ /// Creates a new source file from a list of decoded code units.
+ ///
+ /// [url] may be either a [String], a [Uri], or `null`.
+ ///
+ /// Currently, if [decodedChars] contains characters larger than `0xFFFF`,
+ /// they'll be treated as single characters rather than being split into
+ /// surrogate pairs. **This behavior is deprecated**. For
+ /// forwards-compatibility, callers should only pass in characters less than
+ /// or equal to `0xFFFF`.
+ SourceFile.decoded(Iterable<int> decodedChars, {Object? url})
+ : url = url is String ? Uri.parse(url) : url as Uri?,
+ _decodedChars = Uint32List.fromList(decodedChars.toList()) {
+ for (var i = 0; i < _decodedChars.length; i++) {
+ var c = _decodedChars[i];
+ if (c == _cr) {
+ // Return not followed by newline is treated as a newline
+ final j = i + 1;
+ if (j >= _decodedChars.length || _decodedChars[j] != _lf) c = _lf;
+ }
+ if (c == _lf) _lineStarts.add(i + 1);
+ }
+ }
+
+ /// Returns a span from [start] to [end] (exclusive).
+ ///
+ /// If [end] isn't passed, it defaults to the end of the file.
+ FileSpan span(int start, [int? end]) {
+ end ??= length;
+ return _FileSpan(this, start, end);
+ }
+
+ /// Returns a location at [offset].
+ FileLocation location(int offset) => FileLocation._(this, offset);
+
+ /// Gets the 0-based line corresponding to [offset].
+ int getLine(int offset) {
+ if (offset < 0) {
+ throw RangeError('Offset may not be negative, was $offset.');
+ } else if (offset > length) {
+ throw RangeError('Offset $offset must not be greater than the number '
+ 'of characters in the file, $length.');
+ }
+
+ if (offset < _lineStarts.first) return -1;
+ if (offset >= _lineStarts.last) return _lineStarts.length - 1;
+
+ if (_isNearCachedLine(offset)) return _cachedLine!;
+
+ _cachedLine = _binarySearch(offset) - 1;
+ return _cachedLine!;
+ }
+
+ /// Returns `true` if [offset] is near [_cachedLine].
+ ///
+ /// Checks on [_cachedLine] and the next line. If it's on the next line, it
+ /// updates [_cachedLine] to point to that.
+ bool _isNearCachedLine(int offset) {
+ if (_cachedLine == null) return false;
+ final cachedLine = _cachedLine!;
+
+ // See if it's before the cached line.
+ if (offset < _lineStarts[cachedLine]) return false;
+
+ // See if it's on the cached line.
+ if (cachedLine >= _lineStarts.length - 1 ||
+ offset < _lineStarts[cachedLine + 1]) {
+ return true;
+ }
+
+ // See if it's on the next line.
+ if (cachedLine >= _lineStarts.length - 2 ||
+ offset < _lineStarts[cachedLine + 2]) {
+ _cachedLine = cachedLine + 1;
+ return true;
+ }
+
+ return false;
+ }
+
+ /// Binary search through [_lineStarts] to find the line containing [offset].
+ ///
+ /// Returns the index of the line in [_lineStarts].
+ int _binarySearch(int offset) {
+ var min = 0;
+ var max = _lineStarts.length - 1;
+ while (min < max) {
+ final half = min + ((max - min) ~/ 2);
+ if (_lineStarts[half] > offset) {
+ max = half;
+ } else {
+ min = half + 1;
+ }
+ }
+
+ return max;
+ }
+
+ /// Gets the 0-based column corresponding to [offset].
+ ///
+ /// If [line] is passed, it's assumed to be the line containing [offset] and
+ /// is used to more efficiently compute the column.
+ int getColumn(int offset, {int? line}) {
+ if (offset < 0) {
+ throw RangeError('Offset may not be negative, was $offset.');
+ } else if (offset > length) {
+ throw RangeError('Offset $offset must be not be greater than the '
+ 'number of characters in the file, $length.');
+ }
+
+ if (line == null) {
+ line = getLine(offset);
+ } else if (line < 0) {
+ throw RangeError('Line may not be negative, was $line.');
+ } else if (line >= lines) {
+ throw RangeError('Line $line must be less than the number of '
+ 'lines in the file, $lines.');
+ }
+
+ final lineStart = _lineStarts[line];
+ if (lineStart > offset) {
+ throw RangeError('Line $line comes after offset $offset.');
+ }
+
+ return offset - lineStart;
+ }
+
+ /// Gets the offset for a [line] and [column].
+ ///
+ /// [column] defaults to 0.
+ int getOffset(int line, [int? column]) {
+ column ??= 0;
+
+ if (line < 0) {
+ throw RangeError('Line may not be negative, was $line.');
+ } else if (line >= lines) {
+ throw RangeError('Line $line must be less than the number of '
+ 'lines in the file, $lines.');
+ } else if (column < 0) {
+ throw RangeError('Column may not be negative, was $column.');
+ }
+
+ final result = _lineStarts[line] + column;
+ if (result > length ||
+ (line + 1 < lines && result >= _lineStarts[line + 1])) {
+ throw RangeError("Line $line doesn't have $column columns.");
+ }
+
+ return result;
+ }
+
+ /// Returns the text of the file from [start] to [end] (exclusive).
+ ///
+ /// If [end] isn't passed, it defaults to the end of the file.
+ String getText(int start, [int? end]) =>
+ String.fromCharCodes(_decodedChars.sublist(start, end));
+}
+
+/// A [SourceLocation] within a [SourceFile].
+///
+/// Unlike the base [SourceLocation], [FileLocation] lazily computes its line
+/// and column values based on its offset and the contents of [file].
+///
+/// A [FileLocation] can be created using [SourceFile.location].
+class FileLocation extends SourceLocationMixin implements SourceLocation {
+ /// The [file] that `this` belongs to.
+ final SourceFile file;
+
+ @override
+ final int offset;
+
+ @override
+ Uri? get sourceUrl => file.url;
+
+ @override
+ int get line => file.getLine(offset);
+
+ @override
+ int get column => file.getColumn(offset);
+
+ FileLocation._(this.file, this.offset) {
+ if (offset < 0) {
+ throw RangeError('Offset may not be negative, was $offset.');
+ } else if (offset > file.length) {
+ throw RangeError('Offset $offset must not be greater than the number '
+ 'of characters in the file, ${file.length}.');
+ }
+ }
+
+ @override
+ FileSpan pointSpan() => _FileSpan(file, offset, offset);
+}
+
+/// A [SourceSpan] within a [SourceFile].
+///
+/// Unlike the base [SourceSpan], [FileSpan] lazily computes its line and column
+/// values based on its offset and the contents of [file]. [SourceSpan.message]
+/// is also able to provide more context then [SourceSpan.message], and
+/// [SourceSpan.union] will return a [FileSpan] if possible.
+///
+/// A [FileSpan] can be created using [SourceFile.span].
+abstract class FileSpan implements SourceSpanWithContext {
+ /// The [file] that `this` belongs to.
+ SourceFile get file;
+
+ @override
+ FileLocation get start;
+
+ @override
+ FileLocation get end;
+
+ /// Returns a new span that covers both `this` and [other].
+ ///
+ /// Unlike [union], [other] may be disjoint from `this`. If it is, the text
+ /// between the two will be covered by the returned span.
+ FileSpan expand(FileSpan other);
+}
+
+/// The implementation of [FileSpan].
+///
+/// This is split into a separate class so that `is _FileSpan` checks can be run
+/// to make certain operations more efficient. If we used `is FileSpan`, that
+/// would break if external classes implemented the interface.
+class _FileSpan extends SourceSpanMixin implements FileSpan {
+ @override
+ final SourceFile file;
+
+ /// The offset of the beginning of the span.
+ ///
+ /// [start] is lazily generated from this to avoid allocating unnecessary
+ /// objects.
+ final int _start;
+
+ /// The offset of the end of the span.
+ ///
+ /// [end] is lazily generated from this to avoid allocating unnecessary
+ /// objects.
+ final int _end;
+
+ @override
+ Uri? get sourceUrl => file.url;
+
+ @override
+ int get length => _end - _start;
+
+ @override
+ FileLocation get start => FileLocation._(file, _start);
+
+ @override
+ FileLocation get end => FileLocation._(file, _end);
+
+ @override
+ String get text => file.getText(_start, _end);
+
+ @override
+ String get context {
+ final endLine = file.getLine(_end);
+ final endColumn = file.getColumn(_end);
+
+ int? endOffset;
+ if (endColumn == 0 && endLine != 0) {
+ // If [end] is at the very beginning of the line, the span covers the
+ // previous newline, so we only want to include the previous line in the
+ // context...
+
+ if (length == 0) {
+ // ...unless this is a point span, in which case we want to include the
+ // next line (or the empty string if this is the end of the file).
+ return endLine == file.lines - 1
+ ? ''
+ : file.getText(
+ file.getOffset(endLine), file.getOffset(endLine + 1));
+ }
+
+ endOffset = _end;
+ } else if (endLine == file.lines - 1) {
+ // If the span covers the last line of the file, the context should go all
+ // the way to the end of the file.
+ endOffset = file.length;
+ } else {
+ // Otherwise, the context should cover the full line on which [end]
+ // appears.
+ endOffset = file.getOffset(endLine + 1);
+ }
+
+ return file.getText(file.getOffset(file.getLine(_start)), endOffset);
+ }
+
+ _FileSpan(this.file, this._start, this._end) {
+ if (_end < _start) {
+ throw ArgumentError('End $_end must come after start $_start.');
+ } else if (_end > file.length) {
+ throw RangeError('End $_end must not be greater than the number '
+ 'of characters in the file, ${file.length}.');
+ } else if (_start < 0) {
+ throw RangeError('Start may not be negative, was $_start.');
+ }
+ }
+
+ @override
+ int compareTo(SourceSpan other) {
+ if (other is! _FileSpan) return super.compareTo(other);
+
+ final result = _start.compareTo(other._start);
+ return result == 0 ? _end.compareTo(other._end) : result;
+ }
+
+ @override
+ SourceSpan union(SourceSpan other) {
+ if (other is! FileSpan) return super.union(other);
+
+ final span = expand(other);
+
+ if (other is _FileSpan) {
+ if (_start > other._end || other._start > _end) {
+ throw ArgumentError('Spans $this and $other are disjoint.');
+ }
+ } else {
+ if (_start > other.end.offset || other.start.offset > _end) {
+ throw ArgumentError('Spans $this and $other are disjoint.');
+ }
+ }
+
+ return span;
+ }
+
+ @override
+ bool operator ==(Object other) {
+ if (other is! FileSpan) return super == other;
+ if (other is! _FileSpan) {
+ return super == other && sourceUrl == other.sourceUrl;
+ }
+
+ return _start == other._start &&
+ _end == other._end &&
+ sourceUrl == other.sourceUrl;
+ }
+
+ @override
+ int get hashCode => Object.hash(_start, _end, sourceUrl);
+
+ /// Returns a new span that covers both `this` and [other].
+ ///
+ /// Unlike [union], [other] may be disjoint from `this`. If it is, the text
+ /// between the two will be covered by the returned span.
+ @override
+ FileSpan expand(FileSpan other) {
+ if (sourceUrl != other.sourceUrl) {
+ throw ArgumentError('Source URLs "$sourceUrl" and '
+ " \"${other.sourceUrl}\" don't match.");
+ }
+
+ if (other is _FileSpan) {
+ final start = math.min(_start, other._start);
+ final end = math.max(_end, other._end);
+ return _FileSpan(file, start, end);
+ } else {
+ final start = math.min(_start, other.start.offset);
+ final end = math.max(_end, other.end.offset);
+ return _FileSpan(file, start, end);
+ }
+ }
+
+ /// See `SourceSpanExtension.subspan`.
+ FileSpan subspan(int start, [int? end]) {
+ RangeError.checkValidRange(start, end, length);
+ if (start == 0 && (end == null || end == length)) return this;
+ return file.span(_start + start, end == null ? _end : _start + end);
+ }
+}
+
+// TODO(#52): Move these to instance methods in the next breaking release.
+/// Extension methods on the [FileSpan] API.
+extension FileSpanExtension on FileSpan {
+ /// See `SourceSpanExtension.subspan`.
+ FileSpan subspan(int start, [int? end]) {
+ RangeError.checkValidRange(start, end, length);
+ if (start == 0 && (end == null || end == length)) return this;
+
+ final startOffset = this.start.offset;
+ return file.span(
+ startOffset + start, end == null ? this.end.offset : startOffset + end);
+ }
+}
diff --git a/pkgs/source_span/lib/src/highlighter.dart b/pkgs/source_span/lib/src/highlighter.dart
new file mode 100644
index 0000000..19e04d0
--- /dev/null
+++ b/pkgs/source_span/lib/src/highlighter.dart
@@ -0,0 +1,727 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math' as math;
+
+import 'package:collection/collection.dart';
+import 'package:path/path.dart' as p;
+import 'package:term_glyph/term_glyph.dart' as glyph;
+
+import 'charcode.dart';
+import 'colors.dart' as colors;
+import 'location.dart';
+import 'span.dart';
+import 'span_with_context.dart';
+import 'utils.dart';
+
+/// A class for writing a chunk of text with a particular span highlighted.
+class Highlighter {
+ /// The lines to display, including context around the highlighted spans.
+ final List<_Line> _lines;
+
+ /// The color to highlight the primary [_Highlight] within its context, or
+ /// `null` if it should not be colored.
+ final String? _primaryColor;
+
+ /// The color to highlight the secondary [_Highlight]s within their context,
+ /// or `null` if they should not be colored.
+ final String? _secondaryColor;
+
+ /// The number of characters before the bar in the sidebar.
+ final int _paddingBeforeSidebar;
+
+ /// The maximum number of multiline spans that cover any part of a single
+ /// line in [_lines].
+ final int _maxMultilineSpans;
+
+ /// Whether [_lines] includes lines from multiple different files.
+ final bool _multipleFiles;
+
+ /// The buffer to which to write the result.
+ final _buffer = StringBuffer();
+
+ /// The number of spaces to render for hard tabs that appear in `_span.text`.
+ ///
+ /// We don't want to render raw tabs, because they'll mess up our character
+ /// alignment.
+ static const _spacesPerTab = 4;
+
+ /// Creates a [Highlighter] that will return a string highlighting [span]
+ /// within the text of its file when [highlight] is called.
+ ///
+ /// [color] may either be a [String], a [bool], or `null`. If it's a string,
+ /// it indicates an [ANSI terminal color escape][] that should be used to
+ /// highlight [span]'s text (for example, `"\u001b[31m"` will color red). If
+ /// it's `true`, it indicates that the text should be highlighted using the
+ /// default color. If it's `false` or `null`, it indicates that no color
+ /// should be used.
+ ///
+ /// [ANSI terminal color escape]: https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
+ Highlighter(SourceSpan span, {Object? color})
+ : this._(_collateLines([_Highlight(span, primary: true)]), () {
+ if (color == true) return colors.red;
+ if (color == false) return null;
+ return color as String?;
+ }(), null);
+
+ /// Creates a [Highlighter] that will return a string highlighting
+ /// [primarySpan] as well as all the spans in [secondarySpans] within the text
+ /// of their file when [highlight] is called.
+ ///
+ /// Each span has an associated label that will be written alongside it. For
+ /// [primarySpan] this message is [primaryLabel], and for [secondarySpans] the
+ /// labels are the map values.
+ ///
+ /// If [color] is `true`, this will use [ANSI terminal color escapes][] to
+ /// highlight the text. The [primarySpan] will be highlighted with
+ /// [primaryColor] (which defaults to red), and the [secondarySpans] will be
+ /// highlighted with [secondaryColor] (which defaults to blue). These
+ /// arguments are ignored if [color] is `false`.
+ ///
+ /// [ANSI terminal color escape]: https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
+ Highlighter.multiple(SourceSpan primarySpan, String primaryLabel,
+ Map<SourceSpan, String> secondarySpans,
+ {bool color = false, String? primaryColor, String? secondaryColor})
+ : this._(
+ _collateLines([
+ _Highlight(primarySpan, label: primaryLabel, primary: true),
+ for (var entry in secondarySpans.entries)
+ _Highlight(entry.key, label: entry.value)
+ ]),
+ color ? (primaryColor ?? colors.red) : null,
+ color ? (secondaryColor ?? colors.blue) : null);
+
+ Highlighter._(this._lines, this._primaryColor, this._secondaryColor)
+ : _paddingBeforeSidebar = 1 +
+ math.max<int>(
+ // In a purely mathematical world, floor(log10(n)) would give the
+ // number of digits in n, but floating point errors render that
+ // unreliable in practice.
+ (_lines.last.number + 1).toString().length,
+ // If [_lines] aren't contiguous, we'll write "..." in place of a
+ // line number.
+ _contiguous(_lines) ? 0 : 3,
+ ),
+ _maxMultilineSpans = _lines
+ .map((line) => line.highlights
+ .where((highlight) => isMultiline(highlight.span))
+ .length)
+ .reduce(math.max),
+ _multipleFiles = !isAllTheSame(_lines.map((line) => line.url));
+
+ /// Returns whether [lines] contains any adjacent lines from the same source
+ /// file that aren't adjacent in the original file.
+ static bool _contiguous(List<_Line> lines) {
+ for (var i = 0; i < lines.length - 1; i++) {
+ final thisLine = lines[i];
+ final nextLine = lines[i + 1];
+ if (thisLine.number + 1 != nextLine.number &&
+ thisLine.url == nextLine.url) {
+ return false;
+ }
+ }
+ return true;
+ }
+
+ /// Collect all the source lines from the contexts of all spans in
+ /// [highlights], and associates them with the highlights that cover them.
+ static List<_Line> _collateLines(List<_Highlight> highlights) {
+ // Assign spans without URLs opaque Objects as keys. Each such Object will
+ // be different, but they can then be used later on to determine which lines
+ // came from the same span even if they'd all otherwise have `null` URLs.
+ final highlightsByUrl = groupBy<_Highlight, Object>(
+ highlights, (highlight) => highlight.span.sourceUrl ?? Object());
+ for (var list in highlightsByUrl.values) {
+ list.sort((highlight1, highlight2) =>
+ highlight1.span.compareTo(highlight2.span));
+ }
+
+ return highlightsByUrl.entries.expand((entry) {
+ final url = entry.key;
+ final highlightsForFile = entry.value;
+
+ // First, create a list of all the lines in the current file that we have
+ // context for along with their line numbers.
+ final lines = <_Line>[];
+ for (var highlight in highlightsForFile) {
+ final context = highlight.span.context;
+ // If [highlight.span.context] contains lines prior to the one
+ // [highlight.span.text] appears on, write those first.
+ final lineStart = findLineStart(
+ context, highlight.span.text, highlight.span.start.column)!;
+
+ final linesBeforeSpan =
+ '\n'.allMatches(context.substring(0, lineStart)).length;
+
+ var lineNumber = highlight.span.start.line - linesBeforeSpan;
+ for (var line in context.split('\n')) {
+ // Only add a line if it hasn't already been added for a previous span
+ if (lines.isEmpty || lineNumber > lines.last.number) {
+ lines.add(_Line(line, lineNumber, url));
+ }
+ lineNumber++;
+ }
+ }
+
+ // Next, associate each line with each highlights that covers it.
+ final activeHighlights = <_Highlight>[];
+ var highlightIndex = 0;
+ for (var line in lines) {
+ activeHighlights
+ .removeWhere((highlight) => highlight.span.end.line < line.number);
+
+ final oldHighlightLength = activeHighlights.length;
+ for (var highlight in highlightsForFile.skip(highlightIndex)) {
+ if (highlight.span.start.line > line.number) break;
+ activeHighlights.add(highlight);
+ }
+ highlightIndex += activeHighlights.length - oldHighlightLength;
+
+ line.highlights.addAll(activeHighlights);
+ }
+
+ return lines;
+ }).toList();
+ }
+
+ /// Returns the highlighted span text.
+ ///
+ /// This method should only be called once.
+ String highlight() {
+ _writeFileStart(_lines.first.url);
+
+ // Each index of this list represents a column after the sidebar that could
+ // contain a line indicating an active highlight. If it's `null`, that
+ // column is empty; if it contains a highlight, it should be drawn for that
+ // column.
+ final highlightsByColumn =
+ List<_Highlight?>.filled(_maxMultilineSpans, null);
+
+ for (var i = 0; i < _lines.length; i++) {
+ final line = _lines[i];
+ if (i > 0) {
+ final lastLine = _lines[i - 1];
+ if (lastLine.url != line.url) {
+ _writeSidebar(end: glyph.upEnd);
+ _buffer.writeln();
+ _writeFileStart(line.url);
+ } else if (lastLine.number + 1 != line.number) {
+ _writeSidebar(text: '...');
+ _buffer.writeln();
+ }
+ }
+
+ // If a highlight covers the entire first line other than initial
+ // whitespace, don't bother pointing out exactly where it begins. Iterate
+ // in reverse so that longer highlights (which are sorted after shorter
+ // highlights) appear further out, leading to fewer crossed lines.
+ for (var highlight in line.highlights.reversed) {
+ if (isMultiline(highlight.span) &&
+ highlight.span.start.line == line.number &&
+ _isOnlyWhitespace(
+ line.text.substring(0, highlight.span.start.column))) {
+ replaceFirstNull(highlightsByColumn, highlight);
+ }
+ }
+
+ _writeSidebar(line: line.number);
+ _buffer.write(' ');
+ _writeMultilineHighlights(line, highlightsByColumn);
+ if (highlightsByColumn.isNotEmpty) _buffer.write(' ');
+
+ final primaryIdx =
+ line.highlights.indexWhere((highlight) => highlight.isPrimary);
+ final primary = primaryIdx == -1 ? null : line.highlights[primaryIdx];
+
+ if (primary != null) {
+ _writeHighlightedText(
+ line.text,
+ primary.span.start.line == line.number
+ ? primary.span.start.column
+ : 0,
+ primary.span.end.line == line.number
+ ? primary.span.end.column
+ : line.text.length,
+ color: _primaryColor);
+ } else {
+ _writeText(line.text);
+ }
+ _buffer.writeln();
+
+ // Always write the primary span's indicator first so that it's right next
+ // to the highlighted text.
+ if (primary != null) _writeIndicator(line, primary, highlightsByColumn);
+ for (var highlight in line.highlights) {
+ if (highlight.isPrimary) continue;
+ _writeIndicator(line, highlight, highlightsByColumn);
+ }
+ }
+
+ _writeSidebar(end: glyph.upEnd);
+ return _buffer.toString();
+ }
+
+ /// Writes the beginning of the file highlight for the file with the given
+ /// [url] (or opaque object if it comes from a span with a null URL).
+ void _writeFileStart(Object url) {
+ if (!_multipleFiles || url is! Uri) {
+ _writeSidebar(end: glyph.downEnd);
+ } else {
+ _writeSidebar(end: glyph.topLeftCorner);
+ _colorize(() => _buffer.write('${glyph.horizontalLine * 2}>'),
+ color: colors.blue);
+ _buffer.write(' ${p.prettyUri(url)}');
+ }
+ _buffer.writeln();
+ }
+
+ /// Writes the post-sidebar highlight bars for [line] according to
+ /// [highlightsByColumn].
+ ///
+ /// If [current] is passed, it's the highlight for which an indicator is being
+ /// written. If it appears in [highlightsByColumn], a horizontal line is
+ /// written from its column to the rightmost column.
+ void _writeMultilineHighlights(
+ _Line line, List<_Highlight?> highlightsByColumn,
+ {_Highlight? current}) {
+ // Whether we've written a sidebar indicator for opening a new span on this
+ // line, and which color should be used for that indicator's rightward line.
+ var openedOnThisLine = false;
+ String? openedOnThisLineColor;
+
+ final currentColor = current == null
+ ? null
+ : current.isPrimary
+ ? _primaryColor
+ : _secondaryColor;
+ var foundCurrent = false;
+ for (var highlight in highlightsByColumn) {
+ final startLine = highlight?.span.start.line;
+ final endLine = highlight?.span.end.line;
+ if (current != null && highlight == current) {
+ foundCurrent = true;
+ assert(startLine == line.number || endLine == line.number);
+ _colorize(() {
+ _buffer.write(startLine == line.number
+ ? glyph.topLeftCorner
+ : glyph.bottomLeftCorner);
+ }, color: currentColor);
+ } else if (foundCurrent) {
+ _colorize(() {
+ _buffer.write(highlight == null ? glyph.horizontalLine : glyph.cross);
+ }, color: currentColor);
+ } else if (highlight == null) {
+ if (openedOnThisLine) {
+ _colorize(() => _buffer.write(glyph.horizontalLine),
+ color: openedOnThisLineColor);
+ } else {
+ _buffer.write(' ');
+ }
+ } else {
+ _colorize(() {
+ final vertical = openedOnThisLine ? glyph.cross : glyph.verticalLine;
+ if (current != null) {
+ _buffer.write(vertical);
+ } else if (startLine == line.number) {
+ _colorize(() {
+ _buffer
+ .write(glyph.glyphOrAscii(openedOnThisLine ? '┬' : '┌', '/'));
+ }, color: openedOnThisLineColor);
+ openedOnThisLine = true;
+ openedOnThisLineColor ??=
+ highlight.isPrimary ? _primaryColor : _secondaryColor;
+ } else if (endLine == line.number &&
+ highlight.span.end.column == line.text.length) {
+ _buffer.write(highlight.label == null
+ ? glyph.glyphOrAscii('└', r'\')
+ : vertical);
+ } else {
+ _colorize(() {
+ _buffer.write(vertical);
+ }, color: openedOnThisLineColor);
+ }
+ }, color: highlight.isPrimary ? _primaryColor : _secondaryColor);
+ }
+ }
+ }
+
+ // Writes [text], with text between [startColumn] and [endColumn] colorized in
+ // the same way as [_colorize].
+ void _writeHighlightedText(String text, int startColumn, int endColumn,
+ {required String? color}) {
+ _writeText(text.substring(0, startColumn));
+ _colorize(() => _writeText(text.substring(startColumn, endColumn)),
+ color: color);
+ _writeText(text.substring(endColumn, text.length));
+ }
+
+ /// Writes an indicator for where [highlight] starts, ends, or both below
+ /// [line].
+ ///
+ /// This may either add or remove [highlight] from [highlightsByColumn].
+ void _writeIndicator(
+ _Line line, _Highlight highlight, List<_Highlight?> highlightsByColumn) {
+ final color = highlight.isPrimary ? _primaryColor : _secondaryColor;
+ if (!isMultiline(highlight.span)) {
+ _writeSidebar();
+ _buffer.write(' ');
+ _writeMultilineHighlights(line, highlightsByColumn, current: highlight);
+ if (highlightsByColumn.isNotEmpty) _buffer.write(' ');
+
+ final underlineLength = _colorize(() {
+ final start = _buffer.length;
+ _writeUnderline(line, highlight.span,
+ highlight.isPrimary ? '^' : glyph.horizontalLineBold);
+ return _buffer.length - start;
+ }, color: color);
+ _writeLabel(highlight, highlightsByColumn, underlineLength);
+ } else if (highlight.span.start.line == line.number) {
+ if (highlightsByColumn.contains(highlight)) return;
+ replaceFirstNull(highlightsByColumn, highlight);
+
+ _writeSidebar();
+ _buffer.write(' ');
+ _writeMultilineHighlights(line, highlightsByColumn, current: highlight);
+ _colorize(() => _writeArrow(line, highlight.span.start.column),
+ color: color);
+ _buffer.writeln();
+ } else if (highlight.span.end.line == line.number) {
+ final coversWholeLine = highlight.span.end.column == line.text.length;
+ if (coversWholeLine && highlight.label == null) {
+ replaceWithNull(highlightsByColumn, highlight);
+ return;
+ }
+
+ _writeSidebar();
+ _buffer.write(' ');
+ _writeMultilineHighlights(line, highlightsByColumn, current: highlight);
+
+ final underlineLength = _colorize(() {
+ final start = _buffer.length;
+ if (coversWholeLine) {
+ _buffer.write(glyph.horizontalLine * 3);
+ } else {
+ _writeArrow(line, math.max(highlight.span.end.column - 1, 0),
+ beginning: false);
+ }
+ return _buffer.length - start;
+ }, color: color);
+ _writeLabel(highlight, highlightsByColumn, underlineLength);
+ replaceWithNull(highlightsByColumn, highlight);
+ }
+ }
+
+ /// Underlines the portion of [line] covered by [span] with repeated instances
+ /// of [character].
+ void _writeUnderline(_Line line, SourceSpan span, String character) {
+ assert(!isMultiline(span));
+ assert(line.text.contains(span.text),
+ '"${line.text}" should contain "${span.text}"');
+
+ var startColumn = span.start.column;
+ var endColumn = span.end.column;
+
+ // Adjust the start and end columns to account for any tabs that were
+ // converted to spaces.
+ final tabsBefore = _countTabs(line.text.substring(0, startColumn));
+ final tabsInside = _countTabs(line.text.substring(startColumn, endColumn));
+ startColumn += tabsBefore * (_spacesPerTab - 1);
+ endColumn += (tabsBefore + tabsInside) * (_spacesPerTab - 1);
+
+ _buffer
+ ..write(' ' * startColumn)
+ ..write(character * math.max(endColumn - startColumn, 1));
+ }
+
+ /// Write an arrow pointing to column [column] in [line].
+ ///
+ /// If the arrow points to a tab character, this will point to the beginning
+ /// of the tab if [beginning] is `true` and the end if it's `false`.
+ void _writeArrow(_Line line, int column, {bool beginning = true}) {
+ final tabs =
+ _countTabs(line.text.substring(0, column + (beginning ? 0 : 1)));
+ _buffer
+ ..write(glyph.horizontalLine * (1 + column + tabs * (_spacesPerTab - 1)))
+ ..write('^');
+ }
+
+ /// Writes [highlight]'s label.
+ ///
+ /// The `_buffer` is assumed to be written to the point where the first line
+ /// of `highlight.label` can be written after a space, but this takes care of
+ /// writing indentation and highlight columns for later lines.
+ ///
+ /// The [highlightsByColumn] are used to write ongoing highlight lines if the
+ /// label is more than one line long.
+ ///
+ /// The [underlineLength] is the length of the line written between the
+ /// highlights and the beginning of the first label.
+ void _writeLabel(_Highlight highlight, List<_Highlight?> highlightsByColumn,
+ int underlineLength) {
+ final label = highlight.label;
+ if (label == null) {
+ _buffer.writeln();
+ return;
+ }
+
+ final lines = label.split('\n');
+ final color = highlight.isPrimary ? _primaryColor : _secondaryColor;
+ _colorize(() => _buffer.write(' ${lines.first}'), color: color);
+ _buffer.writeln();
+
+ for (var text in lines.skip(1)) {
+ _writeSidebar();
+ _buffer.write(' ');
+ for (var columnHighlight in highlightsByColumn) {
+ if (columnHighlight == null || columnHighlight == highlight) {
+ _buffer.write(' ');
+ } else {
+ _buffer.write(glyph.verticalLine);
+ }
+ }
+
+ _buffer.write(' ' * underlineLength);
+ _colorize(() => _buffer.write(' $text'), color: color);
+ _buffer.writeln();
+ }
+ }
+
+ /// Writes a snippet from the source text, converting hard tab characters into
+ /// plain indentation.
+ void _writeText(String text) {
+ for (var char in text.codeUnits) {
+ if (char == $tab) {
+ _buffer.write(' ' * _spacesPerTab);
+ } else {
+ _buffer.writeCharCode(char);
+ }
+ }
+ }
+
+ // Writes a sidebar to [buffer] that includes [line] as the line number if
+ // given and writes [end] at the end (defaults to [glyphs.verticalLine]).
+ //
+ // If [text] is given, it's used in place of the line number. It can't be
+ // passed at the same time as [line].
+ void _writeSidebar({int? line, String? text, String? end}) {
+ assert(line == null || text == null);
+
+ // Add 1 to line to convert from computer-friendly 0-indexed line numbers to
+ // human-friendly 1-indexed line numbers.
+ if (line != null) text = (line + 1).toString();
+ _colorize(() {
+ _buffer
+ ..write((text ?? '').padRight(_paddingBeforeSidebar))
+ ..write(end ?? glyph.verticalLine);
+ }, color: colors.blue);
+ }
+
+ /// Returns the number of hard tabs in [text].
+ int _countTabs(String text) {
+ var count = 0;
+ for (var char in text.codeUnits) {
+ if (char == $tab) count++;
+ }
+ return count;
+ }
+
+ /// Returns whether [text] contains only space or tab characters.
+ bool _isOnlyWhitespace(String text) {
+ for (var char in text.codeUnits) {
+ if (char != $space && char != $tab) return false;
+ }
+ return true;
+ }
+
+ /// Colors all text written to [_buffer] during [callback], if colorization is
+ /// enabled and [color] is not `null`.
+ T _colorize<T>(T Function() callback, {required String? color}) {
+ if (_primaryColor != null && color != null) _buffer.write(color);
+ final result = callback();
+ if (_primaryColor != null && color != null) _buffer.write(colors.none);
+ return result;
+ }
+}
+
+/// Information about how to highlight a single section of a source file.
+class _Highlight {
+ /// The section of the source file to highlight.
+ ///
+ /// This is normalized to make it easier for [Highlighter] to work with.
+ final SourceSpanWithContext span;
+
+ /// Whether this is the primary span in the highlight.
+ ///
+ /// The primary span is highlighted with a different character and colored
+ /// differently than non-primary spans.
+ final bool isPrimary;
+
+ /// The label to include inline when highlighting [span].
+ ///
+ /// This helps distinguish clarify what each highlight means when multiple are
+ /// used in the same message.
+ final String? label;
+
+ _Highlight(SourceSpan span, {String? label, bool primary = false})
+ : span = (() {
+ var newSpan = _normalizeContext(span);
+ newSpan = _normalizeNewlines(newSpan);
+ newSpan = _normalizeTrailingNewline(newSpan);
+ return _normalizeEndOfLine(newSpan);
+ })(),
+ isPrimary = primary,
+ label = label?.replaceAll('\r\n', '\n');
+
+ /// Normalizes [span] to ensure that it's a [SourceSpanWithContext] whose
+ /// context actually contains its text at the expected column.
+ ///
+ /// If it's not already a [SourceSpanWithContext], adjust the start and end
+ /// locations' line and column fields so that the highlighter can assume they
+ /// match up with the context.
+ static SourceSpanWithContext _normalizeContext(SourceSpan span) =>
+ span is SourceSpanWithContext &&
+ findLineStart(span.context, span.text, span.start.column) != null
+ ? span
+ : SourceSpanWithContext(
+ SourceLocation(span.start.offset,
+ sourceUrl: span.sourceUrl, line: 0, column: 0),
+ SourceLocation(span.end.offset,
+ sourceUrl: span.sourceUrl,
+ line: countCodeUnits(span.text, $lf),
+ column: _lastLineLength(span.text)),
+ span.text,
+ span.text);
+
+ /// Normalizes [span] to replace Windows-style newlines with Unix-style
+ /// newlines.
+ static SourceSpanWithContext _normalizeNewlines(SourceSpanWithContext span) {
+ final text = span.text;
+ if (!text.contains('\r\n')) return span;
+
+ var endOffset = span.end.offset;
+ for (var i = 0; i < text.length - 1; i++) {
+ if (text.codeUnitAt(i) == $cr && text.codeUnitAt(i + 1) == $lf) {
+ endOffset--;
+ }
+ }
+
+ return SourceSpanWithContext(
+ span.start,
+ SourceLocation(endOffset,
+ sourceUrl: span.sourceUrl,
+ line: span.end.line,
+ column: span.end.column),
+ text.replaceAll('\r\n', '\n'),
+ span.context.replaceAll('\r\n', '\n'));
+ }
+
+ /// Normalizes [span] to remove a trailing newline from `span.context`.
+ ///
+ /// If necessary, also adjust `span.end` so that it doesn't point past where
+ /// the trailing newline used to be.
+ static SourceSpanWithContext _normalizeTrailingNewline(
+ SourceSpanWithContext span) {
+ if (!span.context.endsWith('\n')) return span;
+
+ // If there's a full blank line on the end of [span.context], it's probably
+ // significant, so we shouldn't trim it.
+ if (span.text.endsWith('\n\n')) return span;
+
+ final context = span.context.substring(0, span.context.length - 1);
+ var text = span.text;
+ var start = span.start;
+ var end = span.end;
+ if (span.text.endsWith('\n') && _isTextAtEndOfContext(span)) {
+ text = span.text.substring(0, span.text.length - 1);
+ if (text.isEmpty) {
+ end = start;
+ } else {
+ end = SourceLocation(span.end.offset - 1,
+ sourceUrl: span.sourceUrl,
+ line: span.end.line - 1,
+ column: _lastLineLength(context));
+ start = span.start.offset == span.end.offset ? end : span.start;
+ }
+ }
+ return SourceSpanWithContext(start, end, text, context);
+ }
+
+ /// Normalizes [span] so that the end location is at the end of a line rather
+ /// than at the beginning of the next line.
+ static SourceSpanWithContext _normalizeEndOfLine(SourceSpanWithContext span) {
+ if (span.end.column != 0) return span;
+ if (span.end.line == span.start.line) return span;
+
+ final text = span.text.substring(0, span.text.length - 1);
+
+ return SourceSpanWithContext(
+ span.start,
+ SourceLocation(span.end.offset - 1,
+ sourceUrl: span.sourceUrl,
+ line: span.end.line - 1,
+ column: text.length - text.lastIndexOf('\n') - 1),
+ text,
+ // If the context also ends with a newline, it's possible that we don't
+ // have the full context for that line, so we shouldn't print it at all.
+ span.context.endsWith('\n')
+ ? span.context.substring(0, span.context.length - 1)
+ : span.context);
+ }
+
+ /// Returns the length of the last line in [text], whether or not it ends in a
+ /// newline.
+ static int _lastLineLength(String text) {
+ if (text.isEmpty) {
+ return 0;
+ } else if (text.codeUnitAt(text.length - 1) == $lf) {
+ return text.length == 1
+ ? 0
+ : text.length - text.lastIndexOf('\n', text.length - 2) - 1;
+ } else {
+ return text.length - text.lastIndexOf('\n') - 1;
+ }
+ }
+
+ /// Returns whether [span]'s text runs all the way to the end of its context.
+ static bool _isTextAtEndOfContext(SourceSpanWithContext span) =>
+ findLineStart(span.context, span.text, span.start.column)! +
+ span.start.column +
+ span.length ==
+ span.context.length;
+
+ @override
+ String toString() {
+ final buffer = StringBuffer();
+ if (isPrimary) buffer.write('primary ');
+ buffer.write('${span.start.line}:${span.start.column}-'
+ '${span.end.line}:${span.end.column}');
+ if (label != null) buffer.write(' ($label)');
+ return buffer.toString();
+ }
+}
+
+/// A single line of the source file being highlighted.
+class _Line {
+ /// The text of the line, not including the trailing newline.
+ final String text;
+
+ /// The 0-based line number in the source file.
+ final int number;
+
+ /// The URL of the source file in which this line appears.
+ ///
+ /// For lines created from spans without an explicit URL, this is an opaque
+ /// object that differs between lines that come from different spans.
+ final Object url;
+
+ /// All highlights that cover any portion of this line, in source span order.
+ ///
+ /// This is populated after the initial line is created.
+ final highlights = <_Highlight>[];
+
+ _Line(this.text, this.number, this.url);
+
+ @override
+ String toString() => '$number: "$text" (${highlights.join(', ')})';
+}
diff --git a/pkgs/source_span/lib/src/location.dart b/pkgs/source_span/lib/src/location.dart
new file mode 100644
index 0000000..8f22d7b
--- /dev/null
+++ b/pkgs/source_span/lib/src/location.dart
@@ -0,0 +1,102 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'span.dart';
+
+// TODO(nweiz): Use SourceLocationMixin once we decide to cut a release with
+// breaking changes. See SourceLocationMixin for details.
+
+/// A class that describes a single location within a source file.
+///
+/// This class should not be extended. Instead, [SourceLocationBase] should be
+/// extended instead.
+class SourceLocation implements Comparable<SourceLocation> {
+ /// URL of the source containing this location.
+ ///
+ /// This may be null, indicating that the source URL is unknown or
+ /// unavailable.
+ final Uri? sourceUrl;
+
+ /// The 0-based offset of this location in the source.
+ final int offset;
+
+ /// The 0-based line of this location in the source.
+ final int line;
+
+ /// The 0-based column of this location in the source
+ final int column;
+
+ /// Returns a representation of this location in the `source:line:column`
+ /// format used by text editors.
+ ///
+ /// This prints 1-based lines and columns.
+ String get toolString {
+ final source = sourceUrl ?? 'unknown source';
+ return '$source:${line + 1}:${column + 1}';
+ }
+
+ /// Creates a new location indicating [offset] within [sourceUrl].
+ ///
+ /// [line] and [column] default to assuming the source is a single line. This
+ /// means that [line] defaults to 0 and [column] defaults to [offset].
+ ///
+ /// [sourceUrl] may be either a [String], a [Uri], or `null`.
+ SourceLocation(this.offset, {Object? sourceUrl, int? line, int? column})
+ : sourceUrl =
+ sourceUrl is String ? Uri.parse(sourceUrl) : sourceUrl as Uri?,
+ line = line ?? 0,
+ column = column ?? offset {
+ if (offset < 0) {
+ throw RangeError('Offset may not be negative, was $offset.');
+ } else if (line != null && line < 0) {
+ throw RangeError('Line may not be negative, was $line.');
+ } else if (column != null && column < 0) {
+ throw RangeError('Column may not be negative, was $column.');
+ }
+ }
+
+ /// Returns the distance in characters between `this` and [other].
+ ///
+ /// This always returns a non-negative value.
+ int distance(SourceLocation other) {
+ if (sourceUrl != other.sourceUrl) {
+ throw ArgumentError('Source URLs "$sourceUrl" and '
+ "\"${other.sourceUrl}\" don't match.");
+ }
+ return (offset - other.offset).abs();
+ }
+
+ /// Returns a span that covers only a single point: this location.
+ SourceSpan pointSpan() => SourceSpan(this, this, '');
+
+ /// Compares two locations.
+ ///
+ /// [other] must have the same source URL as `this`.
+ @override
+ int compareTo(SourceLocation other) {
+ if (sourceUrl != other.sourceUrl) {
+ throw ArgumentError('Source URLs "$sourceUrl" and '
+ "\"${other.sourceUrl}\" don't match.");
+ }
+ return offset - other.offset;
+ }
+
+ @override
+ bool operator ==(Object other) =>
+ other is SourceLocation &&
+ sourceUrl == other.sourceUrl &&
+ offset == other.offset;
+
+ @override
+ int get hashCode => (sourceUrl?.hashCode ?? 0) + offset;
+
+ @override
+ String toString() => '<$runtimeType: $offset $toolString>';
+}
+
+/// A base class for source locations with [offset], [line], and [column] known
+/// at construction time.
+class SourceLocationBase extends SourceLocation {
+ SourceLocationBase(super.offset, {super.sourceUrl, super.line, super.column});
+}
diff --git a/pkgs/source_span/lib/src/location_mixin.dart b/pkgs/source_span/lib/src/location_mixin.dart
new file mode 100644
index 0000000..a44f5e2
--- /dev/null
+++ b/pkgs/source_span/lib/src/location_mixin.dart
@@ -0,0 +1,55 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'location.dart';
+import 'span.dart';
+
+// Note: this class duplicates a lot of functionality of [SourceLocation]. This
+// is because in order for SourceLocation to use SourceLocationMixin,
+// SourceLocationMixin couldn't implement SourceLocation. In SourceSpan we
+// handle this by making the class itself non-extensible, but that would be a
+// breaking change for SourceLocation. So until we want to endure the pain of
+// cutting a release with breaking changes, we duplicate the code here.
+
+/// A mixin for easily implementing [SourceLocation].
+abstract class SourceLocationMixin implements SourceLocation {
+ @override
+ String get toolString {
+ final source = sourceUrl ?? 'unknown source';
+ return '$source:${line + 1}:${column + 1}';
+ }
+
+ @override
+ int distance(SourceLocation other) {
+ if (sourceUrl != other.sourceUrl) {
+ throw ArgumentError('Source URLs "$sourceUrl" and '
+ "\"${other.sourceUrl}\" don't match.");
+ }
+ return (offset - other.offset).abs();
+ }
+
+ @override
+ SourceSpan pointSpan() => SourceSpan(this, this, '');
+
+ @override
+ int compareTo(SourceLocation other) {
+ if (sourceUrl != other.sourceUrl) {
+ throw ArgumentError('Source URLs "$sourceUrl" and '
+ "\"${other.sourceUrl}\" don't match.");
+ }
+ return offset - other.offset;
+ }
+
+ @override
+ bool operator ==(Object other) =>
+ other is SourceLocation &&
+ sourceUrl == other.sourceUrl &&
+ offset == other.offset;
+
+ @override
+ int get hashCode => (sourceUrl?.hashCode ?? 0) + offset;
+
+ @override
+ String toString() => '<$runtimeType: $offset $toolString>';
+}
diff --git a/pkgs/source_span/lib/src/span.dart b/pkgs/source_span/lib/src/span.dart
new file mode 100644
index 0000000..941dedc
--- /dev/null
+++ b/pkgs/source_span/lib/src/span.dart
@@ -0,0 +1,193 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:path/path.dart' as p;
+import 'package:term_glyph/term_glyph.dart' as glyph;
+
+import 'file.dart';
+import 'highlighter.dart';
+import 'location.dart';
+import 'span_mixin.dart';
+import 'span_with_context.dart';
+import 'utils.dart';
+
+/// A class that describes a segment of source text.
+abstract class SourceSpan implements Comparable<SourceSpan> {
+ /// The start location of this span.
+ SourceLocation get start;
+
+ /// The end location of this span, exclusive.
+ SourceLocation get end;
+
+ /// The source text for this span.
+ String get text;
+
+ /// The URL of the source (typically a file) of this span.
+ ///
+ /// This may be null, indicating that the source URL is unknown or
+ /// unavailable.
+ Uri? get sourceUrl;
+
+ /// The length of this span, in characters.
+ int get length;
+
+ /// Creates a new span from [start] to [end] (exclusive) containing [text].
+ ///
+ /// [start] and [end] must have the same source URL and [start] must come
+ /// before [end]. [text] must have a number of characters equal to the
+ /// distance between [start] and [end].
+ factory SourceSpan(SourceLocation start, SourceLocation end, String text) =>
+ SourceSpanBase(start, end, text);
+
+ /// Creates a new span that's the union of `this` and [other].
+ ///
+ /// The two spans must have the same source URL and may not be disjoint.
+ /// [text] is computed by combining `this.text` and `other.text`.
+ SourceSpan union(SourceSpan other);
+
+ /// Compares two spans.
+ ///
+ /// [other] must have the same source URL as `this`. This orders spans by
+ /// [start] then [length].
+ @override
+ int compareTo(SourceSpan other);
+
+ /// Formats [message] in a human-friendly way associated with this span.
+ ///
+ /// [color] may either be a [String], a [bool], or `null`. If it's a string,
+ /// it indicates an [ANSI terminal color escape][] that should
+ /// be used to highlight the span's text (for example, `"\u001b[31m"` will
+ /// color red). If it's `true`, it indicates that the text should be
+ /// highlighted using the default color. If it's `false` or `null`, it
+ /// indicates that the text shouldn't be highlighted.
+ ///
+ /// This uses the full range of Unicode characters to highlight the source
+ /// span if [glyph.ascii] is `false` (the default), but only uses ASCII
+ /// characters if it's `true`.
+ ///
+ /// [ANSI terminal color escape]: https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
+ String message(String message, {Object? color});
+
+ /// Prints the text associated with this span in a user-friendly way.
+ ///
+ /// This is identical to [message], except that it doesn't print the file
+ /// name, line number, column number, or message. If [length] is 0 and this
+ /// isn't a [SourceSpanWithContext], returns an empty string.
+ ///
+ /// [color] may either be a [String], a [bool], or `null`. If it's a string,
+ /// it indicates an [ANSI terminal color escape][] that should
+ /// be used to highlight the span's text (for example, `"\u001b[31m"` will
+ /// color red). If it's `true`, it indicates that the text should be
+ /// highlighted using the default color. If it's `false` or `null`, it
+ /// indicates that the text shouldn't be highlighted.
+ ///
+ /// This uses the full range of Unicode characters to highlight the source
+ /// span if [glyph.ascii] is `false` (the default), but only uses ASCII
+ /// characters if it's `true`.
+ ///
+ /// [ANSI terminal color escape]: https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
+ String highlight({Object? color});
+}
+
+/// A base class for source spans with [start], [end], and [text] known at
+/// construction time.
+class SourceSpanBase extends SourceSpanMixin {
+ @override
+ final SourceLocation start;
+ @override
+ final SourceLocation end;
+ @override
+ final String text;
+
+ SourceSpanBase(this.start, this.end, this.text) {
+ if (end.sourceUrl != start.sourceUrl) {
+ throw ArgumentError('Source URLs "${start.sourceUrl}" and '
+ " \"${end.sourceUrl}\" don't match.");
+ } else if (end.offset < start.offset) {
+ throw ArgumentError('End $end must come after start $start.');
+ } else if (text.length != start.distance(end)) {
+ throw ArgumentError('Text "$text" must be ${start.distance(end)} '
+ 'characters long.');
+ }
+ }
+}
+
+// TODO(#52): Move these to instance methods in the next breaking release.
+/// Extension methods on the base [SourceSpan] API.
+extension SourceSpanExtension on SourceSpan {
+ /// Like [SourceSpan.message], but also highlights [secondarySpans] to provide
+ /// the user with additional context.
+ ///
+ /// Each span takes a label ([label] for this span, and the values of the
+ /// [secondarySpans] map for the secondary spans) that's used to indicate to
+ /// the user what that particular span represents.
+ ///
+ /// If [color] is `true`, [ANSI terminal color escapes][] are used to color
+ /// the resulting string. By default this span is colored red and the
+ /// secondary spans are colored blue, but that can be customized by passing
+ /// ANSI escape strings to [primaryColor] or [secondaryColor].
+ ///
+ /// [ANSI terminal color escapes]: https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
+ ///
+ /// Each span in [secondarySpans] must refer to the same document as this
+ /// span. Throws an [ArgumentError] if any secondary span has a different
+ /// source URL than this span.
+ ///
+ /// Note that while this will work with plain [SourceSpan]s, it will produce
+ /// much more useful output with [SourceSpanWithContext]s (including
+ /// [FileSpan]s).
+ String messageMultiple(
+ String message, String label, Map<SourceSpan, String> secondarySpans,
+ {bool color = false, String? primaryColor, String? secondaryColor}) {
+ final buffer = StringBuffer()
+ ..write('line ${start.line + 1}, column ${start.column + 1}');
+ if (sourceUrl != null) buffer.write(' of ${p.prettyUri(sourceUrl)}');
+ buffer
+ ..writeln(': $message')
+ ..write(highlightMultiple(label, secondarySpans,
+ color: color,
+ primaryColor: primaryColor,
+ secondaryColor: secondaryColor));
+ return buffer.toString();
+ }
+
+ /// Like [SourceSpan.highlight], but also highlights [secondarySpans] to
+ /// provide the user with additional context.
+ ///
+ /// Each span takes a label ([label] for this span, and the values of the
+ /// [secondarySpans] map for the secondary spans) that's used to indicate to
+ /// the user what that particular span represents.
+ ///
+ /// If [color] is `true`, [ANSI terminal color escapes][] are used to color
+ /// the resulting string. By default this span is colored red and the
+ /// secondary spans are colored blue, but that can be customized by passing
+ /// ANSI escape strings to [primaryColor] or [secondaryColor].
+ ///
+ /// [ANSI terminal color escapes]: https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
+ ///
+ /// Each span in [secondarySpans] must refer to the same document as this
+ /// span. Throws an [ArgumentError] if any secondary span has a different
+ /// source URL than this span.
+ ///
+ /// Note that while this will work with plain [SourceSpan]s, it will produce
+ /// much more useful output with [SourceSpanWithContext]s (including
+ /// [FileSpan]s).
+ String highlightMultiple(String label, Map<SourceSpan, String> secondarySpans,
+ {bool color = false, String? primaryColor, String? secondaryColor}) =>
+ Highlighter.multiple(this, label, secondarySpans,
+ color: color,
+ primaryColor: primaryColor,
+ secondaryColor: secondaryColor)
+ .highlight();
+
+ /// Returns a span from [start] code units (inclusive) to [end] code units
+ /// (exclusive) after the beginning of this span.
+ SourceSpan subspan(int start, [int? end]) {
+ RangeError.checkValidRange(start, end, length);
+ if (start == 0 && (end == null || end == length)) return this;
+
+ final locations = subspanLocations(this, start, end);
+ return SourceSpan(locations[0], locations[1], text.substring(start, end));
+ }
+}
diff --git a/pkgs/source_span/lib/src/span_exception.dart b/pkgs/source_span/lib/src/span_exception.dart
new file mode 100644
index 0000000..90ad690
--- /dev/null
+++ b/pkgs/source_span/lib/src/span_exception.dart
@@ -0,0 +1,114 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'span.dart';
+
+/// A class for exceptions that have source span information attached.
+class SourceSpanException implements Exception {
+ // This is a getter so that subclasses can override it.
+ /// A message describing the exception.
+ String get message => _message;
+ final String _message;
+
+ // This is a getter so that subclasses can override it.
+ /// The span associated with this exception.
+ ///
+ /// This may be `null` if the source location can't be determined.
+ SourceSpan? get span => _span;
+ final SourceSpan? _span;
+
+ SourceSpanException(this._message, this._span);
+
+ /// Returns a string representation of `this`.
+ ///
+ /// [color] may either be a [String], a [bool], or `null`. If it's a string,
+ /// it indicates an ANSI terminal color escape that should be used to
+ /// highlight the span's text. If it's `true`, it indicates that the text
+ /// should be highlighted using the default color. If it's `false` or `null`,
+ /// it indicates that the text shouldn't be highlighted.
+ @override
+ String toString({Object? color}) {
+ if (span == null) return message;
+ return 'Error on ${span!.message(message, color: color)}';
+ }
+}
+
+/// A [SourceSpanException] that's also a [FormatException].
+class SourceSpanFormatException extends SourceSpanException
+ implements FormatException {
+ @override
+ final dynamic source;
+
+ @override
+ int? get offset => span?.start.offset;
+
+ SourceSpanFormatException(super.message, super.span, [this.source]);
+}
+
+/// A [SourceSpanException] that also highlights some secondary spans to provide
+/// the user with extra context.
+///
+/// Each span has a label ([primaryLabel] for the primary, and the values of the
+/// [secondarySpans] map for the secondary spans) that's used to indicate to the
+/// user what that particular span represents.
+class MultiSourceSpanException extends SourceSpanException {
+ /// A label to attach to [span] that provides additional information and helps
+ /// distinguish it from [secondarySpans].
+ final String primaryLabel;
+
+ /// A map whose keys are secondary spans that should be highlighted.
+ ///
+ /// Each span's value is a label to attach to that span that provides
+ /// additional information and helps distinguish it from [secondarySpans].
+ final Map<SourceSpan, String> secondarySpans;
+
+ MultiSourceSpanException(super.message, super.span, this.primaryLabel,
+ Map<SourceSpan, String> secondarySpans)
+ : secondarySpans = Map.unmodifiable(secondarySpans);
+
+ /// Returns a string representation of `this`.
+ ///
+ /// [color] may either be a [String], a [bool], or `null`. If it's a string,
+ /// it indicates an ANSI terminal color escape that should be used to
+ /// highlight the primary span's text. If it's `true`, it indicates that the
+ /// text should be highlighted using the default color. If it's `false` or
+ /// `null`, it indicates that the text shouldn't be highlighted.
+ ///
+ /// If [color] is `true` or a string, [secondaryColor] is used to highlight
+ /// [secondarySpans].
+ @override
+ String toString({Object? color, String? secondaryColor}) {
+ if (span == null) return message;
+
+ var useColor = false;
+ String? primaryColor;
+ if (color is String) {
+ useColor = true;
+ primaryColor = color;
+ } else if (color == true) {
+ useColor = true;
+ }
+
+ final formatted = span!.messageMultiple(
+ message, primaryLabel, secondarySpans,
+ color: useColor,
+ primaryColor: primaryColor,
+ secondaryColor: secondaryColor);
+ return 'Error on $formatted';
+ }
+}
+
+/// A [MultiSourceSpanException] that's also a [FormatException].
+class MultiSourceSpanFormatException extends MultiSourceSpanException
+ implements FormatException {
+ @override
+ final dynamic source;
+
+ @override
+ int? get offset => span?.start.offset;
+
+ MultiSourceSpanFormatException(
+ super.message, super.span, super.primaryLabel, super.secondarySpans,
+ [this.source]);
+}
diff --git a/pkgs/source_span/lib/src/span_mixin.dart b/pkgs/source_span/lib/src/span_mixin.dart
new file mode 100644
index 0000000..29b6119
--- /dev/null
+++ b/pkgs/source_span/lib/src/span_mixin.dart
@@ -0,0 +1,84 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:path/path.dart' as p;
+
+import 'highlighter.dart';
+import 'span.dart';
+import 'span_with_context.dart';
+import 'utils.dart';
+
+/// A mixin for easily implementing [SourceSpan].
+///
+/// This implements the [SourceSpan] methods in terms of [start], [end], and
+/// [text]. This assumes that [start] and [end] have the same source URL, that
+/// [start] comes before [end], and that [text] has a number of characters equal
+/// to the distance between [start] and [end].
+abstract class SourceSpanMixin implements SourceSpan {
+ @override
+ Uri? get sourceUrl => start.sourceUrl;
+
+ @override
+ int get length => end.offset - start.offset;
+
+ @override
+ int compareTo(SourceSpan other) {
+ final result = start.compareTo(other.start);
+ return result == 0 ? end.compareTo(other.end) : result;
+ }
+
+ @override
+ SourceSpan union(SourceSpan other) {
+ if (sourceUrl != other.sourceUrl) {
+ throw ArgumentError('Source URLs "$sourceUrl" and '
+ " \"${other.sourceUrl}\" don't match.");
+ }
+
+ final start = min(this.start, other.start);
+ final end = max(this.end, other.end);
+ final beginSpan = start == this.start ? this : other;
+ final endSpan = end == this.end ? this : other;
+
+ if (beginSpan.end.compareTo(endSpan.start) < 0) {
+ throw ArgumentError('Spans $this and $other are disjoint.');
+ }
+
+ final text = beginSpan.text +
+ endSpan.text.substring(beginSpan.end.distance(endSpan.start));
+ return SourceSpan(start, end, text);
+ }
+
+ @override
+ String message(String message, {Object? color}) {
+ final buffer = StringBuffer()
+ ..write('line ${start.line + 1}, column ${start.column + 1}');
+ if (sourceUrl != null) buffer.write(' of ${p.prettyUri(sourceUrl)}');
+ buffer.write(': $message');
+
+ final highlight = this.highlight(color: color);
+ if (highlight.isNotEmpty) {
+ buffer
+ ..writeln()
+ ..write(highlight);
+ }
+
+ return buffer.toString();
+ }
+
+ @override
+ String highlight({Object? color}) {
+ if (this is! SourceSpanWithContext && length == 0) return '';
+ return Highlighter(this, color: color).highlight();
+ }
+
+ @override
+ bool operator ==(Object other) =>
+ other is SourceSpan && start == other.start && end == other.end;
+
+ @override
+ int get hashCode => Object.hash(start, end);
+
+ @override
+ String toString() => '<$runtimeType: from $start to $end "$text">';
+}
diff --git a/pkgs/source_span/lib/src/span_with_context.dart b/pkgs/source_span/lib/src/span_with_context.dart
new file mode 100644
index 0000000..776c789
--- /dev/null
+++ b/pkgs/source_span/lib/src/span_with_context.dart
@@ -0,0 +1,51 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'location.dart';
+import 'span.dart';
+import 'utils.dart';
+
+/// A class that describes a segment of source text with additional context.
+class SourceSpanWithContext extends SourceSpanBase {
+ // This is a getter so that subclasses can override it.
+ /// Text around the span, which includes the line containing this span.
+ String get context => _context;
+ final String _context;
+
+ /// Creates a new span from [start] to [end] (exclusive) containing [text], in
+ /// the given [context].
+ ///
+ /// [start] and [end] must have the same source URL and [start] must come
+ /// before [end]. [text] must have a number of characters equal to the
+ /// distance between [start] and [end]. [context] must contain [text], and
+ /// [text] should start at `start.column` from the beginning of a line in
+ /// [context].
+ SourceSpanWithContext(
+ SourceLocation start, SourceLocation end, String text, this._context)
+ : super(start, end, text) {
+ if (!context.contains(text)) {
+ throw ArgumentError('The context line "$context" must contain "$text".');
+ }
+
+ if (findLineStart(context, text, start.column) == null) {
+ throw ArgumentError('The span text "$text" must start at '
+ 'column ${start.column + 1} in a line within "$context".');
+ }
+ }
+}
+
+// TODO(#52): Move these to instance methods in the next breaking release.
+/// Extension methods on the base [SourceSpan] API.
+extension SourceSpanWithContextExtension on SourceSpanWithContext {
+ /// Returns a span from [start] code units (inclusive) to [end] code units
+ /// (exclusive) after the beginning of this span.
+ SourceSpanWithContext subspan(int start, [int? end]) {
+ RangeError.checkValidRange(start, end, length);
+ if (start == 0 && (end == null || end == length)) return this;
+
+ final locations = subspanLocations(this, start, end);
+ return SourceSpanWithContext(
+ locations[0], locations[1], text.substring(start, end), context);
+ }
+}
diff --git a/pkgs/source_span/lib/src/utils.dart b/pkgs/source_span/lib/src/utils.dart
new file mode 100644
index 0000000..aba14ec
--- /dev/null
+++ b/pkgs/source_span/lib/src/utils.dart
@@ -0,0 +1,145 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'charcode.dart';
+import 'location.dart';
+import 'span.dart';
+import 'span_with_context.dart';
+
+/// Returns the minimum of [obj1] and [obj2] according to
+/// [Comparable.compareTo].
+T min<T extends Comparable<T>>(T obj1, T obj2) =>
+ obj1.compareTo(obj2) > 0 ? obj2 : obj1;
+
+/// Returns the maximum of [obj1] and [obj2] according to
+/// [Comparable.compareTo].
+T max<T extends Comparable<T>>(T obj1, T obj2) =>
+ obj1.compareTo(obj2) > 0 ? obj1 : obj2;
+
+/// Returns whether all elements of [iter] are the same value, according to
+/// `==`.
+bool isAllTheSame(Iterable<Object?> iter) {
+ if (iter.isEmpty) return true;
+ final firstValue = iter.first;
+ for (var value in iter.skip(1)) {
+ if (value != firstValue) {
+ return false;
+ }
+ }
+ return true;
+}
+
+/// Returns whether [span] covers multiple lines.
+bool isMultiline(SourceSpan span) => span.start.line != span.end.line;
+
+/// Sets the first `null` element of [list] to [element].
+void replaceFirstNull<E>(List<E?> list, E element) {
+ final index = list.indexOf(null);
+ if (index < 0) throw ArgumentError('$list contains no null elements.');
+ list[index] = element;
+}
+
+/// Sets the element of [list] that currently contains [element] to `null`.
+void replaceWithNull<E>(List<E?> list, E element) {
+ final index = list.indexOf(element);
+ if (index < 0) {
+ throw ArgumentError('$list contains no elements matching $element.');
+ }
+
+ list[index] = null;
+}
+
+/// Returns the number of instances of [codeUnit] in [string].
+int countCodeUnits(String string, int codeUnit) {
+ var count = 0;
+ for (var codeUnitToCheck in string.codeUnits) {
+ if (codeUnitToCheck == codeUnit) count++;
+ }
+ return count;
+}
+
+/// Finds a line in [context] containing [text] at the specified [column].
+///
+/// Returns the index in [context] where that line begins, or null if none
+/// exists.
+int? findLineStart(String context, String text, int column) {
+ // If the text is empty, we just want to find the first line that has at least
+ // [column] characters.
+ if (text.isEmpty) {
+ var beginningOfLine = 0;
+ while (true) {
+ final index = context.indexOf('\n', beginningOfLine);
+ if (index == -1) {
+ return context.length - beginningOfLine >= column
+ ? beginningOfLine
+ : null;
+ }
+
+ if (index - beginningOfLine >= column) return beginningOfLine;
+ beginningOfLine = index + 1;
+ }
+ }
+
+ var index = context.indexOf(text);
+ while (index != -1) {
+ // Start looking before [index] in case [text] starts with a newline.
+ final lineStart = index == 0 ? 0 : context.lastIndexOf('\n', index - 1) + 1;
+ final textColumn = index - lineStart;
+ if (column == textColumn) return lineStart;
+ index = context.indexOf(text, index + 1);
+ }
+ // ignore: avoid_returning_null
+ return null;
+}
+
+/// Returns a two-element list containing the start and end locations of the
+/// span from [start] code units (inclusive) to [end] code units (exclusive)
+/// after the beginning of [span].
+///
+/// This is factored out so it can be shared between
+/// [SourceSpanExtension.subspan] and [SourceSpanWithContextExtension.subspan].
+List<SourceLocation> subspanLocations(SourceSpan span, int start, [int? end]) {
+ final text = span.text;
+ final startLocation = span.start;
+ var line = startLocation.line;
+ var column = startLocation.column;
+
+ // Adjust [line] and [column] as necessary if the character at [i] in [text]
+ // is a newline.
+ void consumeCodePoint(int i) {
+ final codeUnit = text.codeUnitAt(i);
+ if (codeUnit == $lf ||
+ // A carriage return counts as a newline, but only if it's not
+ // followed by a line feed.
+ (codeUnit == $cr &&
+ (i + 1 == text.length || text.codeUnitAt(i + 1) != $lf))) {
+ line += 1;
+ column = 0;
+ } else {
+ column += 1;
+ }
+ }
+
+ for (var i = 0; i < start; i++) {
+ consumeCodePoint(i);
+ }
+
+ final newStartLocation = SourceLocation(startLocation.offset + start,
+ sourceUrl: span.sourceUrl, line: line, column: column);
+
+ SourceLocation newEndLocation;
+ if (end == null || end == span.length) {
+ newEndLocation = span.end;
+ } else if (end == start) {
+ newEndLocation = newStartLocation;
+ } else {
+ for (var i = start; i < end; i++) {
+ consumeCodePoint(i);
+ }
+ newEndLocation = SourceLocation(startLocation.offset + end,
+ sourceUrl: span.sourceUrl, line: line, column: column);
+ }
+
+ return [newStartLocation, newEndLocation];
+}
diff --git a/pkgs/source_span/pubspec.yaml b/pkgs/source_span/pubspec.yaml
new file mode 100644
index 0000000..8757b2d
--- /dev/null
+++ b/pkgs/source_span/pubspec.yaml
@@ -0,0 +1,17 @@
+name: source_span
+version: 1.10.1
+description: >-
+ Provides a standard representation for source code locations and spans.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/source_span
+
+environment:
+ sdk: ^3.1.0
+
+dependencies:
+ collection: ^1.15.0
+ path: ^1.8.0
+ term_glyph: ^1.2.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.0
diff --git a/pkgs/source_span/test/file_test.dart b/pkgs/source_span/test/file_test.dart
new file mode 100644
index 0000000..dff51ee
--- /dev/null
+++ b/pkgs/source_span/test/file_test.dart
@@ -0,0 +1,530 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late SourceFile file;
+ setUp(() {
+ file = SourceFile.fromString('''
+foo bar baz
+whiz bang boom
+zip zap zop''', url: 'foo.dart');
+ });
+
+ group('errors', () {
+ group('for span()', () {
+ test('end must come after start', () {
+ expect(() => file.span(10, 5), throwsArgumentError);
+ });
+
+ test('start may not be negative', () {
+ expect(() => file.span(-1, 5), throwsRangeError);
+ });
+
+ test('end may not be outside the file', () {
+ expect(() => file.span(10, 100), throwsRangeError);
+ });
+ });
+
+ group('for location()', () {
+ test('offset may not be negative', () {
+ expect(() => file.location(-1), throwsRangeError);
+ });
+
+ test('offset may not be outside the file', () {
+ expect(() => file.location(100), throwsRangeError);
+ });
+ });
+
+ group('for getLine()', () {
+ test('offset may not be negative', () {
+ expect(() => file.getLine(-1), throwsRangeError);
+ });
+
+ test('offset may not be outside the file', () {
+ expect(() => file.getLine(100), throwsRangeError);
+ });
+ });
+
+ group('for getColumn()', () {
+ test('offset may not be negative', () {
+ expect(() => file.getColumn(-1), throwsRangeError);
+ });
+
+ test('offset may not be outside the file', () {
+ expect(() => file.getColumn(100), throwsRangeError);
+ });
+
+ test('line may not be negative', () {
+ expect(() => file.getColumn(1, line: -1), throwsRangeError);
+ });
+
+ test('line may not be outside the file', () {
+ expect(() => file.getColumn(1, line: 100), throwsRangeError);
+ });
+
+ test('line must be accurate', () {
+ expect(() => file.getColumn(1, line: 1), throwsRangeError);
+ });
+ });
+
+ group('getOffset()', () {
+ test('line may not be negative', () {
+ expect(() => file.getOffset(-1), throwsRangeError);
+ });
+
+ test('column may not be negative', () {
+ expect(() => file.getOffset(1, -1), throwsRangeError);
+ });
+
+ test('line may not be outside the file', () {
+ expect(() => file.getOffset(100), throwsRangeError);
+ });
+
+ test('column may not be outside the file', () {
+ expect(() => file.getOffset(2, 100), throwsRangeError);
+ });
+
+ test('column may not be outside the line', () {
+ expect(() => file.getOffset(1, 20), throwsRangeError);
+ });
+ });
+
+ group('for getText()', () {
+ test('end must come after start', () {
+ expect(() => file.getText(10, 5), throwsArgumentError);
+ });
+
+ test('start may not be negative', () {
+ expect(() => file.getText(-1, 5), throwsRangeError);
+ });
+
+ test('end may not be outside the file', () {
+ expect(() => file.getText(10, 100), throwsRangeError);
+ });
+ });
+
+ group('for span().union()', () {
+ test('source URLs must match', () {
+ final other = SourceSpan(SourceLocation(10), SourceLocation(11), '_');
+
+ expect(() => file.span(9, 10).union(other), throwsArgumentError);
+ });
+
+ test('spans may not be disjoint', () {
+ expect(() => file.span(9, 10).union(file.span(11, 12)),
+ throwsArgumentError);
+ });
+ });
+
+ test('for span().expand() source URLs must match', () {
+ final other = SourceFile.fromString('''
+foo bar baz
+whiz bang boom
+zip zap zop''', url: 'bar.dart').span(10, 11);
+
+ expect(() => file.span(9, 10).expand(other), throwsArgumentError);
+ });
+ });
+
+ test('fields work correctly', () {
+ expect(file.url, equals(Uri.parse('foo.dart')));
+ expect(file.lines, equals(3));
+ expect(file.length, equals(38));
+ });
+
+ group('new SourceFile()', () {
+ test('handles CRLF correctly', () {
+ expect(SourceFile.fromString('foo\r\nbar').getLine(6), equals(1));
+ });
+
+ test('handles a lone CR correctly', () {
+ expect(SourceFile.fromString('foo\rbar').getLine(5), equals(1));
+ });
+ });
+
+ group('span()', () {
+ test('returns a span between the given offsets', () {
+ final span = file.span(5, 10);
+ expect(span.start, equals(file.location(5)));
+ expect(span.end, equals(file.location(10)));
+ });
+
+ test('end defaults to the end of the file', () {
+ final span = file.span(5);
+ expect(span.start, equals(file.location(5)));
+ expect(span.end, equals(file.location(file.length)));
+ });
+ });
+
+ group('getLine()', () {
+ test('works for a middle character on the line', () {
+ expect(file.getLine(15), equals(1));
+ });
+
+ test('works for the first character of a line', () {
+ expect(file.getLine(12), equals(1));
+ });
+
+ test('works for a newline character', () {
+ expect(file.getLine(11), equals(0));
+ });
+
+ test('works for the last offset', () {
+ expect(file.getLine(file.length), equals(2));
+ });
+ });
+
+ group('getColumn()', () {
+ test('works for a middle character on the line', () {
+ expect(file.getColumn(15), equals(3));
+ });
+
+ test('works for the first character of a line', () {
+ expect(file.getColumn(12), equals(0));
+ });
+
+ test('works for a newline character', () {
+ expect(file.getColumn(11), equals(11));
+ });
+
+ test('works when line is passed as well', () {
+ expect(file.getColumn(12, line: 1), equals(0));
+ });
+
+ test('works for the last offset', () {
+ expect(file.getColumn(file.length), equals(11));
+ });
+ });
+
+ group('getOffset()', () {
+ test('works for a middle character on the line', () {
+ expect(file.getOffset(1, 3), equals(15));
+ });
+
+ test('works for the first character of a line', () {
+ expect(file.getOffset(1), equals(12));
+ });
+
+ test('works for a newline character', () {
+ expect(file.getOffset(0, 11), equals(11));
+ });
+
+ test('works for the last offset', () {
+ expect(file.getOffset(2, 11), equals(file.length));
+ });
+ });
+
+ group('getText()', () {
+ test('returns a substring of the source', () {
+ expect(file.getText(8, 15), equals('baz\nwhi'));
+ });
+
+ test('end defaults to the end of the file', () {
+ expect(file.getText(20), equals('g boom\nzip zap zop'));
+ });
+ });
+
+ group('FileLocation', () {
+ test('reports the correct line number', () {
+ expect(file.location(15).line, equals(1));
+ });
+
+ test('reports the correct column number', () {
+ expect(file.location(15).column, equals(3));
+ });
+
+ test('pointSpan() returns a FileSpan', () {
+ final location = file.location(15);
+ final span = location.pointSpan();
+ expect(span, isA<FileSpan>());
+ expect(span.start, equals(location));
+ expect(span.end, equals(location));
+ expect(span.text, isEmpty);
+ });
+ });
+
+ group('FileSpan', () {
+ test('text returns a substring of the source', () {
+ expect(file.span(8, 15).text, equals('baz\nwhi'));
+ });
+
+ test('text includes the last char when end is defaulted to EOF', () {
+ expect(file.span(29).text, equals('p zap zop'));
+ });
+
+ group('context', () {
+ test("contains the span's text", () {
+ final span = file.span(8, 15);
+ expect(span.context.contains(span.text), isTrue);
+ expect(span.context, equals('foo bar baz\nwhiz bang boom\n'));
+ });
+
+ test('contains the previous line for a point span at the end of a line',
+ () {
+ final span = file.span(25, 25);
+ expect(span.context, equals('whiz bang boom\n'));
+ });
+
+ test('contains the next line for a point span at the beginning of a line',
+ () {
+ final span = file.span(12, 12);
+ expect(span.context, equals('whiz bang boom\n'));
+ });
+
+ group('for a point span at the end of a file', () {
+ test('without a newline, contains the last line', () {
+ final span = file.span(file.length, file.length);
+ expect(span.context, equals('zip zap zop'));
+ });
+
+ test('with a newline, contains an empty line', () {
+ file = SourceFile.fromString('''
+foo bar baz
+whiz bang boom
+zip zap zop
+''', url: 'foo.dart');
+
+ final span = file.span(file.length, file.length);
+ expect(span.context, isEmpty);
+ });
+ });
+ });
+
+ group('union()', () {
+ late FileSpan span;
+ setUp(() {
+ span = file.span(5, 12);
+ });
+
+ test('works with a preceding adjacent span', () {
+ final other = file.span(0, 5);
+ final result = span.union(other);
+ expect(result.start, equals(other.start));
+ expect(result.end, equals(span.end));
+ expect(result.text, equals('foo bar baz\n'));
+ });
+
+ test('works with a preceding overlapping span', () {
+ final other = file.span(0, 8);
+ final result = span.union(other);
+ expect(result.start, equals(other.start));
+ expect(result.end, equals(span.end));
+ expect(result.text, equals('foo bar baz\n'));
+ });
+
+ test('works with a following adjacent span', () {
+ final other = file.span(12, 16);
+ final result = span.union(other);
+ expect(result.start, equals(span.start));
+ expect(result.end, equals(other.end));
+ expect(result.text, equals('ar baz\nwhiz'));
+ });
+
+ test('works with a following overlapping span', () {
+ final other = file.span(9, 16);
+ final result = span.union(other);
+ expect(result.start, equals(span.start));
+ expect(result.end, equals(other.end));
+ expect(result.text, equals('ar baz\nwhiz'));
+ });
+
+ test('works with an internal overlapping span', () {
+ final other = file.span(7, 10);
+ expect(span.union(other), equals(span));
+ });
+
+ test('works with an external overlapping span', () {
+ final other = file.span(0, 16);
+ expect(span.union(other), equals(other));
+ });
+
+ test('returns a FileSpan for a FileSpan input', () {
+ expect(span.union(file.span(0, 5)), isA<FileSpan>());
+ });
+
+ test('returns a base SourceSpan for a SourceSpan input', () {
+ final other = SourceSpan(SourceLocation(0, sourceUrl: 'foo.dart'),
+ SourceLocation(5, sourceUrl: 'foo.dart'), 'hey, ');
+ final result = span.union(other);
+ expect(result, isNot(isA<FileSpan>()));
+ expect(result.start, equals(other.start));
+ expect(result.end, equals(span.end));
+ expect(result.text, equals('hey, ar baz\n'));
+ });
+ });
+
+ group('expand()', () {
+ late FileSpan span;
+ setUp(() {
+ span = file.span(5, 12);
+ });
+
+ test('works with a preceding nonadjacent span', () {
+ final other = file.span(0, 3);
+ final result = span.expand(other);
+ expect(result.start, equals(other.start));
+ expect(result.end, equals(span.end));
+ expect(result.text, equals('foo bar baz\n'));
+ });
+
+ test('works with a preceding overlapping span', () {
+ final other = file.span(0, 8);
+ final result = span.expand(other);
+ expect(result.start, equals(other.start));
+ expect(result.end, equals(span.end));
+ expect(result.text, equals('foo bar baz\n'));
+ });
+
+ test('works with a following nonadjacent span', () {
+ final other = file.span(14, 16);
+ final result = span.expand(other);
+ expect(result.start, equals(span.start));
+ expect(result.end, equals(other.end));
+ expect(result.text, equals('ar baz\nwhiz'));
+ });
+
+ test('works with a following overlapping span', () {
+ final other = file.span(9, 16);
+ final result = span.expand(other);
+ expect(result.start, equals(span.start));
+ expect(result.end, equals(other.end));
+ expect(result.text, equals('ar baz\nwhiz'));
+ });
+
+ test('works with an internal overlapping span', () {
+ final other = file.span(7, 10);
+ expect(span.expand(other), equals(span));
+ });
+
+ test('works with an external overlapping span', () {
+ final other = file.span(0, 16);
+ expect(span.expand(other), equals(other));
+ });
+ });
+
+ group('subspan()', () {
+ late FileSpan span;
+ setUp(() {
+ span = file.span(5, 11); // "ar baz"
+ });
+
+ group('errors', () {
+ test('start must be greater than zero', () {
+ expect(() => span.subspan(-1), throwsRangeError);
+ });
+
+ test('start must be less than or equal to length', () {
+ expect(() => span.subspan(span.length + 1), throwsRangeError);
+ });
+
+ test('end must be greater than start', () {
+ expect(() => span.subspan(2, 1), throwsRangeError);
+ });
+
+ test('end must be less than or equal to length', () {
+ expect(() => span.subspan(0, span.length + 1), throwsRangeError);
+ });
+ });
+
+ test('preserves the source URL', () {
+ final result = span.subspan(1, 2);
+ expect(result.start.sourceUrl, equals(span.sourceUrl));
+ expect(result.end.sourceUrl, equals(span.sourceUrl));
+ });
+
+ group('returns the original span', () {
+ test('with an implicit end',
+ () => expect(span.subspan(0), equals(span)));
+
+ test('with an explicit end',
+ () => expect(span.subspan(0, span.length), equals(span)));
+ });
+
+ group('within a single line', () {
+ test('returns a strict substring of the original span', () {
+ final result = span.subspan(1, 5);
+ expect(result.text, equals('r ba'));
+ expect(result.start.offset, equals(6));
+ expect(result.start.line, equals(0));
+ expect(result.start.column, equals(6));
+ expect(result.end.offset, equals(10));
+ expect(result.end.line, equals(0));
+ expect(result.end.column, equals(10));
+ });
+
+ test('an implicit end goes to the end of the original span', () {
+ final result = span.subspan(1);
+ expect(result.text, equals('r baz'));
+ expect(result.start.offset, equals(6));
+ expect(result.start.line, equals(0));
+ expect(result.start.column, equals(6));
+ expect(result.end.offset, equals(11));
+ expect(result.end.line, equals(0));
+ expect(result.end.column, equals(11));
+ });
+
+ test('can return an empty span', () {
+ final result = span.subspan(3, 3);
+ expect(result.text, isEmpty);
+ expect(result.start.offset, equals(8));
+ expect(result.start.line, equals(0));
+ expect(result.start.column, equals(8));
+ expect(result.end, equals(result.start));
+ });
+ });
+
+ group('across multiple lines', () {
+ setUp(() {
+ span = file.span(22, 30); // "boom\nzip"
+ });
+
+ test('with start and end in the middle of a line', () {
+ final result = span.subspan(3, 6);
+ expect(result.text, equals('m\nz'));
+ expect(result.start.offset, equals(25));
+ expect(result.start.line, equals(1));
+ expect(result.start.column, equals(13));
+ expect(result.end.offset, equals(28));
+ expect(result.end.line, equals(2));
+ expect(result.end.column, equals(1));
+ });
+
+ test('with start at the end of a line', () {
+ final result = span.subspan(4, 6);
+ expect(result.text, equals('\nz'));
+ expect(result.start.offset, equals(26));
+ expect(result.start.line, equals(1));
+ expect(result.start.column, equals(14));
+ });
+
+ test('with start at the beginning of a line', () {
+ final result = span.subspan(5, 6);
+ expect(result.text, equals('z'));
+ expect(result.start.offset, equals(27));
+ expect(result.start.line, equals(2));
+ expect(result.start.column, equals(0));
+ });
+
+ test('with end at the end of a line', () {
+ final result = span.subspan(3, 4);
+ expect(result.text, equals('m'));
+ expect(result.end.offset, equals(26));
+ expect(result.end.line, equals(1));
+ expect(result.end.column, equals(14));
+ });
+
+ test('with end at the beginning of a line', () {
+ final result = span.subspan(3, 5);
+ expect(result.text, equals('m\n'));
+ expect(result.end.offset, equals(27));
+ expect(result.end.line, equals(2));
+ expect(result.end.column, equals(0));
+ });
+ });
+ });
+ });
+}
diff --git a/pkgs/source_span/test/highlight_test.dart b/pkgs/source_span/test/highlight_test.dart
new file mode 100644
index 0000000..93c42db
--- /dev/null
+++ b/pkgs/source_span/test/highlight_test.dart
@@ -0,0 +1,605 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: prefer_interpolation_to_compose_strings
+
+import 'package:source_span/source_span.dart';
+import 'package:source_span/src/colors.dart' as colors;
+import 'package:term_glyph/term_glyph.dart' as glyph;
+import 'package:test/test.dart';
+
+void main() {
+ late bool oldAscii;
+ setUpAll(() {
+ oldAscii = glyph.ascii;
+ glyph.ascii = true;
+ });
+
+ tearDownAll(() {
+ glyph.ascii = oldAscii;
+ });
+
+ late SourceFile file;
+ setUp(() {
+ file = SourceFile.fromString('''
+foo bar baz
+whiz bang boom
+zip zap zop
+''');
+ });
+
+ test('points to the span in the source', () {
+ expect(file.span(4, 7).highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^^^
+ '"""));
+ });
+
+ test('gracefully handles a missing source URL', () {
+ final span = SourceFile.fromString('foo bar baz').span(4, 7);
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^^^
+ '"""));
+ });
+
+ group('highlights a point span', () {
+ test('in the middle of a line', () {
+ expect(file.location(4).pointSpan().highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^
+ '"""));
+ });
+
+ test('at the beginning of the file', () {
+ expect(file.location(0).pointSpan().highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^
+ '"""));
+ });
+
+ test('at the beginning of a line', () {
+ expect(file.location(12).pointSpan().highlight(), equals("""
+ ,
+2 | whiz bang boom
+ | ^
+ '"""));
+ });
+
+ test('at the end of a line', () {
+ expect(file.location(11).pointSpan().highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^
+ '"""));
+ });
+
+ test('at the end of the file', () {
+ expect(file.location(38).pointSpan().highlight(), equals("""
+ ,
+3 | zip zap zop
+ | ^
+ '"""));
+ });
+
+ test('after the end of the file', () {
+ expect(file.location(39).pointSpan().highlight(), equals("""
+ ,
+4 |
+ | ^
+ '"""));
+ });
+
+ test('at the end of the file with no trailing newline', () {
+ file = SourceFile.fromString('zip zap zop');
+ expect(file.location(10).pointSpan().highlight(), equals("""
+ ,
+1 | zip zap zop
+ | ^
+ '"""));
+ });
+
+ test('after the end of the file with no trailing newline', () {
+ file = SourceFile.fromString('zip zap zop');
+ expect(file.location(11).pointSpan().highlight(), equals("""
+ ,
+1 | zip zap zop
+ | ^
+ '"""));
+ });
+
+ test('in an empty file', () {
+ expect(SourceFile.fromString('').location(0).pointSpan().highlight(),
+ equals("""
+ ,
+1 |
+ | ^
+ '"""));
+ });
+
+ test('on an empty line', () {
+ final file = SourceFile.fromString('foo\n\nbar');
+ expect(file.location(4).pointSpan().highlight(), equals("""
+ ,
+2 |
+ | ^
+ '"""));
+ });
+ });
+
+ test('highlights a single-line file without a newline', () {
+ expect(SourceFile.fromString('foo bar').span(0, 7).highlight(), equals("""
+ ,
+1 | foo bar
+ | ^^^^^^^
+ '"""));
+ });
+
+ test('highlights text including a trailing newline', () {
+ expect(file.span(8, 12).highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^^^
+ '"""));
+ });
+
+ test('highlights a single empty line', () {
+ expect(
+ SourceFile.fromString('foo\n\nbar').span(4, 5).highlight(), equals("""
+ ,
+2 |
+ | ^
+ '"""));
+ });
+
+ test('highlights a trailing newline', () {
+ expect(file.span(11, 12).highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^
+ '"""));
+ });
+
+ group('with a multiline span', () {
+ test('highlights the middle of the first and last lines', () {
+ expect(file.span(4, 34).highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+3 | | zip zap zop
+ | '-------^
+ '"""));
+ });
+
+ test('works when it begins at the end of a line', () {
+ expect(file.span(11, 34).highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,------------^
+2 | | whiz bang boom
+3 | | zip zap zop
+ | '-------^
+ '"""));
+ });
+
+ test('works when it ends at the beginning of a line', () {
+ expect(file.span(4, 28).highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+3 | | zip zap zop
+ | '-^
+ '"""));
+ });
+
+ test('highlights the full first line', () {
+ expect(file.span(0, 34).highlight(), equals("""
+ ,
+1 | / foo bar baz
+2 | | whiz bang boom
+3 | | zip zap zop
+ | '-------^
+ '"""));
+ });
+
+ test("highlights the full first line even if it's indented", () {
+ final file = SourceFile.fromString('''
+ foo bar baz
+ whiz bang boom
+ zip zap zop
+''');
+
+ expect(file.span(2, 38).highlight(), equals("""
+ ,
+1 | / foo bar baz
+2 | | whiz bang boom
+3 | | zip zap zop
+ | '-------^
+ '"""));
+ });
+
+ test("highlights the full first line if it's empty", () {
+ final file = SourceFile.fromString('''
+foo
+
+bar
+''');
+
+ expect(file.span(4, 9).highlight(), equals(r"""
+ ,
+2 | /
+3 | \ bar
+ '"""));
+ });
+
+ test('highlights the full last line', () {
+ expect(file.span(4, 27).highlight(), equals(r"""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | \ whiz bang boom
+ '"""));
+ });
+
+ test('highlights the full last line with no trailing newline', () {
+ expect(file.span(4, 26).highlight(), equals(r"""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | \ whiz bang boom
+ '"""));
+ });
+
+ test('highlights the full last line with a trailing Windows newline', () {
+ final file = SourceFile.fromString('''
+foo bar baz\r
+whiz bang boom\r
+zip zap zop\r
+''');
+
+ expect(file.span(4, 29).highlight(), equals(r"""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | \ whiz bang boom
+ '"""));
+ });
+
+ test('highlights the full last line at the end of the file', () {
+ expect(file.span(4, 39).highlight(), equals(r"""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+3 | \ zip zap zop
+ '"""));
+ });
+
+ test(
+ 'highlights the full last line at the end of the file with no trailing '
+ 'newline', () {
+ final file = SourceFile.fromString('''
+foo bar baz
+whiz bang boom
+zip zap zop''');
+
+ expect(file.span(4, 38).highlight(), equals(r"""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+3 | \ zip zap zop
+ '"""));
+ });
+
+ test("highlights the full last line if it's empty", () {
+ final file = SourceFile.fromString('''
+foo
+
+bar
+''');
+
+ expect(file.span(0, 5).highlight(), equals(r"""
+ ,
+1 | / foo
+2 | \
+ '"""));
+ });
+
+ test('highlights multiple empty lines', () {
+ final file = SourceFile.fromString('foo\n\n\n\nbar');
+ expect(file.span(4, 7).highlight(), equals(r"""
+ ,
+2 | /
+3 | |
+4 | \
+ '"""));
+ });
+
+ // Regression test for #32
+ test('highlights the end of a line and an empty line', () {
+ final file = SourceFile.fromString('foo\n\n');
+ expect(file.span(3, 5).highlight(), equals(r"""
+ ,
+1 | foo
+ | ,----^
+2 | \
+ '"""));
+ });
+ });
+
+ group('prints tabs as spaces', () {
+ group('in a single-line span', () {
+ test('before the highlighted section', () {
+ final span = SourceFile.fromString('foo\tbar baz').span(4, 7);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^^^
+ '"""));
+ });
+
+ test('within the highlighted section', () {
+ final span = SourceFile.fromString('foo bar\tbaz bang').span(4, 11);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz bang
+ | ^^^^^^^^^^
+ '"""));
+ });
+
+ test('after the highlighted section', () {
+ final span = SourceFile.fromString('foo bar\tbaz').span(4, 7);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ^^^
+ '"""));
+ });
+ });
+
+ group('in a multi-line span', () {
+ test('before the highlighted section', () {
+ final span = SourceFile.fromString('''
+foo\tbar baz
+whiz bang boom
+''').span(4, 21);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,--------^
+2 | | whiz bang boom
+ | '---------^
+ '"""));
+ });
+
+ test('within the first highlighted line', () {
+ final span = SourceFile.fromString('''
+foo bar\tbaz
+whiz bang boom
+''').span(4, 21);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+ | '---------^
+ '"""));
+ });
+
+ test('at the beginning of the first highlighted line', () {
+ final span = SourceFile.fromString('''
+foo bar\tbaz
+whiz bang boom
+''').span(7, 21);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,--------^
+2 | | whiz bang boom
+ | '---------^
+ '"""));
+ });
+
+ test('within a middle highlighted line', () {
+ final span = SourceFile.fromString('''
+foo bar baz
+whiz\tbang boom
+zip zap zop
+''').span(4, 34);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+3 | | zip zap zop
+ | '-------^
+ '"""));
+ });
+
+ test('within the last highlighted line', () {
+ final span = SourceFile.fromString('''
+foo bar baz
+whiz\tbang boom
+''').span(4, 21);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+ | '------------^
+ '"""));
+ });
+
+ test('at the end of the last highlighted line', () {
+ final span = SourceFile.fromString('''
+foo bar baz
+whiz\tbang boom
+''').span(4, 17);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+ | '--------^
+ '"""));
+ });
+
+ test('after the highlighted section', () {
+ final span = SourceFile.fromString('''
+foo bar baz
+whiz bang\tboom
+''').span(4, 21);
+
+ expect(span.highlight(), equals("""
+ ,
+1 | foo bar baz
+ | ,-----^
+2 | | whiz bang boom
+ | '---------^
+ '"""));
+ });
+ });
+ });
+
+ group('supports lines of preceding and following context for a span', () {
+ test('within a single line', () {
+ final span = SourceSpanWithContext(
+ SourceLocation(20, line: 2, column: 5, sourceUrl: 'foo.dart'),
+ SourceLocation(27, line: 2, column: 12, sourceUrl: 'foo.dart'),
+ 'foo bar',
+ 'previous\nlines\n-----foo bar-----\nfollowing line\n');
+
+ expect(span.highlight(), equals("""
+ ,
+1 | previous
+2 | lines
+3 | -----foo bar-----
+ | ^^^^^^^
+4 | following line
+ '"""));
+ });
+
+ test('covering a full line', () {
+ final span = SourceSpanWithContext(
+ SourceLocation(15, line: 2, column: 0, sourceUrl: 'foo.dart'),
+ SourceLocation(33, line: 3, column: 0, sourceUrl: 'foo.dart'),
+ '-----foo bar-----\n',
+ 'previous\nlines\n-----foo bar-----\nfollowing line\n');
+
+ expect(span.highlight(), equals("""
+ ,
+1 | previous
+2 | lines
+3 | -----foo bar-----
+ | ^^^^^^^^^^^^^^^^^
+4 | following line
+ '"""));
+ });
+
+ test('covering multiple full lines', () {
+ final span = SourceSpanWithContext(
+ SourceLocation(15, line: 2, column: 0, sourceUrl: 'foo.dart'),
+ SourceLocation(23, line: 4, column: 0, sourceUrl: 'foo.dart'),
+ 'foo\nbar\n',
+ 'previous\nlines\nfoo\nbar\nfollowing line\n');
+
+ expect(span.highlight(), equals(r"""
+ ,
+1 | previous
+2 | lines
+3 | / foo
+4 | \ bar
+5 | following line
+ '"""));
+ });
+ });
+
+ group('colors', () {
+ test("doesn't colorize if color is false", () {
+ expect(file.span(4, 7).highlight(color: false), equals("""
+ ,
+1 | foo bar baz
+ | ^^^
+ '"""));
+ });
+
+ test('colorizes if color is true', () {
+ expect(file.span(4, 7).highlight(color: true), equals('''
+${colors.blue} ,${colors.none}
+${colors.blue}1 |${colors.none} foo ${colors.red}bar${colors.none} baz
+${colors.blue} |${colors.none} ${colors.red} ^^^${colors.none}
+${colors.blue} '${colors.none}'''));
+ });
+
+ test("uses the given color if it's passed", () {
+ expect(file.span(4, 7).highlight(color: colors.yellow), equals('''
+${colors.blue} ,${colors.none}
+${colors.blue}1 |${colors.none} foo ${colors.yellow}bar${colors.none} baz
+${colors.blue} |${colors.none} ${colors.yellow} ^^^${colors.none}
+${colors.blue} '${colors.none}'''));
+ });
+
+ test('colorizes a multiline span', () {
+ expect(file.span(4, 34).highlight(color: true), equals('''
+${colors.blue} ,${colors.none}
+${colors.blue}1 |${colors.none} foo ${colors.red}bar baz${colors.none}
+${colors.blue} |${colors.none} ${colors.red},${colors.none}${colors.red}-----^${colors.none}
+${colors.blue}2 |${colors.none} ${colors.red}|${colors.none} ${colors.red}whiz bang boom${colors.none}
+${colors.blue}3 |${colors.none} ${colors.red}|${colors.none} ${colors.red}zip zap${colors.none} zop
+${colors.blue} |${colors.none} ${colors.red}'${colors.none}${colors.red}-------^${colors.none}
+${colors.blue} '${colors.none}'''));
+ });
+
+ test('colorizes a multiline span that highlights full lines', () {
+ expect(file.span(0, 39).highlight(color: true), equals('''
+${colors.blue} ,${colors.none}
+${colors.blue}1 |${colors.none} ${colors.red}/${colors.none} ${colors.red}foo bar baz${colors.none}
+${colors.blue}2 |${colors.none} ${colors.red}|${colors.none} ${colors.red}whiz bang boom${colors.none}
+${colors.blue}3 |${colors.none} ${colors.red}\\${colors.none} ${colors.red}zip zap zop${colors.none}
+${colors.blue} '${colors.none}'''));
+ });
+ });
+
+ group('line numbers have appropriate padding', () {
+ test('with line number 9', () {
+ expect(
+ SourceFile.fromString('\n' * 8 + 'foo bar baz\n')
+ .span(8, 11)
+ .highlight(),
+ equals("""
+ ,
+9 | foo bar baz
+ | ^^^
+ '"""));
+ });
+
+ test('with line number 10', () {
+ expect(
+ SourceFile.fromString('\n' * 9 + 'foo bar baz\n')
+ .span(9, 12)
+ .highlight(),
+ equals("""
+ ,
+10 | foo bar baz
+ | ^^^
+ '"""));
+ });
+ });
+}
diff --git a/pkgs/source_span/test/location_test.dart b/pkgs/source_span/test/location_test.dart
new file mode 100644
index 0000000..bbe259b
--- /dev/null
+++ b/pkgs/source_span/test/location_test.dart
@@ -0,0 +1,97 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late SourceLocation location;
+ setUp(() {
+ location = SourceLocation(15, line: 2, column: 6, sourceUrl: 'foo.dart');
+ });
+
+ group('errors', () {
+ group('for new SourceLocation()', () {
+ test('offset may not be negative', () {
+ expect(() => SourceLocation(-1), throwsRangeError);
+ });
+
+ test('line may not be negative', () {
+ expect(() => SourceLocation(0, line: -1), throwsRangeError);
+ });
+
+ test('column may not be negative', () {
+ expect(() => SourceLocation(0, column: -1), throwsRangeError);
+ });
+ });
+
+ test('for distance() source URLs must match', () {
+ expect(() => location.distance(SourceLocation(0)), throwsArgumentError);
+ });
+
+ test('for compareTo() source URLs must match', () {
+ expect(() => location.compareTo(SourceLocation(0)), throwsArgumentError);
+ });
+ });
+
+ test('fields work correctly', () {
+ expect(location.sourceUrl, equals(Uri.parse('foo.dart')));
+ expect(location.offset, equals(15));
+ expect(location.line, equals(2));
+ expect(location.column, equals(6));
+ });
+
+ group('toolString', () {
+ test('returns a computer-readable representation', () {
+ expect(location.toolString, equals('foo.dart:3:7'));
+ });
+
+ test('gracefully handles a missing source URL', () {
+ final location = SourceLocation(15, line: 2, column: 6);
+ expect(location.toolString, equals('unknown source:3:7'));
+ });
+ });
+
+ test('distance returns the absolute distance between locations', () {
+ final other = SourceLocation(10, sourceUrl: 'foo.dart');
+ expect(location.distance(other), equals(5));
+ expect(other.distance(location), equals(5));
+ });
+
+ test('pointSpan returns an empty span at location', () {
+ final span = location.pointSpan();
+ expect(span.start, equals(location));
+ expect(span.end, equals(location));
+ expect(span.text, isEmpty);
+ });
+
+ group('compareTo()', () {
+ test('sorts by offset', () {
+ final other = SourceLocation(20, sourceUrl: 'foo.dart');
+ expect(location.compareTo(other), lessThan(0));
+ expect(other.compareTo(location), greaterThan(0));
+ });
+
+ test('considers equal locations equal', () {
+ expect(location.compareTo(location), equals(0));
+ });
+ });
+
+ group('equality', () {
+ test('two locations with the same offset and source are equal', () {
+ final other = SourceLocation(15, sourceUrl: 'foo.dart');
+ expect(location, equals(other));
+ });
+
+ test("a different offset isn't equal", () {
+ final other = SourceLocation(10, sourceUrl: 'foo.dart');
+ expect(location, isNot(equals(other)));
+ });
+
+ test("a different source isn't equal", () {
+ final other = SourceLocation(15, sourceUrl: 'bar.dart');
+ expect(location, isNot(equals(other)));
+ });
+ });
+}
diff --git a/pkgs/source_span/test/multiple_highlight_test.dart b/pkgs/source_span/test/multiple_highlight_test.dart
new file mode 100644
index 0000000..139d53c
--- /dev/null
+++ b/pkgs/source_span/test/multiple_highlight_test.dart
@@ -0,0 +1,423 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+import 'package:term_glyph/term_glyph.dart' as glyph;
+import 'package:test/test.dart';
+
+void main() {
+ late bool oldAscii;
+ setUpAll(() {
+ oldAscii = glyph.ascii;
+ glyph.ascii = true;
+ });
+
+ tearDownAll(() {
+ glyph.ascii = oldAscii;
+ });
+
+ late SourceFile file;
+ setUp(() {
+ file = SourceFile.fromString('''
+foo bar baz
+whiz bang boom
+zip zap zop
+fwee fwoo fwip
+argle bargle boo
+gibble bibble bop
+''', url: 'file1.txt');
+ });
+
+ test('highlights spans on separate lines', () {
+ expect(
+ file.span(17, 21).highlightMultiple(
+ 'one', {file.span(31, 34): 'two', file.span(4, 7): 'three'}),
+ equals("""
+ ,
+1 | foo bar baz
+ | === three
+2 | whiz bang boom
+ | ^^^^ one
+3 | zip zap zop
+ | === two
+ '"""));
+ });
+
+ test('highlights spans on the same line', () {
+ expect(
+ file.span(17, 21).highlightMultiple(
+ 'one', {file.span(22, 26): 'two', file.span(12, 16): 'three'}),
+ equals("""
+ ,
+2 | whiz bang boom
+ | ^^^^ one
+ | ==== three
+ | ==== two
+ '"""));
+ });
+
+ test('highlights overlapping spans on the same line', () {
+ expect(
+ file.span(17, 21).highlightMultiple(
+ 'one', {file.span(20, 26): 'two', file.span(12, 18): 'three'}),
+ equals("""
+ ,
+2 | whiz bang boom
+ | ^^^^ one
+ | ====== three
+ | ====== two
+ '"""));
+ });
+
+ test('highlights multiple multiline spans', () {
+ expect(
+ file.span(27, 54).highlightMultiple(
+ 'one', {file.span(54, 89): 'two', file.span(0, 27): 'three'}),
+ equals("""
+ ,
+1 | / foo bar baz
+2 | | whiz bang boom
+ | '--- three
+3 | / zip zap zop
+4 | | fwee fwoo fwip
+ | '--- one
+5 | / argle bargle boo
+6 | | gibble bibble bop
+ | '--- two
+ '"""));
+ });
+
+ test('highlights multiple overlapping multiline spans', () {
+ expect(
+ file.span(12, 70).highlightMultiple(
+ 'one', {file.span(54, 89): 'two', file.span(0, 27): 'three'}),
+ equals("""
+ ,
+1 | /- foo bar baz
+2 | |/ whiz bang boom
+ | '+--- three
+3 | | zip zap zop
+4 | | fwee fwoo fwip
+5 | /+ argle bargle boo
+ | |'--- one
+6 | | gibble bibble bop
+ | '---- two
+ '"""));
+ });
+
+ test('highlights many layers of overlaps', () {
+ expect(
+ file.span(0, 54).highlightMultiple('one', {
+ file.span(12, 77): 'two',
+ file.span(27, 84): 'three',
+ file.span(39, 88): 'four'
+ }),
+ equals("""
+ ,
+1 | /--- foo bar baz
+2 | |/-- whiz bang boom
+3 | ||/- zip zap zop
+4 | |||/ fwee fwoo fwip
+ | '+++--- one
+5 | ||| argle bargle boo
+6 | ||| gibble bibble bop
+ | '++------^ two
+ | '+-------------^ three
+ | '--- four
+ '"""));
+ });
+
+ group("highlights a multiline span that's a subset", () {
+ test('with no first or last line overlap', () {
+ expect(
+ file
+ .span(27, 53)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | /- whiz bang boom
+3 | |/ zip zap zop
+4 | || fwee fwoo fwip
+ | |'--- inner
+5 | | argle bargle boo
+ | '---- outer
+ '"""));
+ });
+
+ test('overlapping the whole first line', () {
+ expect(
+ file
+ .span(12, 53)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | // whiz bang boom
+3 | || zip zap zop
+4 | || fwee fwoo fwip
+ | |'--- inner
+5 | | argle bargle boo
+ | '---- outer
+ '"""));
+ });
+
+ test('overlapping part of first line', () {
+ expect(
+ file
+ .span(17, 53)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | /- whiz bang boom
+ | |,------^
+3 | || zip zap zop
+4 | || fwee fwoo fwip
+ | |'--- inner
+5 | | argle bargle boo
+ | '---- outer
+ '"""));
+ });
+
+ test('overlapping the whole last line', () {
+ expect(
+ file
+ .span(27, 70)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | /- whiz bang boom
+3 | |/ zip zap zop
+4 | || fwee fwoo fwip
+5 | || argle bargle boo
+ | |'--- inner
+ | '---- outer
+ '"""));
+ });
+
+ test('overlapping part of the last line', () {
+ expect(
+ file
+ .span(27, 66)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | /- whiz bang boom
+3 | |/ zip zap zop
+4 | || fwee fwoo fwip
+5 | || argle bargle boo
+ | |'------------^ inner
+ | '---- outer
+ '"""));
+ });
+ });
+
+ group('a single-line span in a multiline span', () {
+ test('on the first line', () {
+ expect(
+ file
+ .span(17, 21)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | / whiz bang boom
+ | | ^^^^ inner
+3 | | zip zap zop
+4 | | fwee fwoo fwip
+5 | | argle bargle boo
+ | '--- outer
+ '"""));
+ });
+
+ test('in the middle', () {
+ expect(
+ file
+ .span(31, 34)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | / whiz bang boom
+3 | | zip zap zop
+ | | ^^^ inner
+4 | | fwee fwoo fwip
+5 | | argle bargle boo
+ | '--- outer
+ '"""));
+ });
+
+ test('on the last line', () {
+ expect(
+ file
+ .span(60, 66)
+ .highlightMultiple('inner', {file.span(12, 70): 'outer'}),
+ equals("""
+ ,
+2 | / whiz bang boom
+3 | | zip zap zop
+4 | | fwee fwoo fwip
+5 | | argle bargle boo
+ | | ^^^^^^ inner
+ | '--- outer
+ '"""));
+ });
+ });
+
+ group('writes headers when highlighting multiple files', () {
+ test('writes all file URLs', () {
+ final span2 = SourceFile.fromString('''
+quibble bibble boop
+''', url: 'file2.txt').span(8, 14);
+
+ expect(
+ file.span(31, 34).highlightMultiple('one', {span2: 'two'}), equals("""
+ ,--> file1.txt
+3 | zip zap zop
+ | ^^^ one
+ '
+ ,--> file2.txt
+1 | quibble bibble boop
+ | ====== two
+ '"""));
+ });
+
+ test('allows secondary spans to have null URL', () {
+ final span2 = SourceSpan(SourceLocation(1), SourceLocation(4), 'foo');
+
+ expect(
+ file.span(31, 34).highlightMultiple('one', {span2: 'two'}), equals("""
+ ,--> file1.txt
+3 | zip zap zop
+ | ^^^ one
+ '
+ ,
+1 | foo
+ | === two
+ '"""));
+ });
+
+ test('allows primary span to have null URL', () {
+ final span1 = SourceSpan(SourceLocation(1), SourceLocation(4), 'foo');
+
+ expect(
+ span1.highlightMultiple('one', {file.span(31, 34): 'two'}), equals("""
+ ,
+1 | foo
+ | ^^^ one
+ '
+ ,--> file1.txt
+3 | zip zap zop
+ | === two
+ '"""));
+ });
+ });
+
+ test('highlights multiple null URLs as separate files', () {
+ final span1 = SourceSpan(SourceLocation(1), SourceLocation(4), 'foo');
+ final span2 = SourceSpan(SourceLocation(1), SourceLocation(4), 'bar');
+
+ expect(span1.highlightMultiple('one', {span2: 'two'}), equals("""
+ ,
+1 | foo
+ | ^^^ one
+ '
+ ,
+1 | bar
+ | === two
+ '"""));
+ });
+
+ group('indents mutli-line labels', () {
+ test('for the primary label', () {
+ expect(file.span(17, 21).highlightMultiple('line 1\nline 2\nline 3', {}),
+ equals("""
+ ,
+2 | whiz bang boom
+ | ^^^^ line 1
+ | line 2
+ | line 3
+ '"""));
+ });
+
+ group('for a secondary label', () {
+ test('on the same line', () {
+ expect(
+ file.span(17, 21).highlightMultiple(
+ 'primary', {file.span(22, 26): 'line 1\nline 2\nline 3'}),
+ equals("""
+ ,
+2 | whiz bang boom
+ | ^^^^ primary
+ | ==== line 1
+ | line 2
+ | line 3
+ '"""));
+ });
+
+ test('on a different line', () {
+ expect(
+ file.span(17, 21).highlightMultiple(
+ 'primary', {file.span(31, 34): 'line 1\nline 2\nline 3'}),
+ equals("""
+ ,
+2 | whiz bang boom
+ | ^^^^ primary
+3 | zip zap zop
+ | === line 1
+ | line 2
+ | line 3
+ '"""));
+ });
+ });
+
+ group('for a multiline span', () {
+ test('that covers the whole last line', () {
+ expect(
+ file.span(12, 70).highlightMultiple('line 1\nline 2\nline 3', {}),
+ equals("""
+ ,
+2 | / whiz bang boom
+3 | | zip zap zop
+4 | | fwee fwoo fwip
+5 | | argle bargle boo
+ | '--- line 1
+ | line 2
+ | line 3
+ '"""));
+ });
+
+ test('that covers part of the last line', () {
+ expect(
+ file.span(12, 66).highlightMultiple('line 1\nline 2\nline 3', {}),
+ equals("""
+ ,
+2 | / whiz bang boom
+3 | | zip zap zop
+4 | | fwee fwoo fwip
+5 | | argle bargle boo
+ | '------------^ line 1
+ | line 2
+ | line 3
+ '"""));
+ });
+ });
+
+ test('with an overlapping span', () {
+ expect(
+ file.span(12, 70).highlightMultiple('line 1\nline 2\nline 3',
+ {file.span(54, 89): 'two', file.span(0, 27): 'three'}),
+ equals("""
+ ,
+1 | /- foo bar baz
+2 | |/ whiz bang boom
+ | '+--- three
+3 | | zip zap zop
+4 | | fwee fwoo fwip
+5 | /+ argle bargle boo
+ | |'--- line 1
+ | | line 2
+ | | line 3
+6 | | gibble bibble bop
+ | '---- two
+ '"""));
+ });
+ });
+}
diff --git a/pkgs/source_span/test/span_test.dart b/pkgs/source_span/test/span_test.dart
new file mode 100644
index 0000000..22c498e
--- /dev/null
+++ b/pkgs/source_span/test/span_test.dart
@@ -0,0 +1,432 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+import 'package:source_span/src/colors.dart' as colors;
+import 'package:term_glyph/term_glyph.dart' as glyph;
+import 'package:test/test.dart';
+
+void main() {
+ late bool oldAscii;
+
+ setUpAll(() {
+ oldAscii = glyph.ascii;
+ glyph.ascii = true;
+ });
+
+ tearDownAll(() {
+ glyph.ascii = oldAscii;
+ });
+
+ late SourceSpan span;
+ setUp(() {
+ span = SourceSpan(SourceLocation(5, sourceUrl: 'foo.dart'),
+ SourceLocation(12, sourceUrl: 'foo.dart'), 'foo bar');
+ });
+
+ group('errors', () {
+ group('for new SourceSpan()', () {
+ test('source URLs must match', () {
+ final start = SourceLocation(0, sourceUrl: 'foo.dart');
+ final end = SourceLocation(1, sourceUrl: 'bar.dart');
+ expect(() => SourceSpan(start, end, '_'), throwsArgumentError);
+ });
+
+ test('end must come after start', () {
+ final start = SourceLocation(1);
+ final end = SourceLocation(0);
+ expect(() => SourceSpan(start, end, '_'), throwsArgumentError);
+ });
+
+ test('text must be the right length', () {
+ final start = SourceLocation(0);
+ final end = SourceLocation(1);
+ expect(() => SourceSpan(start, end, 'abc'), throwsArgumentError);
+ });
+ });
+
+ group('for new SourceSpanWithContext()', () {
+ test('context must contain text', () {
+ final start = SourceLocation(2);
+ final end = SourceLocation(5);
+ expect(() => SourceSpanWithContext(start, end, 'abc', '--axc--'),
+ throwsArgumentError);
+ });
+
+ test('text starts at start.column in context', () {
+ final start = SourceLocation(3);
+ final end = SourceLocation(5);
+ expect(() => SourceSpanWithContext(start, end, 'abc', '--abc--'),
+ throwsArgumentError);
+ });
+
+ test('text starts at start.column of line in multi-line context', () {
+ final start = SourceLocation(4, line: 55, column: 3);
+ final end = SourceLocation(7, line: 55, column: 6);
+ expect(() => SourceSpanWithContext(start, end, 'abc', '\n--abc--'),
+ throwsArgumentError);
+ expect(() => SourceSpanWithContext(start, end, 'abc', '\n----abc--'),
+ throwsArgumentError);
+ expect(() => SourceSpanWithContext(start, end, 'abc', '\n\n--abc--'),
+ throwsArgumentError);
+
+ // However, these are valid:
+ SourceSpanWithContext(start, end, 'abc', '\n---abc--');
+ SourceSpanWithContext(start, end, 'abc', '\n\n---abc--');
+ });
+
+ test('text can occur multiple times in context', () {
+ final start1 = SourceLocation(4, line: 55, column: 2);
+ final end1 = SourceLocation(7, line: 55, column: 5);
+ final start2 = SourceLocation(4, line: 55, column: 8);
+ final end2 = SourceLocation(7, line: 55, column: 11);
+ SourceSpanWithContext(start1, end1, 'abc', '--abc---abc--\n');
+ SourceSpanWithContext(start1, end1, 'abc', '--abc--abc--\n');
+ SourceSpanWithContext(start2, end2, 'abc', '--abc---abc--\n');
+ SourceSpanWithContext(start2, end2, 'abc', '---abc--abc--\n');
+ expect(
+ () => SourceSpanWithContext(start1, end1, 'abc', '---abc--abc--\n'),
+ throwsArgumentError);
+ expect(
+ () => SourceSpanWithContext(start2, end2, 'abc', '--abc--abc--\n'),
+ throwsArgumentError);
+ });
+ });
+
+ group('for union()', () {
+ test('source URLs must match', () {
+ final other = SourceSpan(SourceLocation(12, sourceUrl: 'bar.dart'),
+ SourceLocation(13, sourceUrl: 'bar.dart'), '_');
+
+ expect(() => span.union(other), throwsArgumentError);
+ });
+
+ test('spans may not be disjoint', () {
+ final other = SourceSpan(SourceLocation(13, sourceUrl: 'foo.dart'),
+ SourceLocation(14, sourceUrl: 'foo.dart'), '_');
+
+ expect(() => span.union(other), throwsArgumentError);
+ });
+ });
+
+ test('for compareTo() source URLs must match', () {
+ final other = SourceSpan(SourceLocation(12, sourceUrl: 'bar.dart'),
+ SourceLocation(13, sourceUrl: 'bar.dart'), '_');
+
+ expect(() => span.compareTo(other), throwsArgumentError);
+ });
+ });
+
+ test('fields work correctly', () {
+ expect(span.start, equals(SourceLocation(5, sourceUrl: 'foo.dart')));
+ expect(span.end, equals(SourceLocation(12, sourceUrl: 'foo.dart')));
+ expect(span.sourceUrl, equals(Uri.parse('foo.dart')));
+ expect(span.length, equals(7));
+ });
+
+ group('union()', () {
+ test('works with a preceding adjacent span', () {
+ final other = SourceSpan(SourceLocation(0, sourceUrl: 'foo.dart'),
+ SourceLocation(5, sourceUrl: 'foo.dart'), 'hey, ');
+
+ final result = span.union(other);
+ expect(result.start, equals(other.start));
+ expect(result.end, equals(span.end));
+ expect(result.text, equals('hey, foo bar'));
+ });
+
+ test('works with a preceding overlapping span', () {
+ final other = SourceSpan(SourceLocation(0, sourceUrl: 'foo.dart'),
+ SourceLocation(8, sourceUrl: 'foo.dart'), 'hey, foo');
+
+ final result = span.union(other);
+ expect(result.start, equals(other.start));
+ expect(result.end, equals(span.end));
+ expect(result.text, equals('hey, foo bar'));
+ });
+
+ test('works with a following adjacent span', () {
+ final other = SourceSpan(SourceLocation(12, sourceUrl: 'foo.dart'),
+ SourceLocation(16, sourceUrl: 'foo.dart'), ' baz');
+
+ final result = span.union(other);
+ expect(result.start, equals(span.start));
+ expect(result.end, equals(other.end));
+ expect(result.text, equals('foo bar baz'));
+ });
+
+ test('works with a following overlapping span', () {
+ final other = SourceSpan(SourceLocation(9, sourceUrl: 'foo.dart'),
+ SourceLocation(16, sourceUrl: 'foo.dart'), 'bar baz');
+
+ final result = span.union(other);
+ expect(result.start, equals(span.start));
+ expect(result.end, equals(other.end));
+ expect(result.text, equals('foo bar baz'));
+ });
+
+ test('works with an internal overlapping span', () {
+ final other = SourceSpan(SourceLocation(7, sourceUrl: 'foo.dart'),
+ SourceLocation(10, sourceUrl: 'foo.dart'), 'o b');
+
+ expect(span.union(other), equals(span));
+ });
+
+ test('works with an external overlapping span', () {
+ final other = SourceSpan(SourceLocation(0, sourceUrl: 'foo.dart'),
+ SourceLocation(16, sourceUrl: 'foo.dart'), 'hey, foo bar baz');
+
+ expect(span.union(other), equals(other));
+ });
+ });
+
+ group('subspan()', () {
+ group('errors', () {
+ test('start must be greater than zero', () {
+ expect(() => span.subspan(-1), throwsRangeError);
+ });
+
+ test('start must be less than or equal to length', () {
+ expect(() => span.subspan(span.length + 1), throwsRangeError);
+ });
+
+ test('end must be greater than start', () {
+ expect(() => span.subspan(2, 1), throwsRangeError);
+ });
+
+ test('end must be less than or equal to length', () {
+ expect(() => span.subspan(0, span.length + 1), throwsRangeError);
+ });
+ });
+
+ test('preserves the source URL', () {
+ final result = span.subspan(1, 2);
+ expect(result.start.sourceUrl, equals(span.sourceUrl));
+ expect(result.end.sourceUrl, equals(span.sourceUrl));
+ });
+
+ test('preserves the context', () {
+ final start = SourceLocation(2);
+ final end = SourceLocation(5);
+ final span = SourceSpanWithContext(start, end, 'abc', '--abc--');
+ expect(span.subspan(1, 2).context, equals('--abc--'));
+ });
+
+ group('returns the original span', () {
+ test('with an implicit end', () => expect(span.subspan(0), equals(span)));
+
+ test('with an explicit end',
+ () => expect(span.subspan(0, span.length), equals(span)));
+ });
+
+ group('within a single line', () {
+ test('returns a strict substring of the original span', () {
+ final result = span.subspan(1, 5);
+ expect(result.text, equals('oo b'));
+ expect(result.start.offset, equals(6));
+ expect(result.start.line, equals(0));
+ expect(result.start.column, equals(6));
+ expect(result.end.offset, equals(10));
+ expect(result.end.line, equals(0));
+ expect(result.end.column, equals(10));
+ });
+
+ test('an implicit end goes to the end of the original span', () {
+ final result = span.subspan(1);
+ expect(result.text, equals('oo bar'));
+ expect(result.start.offset, equals(6));
+ expect(result.start.line, equals(0));
+ expect(result.start.column, equals(6));
+ expect(result.end.offset, equals(12));
+ expect(result.end.line, equals(0));
+ expect(result.end.column, equals(12));
+ });
+
+ test('can return an empty span', () {
+ final result = span.subspan(3, 3);
+ expect(result.text, isEmpty);
+ expect(result.start.offset, equals(8));
+ expect(result.start.line, equals(0));
+ expect(result.start.column, equals(8));
+ expect(result.end, equals(result.start));
+ });
+ });
+
+ group('across multiple lines', () {
+ setUp(() {
+ span = SourceSpan(
+ SourceLocation(5, line: 2, column: 0),
+ SourceLocation(16, line: 4, column: 3),
+ 'foo\n'
+ 'bar\n'
+ 'baz');
+ });
+
+ test('with start and end in the middle of a line', () {
+ final result = span.subspan(2, 5);
+ expect(result.text, equals('o\nb'));
+ expect(result.start.offset, equals(7));
+ expect(result.start.line, equals(2));
+ expect(result.start.column, equals(2));
+ expect(result.end.offset, equals(10));
+ expect(result.end.line, equals(3));
+ expect(result.end.column, equals(1));
+ });
+
+ test('with start at the end of a line', () {
+ final result = span.subspan(3, 5);
+ expect(result.text, equals('\nb'));
+ expect(result.start.offset, equals(8));
+ expect(result.start.line, equals(2));
+ expect(result.start.column, equals(3));
+ });
+
+ test('with start at the beginning of a line', () {
+ final result = span.subspan(4, 5);
+ expect(result.text, equals('b'));
+ expect(result.start.offset, equals(9));
+ expect(result.start.line, equals(3));
+ expect(result.start.column, equals(0));
+ });
+
+ test('with end at the end of a line', () {
+ final result = span.subspan(2, 3);
+ expect(result.text, equals('o'));
+ expect(result.end.offset, equals(8));
+ expect(result.end.line, equals(2));
+ expect(result.end.column, equals(3));
+ });
+
+ test('with end at the beginning of a line', () {
+ final result = span.subspan(2, 4);
+ expect(result.text, equals('o\n'));
+ expect(result.end.offset, equals(9));
+ expect(result.end.line, equals(3));
+ expect(result.end.column, equals(0));
+ });
+ });
+ });
+
+ group('message()', () {
+ test('prints the text being described', () {
+ expect(span.message('oh no'), equals("""
+line 1, column 6 of foo.dart: oh no
+ ,
+1 | foo bar
+ | ^^^^^^^
+ '"""));
+ });
+
+ test('gracefully handles a missing source URL', () {
+ final span = SourceSpan(SourceLocation(5), SourceLocation(12), 'foo bar');
+
+ expect(span.message('oh no'), equalsIgnoringWhitespace("""
+line 1, column 6: oh no
+ ,
+1 | foo bar
+ | ^^^^^^^
+ '"""));
+ });
+
+ test('gracefully handles empty text', () {
+ final span = SourceSpan(SourceLocation(5), SourceLocation(5), '');
+
+ expect(span.message('oh no'), equals('line 1, column 6: oh no'));
+ });
+
+ test("doesn't colorize if color is false", () {
+ expect(span.message('oh no', color: false), equals("""
+line 1, column 6 of foo.dart: oh no
+ ,
+1 | foo bar
+ | ^^^^^^^
+ '"""));
+ });
+
+ test('colorizes if color is true', () {
+ expect(span.message('oh no', color: true), equals("""
+line 1, column 6 of foo.dart: oh no
+${colors.blue} ,${colors.none}
+${colors.blue}1 |${colors.none} ${colors.red}foo bar${colors.none}
+${colors.blue} |${colors.none} ${colors.red}^^^^^^^${colors.none}
+${colors.blue} '${colors.none}"""));
+ });
+
+ test("uses the given color if it's passed", () {
+ expect(span.message('oh no', color: colors.yellow), equals("""
+line 1, column 6 of foo.dart: oh no
+${colors.blue} ,${colors.none}
+${colors.blue}1 |${colors.none} ${colors.yellow}foo bar${colors.none}
+${colors.blue} |${colors.none} ${colors.yellow}^^^^^^^${colors.none}
+${colors.blue} '${colors.none}"""));
+ });
+
+ test('with context, underlines the right column', () {
+ final spanWithContext = SourceSpanWithContext(
+ SourceLocation(5, sourceUrl: 'foo.dart'),
+ SourceLocation(12, sourceUrl: 'foo.dart'),
+ 'foo bar',
+ '-----foo bar-----');
+
+ expect(spanWithContext.message('oh no', color: colors.yellow), equals("""
+line 1, column 6 of foo.dart: oh no
+${colors.blue} ,${colors.none}
+${colors.blue}1 |${colors.none} -----${colors.yellow}foo bar${colors.none}-----
+${colors.blue} |${colors.none} ${colors.yellow} ^^^^^^^${colors.none}
+${colors.blue} '${colors.none}"""));
+ });
+ });
+
+ group('compareTo()', () {
+ test('sorts by start location first', () {
+ final other = SourceSpan(SourceLocation(6, sourceUrl: 'foo.dart'),
+ SourceLocation(14, sourceUrl: 'foo.dart'), 'oo bar b');
+
+ expect(span.compareTo(other), lessThan(0));
+ expect(other.compareTo(span), greaterThan(0));
+ });
+
+ test('sorts by length second', () {
+ final other = SourceSpan(SourceLocation(5, sourceUrl: 'foo.dart'),
+ SourceLocation(14, sourceUrl: 'foo.dart'), 'foo bar b');
+
+ expect(span.compareTo(other), lessThan(0));
+ expect(other.compareTo(span), greaterThan(0));
+ });
+
+ test('considers equal spans equal', () {
+ expect(span.compareTo(span), equals(0));
+ });
+ });
+
+ group('equality', () {
+ test('two spans with the same locations are equal', () {
+ final other = SourceSpan(SourceLocation(5, sourceUrl: 'foo.dart'),
+ SourceLocation(12, sourceUrl: 'foo.dart'), 'foo bar');
+
+ expect(span, equals(other));
+ });
+
+ test("a different start isn't equal", () {
+ final other = SourceSpan(SourceLocation(0, sourceUrl: 'foo.dart'),
+ SourceLocation(12, sourceUrl: 'foo.dart'), 'hey, foo bar');
+
+ expect(span, isNot(equals(other)));
+ });
+
+ test("a different end isn't equal", () {
+ final other = SourceSpan(SourceLocation(5, sourceUrl: 'foo.dart'),
+ SourceLocation(16, sourceUrl: 'foo.dart'), 'foo bar baz');
+
+ expect(span, isNot(equals(other)));
+ });
+
+ test("a different source URL isn't equal", () {
+ final other = SourceSpan(SourceLocation(5, sourceUrl: 'bar.dart'),
+ SourceLocation(12, sourceUrl: 'bar.dart'), 'foo bar');
+
+ expect(span, isNot(equals(other)));
+ });
+ });
+}
diff --git a/pkgs/source_span/test/utils_test.dart b/pkgs/source_span/test/utils_test.dart
new file mode 100644
index 0000000..91397c0
--- /dev/null
+++ b/pkgs/source_span/test/utils_test.dart
@@ -0,0 +1,58 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/src/utils.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('find line start', () {
+ test('skip entries in wrong column', () {
+ const context = '0_bb\n1_bbb\n2b____\n3bbb\n';
+ final index = findLineStart(context, 'b', 1)!;
+ expect(index, 11);
+ expect(context.substring(index - 1, index + 3), '\n2b_');
+ });
+
+ test('end of line column for empty text', () {
+ const context = '0123\n56789\nabcdefgh\n';
+ final index = findLineStart(context, '', 5)!;
+ expect(index, 5);
+ expect(context[index], '5');
+ });
+
+ test('column at the end of the file for empty text', () {
+ var context = '0\n2\n45\n';
+ var index = findLineStart(context, '', 2)!;
+ expect(index, 4);
+ expect(context[index], '4');
+
+ context = '0\n2\n45';
+ index = findLineStart(context, '', 2)!;
+ expect(index, 4);
+ });
+
+ test('empty text in empty context', () {
+ final index = findLineStart('', '', 0);
+ expect(index, 0);
+ });
+
+ test('found on the first line', () {
+ const context = '0\n2\n45\n';
+ final index = findLineStart(context, '0', 0);
+ expect(index, 0);
+ });
+
+ test('finds text that starts with a newline', () {
+ const context = '0\n2\n45\n';
+ final index = findLineStart(context, '\n2', 1);
+ expect(index, 0);
+ });
+
+ test('not found', () {
+ const context = '0\n2\n45\n';
+ final index = findLineStart(context, '0', 1);
+ expect(index, isNull);
+ });
+ });
+}
diff --git a/pkgs/sse/.gitignore b/pkgs/sse/.gitignore
new file mode 100644
index 0000000..1467782
--- /dev/null
+++ b/pkgs/sse/.gitignore
@@ -0,0 +1,3 @@
+.dart_tool
+pubspec.lock
+test/web/index.dart.js.deps
diff --git a/pkgs/sse/AUTHORS b/pkgs/sse/AUTHORS
new file mode 100644
index 0000000..7c12ae6
--- /dev/null
+++ b/pkgs/sse/AUTHORS
@@ -0,0 +1,6 @@
+# Below is a list of people and organizations that have contributed
+# to the Dart project. Names should be added to the list like so:
+#
+# Name/Organization <email address>
+
+Google Inc.
diff --git a/pkgs/sse/CHANGELOG.md b/pkgs/sse/CHANGELOG.md
new file mode 100644
index 0000000..0387ba9
--- /dev/null
+++ b/pkgs/sse/CHANGELOG.md
@@ -0,0 +1,178 @@
+## 4.1.7
+
+- Move to `dart-lang/tools` monorepo.
+
+## 4.1.6
+
+- Require package `web: '>=0.5.0 <2.0.0'`.
+
+## 4.1.5
+
+- Drop unneeded dependency on `package:js`.
+- Update the minimum Dart SDK version to `3.3.0`.
+- Support the latest `package:web`.
+
+## 4.1.4
+
+- Fix incorrect cast causing failure with `dart2wasm`.
+
+## 4.1.3
+
+- Update the minimum Dart SDK version to `3.2.0`.
+
+## 4.1.2
+
+- Send `fetch` requests instead of `XHR` requests.
+- Add an optional `debugKey` parameter to `SseClient` to include in logging.
+- Add a dependency on `package:js`.
+- Update the minimum Dart SDK version to `2.16.0`.
+
+## 4.1.1
+
+- Apply `keepAlive` logic to `SocketException`s.
+- Switch from using `package:pedantic` to `package:lints`
+- Rev the minimum required SDK to 2.15.
+- Populate the pubspec `repository` field.
+
+## 4.1.0
+
+- Limit the number of concurrent requests to prevent Chrome from automatically
+ dropping them on the floor.
+
+## 4.0.0
+
+- Support null safety.
+
+## 3.8.3
+
+- Require the latest shelf and remove dead code.
+
+## 3.8.2
+
+- Complete `onConnected` with an error if the `SseClient` receives an error
+ before the connection is successfully opened.
+
+## 3.8.1
+
+- Fix an issue where closing the `SseConnection` stream would result in an
+ error.
+
+## 3.8.0
+
+- Add `onConnected` to replace `onOpen`.
+- Fix an issue where failed requests would not add a `done` event to the
+ connection `sink`.
+
+## 3.7.0
+
+- Deprecate the client's `onOpen` getter. Messages will now be buffered until a
+ connection is established.
+
+## 3.6.1
+
+- Drop dependency on `package:uuid`.
+
+## 3.6.0
+
+- Improve performance by buffering out of order messages in the server instead
+ of the client.
+
+\*\* Note \*\* This is not modelled as a breaking change as the server can
+handle messages from older clients. However, clients should be using the latest
+server if they require order guarantees.
+
+## 3.5.0
+
+- Add new `shutdown` methods on `SseHandler` and `SseConnection` to allow
+ closing connections immediately, ignoring any keep-alive periods.
+
+## 3.4.0
+
+- Remove `onClose` from `SseConnection` and ensure the corresponding
+ `sink.close` correctly fires.
+
+## 3.3.0
+
+- Add an `onClose` event to the `SseConnection`. This allows consumers to listen
+ to this event in lue of `sseConnection.sink.done` as that is not guaranteed to
+ fire.
+
+## 3.2.2
+
+- Fix an issue where `keepAlive` may cause state errors when attempting to send
+ messages on a closed stream.
+
+## 3.2.1
+
+- Fix an issue where `keepAlive` would only allow a single reconnection.
+
+## 3.2.0
+
+- Re-expose `isInKeepAlivePeriod` flag on `SseConnection`. This flag will be
+ `true` when a connection has been dropped and is in the keep-alive period
+ waiting for a client to reconnect.
+
+## 3.1.2
+
+- Fix an issue where the `SseClient` would not send a `done` event when there
+ was an error with the SSE connection.
+
+## 3.1.1
+
+- Make `isInKeepAlive` on `SseConnection` private.
+
+**Note that this is a breaking change but in actuality no one should be
+depending on this API.**
+
+## 3.1.0
+
+- Add optional `keepAlive` parameter to the `SseHandler`. If `keepAlive` is
+ supplied, the connection will remain active for this period after a disconnect
+ and can be reconnected transparently. If there is no reconnect within that
+ period, the connection will be closed normally.
+
+## 3.0.0
+
+- Add retry logic.
+
+**Possible Breaking Change Error messages may now be delayed up to 5 seconds in
+the client.**
+
+## 2.1.2
+
+- Remove `package:http` dependency.
+
+## 2.1.1
+
+- Use proper headers delimiter.
+
+## 2.1.0
+
+- Support Firefox.
+
+## 2.0.3
+
+- Fix an issue where messages could come out of order.
+
+## 2.0.2
+
+- Support the latest `package:stream_channel`.
+- Require Dart SDK `>=2.1.0 <3.0.0`.
+
+## 2.0.1
+
+- Update to `package:uuid` version 2.0.
+
+## 2.0.0
+
+- No longer expose `close` and `onClose` on an `SseConnection`. This is simply
+ handled by the underlying `stream` / `sink`.
+- Fix a bug where resources of the `SseConnection` were not properly closed.
+
+## 1.0.0
+
+- Internal cleanup.
+
+## 0.0.1
+
+- Initial commit.
diff --git a/pkgs/sse/LICENSE b/pkgs/sse/LICENSE
new file mode 100644
index 0000000..a0d5f54
--- /dev/null
+++ b/pkgs/sse/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2019, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/sse/README.md b/pkgs/sse/README.md
new file mode 100644
index 0000000..ef51415
--- /dev/null
+++ b/pkgs/sse/README.md
@@ -0,0 +1,14 @@
+[](https://github.com/dart-lang/tools/actions/workflows/sse.yaml)
+[](https://pub.dev/packages/sse)
+[](https://pub.dev/packages/sse/publisher)
+
+This package provides support for bi-directional communication through Server
+Sent Events and corresponding POST requests.
+
+This package is not intended to be a general purpose SSE package, but instead is
+a bidirectional protocol for use when Websockets are unavailable. That is, both
+the client and the server expose a `sink` and `stream` on which to send and
+receive messages respectively.
+
+Both the server and client have implicit assumptions on each other and therefore
+a client from this package must be paired with a server from this package.
diff --git a/pkgs/sse/analysis_options.yaml b/pkgs/sse/analysis_options.yaml
new file mode 100644
index 0000000..6729bd9
--- /dev/null
+++ b/pkgs/sse/analysis_options.yaml
@@ -0,0 +1,13 @@
+# https://dart.dev/guides/language/analysis-options
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+
+linter:
+ rules:
+ - avoid_unused_constructor_parameters
+ - cancel_subscriptions
+ - literal_only_boolean_expressions
+ - no_adjacent_strings_in_list
diff --git a/pkgs/sse/example/index.dart b/pkgs/sse/example/index.dart
new file mode 100644
index 0000000..0ed7596
--- /dev/null
+++ b/pkgs/sse/example/index.dart
@@ -0,0 +1,15 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:sse/client/sse_client.dart';
+
+/// A basic example which should be used in a browser that supports SSE.
+void main() {
+ var channel = SseClient('/sseHandler');
+
+ channel.stream.listen((s) {
+ // Listen for messages and send them back.
+ channel.sink.add(s);
+ });
+}
diff --git a/pkgs/sse/example/server.dart b/pkgs/sse/example/server.dart
new file mode 100644
index 0000000..b6ee750
--- /dev/null
+++ b/pkgs/sse/example/server.dart
@@ -0,0 +1,21 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:shelf/shelf_io.dart' as io;
+import 'package:sse/server/sse_handler.dart';
+
+/// A basic server which sets up an SSE handler.
+///
+/// When a client connects it will send a simple message and print the
+/// response.
+void main() async {
+ var handler = SseHandler(Uri.parse('/sseHandler'));
+ await io.serve(handler.handler, 'localhost', 0);
+ var connections = handler.connections;
+ while (await connections.hasNext) {
+ var connection = await connections.next;
+ connection.sink.add('foo');
+ connection.stream.listen(print);
+ }
+}
diff --git a/pkgs/sse/lib/client/sse_client.dart b/pkgs/sse/lib/client/sse_client.dart
new file mode 100644
index 0000000..4d3df49
--- /dev/null
+++ b/pkgs/sse/lib/client/sse_client.dart
@@ -0,0 +1,166 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+import 'dart:js_interop';
+
+import 'package:logging/logging.dart';
+import 'package:pool/pool.dart';
+import 'package:stream_channel/stream_channel.dart';
+import 'package:web/web.dart';
+
+import '../src/util/uuid.dart';
+
+/// Limit for the number of concurrent outgoing requests.
+///
+/// Chrome drops outgoing requests on the floor after some threshold. To prevent
+/// these errors we buffer outgoing requests with a pool.
+///
+/// Note Chrome's limit is 6000. So this gives us plenty of headroom.
+final _requestPool = Pool(1000);
+
+/// A client for bi-directional sse communication.
+///
+/// The client can send any JSON-encodable messages to the server by adding
+/// them to the [sink] and listen to messages from the server on the [stream].
+class SseClient extends StreamChannelMixin<String?> {
+ final String _clientId;
+
+ final _incomingController = StreamController<String>();
+
+ final _outgoingController = StreamController<String>();
+
+ final _logger = Logger('SseClient');
+
+ final _onConnected = Completer<void>();
+
+ int _lastMessageId = -1;
+
+ late EventSource _eventSource;
+
+ late String _serverUrl;
+
+ Timer? _errorTimer;
+
+ /// [serverUrl] is the URL under which the server is listening for
+ /// incoming bi-directional SSE connections. [debugKey] is an optional key
+ /// that can be used to identify the SSE connection.
+ SseClient(String serverUrl, {String? debugKey})
+ : _clientId = debugKey == null
+ ? generateUuidV4()
+ : '$debugKey-${generateUuidV4()}' {
+ _serverUrl = '$serverUrl?sseClientId=$_clientId';
+ _eventSource =
+ EventSource(_serverUrl, EventSourceInit(withCredentials: true));
+ _eventSource.onOpen.first.whenComplete(() {
+ _onConnected.complete();
+ _outgoingController.stream
+ .listen(_onOutgoingMessage, onDone: _onOutgoingDone);
+ });
+ _eventSource.addEventListener('message', _onIncomingMessage.toJS);
+ _eventSource.addEventListener('control', _onIncomingControlMessage.toJS);
+
+ _eventSource.onOpen.listen((_) {
+ _errorTimer?.cancel();
+ });
+ _eventSource.onError.listen((error) {
+ if (!(_errorTimer?.isActive ?? false)) {
+ // By default the SSE client uses keep-alive.
+ // Allow for a retry to connect before giving up.
+ _errorTimer = Timer(const Duration(seconds: 5), () {
+ _closeWithError(error);
+ });
+ }
+ });
+ }
+
+ @Deprecated('Use onConnected instead.')
+ Stream<Event> get onOpen => _eventSource.onOpen;
+
+ Future<void> get onConnected => _onConnected.future;
+
+ /// Add messages to this [StreamSink] to send them to the server.
+ ///
+ /// The message added to the sink has to be JSON encodable. Messages that fail
+ /// to encode will be logged through a [Logger].
+ @override
+ StreamSink<String> get sink => _outgoingController.sink;
+
+ /// [Stream] of messages sent from the server to this client.
+ ///
+ /// A message is a decoded JSON object.
+ @override
+ Stream<String> get stream => _incomingController.stream;
+
+ void close() {
+ _eventSource.close();
+ // If the initial connection was never established. Add a listener so close
+ // adds a done event to [sink].
+ if (!_onConnected.isCompleted) _outgoingController.stream.drain<void>();
+ _incomingController.close();
+ _outgoingController.close();
+ }
+
+ void _closeWithError(Object error) {
+ _incomingController.addError(error);
+ close();
+ if (!_onConnected.isCompleted) {
+ // This call must happen after the call to close() which checks
+ // whether the completer was completed earlier.
+ _onConnected.completeError(error);
+ }
+ }
+
+ void _onIncomingControlMessage(Event message) {
+ var data = (message as MessageEvent).data;
+ if (data.dartify() == 'close') {
+ close();
+ } else {
+ throw UnsupportedError('[$_clientId] Illegal Control Message "$data"');
+ }
+ }
+
+ void _onIncomingMessage(Event message) {
+ var decoded =
+ jsonDecode(((message as MessageEvent).data as JSString).toDart);
+ _incomingController.add(decoded as String);
+ }
+
+ void _onOutgoingDone() {
+ close();
+ }
+
+ void _onOutgoingMessage(String? message) async {
+ String? encodedMessage;
+ await _requestPool.withResource(() async {
+ try {
+ encodedMessage = jsonEncode(message);
+ // ignore: avoid_catching_errors
+ } on JsonUnsupportedObjectError catch (e) {
+ _logger.warning('[$_clientId] Unable to encode outgoing message: $e');
+ // ignore: avoid_catching_errors
+ } on ArgumentError catch (e) {
+ _logger.warning('[$_clientId] Invalid argument: $e');
+ }
+ try {
+ final url = '$_serverUrl&messageId=${++_lastMessageId}';
+ await _fetch(
+ url,
+ RequestInit(
+ method: 'POST',
+ body: encodedMessage?.toJS,
+ credentials: 'include'));
+ } catch (error) {
+ final augmentedError =
+ '[$_clientId] SSE client failed to send $message:\n $error';
+ _logger.severe(augmentedError);
+ _closeWithError(augmentedError);
+ }
+ });
+ }
+}
+
+Future<void> _fetch(String resourceUrl, RequestInit options) =>
+ window.fetch(resourceUrl.toJS, options).toDart;
diff --git a/pkgs/sse/lib/server/sse_handler.dart b/pkgs/sse/lib/server/sse_handler.dart
new file mode 100644
index 0000000..bfed935
--- /dev/null
+++ b/pkgs/sse/lib/server/sse_handler.dart
@@ -0,0 +1,5 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'package:sse/src/server/sse_handler.dart' show SseConnection, SseHandler;
diff --git a/pkgs/sse/lib/src/server/sse_handler.dart b/pkgs/sse/lib/src/server/sse_handler.dart
new file mode 100644
index 0000000..376fe27
--- /dev/null
+++ b/pkgs/sse/lib/src/server/sse_handler.dart
@@ -0,0 +1,299 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+import 'dart:io';
+
+import 'package:async/async.dart';
+import 'package:collection/collection.dart';
+import 'package:logging/logging.dart';
+import 'package:shelf/shelf.dart' as shelf;
+import 'package:stream_channel/stream_channel.dart';
+
+// RFC 2616 requires carriage return delimiters.
+String _sseHeaders(String? origin) => 'HTTP/1.1 200 OK\r\n'
+ 'Content-Type: text/event-stream\r\n'
+ 'Cache-Control: no-cache\r\n'
+ 'Connection: keep-alive\r\n'
+ 'Access-Control-Allow-Credentials: true\r\n'
+ "${origin != null ? 'Access-Control-Allow-Origin: $origin\r\n' : ''}"
+ '\r\n\r\n';
+
+class _SseMessage {
+ final int id;
+ final String message;
+ _SseMessage(this.id, this.message);
+}
+
+/// A bi-directional SSE connection between server and browser.
+class SseConnection extends StreamChannelMixin<String> {
+ /// Incoming messages from the Browser client.
+ final _incomingController = StreamController<String>();
+
+ /// Outgoing messages to the Browser client.
+ final _outgoingController = StreamController<String>();
+
+ Sink _sink;
+
+ /// How long to wait after a connection drops before considering it closed.
+ final Duration? _keepAlive;
+
+ /// A timer counting down the KeepAlive period (null if hasn't disconnected).
+ Timer? _keepAliveTimer;
+
+ /// Whether this connection is currently in the KeepAlive timeout period.
+ bool get isInKeepAlivePeriod => _keepAliveTimer?.isActive ?? false;
+
+ /// The id of the last processed incoming message.
+ int _lastProcessedId = -1;
+
+ /// Incoming messages that have yet to be processed.
+ final _pendingMessages =
+ HeapPriorityQueue<_SseMessage>((a, b) => a.id.compareTo(b.id));
+
+ final _closedCompleter = Completer<void>();
+
+ /// Wraps the `_outgoingController.stream` to buffer events to enable keep
+ /// alive.
+ late StreamQueue _outgoingStreamQueue;
+
+ /// Creates an [SseConnection] for the supplied [_sink].
+ ///
+ /// If [keepAlive] is supplied, the connection will remain active for this
+ /// period after a disconnect and can be reconnected transparently. If there
+ /// is no reconnect within that period, the connection will be closed
+ /// normally.
+ ///
+ /// If [keepAlive] is not supplied, the connection will be closed immediately
+ /// after a disconnect.
+ SseConnection(this._sink, {Duration? keepAlive}) : _keepAlive = keepAlive {
+ _outgoingStreamQueue = StreamQueue(_outgoingController.stream);
+ unawaited(_setUpListener());
+ _outgoingController.onCancel = _close;
+ _incomingController.onCancel = _close;
+ }
+
+ Future<void> _setUpListener() async {
+ while (
+ !_outgoingController.isClosed && await _outgoingStreamQueue.hasNext) {
+ // If we're in a KeepAlive timeout, there's nowhere to send messages so
+ // wait a short period and check again.
+ if (isInKeepAlivePeriod) {
+ await Future<void>.delayed(const Duration(milliseconds: 200));
+ continue;
+ }
+
+ // Peek the data so we don't remove it from the stream if we're unable to
+ // send it.
+ final data = await _outgoingStreamQueue.peek;
+
+ // Ignore outgoing messages since the connection may have closed while
+ // waiting for the keep alive.
+ if (_closedCompleter.isCompleted) break;
+
+ try {
+ // JSON encode the message to escape new lines.
+ _sink.add('data: ${json.encode(data)}\n');
+ _sink.add('\n');
+ await _outgoingStreamQueue.next; // Consume from stream if no errors.
+ } catch (e) {
+ if ((e is StateError || e is SocketException) &&
+ (_keepAlive != null && !_closedCompleter.isCompleted)) {
+ // If we got here then the sink may have closed but the stream.onDone
+ // hasn't fired yet, so pause the subscription and skip calling
+ // `next` so the message remains in the queue to try again.
+ _handleDisconnect();
+ } else {
+ rethrow;
+ }
+ }
+ }
+ }
+
+ /// The message added to the sink has to be JSON encodable.
+ @override
+ StreamSink<String> get sink => _outgoingController.sink;
+
+ // Add messages to this [StreamSink] to send them to the server.
+ /// [Stream] of messages sent from the server to this client.
+ ///
+ /// A message is a decoded JSON object.
+ @override
+ Stream<String> get stream => _incomingController.stream;
+
+ /// Adds an incoming [message] to the [stream].
+ ///
+ /// This will buffer messages to guarantee order.
+ void _addIncomingMessage(int id, String message) {
+ _pendingMessages.add(_SseMessage(id, message));
+ while (_pendingMessages.isNotEmpty) {
+ var pendingMessage = _pendingMessages.first;
+ // Only process the next incremental message.
+ if (pendingMessage.id - _lastProcessedId <= 1) {
+ _incomingController.sink.add(pendingMessage.message);
+ _lastProcessedId = pendingMessage.id;
+ _pendingMessages.removeFirst();
+ } else {
+ // A message came out of order. Wait until we receive the previous
+ // messages to process.
+ break;
+ }
+ }
+ }
+
+ void _acceptReconnection(Sink sink) {
+ _keepAliveTimer?.cancel();
+ _sink = sink;
+ }
+
+ void _handleDisconnect() {
+ if (_keepAlive == null) {
+ // Close immediately if we're not keeping alive.
+ _close();
+ } else if (!isInKeepAlivePeriod && !_closedCompleter.isCompleted) {
+ // Otherwise if we didn't already have an active timer and we've not
+ // already been completely closed, set a timer to close after the timeout
+ // period.
+ // If the connection comes back, this will be cancelled and all messages
+ // left in the queue tried again.
+ _keepAliveTimer = Timer(_keepAlive, _close);
+ }
+ }
+
+ void _close() {
+ if (!_closedCompleter.isCompleted) {
+ _closedCompleter.complete();
+ // Cancel any existing timer in case we were told to explicitly shut down
+ // to avoid keeping the process alive.
+ _keepAliveTimer?.cancel();
+ _sink.close();
+ if (!_outgoingController.isClosed) {
+ _outgoingStreamQueue.cancel(immediate: true);
+ _outgoingController.close();
+ }
+ if (!_incomingController.isClosed) _incomingController.close();
+ }
+ }
+
+ /// Immediately close the connection, ignoring any keepAlive period.
+ void shutdown() {
+ _close();
+ }
+}
+
+/// [SseHandler] handles requests on a user defined path to create
+/// two-way communications of JSON encodable data between server and clients.
+///
+/// A server sends messages to a client through an SSE channel, while
+/// a client sends message to a server through HTTP POST requests.
+class SseHandler {
+ final _logger = Logger('SseHandler');
+ final Uri _uri;
+ final Duration? _keepAlive;
+ final _connections = <String?, SseConnection>{};
+ final _connectionController = StreamController<SseConnection>();
+
+ StreamQueue<SseConnection>? _connectionsStream;
+
+ /// [_uri] is the URL under which the server is listening for
+ /// incoming bi-directional SSE connections.
+ ///
+ /// If [keepAlive] is supplied, connections will remain active for this
+ /// period after a disconnect and can be reconnected transparently. If there
+ /// is no reconnect within that period, the connection will be closed
+ /// normally.
+ ///
+ /// If [keepAlive] is not supplied, connections will be closed immediately
+ /// after a disconnect.
+ SseHandler(this._uri, {Duration? keepAlive}) : _keepAlive = keepAlive;
+
+ StreamQueue<SseConnection> get connections =>
+ _connectionsStream ??= StreamQueue(_connectionController.stream);
+
+ shelf.Handler get handler => _handle;
+
+ int get numberOfClients => _connections.length;
+
+ shelf.Response _createSseConnection(shelf.Request req, String path) {
+ req.hijack((channel) async {
+ var sink = utf8.encoder.startChunkedConversion(channel.sink);
+ sink.add(_sseHeaders(req.headers['origin']));
+ var clientId = req.url.queryParameters['sseClientId'];
+
+ // Check if we already have a connection for this ID that is in the
+ // process of timing out
+ // (in which case we can reconnect it transparently).
+ if (_connections[clientId] != null &&
+ _connections[clientId]!.isInKeepAlivePeriod) {
+ _connections[clientId]!._acceptReconnection(sink);
+ } else {
+ var connection = SseConnection(sink, keepAlive: _keepAlive);
+ _connections[clientId] = connection;
+ unawaited(connection._closedCompleter.future.then((_) {
+ _connections.remove(clientId);
+ }));
+ _connectionController.add(connection);
+ }
+ // Remove connection when it is remotely closed or the stream is
+ // cancelled.
+ channel.stream.listen((_) {
+ // SSE is unidirectional. Responses are handled through POST requests.
+ }, onDone: () {
+ _connections[clientId]?._handleDisconnect();
+ });
+ });
+ }
+
+ String _getOriginalPath(shelf.Request req) => req.requestedUri.path;
+
+ Future<shelf.Response> _handle(shelf.Request req) async {
+ var path = _getOriginalPath(req);
+ if (_uri.path != path) {
+ return shelf.Response.notFound('');
+ }
+
+ if (req.headers['accept'] == 'text/event-stream' && req.method == 'GET') {
+ return _createSseConnection(req, path);
+ }
+
+ if (req.headers['accept'] != 'text/event-stream' && req.method == 'POST') {
+ return _handleIncomingMessage(req, path);
+ }
+
+ return shelf.Response.notFound('');
+ }
+
+ Future<shelf.Response> _handleIncomingMessage(
+ shelf.Request req, String path) async {
+ String? clientId;
+ try {
+ clientId = req.url.queryParameters['sseClientId'];
+ var messageId = int.parse(req.url.queryParameters['messageId'] ?? '0');
+ var message = await req.readAsString();
+ var jsonObject = json.decode(message) as String;
+ _connections[clientId]?._addIncomingMessage(messageId, jsonObject);
+ } catch (e, st) {
+ _logger.fine('[$clientId] Failed to handle incoming message. $e $st');
+ }
+ return shelf.Response.ok('', headers: {
+ 'access-control-allow-credentials': 'true',
+ 'access-control-allow-origin': _originFor(req),
+ });
+ }
+
+ String _originFor(shelf.Request req) =>
+ // Firefox does not set header "origin".
+ // https://bugzilla.mozilla.org/show_bug.cgi?id=1508661
+ req.headers['origin'] ?? req.headers['host']!;
+
+ /// Immediately close all connections, ignoring any keepAlive periods.
+ void shutdown() {
+ for (final connection in _connections.values) {
+ connection.shutdown();
+ }
+ }
+}
+
+void closeSink(SseConnection connection) => connection._sink.close();
diff --git a/pkgs/sse/lib/src/util/uuid.dart b/pkgs/sse/lib/src/util/uuid.dart
new file mode 100644
index 0000000..a1aa398
--- /dev/null
+++ b/pkgs/sse/lib/src/util/uuid.dart
@@ -0,0 +1,32 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math' show Random;
+
+/// Returns a unique ID in the format:
+///
+/// f47ac10b-58cc-4372-a567-0e02b2c3d479
+///
+/// The generated uuids are 128 bit numbers encoded in a specific string format.
+/// For more information, see
+/// [en.wikipedia.org/wiki/Universally_unique_identifier](http://en.wikipedia.org/wiki/Universally_unique_identifier).
+String generateUuidV4() {
+ final random = Random();
+
+ int generateBits(int bitCount) => random.nextInt(1 << bitCount);
+
+ String printDigits(int value, int count) =>
+ value.toRadixString(16).padLeft(count, '0');
+ String bitsDigits(int bitCount, int digitCount) =>
+ printDigits(generateBits(bitCount), digitCount);
+
+ // Generate xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx / 8-4-4-4-12.
+ var special = 8 + random.nextInt(4);
+
+ return '${bitsDigits(16, 4)}${bitsDigits(16, 4)}-'
+ '${bitsDigits(16, 4)}-'
+ '4${bitsDigits(12, 3)}-'
+ '${printDigits(special, 1)}${bitsDigits(12, 3)}-'
+ '${bitsDigits(16, 4)}${bitsDigits(16, 4)}${bitsDigits(16, 4)}';
+}
diff --git a/pkgs/sse/pubspec.yaml b/pkgs/sse/pubspec.yaml
new file mode 100644
index 0000000..bd70f74
--- /dev/null
+++ b/pkgs/sse/pubspec.yaml
@@ -0,0 +1,25 @@
+name: sse
+version: 4.1.7
+description: >-
+ Provides client and server functionality for setting up bi-directional
+ communication through Server Sent Events (SSE) and corresponding POST
+ requests.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/sse
+
+environment:
+ sdk: ^3.3.0
+
+dependencies:
+ async: ^2.0.8
+ collection: ^1.0.0
+ logging: ^1.0.0
+ pool: ^1.5.0
+ shelf: ^1.1.0
+ stream_channel: ^2.0.0
+ web: '>=0.5.0 <2.0.0'
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ shelf_static: ^1.0.0
+ test: ^1.16.6
+ webdriver: ^3.0.0
diff --git a/pkgs/sse/test/sse_test.dart b/pkgs/sse/test/sse_test.dart
new file mode 100644
index 0000000..0455baa
--- /dev/null
+++ b/pkgs/sse/test/sse_test.dart
@@ -0,0 +1,270 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'dart:async';
+import 'dart:io';
+
+import 'package:async/async.dart';
+import 'package:shelf/shelf.dart' as shelf;
+import 'package:shelf/shelf_io.dart' as io;
+import 'package:shelf_static/shelf_static.dart';
+import 'package:sse/server/sse_handler.dart';
+import 'package:sse/src/server/sse_handler.dart' show closeSink;
+import 'package:test/test.dart';
+import 'package:webdriver/async_io.dart';
+
+void main() {
+ late HttpServer server;
+ late WebDriver webdriver;
+ late SseHandler handler;
+ late Process chromeDriver;
+
+ setUpAll(() async {
+ try {
+ chromeDriver = await Process.start(
+ 'chromedriver', ['--port=4444', '--url-base=wd/hub']);
+ } catch (e) {
+ throw StateError(
+ 'Could not start ChromeDriver. Is it installed?\nError: $e');
+ }
+ });
+
+ tearDownAll(() {
+ chromeDriver.kill();
+ });
+
+ group('SSE', () {
+ setUp(() async {
+ handler = SseHandler(Uri.parse('/test'));
+
+ var cascade = shelf.Cascade()
+ .add(handler.handler)
+ .add(_faviconHandler)
+ .add(createStaticHandler('test/web',
+ listDirectories: true, defaultDocument: 'index.html'));
+
+ server = await io.serve(cascade.handler, 'localhost', 0);
+ var capabilities = Capabilities.chrome
+ ..addAll({
+ Capabilities.chromeOptions: {
+ 'args': ['--headless']
+ }
+ });
+ webdriver = await createDriver(desired: capabilities);
+ });
+
+ tearDown(() async {
+ await webdriver.quit();
+ await server.close();
+ });
+
+ test('can round trip messages', () async {
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ connection.sink.add('blah');
+ expect(await connection.stream.first, 'blah');
+ });
+
+ test('can send a significant number of requests', () async {
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ var limit = 7000;
+ for (var i = 0; i < limit; i++) {
+ connection.sink.add('$i');
+ }
+ await connection.stream.take(limit).drain<void>();
+ });
+
+ test('messages arrive in-order', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+
+ var expected = <String>[];
+ var count = 100;
+ for (var i = 0; i < count; i++) {
+ expected.add(i.toString());
+ }
+ connection.sink.add('send $count');
+
+ expect(await connection.stream.take(count).toList(), equals(expected));
+ });
+
+ test('multiple clients can connect', () async {
+ var connections = handler.connections;
+ await webdriver.get('http://localhost:${server.port}');
+ await connections.next;
+ await webdriver.get('http://localhost:${server.port}');
+ await connections.next;
+ });
+
+ test('routes data correctly', () async {
+ var connections = handler.connections;
+ await webdriver.get('http://localhost:${server.port}');
+ var connectionA = await connections.next;
+ connectionA.sink.add('foo');
+ expect(await connectionA.stream.first, 'foo');
+
+ await webdriver.get('http://localhost:${server.port}');
+ var connectionB = await connections.next;
+ connectionB.sink.add('bar');
+ expect(await connectionB.stream.first, 'bar');
+ });
+
+ test('can close from the server', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+ await connection.sink.close();
+ await pumpEventQueue();
+ expect(handler.numberOfClients, 0);
+ });
+
+ test('client reconnects after being disconnected', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+ await connection.sink.close();
+ await pumpEventQueue();
+ expect(handler.numberOfClients, 0);
+
+ // Ensure the client reconnects
+ await handler.connections.next;
+ });
+
+ test('can close from the client-side', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+
+ var closeButton = await webdriver.findElement(const By.tagName('button'));
+ await closeButton.click();
+
+ // Should complete since the connection is closed.
+ await connection.stream.drain<void>();
+ expect(handler.numberOfClients, 0);
+ });
+
+ test('cancelling the listener closes the connection', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+
+ var sub = connection.stream.listen((_) {});
+ await sub.cancel();
+ await pumpEventQueue();
+ expect(handler.numberOfClients, 0);
+ });
+
+ test('disconnects when navigating away', () async {
+ await webdriver.get('http://localhost:${server.port}');
+ expect(handler.numberOfClients, 1);
+
+ await webdriver.get('chrome://version/');
+ expect(handler.numberOfClients, 0);
+ });
+ });
+
+ group('SSE with server keep-alive', () {
+ setUp(() async {
+ handler =
+ SseHandler(Uri.parse('/test'), keepAlive: const Duration(seconds: 5));
+
+ var cascade = shelf.Cascade()
+ .add(handler.handler)
+ .add(_faviconHandler)
+ .add(createStaticHandler('test/web',
+ listDirectories: true, defaultDocument: 'index.html'));
+
+ server = await io.serve(cascade.handler, 'localhost', 0);
+ var capabilities = Capabilities.chrome
+ ..addAll({
+ Capabilities.chromeOptions: {
+ 'args': ['--headless']
+ }
+ });
+ webdriver = await createDriver(desired: capabilities);
+ });
+
+ tearDown(() async {
+ await webdriver.quit();
+ await server.close();
+ });
+
+ test('client reconnect use the same connection', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+
+ // Close the underlying connection.
+ closeSink(connection);
+ // Ensure we can still round-trip data on the original connection and that
+ // the connection is no longer marked keep-alive once it's reconnected.
+ connection.sink.add('bar');
+ var queue = StreamQueue(connection.stream);
+ expect(await queue.next, 'bar');
+
+ // Now check that we can reconnect multiple times.
+ closeSink(connection);
+ connection.sink.add('bar');
+ expect(await queue.next, 'bar');
+ expect(handler.numberOfClients, 1);
+ });
+
+ test('messages sent during disconnect arrive in-order', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ var connection = await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+
+ // Close the underlying connection.
+ closeSink(connection);
+ connection.sink.add('one');
+ connection.sink.add('two');
+ await pumpEventQueue();
+
+ // Ensure there's still a connection.
+ expect(handler.numberOfClients, 1);
+
+ // Ensure messages arrive in the same order
+ expect(await connection.stream.take(2).toList(), equals(['one', 'two']));
+ });
+
+ test('explicit shutdown does not wait for keepAlive', () async {
+ expect(handler.numberOfClients, 0);
+ await webdriver.get('http://localhost:${server.port}');
+ await handler.connections.next;
+ expect(handler.numberOfClients, 1);
+
+ // Close the underlying connection.
+ handler.shutdown();
+
+ // Wait for a short period to allow the connection to close, but not
+ // long enough that the 30second keep-alive may have expired.
+ var maxPumps = 50;
+ while (handler.numberOfClients > 0 && maxPumps-- > 0) {
+ await pumpEventQueue(times: 1);
+ }
+
+ // Ensure there are not connected clients.
+ expect(handler.numberOfClients, 0);
+ });
+ }, timeout: const Timeout(Duration(seconds: 120)));
+}
+
+FutureOr<shelf.Response> _faviconHandler(shelf.Request request) {
+ if (request.url.path.endsWith('favicon.ico')) {
+ return shelf.Response.ok('');
+ }
+ return shelf.Response.notFound('');
+}
diff --git a/pkgs/sse/test/web/index.dart b/pkgs/sse/test/web/index.dart
new file mode 100644
index 0000000..c4d78cd
--- /dev/null
+++ b/pkgs/sse/test/web/index.dart
@@ -0,0 +1,25 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:sse/client/sse_client.dart';
+import 'package:web/web.dart';
+
+void main() {
+ var channel = SseClient('/test');
+
+ document.querySelector('button')!.onClick.listen((_) {
+ channel.sink.close();
+ });
+
+ channel.stream.listen((s) {
+ if (s.startsWith('send ')) {
+ var count = int.parse(s.split(' ').last);
+ for (var i = 0; i < count; i++) {
+ channel.sink.add('$i');
+ }
+ } else {
+ channel.sink.add(s);
+ }
+ });
+}
diff --git a/pkgs/sse/test/web/index.dart.js b/pkgs/sse/test/web/index.dart.js
new file mode 100644
index 0000000..e1b37b9
--- /dev/null
+++ b/pkgs/sse/test/web/index.dart.js
@@ -0,0 +1,8851 @@
+// Generated by dart2js (NullSafetyMode.sound, csp, intern-composite-values), the Dart to JavaScript compiler version: 3.4.0-157.0.dev.
+// The code supports the following hooks:
+// dartPrint(message):
+// if this function is defined it is called instead of the Dart [print]
+// method.
+//
+// dartMainRunner(main, args):
+// if this function is defined, the Dart [main] method will not be invoked
+// directly. Instead, a closure that will invoke [main], and its arguments
+// [args] is passed to [dartMainRunner].
+//
+// dartDeferredLibraryLoader(uri, successCallback, errorCallback, loadId, loadPriority):
+// if this function is defined, it will be called when a deferred library
+// is loaded. It should load and eval the javascript of `uri`, and call
+// successCallback. If it fails to do so, it should call errorCallback with
+// an error. The loadId argument is the deferred import that resulted in
+// this uri being loaded. The loadPriority argument is the priority the
+// library should be loaded with as specified in the code via the
+// load-priority annotation (0: normal, 1: high).
+// dartDeferredLibraryMultiLoader(uris, successCallback, errorCallback, loadId, loadPriority):
+// if this function is defined, it will be called when a deferred library
+// is loaded. It should load and eval the javascript of every URI in `uris`,
+// and call successCallback. If it fails to do so, it should call
+// errorCallback with an error. The loadId argument is the deferred import
+// that resulted in this uri being loaded. The loadPriority argument is the
+// priority the library should be loaded with as specified in the code via
+// the load-priority annotation (0: normal, 1: high).
+//
+// dartCallInstrumentation(id, qualifiedName):
+// if this function is defined, it will be called at each entry of a
+// method or constructor. Used only when compiling programs with
+// --experiment-call-instrumentation.
+(function dartProgram() {
+ function copyProperties(from, to) {
+ var keys = Object.keys(from);
+ for (var i = 0; i < keys.length; i++) {
+ var key = keys[i];
+ to[key] = from[key];
+ }
+ }
+ function mixinPropertiesHard(from, to) {
+ var keys = Object.keys(from);
+ for (var i = 0; i < keys.length; i++) {
+ var key = keys[i];
+ if (!to.hasOwnProperty(key)) {
+ to[key] = from[key];
+ }
+ }
+ }
+ function mixinPropertiesEasy(from, to) {
+ Object.assign(to, from);
+ }
+ var supportsDirectProtoAccess = function() {
+ var cls = function() {
+ };
+ cls.prototype = {p: {}};
+ var object = new cls();
+ if (!(Object.getPrototypeOf(object) && Object.getPrototypeOf(object).p === cls.prototype.p))
+ return false;
+ try {
+ if (typeof navigator != "undefined" && typeof navigator.userAgent == "string" && navigator.userAgent.indexOf("Chrome/") >= 0)
+ return true;
+ if (typeof version == "function" && version.length == 0) {
+ var v = version();
+ if (/^\d+\.\d+\.\d+\.\d+$/.test(v))
+ return true;
+ }
+ } catch (_) {
+ }
+ return false;
+ }();
+ function inherit(cls, sup) {
+ cls.prototype.constructor = cls;
+ cls.prototype["$is" + cls.name] = cls;
+ if (sup != null) {
+ if (supportsDirectProtoAccess) {
+ Object.setPrototypeOf(cls.prototype, sup.prototype);
+ return;
+ }
+ var clsPrototype = Object.create(sup.prototype);
+ copyProperties(cls.prototype, clsPrototype);
+ cls.prototype = clsPrototype;
+ }
+ }
+ function inheritMany(sup, classes) {
+ for (var i = 0; i < classes.length; i++) {
+ inherit(classes[i], sup);
+ }
+ }
+ function mixinEasy(cls, mixin) {
+ mixinPropertiesEasy(mixin.prototype, cls.prototype);
+ cls.prototype.constructor = cls;
+ }
+ function mixinHard(cls, mixin) {
+ mixinPropertiesHard(mixin.prototype, cls.prototype);
+ cls.prototype.constructor = cls;
+ }
+ function lazy(holder, name, getterName, initializer) {
+ var uninitializedSentinel = holder;
+ holder[name] = uninitializedSentinel;
+ holder[getterName] = function() {
+ if (holder[name] === uninitializedSentinel) {
+ holder[name] = initializer();
+ }
+ holder[getterName] = function() {
+ return this[name];
+ };
+ return holder[name];
+ };
+ }
+ function lazyFinal(holder, name, getterName, initializer) {
+ var uninitializedSentinel = holder;
+ holder[name] = uninitializedSentinel;
+ holder[getterName] = function() {
+ if (holder[name] === uninitializedSentinel) {
+ var value = initializer();
+ if (holder[name] !== uninitializedSentinel) {
+ A.throwLateFieldADI(name);
+ }
+ holder[name] = value;
+ }
+ var finalValue = holder[name];
+ holder[getterName] = function() {
+ return finalValue;
+ };
+ return finalValue;
+ };
+ }
+ function makeConstList(list) {
+ list.immutable$list = Array;
+ list.fixed$length = Array;
+ return list;
+ }
+ function convertToFastObject(properties) {
+ function t() {
+ }
+ t.prototype = properties;
+ new t();
+ return properties;
+ }
+ function convertAllToFastObject(arrayOfObjects) {
+ for (var i = 0; i < arrayOfObjects.length; ++i) {
+ convertToFastObject(arrayOfObjects[i]);
+ }
+ }
+ var functionCounter = 0;
+ function instanceTearOffGetter(isIntercepted, parameters) {
+ var cache = null;
+ return isIntercepted ? function(receiver) {
+ if (cache === null)
+ cache = A.closureFromTearOff(parameters);
+ return new cache(receiver, this);
+ } : function() {
+ if (cache === null)
+ cache = A.closureFromTearOff(parameters);
+ return new cache(this, null);
+ };
+ }
+ function staticTearOffGetter(parameters) {
+ var cache = null;
+ return function() {
+ if (cache === null)
+ cache = A.closureFromTearOff(parameters).prototype;
+ return cache;
+ };
+ }
+ var typesOffset = 0;
+ function tearOffParameters(container, isStatic, isIntercepted, requiredParameterCount, optionalParameterDefaultValues, callNames, funsOrNames, funType, applyIndex, needsDirectAccess) {
+ if (typeof funType == "number") {
+ funType += typesOffset;
+ }
+ return {co: container, iS: isStatic, iI: isIntercepted, rC: requiredParameterCount, dV: optionalParameterDefaultValues, cs: callNames, fs: funsOrNames, fT: funType, aI: applyIndex || 0, nDA: needsDirectAccess};
+ }
+ function installStaticTearOff(holder, getterName, requiredParameterCount, optionalParameterDefaultValues, callNames, funsOrNames, funType, applyIndex) {
+ var parameters = tearOffParameters(holder, true, false, requiredParameterCount, optionalParameterDefaultValues, callNames, funsOrNames, funType, applyIndex, false);
+ var getterFunction = staticTearOffGetter(parameters);
+ holder[getterName] = getterFunction;
+ }
+ function installInstanceTearOff(prototype, getterName, isIntercepted, requiredParameterCount, optionalParameterDefaultValues, callNames, funsOrNames, funType, applyIndex, needsDirectAccess) {
+ isIntercepted = !!isIntercepted;
+ var parameters = tearOffParameters(prototype, false, isIntercepted, requiredParameterCount, optionalParameterDefaultValues, callNames, funsOrNames, funType, applyIndex, !!needsDirectAccess);
+ var getterFunction = instanceTearOffGetter(isIntercepted, parameters);
+ prototype[getterName] = getterFunction;
+ }
+ function setOrUpdateInterceptorsByTag(newTags) {
+ var tags = init.interceptorsByTag;
+ if (!tags) {
+ init.interceptorsByTag = newTags;
+ return;
+ }
+ copyProperties(newTags, tags);
+ }
+ function setOrUpdateLeafTags(newTags) {
+ var tags = init.leafTags;
+ if (!tags) {
+ init.leafTags = newTags;
+ return;
+ }
+ copyProperties(newTags, tags);
+ }
+ function updateTypes(newTypes) {
+ var types = init.types;
+ var length = types.length;
+ types.push.apply(types, newTypes);
+ return length;
+ }
+ function updateHolder(holder, newHolder) {
+ copyProperties(newHolder, holder);
+ return holder;
+ }
+ var hunkHelpers = function() {
+ var mkInstance = function(isIntercepted, requiredParameterCount, optionalParameterDefaultValues, callNames, applyIndex) {
+ return function(container, getterName, name, funType) {
+ return installInstanceTearOff(container, getterName, isIntercepted, requiredParameterCount, optionalParameterDefaultValues, callNames, [name], funType, applyIndex, false);
+ };
+ },
+ mkStatic = function(requiredParameterCount, optionalParameterDefaultValues, callNames, applyIndex) {
+ return function(container, getterName, name, funType) {
+ return installStaticTearOff(container, getterName, requiredParameterCount, optionalParameterDefaultValues, callNames, [name], funType, applyIndex);
+ };
+ };
+ return {inherit: inherit, inheritMany: inheritMany, mixin: mixinEasy, mixinHard: mixinHard, installStaticTearOff: installStaticTearOff, installInstanceTearOff: installInstanceTearOff, _instance_0u: mkInstance(0, 0, null, ["call$0"], 0), _instance_1u: mkInstance(0, 1, null, ["call$1"], 0), _instance_2u: mkInstance(0, 2, null, ["call$2"], 0), _instance_0i: mkInstance(1, 0, null, ["call$0"], 0), _instance_1i: mkInstance(1, 1, null, ["call$1"], 0), _instance_2i: mkInstance(1, 2, null, ["call$2"], 0), _static_0: mkStatic(0, null, ["call$0"], 0), _static_1: mkStatic(1, null, ["call$1"], 0), _static_2: mkStatic(2, null, ["call$2"], 0), makeConstList: makeConstList, lazy: lazy, lazyFinal: lazyFinal, updateHolder: updateHolder, convertToFastObject: convertToFastObject, updateTypes: updateTypes, setOrUpdateInterceptorsByTag: setOrUpdateInterceptorsByTag, setOrUpdateLeafTags: setOrUpdateLeafTags};
+ }();
+ function initializeDeferredHunk(hunk) {
+ typesOffset = init.types.length;
+ hunk(hunkHelpers, init, holders, $);
+ }
+ var J = {
+ makeDispatchRecord(interceptor, proto, extension, indexability) {
+ return {i: interceptor, p: proto, e: extension, x: indexability};
+ },
+ getNativeInterceptor(object) {
+ var proto, objectProto, $constructor, interceptor, t1,
+ record = object[init.dispatchPropertyName];
+ if (record == null)
+ if ($.initNativeDispatchFlag == null) {
+ A.initNativeDispatch();
+ record = object[init.dispatchPropertyName];
+ }
+ if (record != null) {
+ proto = record.p;
+ if (false === proto)
+ return record.i;
+ if (true === proto)
+ return object;
+ objectProto = Object.getPrototypeOf(object);
+ if (proto === objectProto)
+ return record.i;
+ if (record.e === objectProto)
+ throw A.wrapException(A.UnimplementedError$("Return interceptor for " + A.S(proto(object, record))));
+ }
+ $constructor = object.constructor;
+ if ($constructor == null)
+ interceptor = null;
+ else {
+ t1 = $._JS_INTEROP_INTERCEPTOR_TAG;
+ if (t1 == null)
+ t1 = $._JS_INTEROP_INTERCEPTOR_TAG = init.getIsolateTag("_$dart_js");
+ interceptor = $constructor[t1];
+ }
+ if (interceptor != null)
+ return interceptor;
+ interceptor = A.lookupAndCacheInterceptor(object);
+ if (interceptor != null)
+ return interceptor;
+ if (typeof object == "function")
+ return B.JavaScriptFunction_methods;
+ proto = Object.getPrototypeOf(object);
+ if (proto == null)
+ return B.PlainJavaScriptObject_methods;
+ if (proto === Object.prototype)
+ return B.PlainJavaScriptObject_methods;
+ if (typeof $constructor == "function") {
+ t1 = $._JS_INTEROP_INTERCEPTOR_TAG;
+ if (t1 == null)
+ t1 = $._JS_INTEROP_INTERCEPTOR_TAG = init.getIsolateTag("_$dart_js");
+ Object.defineProperty($constructor, t1, {value: B.UnknownJavaScriptObject_methods, enumerable: false, writable: true, configurable: true});
+ return B.UnknownJavaScriptObject_methods;
+ }
+ return B.UnknownJavaScriptObject_methods;
+ },
+ JSArray_JSArray$fixed($length, $E) {
+ if ($length < 0 || $length > 4294967295)
+ throw A.wrapException(A.RangeError$range($length, 0, 4294967295, "length", null));
+ return J.JSArray_JSArray$markFixed(new Array($length), $E);
+ },
+ JSArray_JSArray$growable($length, $E) {
+ if ($length < 0)
+ throw A.wrapException(A.ArgumentError$("Length must be a non-negative integer: " + $length, null));
+ return A._setArrayType(new Array($length), $E._eval$1("JSArray<0>"));
+ },
+ JSArray_JSArray$markFixed(allocation, $E) {
+ return J.JSArray_markFixedList(A._setArrayType(allocation, $E._eval$1("JSArray<0>")), $E);
+ },
+ JSArray_markFixedList(list, $T) {
+ list.fixed$length = Array;
+ return list;
+ },
+ JSArray_markUnmodifiableList(list) {
+ list.fixed$length = Array;
+ list.immutable$list = Array;
+ return list;
+ },
+ getInterceptor$(receiver) {
+ if (typeof receiver == "number") {
+ if (Math.floor(receiver) == receiver)
+ return J.JSInt.prototype;
+ return J.JSNumNotInt.prototype;
+ }
+ if (typeof receiver == "string")
+ return J.JSString.prototype;
+ if (receiver == null)
+ return J.JSNull.prototype;
+ if (typeof receiver == "boolean")
+ return J.JSBool.prototype;
+ if (Array.isArray(receiver))
+ return J.JSArray.prototype;
+ if (typeof receiver != "object") {
+ if (typeof receiver == "function")
+ return J.JavaScriptFunction.prototype;
+ if (typeof receiver == "symbol")
+ return J.JavaScriptSymbol.prototype;
+ if (typeof receiver == "bigint")
+ return J.JavaScriptBigInt.prototype;
+ return receiver;
+ }
+ if (receiver instanceof A.Object)
+ return receiver;
+ return J.getNativeInterceptor(receiver);
+ },
+ getInterceptor$asx(receiver) {
+ if (typeof receiver == "string")
+ return J.JSString.prototype;
+ if (receiver == null)
+ return receiver;
+ if (Array.isArray(receiver))
+ return J.JSArray.prototype;
+ if (typeof receiver != "object") {
+ if (typeof receiver == "function")
+ return J.JavaScriptFunction.prototype;
+ if (typeof receiver == "symbol")
+ return J.JavaScriptSymbol.prototype;
+ if (typeof receiver == "bigint")
+ return J.JavaScriptBigInt.prototype;
+ return receiver;
+ }
+ if (receiver instanceof A.Object)
+ return receiver;
+ return J.getNativeInterceptor(receiver);
+ },
+ getInterceptor$ax(receiver) {
+ if (receiver == null)
+ return receiver;
+ if (Array.isArray(receiver))
+ return J.JSArray.prototype;
+ if (typeof receiver != "object") {
+ if (typeof receiver == "function")
+ return J.JavaScriptFunction.prototype;
+ if (typeof receiver == "symbol")
+ return J.JavaScriptSymbol.prototype;
+ if (typeof receiver == "bigint")
+ return J.JavaScriptBigInt.prototype;
+ return receiver;
+ }
+ if (receiver instanceof A.Object)
+ return receiver;
+ return J.getNativeInterceptor(receiver);
+ },
+ getInterceptor$s(receiver) {
+ if (typeof receiver == "string")
+ return J.JSString.prototype;
+ if (receiver == null)
+ return receiver;
+ if (!(receiver instanceof A.Object))
+ return J.UnknownJavaScriptObject.prototype;
+ return receiver;
+ },
+ get$hashCode$(receiver) {
+ return J.getInterceptor$(receiver).get$hashCode(receiver);
+ },
+ get$iterator$ax(receiver) {
+ return J.getInterceptor$ax(receiver).get$iterator(receiver);
+ },
+ get$length$asx(receiver) {
+ return J.getInterceptor$asx(receiver).get$length(receiver);
+ },
+ get$runtimeType$(receiver) {
+ return J.getInterceptor$(receiver).get$runtimeType(receiver);
+ },
+ $eq$(receiver, a0) {
+ if (receiver == null)
+ return a0 == null;
+ if (typeof receiver != "object")
+ return a0 != null && receiver === a0;
+ return J.getInterceptor$(receiver).$eq(receiver, a0);
+ },
+ matchAsPrefix$2$s(receiver, a0, a1) {
+ return J.getInterceptor$s(receiver).matchAsPrefix$2(receiver, a0, a1);
+ },
+ noSuchMethod$1$(receiver, a0) {
+ return J.getInterceptor$(receiver).noSuchMethod$1(receiver, a0);
+ },
+ toString$0$(receiver) {
+ return J.getInterceptor$(receiver).toString$0(receiver);
+ },
+ Interceptor: function Interceptor() {
+ },
+ JSBool: function JSBool() {
+ },
+ JSNull: function JSNull() {
+ },
+ JavaScriptObject: function JavaScriptObject() {
+ },
+ LegacyJavaScriptObject: function LegacyJavaScriptObject() {
+ },
+ PlainJavaScriptObject: function PlainJavaScriptObject() {
+ },
+ UnknownJavaScriptObject: function UnknownJavaScriptObject() {
+ },
+ JavaScriptFunction: function JavaScriptFunction() {
+ },
+ JavaScriptBigInt: function JavaScriptBigInt() {
+ },
+ JavaScriptSymbol: function JavaScriptSymbol() {
+ },
+ JSArray: function JSArray(t0) {
+ this.$ti = t0;
+ },
+ JSUnmodifiableArray: function JSUnmodifiableArray(t0) {
+ this.$ti = t0;
+ },
+ ArrayIterator: function ArrayIterator(t0, t1, t2) {
+ var _ = this;
+ _._iterable = t0;
+ _._length = t1;
+ _._index = 0;
+ _._current = null;
+ _.$ti = t2;
+ },
+ JSNumber: function JSNumber() {
+ },
+ JSInt: function JSInt() {
+ },
+ JSNumNotInt: function JSNumNotInt() {
+ },
+ JSString: function JSString() {
+ }
+ },
+ A = {JS_CONST: function JS_CONST() {
+ },
+ checkNotNullable(value, $name, $T) {
+ return value;
+ },
+ isToStringVisiting(object) {
+ var t1, i;
+ for (t1 = $.toStringVisiting.length, i = 0; i < t1; ++i)
+ if (object === $.toStringVisiting[i])
+ return true;
+ return false;
+ },
+ IterableElementError_noElement() {
+ return new A.StateError("No element");
+ },
+ IterableElementError_tooFew() {
+ return new A.StateError("Too few elements");
+ },
+ LateError: function LateError(t0) {
+ this._message = t0;
+ },
+ nullFuture_closure: function nullFuture_closure() {
+ },
+ EfficientLengthIterable: function EfficientLengthIterable() {
+ },
+ ListIterable: function ListIterable() {
+ },
+ ListIterator: function ListIterator(t0, t1, t2) {
+ var _ = this;
+ _.__internal$_iterable = t0;
+ _.__internal$_length = t1;
+ _.__internal$_index = 0;
+ _.__internal$_current = null;
+ _.$ti = t2;
+ },
+ FixedLengthListMixin: function FixedLengthListMixin() {
+ },
+ Symbol: function Symbol(t0) {
+ this._name = t0;
+ },
+ unminifyOrTag(rawClassName) {
+ var preserved = init.mangledGlobalNames[rawClassName];
+ if (preserved != null)
+ return preserved;
+ return rawClassName;
+ },
+ isJsIndexable(object, record) {
+ var result;
+ if (record != null) {
+ result = record.x;
+ if (result != null)
+ return result;
+ }
+ return type$.JavaScriptIndexingBehavior_dynamic._is(object);
+ },
+ S(value) {
+ var result;
+ if (typeof value == "string")
+ return value;
+ if (typeof value == "number") {
+ if (value !== 0)
+ return "" + value;
+ } else if (true === value)
+ return "true";
+ else if (false === value)
+ return "false";
+ else if (value == null)
+ return "null";
+ result = J.toString$0$(value);
+ return result;
+ },
+ Primitives_objectHashCode(object) {
+ var hash,
+ property = $.Primitives__identityHashCodeProperty;
+ if (property == null)
+ property = $.Primitives__identityHashCodeProperty = Symbol("identityHashCode");
+ hash = object[property];
+ if (hash == null) {
+ hash = Math.random() * 0x3fffffff | 0;
+ object[property] = hash;
+ }
+ return hash;
+ },
+ Primitives_parseInt(source, radix) {
+ var decimalMatch, maxCharCode, digitsPart, t1, i, _null = null,
+ match = /^\s*[+-]?((0x[a-f0-9]+)|(\d+)|([a-z0-9]+))\s*$/i.exec(source);
+ if (match == null)
+ return _null;
+ if (3 >= match.length)
+ return A.ioore(match, 3);
+ decimalMatch = match[3];
+ if (radix == null) {
+ if (decimalMatch != null)
+ return parseInt(source, 10);
+ if (match[2] != null)
+ return parseInt(source, 16);
+ return _null;
+ }
+ if (radix < 2 || radix > 36)
+ throw A.wrapException(A.RangeError$range(radix, 2, 36, "radix", _null));
+ if (radix === 10 && decimalMatch != null)
+ return parseInt(source, 10);
+ if (radix < 10 || decimalMatch == null) {
+ maxCharCode = radix <= 10 ? 47 + radix : 86 + radix;
+ digitsPart = match[1];
+ for (t1 = digitsPart.length, i = 0; i < t1; ++i)
+ if ((digitsPart.charCodeAt(i) | 32) > maxCharCode)
+ return _null;
+ }
+ return parseInt(source, radix);
+ },
+ Primitives_objectTypeName(object) {
+ return A.Primitives__objectTypeNameNewRti(object);
+ },
+ Primitives__objectTypeNameNewRti(object) {
+ var interceptor, dispatchName, $constructor, constructorName;
+ if (object instanceof A.Object)
+ return A._rtiToString(A.instanceType(object), null);
+ interceptor = J.getInterceptor$(object);
+ if (interceptor === B.Interceptor_methods || interceptor === B.JavaScriptObject_methods || type$.UnknownJavaScriptObject._is(object)) {
+ dispatchName = B.C_JS_CONST(object);
+ if (dispatchName !== "Object" && dispatchName !== "")
+ return dispatchName;
+ $constructor = object.constructor;
+ if (typeof $constructor == "function") {
+ constructorName = $constructor.name;
+ if (typeof constructorName == "string" && constructorName !== "Object" && constructorName !== "")
+ return constructorName;
+ }
+ }
+ return A._rtiToString(A.instanceType(object), null);
+ },
+ Primitives_safeToString(object) {
+ if (typeof object == "number" || A._isBool(object))
+ return J.toString$0$(object);
+ if (typeof object == "string")
+ return JSON.stringify(object);
+ if (object instanceof A.Closure)
+ return object.toString$0(0);
+ return "Instance of '" + A.Primitives_objectTypeName(object) + "'";
+ },
+ Primitives_stringFromCharCode(charCode) {
+ var bits;
+ if (0 <= charCode) {
+ if (charCode <= 65535)
+ return String.fromCharCode(charCode);
+ if (charCode <= 1114111) {
+ bits = charCode - 65536;
+ return String.fromCharCode((B.JSInt_methods._shrOtherPositive$1(bits, 10) | 55296) >>> 0, bits & 1023 | 56320);
+ }
+ }
+ throw A.wrapException(A.RangeError$range(charCode, 0, 1114111, null, null));
+ },
+ Primitives_lazyAsJsDate(receiver) {
+ if (receiver.date === void 0)
+ receiver.date = new Date(receiver._value);
+ return receiver.date;
+ },
+ Primitives_getYear(receiver) {
+ return receiver.isUtc ? A.Primitives_lazyAsJsDate(receiver).getUTCFullYear() + 0 : A.Primitives_lazyAsJsDate(receiver).getFullYear() + 0;
+ },
+ Primitives_getMonth(receiver) {
+ return receiver.isUtc ? A.Primitives_lazyAsJsDate(receiver).getUTCMonth() + 1 : A.Primitives_lazyAsJsDate(receiver).getMonth() + 1;
+ },
+ Primitives_getDay(receiver) {
+ return receiver.isUtc ? A.Primitives_lazyAsJsDate(receiver).getUTCDate() + 0 : A.Primitives_lazyAsJsDate(receiver).getDate() + 0;
+ },
+ Primitives_getHours(receiver) {
+ return receiver.isUtc ? A.Primitives_lazyAsJsDate(receiver).getUTCHours() + 0 : A.Primitives_lazyAsJsDate(receiver).getHours() + 0;
+ },
+ Primitives_getMinutes(receiver) {
+ return receiver.isUtc ? A.Primitives_lazyAsJsDate(receiver).getUTCMinutes() + 0 : A.Primitives_lazyAsJsDate(receiver).getMinutes() + 0;
+ },
+ Primitives_getSeconds(receiver) {
+ return receiver.isUtc ? A.Primitives_lazyAsJsDate(receiver).getUTCSeconds() + 0 : A.Primitives_lazyAsJsDate(receiver).getSeconds() + 0;
+ },
+ Primitives_getMilliseconds(receiver) {
+ return receiver.isUtc ? A.Primitives_lazyAsJsDate(receiver).getUTCMilliseconds() + 0 : A.Primitives_lazyAsJsDate(receiver).getMilliseconds() + 0;
+ },
+ Primitives_functionNoSuchMethod($function, positionalArguments, namedArguments) {
+ var $arguments, namedArgumentList, t1 = {};
+ t1.argumentCount = 0;
+ $arguments = [];
+ namedArgumentList = [];
+ t1.argumentCount = positionalArguments.length;
+ B.JSArray_methods.addAll$1($arguments, positionalArguments);
+ t1.names = "";
+ if (namedArguments != null && namedArguments.__js_helper$_length !== 0)
+ namedArguments.forEach$1(0, new A.Primitives_functionNoSuchMethod_closure(t1, namedArgumentList, $arguments));
+ return J.noSuchMethod$1$($function, new A.JSInvocationMirror(B.Symbol_call, 0, $arguments, namedArgumentList, 0));
+ },
+ Primitives_applyFunction($function, positionalArguments, namedArguments) {
+ var t1, argumentCount, jsStub;
+ if (Array.isArray(positionalArguments))
+ t1 = namedArguments == null || namedArguments.__js_helper$_length === 0;
+ else
+ t1 = false;
+ if (t1) {
+ argumentCount = positionalArguments.length;
+ if (argumentCount === 0) {
+ if (!!$function.call$0)
+ return $function.call$0();
+ } else if (argumentCount === 1) {
+ if (!!$function.call$1)
+ return $function.call$1(positionalArguments[0]);
+ } else if (argumentCount === 2) {
+ if (!!$function.call$2)
+ return $function.call$2(positionalArguments[0], positionalArguments[1]);
+ } else if (argumentCount === 3) {
+ if (!!$function.call$3)
+ return $function.call$3(positionalArguments[0], positionalArguments[1], positionalArguments[2]);
+ } else if (argumentCount === 4) {
+ if (!!$function.call$4)
+ return $function.call$4(positionalArguments[0], positionalArguments[1], positionalArguments[2], positionalArguments[3]);
+ } else if (argumentCount === 5)
+ if (!!$function.call$5)
+ return $function.call$5(positionalArguments[0], positionalArguments[1], positionalArguments[2], positionalArguments[3], positionalArguments[4]);
+ jsStub = $function["call" + "$" + argumentCount];
+ if (jsStub != null)
+ return jsStub.apply($function, positionalArguments);
+ }
+ return A.Primitives__generalApplyFunction($function, positionalArguments, namedArguments);
+ },
+ Primitives__generalApplyFunction($function, positionalArguments, namedArguments) {
+ var defaultValuesClosure, t1, defaultValues, interceptor, jsFunction, maxArguments, missingDefaults, keys, _i, defaultValue, used, key,
+ $arguments = Array.isArray(positionalArguments) ? positionalArguments : A.List_List$of(positionalArguments, true, type$.dynamic),
+ argumentCount = $arguments.length,
+ requiredParameterCount = $function.$requiredArgCount;
+ if (argumentCount < requiredParameterCount)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ defaultValuesClosure = $function.$defaultValues;
+ t1 = defaultValuesClosure == null;
+ defaultValues = !t1 ? defaultValuesClosure() : null;
+ interceptor = J.getInterceptor$($function);
+ jsFunction = interceptor["call*"];
+ if (typeof jsFunction == "string")
+ jsFunction = interceptor[jsFunction];
+ if (t1) {
+ if (namedArguments != null && namedArguments.__js_helper$_length !== 0)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ if (argumentCount === requiredParameterCount)
+ return jsFunction.apply($function, $arguments);
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ }
+ if (Array.isArray(defaultValues)) {
+ if (namedArguments != null && namedArguments.__js_helper$_length !== 0)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ maxArguments = requiredParameterCount + defaultValues.length;
+ if (argumentCount > maxArguments)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, null);
+ if (argumentCount < maxArguments) {
+ missingDefaults = defaultValues.slice(argumentCount - requiredParameterCount);
+ if ($arguments === positionalArguments)
+ $arguments = A.List_List$of($arguments, true, type$.dynamic);
+ B.JSArray_methods.addAll$1($arguments, missingDefaults);
+ }
+ return jsFunction.apply($function, $arguments);
+ } else {
+ if (argumentCount > requiredParameterCount)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ if ($arguments === positionalArguments)
+ $arguments = A.List_List$of($arguments, true, type$.dynamic);
+ keys = Object.keys(defaultValues);
+ if (namedArguments == null)
+ for (t1 = keys.length, _i = 0; _i < keys.length; keys.length === t1 || (0, A.throwConcurrentModificationError)(keys), ++_i) {
+ defaultValue = defaultValues[A._asString(keys[_i])];
+ if (B.C__Required === defaultValue)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ B.JSArray_methods.add$1($arguments, defaultValue);
+ }
+ else {
+ for (t1 = keys.length, used = 0, _i = 0; _i < keys.length; keys.length === t1 || (0, A.throwConcurrentModificationError)(keys), ++_i) {
+ key = A._asString(keys[_i]);
+ if (namedArguments.containsKey$1(key)) {
+ ++used;
+ B.JSArray_methods.add$1($arguments, namedArguments.$index(0, key));
+ } else {
+ defaultValue = defaultValues[key];
+ if (B.C__Required === defaultValue)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ B.JSArray_methods.add$1($arguments, defaultValue);
+ }
+ }
+ if (used !== namedArguments.__js_helper$_length)
+ return A.Primitives_functionNoSuchMethod($function, $arguments, namedArguments);
+ }
+ return jsFunction.apply($function, $arguments);
+ }
+ },
+ ioore(receiver, index) {
+ if (receiver == null)
+ J.get$length$asx(receiver);
+ throw A.wrapException(A.diagnoseIndexError(receiver, index));
+ },
+ diagnoseIndexError(indexable, index) {
+ var $length, _s5_ = "index";
+ if (!A._isInt(index))
+ return new A.ArgumentError(true, index, _s5_, null);
+ $length = A._asInt(J.get$length$asx(indexable));
+ if (index < 0 || index >= $length)
+ return A.IndexError$withLength(index, $length, indexable, null, _s5_);
+ return A.RangeError$value(index, _s5_);
+ },
+ wrapException(ex) {
+ return A.initializeExceptionWrapper(new Error(), ex);
+ },
+ initializeExceptionWrapper(wrapper, ex) {
+ var t1;
+ if (ex == null)
+ ex = new A.TypeError();
+ wrapper.dartException = ex;
+ t1 = A.toStringWrapper;
+ if ("defineProperty" in Object) {
+ Object.defineProperty(wrapper, "message", {get: t1});
+ wrapper.name = "";
+ } else
+ wrapper.toString = t1;
+ return wrapper;
+ },
+ toStringWrapper() {
+ return J.toString$0$(this.dartException);
+ },
+ throwExpression(ex) {
+ throw A.wrapException(ex);
+ },
+ throwExpressionWithWrapper(ex, wrapper) {
+ throw A.initializeExceptionWrapper(wrapper, ex);
+ },
+ throwConcurrentModificationError(collection) {
+ throw A.wrapException(A.ConcurrentModificationError$(collection));
+ },
+ TypeErrorDecoder_extractPattern(message) {
+ var match, $arguments, argumentsExpr, expr, method, receiver;
+ message = A.quoteStringForRegExp(message.replace(String({}), "$receiver$"));
+ match = message.match(/\\\$[a-zA-Z]+\\\$/g);
+ if (match == null)
+ match = A._setArrayType([], type$.JSArray_String);
+ $arguments = match.indexOf("\\$arguments\\$");
+ argumentsExpr = match.indexOf("\\$argumentsExpr\\$");
+ expr = match.indexOf("\\$expr\\$");
+ method = match.indexOf("\\$method\\$");
+ receiver = match.indexOf("\\$receiver\\$");
+ return new A.TypeErrorDecoder(message.replace(new RegExp("\\\\\\$arguments\\\\\\$", "g"), "((?:x|[^x])*)").replace(new RegExp("\\\\\\$argumentsExpr\\\\\\$", "g"), "((?:x|[^x])*)").replace(new RegExp("\\\\\\$expr\\\\\\$", "g"), "((?:x|[^x])*)").replace(new RegExp("\\\\\\$method\\\\\\$", "g"), "((?:x|[^x])*)").replace(new RegExp("\\\\\\$receiver\\\\\\$", "g"), "((?:x|[^x])*)"), $arguments, argumentsExpr, expr, method, receiver);
+ },
+ TypeErrorDecoder_provokeCallErrorOn(expression) {
+ return function($expr$) {
+ var $argumentsExpr$ = "$arguments$";
+ try {
+ $expr$.$method$($argumentsExpr$);
+ } catch (e) {
+ return e.message;
+ }
+ }(expression);
+ },
+ TypeErrorDecoder_provokePropertyErrorOn(expression) {
+ return function($expr$) {
+ try {
+ $expr$.$method$;
+ } catch (e) {
+ return e.message;
+ }
+ }(expression);
+ },
+ JsNoSuchMethodError$(_message, match) {
+ var t1 = match == null,
+ t2 = t1 ? null : match.method;
+ return new A.JsNoSuchMethodError(_message, t2, t1 ? null : match.receiver);
+ },
+ unwrapException(ex) {
+ var t1;
+ if (ex == null)
+ return new A.NullThrownFromJavaScriptException(ex);
+ if (ex instanceof A.ExceptionAndStackTrace) {
+ t1 = ex.dartException;
+ return A.saveStackTrace(ex, t1 == null ? type$.Object._as(t1) : t1);
+ }
+ if (typeof ex !== "object")
+ return ex;
+ if ("dartException" in ex)
+ return A.saveStackTrace(ex, ex.dartException);
+ return A._unwrapNonDartException(ex);
+ },
+ saveStackTrace(ex, error) {
+ if (type$.Error._is(error))
+ if (error.$thrownJsError == null)
+ error.$thrownJsError = ex;
+ return error;
+ },
+ _unwrapNonDartException(ex) {
+ var message, number, ieErrorCode, nsme, notClosure, nullCall, nullLiteralCall, undefCall, undefLiteralCall, nullProperty, undefProperty, undefLiteralProperty, match;
+ if (!("message" in ex))
+ return ex;
+ message = ex.message;
+ if ("number" in ex && typeof ex.number == "number") {
+ number = ex.number;
+ ieErrorCode = number & 65535;
+ if ((B.JSInt_methods._shrOtherPositive$1(number, 16) & 8191) === 10)
+ switch (ieErrorCode) {
+ case 438:
+ return A.saveStackTrace(ex, A.JsNoSuchMethodError$(A.S(message) + " (Error " + ieErrorCode + ")", null));
+ case 445:
+ case 5007:
+ A.S(message);
+ return A.saveStackTrace(ex, new A.NullError());
+ }
+ }
+ if (ex instanceof TypeError) {
+ nsme = $.$get$TypeErrorDecoder_noSuchMethodPattern();
+ notClosure = $.$get$TypeErrorDecoder_notClosurePattern();
+ nullCall = $.$get$TypeErrorDecoder_nullCallPattern();
+ nullLiteralCall = $.$get$TypeErrorDecoder_nullLiteralCallPattern();
+ undefCall = $.$get$TypeErrorDecoder_undefinedCallPattern();
+ undefLiteralCall = $.$get$TypeErrorDecoder_undefinedLiteralCallPattern();
+ nullProperty = $.$get$TypeErrorDecoder_nullPropertyPattern();
+ $.$get$TypeErrorDecoder_nullLiteralPropertyPattern();
+ undefProperty = $.$get$TypeErrorDecoder_undefinedPropertyPattern();
+ undefLiteralProperty = $.$get$TypeErrorDecoder_undefinedLiteralPropertyPattern();
+ match = nsme.matchTypeError$1(message);
+ if (match != null)
+ return A.saveStackTrace(ex, A.JsNoSuchMethodError$(A._asString(message), match));
+ else {
+ match = notClosure.matchTypeError$1(message);
+ if (match != null) {
+ match.method = "call";
+ return A.saveStackTrace(ex, A.JsNoSuchMethodError$(A._asString(message), match));
+ } else if (nullCall.matchTypeError$1(message) != null || nullLiteralCall.matchTypeError$1(message) != null || undefCall.matchTypeError$1(message) != null || undefLiteralCall.matchTypeError$1(message) != null || nullProperty.matchTypeError$1(message) != null || nullLiteralCall.matchTypeError$1(message) != null || undefProperty.matchTypeError$1(message) != null || undefLiteralProperty.matchTypeError$1(message) != null) {
+ A._asString(message);
+ return A.saveStackTrace(ex, new A.NullError());
+ }
+ }
+ return A.saveStackTrace(ex, new A.UnknownJsTypeError(typeof message == "string" ? message : ""));
+ }
+ if (ex instanceof RangeError) {
+ if (typeof message == "string" && message.indexOf("call stack") !== -1)
+ return new A.StackOverflowError();
+ message = function(ex) {
+ try {
+ return String(ex);
+ } catch (e) {
+ }
+ return null;
+ }(ex);
+ return A.saveStackTrace(ex, new A.ArgumentError(false, null, null, typeof message == "string" ? message.replace(/^RangeError:\s*/, "") : message));
+ }
+ if (typeof InternalError == "function" && ex instanceof InternalError)
+ if (typeof message == "string" && message === "too much recursion")
+ return new A.StackOverflowError();
+ return ex;
+ },
+ getTraceFromException(exception) {
+ var trace;
+ if (exception instanceof A.ExceptionAndStackTrace)
+ return exception.stackTrace;
+ if (exception == null)
+ return new A._StackTrace(exception);
+ trace = exception.$cachedTrace;
+ if (trace != null)
+ return trace;
+ trace = new A._StackTrace(exception);
+ if (typeof exception === "object")
+ exception.$cachedTrace = trace;
+ return trace;
+ },
+ objectHashCode(object) {
+ if (object == null)
+ return J.get$hashCode$(object);
+ if (typeof object == "object")
+ return A.Primitives_objectHashCode(object);
+ return J.get$hashCode$(object);
+ },
+ _invokeClosure(closure, numberOfArguments, arg1, arg2, arg3, arg4) {
+ type$.Function._as(closure);
+ switch (A._asInt(numberOfArguments)) {
+ case 0:
+ return closure.call$0();
+ case 1:
+ return closure.call$1(arg1);
+ case 2:
+ return closure.call$2(arg1, arg2);
+ case 3:
+ return closure.call$3(arg1, arg2, arg3);
+ case 4:
+ return closure.call$4(arg1, arg2, arg3, arg4);
+ }
+ throw A.wrapException(new A._Exception("Unsupported number of arguments for wrapped closure"));
+ },
+ convertDartClosureToJS(closure, arity) {
+ var $function = closure.$identity;
+ if (!!$function)
+ return $function;
+ $function = A.convertDartClosureToJSUncached(closure, arity);
+ closure.$identity = $function;
+ return $function;
+ },
+ convertDartClosureToJSUncached(closure, arity) {
+ var entry;
+ switch (arity) {
+ case 0:
+ entry = closure.call$0;
+ break;
+ case 1:
+ entry = closure.call$1;
+ break;
+ case 2:
+ entry = closure.call$2;
+ break;
+ case 3:
+ entry = closure.call$3;
+ break;
+ case 4:
+ entry = closure.call$4;
+ break;
+ default:
+ entry = null;
+ }
+ if (entry != null)
+ return entry.bind(closure);
+ return function(closure, arity, invoke) {
+ return function(a1, a2, a3, a4) {
+ return invoke(closure, arity, a1, a2, a3, a4);
+ };
+ }(closure, arity, A._invokeClosure);
+ },
+ Closure_fromTearOff(parameters) {
+ var $prototype, $constructor, t2, trampoline, applyTrampoline, i, stub, stub0, stubName, stubCallName,
+ container = parameters.co,
+ isStatic = parameters.iS,
+ isIntercepted = parameters.iI,
+ needsDirectAccess = parameters.nDA,
+ applyTrampolineIndex = parameters.aI,
+ funsOrNames = parameters.fs,
+ callNames = parameters.cs,
+ $name = funsOrNames[0],
+ callName = callNames[0],
+ $function = container[$name],
+ t1 = parameters.fT;
+ t1.toString;
+ $prototype = isStatic ? Object.create(new A.StaticClosure().constructor.prototype) : Object.create(new A.BoundClosure(null, null).constructor.prototype);
+ $prototype.$initialize = $prototype.constructor;
+ $constructor = isStatic ? function static_tear_off() {
+ this.$initialize();
+ } : function tear_off(a, b) {
+ this.$initialize(a, b);
+ };
+ $prototype.constructor = $constructor;
+ $constructor.prototype = $prototype;
+ $prototype.$_name = $name;
+ $prototype.$_target = $function;
+ t2 = !isStatic;
+ if (t2)
+ trampoline = A.Closure_forwardCallTo($name, $function, isIntercepted, needsDirectAccess);
+ else {
+ $prototype.$static_name = $name;
+ trampoline = $function;
+ }
+ $prototype.$signature = A.Closure__computeSignatureFunctionNewRti(t1, isStatic, isIntercepted);
+ $prototype[callName] = trampoline;
+ for (applyTrampoline = trampoline, i = 1; i < funsOrNames.length; ++i) {
+ stub = funsOrNames[i];
+ if (typeof stub == "string") {
+ stub0 = container[stub];
+ stubName = stub;
+ stub = stub0;
+ } else
+ stubName = "";
+ stubCallName = callNames[i];
+ if (stubCallName != null) {
+ if (t2)
+ stub = A.Closure_forwardCallTo(stubName, stub, isIntercepted, needsDirectAccess);
+ $prototype[stubCallName] = stub;
+ }
+ if (i === applyTrampolineIndex)
+ applyTrampoline = stub;
+ }
+ $prototype["call*"] = applyTrampoline;
+ $prototype.$requiredArgCount = parameters.rC;
+ $prototype.$defaultValues = parameters.dV;
+ return $constructor;
+ },
+ Closure__computeSignatureFunctionNewRti(functionType, isStatic, isIntercepted) {
+ if (typeof functionType == "number")
+ return functionType;
+ if (typeof functionType == "string") {
+ if (isStatic)
+ throw A.wrapException("Cannot compute signature for static tearoff.");
+ return function(recipe, evalOnReceiver) {
+ return function() {
+ return evalOnReceiver(this, recipe);
+ };
+ }(functionType, A.BoundClosure_evalRecipe);
+ }
+ throw A.wrapException("Error in functionType of tearoff");
+ },
+ Closure_cspForwardCall(arity, needsDirectAccess, stubName, $function) {
+ var getReceiver = A.BoundClosure_receiverOf;
+ switch (needsDirectAccess ? -1 : arity) {
+ case 0:
+ return function(entry, receiverOf) {
+ return function() {
+ return receiverOf(this)[entry]();
+ };
+ }(stubName, getReceiver);
+ case 1:
+ return function(entry, receiverOf) {
+ return function(a) {
+ return receiverOf(this)[entry](a);
+ };
+ }(stubName, getReceiver);
+ case 2:
+ return function(entry, receiverOf) {
+ return function(a, b) {
+ return receiverOf(this)[entry](a, b);
+ };
+ }(stubName, getReceiver);
+ case 3:
+ return function(entry, receiverOf) {
+ return function(a, b, c) {
+ return receiverOf(this)[entry](a, b, c);
+ };
+ }(stubName, getReceiver);
+ case 4:
+ return function(entry, receiverOf) {
+ return function(a, b, c, d) {
+ return receiverOf(this)[entry](a, b, c, d);
+ };
+ }(stubName, getReceiver);
+ case 5:
+ return function(entry, receiverOf) {
+ return function(a, b, c, d, e) {
+ return receiverOf(this)[entry](a, b, c, d, e);
+ };
+ }(stubName, getReceiver);
+ default:
+ return function(f, receiverOf) {
+ return function() {
+ return f.apply(receiverOf(this), arguments);
+ };
+ }($function, getReceiver);
+ }
+ },
+ Closure_forwardCallTo(stubName, $function, isIntercepted, needsDirectAccess) {
+ if (isIntercepted)
+ return A.Closure_forwardInterceptedCallTo(stubName, $function, needsDirectAccess);
+ return A.Closure_cspForwardCall($function.length, needsDirectAccess, stubName, $function);
+ },
+ Closure_cspForwardInterceptedCall(arity, needsDirectAccess, stubName, $function) {
+ var getReceiver = A.BoundClosure_receiverOf,
+ getInterceptor = A.BoundClosure_interceptorOf;
+ switch (needsDirectAccess ? -1 : arity) {
+ case 0:
+ throw A.wrapException(new A.RuntimeError("Intercepted function with no arguments."));
+ case 1:
+ return function(entry, interceptorOf, receiverOf) {
+ return function() {
+ return interceptorOf(this)[entry](receiverOf(this));
+ };
+ }(stubName, getInterceptor, getReceiver);
+ case 2:
+ return function(entry, interceptorOf, receiverOf) {
+ return function(a) {
+ return interceptorOf(this)[entry](receiverOf(this), a);
+ };
+ }(stubName, getInterceptor, getReceiver);
+ case 3:
+ return function(entry, interceptorOf, receiverOf) {
+ return function(a, b) {
+ return interceptorOf(this)[entry](receiverOf(this), a, b);
+ };
+ }(stubName, getInterceptor, getReceiver);
+ case 4:
+ return function(entry, interceptorOf, receiverOf) {
+ return function(a, b, c) {
+ return interceptorOf(this)[entry](receiverOf(this), a, b, c);
+ };
+ }(stubName, getInterceptor, getReceiver);
+ case 5:
+ return function(entry, interceptorOf, receiverOf) {
+ return function(a, b, c, d) {
+ return interceptorOf(this)[entry](receiverOf(this), a, b, c, d);
+ };
+ }(stubName, getInterceptor, getReceiver);
+ case 6:
+ return function(entry, interceptorOf, receiverOf) {
+ return function(a, b, c, d, e) {
+ return interceptorOf(this)[entry](receiverOf(this), a, b, c, d, e);
+ };
+ }(stubName, getInterceptor, getReceiver);
+ default:
+ return function(f, interceptorOf, receiverOf) {
+ return function() {
+ var a = [receiverOf(this)];
+ Array.prototype.push.apply(a, arguments);
+ return f.apply(interceptorOf(this), a);
+ };
+ }($function, getInterceptor, getReceiver);
+ }
+ },
+ Closure_forwardInterceptedCallTo(stubName, $function, needsDirectAccess) {
+ var arity, t1;
+ if ($.BoundClosure__interceptorFieldNameCache == null)
+ $.BoundClosure__interceptorFieldNameCache = A.BoundClosure__computeFieldNamed("interceptor");
+ if ($.BoundClosure__receiverFieldNameCache == null)
+ $.BoundClosure__receiverFieldNameCache = A.BoundClosure__computeFieldNamed("receiver");
+ arity = $function.length;
+ t1 = A.Closure_cspForwardInterceptedCall(arity, needsDirectAccess, stubName, $function);
+ return t1;
+ },
+ closureFromTearOff(parameters) {
+ return A.Closure_fromTearOff(parameters);
+ },
+ BoundClosure_evalRecipe(closure, recipe) {
+ return A._Universe_evalInEnvironment(init.typeUniverse, A.instanceType(closure._receiver), recipe);
+ },
+ BoundClosure_receiverOf(closure) {
+ return closure._receiver;
+ },
+ BoundClosure_interceptorOf(closure) {
+ return closure._interceptor;
+ },
+ BoundClosure__computeFieldNamed(fieldName) {
+ var t1, i, $name,
+ template = new A.BoundClosure("receiver", "interceptor"),
+ names = J.JSArray_markFixedList(Object.getOwnPropertyNames(template), type$.nullable_Object);
+ for (t1 = names.length, i = 0; i < t1; ++i) {
+ $name = names[i];
+ if (template[$name] === fieldName)
+ return $name;
+ }
+ throw A.wrapException(A.ArgumentError$("Field name " + fieldName + " not found.", null));
+ },
+ throwCyclicInit(staticName) {
+ throw A.wrapException(new A._CyclicInitializationError(staticName));
+ },
+ getIsolateAffinityTag($name) {
+ return init.getIsolateTag($name);
+ },
+ lookupAndCacheInterceptor(obj) {
+ var interceptor, interceptorClass, altTag, mark, t1,
+ tag = A._asString($.getTagFunction.call$1(obj)),
+ record = $.dispatchRecordsForInstanceTags[tag];
+ if (record != null) {
+ Object.defineProperty(obj, init.dispatchPropertyName, {value: record, enumerable: false, writable: true, configurable: true});
+ return record.i;
+ }
+ interceptor = $.interceptorsForUncacheableTags[tag];
+ if (interceptor != null)
+ return interceptor;
+ interceptorClass = init.interceptorsByTag[tag];
+ if (interceptorClass == null) {
+ altTag = A._asStringQ($.alternateTagFunction.call$2(obj, tag));
+ if (altTag != null) {
+ record = $.dispatchRecordsForInstanceTags[altTag];
+ if (record != null) {
+ Object.defineProperty(obj, init.dispatchPropertyName, {value: record, enumerable: false, writable: true, configurable: true});
+ return record.i;
+ }
+ interceptor = $.interceptorsForUncacheableTags[altTag];
+ if (interceptor != null)
+ return interceptor;
+ interceptorClass = init.interceptorsByTag[altTag];
+ tag = altTag;
+ }
+ }
+ if (interceptorClass == null)
+ return null;
+ interceptor = interceptorClass.prototype;
+ mark = tag[0];
+ if (mark === "!") {
+ record = A.makeLeafDispatchRecord(interceptor);
+ $.dispatchRecordsForInstanceTags[tag] = record;
+ Object.defineProperty(obj, init.dispatchPropertyName, {value: record, enumerable: false, writable: true, configurable: true});
+ return record.i;
+ }
+ if (mark === "~") {
+ $.interceptorsForUncacheableTags[tag] = interceptor;
+ return interceptor;
+ }
+ if (mark === "-") {
+ t1 = A.makeLeafDispatchRecord(interceptor);
+ Object.defineProperty(Object.getPrototypeOf(obj), init.dispatchPropertyName, {value: t1, enumerable: false, writable: true, configurable: true});
+ return t1.i;
+ }
+ if (mark === "+")
+ return A.patchInteriorProto(obj, interceptor);
+ if (mark === "*")
+ throw A.wrapException(A.UnimplementedError$(tag));
+ if (init.leafTags[tag] === true) {
+ t1 = A.makeLeafDispatchRecord(interceptor);
+ Object.defineProperty(Object.getPrototypeOf(obj), init.dispatchPropertyName, {value: t1, enumerable: false, writable: true, configurable: true});
+ return t1.i;
+ } else
+ return A.patchInteriorProto(obj, interceptor);
+ },
+ patchInteriorProto(obj, interceptor) {
+ var proto = Object.getPrototypeOf(obj);
+ Object.defineProperty(proto, init.dispatchPropertyName, {value: J.makeDispatchRecord(interceptor, proto, null, null), enumerable: false, writable: true, configurable: true});
+ return interceptor;
+ },
+ makeLeafDispatchRecord(interceptor) {
+ return J.makeDispatchRecord(interceptor, false, null, !!interceptor.$isJavaScriptIndexingBehavior);
+ },
+ makeDefaultDispatchRecord(tag, interceptorClass, proto) {
+ var interceptor = interceptorClass.prototype;
+ if (init.leafTags[tag] === true)
+ return A.makeLeafDispatchRecord(interceptor);
+ else
+ return J.makeDispatchRecord(interceptor, proto, null, null);
+ },
+ initNativeDispatch() {
+ if (true === $.initNativeDispatchFlag)
+ return;
+ $.initNativeDispatchFlag = true;
+ A.initNativeDispatchContinue();
+ },
+ initNativeDispatchContinue() {
+ var map, tags, fun, i, tag, proto, record, interceptorClass;
+ $.dispatchRecordsForInstanceTags = Object.create(null);
+ $.interceptorsForUncacheableTags = Object.create(null);
+ A.initHooks();
+ map = init.interceptorsByTag;
+ tags = Object.getOwnPropertyNames(map);
+ if (typeof window != "undefined") {
+ window;
+ fun = function() {
+ };
+ for (i = 0; i < tags.length; ++i) {
+ tag = tags[i];
+ proto = $.prototypeForTagFunction.call$1(tag);
+ if (proto != null) {
+ record = A.makeDefaultDispatchRecord(tag, map[tag], proto);
+ if (record != null) {
+ Object.defineProperty(proto, init.dispatchPropertyName, {value: record, enumerable: false, writable: true, configurable: true});
+ fun.prototype = proto;
+ }
+ }
+ }
+ }
+ for (i = 0; i < tags.length; ++i) {
+ tag = tags[i];
+ if (/^[A-Za-z_]/.test(tag)) {
+ interceptorClass = map[tag];
+ map["!" + tag] = interceptorClass;
+ map["~" + tag] = interceptorClass;
+ map["-" + tag] = interceptorClass;
+ map["+" + tag] = interceptorClass;
+ map["*" + tag] = interceptorClass;
+ }
+ }
+ },
+ initHooks() {
+ var transformers, i, transformer, getTag, getUnknownTag, prototypeForTag,
+ hooks = B.C_JS_CONST0();
+ hooks = A.applyHooksTransformer(B.C_JS_CONST1, A.applyHooksTransformer(B.C_JS_CONST2, A.applyHooksTransformer(B.C_JS_CONST3, A.applyHooksTransformer(B.C_JS_CONST3, A.applyHooksTransformer(B.C_JS_CONST4, A.applyHooksTransformer(B.C_JS_CONST5, A.applyHooksTransformer(B.C_JS_CONST6(B.C_JS_CONST), hooks)))))));
+ if (typeof dartNativeDispatchHooksTransformer != "undefined") {
+ transformers = dartNativeDispatchHooksTransformer;
+ if (typeof transformers == "function")
+ transformers = [transformers];
+ if (Array.isArray(transformers))
+ for (i = 0; i < transformers.length; ++i) {
+ transformer = transformers[i];
+ if (typeof transformer == "function")
+ hooks = transformer(hooks) || hooks;
+ }
+ }
+ getTag = hooks.getTag;
+ getUnknownTag = hooks.getUnknownTag;
+ prototypeForTag = hooks.prototypeForTag;
+ $.getTagFunction = new A.initHooks_closure(getTag);
+ $.alternateTagFunction = new A.initHooks_closure0(getUnknownTag);
+ $.prototypeForTagFunction = new A.initHooks_closure1(prototypeForTag);
+ },
+ applyHooksTransformer(transformer, hooks) {
+ return transformer(hooks) || hooks;
+ },
+ createRecordTypePredicate(shape, fieldRtis) {
+ var $length = fieldRtis.length,
+ $function = init.rttc["" + $length + ";" + shape];
+ if ($function == null)
+ return null;
+ if ($length === 0)
+ return $function;
+ if ($length === $function.length)
+ return $function.apply(null, fieldRtis);
+ return $function(fieldRtis);
+ },
+ quoteStringForRegExp(string) {
+ if (/[[\]{}()*+?.\\^$|]/.test(string))
+ return string.replace(/[[\]{}()*+?.\\^$|]/g, "\\$&");
+ return string;
+ },
+ ConstantMapView: function ConstantMapView(t0, t1) {
+ this._collection$_map = t0;
+ this.$ti = t1;
+ },
+ ConstantMap: function ConstantMap() {
+ },
+ ConstantStringMap: function ConstantStringMap(t0, t1, t2) {
+ this._jsIndex = t0;
+ this._values = t1;
+ this.$ti = t2;
+ },
+ JSInvocationMirror: function JSInvocationMirror(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _._memberName = t0;
+ _.__js_helper$_kind = t1;
+ _._arguments = t2;
+ _._namedArgumentNames = t3;
+ _._typeArgumentCount = t4;
+ },
+ Primitives_functionNoSuchMethod_closure: function Primitives_functionNoSuchMethod_closure(t0, t1, t2) {
+ this._box_0 = t0;
+ this.namedArgumentList = t1;
+ this.$arguments = t2;
+ },
+ TypeErrorDecoder: function TypeErrorDecoder(t0, t1, t2, t3, t4, t5) {
+ var _ = this;
+ _._pattern = t0;
+ _._arguments = t1;
+ _._argumentsExpr = t2;
+ _._expr = t3;
+ _._method = t4;
+ _._receiver = t5;
+ },
+ NullError: function NullError() {
+ },
+ JsNoSuchMethodError: function JsNoSuchMethodError(t0, t1, t2) {
+ this.__js_helper$_message = t0;
+ this._method = t1;
+ this._receiver = t2;
+ },
+ UnknownJsTypeError: function UnknownJsTypeError(t0) {
+ this.__js_helper$_message = t0;
+ },
+ NullThrownFromJavaScriptException: function NullThrownFromJavaScriptException(t0) {
+ this._irritant = t0;
+ },
+ ExceptionAndStackTrace: function ExceptionAndStackTrace(t0, t1) {
+ this.dartException = t0;
+ this.stackTrace = t1;
+ },
+ _StackTrace: function _StackTrace(t0) {
+ this._exception = t0;
+ this._trace = null;
+ },
+ Closure: function Closure() {
+ },
+ Closure0Args: function Closure0Args() {
+ },
+ Closure2Args: function Closure2Args() {
+ },
+ TearOffClosure: function TearOffClosure() {
+ },
+ StaticClosure: function StaticClosure() {
+ },
+ BoundClosure: function BoundClosure(t0, t1) {
+ this._receiver = t0;
+ this._interceptor = t1;
+ },
+ _CyclicInitializationError: function _CyclicInitializationError(t0) {
+ this.variableName = t0;
+ },
+ RuntimeError: function RuntimeError(t0) {
+ this.message = t0;
+ },
+ _Required: function _Required() {
+ },
+ JsLinkedHashMap: function JsLinkedHashMap(t0) {
+ var _ = this;
+ _.__js_helper$_length = 0;
+ _._last = _._first = _.__js_helper$_rest = _._nums = _._strings = null;
+ _._modifications = 0;
+ _.$ti = t0;
+ },
+ LinkedHashMapCell: function LinkedHashMapCell(t0, t1) {
+ this.hashMapCellKey = t0;
+ this.hashMapCellValue = t1;
+ this._next = null;
+ },
+ LinkedHashMapKeyIterable: function LinkedHashMapKeyIterable(t0, t1) {
+ this._map = t0;
+ this.$ti = t1;
+ },
+ LinkedHashMapKeyIterator: function LinkedHashMapKeyIterator(t0, t1, t2) {
+ var _ = this;
+ _._map = t0;
+ _._modifications = t1;
+ _.__js_helper$_current = _._cell = null;
+ _.$ti = t2;
+ },
+ initHooks_closure: function initHooks_closure(t0) {
+ this.getTag = t0;
+ },
+ initHooks_closure0: function initHooks_closure0(t0) {
+ this.getUnknownTag = t0;
+ },
+ initHooks_closure1: function initHooks_closure1(t0) {
+ this.prototypeForTag = t0;
+ },
+ StringMatch: function StringMatch(t0, t1) {
+ this.start = t0;
+ this.pattern = t1;
+ },
+ _checkValidIndex(index, list, $length) {
+ if (index >>> 0 !== index || index >= $length)
+ throw A.wrapException(A.diagnoseIndexError(list, index));
+ },
+ NativeByteBuffer: function NativeByteBuffer() {
+ },
+ NativeTypedData: function NativeTypedData() {
+ },
+ NativeByteData: function NativeByteData() {
+ },
+ NativeTypedArray: function NativeTypedArray() {
+ },
+ NativeTypedArrayOfDouble: function NativeTypedArrayOfDouble() {
+ },
+ NativeTypedArrayOfInt: function NativeTypedArrayOfInt() {
+ },
+ NativeFloat32List: function NativeFloat32List() {
+ },
+ NativeFloat64List: function NativeFloat64List() {
+ },
+ NativeInt16List: function NativeInt16List() {
+ },
+ NativeInt32List: function NativeInt32List() {
+ },
+ NativeInt8List: function NativeInt8List() {
+ },
+ NativeUint16List: function NativeUint16List() {
+ },
+ NativeUint32List: function NativeUint32List() {
+ },
+ NativeUint8ClampedList: function NativeUint8ClampedList() {
+ },
+ NativeUint8List: function NativeUint8List() {
+ },
+ _NativeTypedArrayOfDouble_NativeTypedArray_ListMixin: function _NativeTypedArrayOfDouble_NativeTypedArray_ListMixin() {
+ },
+ _NativeTypedArrayOfDouble_NativeTypedArray_ListMixin_FixedLengthListMixin: function _NativeTypedArrayOfDouble_NativeTypedArray_ListMixin_FixedLengthListMixin() {
+ },
+ _NativeTypedArrayOfInt_NativeTypedArray_ListMixin: function _NativeTypedArrayOfInt_NativeTypedArray_ListMixin() {
+ },
+ _NativeTypedArrayOfInt_NativeTypedArray_ListMixin_FixedLengthListMixin: function _NativeTypedArrayOfInt_NativeTypedArray_ListMixin_FixedLengthListMixin() {
+ },
+ Rti__getQuestionFromStar(universe, rti) {
+ var question = rti._precomputed1;
+ return question == null ? rti._precomputed1 = A._Universe__lookupQuestionRti(universe, rti._primary, true) : question;
+ },
+ Rti__getFutureFromFutureOr(universe, rti) {
+ var future = rti._precomputed1;
+ return future == null ? rti._precomputed1 = A._Universe__lookupInterfaceRti(universe, "Future", [rti._primary]) : future;
+ },
+ Rti__isUnionOfFunctionType(rti) {
+ var kind = rti._kind;
+ if (kind === 6 || kind === 7 || kind === 8)
+ return A.Rti__isUnionOfFunctionType(rti._primary);
+ return kind === 12 || kind === 13;
+ },
+ Rti__getCanonicalRecipe(rti) {
+ return rti._canonicalRecipe;
+ },
+ findType(recipe) {
+ return A._Universe_eval(init.typeUniverse, recipe, false);
+ },
+ _substitute(universe, rti, typeArguments, depth) {
+ var baseType, substitutedBaseType, interfaceTypeArguments, substitutedInterfaceTypeArguments, base, substitutedBase, $arguments, substitutedArguments, t1, fields, substitutedFields, returnType, substitutedReturnType, functionParameters, substitutedFunctionParameters, bounds, substitutedBounds, index, argument,
+ kind = rti._kind;
+ switch (kind) {
+ case 5:
+ case 1:
+ case 2:
+ case 3:
+ case 4:
+ return rti;
+ case 6:
+ baseType = rti._primary;
+ substitutedBaseType = A._substitute(universe, baseType, typeArguments, depth);
+ if (substitutedBaseType === baseType)
+ return rti;
+ return A._Universe__lookupStarRti(universe, substitutedBaseType, true);
+ case 7:
+ baseType = rti._primary;
+ substitutedBaseType = A._substitute(universe, baseType, typeArguments, depth);
+ if (substitutedBaseType === baseType)
+ return rti;
+ return A._Universe__lookupQuestionRti(universe, substitutedBaseType, true);
+ case 8:
+ baseType = rti._primary;
+ substitutedBaseType = A._substitute(universe, baseType, typeArguments, depth);
+ if (substitutedBaseType === baseType)
+ return rti;
+ return A._Universe__lookupFutureOrRti(universe, substitutedBaseType, true);
+ case 9:
+ interfaceTypeArguments = rti._rest;
+ substitutedInterfaceTypeArguments = A._substituteArray(universe, interfaceTypeArguments, typeArguments, depth);
+ if (substitutedInterfaceTypeArguments === interfaceTypeArguments)
+ return rti;
+ return A._Universe__lookupInterfaceRti(universe, rti._primary, substitutedInterfaceTypeArguments);
+ case 10:
+ base = rti._primary;
+ substitutedBase = A._substitute(universe, base, typeArguments, depth);
+ $arguments = rti._rest;
+ substitutedArguments = A._substituteArray(universe, $arguments, typeArguments, depth);
+ if (substitutedBase === base && substitutedArguments === $arguments)
+ return rti;
+ return A._Universe__lookupBindingRti(universe, substitutedBase, substitutedArguments);
+ case 11:
+ t1 = rti._primary;
+ fields = rti._rest;
+ substitutedFields = A._substituteArray(universe, fields, typeArguments, depth);
+ if (substitutedFields === fields)
+ return rti;
+ return A._Universe__lookupRecordRti(universe, t1, substitutedFields);
+ case 12:
+ returnType = rti._primary;
+ substitutedReturnType = A._substitute(universe, returnType, typeArguments, depth);
+ functionParameters = rti._rest;
+ substitutedFunctionParameters = A._substituteFunctionParameters(universe, functionParameters, typeArguments, depth);
+ if (substitutedReturnType === returnType && substitutedFunctionParameters === functionParameters)
+ return rti;
+ return A._Universe__lookupFunctionRti(universe, substitutedReturnType, substitutedFunctionParameters);
+ case 13:
+ bounds = rti._rest;
+ depth += bounds.length;
+ substitutedBounds = A._substituteArray(universe, bounds, typeArguments, depth);
+ base = rti._primary;
+ substitutedBase = A._substitute(universe, base, typeArguments, depth);
+ if (substitutedBounds === bounds && substitutedBase === base)
+ return rti;
+ return A._Universe__lookupGenericFunctionRti(universe, substitutedBase, substitutedBounds, true);
+ case 14:
+ index = rti._primary;
+ if (index < depth)
+ return rti;
+ argument = typeArguments[index - depth];
+ if (argument == null)
+ return rti;
+ return argument;
+ default:
+ throw A.wrapException(A.AssertionError$("Attempted to substitute unexpected RTI kind " + kind));
+ }
+ },
+ _substituteArray(universe, rtiArray, typeArguments, depth) {
+ var changed, i, rti, substitutedRti,
+ $length = rtiArray.length,
+ result = A._Utils_newArrayOrEmpty($length);
+ for (changed = false, i = 0; i < $length; ++i) {
+ rti = rtiArray[i];
+ substitutedRti = A._substitute(universe, rti, typeArguments, depth);
+ if (substitutedRti !== rti)
+ changed = true;
+ result[i] = substitutedRti;
+ }
+ return changed ? result : rtiArray;
+ },
+ _substituteNamed(universe, namedArray, typeArguments, depth) {
+ var changed, i, t1, t2, rti, substitutedRti,
+ $length = namedArray.length,
+ result = A._Utils_newArrayOrEmpty($length);
+ for (changed = false, i = 0; i < $length; i += 3) {
+ t1 = namedArray[i];
+ t2 = namedArray[i + 1];
+ rti = namedArray[i + 2];
+ substitutedRti = A._substitute(universe, rti, typeArguments, depth);
+ if (substitutedRti !== rti)
+ changed = true;
+ result.splice(i, 3, t1, t2, substitutedRti);
+ }
+ return changed ? result : namedArray;
+ },
+ _substituteFunctionParameters(universe, functionParameters, typeArguments, depth) {
+ var result,
+ requiredPositional = functionParameters._requiredPositional,
+ substitutedRequiredPositional = A._substituteArray(universe, requiredPositional, typeArguments, depth),
+ optionalPositional = functionParameters._optionalPositional,
+ substitutedOptionalPositional = A._substituteArray(universe, optionalPositional, typeArguments, depth),
+ named = functionParameters._named,
+ substitutedNamed = A._substituteNamed(universe, named, typeArguments, depth);
+ if (substitutedRequiredPositional === requiredPositional && substitutedOptionalPositional === optionalPositional && substitutedNamed === named)
+ return functionParameters;
+ result = new A._FunctionParameters();
+ result._requiredPositional = substitutedRequiredPositional;
+ result._optionalPositional = substitutedOptionalPositional;
+ result._named = substitutedNamed;
+ return result;
+ },
+ _setArrayType(target, rti) {
+ target[init.arrayRti] = rti;
+ return target;
+ },
+ closureFunctionType(closure) {
+ var signature = closure.$signature;
+ if (signature != null) {
+ if (typeof signature == "number")
+ return A.getTypeFromTypesTable(signature);
+ return closure.$signature();
+ }
+ return null;
+ },
+ instanceOrFunctionType(object, testRti) {
+ var rti;
+ if (A.Rti__isUnionOfFunctionType(testRti))
+ if (object instanceof A.Closure) {
+ rti = A.closureFunctionType(object);
+ if (rti != null)
+ return rti;
+ }
+ return A.instanceType(object);
+ },
+ instanceType(object) {
+ if (object instanceof A.Object)
+ return A._instanceType(object);
+ if (Array.isArray(object))
+ return A._arrayInstanceType(object);
+ return A._instanceTypeFromConstructor(J.getInterceptor$(object));
+ },
+ _arrayInstanceType(object) {
+ var rti = object[init.arrayRti],
+ defaultRti = type$.JSArray_dynamic;
+ if (rti == null)
+ return defaultRti;
+ if (rti.constructor !== defaultRti.constructor)
+ return defaultRti;
+ return rti;
+ },
+ _instanceType(object) {
+ var rti = object.$ti;
+ return rti != null ? rti : A._instanceTypeFromConstructor(object);
+ },
+ _instanceTypeFromConstructor(instance) {
+ var $constructor = instance.constructor,
+ probe = $constructor.$ccache;
+ if (probe != null)
+ return probe;
+ return A._instanceTypeFromConstructorMiss(instance, $constructor);
+ },
+ _instanceTypeFromConstructorMiss(instance, $constructor) {
+ var effectiveConstructor = instance instanceof A.Closure ? Object.getPrototypeOf(Object.getPrototypeOf(instance)).constructor : $constructor,
+ rti = A._Universe_findErasedType(init.typeUniverse, effectiveConstructor.name);
+ $constructor.$ccache = rti;
+ return rti;
+ },
+ getTypeFromTypesTable(index) {
+ var rti,
+ table = init.types,
+ type = table[index];
+ if (typeof type == "string") {
+ rti = A._Universe_eval(init.typeUniverse, type, false);
+ table[index] = rti;
+ return rti;
+ }
+ return type;
+ },
+ getRuntimeTypeOfDartObject(object) {
+ return A.createRuntimeType(A._instanceType(object));
+ },
+ _structuralTypeOf(object) {
+ var functionRti = object instanceof A.Closure ? A.closureFunctionType(object) : null;
+ if (functionRti != null)
+ return functionRti;
+ if (type$.TrustedGetRuntimeType._is(object))
+ return J.get$runtimeType$(object)._rti;
+ if (Array.isArray(object))
+ return A._arrayInstanceType(object);
+ return A.instanceType(object);
+ },
+ createRuntimeType(rti) {
+ var t1 = rti._cachedRuntimeType;
+ return t1 == null ? rti._cachedRuntimeType = A._createRuntimeType(rti) : t1;
+ },
+ _createRuntimeType(rti) {
+ var starErasedRti, t1,
+ s = rti._canonicalRecipe,
+ starErasedRecipe = s.replace(/\*/g, "");
+ if (starErasedRecipe === s)
+ return rti._cachedRuntimeType = new A._Type(rti);
+ starErasedRti = A._Universe_eval(init.typeUniverse, starErasedRecipe, true);
+ t1 = starErasedRti._cachedRuntimeType;
+ return t1 == null ? starErasedRti._cachedRuntimeType = A._createRuntimeType(starErasedRti) : t1;
+ },
+ typeLiteral(recipe) {
+ return A.createRuntimeType(A._Universe_eval(init.typeUniverse, recipe, false));
+ },
+ _installSpecializedIsTest(object) {
+ var t1, unstarred, unstarredKind, isFn, $name, predicate, testRti = this;
+ if (testRti === type$.Object)
+ return A._finishIsFn(testRti, object, A._isObject);
+ if (!A.isSoundTopType(testRti))
+ t1 = testRti === type$.legacy_Object;
+ else
+ t1 = true;
+ if (t1)
+ return A._finishIsFn(testRti, object, A._isTop);
+ t1 = testRti._kind;
+ if (t1 === 7)
+ return A._finishIsFn(testRti, object, A._generalNullableIsTestImplementation);
+ if (t1 === 1)
+ return A._finishIsFn(testRti, object, A._isNever);
+ unstarred = t1 === 6 ? testRti._primary : testRti;
+ unstarredKind = unstarred._kind;
+ if (unstarredKind === 8)
+ return A._finishIsFn(testRti, object, A._isFutureOr);
+ if (unstarred === type$.int)
+ isFn = A._isInt;
+ else if (unstarred === type$.double || unstarred === type$.num)
+ isFn = A._isNum;
+ else if (unstarred === type$.String)
+ isFn = A._isString;
+ else
+ isFn = unstarred === type$.bool ? A._isBool : null;
+ if (isFn != null)
+ return A._finishIsFn(testRti, object, isFn);
+ if (unstarredKind === 9) {
+ $name = unstarred._primary;
+ if (unstarred._rest.every(A.isDefinitelyTopType)) {
+ testRti._specializedTestResource = "$is" + $name;
+ if ($name === "List")
+ return A._finishIsFn(testRti, object, A._isListTestViaProperty);
+ return A._finishIsFn(testRti, object, A._isTestViaProperty);
+ }
+ } else if (unstarredKind === 11) {
+ predicate = A.createRecordTypePredicate(unstarred._primary, unstarred._rest);
+ return A._finishIsFn(testRti, object, predicate == null ? A._isNever : predicate);
+ }
+ return A._finishIsFn(testRti, object, A._generalIsTestImplementation);
+ },
+ _finishIsFn(testRti, object, isFn) {
+ testRti._is = isFn;
+ return testRti._is(object);
+ },
+ _installSpecializedAsCheck(object) {
+ var t1, testRti = this,
+ asFn = A._generalAsCheckImplementation;
+ if (!A.isSoundTopType(testRti))
+ t1 = testRti === type$.legacy_Object;
+ else
+ t1 = true;
+ if (t1)
+ asFn = A._asTop;
+ else if (testRti === type$.Object)
+ asFn = A._asObject;
+ else {
+ t1 = A.isNullable(testRti);
+ if (t1)
+ asFn = A._generalNullableAsCheckImplementation;
+ }
+ testRti._as = asFn;
+ return testRti._as(object);
+ },
+ _nullIs(testRti) {
+ var t1,
+ kind = testRti._kind;
+ if (!A.isSoundTopType(testRti))
+ if (!(testRti === type$.legacy_Object))
+ if (!(testRti === type$.legacy_Never))
+ if (kind !== 7)
+ if (!(kind === 6 && A._nullIs(testRti._primary)))
+ t1 = kind === 8 && A._nullIs(testRti._primary) || testRti === type$.Null || testRti === type$.JSNull;
+ else
+ t1 = true;
+ else
+ t1 = true;
+ else
+ t1 = true;
+ else
+ t1 = true;
+ else
+ t1 = true;
+ return t1;
+ },
+ _generalIsTestImplementation(object) {
+ var testRti = this;
+ if (object == null)
+ return A._nullIs(testRti);
+ return A.isSubtype(init.typeUniverse, A.instanceOrFunctionType(object, testRti), testRti);
+ },
+ _generalNullableIsTestImplementation(object) {
+ if (object == null)
+ return true;
+ return this._primary._is(object);
+ },
+ _isTestViaProperty(object) {
+ var tag, testRti = this;
+ if (object == null)
+ return A._nullIs(testRti);
+ tag = testRti._specializedTestResource;
+ if (object instanceof A.Object)
+ return !!object[tag];
+ return !!J.getInterceptor$(object)[tag];
+ },
+ _isListTestViaProperty(object) {
+ var tag, testRti = this;
+ if (object == null)
+ return A._nullIs(testRti);
+ if (typeof object != "object")
+ return false;
+ if (Array.isArray(object))
+ return true;
+ tag = testRti._specializedTestResource;
+ if (object instanceof A.Object)
+ return !!object[tag];
+ return !!J.getInterceptor$(object)[tag];
+ },
+ _generalAsCheckImplementation(object) {
+ var testRti = this;
+ if (object == null) {
+ if (A.isNullable(testRti))
+ return object;
+ } else if (testRti._is(object))
+ return object;
+ A._failedAsCheck(object, testRti);
+ },
+ _generalNullableAsCheckImplementation(object) {
+ var testRti = this;
+ if (object == null)
+ return object;
+ else if (testRti._is(object))
+ return object;
+ A._failedAsCheck(object, testRti);
+ },
+ _failedAsCheck(object, testRti) {
+ throw A.wrapException(A._TypeError$fromMessage(A._Error_compose(object, A._rtiToString(testRti, null))));
+ },
+ _Error_compose(object, checkedTypeDescription) {
+ return A.Error_safeToString(object) + ": type '" + A._rtiToString(A._structuralTypeOf(object), null) + "' is not a subtype of type '" + checkedTypeDescription + "'";
+ },
+ _TypeError$fromMessage(message) {
+ return new A._TypeError("TypeError: " + message);
+ },
+ _TypeError__TypeError$forType(object, type) {
+ return new A._TypeError("TypeError: " + A._Error_compose(object, type));
+ },
+ _isFutureOr(object) {
+ var testRti = this,
+ unstarred = testRti._kind === 6 ? testRti._primary : testRti;
+ return unstarred._primary._is(object) || A.Rti__getFutureFromFutureOr(init.typeUniverse, unstarred)._is(object);
+ },
+ _isObject(object) {
+ return object != null;
+ },
+ _asObject(object) {
+ if (object != null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "Object"));
+ },
+ _isTop(object) {
+ return true;
+ },
+ _asTop(object) {
+ return object;
+ },
+ _isNever(object) {
+ return false;
+ },
+ _isBool(object) {
+ return true === object || false === object;
+ },
+ _asBool(object) {
+ if (true === object)
+ return true;
+ if (false === object)
+ return false;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "bool"));
+ },
+ _asBoolS(object) {
+ if (true === object)
+ return true;
+ if (false === object)
+ return false;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "bool"));
+ },
+ _asBoolQ(object) {
+ if (true === object)
+ return true;
+ if (false === object)
+ return false;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "bool?"));
+ },
+ _asDouble(object) {
+ if (typeof object == "number")
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "double"));
+ },
+ _asDoubleS(object) {
+ if (typeof object == "number")
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "double"));
+ },
+ _asDoubleQ(object) {
+ if (typeof object == "number")
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "double?"));
+ },
+ _isInt(object) {
+ return typeof object == "number" && Math.floor(object) === object;
+ },
+ _asInt(object) {
+ if (typeof object == "number" && Math.floor(object) === object)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "int"));
+ },
+ _asIntS(object) {
+ if (typeof object == "number" && Math.floor(object) === object)
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "int"));
+ },
+ _asIntQ(object) {
+ if (typeof object == "number" && Math.floor(object) === object)
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "int?"));
+ },
+ _isNum(object) {
+ return typeof object == "number";
+ },
+ _asNum(object) {
+ if (typeof object == "number")
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "num"));
+ },
+ _asNumS(object) {
+ if (typeof object == "number")
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "num"));
+ },
+ _asNumQ(object) {
+ if (typeof object == "number")
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "num?"));
+ },
+ _isString(object) {
+ return typeof object == "string";
+ },
+ _asString(object) {
+ if (typeof object == "string")
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "String"));
+ },
+ _asStringS(object) {
+ if (typeof object == "string")
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "String"));
+ },
+ _asStringQ(object) {
+ if (typeof object == "string")
+ return object;
+ if (object == null)
+ return object;
+ throw A.wrapException(A._TypeError__TypeError$forType(object, "String?"));
+ },
+ _rtiArrayToString(array, genericContext) {
+ var s, sep, i;
+ for (s = "", sep = "", i = 0; i < array.length; ++i, sep = ", ")
+ s += sep + A._rtiToString(array[i], genericContext);
+ return s;
+ },
+ _recordRtiToString(recordType, genericContext) {
+ var fieldCount, names, namesIndex, s, comma, i,
+ partialShape = recordType._primary,
+ fields = recordType._rest;
+ if ("" === partialShape)
+ return "(" + A._rtiArrayToString(fields, genericContext) + ")";
+ fieldCount = fields.length;
+ names = partialShape.split(",");
+ namesIndex = names.length - fieldCount;
+ for (s = "(", comma = "", i = 0; i < fieldCount; ++i, comma = ", ") {
+ s += comma;
+ if (namesIndex === 0)
+ s += "{";
+ s += A._rtiToString(fields[i], genericContext);
+ if (namesIndex >= 0)
+ s += " " + names[namesIndex];
+ ++namesIndex;
+ }
+ return s + "})";
+ },
+ _functionRtiToString(functionType, genericContext, bounds) {
+ var boundsLength, outerContextLength, offset, i, t1, t2, typeParametersText, typeSep, t3, t4, boundRti, kind, parameters, requiredPositional, requiredPositionalLength, optionalPositional, optionalPositionalLength, named, namedLength, returnTypeText, argumentsText, sep, _s2_ = ", ";
+ if (bounds != null) {
+ boundsLength = bounds.length;
+ if (genericContext == null) {
+ genericContext = A._setArrayType([], type$.JSArray_String);
+ outerContextLength = null;
+ } else
+ outerContextLength = genericContext.length;
+ offset = genericContext.length;
+ for (i = boundsLength; i > 0; --i)
+ B.JSArray_methods.add$1(genericContext, "T" + (offset + i));
+ for (t1 = type$.nullable_Object, t2 = type$.legacy_Object, typeParametersText = "<", typeSep = "", i = 0; i < boundsLength; ++i, typeSep = _s2_) {
+ t3 = genericContext.length;
+ t4 = t3 - 1 - i;
+ if (!(t4 >= 0))
+ return A.ioore(genericContext, t4);
+ typeParametersText = B.JSString_methods.$add(typeParametersText + typeSep, genericContext[t4]);
+ boundRti = bounds[i];
+ kind = boundRti._kind;
+ if (!(kind === 2 || kind === 3 || kind === 4 || kind === 5 || boundRti === t1))
+ t3 = boundRti === t2;
+ else
+ t3 = true;
+ if (!t3)
+ typeParametersText += " extends " + A._rtiToString(boundRti, genericContext);
+ }
+ typeParametersText += ">";
+ } else {
+ typeParametersText = "";
+ outerContextLength = null;
+ }
+ t1 = functionType._primary;
+ parameters = functionType._rest;
+ requiredPositional = parameters._requiredPositional;
+ requiredPositionalLength = requiredPositional.length;
+ optionalPositional = parameters._optionalPositional;
+ optionalPositionalLength = optionalPositional.length;
+ named = parameters._named;
+ namedLength = named.length;
+ returnTypeText = A._rtiToString(t1, genericContext);
+ for (argumentsText = "", sep = "", i = 0; i < requiredPositionalLength; ++i, sep = _s2_)
+ argumentsText += sep + A._rtiToString(requiredPositional[i], genericContext);
+ if (optionalPositionalLength > 0) {
+ argumentsText += sep + "[";
+ for (sep = "", i = 0; i < optionalPositionalLength; ++i, sep = _s2_)
+ argumentsText += sep + A._rtiToString(optionalPositional[i], genericContext);
+ argumentsText += "]";
+ }
+ if (namedLength > 0) {
+ argumentsText += sep + "{";
+ for (sep = "", i = 0; i < namedLength; i += 3, sep = _s2_) {
+ argumentsText += sep;
+ if (named[i + 1])
+ argumentsText += "required ";
+ argumentsText += A._rtiToString(named[i + 2], genericContext) + " " + named[i];
+ }
+ argumentsText += "}";
+ }
+ if (outerContextLength != null) {
+ genericContext.toString;
+ genericContext.length = outerContextLength;
+ }
+ return typeParametersText + "(" + argumentsText + ") => " + returnTypeText;
+ },
+ _rtiToString(rti, genericContext) {
+ var questionArgument, s, argumentKind, $name, $arguments, t1, t2,
+ kind = rti._kind;
+ if (kind === 5)
+ return "erased";
+ if (kind === 2)
+ return "dynamic";
+ if (kind === 3)
+ return "void";
+ if (kind === 1)
+ return "Never";
+ if (kind === 4)
+ return "any";
+ if (kind === 6)
+ return A._rtiToString(rti._primary, genericContext);
+ if (kind === 7) {
+ questionArgument = rti._primary;
+ s = A._rtiToString(questionArgument, genericContext);
+ argumentKind = questionArgument._kind;
+ return (argumentKind === 12 || argumentKind === 13 ? "(" + s + ")" : s) + "?";
+ }
+ if (kind === 8)
+ return "FutureOr<" + A._rtiToString(rti._primary, genericContext) + ">";
+ if (kind === 9) {
+ $name = A._unminifyOrTag(rti._primary);
+ $arguments = rti._rest;
+ return $arguments.length > 0 ? $name + ("<" + A._rtiArrayToString($arguments, genericContext) + ">") : $name;
+ }
+ if (kind === 11)
+ return A._recordRtiToString(rti, genericContext);
+ if (kind === 12)
+ return A._functionRtiToString(rti, genericContext, null);
+ if (kind === 13)
+ return A._functionRtiToString(rti._primary, genericContext, rti._rest);
+ if (kind === 14) {
+ t1 = rti._primary;
+ t2 = genericContext.length;
+ t1 = t2 - 1 - t1;
+ if (!(t1 >= 0 && t1 < t2))
+ return A.ioore(genericContext, t1);
+ return genericContext[t1];
+ }
+ return "?";
+ },
+ _unminifyOrTag(rawClassName) {
+ var preserved = init.mangledGlobalNames[rawClassName];
+ if (preserved != null)
+ return preserved;
+ return rawClassName;
+ },
+ _Universe_findRule(universe, targetType) {
+ var rule = universe.tR[targetType];
+ for (; typeof rule == "string";)
+ rule = universe.tR[rule];
+ return rule;
+ },
+ _Universe_findErasedType(universe, cls) {
+ var $length, erased, $arguments, i, $interface,
+ t1 = universe.eT,
+ probe = t1[cls];
+ if (probe == null)
+ return A._Universe_eval(universe, cls, false);
+ else if (typeof probe == "number") {
+ $length = probe;
+ erased = A._Universe__lookupTerminalRti(universe, 5, "#");
+ $arguments = A._Utils_newArrayOrEmpty($length);
+ for (i = 0; i < $length; ++i)
+ $arguments[i] = erased;
+ $interface = A._Universe__lookupInterfaceRti(universe, cls, $arguments);
+ t1[cls] = $interface;
+ return $interface;
+ } else
+ return probe;
+ },
+ _Universe_addRules(universe, rules) {
+ return A._Utils_objectAssign(universe.tR, rules);
+ },
+ _Universe_addErasedTypes(universe, types) {
+ return A._Utils_objectAssign(universe.eT, types);
+ },
+ _Universe_eval(universe, recipe, normalize) {
+ var rti,
+ t1 = universe.eC,
+ probe = t1.get(recipe);
+ if (probe != null)
+ return probe;
+ rti = A._Parser_parse(A._Parser_create(universe, null, recipe, normalize));
+ t1.set(recipe, rti);
+ return rti;
+ },
+ _Universe_evalInEnvironment(universe, environment, recipe) {
+ var probe, rti,
+ cache = environment._evalCache;
+ if (cache == null)
+ cache = environment._evalCache = new Map();
+ probe = cache.get(recipe);
+ if (probe != null)
+ return probe;
+ rti = A._Parser_parse(A._Parser_create(universe, environment, recipe, true));
+ cache.set(recipe, rti);
+ return rti;
+ },
+ _Universe_bind(universe, environment, argumentsRti) {
+ var argumentsRecipe, probe, rti,
+ cache = environment._bindCache;
+ if (cache == null)
+ cache = environment._bindCache = new Map();
+ argumentsRecipe = argumentsRti._canonicalRecipe;
+ probe = cache.get(argumentsRecipe);
+ if (probe != null)
+ return probe;
+ rti = A._Universe__lookupBindingRti(universe, environment, argumentsRti._kind === 10 ? argumentsRti._rest : [argumentsRti]);
+ cache.set(argumentsRecipe, rti);
+ return rti;
+ },
+ _Universe__installTypeTests(universe, rti) {
+ rti._as = A._installSpecializedAsCheck;
+ rti._is = A._installSpecializedIsTest;
+ return rti;
+ },
+ _Universe__lookupTerminalRti(universe, kind, key) {
+ var rti, t1,
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ rti = new A.Rti(null, null);
+ rti._kind = kind;
+ rti._canonicalRecipe = key;
+ t1 = A._Universe__installTypeTests(universe, rti);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__lookupStarRti(universe, baseType, normalize) {
+ var t1,
+ key = baseType._canonicalRecipe + "*",
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ t1 = A._Universe__createStarRti(universe, baseType, key, normalize);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__createStarRti(universe, baseType, key, normalize) {
+ var baseKind, t1, rti;
+ if (normalize) {
+ baseKind = baseType._kind;
+ if (!A.isSoundTopType(baseType))
+ t1 = baseType === type$.Null || baseType === type$.JSNull || baseKind === 7 || baseKind === 6;
+ else
+ t1 = true;
+ if (t1)
+ return baseType;
+ }
+ rti = new A.Rti(null, null);
+ rti._kind = 6;
+ rti._primary = baseType;
+ rti._canonicalRecipe = key;
+ return A._Universe__installTypeTests(universe, rti);
+ },
+ _Universe__lookupQuestionRti(universe, baseType, normalize) {
+ var t1,
+ key = baseType._canonicalRecipe + "?",
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ t1 = A._Universe__createQuestionRti(universe, baseType, key, normalize);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__createQuestionRti(universe, baseType, key, normalize) {
+ var baseKind, t1, starArgument, rti;
+ if (normalize) {
+ baseKind = baseType._kind;
+ if (!A.isSoundTopType(baseType))
+ if (!(baseType === type$.Null || baseType === type$.JSNull))
+ if (baseKind !== 7)
+ t1 = baseKind === 8 && A.isNullable(baseType._primary);
+ else
+ t1 = true;
+ else
+ t1 = true;
+ else
+ t1 = true;
+ if (t1)
+ return baseType;
+ else if (baseKind === 1 || baseType === type$.legacy_Never)
+ return type$.Null;
+ else if (baseKind === 6) {
+ starArgument = baseType._primary;
+ if (starArgument._kind === 8 && A.isNullable(starArgument._primary))
+ return starArgument;
+ else
+ return A.Rti__getQuestionFromStar(universe, baseType);
+ }
+ }
+ rti = new A.Rti(null, null);
+ rti._kind = 7;
+ rti._primary = baseType;
+ rti._canonicalRecipe = key;
+ return A._Universe__installTypeTests(universe, rti);
+ },
+ _Universe__lookupFutureOrRti(universe, baseType, normalize) {
+ var t1,
+ key = baseType._canonicalRecipe + "/",
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ t1 = A._Universe__createFutureOrRti(universe, baseType, key, normalize);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__createFutureOrRti(universe, baseType, key, normalize) {
+ var t1, rti;
+ if (normalize) {
+ t1 = baseType._kind;
+ if (A.isSoundTopType(baseType) || baseType === type$.Object || baseType === type$.legacy_Object)
+ return baseType;
+ else if (t1 === 1)
+ return A._Universe__lookupInterfaceRti(universe, "Future", [baseType]);
+ else if (baseType === type$.Null || baseType === type$.JSNull)
+ return type$.nullable_Future_Null;
+ }
+ rti = new A.Rti(null, null);
+ rti._kind = 8;
+ rti._primary = baseType;
+ rti._canonicalRecipe = key;
+ return A._Universe__installTypeTests(universe, rti);
+ },
+ _Universe__lookupGenericFunctionParameterRti(universe, index) {
+ var rti, t1,
+ key = "" + index + "^",
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ rti = new A.Rti(null, null);
+ rti._kind = 14;
+ rti._primary = index;
+ rti._canonicalRecipe = key;
+ t1 = A._Universe__installTypeTests(universe, rti);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__canonicalRecipeJoin($arguments) {
+ var s, sep, i,
+ $length = $arguments.length;
+ for (s = "", sep = "", i = 0; i < $length; ++i, sep = ",")
+ s += sep + $arguments[i]._canonicalRecipe;
+ return s;
+ },
+ _Universe__canonicalRecipeJoinNamed($arguments) {
+ var s, sep, i, t1, nameSep,
+ $length = $arguments.length;
+ for (s = "", sep = "", i = 0; i < $length; i += 3, sep = ",") {
+ t1 = $arguments[i];
+ nameSep = $arguments[i + 1] ? "!" : ":";
+ s += sep + t1 + nameSep + $arguments[i + 2]._canonicalRecipe;
+ }
+ return s;
+ },
+ _Universe__lookupInterfaceRti(universe, $name, $arguments) {
+ var probe, rti, t1,
+ s = $name;
+ if ($arguments.length > 0)
+ s += "<" + A._Universe__canonicalRecipeJoin($arguments) + ">";
+ probe = universe.eC.get(s);
+ if (probe != null)
+ return probe;
+ rti = new A.Rti(null, null);
+ rti._kind = 9;
+ rti._primary = $name;
+ rti._rest = $arguments;
+ if ($arguments.length > 0)
+ rti._precomputed1 = $arguments[0];
+ rti._canonicalRecipe = s;
+ t1 = A._Universe__installTypeTests(universe, rti);
+ universe.eC.set(s, t1);
+ return t1;
+ },
+ _Universe__lookupBindingRti(universe, base, $arguments) {
+ var newBase, newArguments, key, probe, rti, t1;
+ if (base._kind === 10) {
+ newBase = base._primary;
+ newArguments = base._rest.concat($arguments);
+ } else {
+ newArguments = $arguments;
+ newBase = base;
+ }
+ key = newBase._canonicalRecipe + (";<" + A._Universe__canonicalRecipeJoin(newArguments) + ">");
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ rti = new A.Rti(null, null);
+ rti._kind = 10;
+ rti._primary = newBase;
+ rti._rest = newArguments;
+ rti._canonicalRecipe = key;
+ t1 = A._Universe__installTypeTests(universe, rti);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__lookupRecordRti(universe, partialShapeTag, fields) {
+ var rti, t1,
+ key = "+" + (partialShapeTag + "(" + A._Universe__canonicalRecipeJoin(fields) + ")"),
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ rti = new A.Rti(null, null);
+ rti._kind = 11;
+ rti._primary = partialShapeTag;
+ rti._rest = fields;
+ rti._canonicalRecipe = key;
+ t1 = A._Universe__installTypeTests(universe, rti);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__lookupFunctionRti(universe, returnType, parameters) {
+ var sep, key, probe, rti, t1,
+ s = returnType._canonicalRecipe,
+ requiredPositional = parameters._requiredPositional,
+ requiredPositionalLength = requiredPositional.length,
+ optionalPositional = parameters._optionalPositional,
+ optionalPositionalLength = optionalPositional.length,
+ named = parameters._named,
+ namedLength = named.length,
+ recipe = "(" + A._Universe__canonicalRecipeJoin(requiredPositional);
+ if (optionalPositionalLength > 0) {
+ sep = requiredPositionalLength > 0 ? "," : "";
+ recipe += sep + "[" + A._Universe__canonicalRecipeJoin(optionalPositional) + "]";
+ }
+ if (namedLength > 0) {
+ sep = requiredPositionalLength > 0 ? "," : "";
+ recipe += sep + "{" + A._Universe__canonicalRecipeJoinNamed(named) + "}";
+ }
+ key = s + (recipe + ")");
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ rti = new A.Rti(null, null);
+ rti._kind = 12;
+ rti._primary = returnType;
+ rti._rest = parameters;
+ rti._canonicalRecipe = key;
+ t1 = A._Universe__installTypeTests(universe, rti);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__lookupGenericFunctionRti(universe, baseFunctionType, bounds, normalize) {
+ var t1,
+ key = baseFunctionType._canonicalRecipe + ("<" + A._Universe__canonicalRecipeJoin(bounds) + ">"),
+ probe = universe.eC.get(key);
+ if (probe != null)
+ return probe;
+ t1 = A._Universe__createGenericFunctionRti(universe, baseFunctionType, bounds, key, normalize);
+ universe.eC.set(key, t1);
+ return t1;
+ },
+ _Universe__createGenericFunctionRti(universe, baseFunctionType, bounds, key, normalize) {
+ var $length, typeArguments, count, i, bound, substitutedBase, substitutedBounds, rti;
+ if (normalize) {
+ $length = bounds.length;
+ typeArguments = A._Utils_newArrayOrEmpty($length);
+ for (count = 0, i = 0; i < $length; ++i) {
+ bound = bounds[i];
+ if (bound._kind === 1) {
+ typeArguments[i] = bound;
+ ++count;
+ }
+ }
+ if (count > 0) {
+ substitutedBase = A._substitute(universe, baseFunctionType, typeArguments, 0);
+ substitutedBounds = A._substituteArray(universe, bounds, typeArguments, 0);
+ return A._Universe__lookupGenericFunctionRti(universe, substitutedBase, substitutedBounds, bounds !== substitutedBounds);
+ }
+ }
+ rti = new A.Rti(null, null);
+ rti._kind = 13;
+ rti._primary = baseFunctionType;
+ rti._rest = bounds;
+ rti._canonicalRecipe = key;
+ return A._Universe__installTypeTests(universe, rti);
+ },
+ _Parser_create(universe, environment, recipe, normalize) {
+ return {u: universe, e: environment, r: recipe, s: [], p: 0, n: normalize};
+ },
+ _Parser_parse(parser) {
+ var t2, i, ch, t3, array, end, item,
+ source = parser.r,
+ t1 = parser.s;
+ for (t2 = source.length, i = 0; i < t2;) {
+ ch = source.charCodeAt(i);
+ if (ch >= 48 && ch <= 57)
+ i = A._Parser_handleDigit(i + 1, ch, source, t1);
+ else if ((((ch | 32) >>> 0) - 97 & 65535) < 26 || ch === 95 || ch === 36 || ch === 124)
+ i = A._Parser_handleIdentifier(parser, i, source, t1, false);
+ else if (ch === 46)
+ i = A._Parser_handleIdentifier(parser, i, source, t1, true);
+ else {
+ ++i;
+ switch (ch) {
+ case 44:
+ break;
+ case 58:
+ t1.push(false);
+ break;
+ case 33:
+ t1.push(true);
+ break;
+ case 59:
+ t1.push(A._Parser_toType(parser.u, parser.e, t1.pop()));
+ break;
+ case 94:
+ t1.push(A._Universe__lookupGenericFunctionParameterRti(parser.u, t1.pop()));
+ break;
+ case 35:
+ t1.push(A._Universe__lookupTerminalRti(parser.u, 5, "#"));
+ break;
+ case 64:
+ t1.push(A._Universe__lookupTerminalRti(parser.u, 2, "@"));
+ break;
+ case 126:
+ t1.push(A._Universe__lookupTerminalRti(parser.u, 3, "~"));
+ break;
+ case 60:
+ t1.push(parser.p);
+ parser.p = t1.length;
+ break;
+ case 62:
+ A._Parser_handleTypeArguments(parser, t1);
+ break;
+ case 38:
+ A._Parser_handleExtendedOperations(parser, t1);
+ break;
+ case 42:
+ t3 = parser.u;
+ t1.push(A._Universe__lookupStarRti(t3, A._Parser_toType(t3, parser.e, t1.pop()), parser.n));
+ break;
+ case 63:
+ t3 = parser.u;
+ t1.push(A._Universe__lookupQuestionRti(t3, A._Parser_toType(t3, parser.e, t1.pop()), parser.n));
+ break;
+ case 47:
+ t3 = parser.u;
+ t1.push(A._Universe__lookupFutureOrRti(t3, A._Parser_toType(t3, parser.e, t1.pop()), parser.n));
+ break;
+ case 40:
+ t1.push(-3);
+ t1.push(parser.p);
+ parser.p = t1.length;
+ break;
+ case 41:
+ A._Parser_handleArguments(parser, t1);
+ break;
+ case 91:
+ t1.push(parser.p);
+ parser.p = t1.length;
+ break;
+ case 93:
+ array = t1.splice(parser.p);
+ A._Parser_toTypes(parser.u, parser.e, array);
+ parser.p = t1.pop();
+ t1.push(array);
+ t1.push(-1);
+ break;
+ case 123:
+ t1.push(parser.p);
+ parser.p = t1.length;
+ break;
+ case 125:
+ array = t1.splice(parser.p);
+ A._Parser_toTypesNamed(parser.u, parser.e, array);
+ parser.p = t1.pop();
+ t1.push(array);
+ t1.push(-2);
+ break;
+ case 43:
+ end = source.indexOf("(", i);
+ t1.push(source.substring(i, end));
+ t1.push(-4);
+ t1.push(parser.p);
+ parser.p = t1.length;
+ i = end + 1;
+ break;
+ default:
+ throw "Bad character " + ch;
+ }
+ }
+ }
+ item = t1.pop();
+ return A._Parser_toType(parser.u, parser.e, item);
+ },
+ _Parser_handleDigit(i, digit, source, stack) {
+ var t1, ch,
+ value = digit - 48;
+ for (t1 = source.length; i < t1; ++i) {
+ ch = source.charCodeAt(i);
+ if (!(ch >= 48 && ch <= 57))
+ break;
+ value = value * 10 + (ch - 48);
+ }
+ stack.push(value);
+ return i;
+ },
+ _Parser_handleIdentifier(parser, start, source, stack, hasPeriod) {
+ var t1, ch, t2, string, environment, recipe,
+ i = start + 1;
+ for (t1 = source.length; i < t1; ++i) {
+ ch = source.charCodeAt(i);
+ if (ch === 46) {
+ if (hasPeriod)
+ break;
+ hasPeriod = true;
+ } else {
+ if (!((((ch | 32) >>> 0) - 97 & 65535) < 26 || ch === 95 || ch === 36 || ch === 124))
+ t2 = ch >= 48 && ch <= 57;
+ else
+ t2 = true;
+ if (!t2)
+ break;
+ }
+ }
+ string = source.substring(start, i);
+ if (hasPeriod) {
+ t1 = parser.u;
+ environment = parser.e;
+ if (environment._kind === 10)
+ environment = environment._primary;
+ recipe = A._Universe_findRule(t1, environment._primary)[string];
+ if (recipe == null)
+ A.throwExpression('No "' + string + '" in "' + A.Rti__getCanonicalRecipe(environment) + '"');
+ stack.push(A._Universe_evalInEnvironment(t1, environment, recipe));
+ } else
+ stack.push(string);
+ return i;
+ },
+ _Parser_handleTypeArguments(parser, stack) {
+ var base,
+ t1 = parser.u,
+ $arguments = A._Parser_collectArray(parser, stack),
+ head = stack.pop();
+ if (typeof head == "string")
+ stack.push(A._Universe__lookupInterfaceRti(t1, head, $arguments));
+ else {
+ base = A._Parser_toType(t1, parser.e, head);
+ switch (base._kind) {
+ case 12:
+ stack.push(A._Universe__lookupGenericFunctionRti(t1, base, $arguments, parser.n));
+ break;
+ default:
+ stack.push(A._Universe__lookupBindingRti(t1, base, $arguments));
+ break;
+ }
+ }
+ },
+ _Parser_handleArguments(parser, stack) {
+ var optionalPositional, named, requiredPositional, returnType, parameters, _null = null,
+ t1 = parser.u,
+ head = stack.pop();
+ if (typeof head == "number")
+ switch (head) {
+ case -1:
+ optionalPositional = stack.pop();
+ named = _null;
+ break;
+ case -2:
+ named = stack.pop();
+ optionalPositional = _null;
+ break;
+ default:
+ stack.push(head);
+ named = _null;
+ optionalPositional = named;
+ break;
+ }
+ else {
+ stack.push(head);
+ named = _null;
+ optionalPositional = named;
+ }
+ requiredPositional = A._Parser_collectArray(parser, stack);
+ head = stack.pop();
+ switch (head) {
+ case -3:
+ head = stack.pop();
+ if (optionalPositional == null)
+ optionalPositional = t1.sEA;
+ if (named == null)
+ named = t1.sEA;
+ returnType = A._Parser_toType(t1, parser.e, head);
+ parameters = new A._FunctionParameters();
+ parameters._requiredPositional = requiredPositional;
+ parameters._optionalPositional = optionalPositional;
+ parameters._named = named;
+ stack.push(A._Universe__lookupFunctionRti(t1, returnType, parameters));
+ return;
+ case -4:
+ stack.push(A._Universe__lookupRecordRti(t1, stack.pop(), requiredPositional));
+ return;
+ default:
+ throw A.wrapException(A.AssertionError$("Unexpected state under `()`: " + A.S(head)));
+ }
+ },
+ _Parser_handleExtendedOperations(parser, stack) {
+ var $top = stack.pop();
+ if (0 === $top) {
+ stack.push(A._Universe__lookupTerminalRti(parser.u, 1, "0&"));
+ return;
+ }
+ if (1 === $top) {
+ stack.push(A._Universe__lookupTerminalRti(parser.u, 4, "1&"));
+ return;
+ }
+ throw A.wrapException(A.AssertionError$("Unexpected extended operation " + A.S($top)));
+ },
+ _Parser_collectArray(parser, stack) {
+ var array = stack.splice(parser.p);
+ A._Parser_toTypes(parser.u, parser.e, array);
+ parser.p = stack.pop();
+ return array;
+ },
+ _Parser_toType(universe, environment, item) {
+ if (typeof item == "string")
+ return A._Universe__lookupInterfaceRti(universe, item, universe.sEA);
+ else if (typeof item == "number") {
+ environment.toString;
+ return A._Parser_indexToType(universe, environment, item);
+ } else
+ return item;
+ },
+ _Parser_toTypes(universe, environment, items) {
+ var i,
+ $length = items.length;
+ for (i = 0; i < $length; ++i)
+ items[i] = A._Parser_toType(universe, environment, items[i]);
+ },
+ _Parser_toTypesNamed(universe, environment, items) {
+ var i,
+ $length = items.length;
+ for (i = 2; i < $length; i += 3)
+ items[i] = A._Parser_toType(universe, environment, items[i]);
+ },
+ _Parser_indexToType(universe, environment, index) {
+ var typeArguments, len,
+ kind = environment._kind;
+ if (kind === 10) {
+ if (index === 0)
+ return environment._primary;
+ typeArguments = environment._rest;
+ len = typeArguments.length;
+ if (index <= len)
+ return typeArguments[index - 1];
+ index -= len;
+ environment = environment._primary;
+ kind = environment._kind;
+ } else if (index === 0)
+ return environment;
+ if (kind !== 9)
+ throw A.wrapException(A.AssertionError$("Indexed base must be an interface type"));
+ typeArguments = environment._rest;
+ if (index <= typeArguments.length)
+ return typeArguments[index - 1];
+ throw A.wrapException(A.AssertionError$("Bad index " + index + " for " + environment.toString$0(0)));
+ },
+ isSubtype(universe, s, t) {
+ var result,
+ sCache = s._isSubtypeCache;
+ if (sCache == null)
+ sCache = s._isSubtypeCache = new Map();
+ result = sCache.get(t);
+ if (result == null) {
+ result = A._isSubtype(universe, s, null, t, null, false) ? 1 : 0;
+ sCache.set(t, result);
+ }
+ if (0 === result)
+ return false;
+ if (1 === result)
+ return true;
+ return true;
+ },
+ _isSubtype(universe, s, sEnv, t, tEnv, isLegacy) {
+ var t1, sKind, leftTypeVariable, tKind, t2, sBounds, tBounds, sLength, i, sBound, tBound;
+ if (s === t)
+ return true;
+ if (!A.isSoundTopType(t))
+ t1 = t === type$.legacy_Object;
+ else
+ t1 = true;
+ if (t1)
+ return true;
+ sKind = s._kind;
+ if (sKind === 4)
+ return true;
+ if (A.isSoundTopType(s))
+ return false;
+ t1 = s._kind;
+ if (t1 === 1)
+ return true;
+ leftTypeVariable = sKind === 14;
+ if (leftTypeVariable)
+ if (A._isSubtype(universe, sEnv[s._primary], sEnv, t, tEnv, false))
+ return true;
+ tKind = t._kind;
+ t1 = s === type$.Null || s === type$.JSNull;
+ if (t1) {
+ if (tKind === 8)
+ return A._isSubtype(universe, s, sEnv, t._primary, tEnv, false);
+ return t === type$.Null || t === type$.JSNull || tKind === 7 || tKind === 6;
+ }
+ if (t === type$.Object) {
+ if (sKind === 8)
+ return A._isSubtype(universe, s._primary, sEnv, t, tEnv, false);
+ if (sKind === 6)
+ return A._isSubtype(universe, s._primary, sEnv, t, tEnv, false);
+ return sKind !== 7;
+ }
+ if (sKind === 6)
+ return A._isSubtype(universe, s._primary, sEnv, t, tEnv, false);
+ if (tKind === 6) {
+ t1 = A.Rti__getQuestionFromStar(universe, t);
+ return A._isSubtype(universe, s, sEnv, t1, tEnv, false);
+ }
+ if (sKind === 8) {
+ if (!A._isSubtype(universe, s._primary, sEnv, t, tEnv, false))
+ return false;
+ return A._isSubtype(universe, A.Rti__getFutureFromFutureOr(universe, s), sEnv, t, tEnv, false);
+ }
+ if (sKind === 7) {
+ t1 = A._isSubtype(universe, type$.Null, sEnv, t, tEnv, false);
+ return t1 && A._isSubtype(universe, s._primary, sEnv, t, tEnv, false);
+ }
+ if (tKind === 8) {
+ if (A._isSubtype(universe, s, sEnv, t._primary, tEnv, false))
+ return true;
+ return A._isSubtype(universe, s, sEnv, A.Rti__getFutureFromFutureOr(universe, t), tEnv, false);
+ }
+ if (tKind === 7) {
+ t1 = A._isSubtype(universe, s, sEnv, type$.Null, tEnv, false);
+ return t1 || A._isSubtype(universe, s, sEnv, t._primary, tEnv, false);
+ }
+ if (leftTypeVariable)
+ return false;
+ t1 = sKind !== 12;
+ if ((!t1 || sKind === 13) && t === type$.Function)
+ return true;
+ t2 = sKind === 11;
+ if (t2 && t === type$.Record)
+ return true;
+ if (tKind === 13) {
+ if (s === type$.JavaScriptFunction)
+ return true;
+ if (sKind !== 13)
+ return false;
+ sBounds = s._rest;
+ tBounds = t._rest;
+ sLength = sBounds.length;
+ if (sLength !== tBounds.length)
+ return false;
+ sEnv = sEnv == null ? sBounds : sBounds.concat(sEnv);
+ tEnv = tEnv == null ? tBounds : tBounds.concat(tEnv);
+ for (i = 0; i < sLength; ++i) {
+ sBound = sBounds[i];
+ tBound = tBounds[i];
+ if (!A._isSubtype(universe, sBound, sEnv, tBound, tEnv, false) || !A._isSubtype(universe, tBound, tEnv, sBound, sEnv, false))
+ return false;
+ }
+ return A._isFunctionSubtype(universe, s._primary, sEnv, t._primary, tEnv, false);
+ }
+ if (tKind === 12) {
+ if (s === type$.JavaScriptFunction)
+ return true;
+ if (t1)
+ return false;
+ return A._isFunctionSubtype(universe, s, sEnv, t, tEnv, false);
+ }
+ if (sKind === 9) {
+ if (tKind !== 9)
+ return false;
+ return A._isInterfaceSubtype(universe, s, sEnv, t, tEnv, false);
+ }
+ if (t2 && tKind === 11)
+ return A._isRecordSubtype(universe, s, sEnv, t, tEnv, false);
+ return false;
+ },
+ _isFunctionSubtype(universe, s, sEnv, t, tEnv, isLegacy) {
+ var sParameters, tParameters, sRequiredPositional, tRequiredPositional, sRequiredPositionalLength, tRequiredPositionalLength, requiredPositionalDelta, sOptionalPositional, tOptionalPositional, sOptionalPositionalLength, tOptionalPositionalLength, i, t1, sNamed, tNamed, sNamedLength, tNamedLength, sIndex, tIndex, tName, sName, sIsRequired;
+ if (!A._isSubtype(universe, s._primary, sEnv, t._primary, tEnv, false))
+ return false;
+ sParameters = s._rest;
+ tParameters = t._rest;
+ sRequiredPositional = sParameters._requiredPositional;
+ tRequiredPositional = tParameters._requiredPositional;
+ sRequiredPositionalLength = sRequiredPositional.length;
+ tRequiredPositionalLength = tRequiredPositional.length;
+ if (sRequiredPositionalLength > tRequiredPositionalLength)
+ return false;
+ requiredPositionalDelta = tRequiredPositionalLength - sRequiredPositionalLength;
+ sOptionalPositional = sParameters._optionalPositional;
+ tOptionalPositional = tParameters._optionalPositional;
+ sOptionalPositionalLength = sOptionalPositional.length;
+ tOptionalPositionalLength = tOptionalPositional.length;
+ if (sRequiredPositionalLength + sOptionalPositionalLength < tRequiredPositionalLength + tOptionalPositionalLength)
+ return false;
+ for (i = 0; i < sRequiredPositionalLength; ++i) {
+ t1 = sRequiredPositional[i];
+ if (!A._isSubtype(universe, tRequiredPositional[i], tEnv, t1, sEnv, false))
+ return false;
+ }
+ for (i = 0; i < requiredPositionalDelta; ++i) {
+ t1 = sOptionalPositional[i];
+ if (!A._isSubtype(universe, tRequiredPositional[sRequiredPositionalLength + i], tEnv, t1, sEnv, false))
+ return false;
+ }
+ for (i = 0; i < tOptionalPositionalLength; ++i) {
+ t1 = sOptionalPositional[requiredPositionalDelta + i];
+ if (!A._isSubtype(universe, tOptionalPositional[i], tEnv, t1, sEnv, false))
+ return false;
+ }
+ sNamed = sParameters._named;
+ tNamed = tParameters._named;
+ sNamedLength = sNamed.length;
+ tNamedLength = tNamed.length;
+ for (sIndex = 0, tIndex = 0; tIndex < tNamedLength; tIndex += 3) {
+ tName = tNamed[tIndex];
+ for (; true;) {
+ if (sIndex >= sNamedLength)
+ return false;
+ sName = sNamed[sIndex];
+ sIndex += 3;
+ if (tName < sName)
+ return false;
+ sIsRequired = sNamed[sIndex - 2];
+ if (sName < tName) {
+ if (sIsRequired)
+ return false;
+ continue;
+ }
+ t1 = tNamed[tIndex + 1];
+ if (sIsRequired && !t1)
+ return false;
+ t1 = sNamed[sIndex - 1];
+ if (!A._isSubtype(universe, tNamed[tIndex + 2], tEnv, t1, sEnv, false))
+ return false;
+ break;
+ }
+ }
+ for (; sIndex < sNamedLength;) {
+ if (sNamed[sIndex + 1])
+ return false;
+ sIndex += 3;
+ }
+ return true;
+ },
+ _isInterfaceSubtype(universe, s, sEnv, t, tEnv, isLegacy) {
+ var rule, recipes, $length, supertypeArgs, i,
+ sName = s._primary,
+ tName = t._primary;
+ for (; sName !== tName;) {
+ rule = universe.tR[sName];
+ if (rule == null)
+ return false;
+ if (typeof rule == "string") {
+ sName = rule;
+ continue;
+ }
+ recipes = rule[tName];
+ if (recipes == null)
+ return false;
+ $length = recipes.length;
+ supertypeArgs = $length > 0 ? new Array($length) : init.typeUniverse.sEA;
+ for (i = 0; i < $length; ++i)
+ supertypeArgs[i] = A._Universe_evalInEnvironment(universe, s, recipes[i]);
+ return A._areArgumentsSubtypes(universe, supertypeArgs, null, sEnv, t._rest, tEnv, false);
+ }
+ return A._areArgumentsSubtypes(universe, s._rest, null, sEnv, t._rest, tEnv, false);
+ },
+ _areArgumentsSubtypes(universe, sArgs, sVariances, sEnv, tArgs, tEnv, isLegacy) {
+ var i,
+ $length = sArgs.length;
+ for (i = 0; i < $length; ++i)
+ if (!A._isSubtype(universe, sArgs[i], sEnv, tArgs[i], tEnv, false))
+ return false;
+ return true;
+ },
+ _isRecordSubtype(universe, s, sEnv, t, tEnv, isLegacy) {
+ var i,
+ sFields = s._rest,
+ tFields = t._rest,
+ sCount = sFields.length;
+ if (sCount !== tFields.length)
+ return false;
+ if (s._primary !== t._primary)
+ return false;
+ for (i = 0; i < sCount; ++i)
+ if (!A._isSubtype(universe, sFields[i], sEnv, tFields[i], tEnv, false))
+ return false;
+ return true;
+ },
+ isNullable(t) {
+ var t1,
+ kind = t._kind;
+ if (!(t === type$.Null || t === type$.JSNull))
+ if (!A.isSoundTopType(t))
+ if (kind !== 7)
+ if (!(kind === 6 && A.isNullable(t._primary)))
+ t1 = kind === 8 && A.isNullable(t._primary);
+ else
+ t1 = true;
+ else
+ t1 = true;
+ else
+ t1 = true;
+ else
+ t1 = true;
+ return t1;
+ },
+ isDefinitelyTopType(t) {
+ var t1;
+ if (!A.isSoundTopType(t))
+ t1 = t === type$.legacy_Object;
+ else
+ t1 = true;
+ return t1;
+ },
+ isSoundTopType(t) {
+ var kind = t._kind;
+ return kind === 2 || kind === 3 || kind === 4 || kind === 5 || t === type$.nullable_Object;
+ },
+ _Utils_objectAssign(o, other) {
+ var i, key,
+ keys = Object.keys(other),
+ $length = keys.length;
+ for (i = 0; i < $length; ++i) {
+ key = keys[i];
+ o[key] = other[key];
+ }
+ },
+ _Utils_newArrayOrEmpty($length) {
+ return $length > 0 ? new Array($length) : init.typeUniverse.sEA;
+ },
+ Rti: function Rti(t0, t1) {
+ var _ = this;
+ _._as = t0;
+ _._is = t1;
+ _._cachedRuntimeType = _._specializedTestResource = _._isSubtypeCache = _._precomputed1 = null;
+ _._kind = 0;
+ _._canonicalRecipe = _._bindCache = _._evalCache = _._rest = _._primary = null;
+ },
+ _FunctionParameters: function _FunctionParameters() {
+ this._named = this._optionalPositional = this._requiredPositional = null;
+ },
+ _Type: function _Type(t0) {
+ this._rti = t0;
+ },
+ _Error: function _Error() {
+ },
+ _TypeError: function _TypeError(t0) {
+ this.__rti$_message = t0;
+ },
+ _AsyncRun__initializeScheduleImmediate() {
+ var div, span, t1 = {};
+ if (self.scheduleImmediate != null)
+ return A.async__AsyncRun__scheduleImmediateJsOverride$closure();
+ if (self.MutationObserver != null && self.document != null) {
+ div = self.document.createElement("div");
+ span = self.document.createElement("span");
+ t1.storedCallback = null;
+ new self.MutationObserver(A.convertDartClosureToJS(new A._AsyncRun__initializeScheduleImmediate_internalCallback(t1), 1)).observe(div, {childList: true});
+ return new A._AsyncRun__initializeScheduleImmediate_closure(t1, div, span);
+ } else if (self.setImmediate != null)
+ return A.async__AsyncRun__scheduleImmediateWithSetImmediate$closure();
+ return A.async__AsyncRun__scheduleImmediateWithTimer$closure();
+ },
+ _AsyncRun__scheduleImmediateJsOverride(callback) {
+ self.scheduleImmediate(A.convertDartClosureToJS(new A._AsyncRun__scheduleImmediateJsOverride_internalCallback(type$.void_Function._as(callback)), 0));
+ },
+ _AsyncRun__scheduleImmediateWithSetImmediate(callback) {
+ self.setImmediate(A.convertDartClosureToJS(new A._AsyncRun__scheduleImmediateWithSetImmediate_internalCallback(type$.void_Function._as(callback)), 0));
+ },
+ _AsyncRun__scheduleImmediateWithTimer(callback) {
+ A.Timer__createTimer(B.Duration_0, type$.void_Function._as(callback));
+ },
+ Timer__createTimer(duration, callback) {
+ return A._TimerImpl$(duration._duration / 1000 | 0, callback);
+ },
+ _TimerImpl$(milliseconds, callback) {
+ var t1 = new A._TimerImpl();
+ t1._TimerImpl$2(milliseconds, callback);
+ return t1;
+ },
+ _makeAsyncAwaitCompleter($T) {
+ return new A._AsyncAwaitCompleter(new A._Future($.Zone__current, $T._eval$1("_Future<0>")), $T._eval$1("_AsyncAwaitCompleter<0>"));
+ },
+ _asyncStartSync(bodyFunction, completer) {
+ bodyFunction.call$2(0, null);
+ completer.isSync = true;
+ return completer._future;
+ },
+ _asyncAwait(object, bodyFunction) {
+ A._awaitOnObject(object, bodyFunction);
+ },
+ _asyncReturn(object, completer) {
+ completer.complete$1(object);
+ },
+ _asyncRethrow(object, completer) {
+ completer.completeError$2(A.unwrapException(object), A.getTraceFromException(object));
+ },
+ _awaitOnObject(object, bodyFunction) {
+ var t1, future,
+ thenCallback = new A._awaitOnObject_closure(bodyFunction),
+ errorCallback = new A._awaitOnObject_closure0(bodyFunction);
+ if (object instanceof A._Future)
+ object._thenAwait$1$2(thenCallback, errorCallback, type$.dynamic);
+ else {
+ t1 = type$.dynamic;
+ if (object instanceof A._Future)
+ object.then$1$2$onError(thenCallback, errorCallback, t1);
+ else {
+ future = new A._Future($.Zone__current, type$._Future_dynamic);
+ future._state = 8;
+ future._resultOrListeners = object;
+ future._thenAwait$1$2(thenCallback, errorCallback, t1);
+ }
+ }
+ },
+ _wrapJsFunctionForAsync($function) {
+ var $protected = function(fn, ERROR) {
+ return function(errorCode, result) {
+ while (true) {
+ try {
+ fn(errorCode, result);
+ break;
+ } catch (error) {
+ result = error;
+ errorCode = ERROR;
+ }
+ }
+ };
+ }($function, 1);
+ return $.Zone__current.registerBinaryCallback$3$1(new A._wrapJsFunctionForAsync_closure($protected), type$.void, type$.int, type$.dynamic);
+ },
+ AsyncError$(error, stackTrace) {
+ var t1 = A.checkNotNullable(error, "error", type$.Object);
+ return new A.AsyncError(t1, stackTrace == null ? A.AsyncError_defaultStackTrace(error) : stackTrace);
+ },
+ AsyncError_defaultStackTrace(error) {
+ var stackTrace;
+ if (type$.Error._is(error)) {
+ stackTrace = error.get$stackTrace();
+ if (stackTrace != null)
+ return stackTrace;
+ }
+ return B._StringStackTrace_3uE;
+ },
+ Future_Future$sync(computation, $T) {
+ var result, error, stackTrace, future, replacement, t1, exception;
+ try {
+ result = computation.call$0();
+ t1 = $T._eval$1("Future<0>")._is(result) ? result : A._Future$value(result, $T);
+ return t1;
+ } catch (exception) {
+ error = A.unwrapException(exception);
+ stackTrace = A.getTraceFromException(exception);
+ future = new A._Future($.Zone__current, $T._eval$1("_Future<0>"));
+ type$.Object._as(error);
+ type$.nullable_StackTrace._as(stackTrace);
+ replacement = null;
+ if (replacement != null)
+ future._asyncCompleteError$2(replacement.get$error(), replacement.get$stackTrace());
+ else
+ future._asyncCompleteError$2(error, stackTrace);
+ return future;
+ }
+ },
+ Future_Future$value(value, $T) {
+ var t1 = value == null ? $T._as(value) : value,
+ t2 = new A._Future($.Zone__current, $T._eval$1("_Future<0>"));
+ t2._asyncComplete$1(t1);
+ return t2;
+ },
+ Completer_Completer($T) {
+ return new A._AsyncCompleter(new A._Future($.Zone__current, $T._eval$1("_Future<0>")), $T._eval$1("_AsyncCompleter<0>"));
+ },
+ _Future$value(value, $T) {
+ var t1 = new A._Future($.Zone__current, $T._eval$1("_Future<0>"));
+ $T._as(value);
+ t1._state = 8;
+ t1._resultOrListeners = value;
+ return t1;
+ },
+ _Future__chainCoreFutureSync(source, target) {
+ var t1, t2, listeners;
+ for (t1 = type$._Future_dynamic; t2 = source._state, (t2 & 4) !== 0;)
+ source = t1._as(source._resultOrListeners);
+ if ((t2 & 24) !== 0) {
+ listeners = target._removeListeners$0();
+ target._cloneResult$1(source);
+ A._Future__propagateToListeners(target, listeners);
+ } else {
+ listeners = type$.nullable__FutureListener_dynamic_dynamic._as(target._resultOrListeners);
+ target._setChained$1(source);
+ source._prependListeners$1(listeners);
+ }
+ },
+ _Future__chainCoreFutureAsync(source, target) {
+ var t2, t3, listeners, _box_0 = {},
+ t1 = _box_0.source = source;
+ for (t2 = type$._Future_dynamic; t3 = t1._state, (t3 & 4) !== 0; t1 = source) {
+ source = t2._as(t1._resultOrListeners);
+ _box_0.source = source;
+ }
+ if ((t3 & 24) === 0) {
+ listeners = type$.nullable__FutureListener_dynamic_dynamic._as(target._resultOrListeners);
+ target._setChained$1(t1);
+ _box_0.source._prependListeners$1(listeners);
+ return;
+ }
+ if ((t3 & 16) === 0 && target._resultOrListeners == null) {
+ target._cloneResult$1(t1);
+ return;
+ }
+ target._state ^= 2;
+ A._rootScheduleMicrotask(null, null, target._zone, type$.void_Function._as(new A._Future__chainCoreFutureAsync_closure(_box_0, target)));
+ },
+ _Future__propagateToListeners(source, listeners) {
+ var t2, t3, t4, _box_0, t5, t6, hasError, asyncError, nextListener, nextListener0, sourceResult, t7, zone, oldZone, result, current, _box_1 = {},
+ t1 = _box_1.source = source;
+ for (t2 = type$.AsyncError, t3 = type$.nullable__FutureListener_dynamic_dynamic, t4 = type$.Future_dynamic; true;) {
+ _box_0 = {};
+ t5 = t1._state;
+ t6 = (t5 & 16) === 0;
+ hasError = !t6;
+ if (listeners == null) {
+ if (hasError && (t5 & 1) === 0) {
+ asyncError = t2._as(t1._resultOrListeners);
+ A._rootHandleError(asyncError.error, asyncError.stackTrace);
+ }
+ return;
+ }
+ _box_0.listener = listeners;
+ nextListener = listeners._nextListener;
+ for (t1 = listeners; nextListener != null; t1 = nextListener, nextListener = nextListener0) {
+ t1._nextListener = null;
+ A._Future__propagateToListeners(_box_1.source, t1);
+ _box_0.listener = nextListener;
+ nextListener0 = nextListener._nextListener;
+ }
+ t5 = _box_1.source;
+ sourceResult = t5._resultOrListeners;
+ _box_0.listenerHasError = hasError;
+ _box_0.listenerValueOrError = sourceResult;
+ if (t6) {
+ t7 = t1.state;
+ t7 = (t7 & 1) !== 0 || (t7 & 15) === 8;
+ } else
+ t7 = true;
+ if (t7) {
+ zone = t1.result._zone;
+ if (hasError) {
+ t5 = t5._zone === zone;
+ t5 = !(t5 || t5);
+ } else
+ t5 = false;
+ if (t5) {
+ t2._as(sourceResult);
+ A._rootHandleError(sourceResult.error, sourceResult.stackTrace);
+ return;
+ }
+ oldZone = $.Zone__current;
+ if (oldZone !== zone)
+ $.Zone__current = zone;
+ else
+ oldZone = null;
+ t1 = t1.state;
+ if ((t1 & 15) === 8)
+ new A._Future__propagateToListeners_handleWhenCompleteCallback(_box_0, _box_1, hasError).call$0();
+ else if (t6) {
+ if ((t1 & 1) !== 0)
+ new A._Future__propagateToListeners_handleValueCallback(_box_0, sourceResult).call$0();
+ } else if ((t1 & 2) !== 0)
+ new A._Future__propagateToListeners_handleError(_box_1, _box_0).call$0();
+ if (oldZone != null)
+ $.Zone__current = oldZone;
+ t1 = _box_0.listenerValueOrError;
+ if (t1 instanceof A._Future) {
+ t5 = _box_0.listener.$ti;
+ t5 = t5._eval$1("Future<2>")._is(t1) || !t5._rest[1]._is(t1);
+ } else
+ t5 = false;
+ if (t5) {
+ t4._as(t1);
+ result = _box_0.listener.result;
+ if ((t1._state & 24) !== 0) {
+ current = t3._as(result._resultOrListeners);
+ result._resultOrListeners = null;
+ listeners = result._reverseListeners$1(current);
+ result._state = t1._state & 30 | result._state & 1;
+ result._resultOrListeners = t1._resultOrListeners;
+ _box_1.source = t1;
+ continue;
+ } else
+ A._Future__chainCoreFutureSync(t1, result);
+ return;
+ }
+ }
+ result = _box_0.listener.result;
+ current = t3._as(result._resultOrListeners);
+ result._resultOrListeners = null;
+ listeners = result._reverseListeners$1(current);
+ t1 = _box_0.listenerHasError;
+ t5 = _box_0.listenerValueOrError;
+ if (!t1) {
+ result.$ti._precomputed1._as(t5);
+ result._state = 8;
+ result._resultOrListeners = t5;
+ } else {
+ t2._as(t5);
+ result._state = result._state & 1 | 16;
+ result._resultOrListeners = t5;
+ }
+ _box_1.source = result;
+ t1 = result;
+ }
+ },
+ _registerErrorHandler(errorHandler, zone) {
+ var t1;
+ if (type$.dynamic_Function_Object_StackTrace._is(errorHandler))
+ return zone.registerBinaryCallback$3$1(errorHandler, type$.dynamic, type$.Object, type$.StackTrace);
+ t1 = type$.dynamic_Function_Object;
+ if (t1._is(errorHandler))
+ return t1._as(errorHandler);
+ throw A.wrapException(A.ArgumentError$value(errorHandler, "onError", string$.Error_));
+ },
+ _microtaskLoop() {
+ var entry, next;
+ for (entry = $._nextCallback; entry != null; entry = $._nextCallback) {
+ $._lastPriorityCallback = null;
+ next = entry.next;
+ $._nextCallback = next;
+ if (next == null)
+ $._lastCallback = null;
+ entry.callback.call$0();
+ }
+ },
+ _startMicrotaskLoop() {
+ $._isInCallbackLoop = true;
+ try {
+ A._microtaskLoop();
+ } finally {
+ $._lastPriorityCallback = null;
+ $._isInCallbackLoop = false;
+ if ($._nextCallback != null)
+ $.$get$_AsyncRun__scheduleImmediateClosure().call$1(A.async___startMicrotaskLoop$closure());
+ }
+ },
+ _scheduleAsyncCallback(callback) {
+ var newEntry = new A._AsyncCallbackEntry(callback),
+ lastCallback = $._lastCallback;
+ if (lastCallback == null) {
+ $._nextCallback = $._lastCallback = newEntry;
+ if (!$._isInCallbackLoop)
+ $.$get$_AsyncRun__scheduleImmediateClosure().call$1(A.async___startMicrotaskLoop$closure());
+ } else
+ $._lastCallback = lastCallback.next = newEntry;
+ },
+ _schedulePriorityAsyncCallback(callback) {
+ var entry, lastPriorityCallback, next,
+ t1 = $._nextCallback;
+ if (t1 == null) {
+ A._scheduleAsyncCallback(callback);
+ $._lastPriorityCallback = $._lastCallback;
+ return;
+ }
+ entry = new A._AsyncCallbackEntry(callback);
+ lastPriorityCallback = $._lastPriorityCallback;
+ if (lastPriorityCallback == null) {
+ entry.next = t1;
+ $._nextCallback = $._lastPriorityCallback = entry;
+ } else {
+ next = lastPriorityCallback.next;
+ entry.next = next;
+ $._lastPriorityCallback = lastPriorityCallback.next = entry;
+ if (next == null)
+ $._lastCallback = entry;
+ }
+ },
+ scheduleMicrotask(callback) {
+ var _null = null,
+ currentZone = $.Zone__current;
+ if (B.C__RootZone === currentZone) {
+ A._rootScheduleMicrotask(_null, _null, B.C__RootZone, callback);
+ return;
+ }
+ A._rootScheduleMicrotask(_null, _null, currentZone, type$.void_Function._as(currentZone.bindCallbackGuarded$1(callback)));
+ },
+ StreamIterator_StreamIterator(stream, $T) {
+ A.checkNotNullable(stream, "stream", type$.Object);
+ return new A._StreamIterator($T._eval$1("_StreamIterator<0>"));
+ },
+ StreamController_StreamController($T) {
+ var _null = null;
+ return new A._AsyncStreamController(_null, _null, _null, _null, $T._eval$1("_AsyncStreamController<0>"));
+ },
+ _runGuarded(notificationHandler) {
+ return;
+ },
+ _BufferingStreamSubscription__registerDataHandler(zone, handleData, $T) {
+ var t1 = handleData == null ? A.async___nullDataHandler$closure() : handleData;
+ return type$.$env_1_1_void._bind$1($T)._eval$1("1(2)")._as(t1);
+ },
+ _BufferingStreamSubscription__registerErrorHandler(zone, handleError) {
+ if (handleError == null)
+ handleError = A.async___nullErrorHandler$closure();
+ if (type$.void_Function_Object_StackTrace._is(handleError))
+ return zone.registerBinaryCallback$3$1(handleError, type$.dynamic, type$.Object, type$.StackTrace);
+ if (type$.void_Function_Object._is(handleError))
+ return type$.dynamic_Function_Object._as(handleError);
+ throw A.wrapException(A.ArgumentError$("handleError callback must take either an Object (the error), or both an Object (the error) and a StackTrace.", null));
+ },
+ _nullDataHandler(value) {
+ },
+ _nullErrorHandler(error, stackTrace) {
+ A._rootHandleError(type$.Object._as(error), type$.StackTrace._as(stackTrace));
+ },
+ _nullDoneHandler() {
+ },
+ _cancelAndValue(subscription, future, value) {
+ var cancelFuture = subscription.cancel$0(),
+ t1 = $.$get$Future__nullFuture();
+ if (cancelFuture !== t1)
+ cancelFuture.whenComplete$1(new A._cancelAndValue_closure(future, value));
+ else
+ future._complete$1(value);
+ },
+ Timer_Timer(duration, callback) {
+ var t1 = $.Zone__current;
+ if (t1 === B.C__RootZone)
+ return A.Timer__createTimer(duration, type$.void_Function._as(callback));
+ return A.Timer__createTimer(duration, type$.void_Function._as(t1.bindCallbackGuarded$1(callback)));
+ },
+ _rootHandleError(error, stackTrace) {
+ A._schedulePriorityAsyncCallback(new A._rootHandleError_closure(error, stackTrace));
+ },
+ _rootRun($self, $parent, zone, f, $R) {
+ var old,
+ t1 = $.Zone__current;
+ if (t1 === zone)
+ return f.call$0();
+ $.Zone__current = zone;
+ old = t1;
+ try {
+ t1 = f.call$0();
+ return t1;
+ } finally {
+ $.Zone__current = old;
+ }
+ },
+ _rootRunUnary($self, $parent, zone, f, arg, $R, $T) {
+ var old,
+ t1 = $.Zone__current;
+ if (t1 === zone)
+ return f.call$1(arg);
+ $.Zone__current = zone;
+ old = t1;
+ try {
+ t1 = f.call$1(arg);
+ return t1;
+ } finally {
+ $.Zone__current = old;
+ }
+ },
+ _rootRunBinary($self, $parent, zone, f, arg1, arg2, $R, T1, T2) {
+ var old,
+ t1 = $.Zone__current;
+ if (t1 === zone)
+ return f.call$2(arg1, arg2);
+ $.Zone__current = zone;
+ old = t1;
+ try {
+ t1 = f.call$2(arg1, arg2);
+ return t1;
+ } finally {
+ $.Zone__current = old;
+ }
+ },
+ _rootScheduleMicrotask($self, $parent, zone, f) {
+ type$.void_Function._as(f);
+ if (B.C__RootZone !== zone)
+ f = zone.bindCallbackGuarded$1(f);
+ A._scheduleAsyncCallback(f);
+ },
+ _AsyncRun__initializeScheduleImmediate_internalCallback: function _AsyncRun__initializeScheduleImmediate_internalCallback(t0) {
+ this._box_0 = t0;
+ },
+ _AsyncRun__initializeScheduleImmediate_closure: function _AsyncRun__initializeScheduleImmediate_closure(t0, t1, t2) {
+ this._box_0 = t0;
+ this.div = t1;
+ this.span = t2;
+ },
+ _AsyncRun__scheduleImmediateJsOverride_internalCallback: function _AsyncRun__scheduleImmediateJsOverride_internalCallback(t0) {
+ this.callback = t0;
+ },
+ _AsyncRun__scheduleImmediateWithSetImmediate_internalCallback: function _AsyncRun__scheduleImmediateWithSetImmediate_internalCallback(t0) {
+ this.callback = t0;
+ },
+ _TimerImpl: function _TimerImpl() {
+ this._handle = null;
+ },
+ _TimerImpl_internalCallback: function _TimerImpl_internalCallback(t0, t1) {
+ this.$this = t0;
+ this.callback = t1;
+ },
+ _AsyncAwaitCompleter: function _AsyncAwaitCompleter(t0, t1) {
+ this._future = t0;
+ this.isSync = false;
+ this.$ti = t1;
+ },
+ _awaitOnObject_closure: function _awaitOnObject_closure(t0) {
+ this.bodyFunction = t0;
+ },
+ _awaitOnObject_closure0: function _awaitOnObject_closure0(t0) {
+ this.bodyFunction = t0;
+ },
+ _wrapJsFunctionForAsync_closure: function _wrapJsFunctionForAsync_closure(t0) {
+ this.$protected = t0;
+ },
+ AsyncError: function AsyncError(t0, t1) {
+ this.error = t0;
+ this.stackTrace = t1;
+ },
+ _Completer: function _Completer() {
+ },
+ _AsyncCompleter: function _AsyncCompleter(t0, t1) {
+ this.future = t0;
+ this.$ti = t1;
+ },
+ _SyncCompleter: function _SyncCompleter(t0, t1) {
+ this.future = t0;
+ this.$ti = t1;
+ },
+ _FutureListener: function _FutureListener(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _._nextListener = null;
+ _.result = t0;
+ _.state = t1;
+ _.callback = t2;
+ _.errorCallback = t3;
+ _.$ti = t4;
+ },
+ _Future: function _Future(t0, t1) {
+ var _ = this;
+ _._state = 0;
+ _._zone = t0;
+ _._resultOrListeners = null;
+ _.$ti = t1;
+ },
+ _Future__addListener_closure: function _Future__addListener_closure(t0, t1) {
+ this.$this = t0;
+ this.listener = t1;
+ },
+ _Future__prependListeners_closure: function _Future__prependListeners_closure(t0, t1) {
+ this._box_0 = t0;
+ this.$this = t1;
+ },
+ _Future__chainForeignFuture_closure: function _Future__chainForeignFuture_closure(t0) {
+ this.$this = t0;
+ },
+ _Future__chainForeignFuture_closure0: function _Future__chainForeignFuture_closure0(t0) {
+ this.$this = t0;
+ },
+ _Future__chainForeignFuture_closure1: function _Future__chainForeignFuture_closure1(t0, t1, t2) {
+ this.$this = t0;
+ this.e = t1;
+ this.s = t2;
+ },
+ _Future__chainCoreFutureAsync_closure: function _Future__chainCoreFutureAsync_closure(t0, t1) {
+ this._box_0 = t0;
+ this.target = t1;
+ },
+ _Future__asyncCompleteWithValue_closure: function _Future__asyncCompleteWithValue_closure(t0, t1) {
+ this.$this = t0;
+ this.value = t1;
+ },
+ _Future__asyncCompleteError_closure: function _Future__asyncCompleteError_closure(t0, t1, t2) {
+ this.$this = t0;
+ this.error = t1;
+ this.stackTrace = t2;
+ },
+ _Future__propagateToListeners_handleWhenCompleteCallback: function _Future__propagateToListeners_handleWhenCompleteCallback(t0, t1, t2) {
+ this._box_0 = t0;
+ this._box_1 = t1;
+ this.hasError = t2;
+ },
+ _Future__propagateToListeners_handleWhenCompleteCallback_closure: function _Future__propagateToListeners_handleWhenCompleteCallback_closure(t0) {
+ this.originalSource = t0;
+ },
+ _Future__propagateToListeners_handleValueCallback: function _Future__propagateToListeners_handleValueCallback(t0, t1) {
+ this._box_0 = t0;
+ this.sourceResult = t1;
+ },
+ _Future__propagateToListeners_handleError: function _Future__propagateToListeners_handleError(t0, t1) {
+ this._box_1 = t0;
+ this._box_0 = t1;
+ },
+ _AsyncCallbackEntry: function _AsyncCallbackEntry(t0) {
+ this.callback = t0;
+ this.next = null;
+ },
+ Stream: function Stream() {
+ },
+ Stream_length_closure: function Stream_length_closure(t0, t1) {
+ this._box_0 = t0;
+ this.$this = t1;
+ },
+ Stream_length_closure0: function Stream_length_closure0(t0, t1) {
+ this._box_0 = t0;
+ this.future = t1;
+ },
+ Stream_first_closure: function Stream_first_closure(t0) {
+ this.future = t0;
+ },
+ Stream_first_closure0: function Stream_first_closure0(t0, t1, t2) {
+ this.$this = t0;
+ this.subscription = t1;
+ this.future = t2;
+ },
+ _StreamController: function _StreamController() {
+ },
+ _StreamController__subscribe_closure: function _StreamController__subscribe_closure(t0) {
+ this.$this = t0;
+ },
+ _StreamController__recordCancel_complete: function _StreamController__recordCancel_complete(t0) {
+ this.$this = t0;
+ },
+ _AsyncStreamControllerDispatch: function _AsyncStreamControllerDispatch() {
+ },
+ _AsyncStreamController: function _AsyncStreamController(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _._varData = null;
+ _._state = 0;
+ _._doneFuture = null;
+ _.onListen = t0;
+ _.onPause = t1;
+ _.onResume = t2;
+ _.onCancel = t3;
+ _.$ti = t4;
+ },
+ _ControllerStream: function _ControllerStream(t0, t1) {
+ this._controller = t0;
+ this.$ti = t1;
+ },
+ _ControllerSubscription: function _ControllerSubscription(t0, t1, t2, t3, t4, t5, t6) {
+ var _ = this;
+ _._controller = t0;
+ _._onData = t1;
+ _._onError = t2;
+ _._onDone = t3;
+ _._zone = t4;
+ _._state = t5;
+ _._pending = _._cancelFuture = null;
+ _.$ti = t6;
+ },
+ _StreamSinkWrapper: function _StreamSinkWrapper(t0, t1) {
+ this._async$_target = t0;
+ this.$ti = t1;
+ },
+ _BufferingStreamSubscription: function _BufferingStreamSubscription() {
+ },
+ _BufferingStreamSubscription_asFuture_closure: function _BufferingStreamSubscription_asFuture_closure(t0, t1) {
+ this._box_0 = t0;
+ this.result = t1;
+ },
+ _BufferingStreamSubscription_asFuture_closure0: function _BufferingStreamSubscription_asFuture_closure0(t0, t1) {
+ this.$this = t0;
+ this.result = t1;
+ },
+ _BufferingStreamSubscription_asFuture__closure: function _BufferingStreamSubscription_asFuture__closure(t0, t1, t2) {
+ this.result = t0;
+ this.error = t1;
+ this.stackTrace = t2;
+ },
+ _BufferingStreamSubscription__sendError_sendError: function _BufferingStreamSubscription__sendError_sendError(t0, t1, t2) {
+ this.$this = t0;
+ this.error = t1;
+ this.stackTrace = t2;
+ },
+ _BufferingStreamSubscription__sendDone_sendDone: function _BufferingStreamSubscription__sendDone_sendDone(t0) {
+ this.$this = t0;
+ },
+ _StreamImpl: function _StreamImpl() {
+ },
+ _DelayedEvent: function _DelayedEvent() {
+ },
+ _DelayedData: function _DelayedData(t0, t1) {
+ this.value = t0;
+ this.next = null;
+ this.$ti = t1;
+ },
+ _DelayedError: function _DelayedError(t0, t1) {
+ this.error = t0;
+ this.stackTrace = t1;
+ this.next = null;
+ },
+ _DelayedDone: function _DelayedDone() {
+ },
+ _PendingEvents: function _PendingEvents(t0) {
+ var _ = this;
+ _._state = 0;
+ _.lastPendingEvent = _.firstPendingEvent = null;
+ _.$ti = t0;
+ },
+ _PendingEvents_schedule_closure: function _PendingEvents_schedule_closure(t0, t1) {
+ this.$this = t0;
+ this.dispatch = t1;
+ },
+ _StreamIterator: function _StreamIterator(t0) {
+ this.$ti = t0;
+ },
+ _cancelAndValue_closure: function _cancelAndValue_closure(t0, t1) {
+ this.future = t0;
+ this.value = t1;
+ },
+ _Zone: function _Zone() {
+ },
+ _rootHandleError_closure: function _rootHandleError_closure(t0, t1) {
+ this.error = t0;
+ this.stackTrace = t1;
+ },
+ _RootZone: function _RootZone() {
+ },
+ _RootZone_bindCallbackGuarded_closure: function _RootZone_bindCallbackGuarded_closure(t0, t1) {
+ this.$this = t0;
+ this.f = t1;
+ },
+ _RootZone_bindUnaryCallbackGuarded_closure: function _RootZone_bindUnaryCallbackGuarded_closure(t0, t1, t2) {
+ this.$this = t0;
+ this.f = t1;
+ this.T = t2;
+ },
+ _HashMap__getTableEntry(table, key) {
+ var entry = table[key];
+ return entry === table ? null : entry;
+ },
+ _HashMap__setTableEntry(table, key, value) {
+ if (value == null)
+ table[key] = table;
+ else
+ table[key] = value;
+ },
+ _HashMap__newHashTable() {
+ var table = Object.create(null);
+ A._HashMap__setTableEntry(table, "<non-identifier-key>", table);
+ delete table["<non-identifier-key>"];
+ return table;
+ },
+ LinkedHashMap_LinkedHashMap$_empty($K, $V) {
+ return new A.JsLinkedHashMap($K._eval$1("@<0>")._bind$1($V)._eval$1("JsLinkedHashMap<1,2>"));
+ },
+ MapBase_mapToString(m) {
+ var result, t1 = {};
+ if (A.isToStringVisiting(m))
+ return "{...}";
+ result = new A.StringBuffer("");
+ try {
+ B.JSArray_methods.add$1($.toStringVisiting, m);
+ result._contents += "{";
+ t1.first = true;
+ m.forEach$1(0, new A.MapBase_mapToString_closure(t1, result));
+ result._contents += "}";
+ } finally {
+ if (0 >= $.toStringVisiting.length)
+ return A.ioore($.toStringVisiting, -1);
+ $.toStringVisiting.pop();
+ }
+ t1 = result._contents;
+ return t1.charCodeAt(0) == 0 ? t1 : t1;
+ },
+ ListQueue$($E) {
+ return new A.ListQueue(A.List_List$filled(A.ListQueue__calculateCapacity(null), null, false, $E._eval$1("0?")), $E._eval$1("ListQueue<0>"));
+ },
+ ListQueue__calculateCapacity(initialCapacity) {
+ return 8;
+ },
+ _HashMap: function _HashMap() {
+ },
+ _IdentityHashMap: function _IdentityHashMap(t0) {
+ var _ = this;
+ _._collection$_length = 0;
+ _._collection$_keys = _._collection$_rest = _._collection$_nums = _._collection$_strings = null;
+ _.$ti = t0;
+ },
+ _HashMapKeyIterable: function _HashMapKeyIterable(t0, t1) {
+ this._collection$_map = t0;
+ this.$ti = t1;
+ },
+ _HashMapKeyIterator: function _HashMapKeyIterator(t0, t1, t2) {
+ var _ = this;
+ _._collection$_map = t0;
+ _._collection$_keys = t1;
+ _._offset = 0;
+ _._collection$_current = null;
+ _.$ti = t2;
+ },
+ ListBase: function ListBase() {
+ },
+ MapBase: function MapBase() {
+ },
+ MapBase_mapToString_closure: function MapBase_mapToString_closure(t0, t1) {
+ this._box_0 = t0;
+ this.result = t1;
+ },
+ _UnmodifiableMapMixin: function _UnmodifiableMapMixin() {
+ },
+ MapView: function MapView() {
+ },
+ UnmodifiableMapView: function UnmodifiableMapView() {
+ },
+ ListQueue: function ListQueue(t0, t1) {
+ var _ = this;
+ _._table = t0;
+ _._modificationCount = _._tail = _._head = 0;
+ _.$ti = t1;
+ },
+ _ListQueueIterator: function _ListQueueIterator(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _._queue = t0;
+ _._end = t1;
+ _._modificationCount = t2;
+ _._position = t3;
+ _._collection$_current = null;
+ _.$ti = t4;
+ },
+ _UnmodifiableMapView_MapView__UnmodifiableMapMixin: function _UnmodifiableMapView_MapView__UnmodifiableMapMixin() {
+ },
+ _parseJson(source, reviver) {
+ var e, exception, t1, parsed = null;
+ try {
+ parsed = JSON.parse(source);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ t1 = A.FormatException$(String(e), null, null);
+ throw A.wrapException(t1);
+ }
+ t1 = A._convertJsonToDartLazy(parsed);
+ return t1;
+ },
+ _convertJsonToDartLazy(object) {
+ var i;
+ if (object == null)
+ return null;
+ if (typeof object != "object")
+ return object;
+ if (Object.getPrototypeOf(object) !== Array.prototype)
+ return new A._JsonMap(object, Object.create(null));
+ for (i = 0; i < object.length; ++i)
+ object[i] = A._convertJsonToDartLazy(object[i]);
+ return object;
+ },
+ JsonUnsupportedObjectError$(unsupportedObject, cause, partialResult) {
+ return new A.JsonUnsupportedObjectError(unsupportedObject, cause);
+ },
+ _defaultToEncodable(object) {
+ return object.toJson$0();
+ },
+ _JsonStringStringifier$(_sink, _toEncodable) {
+ return new A._JsonStringStringifier(_sink, [], A.convert___defaultToEncodable$closure());
+ },
+ _JsonStringStringifier_stringify(object, toEncodable, indent) {
+ var t1,
+ output = new A.StringBuffer(""),
+ stringifier = A._JsonStringStringifier$(output, toEncodable);
+ stringifier.writeObject$1(object);
+ t1 = output._contents;
+ return t1.charCodeAt(0) == 0 ? t1 : t1;
+ },
+ _JsonMap: function _JsonMap(t0, t1) {
+ this._original = t0;
+ this._processed = t1;
+ this._data = null;
+ },
+ _JsonMapKeyIterable: function _JsonMapKeyIterable(t0) {
+ this._parent = t0;
+ },
+ Codec: function Codec() {
+ },
+ Converter: function Converter() {
+ },
+ JsonUnsupportedObjectError: function JsonUnsupportedObjectError(t0, t1) {
+ this.unsupportedObject = t0;
+ this.cause = t1;
+ },
+ JsonCyclicError: function JsonCyclicError(t0, t1) {
+ this.unsupportedObject = t0;
+ this.cause = t1;
+ },
+ JsonCodec: function JsonCodec() {
+ },
+ JsonEncoder: function JsonEncoder(t0) {
+ this._toEncodable = t0;
+ },
+ JsonDecoder: function JsonDecoder(t0) {
+ this._reviver = t0;
+ },
+ _JsonStringifier: function _JsonStringifier() {
+ },
+ _JsonStringifier_writeMap_closure: function _JsonStringifier_writeMap_closure(t0, t1) {
+ this._box_0 = t0;
+ this.keyValueList = t1;
+ },
+ _JsonStringStringifier: function _JsonStringStringifier(t0, t1, t2) {
+ this._sink = t0;
+ this._seen = t1;
+ this._toEncodable = t2;
+ },
+ int_parse(source, radix) {
+ var value = A.Primitives_parseInt(source, radix);
+ if (value != null)
+ return value;
+ throw A.wrapException(A.FormatException$(source, null, null));
+ },
+ Error__throw(error, stackTrace) {
+ error = A.wrapException(error);
+ if (error == null)
+ error = type$.Object._as(error);
+ error.stack = stackTrace.toString$0(0);
+ throw error;
+ throw A.wrapException("unreachable");
+ },
+ List_List$filled($length, fill, growable, $E) {
+ var i,
+ result = growable ? J.JSArray_JSArray$growable($length, $E) : J.JSArray_JSArray$fixed($length, $E);
+ if ($length !== 0 && fill != null)
+ for (i = 0; i < result.length; ++i)
+ result[i] = fill;
+ return result;
+ },
+ List_List$of(elements, growable, $E) {
+ var t1 = A.List_List$_of(elements, $E);
+ return t1;
+ },
+ List_List$_of(elements, $E) {
+ var list, t1;
+ if (Array.isArray(elements))
+ return A._setArrayType(elements.slice(0), $E._eval$1("JSArray<0>"));
+ list = A._setArrayType([], $E._eval$1("JSArray<0>"));
+ for (t1 = J.get$iterator$ax(elements); t1.moveNext$0();)
+ B.JSArray_methods.add$1(list, t1.get$current());
+ return list;
+ },
+ StringBuffer__writeAll(string, objects, separator) {
+ var iterator = J.get$iterator$ax(objects);
+ if (!iterator.moveNext$0())
+ return string;
+ if (separator.length === 0) {
+ do
+ string += A.S(iterator.get$current());
+ while (iterator.moveNext$0());
+ } else {
+ string += A.S(iterator.get$current());
+ for (; iterator.moveNext$0();)
+ string = string + separator + A.S(iterator.get$current());
+ }
+ return string;
+ },
+ NoSuchMethodError_NoSuchMethodError$withInvocation(receiver, invocation) {
+ return new A.NoSuchMethodError(receiver, invocation.get$memberName(), invocation.get$positionalArguments(), invocation.get$namedArguments());
+ },
+ StackTrace_current() {
+ return A.getTraceFromException(new Error());
+ },
+ DateTime__fourDigits(n) {
+ var absN = Math.abs(n),
+ sign = n < 0 ? "-" : "";
+ if (absN >= 1000)
+ return "" + n;
+ if (absN >= 100)
+ return sign + "0" + absN;
+ if (absN >= 10)
+ return sign + "00" + absN;
+ return sign + "000" + absN;
+ },
+ DateTime__threeDigits(n) {
+ if (n >= 100)
+ return "" + n;
+ if (n >= 10)
+ return "0" + n;
+ return "00" + n;
+ },
+ DateTime__twoDigits(n) {
+ if (n >= 10)
+ return "" + n;
+ return "0" + n;
+ },
+ Error_safeToString(object) {
+ if (typeof object == "number" || A._isBool(object) || object == null)
+ return J.toString$0$(object);
+ if (typeof object == "string")
+ return JSON.stringify(object);
+ return A.Primitives_safeToString(object);
+ },
+ Error_throwWithStackTrace(error, stackTrace) {
+ A.checkNotNullable(error, "error", type$.Object);
+ A.checkNotNullable(stackTrace, "stackTrace", type$.StackTrace);
+ A.Error__throw(error, stackTrace);
+ },
+ AssertionError$(message) {
+ return new A.AssertionError(message);
+ },
+ ArgumentError$(message, $name) {
+ return new A.ArgumentError(false, null, $name, message);
+ },
+ ArgumentError$value(value, $name, message) {
+ return new A.ArgumentError(true, value, $name, message);
+ },
+ ArgumentError$notNull($name) {
+ return new A.ArgumentError(false, null, $name, "Must not be null");
+ },
+ RangeError$(message) {
+ var _null = null;
+ return new A.RangeError(_null, _null, false, _null, _null, message);
+ },
+ RangeError$value(value, $name) {
+ return new A.RangeError(null, null, true, value, $name, "Value not in range");
+ },
+ RangeError$range(invalidValue, minValue, maxValue, $name, message) {
+ return new A.RangeError(minValue, maxValue, true, invalidValue, $name, "Invalid value");
+ },
+ RangeError_checkValidRange(start, end, $length) {
+ if (0 > start || start > $length)
+ throw A.wrapException(A.RangeError$range(start, 0, $length, "start", null));
+ if (end != null) {
+ if (start > end || end > $length)
+ throw A.wrapException(A.RangeError$range(end, start, $length, "end", null));
+ return end;
+ }
+ return $length;
+ },
+ RangeError_checkNotNegative(value, $name) {
+ if (value < 0)
+ throw A.wrapException(A.RangeError$range(value, 0, null, $name, null));
+ return value;
+ },
+ IndexError$withLength(invalidValue, $length, indexable, message, $name) {
+ return new A.IndexError($length, true, invalidValue, $name, "Index out of range");
+ },
+ UnsupportedError$(message) {
+ return new A.UnsupportedError(message);
+ },
+ UnimplementedError$(message) {
+ return new A.UnimplementedError(message);
+ },
+ StateError$(message) {
+ return new A.StateError(message);
+ },
+ ConcurrentModificationError$(modifiedObject) {
+ return new A.ConcurrentModificationError(modifiedObject);
+ },
+ FormatException$(message, source, offset) {
+ return new A.FormatException(message, source, offset);
+ },
+ Iterable_iterableToShortString(iterable, leftDelimiter, rightDelimiter) {
+ var parts, t1;
+ if (A.isToStringVisiting(iterable)) {
+ if (leftDelimiter === "(" && rightDelimiter === ")")
+ return "(...)";
+ return leftDelimiter + "..." + rightDelimiter;
+ }
+ parts = A._setArrayType([], type$.JSArray_String);
+ B.JSArray_methods.add$1($.toStringVisiting, iterable);
+ try {
+ A._iterablePartsToStrings(iterable, parts);
+ } finally {
+ if (0 >= $.toStringVisiting.length)
+ return A.ioore($.toStringVisiting, -1);
+ $.toStringVisiting.pop();
+ }
+ t1 = A.StringBuffer__writeAll(leftDelimiter, type$.Iterable_dynamic._as(parts), ", ") + rightDelimiter;
+ return t1.charCodeAt(0) == 0 ? t1 : t1;
+ },
+ Iterable_iterableToFullString(iterable, leftDelimiter, rightDelimiter) {
+ var buffer, t1;
+ if (A.isToStringVisiting(iterable))
+ return leftDelimiter + "..." + rightDelimiter;
+ buffer = new A.StringBuffer(leftDelimiter);
+ B.JSArray_methods.add$1($.toStringVisiting, iterable);
+ try {
+ t1 = buffer;
+ t1._contents = A.StringBuffer__writeAll(t1._contents, iterable, ", ");
+ } finally {
+ if (0 >= $.toStringVisiting.length)
+ return A.ioore($.toStringVisiting, -1);
+ $.toStringVisiting.pop();
+ }
+ buffer._contents += rightDelimiter;
+ t1 = buffer._contents;
+ return t1.charCodeAt(0) == 0 ? t1 : t1;
+ },
+ _iterablePartsToStrings(iterable, parts) {
+ var next, ultimateString, penultimateString, penultimate, ultimate, ultimate0, elision,
+ it = iterable.get$iterator(iterable),
+ $length = 0, count = 0;
+ while (true) {
+ if (!($length < 80 || count < 3))
+ break;
+ if (!it.moveNext$0())
+ return;
+ next = A.S(it.get$current());
+ B.JSArray_methods.add$1(parts, next);
+ $length += next.length + 2;
+ ++count;
+ }
+ if (!it.moveNext$0()) {
+ if (count <= 5)
+ return;
+ if (0 >= parts.length)
+ return A.ioore(parts, -1);
+ ultimateString = parts.pop();
+ if (0 >= parts.length)
+ return A.ioore(parts, -1);
+ penultimateString = parts.pop();
+ } else {
+ penultimate = it.get$current();
+ ++count;
+ if (!it.moveNext$0()) {
+ if (count <= 4) {
+ B.JSArray_methods.add$1(parts, A.S(penultimate));
+ return;
+ }
+ ultimateString = A.S(penultimate);
+ if (0 >= parts.length)
+ return A.ioore(parts, -1);
+ penultimateString = parts.pop();
+ $length += ultimateString.length + 2;
+ } else {
+ ultimate = it.get$current();
+ ++count;
+ for (; it.moveNext$0(); penultimate = ultimate, ultimate = ultimate0) {
+ ultimate0 = it.get$current();
+ ++count;
+ if (count > 100) {
+ while (true) {
+ if (!($length > 75 && count > 3))
+ break;
+ if (0 >= parts.length)
+ return A.ioore(parts, -1);
+ $length -= parts.pop().length + 2;
+ --count;
+ }
+ B.JSArray_methods.add$1(parts, "...");
+ return;
+ }
+ }
+ penultimateString = A.S(penultimate);
+ ultimateString = A.S(ultimate);
+ $length += ultimateString.length + penultimateString.length + 4;
+ }
+ }
+ if (count > parts.length + 2) {
+ $length += 5;
+ elision = "...";
+ } else
+ elision = null;
+ while (true) {
+ if (!($length > 80 && parts.length > 3))
+ break;
+ if (0 >= parts.length)
+ return A.ioore(parts, -1);
+ $length -= parts.pop().length + 2;
+ if (elision == null) {
+ $length += 5;
+ elision = "...";
+ }
+ }
+ if (elision != null)
+ B.JSArray_methods.add$1(parts, elision);
+ B.JSArray_methods.add$1(parts, penultimateString);
+ B.JSArray_methods.add$1(parts, ultimateString);
+ },
+ NoSuchMethodError_toString_closure: function NoSuchMethodError_toString_closure(t0, t1) {
+ this._box_0 = t0;
+ this.sb = t1;
+ },
+ DateTime: function DateTime(t0, t1) {
+ this._value = t0;
+ this.isUtc = t1;
+ },
+ Duration: function Duration(t0) {
+ this._duration = t0;
+ },
+ Error: function Error() {
+ },
+ AssertionError: function AssertionError(t0) {
+ this.message = t0;
+ },
+ TypeError: function TypeError() {
+ },
+ ArgumentError: function ArgumentError(t0, t1, t2, t3) {
+ var _ = this;
+ _._hasValue = t0;
+ _.invalidValue = t1;
+ _.name = t2;
+ _.message = t3;
+ },
+ RangeError: function RangeError(t0, t1, t2, t3, t4, t5) {
+ var _ = this;
+ _.start = t0;
+ _.end = t1;
+ _._hasValue = t2;
+ _.invalidValue = t3;
+ _.name = t4;
+ _.message = t5;
+ },
+ IndexError: function IndexError(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _.length = t0;
+ _._hasValue = t1;
+ _.invalidValue = t2;
+ _.name = t3;
+ _.message = t4;
+ },
+ NoSuchMethodError: function NoSuchMethodError(t0, t1, t2, t3) {
+ var _ = this;
+ _._core$_receiver = t0;
+ _._core$_memberName = t1;
+ _._core$_arguments = t2;
+ _._namedArguments = t3;
+ },
+ UnsupportedError: function UnsupportedError(t0) {
+ this.message = t0;
+ },
+ UnimplementedError: function UnimplementedError(t0) {
+ this.message = t0;
+ },
+ StateError: function StateError(t0) {
+ this.message = t0;
+ },
+ ConcurrentModificationError: function ConcurrentModificationError(t0) {
+ this.modifiedObject = t0;
+ },
+ OutOfMemoryError: function OutOfMemoryError() {
+ },
+ StackOverflowError: function StackOverflowError() {
+ },
+ _Exception: function _Exception(t0) {
+ this.message = t0;
+ },
+ FormatException: function FormatException(t0, t1, t2) {
+ this.message = t0;
+ this.source = t1;
+ this.offset = t2;
+ },
+ Iterable: function Iterable() {
+ },
+ Null: function Null() {
+ },
+ Object: function Object() {
+ },
+ _StringStackTrace: function _StringStackTrace(t0) {
+ this._stackTrace = t0;
+ },
+ StringBuffer: function StringBuffer(t0) {
+ this._contents = t0;
+ },
+ _convertDartFunctionFast(f) {
+ var ret,
+ existing = f.$dart_jsFunction;
+ if (existing != null)
+ return existing;
+ ret = function(_call, f) {
+ return function() {
+ return _call(f, Array.prototype.slice.apply(arguments));
+ };
+ }(A._callDartFunctionFast, f);
+ ret[$.$get$DART_CLOSURE_PROPERTY_NAME()] = f;
+ f.$dart_jsFunction = ret;
+ return ret;
+ },
+ _callDartFunctionFast(callback, $arguments) {
+ type$.List_dynamic._as($arguments);
+ type$.Function._as(callback);
+ return A.Primitives_applyFunction(callback, $arguments, null);
+ },
+ allowInterop(f, $F) {
+ if (typeof f == "function")
+ return f;
+ else
+ return $F._as(A._convertDartFunctionFast(f));
+ },
+ promiseToFuture(jsPromise, $T) {
+ var t1 = new A._Future($.Zone__current, $T._eval$1("_Future<0>")),
+ completer = new A._AsyncCompleter(t1, $T._eval$1("_AsyncCompleter<0>"));
+ jsPromise.then(A.convertDartClosureToJS(new A.promiseToFuture_closure(completer, $T), 1), A.convertDartClosureToJS(new A.promiseToFuture_closure0(completer), 1));
+ return t1;
+ },
+ _noDartifyRequired(o) {
+ return o == null || typeof o === "boolean" || typeof o === "number" || typeof o === "string" || o instanceof Int8Array || o instanceof Uint8Array || o instanceof Uint8ClampedArray || o instanceof Int16Array || o instanceof Uint16Array || o instanceof Int32Array || o instanceof Uint32Array || o instanceof Float32Array || o instanceof Float64Array || o instanceof ArrayBuffer || o instanceof DataView;
+ },
+ dartify(o) {
+ if (A._noDartifyRequired(o))
+ return o;
+ return new A.dartify_convert(new A._IdentityHashMap(type$._IdentityHashMap_of_nullable_Object_and_nullable_Object)).call$1(o);
+ },
+ promiseToFuture_closure: function promiseToFuture_closure(t0, t1) {
+ this.completer = t0;
+ this.T = t1;
+ },
+ promiseToFuture_closure0: function promiseToFuture_closure0(t0) {
+ this.completer = t0;
+ },
+ dartify_convert: function dartify_convert(t0) {
+ this._convertedObjects = t0;
+ },
+ NullRejectionException: function NullRejectionException(t0) {
+ this.isUndefined = t0;
+ },
+ _JSRandom: function _JSRandom() {
+ },
+ AsyncMemoizer: function AsyncMemoizer(t0, t1) {
+ this._completer = t0;
+ this.$ti = t1;
+ },
+ Level: function Level(t0, t1) {
+ this.name = t0;
+ this.value = t1;
+ },
+ LogRecord: function LogRecord(t0, t1, t2) {
+ this.level = t0;
+ this.message = t1;
+ this.loggerName = t2;
+ },
+ Logger_Logger($name) {
+ return $.Logger__loggers.putIfAbsent$2($name, new A.Logger_Logger_closure($name));
+ },
+ Logger: function Logger(t0, t1, t2) {
+ var _ = this;
+ _.name = t0;
+ _.parent = t1;
+ _._level = null;
+ _._children = t2;
+ },
+ Logger_Logger_closure: function Logger_Logger_closure(t0) {
+ this.name = t0;
+ },
+ Pool: function Pool(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _._requestedResources = t0;
+ _._onReleaseCallbacks = t1;
+ _._onReleaseCompleters = t2;
+ _._maxAllocatedResources = t3;
+ _._allocatedResources = 0;
+ _._timer = null;
+ _._closeMemo = t4;
+ },
+ Pool__runOnRelease_closure: function Pool__runOnRelease_closure(t0) {
+ this.$this = t0;
+ },
+ Pool__runOnRelease_closure0: function Pool__runOnRelease_closure0(t0) {
+ this.$this = t0;
+ },
+ PoolResource: function PoolResource(t0) {
+ this._pool = t0;
+ this._released = false;
+ },
+ SseClient$(serverUrl) {
+ var t3, t4, t5,
+ t1 = type$.String,
+ t2 = A.StreamController_StreamController(t1);
+ t1 = A.StreamController_StreamController(t1);
+ t3 = A.Logger_Logger("SseClient");
+ t4 = $.Zone__current;
+ t5 = A.generateUuidV4();
+ t1 = new A.SseClient(t5, t2, t1, t3, new A._AsyncCompleter(new A._Future(t4, type$._Future_void), type$._AsyncCompleter_void));
+ t1.SseClient$2$debugKey(serverUrl, null);
+ return t1;
+ },
+ SseClient: function SseClient(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _._clientId = t0;
+ _._incomingController = t1;
+ _._outgoingController = t2;
+ _._logger = t3;
+ _._onConnected = t4;
+ _._lastMessageId = -1;
+ _.__SseClient__serverUrl_A = _.__SseClient__eventSource_A = $;
+ _._errorTimer = null;
+ },
+ SseClient_closure: function SseClient_closure(t0) {
+ this.$this = t0;
+ },
+ SseClient_closure0: function SseClient_closure0(t0) {
+ this.$this = t0;
+ },
+ SseClient_closure1: function SseClient_closure1(t0) {
+ this.$this = t0;
+ },
+ SseClient__closure: function SseClient__closure(t0, t1) {
+ this.$this = t0;
+ this.error = t1;
+ },
+ SseClient__onOutgoingMessage_closure: function SseClient__onOutgoingMessage_closure(t0, t1, t2) {
+ this._box_0 = t0;
+ this.$this = t1;
+ this.message = t2;
+ },
+ generateUuidV4() {
+ var t1 = new A.generateUuidV4_printDigits(),
+ t2 = new A.generateUuidV4_bitsDigits(t1, new A.generateUuidV4_generateBits(B.C__JSRandom)),
+ t3 = B.C__JSRandom.nextInt$1(4);
+ return A.S(t2.call$2(16, 4)) + A.S(t2.call$2(16, 4)) + "-" + A.S(t2.call$2(16, 4)) + "-4" + A.S(t2.call$2(12, 3)) + "-" + A.S(t1.call$2(8 + t3, 1)) + A.S(t2.call$2(12, 3)) + "-" + A.S(t2.call$2(16, 4)) + A.S(t2.call$2(16, 4)) + A.S(t2.call$2(16, 4));
+ },
+ generateUuidV4_generateBits: function generateUuidV4_generateBits(t0) {
+ this.random = t0;
+ },
+ generateUuidV4_printDigits: function generateUuidV4_printDigits() {
+ },
+ generateUuidV4_bitsDigits: function generateUuidV4_bitsDigits(t0, t1) {
+ this.printDigits = t0;
+ this.generateBits = t1;
+ },
+ StreamChannelMixin: function StreamChannelMixin() {
+ },
+ _EventStreamSubscription$(_target, _eventType, onData, _useCapture, $T) {
+ var t1;
+ if (onData == null)
+ t1 = null;
+ else {
+ t1 = A._wrapZone(new A._EventStreamSubscription_closure(onData), type$.JSObject);
+ t1 = t1 == null ? null : type$.JavaScriptFunction._as(A.allowInterop(t1, type$.Function));
+ }
+ t1 = new A._EventStreamSubscription(_target, _eventType, t1, false, $T._eval$1("_EventStreamSubscription<0>"));
+ t1._tryResume$0();
+ return t1;
+ },
+ _wrapZone(callback, $T) {
+ var t1 = $.Zone__current;
+ if (t1 === B.C__RootZone)
+ return callback;
+ return t1.bindUnaryCallbackGuarded$1$1(callback, $T);
+ },
+ EventStreamProvider: function EventStreamProvider(t0, t1) {
+ this._eventType = t0;
+ this.$ti = t1;
+ },
+ _EventStream: function _EventStream(t0, t1, t2, t3) {
+ var _ = this;
+ _._target = t0;
+ _._eventType = t1;
+ _._useCapture = t2;
+ _.$ti = t3;
+ },
+ _ElementEventStreamImpl: function _ElementEventStreamImpl(t0, t1, t2, t3) {
+ var _ = this;
+ _._target = t0;
+ _._eventType = t1;
+ _._useCapture = t2;
+ _.$ti = t3;
+ },
+ _EventStreamSubscription: function _EventStreamSubscription(t0, t1, t2, t3, t4) {
+ var _ = this;
+ _._target = t0;
+ _._eventType = t1;
+ _._streams$_onData = t2;
+ _._useCapture = t3;
+ _.$ti = t4;
+ },
+ _EventStreamSubscription_closure: function _EventStreamSubscription_closure(t0) {
+ this.onData = t0;
+ },
+ _EventStreamSubscription_onData_closure: function _EventStreamSubscription_onData_closure(t0) {
+ this.handleData = t0;
+ },
+ main() {
+ var t2,
+ channel = A.SseClient$("/test"),
+ t1 = type$.nullable_JSObject._as(type$.JSObject._as(self.document).querySelector("button"));
+ t1.toString;
+ t2 = type$._ElementEventStreamImpl_JSObject;
+ A._EventStreamSubscription$(t1, "click", t2._eval$1("~(1)?")._as(new A.main_closure(channel)), false, t2._precomputed1);
+ t2 = channel._incomingController;
+ new A._ControllerStream(t2, A._instanceType(t2)._eval$1("_ControllerStream<1>")).listen$1(new A.main_closure0(channel));
+ },
+ main_closure: function main_closure(t0) {
+ this.channel = t0;
+ },
+ main_closure0: function main_closure0(t0) {
+ this.channel = t0;
+ },
+ throwLateFieldNI(fieldName) {
+ A.throwExpressionWithWrapper(new A.LateError("Field '" + fieldName + "' has not been initialized."), new Error());
+ },
+ throwLateFieldADI(fieldName) {
+ A.throwExpressionWithWrapper(new A.LateError("Field '" + fieldName + "' has been assigned during initialization."), new Error());
+ }
+ },
+ B = {};
+ var holders = [A, J, B];
+ var $ = {};
+ A.JS_CONST.prototype = {};
+ J.Interceptor.prototype = {
+ $eq(receiver, other) {
+ return receiver === other;
+ },
+ get$hashCode(receiver) {
+ return A.Primitives_objectHashCode(receiver);
+ },
+ toString$0(receiver) {
+ return "Instance of '" + A.Primitives_objectTypeName(receiver) + "'";
+ },
+ noSuchMethod$1(receiver, invocation) {
+ throw A.wrapException(A.NoSuchMethodError_NoSuchMethodError$withInvocation(receiver, type$.Invocation._as(invocation)));
+ },
+ get$runtimeType(receiver) {
+ return A.createRuntimeType(A._instanceTypeFromConstructor(this));
+ }
+ };
+ J.JSBool.prototype = {
+ toString$0(receiver) {
+ return String(receiver);
+ },
+ get$hashCode(receiver) {
+ return receiver ? 519018 : 218159;
+ },
+ get$runtimeType(receiver) {
+ return A.createRuntimeType(type$.bool);
+ },
+ $isTrustedGetRuntimeType: 1,
+ $isbool: 1
+ };
+ J.JSNull.prototype = {
+ $eq(receiver, other) {
+ return null == other;
+ },
+ toString$0(receiver) {
+ return "null";
+ },
+ get$hashCode(receiver) {
+ return 0;
+ },
+ $isTrustedGetRuntimeType: 1,
+ $isNull: 1
+ };
+ J.JavaScriptObject.prototype = {$isJSObject: 1};
+ J.LegacyJavaScriptObject.prototype = {
+ get$hashCode(receiver) {
+ return 0;
+ },
+ toString$0(receiver) {
+ return String(receiver);
+ }
+ };
+ J.PlainJavaScriptObject.prototype = {};
+ J.UnknownJavaScriptObject.prototype = {};
+ J.JavaScriptFunction.prototype = {
+ toString$0(receiver) {
+ var dartClosure = receiver[$.$get$DART_CLOSURE_PROPERTY_NAME()];
+ if (dartClosure == null)
+ return this.super$LegacyJavaScriptObject$toString(receiver);
+ return "JavaScript function for " + J.toString$0$(dartClosure);
+ },
+ $isFunction: 1
+ };
+ J.JavaScriptBigInt.prototype = {
+ get$hashCode(receiver) {
+ return 0;
+ },
+ toString$0(receiver) {
+ return String(receiver);
+ }
+ };
+ J.JavaScriptSymbol.prototype = {
+ get$hashCode(receiver) {
+ return 0;
+ },
+ toString$0(receiver) {
+ return String(receiver);
+ }
+ };
+ J.JSArray.prototype = {
+ add$1(receiver, value) {
+ A._arrayInstanceType(receiver)._precomputed1._as(value);
+ if (!!receiver.fixed$length)
+ A.throwExpression(A.UnsupportedError$("add"));
+ receiver.push(value);
+ },
+ addAll$1(receiver, collection) {
+ var t1;
+ A._arrayInstanceType(receiver)._eval$1("Iterable<1>")._as(collection);
+ if (!!receiver.fixed$length)
+ A.throwExpression(A.UnsupportedError$("addAll"));
+ if (Array.isArray(collection)) {
+ this._addAllFromArray$1(receiver, collection);
+ return;
+ }
+ for (t1 = J.get$iterator$ax(collection); t1.moveNext$0();)
+ receiver.push(t1.get$current());
+ },
+ _addAllFromArray$1(receiver, array) {
+ var len, i;
+ type$.JSArray_dynamic._as(array);
+ len = array.length;
+ if (len === 0)
+ return;
+ if (receiver === array)
+ throw A.wrapException(A.ConcurrentModificationError$(receiver));
+ for (i = 0; i < len; ++i)
+ receiver.push(array[i]);
+ },
+ get$last(receiver) {
+ var t1 = receiver.length;
+ if (t1 > 0)
+ return receiver[t1 - 1];
+ throw A.wrapException(A.IterableElementError_noElement());
+ },
+ setRange$4(receiver, start, end, iterable, skipCount) {
+ var $length, otherList, t1, i;
+ A._arrayInstanceType(receiver)._eval$1("Iterable<1>")._as(iterable);
+ if (!!receiver.immutable$list)
+ A.throwExpression(A.UnsupportedError$("setRange"));
+ A.RangeError_checkValidRange(start, end, receiver.length);
+ $length = end - start;
+ if ($length === 0)
+ return;
+ A.RangeError_checkNotNegative(skipCount, "skipCount");
+ otherList = iterable;
+ t1 = J.getInterceptor$asx(otherList);
+ if (skipCount + $length > t1.get$length(otherList))
+ throw A.wrapException(A.IterableElementError_tooFew());
+ if (skipCount < start)
+ for (i = $length - 1; i >= 0; --i)
+ receiver[start + i] = t1.$index(otherList, skipCount + i);
+ else
+ for (i = 0; i < $length; ++i)
+ receiver[start + i] = t1.$index(otherList, skipCount + i);
+ },
+ get$isNotEmpty(receiver) {
+ return receiver.length !== 0;
+ },
+ toString$0(receiver) {
+ return A.Iterable_iterableToFullString(receiver, "[", "]");
+ },
+ get$iterator(receiver) {
+ return new J.ArrayIterator(receiver, receiver.length, A._arrayInstanceType(receiver)._eval$1("ArrayIterator<1>"));
+ },
+ get$hashCode(receiver) {
+ return A.Primitives_objectHashCode(receiver);
+ },
+ get$length(receiver) {
+ return receiver.length;
+ },
+ $index(receiver, index) {
+ if (!(index >= 0 && index < receiver.length))
+ throw A.wrapException(A.diagnoseIndexError(receiver, index));
+ return receiver[index];
+ },
+ $indexSet(receiver, index, value) {
+ A._arrayInstanceType(receiver)._precomputed1._as(value);
+ if (!!receiver.immutable$list)
+ A.throwExpression(A.UnsupportedError$("indexed set"));
+ if (!(index >= 0 && index < receiver.length))
+ throw A.wrapException(A.diagnoseIndexError(receiver, index));
+ receiver[index] = value;
+ },
+ $isIterable: 1,
+ $isList: 1
+ };
+ J.JSUnmodifiableArray.prototype = {};
+ J.ArrayIterator.prototype = {
+ get$current() {
+ var t1 = this._current;
+ return t1 == null ? this.$ti._precomputed1._as(t1) : t1;
+ },
+ moveNext$0() {
+ var t2, _this = this,
+ t1 = _this._iterable,
+ $length = t1.length;
+ if (_this._length !== $length) {
+ t1 = A.throwConcurrentModificationError(t1);
+ throw A.wrapException(t1);
+ }
+ t2 = _this._index;
+ if (t2 >= $length) {
+ _this.set$_current(null);
+ return false;
+ }
+ _this.set$_current(t1[t2]);
+ ++_this._index;
+ return true;
+ },
+ set$_current(_current) {
+ this._current = this.$ti._eval$1("1?")._as(_current);
+ }
+ };
+ J.JSNumber.prototype = {
+ toRadixString$1(receiver, radix) {
+ var result, t1, t2, match, exponent;
+ if (radix < 2 || radix > 36)
+ throw A.wrapException(A.RangeError$range(radix, 2, 36, "radix", null));
+ result = receiver.toString(radix);
+ t1 = result.length;
+ t2 = t1 - 1;
+ if (!(t2 >= 0))
+ return A.ioore(result, t2);
+ if (result.charCodeAt(t2) !== 41)
+ return result;
+ match = /^([\da-z]+)(?:\.([\da-z]+))?\(e\+(\d+)\)$/.exec(result);
+ if (match == null)
+ A.throwExpression(A.UnsupportedError$("Unexpected toString result: " + result));
+ t1 = match.length;
+ if (1 >= t1)
+ return A.ioore(match, 1);
+ result = match[1];
+ if (3 >= t1)
+ return A.ioore(match, 3);
+ exponent = +match[3];
+ t1 = match[2];
+ if (t1 != null) {
+ result += t1;
+ exponent -= t1.length;
+ }
+ return result + B.JSString_methods.$mul("0", exponent);
+ },
+ toString$0(receiver) {
+ if (receiver === 0 && 1 / receiver < 0)
+ return "-0.0";
+ else
+ return "" + receiver;
+ },
+ get$hashCode(receiver) {
+ var absolute, floorLog2, factor, scaled,
+ intValue = receiver | 0;
+ if (receiver === intValue)
+ return intValue & 536870911;
+ absolute = Math.abs(receiver);
+ floorLog2 = Math.log(absolute) / 0.6931471805599453 | 0;
+ factor = Math.pow(2, floorLog2);
+ scaled = absolute < 1 ? absolute / factor : factor / absolute;
+ return ((scaled * 9007199254740992 | 0) + (scaled * 3542243181176521 | 0)) * 599197 + floorLog2 * 1259 & 536870911;
+ },
+ _tdivFast$1(receiver, other) {
+ return (receiver | 0) === receiver ? receiver / other | 0 : this._tdivSlow$1(receiver, other);
+ },
+ _tdivSlow$1(receiver, other) {
+ var quotient = receiver / other;
+ if (quotient >= -2147483648 && quotient <= 2147483647)
+ return quotient | 0;
+ if (quotient > 0) {
+ if (quotient !== 1 / 0)
+ return Math.floor(quotient);
+ } else if (quotient > -1 / 0)
+ return Math.ceil(quotient);
+ throw A.wrapException(A.UnsupportedError$("Result of truncating division is " + A.S(quotient) + ": " + A.S(receiver) + " ~/ " + other));
+ },
+ _shlPositive$1(receiver, other) {
+ return other > 31 ? 0 : receiver << other >>> 0;
+ },
+ _shrOtherPositive$1(receiver, other) {
+ var t1;
+ if (receiver > 0)
+ t1 = this._shrBothPositive$1(receiver, other);
+ else {
+ t1 = other > 31 ? 31 : other;
+ t1 = receiver >> t1 >>> 0;
+ }
+ return t1;
+ },
+ _shrBothPositive$1(receiver, other) {
+ return other > 31 ? 0 : receiver >>> other;
+ },
+ get$runtimeType(receiver) {
+ return A.createRuntimeType(type$.num);
+ },
+ $isdouble: 1,
+ $isnum: 1
+ };
+ J.JSInt.prototype = {
+ get$runtimeType(receiver) {
+ return A.createRuntimeType(type$.int);
+ },
+ $isTrustedGetRuntimeType: 1,
+ $isint: 1
+ };
+ J.JSNumNotInt.prototype = {
+ get$runtimeType(receiver) {
+ return A.createRuntimeType(type$.double);
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ J.JSString.prototype = {
+ matchAsPrefix$2(receiver, string, start) {
+ var t1, t2, i, t3, _null = null;
+ if (start < 0 || start > string.length)
+ throw A.wrapException(A.RangeError$range(start, 0, string.length, _null, _null));
+ t1 = receiver.length;
+ t2 = string.length;
+ if (start + t1 > t2)
+ return _null;
+ for (i = 0; i < t1; ++i) {
+ t3 = start + i;
+ if (!(t3 >= 0 && t3 < t2))
+ return A.ioore(string, t3);
+ if (string.charCodeAt(t3) !== receiver.charCodeAt(i))
+ return _null;
+ }
+ return new A.StringMatch(start, receiver);
+ },
+ $add(receiver, other) {
+ return receiver + other;
+ },
+ endsWith$1(receiver, other) {
+ var otherLength = other.length,
+ t1 = receiver.length;
+ if (otherLength > t1)
+ return false;
+ return other === this.substring$1(receiver, t1 - otherLength);
+ },
+ startsWith$2(receiver, pattern, index) {
+ var endIndex;
+ if (index < 0 || index > receiver.length)
+ throw A.wrapException(A.RangeError$range(index, 0, receiver.length, null, null));
+ if (typeof pattern == "string") {
+ endIndex = index + pattern.length;
+ if (endIndex > receiver.length)
+ return false;
+ return pattern === receiver.substring(index, endIndex);
+ }
+ return J.matchAsPrefix$2$s(pattern, receiver, index) != null;
+ },
+ startsWith$1(receiver, pattern) {
+ return this.startsWith$2(receiver, pattern, 0);
+ },
+ substring$2(receiver, start, end) {
+ return receiver.substring(start, A.RangeError_checkValidRange(start, end, receiver.length));
+ },
+ substring$1(receiver, start) {
+ return this.substring$2(receiver, start, null);
+ },
+ $mul(receiver, times) {
+ var s, result;
+ if (0 >= times)
+ return "";
+ if (times === 1 || receiver.length === 0)
+ return receiver;
+ if (times !== times >>> 0)
+ throw A.wrapException(B.C_OutOfMemoryError);
+ for (s = receiver, result = ""; true;) {
+ if ((times & 1) === 1)
+ result = s + result;
+ times = times >>> 1;
+ if (times === 0)
+ break;
+ s += s;
+ }
+ return result;
+ },
+ padLeft$2(receiver, width, padding) {
+ var delta = width - receiver.length;
+ if (delta <= 0)
+ return receiver;
+ return this.$mul(padding, delta) + receiver;
+ },
+ lastIndexOf$2(receiver, pattern, start) {
+ var t1, t2;
+ if (start == null)
+ start = receiver.length;
+ else if (start < 0 || start > receiver.length)
+ throw A.wrapException(A.RangeError$range(start, 0, receiver.length, null, null));
+ t1 = pattern.length;
+ t2 = receiver.length;
+ if (start + t1 > t2)
+ start = t2 - t1;
+ return receiver.lastIndexOf(pattern, start);
+ },
+ lastIndexOf$1(receiver, pattern) {
+ return this.lastIndexOf$2(receiver, pattern, null);
+ },
+ toString$0(receiver) {
+ return receiver;
+ },
+ get$hashCode(receiver) {
+ var t1, hash, i;
+ for (t1 = receiver.length, hash = 0, i = 0; i < t1; ++i) {
+ hash = hash + receiver.charCodeAt(i) & 536870911;
+ hash = hash + ((hash & 524287) << 10) & 536870911;
+ hash ^= hash >> 6;
+ }
+ hash = hash + ((hash & 67108863) << 3) & 536870911;
+ hash ^= hash >> 11;
+ return hash + ((hash & 16383) << 15) & 536870911;
+ },
+ get$runtimeType(receiver) {
+ return A.createRuntimeType(type$.String);
+ },
+ get$length(receiver) {
+ return receiver.length;
+ },
+ $isTrustedGetRuntimeType: 1,
+ $isPattern: 1,
+ $isString: 1
+ };
+ A.LateError.prototype = {
+ toString$0(_) {
+ return "LateInitializationError: " + this._message;
+ }
+ };
+ A.nullFuture_closure.prototype = {
+ call$0() {
+ return A.Future_Future$value(null, type$.Null);
+ },
+ $signature: 7
+ };
+ A.EfficientLengthIterable.prototype = {};
+ A.ListIterable.prototype = {
+ get$iterator(_) {
+ var _this = this;
+ return new A.ListIterator(_this, _this.get$length(_this), A._instanceType(_this)._eval$1("ListIterator<ListIterable.E>"));
+ },
+ get$isEmpty(_) {
+ return this.get$length(this) === 0;
+ }
+ };
+ A.ListIterator.prototype = {
+ get$current() {
+ var t1 = this.__internal$_current;
+ return t1 == null ? this.$ti._precomputed1._as(t1) : t1;
+ },
+ moveNext$0() {
+ var t3, _this = this,
+ t1 = _this.__internal$_iterable,
+ t2 = J.getInterceptor$asx(t1),
+ $length = t2.get$length(t1);
+ if (_this.__internal$_length !== $length)
+ throw A.wrapException(A.ConcurrentModificationError$(t1));
+ t3 = _this.__internal$_index;
+ if (t3 >= $length) {
+ _this.set$__internal$_current(null);
+ return false;
+ }
+ _this.set$__internal$_current(t2.elementAt$1(t1, t3));
+ ++_this.__internal$_index;
+ return true;
+ },
+ set$__internal$_current(_current) {
+ this.__internal$_current = this.$ti._eval$1("1?")._as(_current);
+ }
+ };
+ A.FixedLengthListMixin.prototype = {};
+ A.Symbol.prototype = {
+ get$hashCode(_) {
+ var hash = this._hashCode;
+ if (hash != null)
+ return hash;
+ hash = 664597 * B.JSString_methods.get$hashCode(this._name) & 536870911;
+ this._hashCode = hash;
+ return hash;
+ },
+ toString$0(_) {
+ return 'Symbol("' + this._name + '")';
+ },
+ $eq(_, other) {
+ if (other == null)
+ return false;
+ return other instanceof A.Symbol && this._name === other._name;
+ },
+ $isSymbol0: 1
+ };
+ A.ConstantMapView.prototype = {};
+ A.ConstantMap.prototype = {
+ get$isEmpty(_) {
+ return this.get$length(this) === 0;
+ },
+ toString$0(_) {
+ return A.MapBase_mapToString(this);
+ },
+ $isMap: 1
+ };
+ A.ConstantStringMap.prototype = {
+ get$length(_) {
+ return this._values.length;
+ },
+ get$_keys() {
+ var keys = this.$keys;
+ if (keys == null) {
+ keys = Object.keys(this._jsIndex);
+ this.$keys = keys;
+ }
+ return keys;
+ },
+ forEach$1(_, f) {
+ var keys, values, t1, i;
+ this.$ti._eval$1("~(1,2)")._as(f);
+ keys = this.get$_keys();
+ values = this._values;
+ for (t1 = keys.length, i = 0; i < t1; ++i)
+ f.call$2(keys[i], values[i]);
+ }
+ };
+ A.JSInvocationMirror.prototype = {
+ get$memberName() {
+ var t1 = this._memberName;
+ if (t1 instanceof A.Symbol)
+ return t1;
+ return this._memberName = new A.Symbol(A._asString(t1));
+ },
+ get$positionalArguments() {
+ var t1, t2, argumentCount, list, index, _this = this;
+ if (_this.__js_helper$_kind === 1)
+ return B.List_empty;
+ t1 = _this._arguments;
+ t2 = J.getInterceptor$asx(t1);
+ argumentCount = t2.get$length(t1) - J.get$length$asx(_this._namedArgumentNames) - _this._typeArgumentCount;
+ if (argumentCount === 0)
+ return B.List_empty;
+ list = [];
+ for (index = 0; index < argumentCount; ++index)
+ list.push(t2.$index(t1, index));
+ return J.JSArray_markUnmodifiableList(list);
+ },
+ get$namedArguments() {
+ var t1, t2, namedArgumentCount, t3, t4, namedArgumentsStartIndex, map, i, _this = this;
+ if (_this.__js_helper$_kind !== 0)
+ return B.Map_empty;
+ t1 = _this._namedArgumentNames;
+ t2 = J.getInterceptor$asx(t1);
+ namedArgumentCount = t2.get$length(t1);
+ t3 = _this._arguments;
+ t4 = J.getInterceptor$asx(t3);
+ namedArgumentsStartIndex = t4.get$length(t3) - namedArgumentCount - _this._typeArgumentCount;
+ if (namedArgumentCount === 0)
+ return B.Map_empty;
+ map = new A.JsLinkedHashMap(type$.JsLinkedHashMap_Symbol_dynamic);
+ for (i = 0; i < namedArgumentCount; ++i)
+ map.$indexSet(0, new A.Symbol(A._asString(t2.$index(t1, i))), t4.$index(t3, namedArgumentsStartIndex + i));
+ return new A.ConstantMapView(map, type$.ConstantMapView_Symbol_dynamic);
+ },
+ $isInvocation: 1
+ };
+ A.Primitives_functionNoSuchMethod_closure.prototype = {
+ call$2($name, argument) {
+ var t1;
+ A._asString($name);
+ t1 = this._box_0;
+ t1.names = t1.names + "$" + $name;
+ B.JSArray_methods.add$1(this.namedArgumentList, $name);
+ B.JSArray_methods.add$1(this.$arguments, argument);
+ ++t1.argumentCount;
+ },
+ $signature: 12
+ };
+ A.TypeErrorDecoder.prototype = {
+ matchTypeError$1(message) {
+ var result, t1, _this = this,
+ match = new RegExp(_this._pattern).exec(message);
+ if (match == null)
+ return null;
+ result = Object.create(null);
+ t1 = _this._arguments;
+ if (t1 !== -1)
+ result.arguments = match[t1 + 1];
+ t1 = _this._argumentsExpr;
+ if (t1 !== -1)
+ result.argumentsExpr = match[t1 + 1];
+ t1 = _this._expr;
+ if (t1 !== -1)
+ result.expr = match[t1 + 1];
+ t1 = _this._method;
+ if (t1 !== -1)
+ result.method = match[t1 + 1];
+ t1 = _this._receiver;
+ if (t1 !== -1)
+ result.receiver = match[t1 + 1];
+ return result;
+ }
+ };
+ A.NullError.prototype = {
+ toString$0(_) {
+ return "Null check operator used on a null value";
+ }
+ };
+ A.JsNoSuchMethodError.prototype = {
+ toString$0(_) {
+ var t2, _this = this,
+ _s38_ = "NoSuchMethodError: method not found: '",
+ t1 = _this._method;
+ if (t1 == null)
+ return "NoSuchMethodError: " + _this.__js_helper$_message;
+ t2 = _this._receiver;
+ if (t2 == null)
+ return _s38_ + t1 + "' (" + _this.__js_helper$_message + ")";
+ return _s38_ + t1 + "' on '" + t2 + "' (" + _this.__js_helper$_message + ")";
+ }
+ };
+ A.UnknownJsTypeError.prototype = {
+ toString$0(_) {
+ var t1 = this.__js_helper$_message;
+ return t1.length === 0 ? "Error" : "Error: " + t1;
+ }
+ };
+ A.NullThrownFromJavaScriptException.prototype = {
+ toString$0(_) {
+ return "Throw of null ('" + (this._irritant === null ? "null" : "undefined") + "' from JavaScript)";
+ }
+ };
+ A.ExceptionAndStackTrace.prototype = {};
+ A._StackTrace.prototype = {
+ toString$0(_) {
+ var trace,
+ t1 = this._trace;
+ if (t1 != null)
+ return t1;
+ t1 = this._exception;
+ trace = t1 !== null && typeof t1 === "object" ? t1.stack : null;
+ return this._trace = trace == null ? "" : trace;
+ },
+ $isStackTrace: 1
+ };
+ A.Closure.prototype = {
+ toString$0(_) {
+ var $constructor = this.constructor,
+ $name = $constructor == null ? null : $constructor.name;
+ return "Closure '" + A.unminifyOrTag($name == null ? "unknown" : $name) + "'";
+ },
+ $isFunction: 1,
+ get$$call() {
+ return this;
+ },
+ "call*": "call$1",
+ $requiredArgCount: 1,
+ $defaultValues: null
+ };
+ A.Closure0Args.prototype = {"call*": "call$0", $requiredArgCount: 0};
+ A.Closure2Args.prototype = {"call*": "call$2", $requiredArgCount: 2};
+ A.TearOffClosure.prototype = {};
+ A.StaticClosure.prototype = {
+ toString$0(_) {
+ var $name = this.$static_name;
+ if ($name == null)
+ return "Closure of unknown static method";
+ return "Closure '" + A.unminifyOrTag($name) + "'";
+ }
+ };
+ A.BoundClosure.prototype = {
+ $eq(_, other) {
+ if (other == null)
+ return false;
+ if (this === other)
+ return true;
+ if (!(other instanceof A.BoundClosure))
+ return false;
+ return this.$_target === other.$_target && this._receiver === other._receiver;
+ },
+ get$hashCode(_) {
+ return (A.objectHashCode(this._receiver) ^ A.Primitives_objectHashCode(this.$_target)) >>> 0;
+ },
+ toString$0(_) {
+ return "Closure '" + this.$_name + "' of " + ("Instance of '" + A.Primitives_objectTypeName(this._receiver) + "'");
+ }
+ };
+ A._CyclicInitializationError.prototype = {
+ toString$0(_) {
+ return "Reading static variable '" + this.variableName + "' during its initialization";
+ }
+ };
+ A.RuntimeError.prototype = {
+ toString$0(_) {
+ return "RuntimeError: " + this.message;
+ }
+ };
+ A._Required.prototype = {};
+ A.JsLinkedHashMap.prototype = {
+ get$length(_) {
+ return this.__js_helper$_length;
+ },
+ get$isEmpty(_) {
+ return this.__js_helper$_length === 0;
+ },
+ get$keys() {
+ return new A.LinkedHashMapKeyIterable(this, A._instanceType(this)._eval$1("LinkedHashMapKeyIterable<1>"));
+ },
+ containsKey$1(key) {
+ var strings = this._strings;
+ if (strings == null)
+ return false;
+ return strings[key] != null;
+ },
+ $index(_, key) {
+ var strings, cell, t1, nums, _null = null;
+ if (typeof key == "string") {
+ strings = this._strings;
+ if (strings == null)
+ return _null;
+ cell = strings[key];
+ t1 = cell == null ? _null : cell.hashMapCellValue;
+ return t1;
+ } else if (typeof key == "number" && (key & 0x3fffffff) === key) {
+ nums = this._nums;
+ if (nums == null)
+ return _null;
+ cell = nums[key];
+ t1 = cell == null ? _null : cell.hashMapCellValue;
+ return t1;
+ } else
+ return this.internalGet$1(key);
+ },
+ internalGet$1(key) {
+ var bucket, index,
+ rest = this.__js_helper$_rest;
+ if (rest == null)
+ return null;
+ bucket = rest[this.internalComputeHashCode$1(key)];
+ index = this.internalFindBucketIndex$2(bucket, key);
+ if (index < 0)
+ return null;
+ return bucket[index].hashMapCellValue;
+ },
+ $indexSet(_, key, value) {
+ var strings, nums, rest, hash, bucket, index, _this = this,
+ t1 = A._instanceType(_this);
+ t1._precomputed1._as(key);
+ t1._rest[1]._as(value);
+ if (typeof key == "string") {
+ strings = _this._strings;
+ _this._addHashTableEntry$3(strings == null ? _this._strings = _this._newHashTable$0() : strings, key, value);
+ } else if (typeof key == "number" && (key & 0x3fffffff) === key) {
+ nums = _this._nums;
+ _this._addHashTableEntry$3(nums == null ? _this._nums = _this._newHashTable$0() : nums, key, value);
+ } else {
+ rest = _this.__js_helper$_rest;
+ if (rest == null)
+ rest = _this.__js_helper$_rest = _this._newHashTable$0();
+ hash = _this.internalComputeHashCode$1(key);
+ bucket = rest[hash];
+ if (bucket == null)
+ rest[hash] = [_this._newLinkedCell$2(key, value)];
+ else {
+ index = _this.internalFindBucketIndex$2(bucket, key);
+ if (index >= 0)
+ bucket[index].hashMapCellValue = value;
+ else
+ bucket.push(_this._newLinkedCell$2(key, value));
+ }
+ }
+ },
+ putIfAbsent$2(key, ifAbsent) {
+ var t2, value, _this = this,
+ t1 = A._instanceType(_this);
+ t1._precomputed1._as(key);
+ t1._eval$1("2()")._as(ifAbsent);
+ if (_this.containsKey$1(key)) {
+ t2 = _this.$index(0, key);
+ return t2 == null ? t1._rest[1]._as(t2) : t2;
+ }
+ value = ifAbsent.call$0();
+ _this.$indexSet(0, key, value);
+ return value;
+ },
+ forEach$1(_, action) {
+ var cell, modifications, _this = this;
+ A._instanceType(_this)._eval$1("~(1,2)")._as(action);
+ cell = _this._first;
+ modifications = _this._modifications;
+ for (; cell != null;) {
+ action.call$2(cell.hashMapCellKey, cell.hashMapCellValue);
+ if (modifications !== _this._modifications)
+ throw A.wrapException(A.ConcurrentModificationError$(_this));
+ cell = cell._next;
+ }
+ },
+ _addHashTableEntry$3(table, key, value) {
+ var cell,
+ t1 = A._instanceType(this);
+ t1._precomputed1._as(key);
+ t1._rest[1]._as(value);
+ cell = table[key];
+ if (cell == null)
+ table[key] = this._newLinkedCell$2(key, value);
+ else
+ cell.hashMapCellValue = value;
+ },
+ _newLinkedCell$2(key, value) {
+ var _this = this,
+ t1 = A._instanceType(_this),
+ cell = new A.LinkedHashMapCell(t1._precomputed1._as(key), t1._rest[1]._as(value));
+ if (_this._first == null)
+ _this._first = _this._last = cell;
+ else
+ _this._last = _this._last._next = cell;
+ ++_this.__js_helper$_length;
+ _this._modifications = _this._modifications + 1 & 1073741823;
+ return cell;
+ },
+ internalComputeHashCode$1(key) {
+ return J.get$hashCode$(key) & 1073741823;
+ },
+ internalFindBucketIndex$2(bucket, key) {
+ var $length, i;
+ if (bucket == null)
+ return -1;
+ $length = bucket.length;
+ for (i = 0; i < $length; ++i)
+ if (J.$eq$(bucket[i].hashMapCellKey, key))
+ return i;
+ return -1;
+ },
+ toString$0(_) {
+ return A.MapBase_mapToString(this);
+ },
+ _newHashTable$0() {
+ var table = Object.create(null);
+ table["<non-identifier-key>"] = table;
+ delete table["<non-identifier-key>"];
+ return table;
+ }
+ };
+ A.LinkedHashMapCell.prototype = {};
+ A.LinkedHashMapKeyIterable.prototype = {
+ get$length(_) {
+ return this._map.__js_helper$_length;
+ },
+ get$isEmpty(_) {
+ return this._map.__js_helper$_length === 0;
+ },
+ get$iterator(_) {
+ var t1 = this._map,
+ t2 = new A.LinkedHashMapKeyIterator(t1, t1._modifications, this.$ti._eval$1("LinkedHashMapKeyIterator<1>"));
+ t2._cell = t1._first;
+ return t2;
+ }
+ };
+ A.LinkedHashMapKeyIterator.prototype = {
+ get$current() {
+ return this.__js_helper$_current;
+ },
+ moveNext$0() {
+ var cell, _this = this,
+ t1 = _this._map;
+ if (_this._modifications !== t1._modifications)
+ throw A.wrapException(A.ConcurrentModificationError$(t1));
+ cell = _this._cell;
+ if (cell == null) {
+ _this.set$__js_helper$_current(null);
+ return false;
+ } else {
+ _this.set$__js_helper$_current(cell.hashMapCellKey);
+ _this._cell = cell._next;
+ return true;
+ }
+ },
+ set$__js_helper$_current(_current) {
+ this.__js_helper$_current = this.$ti._eval$1("1?")._as(_current);
+ }
+ };
+ A.initHooks_closure.prototype = {
+ call$1(o) {
+ return this.getTag(o);
+ },
+ $signature: 8
+ };
+ A.initHooks_closure0.prototype = {
+ call$2(o, tag) {
+ return this.getUnknownTag(o, tag);
+ },
+ $signature: 13
+ };
+ A.initHooks_closure1.prototype = {
+ call$1(tag) {
+ return this.prototypeForTag(A._asString(tag));
+ },
+ $signature: 14
+ };
+ A.StringMatch.prototype = {};
+ A.NativeByteBuffer.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_ByteBuffer_RkP;
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeTypedData.prototype = {};
+ A.NativeByteData.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_ByteData_zNC;
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeTypedArray.prototype = {
+ get$length(receiver) {
+ return receiver.length;
+ },
+ $isJavaScriptIndexingBehavior: 1
+ };
+ A.NativeTypedArrayOfDouble.prototype = {
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isIterable: 1,
+ $isList: 1
+ };
+ A.NativeTypedArrayOfInt.prototype = {$isIterable: 1, $isList: 1};
+ A.NativeFloat32List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Float32List_LB7;
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeFloat64List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Float64List_LB7;
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeInt16List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Int16List_uXf;
+ },
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeInt32List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Int32List_O50;
+ },
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeInt8List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Int8List_ekJ;
+ },
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeUint16List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Uint16List_2bx;
+ },
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeUint32List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Uint32List_2bx;
+ },
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeUint8ClampedList.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Uint8ClampedList_Jik;
+ },
+ get$length(receiver) {
+ return receiver.length;
+ },
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A.NativeUint8List.prototype = {
+ get$runtimeType(receiver) {
+ return B.Type_Uint8List_WLA;
+ },
+ get$length(receiver) {
+ return receiver.length;
+ },
+ $index(receiver, index) {
+ A._checkValidIndex(index, receiver, receiver.length);
+ return receiver[index];
+ },
+ $isTrustedGetRuntimeType: 1
+ };
+ A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin.prototype = {};
+ A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin_FixedLengthListMixin.prototype = {};
+ A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin.prototype = {};
+ A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin_FixedLengthListMixin.prototype = {};
+ A.Rti.prototype = {
+ _eval$1(recipe) {
+ return A._Universe_evalInEnvironment(init.typeUniverse, this, recipe);
+ },
+ _bind$1(typeOrTuple) {
+ return A._Universe_bind(init.typeUniverse, this, typeOrTuple);
+ }
+ };
+ A._FunctionParameters.prototype = {};
+ A._Type.prototype = {
+ toString$0(_) {
+ return A._rtiToString(this._rti, null);
+ }
+ };
+ A._Error.prototype = {
+ toString$0(_) {
+ return this.__rti$_message;
+ }
+ };
+ A._TypeError.prototype = {$isTypeError: 1};
+ A._AsyncRun__initializeScheduleImmediate_internalCallback.prototype = {
+ call$1(_) {
+ var t1 = this._box_0,
+ f = t1.storedCallback;
+ t1.storedCallback = null;
+ f.call$0();
+ },
+ $signature: 4
+ };
+ A._AsyncRun__initializeScheduleImmediate_closure.prototype = {
+ call$1(callback) {
+ var t1, t2;
+ this._box_0.storedCallback = type$.void_Function._as(callback);
+ t1 = this.div;
+ t2 = this.span;
+ t1.firstChild ? t1.removeChild(t2) : t1.appendChild(t2);
+ },
+ $signature: 15
+ };
+ A._AsyncRun__scheduleImmediateJsOverride_internalCallback.prototype = {
+ call$0() {
+ this.callback.call$0();
+ },
+ $signature: 2
+ };
+ A._AsyncRun__scheduleImmediateWithSetImmediate_internalCallback.prototype = {
+ call$0() {
+ this.callback.call$0();
+ },
+ $signature: 2
+ };
+ A._TimerImpl.prototype = {
+ _TimerImpl$2(milliseconds, callback) {
+ if (self.setTimeout != null)
+ this._handle = self.setTimeout(A.convertDartClosureToJS(new A._TimerImpl_internalCallback(this, callback), 0), milliseconds);
+ else
+ throw A.wrapException(A.UnsupportedError$("`setTimeout()` not found."));
+ },
+ cancel$0() {
+ if (self.setTimeout != null) {
+ var t1 = this._handle;
+ if (t1 == null)
+ return;
+ self.clearTimeout(t1);
+ this._handle = null;
+ } else
+ throw A.wrapException(A.UnsupportedError$("Canceling a timer."));
+ },
+ $isTimer: 1
+ };
+ A._TimerImpl_internalCallback.prototype = {
+ call$0() {
+ this.$this._handle = null;
+ this.callback.call$0();
+ },
+ $signature: 0
+ };
+ A._AsyncAwaitCompleter.prototype = {
+ complete$1(value) {
+ var t2, _this = this,
+ t1 = _this.$ti;
+ t1._eval$1("1/?")._as(value);
+ if (value == null)
+ value = t1._precomputed1._as(value);
+ if (!_this.isSync)
+ _this._future._asyncComplete$1(value);
+ else {
+ t2 = _this._future;
+ if (t1._eval$1("Future<1>")._is(value))
+ t2._chainFuture$1(value);
+ else
+ t2._completeWithValue$1(value);
+ }
+ },
+ completeError$2(e, st) {
+ var t1 = this._future;
+ if (this.isSync)
+ t1._completeError$2(e, st);
+ else
+ t1._asyncCompleteError$2(e, st);
+ },
+ $isCompleter: 1
+ };
+ A._awaitOnObject_closure.prototype = {
+ call$1(result) {
+ return this.bodyFunction.call$2(0, result);
+ },
+ $signature: 3
+ };
+ A._awaitOnObject_closure0.prototype = {
+ call$2(error, stackTrace) {
+ this.bodyFunction.call$2(1, new A.ExceptionAndStackTrace(error, type$.StackTrace._as(stackTrace)));
+ },
+ $signature: 16
+ };
+ A._wrapJsFunctionForAsync_closure.prototype = {
+ call$2(errorCode, result) {
+ this.$protected(A._asInt(errorCode), result);
+ },
+ $signature: 17
+ };
+ A.AsyncError.prototype = {
+ toString$0(_) {
+ return A.S(this.error);
+ },
+ $isError: 1,
+ get$stackTrace() {
+ return this.stackTrace;
+ }
+ };
+ A._Completer.prototype = {
+ completeError$2(error, stackTrace) {
+ A.checkNotNullable(error, "error", type$.Object);
+ if ((this.future._state & 30) !== 0)
+ throw A.wrapException(A.StateError$("Future already completed"));
+ if (stackTrace == null)
+ stackTrace = A.AsyncError_defaultStackTrace(error);
+ this._completeError$2(error, stackTrace);
+ },
+ completeError$1(error) {
+ return this.completeError$2(error, null);
+ },
+ $isCompleter: 1
+ };
+ A._AsyncCompleter.prototype = {
+ complete$1(value) {
+ var t2,
+ t1 = this.$ti;
+ t1._eval$1("1/?")._as(value);
+ t2 = this.future;
+ if ((t2._state & 30) !== 0)
+ throw A.wrapException(A.StateError$("Future already completed"));
+ t2._asyncComplete$1(t1._eval$1("1/")._as(value));
+ },
+ complete$0() {
+ return this.complete$1(null);
+ },
+ _completeError$2(error, stackTrace) {
+ this.future._asyncCompleteError$2(error, stackTrace);
+ }
+ };
+ A._SyncCompleter.prototype = {
+ complete$1(value) {
+ var t2,
+ t1 = this.$ti;
+ t1._eval$1("1/?")._as(value);
+ t2 = this.future;
+ if ((t2._state & 30) !== 0)
+ throw A.wrapException(A.StateError$("Future already completed"));
+ t2._complete$1(t1._eval$1("1/")._as(value));
+ },
+ _completeError$2(error, stackTrace) {
+ this.future._completeError$2(error, stackTrace);
+ }
+ };
+ A._FutureListener.prototype = {
+ matchesErrorTest$1(asyncError) {
+ if ((this.state & 15) !== 6)
+ return true;
+ return this.result._zone.runUnary$2$2(type$.bool_Function_Object._as(this.callback), asyncError.error, type$.bool, type$.Object);
+ },
+ handleError$1(asyncError) {
+ var exception, _this = this,
+ errorCallback = _this.errorCallback,
+ result = null,
+ t1 = type$.dynamic,
+ t2 = type$.Object,
+ t3 = asyncError.error,
+ t4 = _this.result._zone;
+ if (type$.dynamic_Function_Object_StackTrace._is(errorCallback))
+ result = t4.runBinary$3$3(errorCallback, t3, asyncError.stackTrace, t1, t2, type$.StackTrace);
+ else
+ result = t4.runUnary$2$2(type$.dynamic_Function_Object._as(errorCallback), t3, t1, t2);
+ try {
+ t1 = _this.$ti._eval$1("2/")._as(result);
+ return t1;
+ } catch (exception) {
+ if (type$.TypeError._is(A.unwrapException(exception))) {
+ if ((_this.state & 1) !== 0)
+ throw A.wrapException(A.ArgumentError$("The error handler of Future.then must return a value of the returned future's type", "onError"));
+ throw A.wrapException(A.ArgumentError$("The error handler of Future.catchError must return a value of the future's type", "onError"));
+ } else
+ throw exception;
+ }
+ }
+ };
+ A._Future.prototype = {
+ _setChained$1(source) {
+ this._state = this._state & 1 | 4;
+ this._resultOrListeners = source;
+ },
+ then$1$2$onError(f, onError, $R) {
+ var currentZone, result, t2,
+ t1 = this.$ti;
+ t1._bind$1($R)._eval$1("1/(2)")._as(f);
+ currentZone = $.Zone__current;
+ if (currentZone === B.C__RootZone) {
+ if (onError != null && !type$.dynamic_Function_Object_StackTrace._is(onError) && !type$.dynamic_Function_Object._is(onError))
+ throw A.wrapException(A.ArgumentError$value(onError, "onError", string$.Error_));
+ } else {
+ $R._eval$1("@<0/>")._bind$1(t1._precomputed1)._eval$1("1(2)")._as(f);
+ if (onError != null)
+ onError = A._registerErrorHandler(onError, currentZone);
+ }
+ result = new A._Future(currentZone, $R._eval$1("_Future<0>"));
+ t2 = onError == null ? 1 : 3;
+ this._addListener$1(new A._FutureListener(result, t2, f, onError, t1._eval$1("@<1>")._bind$1($R)._eval$1("_FutureListener<1,2>")));
+ return result;
+ },
+ then$1$1(f, $R) {
+ return this.then$1$2$onError(f, null, $R);
+ },
+ _thenAwait$1$2(f, onError, $E) {
+ var result,
+ t1 = this.$ti;
+ t1._bind$1($E)._eval$1("1/(2)")._as(f);
+ result = new A._Future($.Zone__current, $E._eval$1("_Future<0>"));
+ this._addListener$1(new A._FutureListener(result, 19, f, onError, t1._eval$1("@<1>")._bind$1($E)._eval$1("_FutureListener<1,2>")));
+ return result;
+ },
+ whenComplete$1(action) {
+ var t1, result;
+ type$.dynamic_Function._as(action);
+ t1 = this.$ti;
+ result = new A._Future($.Zone__current, t1);
+ this._addListener$1(new A._FutureListener(result, 8, action, null, t1._eval$1("@<1>")._bind$1(t1._precomputed1)._eval$1("_FutureListener<1,2>")));
+ return result;
+ },
+ _setErrorObject$1(error) {
+ this._state = this._state & 1 | 16;
+ this._resultOrListeners = error;
+ },
+ _cloneResult$1(source) {
+ this._state = source._state & 30 | this._state & 1;
+ this._resultOrListeners = source._resultOrListeners;
+ },
+ _addListener$1(listener) {
+ var source, _this = this,
+ t1 = _this._state;
+ if (t1 <= 3) {
+ listener._nextListener = type$.nullable__FutureListener_dynamic_dynamic._as(_this._resultOrListeners);
+ _this._resultOrListeners = listener;
+ } else {
+ if ((t1 & 4) !== 0) {
+ source = type$._Future_dynamic._as(_this._resultOrListeners);
+ if ((source._state & 24) === 0) {
+ source._addListener$1(listener);
+ return;
+ }
+ _this._cloneResult$1(source);
+ }
+ A._rootScheduleMicrotask(null, null, _this._zone, type$.void_Function._as(new A._Future__addListener_closure(_this, listener)));
+ }
+ },
+ _prependListeners$1(listeners) {
+ var t1, existingListeners, next, cursor, next0, source, _this = this, _box_0 = {};
+ _box_0.listeners = listeners;
+ if (listeners == null)
+ return;
+ t1 = _this._state;
+ if (t1 <= 3) {
+ existingListeners = type$.nullable__FutureListener_dynamic_dynamic._as(_this._resultOrListeners);
+ _this._resultOrListeners = listeners;
+ if (existingListeners != null) {
+ next = listeners._nextListener;
+ for (cursor = listeners; next != null; cursor = next, next = next0)
+ next0 = next._nextListener;
+ cursor._nextListener = existingListeners;
+ }
+ } else {
+ if ((t1 & 4) !== 0) {
+ source = type$._Future_dynamic._as(_this._resultOrListeners);
+ if ((source._state & 24) === 0) {
+ source._prependListeners$1(listeners);
+ return;
+ }
+ _this._cloneResult$1(source);
+ }
+ _box_0.listeners = _this._reverseListeners$1(listeners);
+ A._rootScheduleMicrotask(null, null, _this._zone, type$.void_Function._as(new A._Future__prependListeners_closure(_box_0, _this)));
+ }
+ },
+ _removeListeners$0() {
+ var current = type$.nullable__FutureListener_dynamic_dynamic._as(this._resultOrListeners);
+ this._resultOrListeners = null;
+ return this._reverseListeners$1(current);
+ },
+ _reverseListeners$1(listeners) {
+ var current, prev, next;
+ for (current = listeners, prev = null; current != null; prev = current, current = next) {
+ next = current._nextListener;
+ current._nextListener = prev;
+ }
+ return prev;
+ },
+ _chainForeignFuture$1(source) {
+ var e, s, exception, _this = this;
+ _this._state ^= 2;
+ try {
+ source.then$1$2$onError(new A._Future__chainForeignFuture_closure(_this), new A._Future__chainForeignFuture_closure0(_this), type$.Null);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ A.scheduleMicrotask(new A._Future__chainForeignFuture_closure1(_this, e, s));
+ }
+ },
+ _complete$1(value) {
+ var listeners, _this = this,
+ t1 = _this.$ti;
+ t1._eval$1("1/")._as(value);
+ if (t1._eval$1("Future<1>")._is(value))
+ if (t1._is(value))
+ A._Future__chainCoreFutureSync(value, _this);
+ else
+ _this._chainForeignFuture$1(value);
+ else {
+ listeners = _this._removeListeners$0();
+ t1._precomputed1._as(value);
+ _this._state = 8;
+ _this._resultOrListeners = value;
+ A._Future__propagateToListeners(_this, listeners);
+ }
+ },
+ _completeWithValue$1(value) {
+ var listeners, _this = this;
+ _this.$ti._precomputed1._as(value);
+ listeners = _this._removeListeners$0();
+ _this._state = 8;
+ _this._resultOrListeners = value;
+ A._Future__propagateToListeners(_this, listeners);
+ },
+ _completeError$2(error, stackTrace) {
+ var listeners;
+ type$.Object._as(error);
+ type$.StackTrace._as(stackTrace);
+ listeners = this._removeListeners$0();
+ this._setErrorObject$1(A.AsyncError$(error, stackTrace));
+ A._Future__propagateToListeners(this, listeners);
+ },
+ _asyncComplete$1(value) {
+ var t1 = this.$ti;
+ t1._eval$1("1/")._as(value);
+ if (t1._eval$1("Future<1>")._is(value)) {
+ this._chainFuture$1(value);
+ return;
+ }
+ this._asyncCompleteWithValue$1(value);
+ },
+ _asyncCompleteWithValue$1(value) {
+ var _this = this;
+ _this.$ti._precomputed1._as(value);
+ _this._state ^= 2;
+ A._rootScheduleMicrotask(null, null, _this._zone, type$.void_Function._as(new A._Future__asyncCompleteWithValue_closure(_this, value)));
+ },
+ _chainFuture$1(value) {
+ var t1 = this.$ti;
+ t1._eval$1("Future<1>")._as(value);
+ if (t1._is(value)) {
+ A._Future__chainCoreFutureAsync(value, this);
+ return;
+ }
+ this._chainForeignFuture$1(value);
+ },
+ _asyncCompleteError$2(error, stackTrace) {
+ type$.StackTrace._as(stackTrace);
+ this._state ^= 2;
+ A._rootScheduleMicrotask(null, null, this._zone, type$.void_Function._as(new A._Future__asyncCompleteError_closure(this, error, stackTrace)));
+ },
+ $isFuture: 1
+ };
+ A._Future__addListener_closure.prototype = {
+ call$0() {
+ A._Future__propagateToListeners(this.$this, this.listener);
+ },
+ $signature: 0
+ };
+ A._Future__prependListeners_closure.prototype = {
+ call$0() {
+ A._Future__propagateToListeners(this.$this, this._box_0.listeners);
+ },
+ $signature: 0
+ };
+ A._Future__chainForeignFuture_closure.prototype = {
+ call$1(value) {
+ var error, stackTrace, exception,
+ t1 = this.$this;
+ t1._state ^= 2;
+ try {
+ t1._completeWithValue$1(t1.$ti._precomputed1._as(value));
+ } catch (exception) {
+ error = A.unwrapException(exception);
+ stackTrace = A.getTraceFromException(exception);
+ t1._completeError$2(error, stackTrace);
+ }
+ },
+ $signature: 4
+ };
+ A._Future__chainForeignFuture_closure0.prototype = {
+ call$2(error, stackTrace) {
+ this.$this._completeError$2(type$.Object._as(error), type$.StackTrace._as(stackTrace));
+ },
+ $signature: 5
+ };
+ A._Future__chainForeignFuture_closure1.prototype = {
+ call$0() {
+ this.$this._completeError$2(this.e, this.s);
+ },
+ $signature: 0
+ };
+ A._Future__chainCoreFutureAsync_closure.prototype = {
+ call$0() {
+ A._Future__chainCoreFutureSync(this._box_0.source, this.target);
+ },
+ $signature: 0
+ };
+ A._Future__asyncCompleteWithValue_closure.prototype = {
+ call$0() {
+ this.$this._completeWithValue$1(this.value);
+ },
+ $signature: 0
+ };
+ A._Future__asyncCompleteError_closure.prototype = {
+ call$0() {
+ this.$this._completeError$2(this.error, this.stackTrace);
+ },
+ $signature: 0
+ };
+ A._Future__propagateToListeners_handleWhenCompleteCallback.prototype = {
+ call$0() {
+ var e, s, t1, exception, t2, originalSource, _this = this, completeResult = null;
+ try {
+ t1 = _this._box_0.listener;
+ completeResult = t1.result._zone.run$1$1(type$.dynamic_Function._as(t1.callback), type$.dynamic);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ t1 = _this.hasError && type$.AsyncError._as(_this._box_1.source._resultOrListeners).error === e;
+ t2 = _this._box_0;
+ if (t1)
+ t2.listenerValueOrError = type$.AsyncError._as(_this._box_1.source._resultOrListeners);
+ else
+ t2.listenerValueOrError = A.AsyncError$(e, s);
+ t2.listenerHasError = true;
+ return;
+ }
+ if (completeResult instanceof A._Future && (completeResult._state & 24) !== 0) {
+ if ((completeResult._state & 16) !== 0) {
+ t1 = _this._box_0;
+ t1.listenerValueOrError = type$.AsyncError._as(completeResult._resultOrListeners);
+ t1.listenerHasError = true;
+ }
+ return;
+ }
+ if (completeResult instanceof A._Future) {
+ originalSource = _this._box_1.source;
+ t1 = _this._box_0;
+ t1.listenerValueOrError = completeResult.then$1$1(new A._Future__propagateToListeners_handleWhenCompleteCallback_closure(originalSource), type$.dynamic);
+ t1.listenerHasError = false;
+ }
+ },
+ $signature: 0
+ };
+ A._Future__propagateToListeners_handleWhenCompleteCallback_closure.prototype = {
+ call$1(_) {
+ return this.originalSource;
+ },
+ $signature: 18
+ };
+ A._Future__propagateToListeners_handleValueCallback.prototype = {
+ call$0() {
+ var e, s, t1, t2, t3, t4, t5, exception;
+ try {
+ t1 = this._box_0;
+ t2 = t1.listener;
+ t3 = t2.$ti;
+ t4 = t3._precomputed1;
+ t5 = t4._as(this.sourceResult);
+ t1.listenerValueOrError = t2.result._zone.runUnary$2$2(t3._eval$1("2/(1)")._as(t2.callback), t5, t3._eval$1("2/"), t4);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ t1 = this._box_0;
+ t1.listenerValueOrError = A.AsyncError$(e, s);
+ t1.listenerHasError = true;
+ }
+ },
+ $signature: 0
+ };
+ A._Future__propagateToListeners_handleError.prototype = {
+ call$0() {
+ var asyncError, e, s, t1, exception, t2, _this = this;
+ try {
+ asyncError = type$.AsyncError._as(_this._box_1.source._resultOrListeners);
+ t1 = _this._box_0;
+ if (t1.listener.matchesErrorTest$1(asyncError) && t1.listener.errorCallback != null) {
+ t1.listenerValueOrError = t1.listener.handleError$1(asyncError);
+ t1.listenerHasError = false;
+ }
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ t1 = type$.AsyncError._as(_this._box_1.source._resultOrListeners);
+ t2 = _this._box_0;
+ if (t1.error === e)
+ t2.listenerValueOrError = t1;
+ else
+ t2.listenerValueOrError = A.AsyncError$(e, s);
+ t2.listenerHasError = true;
+ }
+ },
+ $signature: 0
+ };
+ A._AsyncCallbackEntry.prototype = {};
+ A.Stream.prototype = {
+ get$length(_) {
+ var t1 = {},
+ future = new A._Future($.Zone__current, type$._Future_int);
+ t1.count = 0;
+ this.listen$4$cancelOnError$onDone$onError(new A.Stream_length_closure(t1, this), true, new A.Stream_length_closure0(t1, future), future.get$_completeError());
+ return future;
+ },
+ get$first(_) {
+ var future = new A._Future($.Zone__current, A._instanceType(this)._eval$1("_Future<1>")),
+ subscription = this.listen$4$cancelOnError$onDone$onError(null, true, new A.Stream_first_closure(future), future.get$_completeError());
+ subscription.onData$1(new A.Stream_first_closure0(this, subscription, future));
+ return future;
+ }
+ };
+ A.Stream_length_closure.prototype = {
+ call$1(_) {
+ A._instanceType(this.$this)._precomputed1._as(_);
+ ++this._box_0.count;
+ },
+ $signature() {
+ return A._instanceType(this.$this)._eval$1("~(1)");
+ }
+ };
+ A.Stream_length_closure0.prototype = {
+ call$0() {
+ this.future._complete$1(this._box_0.count);
+ },
+ $signature: 0
+ };
+ A.Stream_first_closure.prototype = {
+ call$0() {
+ var e, s, t1, exception, stackTrace;
+ try {
+ t1 = A.IterableElementError_noElement();
+ throw A.wrapException(t1);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ t1 = e;
+ stackTrace = s;
+ if (stackTrace == null)
+ stackTrace = A.AsyncError_defaultStackTrace(t1);
+ this.future._completeError$2(t1, stackTrace);
+ }
+ },
+ $signature: 0
+ };
+ A.Stream_first_closure0.prototype = {
+ call$1(value) {
+ A._cancelAndValue(this.subscription, this.future, A._instanceType(this.$this)._precomputed1._as(value));
+ },
+ $signature() {
+ return A._instanceType(this.$this)._eval$1("~(1)");
+ }
+ };
+ A._StreamController.prototype = {
+ get$_pendingEvents() {
+ var t1, _this = this;
+ if ((_this._state & 8) === 0)
+ return A._instanceType(_this)._eval$1("_PendingEvents<1>?")._as(_this._varData);
+ t1 = A._instanceType(_this);
+ return t1._eval$1("_PendingEvents<1>?")._as(t1._eval$1("_StreamControllerAddStreamState<1>")._as(_this._varData).get$_varData());
+ },
+ _ensurePendingEvents$0() {
+ var events, t1, _this = this;
+ if ((_this._state & 8) === 0) {
+ events = _this._varData;
+ if (events == null)
+ events = _this._varData = new A._PendingEvents(A._instanceType(_this)._eval$1("_PendingEvents<1>"));
+ return A._instanceType(_this)._eval$1("_PendingEvents<1>")._as(events);
+ }
+ t1 = A._instanceType(_this);
+ events = t1._eval$1("_StreamControllerAddStreamState<1>")._as(_this._varData).get$_varData();
+ return t1._eval$1("_PendingEvents<1>")._as(events);
+ },
+ get$_subscription() {
+ var varData = this._varData;
+ if ((this._state & 8) !== 0)
+ varData = type$._StreamControllerAddStreamState_nullable_Object._as(varData).get$_varData();
+ return A._instanceType(this)._eval$1("_ControllerSubscription<1>")._as(varData);
+ },
+ _badEventState$0() {
+ if ((this._state & 4) !== 0)
+ return new A.StateError("Cannot add event after closing");
+ return new A.StateError("Cannot add event while adding a stream");
+ },
+ _ensureDoneFuture$0() {
+ var t1 = this._doneFuture;
+ if (t1 == null)
+ t1 = this._doneFuture = (this._state & 2) !== 0 ? $.$get$Future__nullFuture() : new A._Future($.Zone__current, type$._Future_void);
+ return t1;
+ },
+ add$1(_, value) {
+ var t2, _this = this,
+ t1 = A._instanceType(_this);
+ t1._precomputed1._as(value);
+ t2 = _this._state;
+ if (t2 >= 4)
+ throw A.wrapException(_this._badEventState$0());
+ if ((t2 & 1) !== 0)
+ _this._sendData$1(value);
+ else if ((t2 & 3) === 0)
+ _this._ensurePendingEvents$0().add$1(0, new A._DelayedData(value, t1._eval$1("_DelayedData<1>")));
+ },
+ close$0() {
+ var _this = this,
+ t1 = _this._state;
+ if ((t1 & 4) !== 0)
+ return _this._ensureDoneFuture$0();
+ if (t1 >= 4)
+ throw A.wrapException(_this._badEventState$0());
+ t1 = _this._state = t1 | 4;
+ if ((t1 & 1) !== 0)
+ _this._sendDone$0();
+ else if ((t1 & 3) === 0)
+ _this._ensurePendingEvents$0().add$1(0, B.C__DelayedDone);
+ return _this._ensureDoneFuture$0();
+ },
+ _subscribe$4(onData, onError, onDone, cancelOnError) {
+ var t2, t3, t4, t5, t6, t7, t8, subscription, pendingEvents, addState, _this = this,
+ t1 = A._instanceType(_this);
+ t1._eval$1("~(1)?")._as(onData);
+ type$.nullable_void_Function._as(onDone);
+ if ((_this._state & 3) !== 0)
+ throw A.wrapException(A.StateError$("Stream has already been listened to."));
+ t2 = $.Zone__current;
+ t3 = cancelOnError ? 1 : 0;
+ t4 = onError != null ? 32 : 0;
+ t5 = A._BufferingStreamSubscription__registerDataHandler(t2, onData, t1._precomputed1);
+ t6 = A._BufferingStreamSubscription__registerErrorHandler(t2, onError);
+ t7 = onDone == null ? A.async___nullDoneHandler$closure() : onDone;
+ t8 = type$.void_Function;
+ subscription = new A._ControllerSubscription(_this, t5, t6, t8._as(t7), t2, t3 | t4, t1._eval$1("_ControllerSubscription<1>"));
+ pendingEvents = _this.get$_pendingEvents();
+ t4 = _this._state |= 1;
+ if ((t4 & 8) !== 0) {
+ addState = t1._eval$1("_StreamControllerAddStreamState<1>")._as(_this._varData);
+ addState.set$_varData(subscription);
+ addState.resume$0();
+ } else
+ _this._varData = subscription;
+ subscription._setPendingEvents$1(pendingEvents);
+ t1 = t8._as(new A._StreamController__subscribe_closure(_this));
+ t2 = subscription._state;
+ subscription._state = t2 | 64;
+ t1.call$0();
+ subscription._state &= 4294967231;
+ subscription._checkState$1((t2 & 4) !== 0);
+ return subscription;
+ },
+ _recordCancel$1(subscription) {
+ var result, onCancel, cancelResult, e, s, exception, result0, _this = this,
+ t1 = A._instanceType(_this);
+ t1._eval$1("StreamSubscription<1>")._as(subscription);
+ result = null;
+ if ((_this._state & 8) !== 0)
+ result = t1._eval$1("_StreamControllerAddStreamState<1>")._as(_this._varData).cancel$0();
+ _this._varData = null;
+ _this._state = _this._state & 4294967286 | 2;
+ onCancel = _this.onCancel;
+ if (onCancel != null)
+ if (result == null)
+ try {
+ cancelResult = onCancel.call$0();
+ if (cancelResult instanceof A._Future)
+ result = cancelResult;
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ result0 = new A._Future($.Zone__current, type$._Future_void);
+ result0._asyncCompleteError$2(e, s);
+ result = result0;
+ }
+ else
+ result = result.whenComplete$1(onCancel);
+ t1 = new A._StreamController__recordCancel_complete(_this);
+ if (result != null)
+ result = result.whenComplete$1(t1);
+ else
+ t1.call$0();
+ return result;
+ },
+ $isStreamController: 1,
+ $is_StreamControllerLifecycle: 1,
+ $is_EventDispatch: 1
+ };
+ A._StreamController__subscribe_closure.prototype = {
+ call$0() {
+ A._runGuarded(this.$this.onListen);
+ },
+ $signature: 0
+ };
+ A._StreamController__recordCancel_complete.prototype = {
+ call$0() {
+ var doneFuture = this.$this._doneFuture;
+ if (doneFuture != null && (doneFuture._state & 30) === 0)
+ doneFuture._asyncComplete$1(null);
+ },
+ $signature: 0
+ };
+ A._AsyncStreamControllerDispatch.prototype = {
+ _sendData$1(data) {
+ var t1 = this.$ti;
+ t1._precomputed1._as(data);
+ this.get$_subscription()._addPending$1(new A._DelayedData(data, t1._eval$1("_DelayedData<1>")));
+ },
+ _sendError$2(error, stackTrace) {
+ this.get$_subscription()._addPending$1(new A._DelayedError(error, stackTrace));
+ },
+ _sendDone$0() {
+ this.get$_subscription()._addPending$1(B.C__DelayedDone);
+ }
+ };
+ A._AsyncStreamController.prototype = {};
+ A._ControllerStream.prototype = {
+ get$hashCode(_) {
+ return (A.Primitives_objectHashCode(this._controller) ^ 892482866) >>> 0;
+ },
+ $eq(_, other) {
+ if (other == null)
+ return false;
+ if (this === other)
+ return true;
+ return other instanceof A._ControllerStream && other._controller === this._controller;
+ }
+ };
+ A._ControllerSubscription.prototype = {
+ _onCancel$0() {
+ return this._controller._recordCancel$1(this);
+ },
+ _onPause$0() {
+ var t1 = this._controller,
+ t2 = A._instanceType(t1);
+ t2._eval$1("StreamSubscription<1>")._as(this);
+ if ((t1._state & 8) !== 0)
+ t2._eval$1("_StreamControllerAddStreamState<1>")._as(t1._varData).pause$0();
+ A._runGuarded(t1.onPause);
+ },
+ _onResume$0() {
+ var t1 = this._controller,
+ t2 = A._instanceType(t1);
+ t2._eval$1("StreamSubscription<1>")._as(this);
+ if ((t1._state & 8) !== 0)
+ t2._eval$1("_StreamControllerAddStreamState<1>")._as(t1._varData).resume$0();
+ A._runGuarded(t1.onResume);
+ }
+ };
+ A._StreamSinkWrapper.prototype = {};
+ A._BufferingStreamSubscription.prototype = {
+ _setPendingEvents$1(pendingEvents) {
+ var _this = this;
+ A._instanceType(_this)._eval$1("_PendingEvents<1>?")._as(pendingEvents);
+ if (pendingEvents == null)
+ return;
+ _this.set$_pending(pendingEvents);
+ if (pendingEvents.lastPendingEvent != null) {
+ _this._state |= 128;
+ pendingEvents.schedule$1(_this);
+ }
+ },
+ onData$1(handleData) {
+ var t1 = A._instanceType(this);
+ this.set$_onData(A._BufferingStreamSubscription__registerDataHandler(this._zone, t1._eval$1("~(1)?")._as(handleData), t1._precomputed1));
+ },
+ cancel$0() {
+ var t1 = this._state &= 4294967279;
+ if ((t1 & 8) === 0)
+ this._cancel$0();
+ t1 = this._cancelFuture;
+ return t1 == null ? $.$get$Future__nullFuture() : t1;
+ },
+ asFuture$1$1(futureValue, $E) {
+ var result, _this = this, t1 = {};
+ t1.resultValue = null;
+ if (!$E._is(null))
+ throw A.wrapException(A.ArgumentError$notNull("futureValue"));
+ $E._as(futureValue);
+ t1.resultValue = futureValue;
+ result = new A._Future($.Zone__current, $E._eval$1("_Future<0>"));
+ _this.set$_onDone(new A._BufferingStreamSubscription_asFuture_closure(t1, result));
+ _this._state |= 32;
+ _this._onError = new A._BufferingStreamSubscription_asFuture_closure0(_this, result);
+ return result;
+ },
+ _cancel$0() {
+ var t2, _this = this,
+ t1 = _this._state |= 8;
+ if ((t1 & 128) !== 0) {
+ t2 = _this._pending;
+ if (t2._state === 1)
+ t2._state = 3;
+ }
+ if ((t1 & 64) === 0)
+ _this.set$_pending(null);
+ _this._cancelFuture = _this._onCancel$0();
+ },
+ _onPause$0() {
+ },
+ _onResume$0() {
+ },
+ _onCancel$0() {
+ return null;
+ },
+ _addPending$1($event) {
+ var t1, _this = this,
+ pending = _this._pending;
+ if (pending == null) {
+ pending = new A._PendingEvents(A._instanceType(_this)._eval$1("_PendingEvents<1>"));
+ _this.set$_pending(pending);
+ }
+ pending.add$1(0, $event);
+ t1 = _this._state;
+ if ((t1 & 128) === 0) {
+ t1 |= 128;
+ _this._state = t1;
+ if (t1 < 256)
+ pending.schedule$1(_this);
+ }
+ },
+ _sendData$1(data) {
+ var t2, _this = this,
+ t1 = A._instanceType(_this)._precomputed1;
+ t1._as(data);
+ t2 = _this._state;
+ _this._state = t2 | 64;
+ _this._zone.runUnaryGuarded$1$2(_this._onData, data, t1);
+ _this._state &= 4294967231;
+ _this._checkState$1((t2 & 4) !== 0);
+ },
+ _sendError$2(error, stackTrace) {
+ var cancelFuture, _this = this,
+ t1 = _this._state,
+ t2 = new A._BufferingStreamSubscription__sendError_sendError(_this, error, stackTrace);
+ if ((t1 & 1) !== 0) {
+ _this._state = t1 | 16;
+ _this._cancel$0();
+ cancelFuture = _this._cancelFuture;
+ if (cancelFuture != null && cancelFuture !== $.$get$Future__nullFuture())
+ cancelFuture.whenComplete$1(t2);
+ else
+ t2.call$0();
+ } else {
+ t2.call$0();
+ _this._checkState$1((t1 & 4) !== 0);
+ }
+ },
+ _sendDone$0() {
+ var cancelFuture, _this = this,
+ t1 = new A._BufferingStreamSubscription__sendDone_sendDone(_this);
+ _this._cancel$0();
+ _this._state |= 16;
+ cancelFuture = _this._cancelFuture;
+ if (cancelFuture != null && cancelFuture !== $.$get$Future__nullFuture())
+ cancelFuture.whenComplete$1(t1);
+ else
+ t1.call$0();
+ },
+ _checkState$1(wasInputPaused) {
+ var t2, isInputPaused, _this = this,
+ t1 = _this._state;
+ if ((t1 & 128) !== 0 && _this._pending.lastPendingEvent == null) {
+ t1 = _this._state = t1 & 4294967167;
+ if ((t1 & 4) !== 0)
+ if (t1 < 256) {
+ t2 = _this._pending;
+ t2 = t2 == null ? null : t2.lastPendingEvent == null;
+ t2 = t2 !== false;
+ } else
+ t2 = false;
+ else
+ t2 = false;
+ if (t2) {
+ t1 &= 4294967291;
+ _this._state = t1;
+ }
+ }
+ for (; true; wasInputPaused = isInputPaused) {
+ if ((t1 & 8) !== 0) {
+ _this.set$_pending(null);
+ return;
+ }
+ isInputPaused = (t1 & 4) !== 0;
+ if (wasInputPaused === isInputPaused)
+ break;
+ _this._state = t1 ^ 64;
+ if (isInputPaused)
+ _this._onPause$0();
+ else
+ _this._onResume$0();
+ t1 = _this._state &= 4294967231;
+ }
+ if ((t1 & 128) !== 0 && t1 < 256)
+ _this._pending.schedule$1(_this);
+ },
+ set$_onData(_onData) {
+ this._onData = A._instanceType(this)._eval$1("~(1)")._as(_onData);
+ },
+ set$_onDone(_onDone) {
+ this._onDone = type$.void_Function._as(_onDone);
+ },
+ set$_pending(_pending) {
+ this._pending = A._instanceType(this)._eval$1("_PendingEvents<1>?")._as(_pending);
+ },
+ $isStreamSubscription: 1,
+ $is_EventDispatch: 1
+ };
+ A._BufferingStreamSubscription_asFuture_closure.prototype = {
+ call$0() {
+ this.result._complete$1(this._box_0.resultValue);
+ },
+ $signature: 0
+ };
+ A._BufferingStreamSubscription_asFuture_closure0.prototype = {
+ call$2(error, stackTrace) {
+ var cancelFuture = this.$this.cancel$0(),
+ t1 = this.result;
+ if (cancelFuture !== $.$get$Future__nullFuture())
+ cancelFuture.whenComplete$1(new A._BufferingStreamSubscription_asFuture__closure(t1, error, stackTrace));
+ else
+ t1._completeError$2(error, stackTrace);
+ },
+ $signature: 5
+ };
+ A._BufferingStreamSubscription_asFuture__closure.prototype = {
+ call$0() {
+ this.result._completeError$2(this.error, this.stackTrace);
+ },
+ $signature: 2
+ };
+ A._BufferingStreamSubscription__sendError_sendError.prototype = {
+ call$0() {
+ var onError, t3, t4,
+ t1 = this.$this,
+ t2 = t1._state;
+ if ((t2 & 8) !== 0 && (t2 & 16) === 0)
+ return;
+ t1._state = t2 | 64;
+ onError = t1._onError;
+ t2 = this.error;
+ t3 = type$.Object;
+ t4 = t1._zone;
+ if (type$.void_Function_Object_StackTrace._is(onError))
+ t4.runBinaryGuarded$2$3(onError, t2, this.stackTrace, t3, type$.StackTrace);
+ else
+ t4.runUnaryGuarded$1$2(type$.void_Function_Object._as(onError), t2, t3);
+ t1._state &= 4294967231;
+ },
+ $signature: 0
+ };
+ A._BufferingStreamSubscription__sendDone_sendDone.prototype = {
+ call$0() {
+ var t1 = this.$this,
+ t2 = t1._state;
+ if ((t2 & 16) === 0)
+ return;
+ t1._state = t2 | 74;
+ t1._zone.runGuarded$1(t1._onDone);
+ t1._state &= 4294967231;
+ },
+ $signature: 0
+ };
+ A._StreamImpl.prototype = {
+ listen$4$cancelOnError$onDone$onError(onData, cancelOnError, onDone, onError) {
+ var t1 = this.$ti;
+ t1._eval$1("~(1)?")._as(onData);
+ type$.nullable_void_Function._as(onDone);
+ return this._controller._subscribe$4(t1._eval$1("~(1)?")._as(onData), onError, onDone, cancelOnError === true);
+ },
+ listen$1(onData) {
+ return this.listen$4$cancelOnError$onDone$onError(onData, null, null, null);
+ },
+ listen$2$cancelOnError(onData, cancelOnError) {
+ return this.listen$4$cancelOnError$onDone$onError(onData, cancelOnError, null, null);
+ },
+ listen$2$onDone(onData, onDone) {
+ return this.listen$4$cancelOnError$onDone$onError(onData, null, onDone, null);
+ }
+ };
+ A._DelayedEvent.prototype = {
+ set$next(next) {
+ this.next = type$.nullable__DelayedEvent_dynamic._as(next);
+ },
+ get$next() {
+ return this.next;
+ }
+ };
+ A._DelayedData.prototype = {
+ perform$1(dispatch) {
+ this.$ti._eval$1("_EventDispatch<1>")._as(dispatch)._sendData$1(this.value);
+ }
+ };
+ A._DelayedError.prototype = {
+ perform$1(dispatch) {
+ dispatch._sendError$2(this.error, this.stackTrace);
+ }
+ };
+ A._DelayedDone.prototype = {
+ perform$1(dispatch) {
+ dispatch._sendDone$0();
+ },
+ get$next() {
+ return null;
+ },
+ set$next(_) {
+ throw A.wrapException(A.StateError$("No events after a done."));
+ },
+ $is_DelayedEvent: 1
+ };
+ A._PendingEvents.prototype = {
+ schedule$1(dispatch) {
+ var t1, _this = this;
+ _this.$ti._eval$1("_EventDispatch<1>")._as(dispatch);
+ t1 = _this._state;
+ if (t1 === 1)
+ return;
+ if (t1 >= 1) {
+ _this._state = 1;
+ return;
+ }
+ A.scheduleMicrotask(new A._PendingEvents_schedule_closure(_this, dispatch));
+ _this._state = 1;
+ },
+ add$1(_, $event) {
+ var _this = this,
+ lastEvent = _this.lastPendingEvent;
+ if (lastEvent == null)
+ _this.firstPendingEvent = _this.lastPendingEvent = $event;
+ else {
+ lastEvent.set$next($event);
+ _this.lastPendingEvent = $event;
+ }
+ }
+ };
+ A._PendingEvents_schedule_closure.prototype = {
+ call$0() {
+ var t2, $event, nextEvent,
+ t1 = this.$this,
+ oldState = t1._state;
+ t1._state = 0;
+ if (oldState === 3)
+ return;
+ t2 = t1.$ti._eval$1("_EventDispatch<1>")._as(this.dispatch);
+ $event = t1.firstPendingEvent;
+ nextEvent = $event.get$next();
+ t1.firstPendingEvent = nextEvent;
+ if (nextEvent == null)
+ t1.lastPendingEvent = null;
+ $event.perform$1(t2);
+ },
+ $signature: 0
+ };
+ A._StreamIterator.prototype = {};
+ A._cancelAndValue_closure.prototype = {
+ call$0() {
+ return this.future._complete$1(this.value);
+ },
+ $signature: 0
+ };
+ A._Zone.prototype = {$isZone: 1};
+ A._rootHandleError_closure.prototype = {
+ call$0() {
+ A.Error_throwWithStackTrace(this.error, this.stackTrace);
+ },
+ $signature: 0
+ };
+ A._RootZone.prototype = {
+ runGuarded$1(f) {
+ var e, s, exception;
+ type$.void_Function._as(f);
+ try {
+ if (B.C__RootZone === $.Zone__current) {
+ f.call$0();
+ return;
+ }
+ A._rootRun(null, null, this, f, type$.void);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ A._rootHandleError(type$.Object._as(e), type$.StackTrace._as(s));
+ }
+ },
+ runUnaryGuarded$1$2(f, arg, $T) {
+ var e, s, exception;
+ $T._eval$1("~(0)")._as(f);
+ $T._as(arg);
+ try {
+ if (B.C__RootZone === $.Zone__current) {
+ f.call$1(arg);
+ return;
+ }
+ A._rootRunUnary(null, null, this, f, arg, type$.void, $T);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ A._rootHandleError(type$.Object._as(e), type$.StackTrace._as(s));
+ }
+ },
+ runBinaryGuarded$2$3(f, arg1, arg2, T1, T2) {
+ var e, s, exception;
+ T1._eval$1("@<0>")._bind$1(T2)._eval$1("~(1,2)")._as(f);
+ T1._as(arg1);
+ T2._as(arg2);
+ try {
+ if (B.C__RootZone === $.Zone__current) {
+ f.call$2(arg1, arg2);
+ return;
+ }
+ A._rootRunBinary(null, null, this, f, arg1, arg2, type$.void, T1, T2);
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ s = A.getTraceFromException(exception);
+ A._rootHandleError(type$.Object._as(e), type$.StackTrace._as(s));
+ }
+ },
+ bindCallbackGuarded$1(f) {
+ return new A._RootZone_bindCallbackGuarded_closure(this, type$.void_Function._as(f));
+ },
+ bindUnaryCallbackGuarded$1$1(f, $T) {
+ return new A._RootZone_bindUnaryCallbackGuarded_closure(this, $T._eval$1("~(0)")._as(f), $T);
+ },
+ run$1$1(f, $R) {
+ $R._eval$1("0()")._as(f);
+ if ($.Zone__current === B.C__RootZone)
+ return f.call$0();
+ return A._rootRun(null, null, this, f, $R);
+ },
+ runUnary$2$2(f, arg, $R, $T) {
+ $R._eval$1("@<0>")._bind$1($T)._eval$1("1(2)")._as(f);
+ $T._as(arg);
+ if ($.Zone__current === B.C__RootZone)
+ return f.call$1(arg);
+ return A._rootRunUnary(null, null, this, f, arg, $R, $T);
+ },
+ runBinary$3$3(f, arg1, arg2, $R, T1, T2) {
+ $R._eval$1("@<0>")._bind$1(T1)._bind$1(T2)._eval$1("1(2,3)")._as(f);
+ T1._as(arg1);
+ T2._as(arg2);
+ if ($.Zone__current === B.C__RootZone)
+ return f.call$2(arg1, arg2);
+ return A._rootRunBinary(null, null, this, f, arg1, arg2, $R, T1, T2);
+ },
+ registerBinaryCallback$3$1(f, $R, T1, T2) {
+ return $R._eval$1("@<0>")._bind$1(T1)._bind$1(T2)._eval$1("1(2,3)")._as(f);
+ }
+ };
+ A._RootZone_bindCallbackGuarded_closure.prototype = {
+ call$0() {
+ return this.$this.runGuarded$1(this.f);
+ },
+ $signature: 0
+ };
+ A._RootZone_bindUnaryCallbackGuarded_closure.prototype = {
+ call$1(arg) {
+ var t1 = this.T;
+ return this.$this.runUnaryGuarded$1$2(this.f, t1._as(arg), t1);
+ },
+ $signature() {
+ return this.T._eval$1("~(0)");
+ }
+ };
+ A._HashMap.prototype = {
+ get$length(_) {
+ return this._collection$_length;
+ },
+ get$isEmpty(_) {
+ return this._collection$_length === 0;
+ },
+ get$keys() {
+ return new A._HashMapKeyIterable(this, this.$ti._eval$1("_HashMapKeyIterable<1>"));
+ },
+ containsKey$1(key) {
+ var strings, nums;
+ if (typeof key == "string" && key !== "__proto__") {
+ strings = this._collection$_strings;
+ return strings == null ? false : strings[key] != null;
+ } else if (typeof key == "number" && (key & 1073741823) === key) {
+ nums = this._collection$_nums;
+ return nums == null ? false : nums[key] != null;
+ } else
+ return this._containsKey$1(key);
+ },
+ _containsKey$1(key) {
+ var rest = this._collection$_rest;
+ if (rest == null)
+ return false;
+ return this._findBucketIndex$2(this._getBucket$2(rest, key), key) >= 0;
+ },
+ $index(_, key) {
+ var strings, t1, nums;
+ if (typeof key == "string" && key !== "__proto__") {
+ strings = this._collection$_strings;
+ t1 = strings == null ? null : A._HashMap__getTableEntry(strings, key);
+ return t1;
+ } else if (typeof key == "number" && (key & 1073741823) === key) {
+ nums = this._collection$_nums;
+ t1 = nums == null ? null : A._HashMap__getTableEntry(nums, key);
+ return t1;
+ } else
+ return this._get$1(key);
+ },
+ _get$1(key) {
+ var bucket, index,
+ rest = this._collection$_rest;
+ if (rest == null)
+ return null;
+ bucket = this._getBucket$2(rest, key);
+ index = this._findBucketIndex$2(bucket, key);
+ return index < 0 ? null : bucket[index + 1];
+ },
+ $indexSet(_, key, value) {
+ var strings, nums, rest, hash, bucket, index, _this = this,
+ t1 = _this.$ti;
+ t1._precomputed1._as(key);
+ t1._rest[1]._as(value);
+ if (typeof key == "string" && key !== "__proto__") {
+ strings = _this._collection$_strings;
+ _this._collection$_addHashTableEntry$3(strings == null ? _this._collection$_strings = A._HashMap__newHashTable() : strings, key, value);
+ } else if (typeof key == "number" && (key & 1073741823) === key) {
+ nums = _this._collection$_nums;
+ _this._collection$_addHashTableEntry$3(nums == null ? _this._collection$_nums = A._HashMap__newHashTable() : nums, key, value);
+ } else {
+ rest = _this._collection$_rest;
+ if (rest == null)
+ rest = _this._collection$_rest = A._HashMap__newHashTable();
+ hash = A.objectHashCode(key) & 1073741823;
+ bucket = rest[hash];
+ if (bucket == null) {
+ A._HashMap__setTableEntry(rest, hash, [key, value]);
+ ++_this._collection$_length;
+ _this._collection$_keys = null;
+ } else {
+ index = _this._findBucketIndex$2(bucket, key);
+ if (index >= 0)
+ bucket[index + 1] = value;
+ else {
+ bucket.push(key, value);
+ ++_this._collection$_length;
+ _this._collection$_keys = null;
+ }
+ }
+ }
+ },
+ forEach$1(_, action) {
+ var keys, $length, t2, i, key, t3, _this = this,
+ t1 = _this.$ti;
+ t1._eval$1("~(1,2)")._as(action);
+ keys = _this._computeKeys$0();
+ for ($length = keys.length, t2 = t1._precomputed1, t1 = t1._rest[1], i = 0; i < $length; ++i) {
+ key = keys[i];
+ t2._as(key);
+ t3 = _this.$index(0, key);
+ action.call$2(key, t3 == null ? t1._as(t3) : t3);
+ if (keys !== _this._collection$_keys)
+ throw A.wrapException(A.ConcurrentModificationError$(_this));
+ }
+ },
+ _computeKeys$0() {
+ var strings, names, entries, index, i, nums, rest, bucket, $length, i0, _this = this,
+ result = _this._collection$_keys;
+ if (result != null)
+ return result;
+ result = A.List_List$filled(_this._collection$_length, null, false, type$.dynamic);
+ strings = _this._collection$_strings;
+ if (strings != null) {
+ names = Object.getOwnPropertyNames(strings);
+ entries = names.length;
+ for (index = 0, i = 0; i < entries; ++i) {
+ result[index] = names[i];
+ ++index;
+ }
+ } else
+ index = 0;
+ nums = _this._collection$_nums;
+ if (nums != null) {
+ names = Object.getOwnPropertyNames(nums);
+ entries = names.length;
+ for (i = 0; i < entries; ++i) {
+ result[index] = +names[i];
+ ++index;
+ }
+ }
+ rest = _this._collection$_rest;
+ if (rest != null) {
+ names = Object.getOwnPropertyNames(rest);
+ entries = names.length;
+ for (i = 0; i < entries; ++i) {
+ bucket = rest[names[i]];
+ $length = bucket.length;
+ for (i0 = 0; i0 < $length; i0 += 2) {
+ result[index] = bucket[i0];
+ ++index;
+ }
+ }
+ }
+ return _this._collection$_keys = result;
+ },
+ _collection$_addHashTableEntry$3(table, key, value) {
+ var t1 = this.$ti;
+ t1._precomputed1._as(key);
+ t1._rest[1]._as(value);
+ if (table[key] == null) {
+ ++this._collection$_length;
+ this._collection$_keys = null;
+ }
+ A._HashMap__setTableEntry(table, key, value);
+ },
+ _getBucket$2(table, key) {
+ return table[A.objectHashCode(key) & 1073741823];
+ }
+ };
+ A._IdentityHashMap.prototype = {
+ _findBucketIndex$2(bucket, key) {
+ var $length, i, t1;
+ if (bucket == null)
+ return -1;
+ $length = bucket.length;
+ for (i = 0; i < $length; i += 2) {
+ t1 = bucket[i];
+ if (t1 == null ? key == null : t1 === key)
+ return i;
+ }
+ return -1;
+ }
+ };
+ A._HashMapKeyIterable.prototype = {
+ get$length(_) {
+ return this._collection$_map._collection$_length;
+ },
+ get$isEmpty(_) {
+ return this._collection$_map._collection$_length === 0;
+ },
+ get$iterator(_) {
+ var t1 = this._collection$_map;
+ return new A._HashMapKeyIterator(t1, t1._computeKeys$0(), this.$ti._eval$1("_HashMapKeyIterator<1>"));
+ }
+ };
+ A._HashMapKeyIterator.prototype = {
+ get$current() {
+ var t1 = this._collection$_current;
+ return t1 == null ? this.$ti._precomputed1._as(t1) : t1;
+ },
+ moveNext$0() {
+ var _this = this,
+ keys = _this._collection$_keys,
+ offset = _this._offset,
+ t1 = _this._collection$_map;
+ if (keys !== t1._collection$_keys)
+ throw A.wrapException(A.ConcurrentModificationError$(t1));
+ else if (offset >= keys.length) {
+ _this.set$_collection$_current(null);
+ return false;
+ } else {
+ _this.set$_collection$_current(keys[offset]);
+ _this._offset = offset + 1;
+ return true;
+ }
+ },
+ set$_collection$_current(_current) {
+ this._collection$_current = this.$ti._eval$1("1?")._as(_current);
+ }
+ };
+ A.ListBase.prototype = {
+ get$iterator(receiver) {
+ return new A.ListIterator(receiver, this.get$length(receiver), A.instanceType(receiver)._eval$1("ListIterator<ListBase.E>"));
+ },
+ elementAt$1(receiver, index) {
+ return this.$index(receiver, index);
+ },
+ get$isNotEmpty(receiver) {
+ return this.get$length(receiver) !== 0;
+ },
+ toString$0(receiver) {
+ return A.Iterable_iterableToFullString(receiver, "[", "]");
+ }
+ };
+ A.MapBase.prototype = {
+ forEach$1(_, action) {
+ var t2, key, t3,
+ t1 = A._instanceType(this);
+ t1._eval$1("~(MapBase.K,MapBase.V)")._as(action);
+ for (t2 = this.get$keys(), t2 = t2.get$iterator(t2), t1 = t1._eval$1("MapBase.V"); t2.moveNext$0();) {
+ key = t2.get$current();
+ t3 = this.$index(0, key);
+ action.call$2(key, t3 == null ? t1._as(t3) : t3);
+ }
+ },
+ get$length(_) {
+ var t1 = this.get$keys();
+ return t1.get$length(t1);
+ },
+ get$isEmpty(_) {
+ var t1 = this.get$keys();
+ return t1.get$isEmpty(t1);
+ },
+ toString$0(_) {
+ return A.MapBase_mapToString(this);
+ },
+ $isMap: 1
+ };
+ A.MapBase_mapToString_closure.prototype = {
+ call$2(k, v) {
+ var t2,
+ t1 = this._box_0;
+ if (!t1.first)
+ this.result._contents += ", ";
+ t1.first = false;
+ t1 = this.result;
+ t2 = A.S(k);
+ t2 = t1._contents += t2;
+ t1._contents = t2 + ": ";
+ t2 = A.S(v);
+ t1._contents += t2;
+ },
+ $signature: 10
+ };
+ A._UnmodifiableMapMixin.prototype = {};
+ A.MapView.prototype = {
+ forEach$1(_, action) {
+ this._collection$_map.forEach$1(0, A._instanceType(this)._eval$1("~(1,2)")._as(action));
+ },
+ get$isEmpty(_) {
+ return this._collection$_map.__js_helper$_length === 0;
+ },
+ get$length(_) {
+ return this._collection$_map.__js_helper$_length;
+ },
+ toString$0(_) {
+ return A.MapBase_mapToString(this._collection$_map);
+ },
+ $isMap: 1
+ };
+ A.UnmodifiableMapView.prototype = {};
+ A.ListQueue.prototype = {
+ get$iterator(_) {
+ var _this = this;
+ return new A._ListQueueIterator(_this, _this._tail, _this._modificationCount, _this._head, _this.$ti._eval$1("_ListQueueIterator<1>"));
+ },
+ get$isEmpty(_) {
+ return this._head === this._tail;
+ },
+ get$length(_) {
+ return (this._tail - this._head & this._table.length - 1) >>> 0;
+ },
+ elementAt$1(_, index) {
+ var t2, t3, _this = this,
+ t1 = _this.get$length(0);
+ if (0 > index || index >= t1)
+ A.throwExpression(A.IndexError$withLength(index, t1, _this, null, "index"));
+ t1 = _this._table;
+ t2 = t1.length;
+ t3 = (_this._head + index & t2 - 1) >>> 0;
+ if (!(t3 >= 0 && t3 < t2))
+ return A.ioore(t1, t3);
+ t3 = t1[t3];
+ return t3 == null ? _this.$ti._precomputed1._as(t3) : t3;
+ },
+ toString$0(_) {
+ return A.Iterable_iterableToFullString(this, "{", "}");
+ },
+ removeFirst$0() {
+ var t2, result, _this = this,
+ t1 = _this._head;
+ if (t1 === _this._tail)
+ throw A.wrapException(A.IterableElementError_noElement());
+ ++_this._modificationCount;
+ t2 = _this._table;
+ if (!(t1 < t2.length))
+ return A.ioore(t2, t1);
+ result = t2[t1];
+ if (result == null)
+ result = _this.$ti._precomputed1._as(result);
+ B.JSArray_methods.$indexSet(t2, t1, null);
+ _this._head = (_this._head + 1 & _this._table.length - 1) >>> 0;
+ return result;
+ },
+ _add$1(element) {
+ var t2, t3, newTable, split, _this = this,
+ t1 = _this.$ti;
+ t1._precomputed1._as(element);
+ B.JSArray_methods.$indexSet(_this._table, _this._tail, element);
+ t2 = _this._tail;
+ t3 = _this._table.length;
+ t2 = (t2 + 1 & t3 - 1) >>> 0;
+ _this._tail = t2;
+ if (_this._head === t2) {
+ newTable = A.List_List$filled(t3 * 2, null, false, t1._eval$1("1?"));
+ t1 = _this._table;
+ t2 = _this._head;
+ split = t1.length - t2;
+ B.JSArray_methods.setRange$4(newTable, 0, split, t1, t2);
+ B.JSArray_methods.setRange$4(newTable, split, split + _this._head, _this._table, 0);
+ _this._head = 0;
+ _this._tail = _this._table.length;
+ _this.set$_table(newTable);
+ }
+ ++_this._modificationCount;
+ },
+ set$_table(_table) {
+ this._table = this.$ti._eval$1("List<1?>")._as(_table);
+ },
+ $isQueue: 1
+ };
+ A._ListQueueIterator.prototype = {
+ get$current() {
+ var t1 = this._collection$_current;
+ return t1 == null ? this.$ti._precomputed1._as(t1) : t1;
+ },
+ moveNext$0() {
+ var t2, t3, _this = this,
+ t1 = _this._queue;
+ if (_this._modificationCount !== t1._modificationCount)
+ A.throwExpression(A.ConcurrentModificationError$(t1));
+ t2 = _this._position;
+ if (t2 === _this._end) {
+ _this.set$_collection$_current(null);
+ return false;
+ }
+ t3 = t1._table;
+ if (!(t2 < t3.length))
+ return A.ioore(t3, t2);
+ _this.set$_collection$_current(t3[t2]);
+ _this._position = (_this._position + 1 & t1._table.length - 1) >>> 0;
+ return true;
+ },
+ set$_collection$_current(_current) {
+ this._collection$_current = this.$ti._eval$1("1?")._as(_current);
+ }
+ };
+ A._UnmodifiableMapView_MapView__UnmodifiableMapMixin.prototype = {};
+ A._JsonMap.prototype = {
+ $index(_, key) {
+ var result,
+ t1 = this._processed;
+ if (t1 == null)
+ return this._data.$index(0, key);
+ else if (typeof key != "string")
+ return null;
+ else {
+ result = t1[key];
+ return typeof result == "undefined" ? this._process$1(key) : result;
+ }
+ },
+ get$length(_) {
+ return this._processed == null ? this._data.__js_helper$_length : this._convert$_computeKeys$0().length;
+ },
+ get$isEmpty(_) {
+ return this.get$length(0) === 0;
+ },
+ get$keys() {
+ if (this._processed == null) {
+ var t1 = this._data;
+ return new A.LinkedHashMapKeyIterable(t1, A._instanceType(t1)._eval$1("LinkedHashMapKeyIterable<1>"));
+ }
+ return new A._JsonMapKeyIterable(this);
+ },
+ forEach$1(_, f) {
+ var keys, i, key, value, _this = this;
+ type$.void_Function_String_dynamic._as(f);
+ if (_this._processed == null)
+ return _this._data.forEach$1(0, f);
+ keys = _this._convert$_computeKeys$0();
+ for (i = 0; i < keys.length; ++i) {
+ key = keys[i];
+ value = _this._processed[key];
+ if (typeof value == "undefined") {
+ value = A._convertJsonToDartLazy(_this._original[key]);
+ _this._processed[key] = value;
+ }
+ f.call$2(key, value);
+ if (keys !== _this._data)
+ throw A.wrapException(A.ConcurrentModificationError$(_this));
+ }
+ },
+ _convert$_computeKeys$0() {
+ var keys = type$.nullable_List_dynamic._as(this._data);
+ if (keys == null)
+ keys = this._data = A._setArrayType(Object.keys(this._original), type$.JSArray_String);
+ return keys;
+ },
+ _process$1(key) {
+ var result;
+ if (!Object.prototype.hasOwnProperty.call(this._original, key))
+ return null;
+ result = A._convertJsonToDartLazy(this._original[key]);
+ return this._processed[key] = result;
+ }
+ };
+ A._JsonMapKeyIterable.prototype = {
+ get$length(_) {
+ return this._parent.get$length(0);
+ },
+ elementAt$1(_, index) {
+ var t1 = this._parent;
+ if (t1._processed == null)
+ t1 = t1.get$keys().elementAt$1(0, index);
+ else {
+ t1 = t1._convert$_computeKeys$0();
+ if (!(index >= 0 && index < t1.length))
+ return A.ioore(t1, index);
+ t1 = t1[index];
+ }
+ return t1;
+ },
+ get$iterator(_) {
+ var t1 = this._parent;
+ if (t1._processed == null) {
+ t1 = t1.get$keys();
+ t1 = t1.get$iterator(t1);
+ } else {
+ t1 = t1._convert$_computeKeys$0();
+ t1 = new J.ArrayIterator(t1, t1.length, A._arrayInstanceType(t1)._eval$1("ArrayIterator<1>"));
+ }
+ return t1;
+ }
+ };
+ A.Codec.prototype = {};
+ A.Converter.prototype = {};
+ A.JsonUnsupportedObjectError.prototype = {
+ toString$0(_) {
+ var safeString = A.Error_safeToString(this.unsupportedObject);
+ return (this.cause != null ? "Converting object to an encodable object failed:" : "Converting object did not return an encodable object:") + " " + safeString;
+ }
+ };
+ A.JsonCyclicError.prototype = {
+ toString$0(_) {
+ return "Cyclic error in JSON stringify";
+ }
+ };
+ A.JsonCodec.prototype = {
+ decode$2$reviver(source, reviver) {
+ var t1 = A._parseJson(source, this.get$decoder()._reviver);
+ return t1;
+ },
+ encode$2$toEncodable(value, toEncodable) {
+ var t1 = A._JsonStringStringifier_stringify(value, this.get$encoder()._toEncodable, null);
+ return t1;
+ },
+ get$encoder() {
+ return B.JsonEncoder_null;
+ },
+ get$decoder() {
+ return B.JsonDecoder_null;
+ }
+ };
+ A.JsonEncoder.prototype = {};
+ A.JsonDecoder.prototype = {};
+ A._JsonStringifier.prototype = {
+ writeStringContent$1(s) {
+ var offset, i, charCode, t1, t2, _this = this,
+ $length = s.length;
+ for (offset = 0, i = 0; i < $length; ++i) {
+ charCode = s.charCodeAt(i);
+ if (charCode > 92) {
+ if (charCode >= 55296) {
+ t1 = charCode & 64512;
+ if (t1 === 55296) {
+ t2 = i + 1;
+ t2 = !(t2 < $length && (s.charCodeAt(t2) & 64512) === 56320);
+ } else
+ t2 = false;
+ if (!t2)
+ if (t1 === 56320) {
+ t1 = i - 1;
+ t1 = !(t1 >= 0 && (s.charCodeAt(t1) & 64512) === 55296);
+ } else
+ t1 = false;
+ else
+ t1 = true;
+ if (t1) {
+ if (i > offset)
+ _this.writeStringSlice$3(s, offset, i);
+ offset = i + 1;
+ _this.writeCharCode$1(92);
+ _this.writeCharCode$1(117);
+ _this.writeCharCode$1(100);
+ t1 = charCode >>> 8 & 15;
+ _this.writeCharCode$1(t1 < 10 ? 48 + t1 : 87 + t1);
+ t1 = charCode >>> 4 & 15;
+ _this.writeCharCode$1(t1 < 10 ? 48 + t1 : 87 + t1);
+ t1 = charCode & 15;
+ _this.writeCharCode$1(t1 < 10 ? 48 + t1 : 87 + t1);
+ }
+ }
+ continue;
+ }
+ if (charCode < 32) {
+ if (i > offset)
+ _this.writeStringSlice$3(s, offset, i);
+ offset = i + 1;
+ _this.writeCharCode$1(92);
+ switch (charCode) {
+ case 8:
+ _this.writeCharCode$1(98);
+ break;
+ case 9:
+ _this.writeCharCode$1(116);
+ break;
+ case 10:
+ _this.writeCharCode$1(110);
+ break;
+ case 12:
+ _this.writeCharCode$1(102);
+ break;
+ case 13:
+ _this.writeCharCode$1(114);
+ break;
+ default:
+ _this.writeCharCode$1(117);
+ _this.writeCharCode$1(48);
+ _this.writeCharCode$1(48);
+ t1 = charCode >>> 4 & 15;
+ _this.writeCharCode$1(t1 < 10 ? 48 + t1 : 87 + t1);
+ t1 = charCode & 15;
+ _this.writeCharCode$1(t1 < 10 ? 48 + t1 : 87 + t1);
+ break;
+ }
+ } else if (charCode === 34 || charCode === 92) {
+ if (i > offset)
+ _this.writeStringSlice$3(s, offset, i);
+ offset = i + 1;
+ _this.writeCharCode$1(92);
+ _this.writeCharCode$1(charCode);
+ }
+ }
+ if (offset === 0)
+ _this.writeString$1(s);
+ else if (offset < $length)
+ _this.writeStringSlice$3(s, offset, $length);
+ },
+ _checkCycle$1(object) {
+ var t1, t2, i, t3;
+ for (t1 = this._seen, t2 = t1.length, i = 0; i < t2; ++i) {
+ t3 = t1[i];
+ if (object == null ? t3 == null : object === t3)
+ throw A.wrapException(new A.JsonCyclicError(object, null));
+ }
+ B.JSArray_methods.add$1(t1, object);
+ },
+ writeObject$1(object) {
+ var customJson, e, t1, exception, _this = this;
+ if (_this.writeJsonValue$1(object))
+ return;
+ _this._checkCycle$1(object);
+ try {
+ customJson = _this._toEncodable.call$1(object);
+ if (!_this.writeJsonValue$1(customJson)) {
+ t1 = A.JsonUnsupportedObjectError$(object, null, _this.get$_partialResult());
+ throw A.wrapException(t1);
+ }
+ t1 = _this._seen;
+ if (0 >= t1.length)
+ return A.ioore(t1, -1);
+ t1.pop();
+ } catch (exception) {
+ e = A.unwrapException(exception);
+ t1 = A.JsonUnsupportedObjectError$(object, e, _this.get$_partialResult());
+ throw A.wrapException(t1);
+ }
+ },
+ writeJsonValue$1(object) {
+ var t1, success, _this = this;
+ if (typeof object == "number") {
+ if (!isFinite(object))
+ return false;
+ _this.writeNumber$1(object);
+ return true;
+ } else if (object === true) {
+ _this.writeString$1("true");
+ return true;
+ } else if (object === false) {
+ _this.writeString$1("false");
+ return true;
+ } else if (object == null) {
+ _this.writeString$1("null");
+ return true;
+ } else if (typeof object == "string") {
+ _this.writeString$1('"');
+ _this.writeStringContent$1(object);
+ _this.writeString$1('"');
+ return true;
+ } else if (type$.List_dynamic._is(object)) {
+ _this._checkCycle$1(object);
+ _this.writeList$1(object);
+ t1 = _this._seen;
+ if (0 >= t1.length)
+ return A.ioore(t1, -1);
+ t1.pop();
+ return true;
+ } else if (type$.Map_dynamic_dynamic._is(object)) {
+ _this._checkCycle$1(object);
+ success = _this.writeMap$1(object);
+ t1 = _this._seen;
+ if (0 >= t1.length)
+ return A.ioore(t1, -1);
+ t1.pop();
+ return success;
+ } else
+ return false;
+ },
+ writeList$1(list) {
+ var t1, i, _this = this;
+ _this.writeString$1("[");
+ t1 = J.getInterceptor$asx(list);
+ if (t1.get$isNotEmpty(list)) {
+ _this.writeObject$1(t1.$index(list, 0));
+ for (i = 1; i < t1.get$length(list); ++i) {
+ _this.writeString$1(",");
+ _this.writeObject$1(t1.$index(list, i));
+ }
+ }
+ _this.writeString$1("]");
+ },
+ writeMap$1(map) {
+ var t1, keyValueList, i, separator, t2, _this = this, _box_0 = {};
+ if (map.get$isEmpty(map)) {
+ _this.writeString$1("{}");
+ return true;
+ }
+ t1 = map.get$length(map) * 2;
+ keyValueList = A.List_List$filled(t1, null, false, type$.nullable_Object);
+ i = _box_0.i = 0;
+ _box_0.allStringKeys = true;
+ map.forEach$1(0, new A._JsonStringifier_writeMap_closure(_box_0, keyValueList));
+ if (!_box_0.allStringKeys)
+ return false;
+ _this.writeString$1("{");
+ for (separator = '"'; i < t1; i += 2, separator = ',"') {
+ _this.writeString$1(separator);
+ _this.writeStringContent$1(A._asString(keyValueList[i]));
+ _this.writeString$1('":');
+ t2 = i + 1;
+ if (!(t2 < t1))
+ return A.ioore(keyValueList, t2);
+ _this.writeObject$1(keyValueList[t2]);
+ }
+ _this.writeString$1("}");
+ return true;
+ }
+ };
+ A._JsonStringifier_writeMap_closure.prototype = {
+ call$2(key, value) {
+ var t1, t2;
+ if (typeof key != "string")
+ this._box_0.allStringKeys = false;
+ t1 = this.keyValueList;
+ t2 = this._box_0;
+ B.JSArray_methods.$indexSet(t1, t2.i++, key);
+ B.JSArray_methods.$indexSet(t1, t2.i++, value);
+ },
+ $signature: 10
+ };
+ A._JsonStringStringifier.prototype = {
+ get$_partialResult() {
+ var t1 = this._sink._contents;
+ return t1.charCodeAt(0) == 0 ? t1 : t1;
+ },
+ writeNumber$1(number) {
+ var t1 = this._sink,
+ t2 = B.JSNumber_methods.toString$0(number);
+ t1._contents += t2;
+ },
+ writeString$1(string) {
+ this._sink._contents += string;
+ },
+ writeStringSlice$3(string, start, end) {
+ this._sink._contents += B.JSString_methods.substring$2(string, start, end);
+ },
+ writeCharCode$1(charCode) {
+ var t1 = this._sink,
+ t2 = A.Primitives_stringFromCharCode(charCode);
+ t1._contents += t2;
+ }
+ };
+ A.NoSuchMethodError_toString_closure.prototype = {
+ call$2(key, value) {
+ var t1, t2, t3;
+ type$.Symbol._as(key);
+ t1 = this.sb;
+ t2 = this._box_0;
+ t3 = t1._contents += t2.comma;
+ t3 += key._name;
+ t1._contents = t3;
+ t1._contents = t3 + ": ";
+ t3 = A.Error_safeToString(value);
+ t1._contents += t3;
+ t2.comma = ", ";
+ },
+ $signature: 19
+ };
+ A.DateTime.prototype = {
+ $eq(_, other) {
+ if (other == null)
+ return false;
+ return other instanceof A.DateTime && this._value === other._value && this.isUtc === other.isUtc;
+ },
+ get$hashCode(_) {
+ var t1 = this._value;
+ return (t1 ^ B.JSInt_methods._shrOtherPositive$1(t1, 30)) & 1073741823;
+ },
+ toString$0(_) {
+ var _this = this,
+ y = A.DateTime__fourDigits(A.Primitives_getYear(_this)),
+ m = A.DateTime__twoDigits(A.Primitives_getMonth(_this)),
+ d = A.DateTime__twoDigits(A.Primitives_getDay(_this)),
+ h = A.DateTime__twoDigits(A.Primitives_getHours(_this)),
+ min = A.DateTime__twoDigits(A.Primitives_getMinutes(_this)),
+ sec = A.DateTime__twoDigits(A.Primitives_getSeconds(_this)),
+ ms = A.DateTime__threeDigits(A.Primitives_getMilliseconds(_this)),
+ t1 = y + "-" + m;
+ if (_this.isUtc)
+ return t1 + "-" + d + " " + h + ":" + min + ":" + sec + "." + ms + "Z";
+ else
+ return t1 + "-" + d + " " + h + ":" + min + ":" + sec + "." + ms;
+ }
+ };
+ A.Duration.prototype = {
+ $eq(_, other) {
+ if (other == null)
+ return false;
+ return other instanceof A.Duration && this._duration === other._duration;
+ },
+ get$hashCode(_) {
+ return B.JSInt_methods.get$hashCode(this._duration);
+ },
+ toString$0(_) {
+ var minutesPadding, seconds, secondsPadding,
+ microseconds = this._duration,
+ microseconds0 = microseconds % 3600000000,
+ minutes = B.JSInt_methods._tdivFast$1(microseconds0, 60000000);
+ microseconds0 %= 60000000;
+ minutesPadding = minutes < 10 ? "0" : "";
+ seconds = B.JSInt_methods._tdivFast$1(microseconds0, 1000000);
+ secondsPadding = seconds < 10 ? "0" : "";
+ return "" + (microseconds / 3600000000 | 0) + ":" + minutesPadding + minutes + ":" + secondsPadding + seconds + "." + B.JSString_methods.padLeft$2(B.JSInt_methods.toString$0(microseconds0 % 1000000), 6, "0");
+ }
+ };
+ A.Error.prototype = {
+ get$stackTrace() {
+ return A.getTraceFromException(this.$thrownJsError);
+ }
+ };
+ A.AssertionError.prototype = {
+ toString$0(_) {
+ var t1 = this.message;
+ if (t1 != null)
+ return "Assertion failed: " + A.Error_safeToString(t1);
+ return "Assertion failed";
+ }
+ };
+ A.TypeError.prototype = {};
+ A.ArgumentError.prototype = {
+ get$_errorName() {
+ return "Invalid argument" + (!this._hasValue ? "(s)" : "");
+ },
+ get$_errorExplanation() {
+ return "";
+ },
+ toString$0(_) {
+ var _this = this,
+ $name = _this.name,
+ nameString = $name == null ? "" : " (" + $name + ")",
+ message = _this.message,
+ messageString = message == null ? "" : ": " + A.S(message),
+ prefix = _this.get$_errorName() + nameString + messageString;
+ if (!_this._hasValue)
+ return prefix;
+ return prefix + _this.get$_errorExplanation() + ": " + A.Error_safeToString(_this.get$invalidValue());
+ },
+ get$invalidValue() {
+ return this.invalidValue;
+ }
+ };
+ A.RangeError.prototype = {
+ get$invalidValue() {
+ return A._asNumQ(this.invalidValue);
+ },
+ get$_errorName() {
+ return "RangeError";
+ },
+ get$_errorExplanation() {
+ var explanation,
+ start = this.start,
+ end = this.end;
+ if (start == null)
+ explanation = end != null ? ": Not less than or equal to " + A.S(end) : "";
+ else if (end == null)
+ explanation = ": Not greater than or equal to " + A.S(start);
+ else if (end > start)
+ explanation = ": Not in inclusive range " + A.S(start) + ".." + A.S(end);
+ else
+ explanation = end < start ? ": Valid value range is empty" : ": Only valid value is " + A.S(start);
+ return explanation;
+ }
+ };
+ A.IndexError.prototype = {
+ get$invalidValue() {
+ return A._asInt(this.invalidValue);
+ },
+ get$_errorName() {
+ return "RangeError";
+ },
+ get$_errorExplanation() {
+ if (A._asInt(this.invalidValue) < 0)
+ return ": index must not be negative";
+ var t1 = this.length;
+ if (t1 === 0)
+ return ": no indices are valid";
+ return ": index should be less than " + t1;
+ },
+ get$length(receiver) {
+ return this.length;
+ }
+ };
+ A.NoSuchMethodError.prototype = {
+ toString$0(_) {
+ var $arguments, t1, _i, t2, t3, argument, receiverText, actualParameters, _this = this, _box_0 = {},
+ sb = new A.StringBuffer("");
+ _box_0.comma = "";
+ $arguments = _this._core$_arguments;
+ for (t1 = $arguments.length, _i = 0, t2 = "", t3 = ""; _i < t1; ++_i, t3 = ", ") {
+ argument = $arguments[_i];
+ sb._contents = t2 + t3;
+ t2 = A.Error_safeToString(argument);
+ t2 = sb._contents += t2;
+ _box_0.comma = ", ";
+ }
+ _this._namedArguments.forEach$1(0, new A.NoSuchMethodError_toString_closure(_box_0, sb));
+ receiverText = A.Error_safeToString(_this._core$_receiver);
+ actualParameters = sb.toString$0(0);
+ return "NoSuchMethodError: method not found: '" + _this._core$_memberName._name + "'\nReceiver: " + receiverText + "\nArguments: [" + actualParameters + "]";
+ }
+ };
+ A.UnsupportedError.prototype = {
+ toString$0(_) {
+ return "Unsupported operation: " + this.message;
+ }
+ };
+ A.UnimplementedError.prototype = {
+ toString$0(_) {
+ return "UnimplementedError: " + this.message;
+ }
+ };
+ A.StateError.prototype = {
+ toString$0(_) {
+ return "Bad state: " + this.message;
+ }
+ };
+ A.ConcurrentModificationError.prototype = {
+ toString$0(_) {
+ var t1 = this.modifiedObject;
+ if (t1 == null)
+ return "Concurrent modification during iteration.";
+ return "Concurrent modification during iteration: " + A.Error_safeToString(t1) + ".";
+ }
+ };
+ A.OutOfMemoryError.prototype = {
+ toString$0(_) {
+ return "Out of Memory";
+ },
+ get$stackTrace() {
+ return null;
+ },
+ $isError: 1
+ };
+ A.StackOverflowError.prototype = {
+ toString$0(_) {
+ return "Stack Overflow";
+ },
+ get$stackTrace() {
+ return null;
+ },
+ $isError: 1
+ };
+ A._Exception.prototype = {
+ toString$0(_) {
+ return "Exception: " + this.message;
+ }
+ };
+ A.FormatException.prototype = {
+ toString$0(_) {
+ var t1, lineEnd, lineNum, lineStart, previousCharWasCR, i, char, end, start, prefix, postfix,
+ message = this.message,
+ report = "" !== message ? "FormatException: " + message : "FormatException",
+ offset = this.offset,
+ source = this.source;
+ if (typeof source == "string") {
+ if (offset != null)
+ t1 = offset < 0 || offset > source.length;
+ else
+ t1 = false;
+ if (t1)
+ offset = null;
+ if (offset == null) {
+ if (source.length > 78)
+ source = B.JSString_methods.substring$2(source, 0, 75) + "...";
+ return report + "\n" + source;
+ }
+ for (lineEnd = source.length, lineNum = 1, lineStart = 0, previousCharWasCR = false, i = 0; i < offset; ++i) {
+ if (!(i < lineEnd))
+ return A.ioore(source, i);
+ char = source.charCodeAt(i);
+ if (char === 10) {
+ if (lineStart !== i || !previousCharWasCR)
+ ++lineNum;
+ lineStart = i + 1;
+ previousCharWasCR = false;
+ } else if (char === 13) {
+ ++lineNum;
+ lineStart = i + 1;
+ previousCharWasCR = true;
+ }
+ }
+ report = lineNum > 1 ? report + (" (at line " + lineNum + ", character " + (offset - lineStart + 1) + ")\n") : report + (" (at character " + (offset + 1) + ")\n");
+ for (i = offset; i < lineEnd; ++i) {
+ if (!(i >= 0))
+ return A.ioore(source, i);
+ char = source.charCodeAt(i);
+ if (char === 10 || char === 13) {
+ lineEnd = i;
+ break;
+ }
+ }
+ if (lineEnd - lineStart > 78)
+ if (offset - lineStart < 75) {
+ end = lineStart + 75;
+ start = lineStart;
+ prefix = "";
+ postfix = "...";
+ } else {
+ if (lineEnd - offset < 75) {
+ start = lineEnd - 75;
+ end = lineEnd;
+ postfix = "";
+ } else {
+ start = offset - 36;
+ end = offset + 36;
+ postfix = "...";
+ }
+ prefix = "...";
+ }
+ else {
+ end = lineEnd;
+ start = lineStart;
+ prefix = "";
+ postfix = "";
+ }
+ return report + prefix + B.JSString_methods.substring$2(source, start, end) + postfix + "\n" + B.JSString_methods.$mul(" ", offset - start + prefix.length) + "^\n";
+ } else
+ return offset != null ? report + (" (at offset " + A.S(offset) + ")") : report;
+ }
+ };
+ A.Iterable.prototype = {
+ get$length(_) {
+ var count,
+ it = this.get$iterator(this);
+ for (count = 0; it.moveNext$0();)
+ ++count;
+ return count;
+ },
+ elementAt$1(_, index) {
+ var iterator, skipCount;
+ A.RangeError_checkNotNegative(index, "index");
+ iterator = this.get$iterator(this);
+ for (skipCount = index; iterator.moveNext$0();) {
+ if (skipCount === 0)
+ return iterator.get$current();
+ --skipCount;
+ }
+ throw A.wrapException(A.IndexError$withLength(index, index - skipCount, this, null, "index"));
+ },
+ toString$0(_) {
+ return A.Iterable_iterableToShortString(this, "(", ")");
+ }
+ };
+ A.Null.prototype = {
+ get$hashCode(_) {
+ return A.Object.prototype.get$hashCode.call(this, 0);
+ },
+ toString$0(_) {
+ return "null";
+ }
+ };
+ A.Object.prototype = {$isObject: 1,
+ $eq(_, other) {
+ return this === other;
+ },
+ get$hashCode(_) {
+ return A.Primitives_objectHashCode(this);
+ },
+ toString$0(_) {
+ return "Instance of '" + A.Primitives_objectTypeName(this) + "'";
+ },
+ noSuchMethod$1(_, invocation) {
+ throw A.wrapException(A.NoSuchMethodError_NoSuchMethodError$withInvocation(this, type$.Invocation._as(invocation)));
+ },
+ get$runtimeType(_) {
+ return A.getRuntimeTypeOfDartObject(this);
+ },
+ toString() {
+ return this.toString$0(this);
+ }
+ };
+ A._StringStackTrace.prototype = {
+ toString$0(_) {
+ return this._stackTrace;
+ },
+ $isStackTrace: 1
+ };
+ A.StringBuffer.prototype = {
+ get$length(_) {
+ return this._contents.length;
+ },
+ toString$0(_) {
+ var t1 = this._contents;
+ return t1.charCodeAt(0) == 0 ? t1 : t1;
+ },
+ $isStringSink: 1
+ };
+ A.promiseToFuture_closure.prototype = {
+ call$1(r) {
+ return this.completer.complete$1(this.T._eval$1("0/?")._as(r));
+ },
+ $signature: 3
+ };
+ A.promiseToFuture_closure0.prototype = {
+ call$1(e) {
+ if (e == null)
+ return this.completer.completeError$1(new A.NullRejectionException(e === undefined));
+ return this.completer.completeError$1(e);
+ },
+ $signature: 3
+ };
+ A.dartify_convert.prototype = {
+ call$1(o) {
+ var t1, millisSinceEpoch, proto, t2, dartObject, originalKeys, dartKeys, i, jsKey, dartKey, l, $length;
+ if (A._noDartifyRequired(o))
+ return o;
+ t1 = this._convertedObjects;
+ o.toString;
+ if (t1.containsKey$1(o))
+ return t1.$index(0, o);
+ if (o instanceof Date) {
+ millisSinceEpoch = o.getTime();
+ if (Math.abs(millisSinceEpoch) > 864e13)
+ A.throwExpression(A.ArgumentError$("DateTime is outside valid range: " + millisSinceEpoch, null));
+ A.checkNotNullable(true, "isUtc", type$.bool);
+ return new A.DateTime(millisSinceEpoch, true);
+ }
+ if (o instanceof RegExp)
+ throw A.wrapException(A.ArgumentError$("structured clone of RegExp", null));
+ if (typeof Promise != "undefined" && o instanceof Promise)
+ return A.promiseToFuture(o, type$.nullable_Object);
+ proto = Object.getPrototypeOf(o);
+ if (proto === Object.prototype || proto === null) {
+ t2 = type$.nullable_Object;
+ dartObject = A.LinkedHashMap_LinkedHashMap$_empty(t2, t2);
+ t1.$indexSet(0, o, dartObject);
+ originalKeys = Object.keys(o);
+ dartKeys = [];
+ for (t1 = J.getInterceptor$ax(originalKeys), t2 = t1.get$iterator(originalKeys); t2.moveNext$0();)
+ dartKeys.push(A.dartify(t2.get$current()));
+ for (i = 0; i < t1.get$length(originalKeys); ++i) {
+ jsKey = t1.$index(originalKeys, i);
+ if (!(i < dartKeys.length))
+ return A.ioore(dartKeys, i);
+ dartKey = dartKeys[i];
+ if (jsKey != null)
+ dartObject.$indexSet(0, dartKey, this.call$1(o[jsKey]));
+ }
+ return dartObject;
+ }
+ if (o instanceof Array) {
+ l = o;
+ dartObject = [];
+ t1.$indexSet(0, o, dartObject);
+ $length = A._asInt(o.length);
+ for (t1 = J.getInterceptor$asx(l), i = 0; i < $length; ++i)
+ dartObject.push(this.call$1(t1.$index(l, i)));
+ return dartObject;
+ }
+ return o;
+ },
+ $signature: 20
+ };
+ A.NullRejectionException.prototype = {
+ toString$0(_) {
+ return "Promise was rejected with a value of `" + (this.isUndefined ? "undefined" : "null") + "`.";
+ }
+ };
+ A._JSRandom.prototype = {
+ nextInt$1(max) {
+ if (max <= 0 || max > 4294967296)
+ throw A.wrapException(A.RangeError$("max must be in range 0 < max \u2264 2^32, was " + max));
+ return Math.random() * max >>> 0;
+ }
+ };
+ A.AsyncMemoizer.prototype = {};
+ A.Level.prototype = {
+ $eq(_, other) {
+ if (other == null)
+ return false;
+ return other instanceof A.Level && this.value === other.value;
+ },
+ get$hashCode(_) {
+ return this.value;
+ },
+ toString$0(_) {
+ return this.name;
+ }
+ };
+ A.LogRecord.prototype = {
+ toString$0(_) {
+ return "[" + this.level.name + "] " + this.loggerName + ": " + this.message;
+ }
+ };
+ A.Logger.prototype = {
+ get$fullName() {
+ var t1 = this.parent,
+ t2 = t1 == null ? null : t1.name.length !== 0,
+ t3 = this.name;
+ return t2 === true ? t1.get$fullName() + "." + t3 : t3;
+ },
+ get$level() {
+ var t1, effectiveLevel;
+ if (this.parent == null) {
+ t1 = this._level;
+ t1.toString;
+ effectiveLevel = t1;
+ } else {
+ t1 = $.$get$Logger_root()._level;
+ t1.toString;
+ effectiveLevel = t1;
+ }
+ return effectiveLevel;
+ },
+ log$4(logLevel, message, error, stackTrace) {
+ var record, _this = this,
+ t1 = logLevel.value;
+ if (t1 >= _this.get$level().value) {
+ if (t1 >= 2000) {
+ A.StackTrace_current();
+ logLevel.toString$0(0);
+ }
+ t1 = _this.get$fullName();
+ Date.now();
+ $.LogRecord__nextNumber = $.LogRecord__nextNumber + 1;
+ record = new A.LogRecord(logLevel, message, t1);
+ if (_this.parent == null)
+ _this._publish$1(record);
+ else
+ $.$get$Logger_root()._publish$1(record);
+ }
+ },
+ _publish$1(record) {
+ return null;
+ }
+ };
+ A.Logger_Logger_closure.prototype = {
+ call$0() {
+ var dot, $parent, t1,
+ thisName = this.name;
+ if (B.JSString_methods.startsWith$1(thisName, "."))
+ A.throwExpression(A.ArgumentError$("name shouldn't start with a '.'", null));
+ if (B.JSString_methods.endsWith$1(thisName, "."))
+ A.throwExpression(A.ArgumentError$("name shouldn't end with a '.'", null));
+ dot = B.JSString_methods.lastIndexOf$1(thisName, ".");
+ if (dot === -1)
+ $parent = thisName !== "" ? A.Logger_Logger("") : null;
+ else {
+ $parent = A.Logger_Logger(B.JSString_methods.substring$2(thisName, 0, dot));
+ thisName = B.JSString_methods.substring$1(thisName, dot + 1);
+ }
+ t1 = new A.Logger(thisName, $parent, A.LinkedHashMap_LinkedHashMap$_empty(type$.String, type$.Logger));
+ if ($parent == null)
+ t1._level = B.Level_INFO_800;
+ else
+ $parent._children.$indexSet(0, thisName, t1);
+ return t1;
+ },
+ $signature: 21
+ };
+ A.Pool.prototype = {
+ request$0() {
+ var t1, t2, _this = this;
+ if ((_this._closeMemo._completer.future._state & 30) !== 0)
+ throw A.wrapException(A.StateError$("request() may not be called on a closed Pool."));
+ t1 = _this._allocatedResources;
+ if (t1 < _this._maxAllocatedResources) {
+ _this._allocatedResources = t1 + 1;
+ return A.Future_Future$value(new A.PoolResource(_this), type$.PoolResource);
+ } else {
+ t1 = _this._onReleaseCallbacks;
+ if (!t1.get$isEmpty(0))
+ return _this._runOnRelease$1(t1.removeFirst$0());
+ else {
+ t1 = new A._Future($.Zone__current, type$._Future_PoolResource);
+ t2 = _this._requestedResources;
+ t2._add$1(t2.$ti._precomputed1._as(new A._AsyncCompleter(t1, type$._AsyncCompleter_PoolResource)));
+ _this._resetTimer$0();
+ return t1;
+ }
+ }
+ },
+ withResource$1$1(callback, $T) {
+ return this.withResource$body$Pool($T._eval$1("0/()")._as(callback), $T, $T);
+ },
+ withResource$body$Pool(callback, $T, $async$type) {
+ var $async$goto = 0,
+ $async$completer = A._makeAsyncAwaitCompleter($async$type),
+ $async$returnValue, $async$handler = 2, $async$currentError, $async$next = [], $async$self = this, resource, t1, t2;
+ var $async$withResource$1$1 = A._wrapJsFunctionForAsync(function($async$errorCode, $async$result) {
+ if ($async$errorCode === 1) {
+ $async$currentError = $async$result;
+ $async$goto = $async$handler;
+ }
+ while (true)
+ switch ($async$goto) {
+ case 0:
+ // Function start
+ if (($async$self._closeMemo._completer.future._state & 30) !== 0)
+ throw A.wrapException(A.StateError$("withResource() may not be called on a closed Pool."));
+ $async$goto = 3;
+ return A._asyncAwait($async$self.request$0(), $async$withResource$1$1);
+ case 3:
+ // returning from await.
+ resource = $async$result;
+ $async$handler = 4;
+ t1 = callback.call$0();
+ $async$goto = 7;
+ return A._asyncAwait($T._eval$1("Future<0>")._is(t1) ? t1 : A._Future$value($T._as(t1), $T), $async$withResource$1$1);
+ case 7:
+ // returning from await.
+ t1 = $async$result;
+ $async$returnValue = t1;
+ $async$next = [1];
+ // goto finally
+ $async$goto = 5;
+ break;
+ $async$next.push(6);
+ // goto finally
+ $async$goto = 5;
+ break;
+ case 4:
+ // uncaught
+ $async$next = [2];
+ case 5:
+ // finally
+ $async$handler = 2;
+ t1 = resource;
+ if (t1._released)
+ A.throwExpression(A.StateError$("A PoolResource may only be released once."));
+ t1._released = true;
+ t1 = t1._pool;
+ t1._resetTimer$0();
+ t2 = t1._requestedResources;
+ if (!t2.get$isEmpty(0))
+ t2.removeFirst$0().complete$1(new A.PoolResource(t1));
+ else {
+ t2 = --t1._allocatedResources;
+ if ((t1._closeMemo._completer.future._state & 30) !== 0 && t2 === 0)
+ null.close$0();
+ }
+ // goto the next finally handler
+ $async$goto = $async$next.pop();
+ break;
+ case 6:
+ // after finally
+ case 1:
+ // return
+ return A._asyncReturn($async$returnValue, $async$completer);
+ case 2:
+ // rethrow
+ return A._asyncRethrow($async$currentError, $async$completer);
+ }
+ });
+ return A._asyncStartSync($async$withResource$1$1, $async$completer);
+ },
+ _runOnRelease$1(onRelease) {
+ var t1 = A.Future_Future$sync(type$.dynamic_Function._as(onRelease), type$.dynamic).then$1$1(new A.Pool__runOnRelease_closure(this), type$.Null),
+ onError = new A.Pool__runOnRelease_closure0(this),
+ t2 = t1.$ti,
+ t3 = $.Zone__current;
+ if (t3 !== B.C__RootZone)
+ onError = A._registerErrorHandler(onError, t3);
+ t1._addListener$1(new A._FutureListener(new A._Future(t3, t2), 2, null, onError, t2._eval$1("@<1>")._bind$1(t2._precomputed1)._eval$1("_FutureListener<1,2>")));
+ t1 = new A._Future($.Zone__current, type$._Future_PoolResource);
+ t2 = this._onReleaseCompleters;
+ t2._add$1(t2.$ti._precomputed1._as(new A._SyncCompleter(t1, type$._SyncCompleter_PoolResource)));
+ return t1;
+ },
+ _resetTimer$0() {
+ var t2,
+ t1 = this._timer;
+ if (t1 == null)
+ return;
+ t2 = this._requestedResources;
+ if (t2._head === t2._tail)
+ t1._restartable_timer$_timer.cancel$0();
+ else {
+ t1._restartable_timer$_timer.cancel$0();
+ t1._restartable_timer$_timer = A.Timer_Timer(t1._restartable_timer$_duration, t1._callback);
+ }
+ }
+ };
+ A.Pool__runOnRelease_closure.prototype = {
+ call$1(value) {
+ var t1 = this.$this;
+ t1._onReleaseCompleters.removeFirst$0().complete$1(new A.PoolResource(t1));
+ },
+ $signature: 4
+ };
+ A.Pool__runOnRelease_closure0.prototype = {
+ call$2(error, stackTrace) {
+ type$.Object._as(error);
+ type$.StackTrace._as(stackTrace);
+ this.$this._onReleaseCompleters.removeFirst$0().completeError$2(error, stackTrace);
+ },
+ $signature: 5
+ };
+ A.PoolResource.prototype = {};
+ A.SseClient.prototype = {
+ SseClient$2$debugKey(serverUrl, debugKey) {
+ var t2, t3, _this = this,
+ t1 = serverUrl + "?sseClientId=" + _this._clientId;
+ _this.__SseClient__serverUrl_A = t1;
+ t2 = type$.JSObject;
+ t1 = t2._as(new self.EventSource(t1, {withCredentials: true}));
+ _this.__SseClient__eventSource_A = t1;
+ new A._EventStream(t1, "open", false, type$._EventStream_JSObject).get$first(0).whenComplete$1(new A.SseClient_closure(_this));
+ t1 = type$.Function;
+ t3 = type$.JavaScriptFunction;
+ _this.__SseClient__eventSource_A.addEventListener("message", t3._as(A.allowInterop(_this.get$_onIncomingMessage(), t1)));
+ _this.__SseClient__eventSource_A.addEventListener("control", t3._as(A.allowInterop(_this.get$_onIncomingControlMessage(), t1)));
+ t1 = type$.nullable_void_Function_JSObject;
+ A._EventStreamSubscription$(_this.__SseClient__eventSource_A, "open", t1._as(new A.SseClient_closure0(_this)), false, t2);
+ A._EventStreamSubscription$(_this.__SseClient__eventSource_A, "error", t1._as(new A.SseClient_closure1(_this)), false, t2);
+ },
+ close$0() {
+ var _this = this,
+ t1 = _this.__SseClient__eventSource_A;
+ t1 === $ && A.throwLateFieldNI("_eventSource");
+ t1.close();
+ if ((_this._onConnected.future._state & 30) === 0) {
+ t1 = _this._outgoingController;
+ new A._ControllerStream(t1, A._instanceType(t1)._eval$1("_ControllerStream<1>")).listen$2$cancelOnError(null, true).asFuture$1$1(null, type$.void);
+ }
+ _this._incomingController.close$0();
+ _this._outgoingController.close$0();
+ },
+ _closeWithError$1(error) {
+ var stackTrace, t2,
+ t1 = this._incomingController;
+ A.checkNotNullable(error, "error", type$.Object);
+ if (t1._state >= 4)
+ A.throwExpression(t1._badEventState$0());
+ stackTrace = A.AsyncError_defaultStackTrace(error);
+ t2 = t1._state;
+ if ((t2 & 1) !== 0)
+ t1._sendError$2(error, stackTrace);
+ else if ((t2 & 3) === 0)
+ t1._ensurePendingEvents$0().add$1(0, new A._DelayedError(error, stackTrace));
+ this.close$0();
+ t1 = this._onConnected;
+ if ((t1.future._state & 30) === 0)
+ t1.completeError$1(error);
+ },
+ _onIncomingControlMessage$1(message) {
+ var data = type$.JSObject._as(message).data;
+ if (J.$eq$(A.dartify(data), "close"))
+ this.close$0();
+ else
+ throw A.wrapException(A.UnsupportedError$("[" + this._clientId + '] Illegal Control Message "' + A.S(data) + '"'));
+ },
+ _onIncomingMessage$1(message) {
+ this._incomingController.add$1(0, A._asString(B.C_JsonCodec.decode$2$reviver(A._asString(type$.JSObject._as(message).data), null)));
+ },
+ _onOutgoingDone$0() {
+ this.close$0();
+ },
+ _onOutgoingMessage$1(message) {
+ return this._onOutgoingMessage$body$SseClient(A._asStringQ(message));
+ },
+ _onOutgoingMessage$body$SseClient(message) {
+ var $async$goto = 0,
+ $async$completer = A._makeAsyncAwaitCompleter(type$.void),
+ $async$self = this, t1;
+ var $async$_onOutgoingMessage$1 = A._wrapJsFunctionForAsync(function($async$errorCode, $async$result) {
+ if ($async$errorCode === 1)
+ return A._asyncRethrow($async$result, $async$completer);
+ while (true)
+ switch ($async$goto) {
+ case 0:
+ // Function start
+ t1 = {};
+ t1.encodedMessage = null;
+ $async$goto = 2;
+ return A._asyncAwait($.$get$_requestPool().withResource$1$1(new A.SseClient__onOutgoingMessage_closure(t1, $async$self, message), type$.Null), $async$_onOutgoingMessage$1);
+ case 2:
+ // returning from await.
+ // implicit return
+ return A._asyncReturn(null, $async$completer);
+ }
+ });
+ return A._asyncStartSync($async$_onOutgoingMessage$1, $async$completer);
+ }
+ };
+ A.SseClient_closure.prototype = {
+ call$0() {
+ var t2,
+ t1 = this.$this;
+ t1._onConnected.complete$0();
+ t2 = t1._outgoingController;
+ new A._ControllerStream(t2, A._instanceType(t2)._eval$1("_ControllerStream<1>")).listen$2$onDone(t1.get$_onOutgoingMessage(), t1.get$_onOutgoingDone());
+ },
+ $signature: 2
+ };
+ A.SseClient_closure0.prototype = {
+ call$1(_) {
+ var t1 = this.$this._errorTimer;
+ if (t1 != null)
+ t1.cancel$0();
+ },
+ $signature: 1
+ };
+ A.SseClient_closure1.prototype = {
+ call$1(error) {
+ var t1 = this.$this,
+ t2 = t1._errorTimer;
+ t2 = t2 == null ? null : t2._handle != null;
+ if (t2 !== true)
+ t1._errorTimer = A.Timer_Timer(B.Duration_5000000, new A.SseClient__closure(t1, error));
+ },
+ $signature: 1
+ };
+ A.SseClient__closure.prototype = {
+ call$0() {
+ this.$this._closeWithError$1(this.error);
+ },
+ $signature: 0
+ };
+ A.SseClient__onOutgoingMessage_closure.prototype = {
+ call$0() {
+ var $async$goto = 0,
+ $async$completer = A._makeAsyncAwaitCompleter(type$.Null),
+ $async$handler = 1, $async$currentError, $async$self = this, e, e0, url, error, augmentedError, exception, t1, t2, $async$exception;
+ var $async$call$0 = A._wrapJsFunctionForAsync(function($async$errorCode, $async$result) {
+ if ($async$errorCode === 1) {
+ $async$currentError = $async$result;
+ $async$goto = $async$handler;
+ }
+ while (true)
+ switch ($async$goto) {
+ case 0:
+ // Function start
+ try {
+ $async$self._box_0.encodedMessage = B.C_JsonCodec.encode$2$toEncodable($async$self.message, null);
+ } catch (exception) {
+ t1 = A.unwrapException(exception);
+ if (t1 instanceof A.JsonUnsupportedObjectError) {
+ e = t1;
+ t1 = $async$self.$this;
+ t1._logger.log$4(B.Level_WARNING_900, "[" + t1._clientId + "] Unable to encode outgoing message: " + A.S(e), null, null);
+ } else if (t1 instanceof A.ArgumentError) {
+ e0 = t1;
+ t1 = $async$self.$this;
+ t1._logger.log$4(B.Level_WARNING_900, "[" + t1._clientId + "] Invalid argument: " + A.S(e0), null, null);
+ } else
+ throw exception;
+ }
+ $async$handler = 3;
+ t1 = $async$self.$this;
+ t2 = t1.__SseClient__serverUrl_A;
+ t2 === $ && A.throwLateFieldNI("_serverUrl");
+ url = t2 + "&messageId=" + ++t1._lastMessageId;
+ t1 = $async$self._box_0.encodedMessage;
+ if (t1 == null)
+ t1 = null;
+ t1 = {method: "POST", body: t1, credentials: "include"};
+ t2 = type$.JSObject;
+ $async$goto = 6;
+ return A._asyncAwait(A.promiseToFuture(t2._as(t2._as(self.window).fetch(url, t1)), t2), $async$call$0);
+ case 6:
+ // returning from await.
+ $async$handler = 1;
+ // goto after finally
+ $async$goto = 5;
+ break;
+ case 3:
+ // catch
+ $async$handler = 2;
+ $async$exception = $async$currentError;
+ error = A.unwrapException($async$exception);
+ t1 = $async$self.$this;
+ augmentedError = "[" + t1._clientId + "] SSE client failed to send " + A.S($async$self.message) + ":\n " + A.S(error);
+ t1._logger.log$4(B.Level_SEVERE_1000, augmentedError, null, null);
+ t1._closeWithError$1(augmentedError);
+ // goto after finally
+ $async$goto = 5;
+ break;
+ case 2:
+ // uncaught
+ // goto rethrow
+ $async$goto = 1;
+ break;
+ case 5:
+ // after finally
+ // implicit return
+ return A._asyncReturn(null, $async$completer);
+ case 1:
+ // rethrow
+ return A._asyncRethrow($async$currentError, $async$completer);
+ }
+ });
+ return A._asyncStartSync($async$call$0, $async$completer);
+ },
+ $signature: 7
+ };
+ A.generateUuidV4_generateBits.prototype = {
+ call$1(bitCount) {
+ return this.random.nextInt$1(B.JSInt_methods._shlPositive$1(1, bitCount));
+ },
+ $signature: 23
+ };
+ A.generateUuidV4_printDigits.prototype = {
+ call$2(value, count) {
+ return B.JSString_methods.padLeft$2(B.JSInt_methods.toRadixString$1(value, 16), count, "0");
+ },
+ $signature: 11
+ };
+ A.generateUuidV4_bitsDigits.prototype = {
+ call$2(bitCount, digitCount) {
+ return this.printDigits.call$2(this.generateBits.call$1(bitCount), digitCount);
+ },
+ $signature: 11
+ };
+ A.StreamChannelMixin.prototype = {};
+ A.EventStreamProvider.prototype = {};
+ A._EventStream.prototype = {
+ listen$4$cancelOnError$onDone$onError(onData, cancelOnError, onDone, onError) {
+ var t1 = A._instanceType(this);
+ t1._eval$1("~(1)?")._as(onData);
+ type$.nullable_void_Function._as(onDone);
+ return A._EventStreamSubscription$(this._target, this._eventType, onData, false, t1._precomputed1);
+ }
+ };
+ A._ElementEventStreamImpl.prototype = {};
+ A._EventStreamSubscription.prototype = {
+ cancel$0() {
+ var _this = this,
+ emptyFuture = A.Future_Future$value(null, type$.void);
+ if (_this._target == null)
+ return emptyFuture;
+ _this._unlisten$0();
+ _this._streams$_onData = _this._target = null;
+ return emptyFuture;
+ },
+ onData$1(handleData) {
+ var t1, _this = this;
+ _this.$ti._eval$1("~(1)?")._as(handleData);
+ if (_this._target == null)
+ throw A.wrapException(A.StateError$("Subscription has been canceled."));
+ _this._unlisten$0();
+ t1 = A._wrapZone(new A._EventStreamSubscription_onData_closure(handleData), type$.JSObject);
+ t1 = t1 == null ? null : type$.JavaScriptFunction._as(A.allowInterop(t1, type$.Function));
+ _this._streams$_onData = t1;
+ _this._tryResume$0();
+ },
+ _tryResume$0() {
+ var t1 = this._streams$_onData;
+ if (t1 != null)
+ this._target.addEventListener(this._eventType, t1, false);
+ },
+ _unlisten$0() {
+ var t1 = this._streams$_onData;
+ if (t1 != null)
+ this._target.removeEventListener(this._eventType, t1, false);
+ },
+ $isStreamSubscription: 1
+ };
+ A._EventStreamSubscription_closure.prototype = {
+ call$1(e) {
+ return this.onData.call$1(type$.JSObject._as(e));
+ },
+ $signature: 1
+ };
+ A._EventStreamSubscription_onData_closure.prototype = {
+ call$1(e) {
+ return this.handleData.call$1(type$.JSObject._as(e));
+ },
+ $signature: 1
+ };
+ A.main_closure.prototype = {
+ call$1(_) {
+ this.channel._outgoingController.close$0();
+ },
+ $signature: 1
+ };
+ A.main_closure0.prototype = {
+ call$1(s) {
+ var count, t1, t2, t3, i, t4, t5, lastEvent;
+ A._asString(s);
+ if (B.JSString_methods.startsWith$1(s, "send ")) {
+ count = A.int_parse(B.JSArray_methods.get$last(s.split(" ")), null);
+ for (t1 = this.channel._outgoingController, t2 = A._instanceType(t1), t3 = t2._precomputed1, t2 = t2._eval$1("_DelayedData<1>"), i = 0; i < count; ++i) {
+ t4 = t3._as("" + i);
+ t5 = t1._state;
+ if (t5 >= 4)
+ A.throwExpression(t1._badEventState$0());
+ if ((t5 & 1) !== 0)
+ t1._sendData$1(t4);
+ else if ((t5 & 3) === 0) {
+ t5 = t1._ensurePendingEvents$0();
+ t4 = new A._DelayedData(t4, t2);
+ lastEvent = t5.lastPendingEvent;
+ if (lastEvent == null)
+ t5.firstPendingEvent = t5.lastPendingEvent = t4;
+ else {
+ lastEvent.set$next(t4);
+ t5.lastPendingEvent = t4;
+ }
+ }
+ }
+ } else {
+ t1 = this.channel._outgoingController;
+ t1.add$1(0, A._instanceType(t1)._precomputed1._as(s));
+ }
+ },
+ $signature: 24
+ };
+ (function aliases() {
+ var _ = J.LegacyJavaScriptObject.prototype;
+ _.super$LegacyJavaScriptObject$toString = _.toString$0;
+ })();
+ (function installTearOffs() {
+ var _static_1 = hunkHelpers._static_1,
+ _static_0 = hunkHelpers._static_0,
+ _static_2 = hunkHelpers._static_2,
+ _instance_2_u = hunkHelpers._instance_2u,
+ _instance_1_u = hunkHelpers._instance_1u,
+ _instance_0_u = hunkHelpers._instance_0u;
+ _static_1(A, "async__AsyncRun__scheduleImmediateJsOverride$closure", "_AsyncRun__scheduleImmediateJsOverride", 6);
+ _static_1(A, "async__AsyncRun__scheduleImmediateWithSetImmediate$closure", "_AsyncRun__scheduleImmediateWithSetImmediate", 6);
+ _static_1(A, "async__AsyncRun__scheduleImmediateWithTimer$closure", "_AsyncRun__scheduleImmediateWithTimer", 6);
+ _static_0(A, "async___startMicrotaskLoop$closure", "_startMicrotaskLoop", 0);
+ _static_1(A, "async___nullDataHandler$closure", "_nullDataHandler", 3);
+ _static_2(A, "async___nullErrorHandler$closure", "_nullErrorHandler", 9);
+ _static_0(A, "async___nullDoneHandler$closure", "_nullDoneHandler", 0);
+ _instance_2_u(A._Future.prototype, "get$_completeError", "_completeError$2", 9);
+ _static_1(A, "convert___defaultToEncodable$closure", "_defaultToEncodable", 8);
+ var _;
+ _instance_1_u(_ = A.SseClient.prototype, "get$_onIncomingControlMessage", "_onIncomingControlMessage$1", 1);
+ _instance_1_u(_, "get$_onIncomingMessage", "_onIncomingMessage$1", 1);
+ _instance_0_u(_, "get$_onOutgoingDone", "_onOutgoingDone$0", 0);
+ _instance_1_u(_, "get$_onOutgoingMessage", "_onOutgoingMessage$1", 22);
+ })();
+ (function inheritance() {
+ var _mixin = hunkHelpers.mixin,
+ _inherit = hunkHelpers.inherit,
+ _inheritMany = hunkHelpers.inheritMany;
+ _inherit(A.Object, null);
+ _inheritMany(A.Object, [A.JS_CONST, J.Interceptor, J.ArrayIterator, A.Error, A.Closure, A.Iterable, A.ListIterator, A.FixedLengthListMixin, A.Symbol, A.MapView, A.ConstantMap, A.JSInvocationMirror, A.TypeErrorDecoder, A.NullThrownFromJavaScriptException, A.ExceptionAndStackTrace, A._StackTrace, A._Required, A.MapBase, A.LinkedHashMapCell, A.LinkedHashMapKeyIterator, A.StringMatch, A.Rti, A._FunctionParameters, A._Type, A._TimerImpl, A._AsyncAwaitCompleter, A.AsyncError, A._Completer, A._FutureListener, A._Future, A._AsyncCallbackEntry, A.Stream, A._StreamController, A._AsyncStreamControllerDispatch, A._BufferingStreamSubscription, A._StreamSinkWrapper, A._DelayedEvent, A._DelayedDone, A._PendingEvents, A._StreamIterator, A._Zone, A._HashMapKeyIterator, A.ListBase, A._UnmodifiableMapMixin, A._ListQueueIterator, A.Codec, A.Converter, A._JsonStringifier, A.DateTime, A.Duration, A.OutOfMemoryError, A.StackOverflowError, A._Exception, A.FormatException, A.Null, A._StringStackTrace, A.StringBuffer, A.NullRejectionException, A._JSRandom, A.AsyncMemoizer, A.Level, A.LogRecord, A.Logger, A.Pool, A.PoolResource, A.StreamChannelMixin, A.EventStreamProvider, A._EventStreamSubscription]);
+ _inheritMany(J.Interceptor, [J.JSBool, J.JSNull, J.JavaScriptObject, J.JavaScriptBigInt, J.JavaScriptSymbol, J.JSNumber, J.JSString]);
+ _inheritMany(J.JavaScriptObject, [J.LegacyJavaScriptObject, J.JSArray, A.NativeByteBuffer, A.NativeTypedData]);
+ _inheritMany(J.LegacyJavaScriptObject, [J.PlainJavaScriptObject, J.UnknownJavaScriptObject, J.JavaScriptFunction]);
+ _inherit(J.JSUnmodifiableArray, J.JSArray);
+ _inheritMany(J.JSNumber, [J.JSInt, J.JSNumNotInt]);
+ _inheritMany(A.Error, [A.LateError, A.TypeError, A.JsNoSuchMethodError, A.UnknownJsTypeError, A._CyclicInitializationError, A.RuntimeError, A._Error, A.JsonUnsupportedObjectError, A.AssertionError, A.ArgumentError, A.NoSuchMethodError, A.UnsupportedError, A.UnimplementedError, A.StateError, A.ConcurrentModificationError]);
+ _inheritMany(A.Closure, [A.Closure0Args, A.Closure2Args, A.TearOffClosure, A.initHooks_closure, A.initHooks_closure1, A._AsyncRun__initializeScheduleImmediate_internalCallback, A._AsyncRun__initializeScheduleImmediate_closure, A._awaitOnObject_closure, A._Future__chainForeignFuture_closure, A._Future__propagateToListeners_handleWhenCompleteCallback_closure, A.Stream_length_closure, A.Stream_first_closure0, A._RootZone_bindUnaryCallbackGuarded_closure, A.promiseToFuture_closure, A.promiseToFuture_closure0, A.dartify_convert, A.Pool__runOnRelease_closure, A.SseClient_closure0, A.SseClient_closure1, A.generateUuidV4_generateBits, A._EventStreamSubscription_closure, A._EventStreamSubscription_onData_closure, A.main_closure, A.main_closure0]);
+ _inheritMany(A.Closure0Args, [A.nullFuture_closure, A._AsyncRun__scheduleImmediateJsOverride_internalCallback, A._AsyncRun__scheduleImmediateWithSetImmediate_internalCallback, A._TimerImpl_internalCallback, A._Future__addListener_closure, A._Future__prependListeners_closure, A._Future__chainForeignFuture_closure1, A._Future__chainCoreFutureAsync_closure, A._Future__asyncCompleteWithValue_closure, A._Future__asyncCompleteError_closure, A._Future__propagateToListeners_handleWhenCompleteCallback, A._Future__propagateToListeners_handleValueCallback, A._Future__propagateToListeners_handleError, A.Stream_length_closure0, A.Stream_first_closure, A._StreamController__subscribe_closure, A._StreamController__recordCancel_complete, A._BufferingStreamSubscription_asFuture_closure, A._BufferingStreamSubscription_asFuture__closure, A._BufferingStreamSubscription__sendError_sendError, A._BufferingStreamSubscription__sendDone_sendDone, A._PendingEvents_schedule_closure, A._cancelAndValue_closure, A._rootHandleError_closure, A._RootZone_bindCallbackGuarded_closure, A.Logger_Logger_closure, A.SseClient_closure, A.SseClient__closure, A.SseClient__onOutgoingMessage_closure]);
+ _inherit(A.EfficientLengthIterable, A.Iterable);
+ _inheritMany(A.EfficientLengthIterable, [A.ListIterable, A.LinkedHashMapKeyIterable, A._HashMapKeyIterable]);
+ _inherit(A._UnmodifiableMapView_MapView__UnmodifiableMapMixin, A.MapView);
+ _inherit(A.UnmodifiableMapView, A._UnmodifiableMapView_MapView__UnmodifiableMapMixin);
+ _inherit(A.ConstantMapView, A.UnmodifiableMapView);
+ _inherit(A.ConstantStringMap, A.ConstantMap);
+ _inheritMany(A.Closure2Args, [A.Primitives_functionNoSuchMethod_closure, A.initHooks_closure0, A._awaitOnObject_closure0, A._wrapJsFunctionForAsync_closure, A._Future__chainForeignFuture_closure0, A._BufferingStreamSubscription_asFuture_closure0, A.MapBase_mapToString_closure, A._JsonStringifier_writeMap_closure, A.NoSuchMethodError_toString_closure, A.Pool__runOnRelease_closure0, A.generateUuidV4_printDigits, A.generateUuidV4_bitsDigits]);
+ _inherit(A.NullError, A.TypeError);
+ _inheritMany(A.TearOffClosure, [A.StaticClosure, A.BoundClosure]);
+ _inheritMany(A.MapBase, [A.JsLinkedHashMap, A._HashMap, A._JsonMap]);
+ _inheritMany(A.NativeTypedData, [A.NativeByteData, A.NativeTypedArray]);
+ _inheritMany(A.NativeTypedArray, [A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin, A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin]);
+ _inherit(A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin_FixedLengthListMixin, A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin);
+ _inherit(A.NativeTypedArrayOfDouble, A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin_FixedLengthListMixin);
+ _inherit(A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin_FixedLengthListMixin, A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin);
+ _inherit(A.NativeTypedArrayOfInt, A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin_FixedLengthListMixin);
+ _inheritMany(A.NativeTypedArrayOfDouble, [A.NativeFloat32List, A.NativeFloat64List]);
+ _inheritMany(A.NativeTypedArrayOfInt, [A.NativeInt16List, A.NativeInt32List, A.NativeInt8List, A.NativeUint16List, A.NativeUint32List, A.NativeUint8ClampedList, A.NativeUint8List]);
+ _inherit(A._TypeError, A._Error);
+ _inheritMany(A._Completer, [A._AsyncCompleter, A._SyncCompleter]);
+ _inherit(A._AsyncStreamController, A._StreamController);
+ _inheritMany(A.Stream, [A._StreamImpl, A._EventStream]);
+ _inherit(A._ControllerStream, A._StreamImpl);
+ _inherit(A._ControllerSubscription, A._BufferingStreamSubscription);
+ _inheritMany(A._DelayedEvent, [A._DelayedData, A._DelayedError]);
+ _inherit(A._RootZone, A._Zone);
+ _inherit(A._IdentityHashMap, A._HashMap);
+ _inheritMany(A.ListIterable, [A.ListQueue, A._JsonMapKeyIterable]);
+ _inherit(A.JsonCyclicError, A.JsonUnsupportedObjectError);
+ _inherit(A.JsonCodec, A.Codec);
+ _inheritMany(A.Converter, [A.JsonEncoder, A.JsonDecoder]);
+ _inherit(A._JsonStringStringifier, A._JsonStringifier);
+ _inheritMany(A.ArgumentError, [A.RangeError, A.IndexError]);
+ _inherit(A.SseClient, A.StreamChannelMixin);
+ _inherit(A._ElementEventStreamImpl, A._EventStream);
+ _mixin(A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin, A.ListBase);
+ _mixin(A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin_FixedLengthListMixin, A.FixedLengthListMixin);
+ _mixin(A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin, A.ListBase);
+ _mixin(A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin_FixedLengthListMixin, A.FixedLengthListMixin);
+ _mixin(A._AsyncStreamController, A._AsyncStreamControllerDispatch);
+ _mixin(A._UnmodifiableMapView_MapView__UnmodifiableMapMixin, A._UnmodifiableMapMixin);
+ })();
+ var init = {
+ typeUniverse: {eC: new Map(), tR: {}, eT: {}, tPV: {}, sEA: []},
+ mangledGlobalNames: {int: "int", double: "double", num: "num", String: "String", bool: "bool", Null: "Null", List: "List", Object: "Object", Map: "Map"},
+ mangledNames: {},
+ types: ["~()", "~(JSObject)", "Null()", "~(@)", "Null(@)", "Null(Object,StackTrace)", "~(~())", "Future<Null>()", "@(@)", "~(Object,StackTrace)", "~(Object?,Object?)", "String(int,int)", "~(String,@)", "@(@,String)", "@(String)", "Null(~())", "Null(@,StackTrace)", "~(int,@)", "_Future<@>(@)", "~(Symbol0,@)", "Object?(Object?)", "Logger()", "~(String?)", "int(int)", "~(String)"],
+ interceptorsByTag: null,
+ leafTags: null,
+ arrayRti: Symbol("$ti")
+ };
+ A._Universe_addRules(init.typeUniverse, JSON.parse('{"PlainJavaScriptObject":"LegacyJavaScriptObject","UnknownJavaScriptObject":"LegacyJavaScriptObject","JavaScriptFunction":"LegacyJavaScriptObject","JSBool":{"bool":[],"TrustedGetRuntimeType":[]},"JSNull":{"Null":[],"TrustedGetRuntimeType":[]},"JavaScriptObject":{"JSObject":[]},"LegacyJavaScriptObject":{"JSObject":[]},"JSArray":{"List":["1"],"JSObject":[],"Iterable":["1"]},"JSUnmodifiableArray":{"JSArray":["1"],"List":["1"],"JSObject":[],"Iterable":["1"]},"JSNumber":{"double":[],"num":[]},"JSInt":{"double":[],"int":[],"num":[],"TrustedGetRuntimeType":[]},"JSNumNotInt":{"double":[],"num":[],"TrustedGetRuntimeType":[]},"JSString":{"String":[],"Pattern":[],"TrustedGetRuntimeType":[]},"LateError":{"Error":[]},"EfficientLengthIterable":{"Iterable":["1"]},"ListIterable":{"Iterable":["1"]},"Symbol":{"Symbol0":[]},"ConstantMapView":{"UnmodifiableMapView":["1","2"],"_UnmodifiableMapView_MapView__UnmodifiableMapMixin":["1","2"],"MapView":["1","2"],"_UnmodifiableMapMixin":["1","2"],"Map":["1","2"]},"ConstantMap":{"Map":["1","2"]},"ConstantStringMap":{"ConstantMap":["1","2"],"Map":["1","2"]},"JSInvocationMirror":{"Invocation":[]},"NullError":{"TypeError":[],"Error":[]},"JsNoSuchMethodError":{"Error":[]},"UnknownJsTypeError":{"Error":[]},"_StackTrace":{"StackTrace":[]},"Closure":{"Function":[]},"Closure0Args":{"Function":[]},"Closure2Args":{"Function":[]},"TearOffClosure":{"Function":[]},"StaticClosure":{"Function":[]},"BoundClosure":{"Function":[]},"_CyclicInitializationError":{"Error":[]},"RuntimeError":{"Error":[]},"JsLinkedHashMap":{"MapBase":["1","2"],"Map":["1","2"],"MapBase.K":"1","MapBase.V":"2"},"LinkedHashMapKeyIterable":{"Iterable":["1"]},"NativeByteBuffer":{"JSObject":[],"TrustedGetRuntimeType":[]},"NativeTypedData":{"JSObject":[]},"NativeByteData":{"JSObject":[],"TrustedGetRuntimeType":[]},"NativeTypedArray":{"JavaScriptIndexingBehavior":["1"],"JSObject":[]},"NativeTypedArrayOfDouble":{"ListBase":["double"],"List":["double"],"JavaScriptIndexingBehavior":["double"],"JSObject":[],"Iterable":["double"],"FixedLengthListMixin":["double"]},"NativeTypedArrayOfInt":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"]},"NativeFloat32List":{"ListBase":["double"],"List":["double"],"JavaScriptIndexingBehavior":["double"],"JSObject":[],"Iterable":["double"],"FixedLengthListMixin":["double"],"TrustedGetRuntimeType":[],"ListBase.E":"double"},"NativeFloat64List":{"ListBase":["double"],"List":["double"],"JavaScriptIndexingBehavior":["double"],"JSObject":[],"Iterable":["double"],"FixedLengthListMixin":["double"],"TrustedGetRuntimeType":[],"ListBase.E":"double"},"NativeInt16List":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"],"TrustedGetRuntimeType":[],"ListBase.E":"int"},"NativeInt32List":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"],"TrustedGetRuntimeType":[],"ListBase.E":"int"},"NativeInt8List":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"],"TrustedGetRuntimeType":[],"ListBase.E":"int"},"NativeUint16List":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"],"TrustedGetRuntimeType":[],"ListBase.E":"int"},"NativeUint32List":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"],"TrustedGetRuntimeType":[],"ListBase.E":"int"},"NativeUint8ClampedList":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"],"TrustedGetRuntimeType":[],"ListBase.E":"int"},"NativeUint8List":{"ListBase":["int"],"List":["int"],"JavaScriptIndexingBehavior":["int"],"JSObject":[],"Iterable":["int"],"FixedLengthListMixin":["int"],"TrustedGetRuntimeType":[],"ListBase.E":"int"},"_Error":{"Error":[]},"_TypeError":{"TypeError":[],"Error":[]},"_Future":{"Future":["1"]},"_TimerImpl":{"Timer":[]},"_AsyncAwaitCompleter":{"Completer":["1"]},"AsyncError":{"Error":[]},"_Completer":{"Completer":["1"]},"_AsyncCompleter":{"_Completer":["1"],"Completer":["1"]},"_SyncCompleter":{"_Completer":["1"],"Completer":["1"]},"_StreamController":{"StreamController":["1"],"_StreamControllerLifecycle":["1"],"_EventDispatch":["1"]},"_AsyncStreamController":{"_AsyncStreamControllerDispatch":["1"],"_StreamController":["1"],"StreamController":["1"],"_StreamControllerLifecycle":["1"],"_EventDispatch":["1"]},"_ControllerStream":{"_StreamImpl":["1"],"Stream":["1"]},"_ControllerSubscription":{"_BufferingStreamSubscription":["1"],"StreamSubscription":["1"],"_EventDispatch":["1"]},"_BufferingStreamSubscription":{"StreamSubscription":["1"],"_EventDispatch":["1"]},"_StreamImpl":{"Stream":["1"]},"_DelayedData":{"_DelayedEvent":["1"]},"_DelayedError":{"_DelayedEvent":["@"]},"_DelayedDone":{"_DelayedEvent":["@"]},"_Zone":{"Zone":[]},"_RootZone":{"_Zone":[],"Zone":[]},"_HashMap":{"MapBase":["1","2"],"Map":["1","2"]},"_IdentityHashMap":{"_HashMap":["1","2"],"MapBase":["1","2"],"Map":["1","2"],"MapBase.K":"1","MapBase.V":"2"},"_HashMapKeyIterable":{"Iterable":["1"]},"MapBase":{"Map":["1","2"]},"MapView":{"Map":["1","2"]},"UnmodifiableMapView":{"_UnmodifiableMapView_MapView__UnmodifiableMapMixin":["1","2"],"MapView":["1","2"],"_UnmodifiableMapMixin":["1","2"],"Map":["1","2"]},"ListQueue":{"Queue":["1"],"ListIterable":["1"],"Iterable":["1"],"ListIterable.E":"1"},"_JsonMap":{"MapBase":["String","@"],"Map":["String","@"],"MapBase.K":"String","MapBase.V":"@"},"_JsonMapKeyIterable":{"ListIterable":["String"],"Iterable":["String"],"ListIterable.E":"String"},"JsonUnsupportedObjectError":{"Error":[]},"JsonCyclicError":{"Error":[]},"JsonCodec":{"Codec":["Object?","String"]},"JsonEncoder":{"Converter":["Object?","String"]},"JsonDecoder":{"Converter":["String","Object?"]},"double":{"num":[]},"int":{"num":[]},"String":{"Pattern":[]},"AssertionError":{"Error":[]},"TypeError":{"Error":[]},"ArgumentError":{"Error":[]},"RangeError":{"Error":[]},"IndexError":{"Error":[]},"NoSuchMethodError":{"Error":[]},"UnsupportedError":{"Error":[]},"UnimplementedError":{"Error":[]},"StateError":{"Error":[]},"ConcurrentModificationError":{"Error":[]},"OutOfMemoryError":{"Error":[]},"StackOverflowError":{"Error":[]},"_StringStackTrace":{"StackTrace":[]},"StringBuffer":{"StringSink":[]},"_EventStream":{"Stream":["1"]},"_ElementEventStreamImpl":{"_EventStream":["1"],"Stream":["1"]},"_EventStreamSubscription":{"StreamSubscription":["1"]},"Int8List":{"List":["int"],"Iterable":["int"]},"Uint8List":{"List":["int"],"Iterable":["int"]},"Uint8ClampedList":{"List":["int"],"Iterable":["int"]},"Int16List":{"List":["int"],"Iterable":["int"]},"Uint16List":{"List":["int"],"Iterable":["int"]},"Int32List":{"List":["int"],"Iterable":["int"]},"Uint32List":{"List":["int"],"Iterable":["int"]},"Float32List":{"List":["double"],"Iterable":["double"]},"Float64List":{"List":["double"],"Iterable":["double"]}}'));
+ A._Universe_addErasedTypes(init.typeUniverse, JSON.parse('{"EfficientLengthIterable":1,"NativeTypedArray":1,"_DelayedEvent":1,"StreamChannelMixin":1}'));
+ var string$ = {
+ Error_: "Error handler must accept one Object or one Object and a StackTrace as arguments, and return a value of the returned future's type"
+ };
+ var type$ = (function rtii() {
+ var findType = A.findType;
+ return {
+ $env_1_1_void: findType("@<~>"),
+ AsyncError: findType("AsyncError"),
+ ConstantMapView_Symbol_dynamic: findType("ConstantMapView<Symbol0,@>"),
+ Error: findType("Error"),
+ Function: findType("Function"),
+ Future_dynamic: findType("Future<@>"),
+ Invocation: findType("Invocation"),
+ Iterable_dynamic: findType("Iterable<@>"),
+ JSArray_String: findType("JSArray<String>"),
+ JSArray_dynamic: findType("JSArray<@>"),
+ JSNull: findType("JSNull"),
+ JSObject: findType("JSObject"),
+ JavaScriptFunction: findType("JavaScriptFunction"),
+ JavaScriptIndexingBehavior_dynamic: findType("JavaScriptIndexingBehavior<@>"),
+ JsLinkedHashMap_Symbol_dynamic: findType("JsLinkedHashMap<Symbol0,@>"),
+ List_dynamic: findType("List<@>"),
+ Logger: findType("Logger"),
+ Map_dynamic_dynamic: findType("Map<@,@>"),
+ Null: findType("Null"),
+ Object: findType("Object"),
+ PoolResource: findType("PoolResource"),
+ Record: findType("Record"),
+ StackTrace: findType("StackTrace"),
+ String: findType("String"),
+ Symbol: findType("Symbol0"),
+ TrustedGetRuntimeType: findType("TrustedGetRuntimeType"),
+ TypeError: findType("TypeError"),
+ UnknownJavaScriptObject: findType("UnknownJavaScriptObject"),
+ _AsyncCompleter_PoolResource: findType("_AsyncCompleter<PoolResource>"),
+ _AsyncCompleter_void: findType("_AsyncCompleter<~>"),
+ _ElementEventStreamImpl_JSObject: findType("_ElementEventStreamImpl<JSObject>"),
+ _EventStream_JSObject: findType("_EventStream<JSObject>"),
+ _Future_PoolResource: findType("_Future<PoolResource>"),
+ _Future_dynamic: findType("_Future<@>"),
+ _Future_int: findType("_Future<int>"),
+ _Future_void: findType("_Future<~>"),
+ _IdentityHashMap_of_nullable_Object_and_nullable_Object: findType("_IdentityHashMap<Object?,Object?>"),
+ _StreamControllerAddStreamState_nullable_Object: findType("_StreamControllerAddStreamState<Object?>"),
+ _SyncCompleter_PoolResource: findType("_SyncCompleter<PoolResource>"),
+ bool: findType("bool"),
+ bool_Function_Object: findType("bool(Object)"),
+ double: findType("double"),
+ dynamic: findType("@"),
+ dynamic_Function: findType("@()"),
+ dynamic_Function_Object: findType("@(Object)"),
+ dynamic_Function_Object_StackTrace: findType("@(Object,StackTrace)"),
+ int: findType("int"),
+ legacy_Never: findType("0&*"),
+ legacy_Object: findType("Object*"),
+ nullable_Future_Null: findType("Future<Null>?"),
+ nullable_JSObject: findType("JSObject?"),
+ nullable_List_dynamic: findType("List<@>?"),
+ nullable_Object: findType("Object?"),
+ nullable_StackTrace: findType("StackTrace?"),
+ nullable__DelayedEvent_dynamic: findType("_DelayedEvent<@>?"),
+ nullable__FutureListener_dynamic_dynamic: findType("_FutureListener<@,@>?"),
+ nullable_void_Function: findType("~()?"),
+ nullable_void_Function_JSObject: findType("~(JSObject)?"),
+ num: findType("num"),
+ void: findType("~"),
+ void_Function: findType("~()"),
+ void_Function_Object: findType("~(Object)"),
+ void_Function_Object_StackTrace: findType("~(Object,StackTrace)"),
+ void_Function_String_dynamic: findType("~(String,@)")
+ };
+ })();
+ (function constants() {
+ var makeConstList = hunkHelpers.makeConstList;
+ B.Interceptor_methods = J.Interceptor.prototype;
+ B.JSArray_methods = J.JSArray.prototype;
+ B.JSInt_methods = J.JSInt.prototype;
+ B.JSNumber_methods = J.JSNumber.prototype;
+ B.JSString_methods = J.JSString.prototype;
+ B.JavaScriptFunction_methods = J.JavaScriptFunction.prototype;
+ B.JavaScriptObject_methods = J.JavaScriptObject.prototype;
+ B.PlainJavaScriptObject_methods = J.PlainJavaScriptObject.prototype;
+ B.UnknownJavaScriptObject_methods = J.UnknownJavaScriptObject.prototype;
+ B.C_JS_CONST = function getTagFallback(o) {
+ var s = Object.prototype.toString.call(o);
+ return s.substring(8, s.length - 1);
+};
+ B.C_JS_CONST0 = function() {
+ var toStringFunction = Object.prototype.toString;
+ function getTag(o) {
+ var s = toStringFunction.call(o);
+ return s.substring(8, s.length - 1);
+ }
+ function getUnknownTag(object, tag) {
+ if (/^HTML[A-Z].*Element$/.test(tag)) {
+ var name = toStringFunction.call(object);
+ if (name == "[object Object]") return null;
+ return "HTMLElement";
+ }
+ }
+ function getUnknownTagGenericBrowser(object, tag) {
+ if (object instanceof HTMLElement) return "HTMLElement";
+ return getUnknownTag(object, tag);
+ }
+ function prototypeForTag(tag) {
+ if (typeof window == "undefined") return null;
+ if (typeof window[tag] == "undefined") return null;
+ var constructor = window[tag];
+ if (typeof constructor != "function") return null;
+ return constructor.prototype;
+ }
+ function discriminator(tag) { return null; }
+ var isBrowser = typeof HTMLElement == "function";
+ return {
+ getTag: getTag,
+ getUnknownTag: isBrowser ? getUnknownTagGenericBrowser : getUnknownTag,
+ prototypeForTag: prototypeForTag,
+ discriminator: discriminator };
+};
+ B.C_JS_CONST6 = function(getTagFallback) {
+ return function(hooks) {
+ if (typeof navigator != "object") return hooks;
+ var userAgent = navigator.userAgent;
+ if (typeof userAgent != "string") return hooks;
+ if (userAgent.indexOf("DumpRenderTree") >= 0) return hooks;
+ if (userAgent.indexOf("Chrome") >= 0) {
+ function confirm(p) {
+ return typeof window == "object" && window[p] && window[p].name == p;
+ }
+ if (confirm("Window") && confirm("HTMLElement")) return hooks;
+ }
+ hooks.getTag = getTagFallback;
+ };
+};
+ B.C_JS_CONST1 = function(hooks) {
+ if (typeof dartExperimentalFixupGetTag != "function") return hooks;
+ hooks.getTag = dartExperimentalFixupGetTag(hooks.getTag);
+};
+ B.C_JS_CONST5 = function(hooks) {
+ if (typeof navigator != "object") return hooks;
+ var userAgent = navigator.userAgent;
+ if (typeof userAgent != "string") return hooks;
+ if (userAgent.indexOf("Firefox") == -1) return hooks;
+ var getTag = hooks.getTag;
+ var quickMap = {
+ "BeforeUnloadEvent": "Event",
+ "DataTransfer": "Clipboard",
+ "GeoGeolocation": "Geolocation",
+ "Location": "!Location",
+ "WorkerMessageEvent": "MessageEvent",
+ "XMLDocument": "!Document"};
+ function getTagFirefox(o) {
+ var tag = getTag(o);
+ return quickMap[tag] || tag;
+ }
+ hooks.getTag = getTagFirefox;
+};
+ B.C_JS_CONST4 = function(hooks) {
+ if (typeof navigator != "object") return hooks;
+ var userAgent = navigator.userAgent;
+ if (typeof userAgent != "string") return hooks;
+ if (userAgent.indexOf("Trident/") == -1) return hooks;
+ var getTag = hooks.getTag;
+ var quickMap = {
+ "BeforeUnloadEvent": "Event",
+ "DataTransfer": "Clipboard",
+ "HTMLDDElement": "HTMLElement",
+ "HTMLDTElement": "HTMLElement",
+ "HTMLPhraseElement": "HTMLElement",
+ "Position": "Geoposition"
+ };
+ function getTagIE(o) {
+ var tag = getTag(o);
+ var newTag = quickMap[tag];
+ if (newTag) return newTag;
+ if (tag == "Object") {
+ if (window.DataView && (o instanceof window.DataView)) return "DataView";
+ }
+ return tag;
+ }
+ function prototypeForTagIE(tag) {
+ var constructor = window[tag];
+ if (constructor == null) return null;
+ return constructor.prototype;
+ }
+ hooks.getTag = getTagIE;
+ hooks.prototypeForTag = prototypeForTagIE;
+};
+ B.C_JS_CONST2 = function(hooks) {
+ var getTag = hooks.getTag;
+ var prototypeForTag = hooks.prototypeForTag;
+ function getTagFixed(o) {
+ var tag = getTag(o);
+ if (tag == "Document") {
+ if (!!o.xmlVersion) return "!Document";
+ return "!HTMLDocument";
+ }
+ return tag;
+ }
+ function prototypeForTagFixed(tag) {
+ if (tag == "Document") return null;
+ return prototypeForTag(tag);
+ }
+ hooks.getTag = getTagFixed;
+ hooks.prototypeForTag = prototypeForTagFixed;
+};
+ B.C_JS_CONST3 = function(hooks) { return hooks; }
+;
+ B.C_JsonCodec = new A.JsonCodec();
+ B.C_OutOfMemoryError = new A.OutOfMemoryError();
+ B.C__DelayedDone = new A._DelayedDone();
+ B.C__JSRandom = new A._JSRandom();
+ B.C__Required = new A._Required();
+ B.C__RootZone = new A._RootZone();
+ B.Duration_0 = new A.Duration(0);
+ B.Duration_5000000 = new A.Duration(5000000);
+ B.JsonDecoder_null = new A.JsonDecoder(null);
+ B.JsonEncoder_null = new A.JsonEncoder(null);
+ B.Level_INFO_800 = new A.Level("INFO", 800);
+ B.Level_SEVERE_1000 = new A.Level("SEVERE", 1000);
+ B.Level_WARNING_900 = new A.Level("WARNING", 900);
+ B.List_empty = A._setArrayType(makeConstList([]), type$.JSArray_dynamic);
+ B.Object_empty = {};
+ B.Map_empty = new A.ConstantStringMap(B.Object_empty, [], A.findType("ConstantStringMap<Symbol0,@>"));
+ B.Symbol_call = new A.Symbol("call");
+ B.Type_ByteBuffer_RkP = A.typeLiteral("ByteBuffer");
+ B.Type_ByteData_zNC = A.typeLiteral("ByteData");
+ B.Type_Float32List_LB7 = A.typeLiteral("Float32List");
+ B.Type_Float64List_LB7 = A.typeLiteral("Float64List");
+ B.Type_Int16List_uXf = A.typeLiteral("Int16List");
+ B.Type_Int32List_O50 = A.typeLiteral("Int32List");
+ B.Type_Int8List_ekJ = A.typeLiteral("Int8List");
+ B.Type_Uint16List_2bx = A.typeLiteral("Uint16List");
+ B.Type_Uint32List_2bx = A.typeLiteral("Uint32List");
+ B.Type_Uint8ClampedList_Jik = A.typeLiteral("Uint8ClampedList");
+ B.Type_Uint8List_WLA = A.typeLiteral("Uint8List");
+ B._StringStackTrace_3uE = new A._StringStackTrace("");
+ })();
+ (function staticFields() {
+ $._JS_INTEROP_INTERCEPTOR_TAG = null;
+ $.toStringVisiting = A._setArrayType([], A.findType("JSArray<Object>"));
+ $.Primitives__identityHashCodeProperty = null;
+ $.BoundClosure__receiverFieldNameCache = null;
+ $.BoundClosure__interceptorFieldNameCache = null;
+ $.getTagFunction = null;
+ $.alternateTagFunction = null;
+ $.prototypeForTagFunction = null;
+ $.dispatchRecordsForInstanceTags = null;
+ $.interceptorsForUncacheableTags = null;
+ $.initNativeDispatchFlag = null;
+ $._nextCallback = null;
+ $._lastCallback = null;
+ $._lastPriorityCallback = null;
+ $._isInCallbackLoop = false;
+ $.Zone__current = B.C__RootZone;
+ $.LogRecord__nextNumber = 0;
+ $.Logger__loggers = A.LinkedHashMap_LinkedHashMap$_empty(type$.String, type$.Logger);
+ })();
+ (function lazyInitializers() {
+ var _lazyFinal = hunkHelpers.lazyFinal;
+ _lazyFinal($, "DART_CLOSURE_PROPERTY_NAME", "$get$DART_CLOSURE_PROPERTY_NAME", () => A.getIsolateAffinityTag("_$dart_dartClosure"));
+ _lazyFinal($, "nullFuture", "$get$nullFuture", () => B.C__RootZone.run$1$1(new A.nullFuture_closure(), A.findType("Future<Null>")));
+ _lazyFinal($, "TypeErrorDecoder_noSuchMethodPattern", "$get$TypeErrorDecoder_noSuchMethodPattern", () => A.TypeErrorDecoder_extractPattern(A.TypeErrorDecoder_provokeCallErrorOn({
+ toString: function() {
+ return "$receiver$";
+ }
+ })));
+ _lazyFinal($, "TypeErrorDecoder_notClosurePattern", "$get$TypeErrorDecoder_notClosurePattern", () => A.TypeErrorDecoder_extractPattern(A.TypeErrorDecoder_provokeCallErrorOn({$method$: null,
+ toString: function() {
+ return "$receiver$";
+ }
+ })));
+ _lazyFinal($, "TypeErrorDecoder_nullCallPattern", "$get$TypeErrorDecoder_nullCallPattern", () => A.TypeErrorDecoder_extractPattern(A.TypeErrorDecoder_provokeCallErrorOn(null)));
+ _lazyFinal($, "TypeErrorDecoder_nullLiteralCallPattern", "$get$TypeErrorDecoder_nullLiteralCallPattern", () => A.TypeErrorDecoder_extractPattern(function() {
+ var $argumentsExpr$ = "$arguments$";
+ try {
+ null.$method$($argumentsExpr$);
+ } catch (e) {
+ return e.message;
+ }
+ }()));
+ _lazyFinal($, "TypeErrorDecoder_undefinedCallPattern", "$get$TypeErrorDecoder_undefinedCallPattern", () => A.TypeErrorDecoder_extractPattern(A.TypeErrorDecoder_provokeCallErrorOn(void 0)));
+ _lazyFinal($, "TypeErrorDecoder_undefinedLiteralCallPattern", "$get$TypeErrorDecoder_undefinedLiteralCallPattern", () => A.TypeErrorDecoder_extractPattern(function() {
+ var $argumentsExpr$ = "$arguments$";
+ try {
+ (void 0).$method$($argumentsExpr$);
+ } catch (e) {
+ return e.message;
+ }
+ }()));
+ _lazyFinal($, "TypeErrorDecoder_nullPropertyPattern", "$get$TypeErrorDecoder_nullPropertyPattern", () => A.TypeErrorDecoder_extractPattern(A.TypeErrorDecoder_provokePropertyErrorOn(null)));
+ _lazyFinal($, "TypeErrorDecoder_nullLiteralPropertyPattern", "$get$TypeErrorDecoder_nullLiteralPropertyPattern", () => A.TypeErrorDecoder_extractPattern(function() {
+ try {
+ null.$method$;
+ } catch (e) {
+ return e.message;
+ }
+ }()));
+ _lazyFinal($, "TypeErrorDecoder_undefinedPropertyPattern", "$get$TypeErrorDecoder_undefinedPropertyPattern", () => A.TypeErrorDecoder_extractPattern(A.TypeErrorDecoder_provokePropertyErrorOn(void 0)));
+ _lazyFinal($, "TypeErrorDecoder_undefinedLiteralPropertyPattern", "$get$TypeErrorDecoder_undefinedLiteralPropertyPattern", () => A.TypeErrorDecoder_extractPattern(function() {
+ try {
+ (void 0).$method$;
+ } catch (e) {
+ return e.message;
+ }
+ }()));
+ _lazyFinal($, "_AsyncRun__scheduleImmediateClosure", "$get$_AsyncRun__scheduleImmediateClosure", () => A._AsyncRun__initializeScheduleImmediate());
+ _lazyFinal($, "Future__nullFuture", "$get$Future__nullFuture", () => A.findType("_Future<Null>")._as($.$get$nullFuture()));
+ _lazyFinal($, "Logger_root", "$get$Logger_root", () => A.Logger_Logger(""));
+ _lazyFinal($, "_requestPool", "$get$_requestPool", () => {
+ var t4,
+ t1 = A.findType("Completer<PoolResource>"),
+ t2 = A.ListQueue$(t1),
+ t3 = A.ListQueue$(type$.void_Function);
+ t1 = A.ListQueue$(t1);
+ t4 = A.Completer_Completer(type$.dynamic);
+ return new A.Pool(t2, t3, t1, 1000, new A.AsyncMemoizer(t4, A.findType("AsyncMemoizer<@>")));
+ });
+ })();
+ (function nativeSupport() {
+ !function() {
+ var intern = function(s) {
+ var o = {};
+ o[s] = 1;
+ return Object.keys(hunkHelpers.convertToFastObject(o))[0];
+ };
+ init.getIsolateTag = function(name) {
+ return intern("___dart_" + name + init.isolateTag);
+ };
+ var tableProperty = "___dart_isolate_tags_";
+ var usedProperties = Object[tableProperty] || (Object[tableProperty] = Object.create(null));
+ var rootProperty = "_ZxYxX";
+ for (var i = 0;; i++) {
+ var property = intern(rootProperty + "_" + i + "_");
+ if (!(property in usedProperties)) {
+ usedProperties[property] = 1;
+ init.isolateTag = property;
+ break;
+ }
+ }
+ init.dispatchPropertyName = init.getIsolateTag("dispatch_record");
+ }();
+ hunkHelpers.setOrUpdateInterceptorsByTag({ArrayBuffer: A.NativeByteBuffer, ArrayBufferView: A.NativeTypedData, DataView: A.NativeByteData, Float32Array: A.NativeFloat32List, Float64Array: A.NativeFloat64List, Int16Array: A.NativeInt16List, Int32Array: A.NativeInt32List, Int8Array: A.NativeInt8List, Uint16Array: A.NativeUint16List, Uint32Array: A.NativeUint32List, Uint8ClampedArray: A.NativeUint8ClampedList, CanvasPixelArray: A.NativeUint8ClampedList, Uint8Array: A.NativeUint8List});
+ hunkHelpers.setOrUpdateLeafTags({ArrayBuffer: true, ArrayBufferView: false, DataView: true, Float32Array: true, Float64Array: true, Int16Array: true, Int32Array: true, Int8Array: true, Uint16Array: true, Uint32Array: true, Uint8ClampedArray: true, CanvasPixelArray: true, Uint8Array: false});
+ A.NativeTypedArray.$nativeSuperclassTag = "ArrayBufferView";
+ A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin.$nativeSuperclassTag = "ArrayBufferView";
+ A._NativeTypedArrayOfDouble_NativeTypedArray_ListMixin_FixedLengthListMixin.$nativeSuperclassTag = "ArrayBufferView";
+ A.NativeTypedArrayOfDouble.$nativeSuperclassTag = "ArrayBufferView";
+ A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin.$nativeSuperclassTag = "ArrayBufferView";
+ A._NativeTypedArrayOfInt_NativeTypedArray_ListMixin_FixedLengthListMixin.$nativeSuperclassTag = "ArrayBufferView";
+ A.NativeTypedArrayOfInt.$nativeSuperclassTag = "ArrayBufferView";
+ })();
+ Function.prototype.call$1 = function(a) {
+ return this(a);
+ };
+ Function.prototype.call$0 = function() {
+ return this();
+ };
+ Function.prototype.call$2 = function(a, b) {
+ return this(a, b);
+ };
+ Function.prototype.call$3 = function(a, b, c) {
+ return this(a, b, c);
+ };
+ Function.prototype.call$4 = function(a, b, c, d) {
+ return this(a, b, c, d);
+ };
+ Function.prototype.call$1$1 = function(a) {
+ return this(a);
+ };
+ convertAllToFastObject(holders);
+ convertToFastObject($);
+ (function(callback) {
+ if (typeof document === "undefined") {
+ callback(null);
+ return;
+ }
+ if (typeof document.currentScript != "undefined") {
+ callback(document.currentScript);
+ return;
+ }
+ var scripts = document.scripts;
+ function onLoad(event) {
+ for (var i = 0; i < scripts.length; ++i) {
+ scripts[i].removeEventListener("load", onLoad, false);
+ }
+ callback(event.target);
+ }
+ for (var i = 0; i < scripts.length; ++i) {
+ scripts[i].addEventListener("load", onLoad, false);
+ }
+ })(function(currentScript) {
+ init.currentScript = currentScript;
+ var callMain = A.main;
+ if (typeof dartMainRunner === "function") {
+ dartMainRunner(callMain, []);
+ } else {
+ callMain([]);
+ }
+ });
+})();
diff --git a/pkgs/sse/test/web/index.html b/pkgs/sse/test/web/index.html
new file mode 100644
index 0000000..be26763
--- /dev/null
+++ b/pkgs/sse/test/web/index.html
@@ -0,0 +1,13 @@
+<!DOCTYPE html>
+<html>
+
+<head>
+ <title>SSE Broadcast Channel Test</title>
+</head>
+
+<body>
+ <button type="button">Close Sink</button>
+ <script type="application/javascript" src="index.dart.js"></script>
+</body>
+
+</html>
diff --git a/pkgs/sse/tool/build_js.sh b/pkgs/sse/tool/build_js.sh
new file mode 100755
index 0000000..ef29b70
--- /dev/null
+++ b/pkgs/sse/tool/build_js.sh
@@ -0,0 +1,2 @@
+#!/bin/bash
+dart compile js --no-source-maps test/web/index.dart -o test/web/index.dart.js
diff --git a/pkgs/stack_trace/.gitignore b/pkgs/stack_trace/.gitignore
new file mode 100644
index 0000000..f023015
--- /dev/null
+++ b/pkgs/stack_trace/.gitignore
@@ -0,0 +1,6 @@
+# See https://dart.dev/guides/libraries/private-files
+# Don’t commit the following directories created by pub.
+.dart_tool/
+.packages
+.pub/
+pubspec.lock
diff --git a/pkgs/stack_trace/CHANGELOG.md b/pkgs/stack_trace/CHANGELOG.md
new file mode 100644
index 0000000..e92cf9c
--- /dev/null
+++ b/pkgs/stack_trace/CHANGELOG.md
@@ -0,0 +1,363 @@
+## 1.12.1
+
+* Move to `dart-lang/tools` monorepo.
+
+## 1.12.0
+
+* Added support for parsing Wasm frames of Chrome (V8), Firefox, Safari.
+* Require Dart 3.4 or greater
+
+## 1.11.1
+
+* Make use of `@pragma('vm:awaiter-link')` to make package work better with
+ Dart VM's builtin awaiter stack unwinding. No other changes.
+
+## 1.11.0
+
+* Added the parameter `zoneValues` to `Chain.capture` to be able to use custom
+ zone values with the `runZoned` internal calls.
+* Populate the pubspec `repository` field.
+* Require Dart 2.18 or greater
+
+## 1.10.0
+
+* Stable release for null safety.
+* Fix broken test, `test/chain/vm_test.dart`, which incorrectly handles
+ asynchronous suspension gap markers at the end of stack traces.
+
+## 1.10.0-nullsafety.6
+
+* Fix bug parsing asynchronous suspension gap markers at the end of stack
+ traces, when parsing with `Trace.parse` and `Chain.parse`.
+* Update SDK constraints to `>=2.12.0-0 <3.0.0` based on beta release
+ guidelines.
+
+## 1.10.0-nullsafety.5
+
+* Allow prerelease versions of the 2.12 sdk.
+
+## 1.10.0-nullsafety.4
+
+* Allow the `2.10.0` stable and dev SDKs.
+
+## 1.10.0-nullsafety.3
+
+* Fix bug parsing asynchronous suspension gap markers at the end of stack
+ traces.
+
+## 1.10.0-nullsafety.2
+
+* Forward fix for a change in SDK type promotion behavior.
+
+## 1.10.0-nullsafety.1
+
+* Allow 2.10 stable and 2.11.0 dev SDK versions.
+
+## 1.10.0-nullsafety
+
+* Opt in to null safety.
+
+## 1.9.6 (backpublish)
+
+* Fix bug parsing asynchronous suspension gap markers at the end of stack
+ traces. (Also fixed separately in 1.10.0-nullsafety.3)
+* Fix bug parsing asynchronous suspension gap markers at the end of stack
+ traces, when parsing with `Trace.parse` and `Chain.parse`. (Also fixed
+ separately in 1.10.0-nullsafety.6)
+
+## 1.9.5
+
+* Parse the format for `data:` URIs that the Dart VM has used since `2.2.0`.
+
+## 1.9.4
+
+* Add support for firefox anonymous stack traces.
+* Add support for chrome eval stack traces without a column.
+* Change the argument type to `Chain.capture` from `Function(dynamic, Chain)` to
+ `Function(Object, Chain)`. Existing functions which take `dynamic` are still
+ fine, but new uses can have a safer type.
+
+## 1.9.3
+
+* Set max SDK version to `<3.0.0`.
+
+## 1.9.2
+
+* Fix Dart 2.0 runtime cast failure in test.
+
+## 1.9.1
+
+* Preserve the original chain for a trace to handle cases where an
+ error is rethrown.
+
+## 1.9.0
+
+* Add an `errorZone` parameter to `Chain.capture()` that makes it avoid creating
+ an error zone.
+
+## 1.8.3
+
+* `Chain.forTrace()` now returns a full stack chain for *all* `StackTrace`s
+ within `Chain.capture()`, even those that haven't been processed by
+ `dart:async` yet.
+
+* `Chain.forTrace()` now uses the Dart VM's stack chain information when called
+ synchronously within `Chain.capture()`. This matches the existing behavior
+ outside `Chain.capture()`.
+
+* `Chain.forTrace()` now trims the VM's stack chains for the innermost stack
+ trace within `Chain.capture()` (unless it's called synchronously, as above).
+ This avoids duplicated frames and makes the format of the innermost traces
+ consistent with the other traces in the chain.
+
+## 1.8.2
+
+* Update to use strong-mode clean Zone API.
+
+## 1.8.1
+
+* Use official generic function syntax.
+
+* Updated minimum SDK to 1.23.0.
+
+## 1.8.0
+
+* Add a `Trace.original` field to provide access to the original `StackTrace`s
+ from which the `Trace` was created, and a matching constructor parameter to
+ `new Trace()`.
+
+## 1.7.4
+
+* Always run `onError` callbacks for `Chain.capture()` in the parent zone.
+
+## 1.7.3
+
+* Fix broken links in the README.
+
+## 1.7.2
+
+* `Trace.foldFrames()` and `Chain.foldFrames()` now remove the outermost folded
+ frame. This matches the behavior of `.terse` with core frames.
+
+* Fix bug parsing a friendly frame with spaces in the member name.
+
+* Fix bug parsing a friendly frame where the location is a data url.
+
+## 1.7.1
+
+* Make `Trace.parse()`, `Chain.parse()`, treat the VM's new causal asynchronous
+ stack traces as chains. Outside of a `Chain.capture()` block, `new
+ Chain.current()` will return a stack chain constructed from the asynchronous
+ stack traces.
+
+## 1.7.0
+
+* Add a `Chain.disable()` function that disables stack-chain tracking.
+
+* Fix a bug where `Chain.capture(..., when: false)` would throw if an error was
+ emitted without a stack trace.
+
+## 1.6.8
+
+* Add a note to the documentation of `Chain.terse` and `Trace.terse`.
+
+## 1.6.7
+
+* Fix a bug where `new Frame.caller()` returned the wrong depth of frame on
+ Dartium.
+
+## 1.6.6
+
+* `new Trace.current()` and `new Chain.current()` now skip an extra frame when
+ run in a JS context. This makes their return values match the VM context.
+
+## 1.6.5
+
+* Really fix strong mode warnings.
+
+## 1.6.4
+
+* Fix a syntax error introduced in 1.6.3.
+
+## 1.6.3
+
+* Make `Chain.capture()` generic. Its signature is now `T Chain.capture<T>(T
+ callback(), ...)`.
+
+## 1.6.2
+
+* Fix all strong mode warnings.
+
+## 1.6.1
+
+* Use `StackTrace.current` in Dart SDK 1.14 to get the current stack trace.
+
+## 1.6.0
+
+* Add a `when` parameter to `Chain.capture()`. This allows capturing to be
+ easily enabled and disabled based on whether the application is running in
+ debug/development mode or not.
+
+* Deprecate the `ChainHandler` typedef. This didn't provide any value over
+ directly annotating the function argument, and it made the documentation less
+ clear.
+
+## 1.5.1
+
+* Fix a crash in `Chain.foldFrames()` and `Chain.terse` when one of the chain's
+ traces has no frames.
+
+## 1.5.0
+
+* `new Chain.parse()` now parses all the stack trace formats supported by `new
+ Trace.parse()`. Formats other than that emitted by `Chain.toString()` will
+ produce single-element chains.
+
+* `new Trace.parse()` now parses the output of `Chain.toString()`. It produces
+ the same result as `Chain.parse().toTrace()`.
+
+## 1.4.2
+
+* Improve the display of `data:` URIs in stack traces.
+
+## 1.4.1
+
+* Fix a crashing bug in `UnparsedFrame.toString()`.
+
+## 1.4.0
+
+* `new Trace.parse()` and related constructors will no longer throw an exception
+ if they encounter an unparseable stack frame. Instead, they will generate an
+ `UnparsedFrame`, which exposes no metadata but preserves the frame's original
+ text.
+
+* Properly parse native-code V8 frames.
+
+## 1.3.5
+
+* Properly shorten library names for pathnames of folded frames on Windows.
+
+## 1.3.4
+
+* No longer say that stack chains aren't supported on dart2js now that
+ [sdk#15171][] is fixed. Note that this fix only applies to Dart 1.12.
+
+[sdk#15171]: https://github.com/dart-lang/sdk/issues/15171
+
+## 1.3.3
+
+* When a `null` stack trace is passed to a completer or stream controller in
+ nested `Chain.capture()` blocks, substitute the inner block's chain rather
+ than the outer block's.
+
+* Add support for empty chains and chains of empty traces to `Chain.parse()`.
+
+* Don't crash when parsing stack traces from Dart VM stack overflows.
+
+## 1.3.2
+
+* Don't crash when running `Trace.terse` on empty stack traces.
+
+## 1.3.1
+
+* Support more types of JavaScriptCore stack frames.
+
+## 1.3.0
+
+* Support stack traces generated by JavaScriptCore. They can be explicitly
+ parsed via `new Trace.parseJSCore` and `new Frame.parseJSCore`.
+
+## 1.2.4
+
+* Fix a type annotation in `LazyTrace`.
+
+## 1.2.3
+
+* Fix a crash in `Chain.parse`.
+
+## 1.2.2
+
+* Don't print the first folded frame of terse stack traces. This frame
+ is always just an internal isolate message handler anyway. This
+ improves the readability of stack traces, especially in stack chains.
+
+* Remove the line numbers and specific files in all terse folded frames, not
+ just those from core libraries.
+
+* Make padding consistent across all stack traces for `Chain.toString()`.
+
+## 1.2.1
+
+* Add `terse` to `LazyTrace.foldFrames()`.
+
+* Further improve stack chains when using the VM's async/await implementation.
+
+## 1.2.0
+
+* Add a `terse` argument to `Trace.foldFrames()` and `Chain.foldFrames()`. This
+ allows them to inherit the behavior of `Trace.terse` and `Chain.terse` without
+ having to duplicate the logic.
+
+## 1.1.3
+
+* Produce nicer-looking stack chains when using the VM's async/await
+ implementation.
+
+## 1.1.2
+
+* Support VM frames without line *or* column numbers, which async/await programs
+ occasionally generate.
+
+* Replace `<<anonymous closure>_async_body>` in VM frames' members with the
+ terser `<async>`.
+
+## 1.1.1
+
+* Widen the SDK constraint to include 1.7.0-dev.4.0.
+
+## 1.1.0
+
+* Unify the parsing of Safari and Firefox stack traces. This fixes an error in
+ Firefox trace parsing.
+
+* Deprecate `Trace.parseSafari6_0`, `Trace.parseSafari6_1`,
+ `Frame.parseSafari6_0`, and `Frame.parseSafari6_1`.
+
+* Add `Frame.parseSafari`.
+
+## 1.0.3
+
+* Use `Zone.errorCallback` to attach stack chains to all errors without the need
+ for `Chain.track`, which is now deprecated.
+
+## 1.0.2
+
+* Remove a workaround for [issue 17083][].
+
+[issue 17083]: https://github.com/dart-lang/sdk/issues/17083
+
+## 1.0.1
+
+* Synchronous errors in the [Chain.capture] callback are now handled correctly.
+
+## 1.0.0
+
+* No API changes, just declared stable.
+
+## 0.9.3+2
+
+* Update the dependency on path.
+
+* Improve the formatting of library URIs in stack traces.
+
+## 0.9.3+1
+
+* If an error is thrown in `Chain.capture`'s `onError` handler, that error is
+ handled by the parent zone. This matches the behavior of `runZoned` in
+ `dart:async`.
+
+## 0.9.3
+
+* Add a `Chain.foldFrames` method that parallels `Trace.foldFrames`.
+
+* Record anonymous method frames in IE10 as "<fn>".
diff --git a/pkgs/stack_trace/LICENSE b/pkgs/stack_trace/LICENSE
new file mode 100644
index 0000000..162572a
--- /dev/null
+++ b/pkgs/stack_trace/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2014, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/stack_trace/README.md b/pkgs/stack_trace/README.md
new file mode 100644
index 0000000..b10a556
--- /dev/null
+++ b/pkgs/stack_trace/README.md
@@ -0,0 +1,169 @@
+[](https://github.com/dart-lang/tools/actions/workflows/stack_trace.yaml)
+[](https://pub.dev/packages/stack_trace)
+[](https://pub.dev/packages/stack_trace/publisher)
+
+This library provides the ability to parse, inspect, and manipulate stack traces
+produced by the underlying Dart implementation. It also provides functions to
+produce string representations of stack traces in a more readable format than
+the native [StackTrace] implementation.
+
+`Trace`s can be parsed from native [StackTrace]s using `Trace.from`, or captured
+using `Trace.current`. Native [StackTrace]s can also be directly converted to
+human-readable strings using `Trace.format`.
+
+[StackTrace]: https://api.dart.dev/stable/dart-core/StackTrace-class.html
+
+Here's an example native stack trace from debugging this library:
+
+ #0 Object.noSuchMethod (dart:core-patch:1884:25)
+ #1 Trace.terse.<anonymous closure> (file:///usr/local/google-old/home/goog/dart/dart/pkg/stack_trace/lib/src/trace.dart:47:21)
+ #2 IterableMixinWorkaround.reduce (dart:collection:29:29)
+ #3 List.reduce (dart:core-patch:1247:42)
+ #4 Trace.terse (file:///usr/local/google-old/home/goog/dart/dart/pkg/stack_trace/lib/src/trace.dart:40:35)
+ #5 format (file:///usr/local/google-old/home/goog/dart/dart/pkg/stack_trace/lib/stack_trace.dart:24:28)
+ #6 main.<anonymous closure> (file:///usr/local/google-old/home/goog/dart/dart/test.dart:21:29)
+ #7 _CatchErrorFuture._sendError (dart:async:525:24)
+ #8 _FutureImpl._setErrorWithoutAsyncTrace (dart:async:393:26)
+ #9 _FutureImpl._setError (dart:async:378:31)
+ #10 _ThenFuture._sendValue (dart:async:490:16)
+ #11 _FutureImpl._handleValue.<anonymous closure> (dart:async:349:28)
+ #12 Timer.run.<anonymous closure> (dart:async:2402:21)
+ #13 Timer.Timer.<anonymous closure> (dart:async-patch:15:15)
+
+and its human-readable representation:
+
+ dart:core-patch 1884:25 Object.noSuchMethod
+ pkg/stack_trace/lib/src/trace.dart 47:21 Trace.terse.<fn>
+ dart:collection 29:29 IterableMixinWorkaround.reduce
+ dart:core-patch 1247:42 List.reduce
+ pkg/stack_trace/lib/src/trace.dart 40:35 Trace.terse
+ pkg/stack_trace/lib/stack_trace.dart 24:28 format
+ test.dart 21:29 main.<fn>
+ dart:async 525:24 _CatchErrorFuture._sendError
+ dart:async 393:26 _FutureImpl._setErrorWithoutAsyncTrace
+ dart:async 378:31 _FutureImpl._setError
+ dart:async 490:16 _ThenFuture._sendValue
+ dart:async 349:28 _FutureImpl._handleValue.<fn>
+ dart:async 2402:21 Timer.run.<fn>
+ dart:async-patch 15:15 Timer.Timer.<fn>
+
+You can further clean up the stack trace using `Trace.terse`. This folds
+together multiple stack frames from the Dart core libraries, so that only the
+core library method that was directly called from user code is visible. For
+example:
+
+ dart:core Object.noSuchMethod
+ pkg/stack_trace/lib/src/trace.dart 47:21 Trace.terse.<fn>
+ dart:core List.reduce
+ pkg/stack_trace/lib/src/trace.dart 40:35 Trace.terse
+ pkg/stack_trace/lib/stack_trace.dart 24:28 format
+ test.dart 21:29 main.<fn>
+
+## Stack Chains
+
+This library also provides the ability to capture "stack chains" with the
+`Chain` class. When writing asynchronous code, a single stack trace isn't very
+useful, since the call stack is unwound every time something async happens. A
+stack chain tracks stack traces through asynchronous calls, so that you can see
+the full path from `main` down to the error.
+
+To use stack chains, just wrap the code that you want to track in
+`Chain.capture`. This will create a new [Zone][] in which stack traces are
+recorded and woven into chains every time an asynchronous call occurs. Zones are
+sticky, too, so any asynchronous operations started in the `Chain.capture`
+callback will have their chains tracked, as will asynchronous operations they
+start and so on.
+
+Here's an example of some code that doesn't capture its stack chains:
+
+```dart
+import 'dart:async';
+
+void main() {
+ _scheduleAsync();
+}
+
+void _scheduleAsync() {
+ Future.delayed(Duration(seconds: 1)).then((_) => _runAsync());
+}
+
+void _runAsync() {
+ throw 'oh no!';
+}
+```
+
+If we run this, it prints the following:
+
+ Unhandled exception:
+ oh no!
+ #0 _runAsync (file:///Users/kevmoo/github/stack_trace/example/example.dart:12:3)
+ #1 _scheduleAsync.<anonymous closure> (file:///Users/kevmoo/github/stack_trace/example/example.dart:8:52)
+ <asynchronous suspension>
+
+Notice how there's no mention of `main` in that stack trace. All we know is that
+the error was in `runAsync`; we don't know why `runAsync` was called.
+
+Now let's look at the same code with stack chains captured:
+
+```dart
+import 'dart:async';
+
+import 'package:stack_trace/stack_trace.dart';
+
+void main() {
+ Chain.capture(_scheduleAsync);
+}
+
+void _scheduleAsync() {
+ Future.delayed(Duration(seconds: 1)).then((_) => _runAsync());
+}
+
+void _runAsync() {
+ throw 'oh no!';
+}
+```
+
+Now if we run it, it prints this:
+
+ Unhandled exception:
+ oh no!
+ example/example.dart 14:3 _runAsync
+ example/example.dart 10:52 _scheduleAsync.<fn>
+ package:stack_trace/src/stack_zone_specification.dart 126:26 StackZoneSpecification._registerUnaryCallback.<fn>.<fn>
+ package:stack_trace/src/stack_zone_specification.dart 208:15 StackZoneSpecification._run
+ package:stack_trace/src/stack_zone_specification.dart 126:14 StackZoneSpecification._registerUnaryCallback.<fn>
+ dart:async/zone.dart 1406:47 _rootRunUnary
+ dart:async/zone.dart 1307:19 _CustomZone.runUnary
+ ===== asynchronous gap ===========================
+ dart:async/zone.dart 1328:19 _CustomZone.registerUnaryCallback
+ dart:async/future_impl.dart 315:23 Future.then
+ example/example.dart 10:40 _scheduleAsync
+ package:stack_trace/src/chain.dart 97:24 Chain.capture.<fn>
+ dart:async/zone.dart 1398:13 _rootRun
+ dart:async/zone.dart 1300:19 _CustomZone.run
+ dart:async/zone.dart 1803:10 _runZoned
+ dart:async/zone.dart 1746:10 runZoned
+ package:stack_trace/src/chain.dart 95:12 Chain.capture
+ example/example.dart 6:9 main
+ dart:isolate-patch/isolate_patch.dart 297:19 _delayEntrypointInvocation.<fn>
+ dart:isolate-patch/isolate_patch.dart 192:12 _RawReceivePortImpl._handleMessage
+
+That's a lot of text! If you look closely, though, you can see that `main` is
+listed in the first trace in the chain.
+
+Thankfully, you can call `Chain.terse` just like `Trace.terse` to get rid of all
+the frames you don't care about. The terse version of the stack chain above is
+this:
+
+ test.dart 17:3 runAsync
+ test.dart 13:28 scheduleAsync.<fn>
+ ===== asynchronous gap ===========================
+ dart:async _Future.then
+ test.dart 13:12 scheduleAsync
+ test.dart 7:18 main.<fn>
+ package:stack_trace Chain.capture
+ test.dart 6:16 main
+
+That's a lot easier to understand!
+
+[Zone]: https://api.dart.dev/stable/dart-async/Zone-class.html
diff --git a/pkgs/stack_trace/analysis_options.yaml b/pkgs/stack_trace/analysis_options.yaml
new file mode 100644
index 0000000..4eb82ce
--- /dev/null
+++ b/pkgs/stack_trace/analysis_options.yaml
@@ -0,0 +1,22 @@
+# https://dart.dev/tools/analysis#the-analysis-options-file
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-raw-types: true
+
+linter:
+ rules:
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_unused_constructor_parameters
+ - avoid_void_async
+ - cancel_subscriptions
+ - literal_only_boolean_expressions
+ - missing_whitespace_between_adjacent_strings
+ - no_adjacent_strings_in_list
+ - no_runtimeType_toString
+ - prefer_const_declarations
+ - unnecessary_await_in_return
+ - use_string_buffers
diff --git a/pkgs/stack_trace/example/example.dart b/pkgs/stack_trace/example/example.dart
new file mode 100644
index 0000000..d601ca4
--- /dev/null
+++ b/pkgs/stack_trace/example/example.dart
@@ -0,0 +1,15 @@
+import 'dart:async';
+
+import 'package:stack_trace/stack_trace.dart';
+
+void main() {
+ Chain.capture(_scheduleAsync);
+}
+
+void _scheduleAsync() {
+ Future<void>.delayed(const Duration(seconds: 1)).then((_) => _runAsync());
+}
+
+void _runAsync() {
+ throw StateError('oh no!');
+}
diff --git a/pkgs/stack_trace/lib/src/chain.dart b/pkgs/stack_trace/lib/src/chain.dart
new file mode 100644
index 0000000..6a815c6
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/chain.dart
@@ -0,0 +1,264 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:math' as math;
+
+import 'frame.dart';
+import 'lazy_chain.dart';
+import 'stack_zone_specification.dart';
+import 'trace.dart';
+import 'utils.dart';
+
+/// A function that handles errors in the zone wrapped by [Chain.capture].
+@Deprecated('Will be removed in stack_trace 2.0.0.')
+typedef ChainHandler = void Function(dynamic error, Chain chain);
+
+/// An opaque key used to track the current [StackZoneSpecification].
+final _specKey = Object();
+
+/// A chain of stack traces.
+///
+/// A stack chain is a collection of one or more stack traces that collectively
+/// represent the path from `main` through nested function calls to a particular
+/// code location, usually where an error was thrown. Multiple stack traces are
+/// necessary when using asynchronous functions, since the program's stack is
+/// reset before each asynchronous callback is run.
+///
+/// Stack chains can be automatically tracked using [Chain.capture]. This sets
+/// up a new [Zone] in which the current stack chain is tracked and can be
+/// accessed using [Chain.current]. Any errors that would be top-leveled in
+/// the zone can be handled, along with their associated chains, with the
+/// `onError` callback. For example:
+///
+/// Chain.capture(() {
+/// // ...
+/// }, onError: (error, stackChain) {
+/// print("Caught error $error\n"
+/// "$stackChain");
+/// });
+class Chain implements StackTrace {
+ /// The stack traces that make up this chain.
+ ///
+ /// Like the frames in a stack trace, the traces are ordered from most local
+ /// to least local. The first one is the trace where the actual exception was
+ /// raised, the second one is where that callback was scheduled, and so on.
+ final List<Trace> traces;
+
+ /// The [StackZoneSpecification] for the current zone.
+ static StackZoneSpecification? get _currentSpec =>
+ Zone.current[_specKey] as StackZoneSpecification?;
+
+ /// If [when] is `true`, runs [callback] in a [Zone] in which the current
+ /// stack chain is tracked and automatically associated with (most) errors.
+ ///
+ /// If [when] is `false`, this does not track stack chains. Instead, it's
+ /// identical to [runZoned], except that it wraps any errors in
+ /// [Chain.forTrace]—which will only wrap the trace unless there's a different
+ /// [Chain.capture] active. This makes it easy for the caller to only capture
+ /// stack chains in debug mode or during development.
+ ///
+ /// If [onError] is passed, any error in the zone that would otherwise go
+ /// unhandled is passed to it, along with the [Chain] associated with that
+ /// error. Note that if [callback] produces multiple unhandled errors,
+ /// [onError] may be called more than once. If [onError] isn't passed, the
+ /// parent Zone's `unhandledErrorHandler` will be called with the error and
+ /// its chain.
+ ///
+ /// The zone this creates will be an error zone if either [onError] is
+ /// not `null` and [when] is false,
+ /// or if both [when] and [errorZone] are `true`.
+ /// If [errorZone] is `false`, [onError] must be `null`.
+ ///
+ /// If [callback] returns a value, it will be returned by [capture] as well.
+ ///
+ /// [zoneValues] is added to the [runZoned] calls.
+ static T capture<T>(T Function() callback,
+ {void Function(Object error, Chain)? onError,
+ bool when = true,
+ bool errorZone = true,
+ Map<Object?, Object?>? zoneValues}) {
+ if (!errorZone && onError != null) {
+ throw ArgumentError.value(
+ onError, 'onError', 'must be null if errorZone is false');
+ }
+
+ if (!when) {
+ if (onError == null) return runZoned(callback, zoneValues: zoneValues);
+ return runZonedGuarded(callback, (error, stackTrace) {
+ onError(error, Chain.forTrace(stackTrace));
+ }, zoneValues: zoneValues) as T;
+ }
+
+ var spec = StackZoneSpecification(onError, errorZone: errorZone);
+ return runZoned(() {
+ try {
+ return callback();
+ } on Object catch (error, stackTrace) {
+ // Forward synchronous errors through the async error path to match the
+ // behavior of `runZonedGuarded`.
+ Zone.current.handleUncaughtError(error, stackTrace);
+
+ // If the expected return type of capture() is not nullable, this will
+ // throw a cast exception. But the only other alternative is to throw
+ // some other exception. Casting null to T at least lets existing uses
+ // where T is a nullable type continue to work.
+ return null as T;
+ }
+ }, zoneSpecification: spec.toSpec(), zoneValues: {
+ ...?zoneValues,
+ _specKey: spec,
+ StackZoneSpecification.disableKey: false
+ });
+ }
+
+ /// If [when] is `true` and this is called within a [Chain.capture] zone, runs
+ /// [callback] in a [Zone] in which chain capturing is disabled.
+ ///
+ /// If [callback] returns a value, it will be returned by [disable] as well.
+ static T disable<T>(T Function() callback, {bool when = true}) {
+ var zoneValues =
+ when ? {_specKey: null, StackZoneSpecification.disableKey: true} : null;
+
+ return runZoned(callback, zoneValues: zoneValues);
+ }
+
+ /// Returns [futureOrStream] unmodified.
+ ///
+ /// Prior to Dart 1.7, this was necessary to ensure that stack traces for
+ /// exceptions reported with [Completer.completeError] and
+ /// [StreamController.addError] were tracked correctly.
+ @Deprecated('Chain.track is not necessary in Dart 1.7+.')
+ static dynamic track(Object? futureOrStream) => futureOrStream;
+
+ /// Returns the current stack chain.
+ ///
+ /// By default, the first frame of the first trace will be the line where
+ /// [Chain.current] is called. If [level] is passed, the first trace will
+ /// start that many frames up instead.
+ ///
+ /// If this is called outside of a [capture] zone, it just returns a
+ /// single-trace chain.
+ factory Chain.current([int level = 0]) {
+ if (_currentSpec != null) return _currentSpec!.currentChain(level + 1);
+
+ var chain = Chain.forTrace(StackTrace.current);
+ return LazyChain(() {
+ // JS includes a frame for the call to StackTrace.current, but the VM
+ // doesn't, so we skip an extra frame in a JS context.
+ var first = Trace(chain.traces.first.frames.skip(level + (inJS ? 2 : 1)),
+ original: chain.traces.first.original.toString());
+ return Chain([first, ...chain.traces.skip(1)]);
+ });
+ }
+
+ /// Returns the stack chain associated with [trace].
+ ///
+ /// The first stack trace in the returned chain will always be [trace]
+ /// (converted to a [Trace] if necessary). If there is no chain associated
+ /// with [trace] or if this is called outside of a [capture] zone, this just
+ /// returns a single-trace chain containing [trace].
+ ///
+ /// If [trace] is already a [Chain], it will be returned as-is.
+ factory Chain.forTrace(StackTrace trace) {
+ if (trace is Chain) return trace;
+ if (_currentSpec != null) return _currentSpec!.chainFor(trace);
+ if (trace is Trace) return Chain([trace]);
+ return LazyChain(() => Chain.parse(trace.toString()));
+ }
+
+ /// Parses a string representation of a stack chain.
+ ///
+ /// If [chain] is the output of a call to [Chain.toString], it will be parsed
+ /// as a full stack chain. Otherwise, it will be parsed as in [Trace.parse]
+ /// and returned as a single-trace chain.
+ factory Chain.parse(String chain) {
+ if (chain.isEmpty) return Chain([]);
+ if (chain.contains(vmChainGap)) {
+ return Chain(chain
+ .split(vmChainGap)
+ .where((line) => line.isNotEmpty)
+ .map(Trace.parseVM));
+ }
+ if (!chain.contains(chainGap)) return Chain([Trace.parse(chain)]);
+
+ return Chain(chain.split(chainGap).map(Trace.parseFriendly));
+ }
+
+ /// Returns a new [Chain] comprised of [traces].
+ Chain(Iterable<Trace> traces) : traces = List<Trace>.unmodifiable(traces);
+
+ /// Returns a terser version of this chain.
+ ///
+ /// This calls [Trace.terse] on every trace in [traces], and discards any
+ /// trace that contain only internal frames.
+ ///
+ /// This won't do anything with a raw JavaScript trace, since there's no way
+ /// to determine which frames come from which Dart libraries. However, the
+ /// [`source_map_stack_trace`](https://pub.dev/packages/source_map_stack_trace)
+ /// package can be used to convert JavaScript traces into Dart-style traces.
+ Chain get terse => foldFrames((_) => false, terse: true);
+
+ /// Returns a new [Chain] based on this chain where multiple stack frames
+ /// matching [predicate] are folded together.
+ ///
+ /// This means that whenever there are multiple frames in a row that match
+ /// [predicate], only the last one is kept. In addition, traces that are
+ /// composed entirely of frames matching [predicate] are omitted.
+ ///
+ /// This is useful for limiting the amount of library code that appears in a
+ /// stack trace by only showing user code and code that's called by user code.
+ ///
+ /// If [terse] is true, this will also fold together frames from the core
+ /// library or from this package, and simplify core library frames as in
+ /// [Trace.terse].
+ Chain foldFrames(bool Function(Frame) predicate, {bool terse = false}) {
+ var foldedTraces =
+ traces.map((trace) => trace.foldFrames(predicate, terse: terse));
+ var nonEmptyTraces = foldedTraces.where((trace) {
+ // Ignore traces that contain only folded frames.
+ if (trace.frames.length > 1) return true;
+ if (trace.frames.isEmpty) return false;
+
+ // In terse mode, the trace may have removed an outer folded frame,
+ // leaving a single non-folded frame. We can detect a folded frame because
+ // it has no line information.
+ if (!terse) return false;
+ return trace.frames.single.line != null;
+ });
+
+ // If all the traces contain only internal processing, preserve the last
+ // (top-most) one so that the chain isn't empty.
+ if (nonEmptyTraces.isEmpty && foldedTraces.isNotEmpty) {
+ return Chain([foldedTraces.last]);
+ }
+
+ return Chain(nonEmptyTraces);
+ }
+
+ /// Converts this chain to a [Trace].
+ ///
+ /// The trace version of a chain is just the concatenation of all the traces
+ /// in the chain.
+ Trace toTrace() => Trace(traces.expand((trace) => trace.frames));
+
+ @override
+ String toString() {
+ // Figure out the longest path so we know how much to pad.
+ var longest = traces
+ .map((trace) => trace.frames
+ .map((frame) => frame.location.length)
+ .fold(0, math.max))
+ .fold(0, math.max);
+
+ // Don't call out to [Trace.toString] here because that doesn't ensure that
+ // padding is consistent across all traces.
+ return traces
+ .map((trace) => trace.frames
+ .map((frame) =>
+ '${frame.location.padRight(longest)} ${frame.member}\n')
+ .join())
+ .join(chainGap);
+ }
+}
diff --git a/pkgs/stack_trace/lib/src/frame.dart b/pkgs/stack_trace/lib/src/frame.dart
new file mode 100644
index 0000000..d4043b7
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/frame.dart
@@ -0,0 +1,458 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:path/path.dart' as path;
+
+import 'trace.dart';
+import 'unparsed_frame.dart';
+
+// #1 Foo._bar (file:///home/nweiz/code/stuff.dart:42:21)
+// #1 Foo._bar (file:///home/nweiz/code/stuff.dart:42)
+// #1 Foo._bar (file:///home/nweiz/code/stuff.dart)
+final _vmFrame = RegExp(r'^#\d+\s+(\S.*) \((.+?)((?::\d+){0,2})\)$');
+
+// at Object.stringify (native)
+// at VW.call$0 (https://example.com/stuff.dart.js:560:28)
+// at VW.call$0 (eval as fn
+// (https://example.com/stuff.dart.js:560:28), efn:3:28)
+// at https://example.com/stuff.dart.js:560:28
+final _v8JsFrame =
+ RegExp(r'^\s*at (?:(\S.*?)(?: \[as [^\]]+\])? \((.*)\)|(.*))$');
+
+// https://example.com/stuff.dart.js:560:28
+// https://example.com/stuff.dart.js:560
+//
+// Group 1: URI, required
+// Group 2: line number, required
+// Group 3: column number, optional
+final _v8JsUrlLocation = RegExp(r'^(.*?):(\d+)(?::(\d+))?$|native$');
+
+// With names:
+//
+// at Error.f (wasm://wasm/0006d966:wasm-function[119]:0xbb13)
+// at g (wasm://wasm/0006d966:wasm-function[796]:0x143b4)
+//
+// Without names:
+//
+// at wasm://wasm/0005168a:wasm-function[119]:0xbb13
+// at wasm://wasm/0005168a:wasm-function[796]:0x143b4
+//
+// Matches named groups:
+//
+// - "member": optional, `Error.f` in the first example, NA in the second.
+// - "uri": `wasm://wasm/0006d966`.
+// - "index": `119`.
+// - "offset": (hex number) `bb13`.
+//
+// To avoid having multiple groups for the same part of the frame, this regex
+// matches unmatched parentheses after the member name.
+final _v8WasmFrame = RegExp(r'^\s*at (?:(?<member>.+) )?'
+ r'(?:\(?(?:(?<uri>\S+):wasm-function\[(?<index>\d+)\]'
+ r'\:0x(?<offset>[0-9a-fA-F]+))\)?)$');
+
+// eval as function (https://example.com/stuff.dart.js:560:28), efn:3:28
+// eval as function (https://example.com/stuff.dart.js:560:28)
+// eval as function (eval as otherFunction
+// (https://example.com/stuff.dart.js:560:28))
+final _v8EvalLocation =
+ RegExp(r'^eval at (?:\S.*?) \((.*)\)(?:, .*?:\d+:\d+)?$');
+
+// anonymous/<@https://example.com/stuff.js line 693 > Function:3:40
+// anonymous/<@https://example.com/stuff.js line 693 > eval:3:40
+final _firefoxEvalLocation =
+ RegExp(r'(\S+)@(\S+) line (\d+) >.* (Function|eval):\d+:\d+');
+
+// .VW.call$0@https://example.com/stuff.dart.js:560
+// .VW.call$0("arg")@https://example.com/stuff.dart.js:560
+// .VW.call$0/name<@https://example.com/stuff.dart.js:560
+// .VW.call$0@https://example.com/stuff.dart.js:560:36
+// https://example.com/stuff.dart.js:560
+final _firefoxSafariJSFrame = RegExp(r'^'
+ r'(?:' // Member description. Not present in some Safari frames.
+ r'([^@(/]*)' // The actual name of the member.
+ r'(?:\(.*\))?' // Arguments to the member, sometimes captured by Firefox.
+ r'((?:/[^/]*)*)' // Extra characters indicating a nested closure.
+ r'(?:\(.*\))?' // Arguments to the closure.
+ r'@'
+ r')?'
+ r'(.*?)' // The frame's URL.
+ r':'
+ r'(\d*)' // The line number. Empty in Safari if it's unknown.
+ r'(?::(\d*))?' // The column number. Not present in older browsers and
+ // empty in Safari if it's unknown.
+ r'$');
+
+// With names:
+//
+// g@http://localhost:8080/test.wasm:wasm-function[796]:0x143b4
+// f@http://localhost:8080/test.wasm:wasm-function[795]:0x143a8
+// main@http://localhost:8080/test.wasm:wasm-function[792]:0x14390
+//
+// Without names:
+//
+// @http://localhost:8080/test.wasm:wasm-function[796]:0x143b4
+// @http://localhost:8080/test.wasm:wasm-function[795]:0x143a8
+// @http://localhost:8080/test.wasm:wasm-function[792]:0x14390
+//
+// JSShell in the command line uses a different format, which this regex also
+// parses.
+//
+// With names:
+//
+// main@/home/user/test.mjs line 29 > WebAssembly.compile:wasm-function[792]:0x14378
+//
+// Without names:
+//
+// @/home/user/test.mjs line 29 > WebAssembly.compile:wasm-function[792]:0x14378
+//
+// Matches named groups:
+//
+// - "member": Function name, may be empty: `g`.
+// - "uri": `http://localhost:8080/test.wasm`.
+// - "index": `796`.
+// - "offset": (in hex) `143b4`.
+final _firefoxWasmFrame =
+ RegExp(r'^(?<member>.*?)@(?:(?<uri>\S+).*?:wasm-function'
+ r'\[(?<index>\d+)\]:0x(?<offset>[0-9a-fA-F]+))$');
+
+// With names:
+//
+// (Note: Lines below are literal text, e.g. <?> is not a placeholder, it's a
+// part of the stack frame.)
+//
+// <?>.wasm-function[g]@[wasm code]
+// <?>.wasm-function[f]@[wasm code]
+// <?>.wasm-function[main]@[wasm code]
+//
+// Without names:
+//
+// <?>.wasm-function[796]@[wasm code]
+// <?>.wasm-function[795]@[wasm code]
+// <?>.wasm-function[792]@[wasm code]
+//
+// Matches named group "member": `g` or `796`.
+final _safariWasmFrame =
+ RegExp(r'^.*?wasm-function\[(?<member>.*)\]@\[wasm code\]$');
+
+// foo/bar.dart 10:11 Foo._bar
+// foo/bar.dart 10:11 (anonymous function).dart.fn
+// https://dart.dev/foo/bar.dart Foo._bar
+// data:... 10:11 Foo._bar
+final _friendlyFrame = RegExp(r'^(\S+)(?: (\d+)(?::(\d+))?)?\s+([^\d].*)$');
+
+/// A regular expression that matches asynchronous member names generated by the
+/// VM.
+final _asyncBody = RegExp(r'<(<anonymous closure>|[^>]+)_async_body>');
+
+final _initialDot = RegExp(r'^\.');
+
+/// A single stack frame. Each frame points to a precise location in Dart code.
+class Frame {
+ /// The URI of the file in which the code is located.
+ ///
+ /// This URI will usually have the scheme `dart`, `file`, `http`, or `https`.
+ final Uri uri;
+
+ /// The line number on which the code location is located.
+ ///
+ /// This can be null, indicating that the line number is unknown or
+ /// unimportant.
+ final int? line;
+
+ /// The column number of the code location.
+ ///
+ /// This can be null, indicating that the column number is unknown or
+ /// unimportant.
+ final int? column;
+
+ /// The name of the member in which the code location occurs.
+ ///
+ /// Anonymous closures are represented as `<fn>` in this member string.
+ final String? member;
+
+ /// Whether this stack frame comes from the Dart core libraries.
+ bool get isCore => uri.scheme == 'dart';
+
+ /// Returns a human-friendly description of the library that this stack frame
+ /// comes from.
+ ///
+ /// This will usually be the string form of [uri], but a relative URI will be
+ /// used if possible. Data URIs will be truncated.
+ String get library {
+ if (uri.scheme == 'data') return 'data:...';
+ return path.prettyUri(uri);
+ }
+
+ /// Returns the name of the package this stack frame comes from, or `null` if
+ /// this stack frame doesn't come from a `package:` URL.
+ String? get package {
+ if (uri.scheme != 'package') return null;
+ return uri.path.split('/').first;
+ }
+
+ /// A human-friendly description of the code location.
+ String get location {
+ if (line == null) return library;
+ if (column == null) return '$library $line';
+ return '$library $line:$column';
+ }
+
+ /// Returns a single frame of the current stack.
+ ///
+ /// By default, this will return the frame above the current method. If
+ /// [level] is `0`, it will return the current method's frame; if [level] is
+ /// higher than `1`, it will return higher frames.
+ factory Frame.caller([int level = 1]) {
+ if (level < 0) {
+ throw ArgumentError('Argument [level] must be greater than or equal '
+ 'to 0.');
+ }
+
+ return Trace.current(level + 1).frames.first;
+ }
+
+ /// Parses a string representation of a Dart VM stack frame.
+ factory Frame.parseVM(String frame) => _catchFormatException(frame, () {
+ // The VM sometimes folds multiple stack frames together and replaces
+ // them with "...".
+ if (frame == '...') {
+ return Frame(Uri(), null, null, '...');
+ }
+
+ var match = _vmFrame.firstMatch(frame);
+ if (match == null) return UnparsedFrame(frame);
+
+ // Get the pieces out of the regexp match. Function, URI and line should
+ // always be found. The column is optional.
+ var member = match[1]!
+ .replaceAll(_asyncBody, '<async>')
+ .replaceAll('<anonymous closure>', '<fn>');
+ var uri = match[2]!.startsWith('<data:')
+ ? Uri.dataFromString('')
+ : Uri.parse(match[2]!);
+
+ var lineAndColumn = match[3]!.split(':');
+ var line =
+ lineAndColumn.length > 1 ? int.parse(lineAndColumn[1]) : null;
+ var column =
+ lineAndColumn.length > 2 ? int.parse(lineAndColumn[2]) : null;
+ return Frame(uri, line, column, member);
+ });
+
+ /// Parses a string representation of a Chrome/V8 stack frame.
+ factory Frame.parseV8(String frame) => _catchFormatException(frame, () {
+ // Try to match a Wasm frame first: the Wasm frame regex won't match a
+ // JS frame but the JS frame regex may match a Wasm frame.
+ var match = _v8WasmFrame.firstMatch(frame);
+ if (match != null) {
+ final member = match.namedGroup('member');
+ final uri = _uriOrPathToUri(match.namedGroup('uri')!);
+ final functionIndex = match.namedGroup('index')!;
+ final functionOffset =
+ int.parse(match.namedGroup('offset')!, radix: 16);
+ return Frame(uri, 1, functionOffset + 1, member ?? functionIndex);
+ }
+
+ match = _v8JsFrame.firstMatch(frame);
+ if (match != null) {
+ // v8 location strings can be arbitrarily-nested, since it adds a
+ // layer of nesting for each eval performed on that line.
+ Frame parseJsLocation(String location, String member) {
+ var evalMatch = _v8EvalLocation.firstMatch(location);
+ while (evalMatch != null) {
+ location = evalMatch[1]!;
+ evalMatch = _v8EvalLocation.firstMatch(location);
+ }
+
+ if (location == 'native') {
+ return Frame(Uri.parse('native'), null, null, member);
+ }
+
+ var urlMatch = _v8JsUrlLocation.firstMatch(location);
+ if (urlMatch == null) return UnparsedFrame(frame);
+
+ final uri = _uriOrPathToUri(urlMatch[1]!);
+ final line = int.parse(urlMatch[2]!);
+ final columnMatch = urlMatch[3];
+ final column = columnMatch != null ? int.parse(columnMatch) : null;
+ return Frame(uri, line, column, member);
+ }
+
+ // V8 stack frames can be in two forms.
+ if (match[2] != null) {
+ // The first form looks like " at FUNCTION (LOCATION)". V8 proper
+ // lists anonymous functions within eval as "<anonymous>", while
+ // IE10 lists them as "Anonymous function".
+ return parseJsLocation(
+ match[2]!,
+ match[1]!
+ .replaceAll('<anonymous>', '<fn>')
+ .replaceAll('Anonymous function', '<fn>')
+ .replaceAll('(anonymous function)', '<fn>'));
+ } else {
+ // The second form looks like " at LOCATION", and is used for
+ // anonymous functions.
+ return parseJsLocation(match[3]!, '<fn>');
+ }
+ }
+
+ return UnparsedFrame(frame);
+ });
+
+ /// Parses a string representation of a JavaScriptCore stack trace.
+ factory Frame.parseJSCore(String frame) => Frame.parseV8(frame);
+
+ /// Parses a string representation of an IE stack frame.
+ ///
+ /// IE10+ frames look just like V8 frames. Prior to IE10, stack traces can't
+ /// be retrieved.
+ factory Frame.parseIE(String frame) => Frame.parseV8(frame);
+
+ /// Parses a Firefox 'eval' or 'function' stack frame.
+ ///
+ /// For example:
+ ///
+ /// ```
+ /// anonymous/<@https://example.com/stuff.js line 693 > Function:3:40
+ /// anonymous/<@https://example.com/stuff.js line 693 > eval:3:40
+ /// ```
+ factory Frame._parseFirefoxEval(String frame) =>
+ _catchFormatException(frame, () {
+ final match = _firefoxEvalLocation.firstMatch(frame);
+ if (match == null) return UnparsedFrame(frame);
+ var member = match[1]!.replaceAll('/<', '');
+ final uri = _uriOrPathToUri(match[2]!);
+ final line = int.parse(match[3]!);
+ if (member.isEmpty || member == 'anonymous') {
+ member = '<fn>';
+ }
+ return Frame(uri, line, null, member);
+ });
+
+ /// Parses a string representation of a Firefox or Safari stack frame.
+ factory Frame.parseFirefox(String frame) => _catchFormatException(frame, () {
+ var match = _firefoxSafariJSFrame.firstMatch(frame);
+ if (match != null) {
+ if (match[3]!.contains(' line ')) {
+ return Frame._parseFirefoxEval(frame);
+ }
+
+ // Normally this is a URI, but in a jsshell trace it can be a path.
+ var uri = _uriOrPathToUri(match[3]!);
+
+ var member = match[1];
+ if (member != null) {
+ member +=
+ List.filled('/'.allMatches(match[2]!).length, '.<fn>').join();
+ if (member == '') member = '<fn>';
+
+ // Some Firefox members have initial dots. We remove them for
+ // consistency with other platforms.
+ member = member.replaceFirst(_initialDot, '');
+ } else {
+ member = '<fn>';
+ }
+
+ var line = match[4] == '' ? null : int.parse(match[4]!);
+ var column =
+ match[5] == null || match[5] == '' ? null : int.parse(match[5]!);
+ return Frame(uri, line, column, member);
+ }
+
+ match = _firefoxWasmFrame.firstMatch(frame);
+ if (match != null) {
+ final member = match.namedGroup('member')!;
+ final uri = _uriOrPathToUri(match.namedGroup('uri')!);
+ final functionIndex = match.namedGroup('index')!;
+ final functionOffset =
+ int.parse(match.namedGroup('offset')!, radix: 16);
+ return Frame(uri, 1, functionOffset + 1,
+ member.isNotEmpty ? member : functionIndex);
+ }
+
+ match = _safariWasmFrame.firstMatch(frame);
+ if (match != null) {
+ final member = match.namedGroup('member')!;
+ return Frame(Uri(path: 'wasm code'), null, null, member);
+ }
+
+ return UnparsedFrame(frame);
+ });
+
+ /// Parses a string representation of a Safari 6.0 stack frame.
+ @Deprecated('Use Frame.parseSafari instead.')
+ factory Frame.parseSafari6_0(String frame) => Frame.parseFirefox(frame);
+
+ /// Parses a string representation of a Safari 6.1+ stack frame.
+ @Deprecated('Use Frame.parseSafari instead.')
+ factory Frame.parseSafari6_1(String frame) => Frame.parseFirefox(frame);
+
+ /// Parses a string representation of a Safari stack frame.
+ factory Frame.parseSafari(String frame) => Frame.parseFirefox(frame);
+
+ /// Parses this package's string representation of a stack frame.
+ factory Frame.parseFriendly(String frame) => _catchFormatException(frame, () {
+ var match = _friendlyFrame.firstMatch(frame);
+ if (match == null) {
+ throw FormatException(
+ "Couldn't parse package:stack_trace stack trace line '$frame'.");
+ }
+ // Fake truncated data urls generated by the friendly stack trace format
+ // cause Uri.parse to throw an exception so we have to special case
+ // them.
+ var uri = match[1] == 'data:...'
+ ? Uri.dataFromString('')
+ : Uri.parse(match[1]!);
+ // If there's no scheme, this is a relative URI. We should interpret it
+ // as relative to the current working directory.
+ if (uri.scheme == '') {
+ uri = path.toUri(path.absolute(path.fromUri(uri)));
+ }
+
+ var line = match[2] == null ? null : int.parse(match[2]!);
+ var column = match[3] == null ? null : int.parse(match[3]!);
+ return Frame(uri, line, column, match[4]);
+ });
+
+ /// A regular expression matching an absolute URI.
+ static final _uriRegExp = RegExp(r'^[a-zA-Z][-+.a-zA-Z\d]*://');
+
+ /// A regular expression matching a Windows path.
+ static final _windowsRegExp = RegExp(r'^([a-zA-Z]:[\\/]|\\\\)');
+
+ /// Converts [uriOrPath], which can be a URI, a Windows path, or a Posix path,
+ /// to a URI (absolute if possible).
+ static Uri _uriOrPathToUri(String uriOrPath) {
+ if (uriOrPath.contains(_uriRegExp)) {
+ return Uri.parse(uriOrPath);
+ } else if (uriOrPath.contains(_windowsRegExp)) {
+ return Uri.file(uriOrPath, windows: true);
+ } else if (uriOrPath.startsWith('/')) {
+ return Uri.file(uriOrPath, windows: false);
+ }
+
+ // As far as I've seen, Firefox and V8 both always report absolute paths in
+ // their stack frames. However, if we do get a relative path, we should
+ // handle it gracefully.
+ if (uriOrPath.contains('\\')) return path.windows.toUri(uriOrPath);
+ return Uri.parse(uriOrPath);
+ }
+
+ /// Runs [body] and returns its result.
+ ///
+ /// If [body] throws a [FormatException], returns an [UnparsedFrame] with
+ /// [text] instead.
+ static Frame _catchFormatException(String text, Frame Function() body) {
+ try {
+ return body();
+ } on FormatException catch (_) {
+ return UnparsedFrame(text);
+ }
+ }
+
+ Frame(this.uri, this.line, this.column, this.member);
+
+ @override
+ String toString() => '$location in $member';
+}
diff --git a/pkgs/stack_trace/lib/src/lazy_chain.dart b/pkgs/stack_trace/lib/src/lazy_chain.dart
new file mode 100644
index 0000000..063ed59
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/lazy_chain.dart
@@ -0,0 +1,33 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'chain.dart';
+import 'frame.dart';
+import 'lazy_trace.dart';
+import 'trace.dart';
+
+/// A thunk for lazily constructing a [Chain].
+typedef ChainThunk = Chain Function();
+
+/// A wrapper around a [ChainThunk]. This works around issue 9579 by avoiding
+/// the conversion of native [StackTrace]s to strings until it's absolutely
+/// necessary.
+class LazyChain implements Chain {
+ final ChainThunk _thunk;
+ late final Chain _chain = _thunk();
+
+ LazyChain(this._thunk);
+
+ @override
+ List<Trace> get traces => _chain.traces;
+ @override
+ Chain get terse => _chain.terse;
+ @override
+ Chain foldFrames(bool Function(Frame) predicate, {bool terse = false}) =>
+ LazyChain(() => _chain.foldFrames(predicate, terse: terse));
+ @override
+ Trace toTrace() => LazyTrace(_chain.toTrace);
+ @override
+ String toString() => _chain.toString();
+}
diff --git a/pkgs/stack_trace/lib/src/lazy_trace.dart b/pkgs/stack_trace/lib/src/lazy_trace.dart
new file mode 100644
index 0000000..3ecaa2d
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/lazy_trace.dart
@@ -0,0 +1,33 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'frame.dart';
+import 'trace.dart';
+
+/// A thunk for lazily constructing a [Trace].
+typedef TraceThunk = Trace Function();
+
+/// A wrapper around a [TraceThunk]. This works around issue 9579 by avoiding
+/// the conversion of native [StackTrace]s to strings until it's absolutely
+/// necessary.
+class LazyTrace implements Trace {
+ final TraceThunk _thunk;
+ late final Trace _trace = _thunk();
+
+ LazyTrace(this._thunk);
+
+ @override
+ List<Frame> get frames => _trace.frames;
+ @override
+ StackTrace get original => _trace.original;
+ @override
+ StackTrace get vmTrace => _trace.vmTrace;
+ @override
+ Trace get terse => LazyTrace(() => _trace.terse);
+ @override
+ Trace foldFrames(bool Function(Frame) predicate, {bool terse = false}) =>
+ LazyTrace(() => _trace.foldFrames(predicate, terse: terse));
+ @override
+ String toString() => _trace.toString();
+}
diff --git a/pkgs/stack_trace/lib/src/stack_zone_specification.dart b/pkgs/stack_trace/lib/src/stack_zone_specification.dart
new file mode 100644
index 0000000..901a5ee
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/stack_zone_specification.dart
@@ -0,0 +1,262 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'chain.dart';
+import 'lazy_chain.dart';
+import 'lazy_trace.dart';
+import 'trace.dart';
+import 'utils.dart';
+
+/// A class encapsulating the zone specification for a [Chain.capture] zone.
+///
+/// Until they're materialized and exposed to the user, stack chains are tracked
+/// as linked lists of [Trace]s using the [_Node] class. These nodes are stored
+/// in three distinct ways:
+///
+/// * When a callback is registered, a node is created and stored as a captured
+/// local variable until the callback is run.
+///
+/// * When a callback is run, its captured node is set as the [_currentNode] so
+/// it can be available to [Chain.current] and to be linked into additional
+/// chains when more callbacks are scheduled.
+///
+/// * When a callback throws an error or a Future or Stream emits an error, the
+/// current node is associated with that error's stack trace using the
+/// [_chains] expando.
+///
+/// Since [ZoneSpecification] can't be extended or even implemented, in order to
+/// get a real [ZoneSpecification] instance it's necessary to call [toSpec].
+class StackZoneSpecification {
+ /// An opaque object used as a zone value to disable chain tracking in a given
+ /// zone.
+ ///
+ /// If `Zone.current[disableKey]` is `true`, no stack chains will be tracked.
+ static final disableKey = Object();
+
+ /// Whether chain-tracking is disabled in the current zone.
+ bool get _disabled => Zone.current[disableKey] == true;
+
+ /// The expando that associates stack chains with [StackTrace]s.
+ ///
+ /// The chains are associated with stack traces rather than errors themselves
+ /// because it's a common practice to throw strings as errors, which can't be
+ /// used with expandos.
+ ///
+ /// The chain associated with a given stack trace doesn't contain a node for
+ /// that stack trace.
+ final _chains = Expando<_Node>('stack chains');
+
+ /// The error handler for the zone.
+ ///
+ /// If this is null, that indicates that any unhandled errors should be passed
+ /// to the parent zone.
+ final void Function(Object error, Chain)? _onError;
+
+ /// The most recent node of the current stack chain.
+ _Node? _currentNode;
+
+ /// Whether this is an error zone.
+ final bool _errorZone;
+
+ StackZoneSpecification(this._onError, {bool errorZone = true})
+ : _errorZone = errorZone;
+
+ /// Converts this specification to a real [ZoneSpecification].
+ ZoneSpecification toSpec() => ZoneSpecification(
+ handleUncaughtError: _errorZone ? _handleUncaughtError : null,
+ registerCallback: _registerCallback,
+ registerUnaryCallback: _registerUnaryCallback,
+ registerBinaryCallback: _registerBinaryCallback,
+ errorCallback: _errorCallback);
+
+ /// Returns the current stack chain.
+ ///
+ /// By default, the first frame of the first trace will be the line where
+ /// [currentChain] is called. If [level] is passed, the first trace will start
+ /// that many frames up instead.
+ Chain currentChain([int level = 0]) => _createNode(level + 1).toChain();
+
+ /// Returns the stack chain associated with [trace], if one exists.
+ ///
+ /// The first stack trace in the returned chain will always be [trace]
+ /// (converted to a [Trace] if necessary). If there is no chain associated
+ /// with [trace], this just returns a single-trace chain containing [trace].
+ Chain chainFor(StackTrace? trace) {
+ if (trace is Chain) return trace;
+ trace ??= StackTrace.current;
+
+ var previous = _chains[trace] ?? _currentNode;
+ if (previous == null) {
+ // If there's no [_currentNode], we're running synchronously beneath
+ // [Chain.capture] and we should fall back to the VM's stack chaining. We
+ // can't use [Chain.from] here because it'll just call [chainFor] again.
+ if (trace is Trace) return Chain([trace]);
+ return LazyChain(() => Chain.parse(trace!.toString()));
+ } else {
+ if (trace is! Trace) {
+ var original = trace;
+ trace = LazyTrace(() => Trace.parse(_trimVMChain(original)));
+ }
+
+ return _Node(trace, previous).toChain();
+ }
+ }
+
+ /// Tracks the current stack chain so it can be set to [_currentNode] when
+ /// [f] is run.
+ ZoneCallback<R> _registerCallback<R>(
+ Zone self, ZoneDelegate parent, Zone zone, R Function() f) {
+ if (_disabled) return parent.registerCallback(zone, f);
+ var node = _createNode(1);
+ return parent.registerCallback(zone, () => _run(f, node));
+ }
+
+ /// Tracks the current stack chain so it can be set to [_currentNode] when
+ /// [f] is run.
+ ZoneUnaryCallback<R, T> _registerUnaryCallback<R, T>(
+ Zone self,
+ ZoneDelegate parent,
+ Zone zone,
+ @pragma('vm:awaiter-link') R Function(T) f) {
+ if (_disabled) return parent.registerUnaryCallback(zone, f);
+ var node = _createNode(1);
+ return parent.registerUnaryCallback(
+ zone, (arg) => _run(() => f(arg), node));
+ }
+
+ /// Tracks the current stack chain so it can be set to [_currentNode] when
+ /// [f] is run.
+ ZoneBinaryCallback<R, T1, T2> _registerBinaryCallback<R, T1, T2>(
+ Zone self, ZoneDelegate parent, Zone zone, R Function(T1, T2) f) {
+ if (_disabled) return parent.registerBinaryCallback(zone, f);
+
+ var node = _createNode(1);
+ return parent.registerBinaryCallback(
+ zone, (arg1, arg2) => _run(() => f(arg1, arg2), node));
+ }
+
+ /// Looks up the chain associated with [stackTrace] and passes it either to
+ /// [_onError] or [parent]'s error handler.
+ void _handleUncaughtError(Zone self, ZoneDelegate parent, Zone zone,
+ Object error, StackTrace stackTrace) {
+ if (_disabled) {
+ parent.handleUncaughtError(zone, error, stackTrace);
+ return;
+ }
+
+ var stackChain = chainFor(stackTrace);
+ if (_onError == null) {
+ parent.handleUncaughtError(zone, error, stackChain);
+ return;
+ }
+
+ // TODO(nweiz): Currently this copies a lot of logic from [runZoned]. Just
+ // allow [runBinary] to throw instead once issue 18134 is fixed.
+ try {
+ // TODO(rnystrom): Is the null-assertion correct here? It is nullable in
+ // Zone. Should we check for that here?
+ self.parent!.runBinary(_onError, error, stackChain);
+ } on Object catch (newError, newStackTrace) {
+ if (identical(newError, error)) {
+ parent.handleUncaughtError(zone, error, stackChain);
+ } else {
+ parent.handleUncaughtError(zone, newError, newStackTrace);
+ }
+ }
+ }
+
+ /// Attaches the current stack chain to [stackTrace], replacing it if
+ /// necessary.
+ AsyncError? _errorCallback(Zone self, ZoneDelegate parent, Zone zone,
+ Object error, StackTrace? stackTrace) {
+ if (_disabled) return parent.errorCallback(zone, error, stackTrace);
+
+ // Go up two levels to get through [_CustomZone.errorCallback].
+ if (stackTrace == null) {
+ stackTrace = _createNode(2).toChain();
+ } else {
+ if (_chains[stackTrace] == null) _chains[stackTrace] = _createNode(2);
+ }
+
+ var asyncError = parent.errorCallback(zone, error, stackTrace);
+ return asyncError ?? AsyncError(error, stackTrace);
+ }
+
+ /// Creates a [_Node] with the current stack trace and linked to
+ /// [_currentNode].
+ ///
+ /// By default, the first frame of the first trace will be the line where
+ /// [_createNode] is called. If [level] is passed, the first trace will start
+ /// that many frames up instead.
+ _Node _createNode([int level = 0]) =>
+ _Node(_currentTrace(level + 1), _currentNode);
+
+ // TODO(nweiz): use a more robust way of detecting and tracking errors when
+ // issue 15105 is fixed.
+ /// Runs [f] with [_currentNode] set to [node].
+ ///
+ /// If [f] throws an error, this associates [node] with that error's stack
+ /// trace.
+ T _run<T>(T Function() f, _Node node) {
+ var previousNode = _currentNode;
+ _currentNode = node;
+ try {
+ return f();
+ } catch (e, stackTrace) {
+ // We can see the same stack trace multiple times if it's rethrown through
+ // guarded callbacks. The innermost chain will have the most
+ // information so it should take precedence.
+ _chains[stackTrace] ??= node;
+ rethrow;
+ } finally {
+ _currentNode = previousNode;
+ }
+ }
+
+ /// Like [Trace.current], but if the current stack trace has VM chaining
+ /// enabled, this only returns the innermost sub-trace.
+ Trace _currentTrace([int? level]) {
+ var stackTrace = StackTrace.current;
+ return LazyTrace(() {
+ var text = _trimVMChain(stackTrace);
+ var trace = Trace.parse(text);
+ // JS includes a frame for the call to StackTrace.current, but the VM
+ // doesn't, so we skip an extra frame in a JS context.
+ return Trace(trace.frames.skip((level ?? 0) + (inJS ? 2 : 1)),
+ original: text);
+ });
+ }
+
+ /// Removes the VM's stack chains from the native [trace], since we're
+ /// generating our own and we don't want duplicate frames.
+ String _trimVMChain(StackTrace trace) {
+ var text = trace.toString();
+ var index = text.indexOf(vmChainGap);
+ return index == -1 ? text : text.substring(0, index);
+ }
+}
+
+/// A linked list node representing a single entry in a stack chain.
+class _Node {
+ /// The stack trace for this link of the chain.
+ final Trace trace;
+
+ /// The previous node in the chain.
+ final _Node? previous;
+
+ _Node(StackTrace trace, [this.previous]) : trace = Trace.from(trace);
+
+ /// Converts this to a [Chain].
+ Chain toChain() {
+ var nodes = <Trace>[];
+ _Node? node = this;
+ while (node != null) {
+ nodes.add(node.trace);
+ node = node.previous;
+ }
+ return Chain(nodes);
+ }
+}
diff --git a/pkgs/stack_trace/lib/src/trace.dart b/pkgs/stack_trace/lib/src/trace.dart
new file mode 100644
index 0000000..b8c62f5
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/trace.dart
@@ -0,0 +1,341 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math' as math;
+
+import 'chain.dart';
+import 'frame.dart';
+import 'lazy_trace.dart';
+import 'unparsed_frame.dart';
+import 'utils.dart';
+import 'vm_trace.dart';
+
+final _terseRegExp = RegExp(r'(-patch)?([/\\].*)?$');
+
+/// A RegExp to match V8's stack traces.
+///
+/// V8's traces start with a line that's either just "Error" or else is a
+/// description of the exception that occurred. That description can be multiple
+/// lines, so we just look for any line other than the first that begins with
+/// three or four spaces and "at".
+final _v8Trace = RegExp(r'\n ?at ');
+
+/// A RegExp to match indidual lines of V8's stack traces.
+///
+/// This is intended to filter out the leading exception details of the trace
+/// though it is possible for the message to match this as well.
+final _v8TraceLine = RegExp(r' ?at ');
+
+/// A RegExp to match Firefox's eval and Function stack traces.
+///
+/// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error/stack
+///
+/// These stack traces look like:
+///
+/// ````
+/// anonymous/<@https://example.com/stuff.js line 693 > Function:3:40
+/// anonymous/<@https://example.com/stuff.js line 693 > eval:3:40
+/// ````
+final _firefoxEvalTrace = RegExp(r'@\S+ line \d+ >.* (Function|eval):\d+:\d+');
+
+/// A RegExp to match Firefox and Safari's stack traces.
+///
+/// Firefox and Safari have very similar stack trace formats, so we use the same
+/// logic for parsing them.
+///
+/// Firefox's trace frames start with the name of the function in which the
+/// error occurred, possibly including its parameters inside `()`. For example,
+/// `.VW.call$0("arg")@https://example.com/stuff.dart.js:560`.
+///
+/// Safari traces occasionally don't include the initial method name followed by
+/// "@", and they always have both the line and column number (or just a
+/// trailing colon if no column number is available). They can also contain
+/// empty lines or lines consisting only of `[native code]`.
+final _firefoxSafariTrace = RegExp(
+ r'^'
+ r'(' // Member description. Not present in some Safari frames.
+ r'([.0-9A-Za-z_$/<]|\(.*\))*' // Member name and arguments.
+ r'@'
+ r')?'
+ r'[^\s]*' // Frame URL.
+ r':\d*' // Line or column number. Some older frames only have a line number.
+ r'$',
+ multiLine: true);
+
+/// A RegExp to match this package's stack traces.
+final _friendlyTrace =
+ RegExp(r'^[^\s<][^\s]*( \d+(:\d+)?)?[ \t]+[^\s]+$', multiLine: true);
+
+/// A stack trace, comprised of a list of stack frames.
+class Trace implements StackTrace {
+ /// The stack frames that comprise this stack trace.
+ final List<Frame> frames;
+
+ /// The original stack trace from which this trace was parsed.
+ final StackTrace original;
+
+ /// Returns a human-readable representation of [stackTrace]. If [terse] is
+ /// set, this folds together multiple stack frames from the Dart core
+ /// libraries, so that only the core library method directly called from user
+ /// code is visible (see [Trace.terse]).
+ static String format(StackTrace stackTrace, {bool terse = true}) {
+ var trace = Trace.from(stackTrace);
+ if (terse) trace = trace.terse;
+ return trace.toString();
+ }
+
+ /// Returns the current stack trace.
+ ///
+ /// By default, the first frame of this trace will be the line where
+ /// [Trace.current] is called. If [level] is passed, the trace will start that
+ /// many frames up instead.
+ factory Trace.current([int level = 0]) {
+ if (level < 0) {
+ throw ArgumentError('Argument [level] must be greater than or equal '
+ 'to 0.');
+ }
+
+ var trace = Trace.from(StackTrace.current);
+ return LazyTrace(
+ () =>
+ // JS includes a frame for the call to StackTrace.current, but the VM
+ // doesn't, so we skip an extra frame in a JS context.
+ Trace(trace.frames.skip(level + (inJS ? 2 : 1)),
+ original: trace.original.toString()),
+ );
+ }
+
+ /// Returns a new stack trace containing the same data as [trace].
+ ///
+ /// If [trace] is a native [StackTrace], its data will be parsed out; if it's
+ /// a [Trace], it will be returned as-is.
+ factory Trace.from(StackTrace trace) {
+ if (trace is Trace) return trace;
+ if (trace is Chain) return trace.toTrace();
+ return LazyTrace(() => Trace.parse(trace.toString()));
+ }
+
+ /// Parses a string representation of a stack trace.
+ ///
+ /// [trace] should be formatted in the same way as a Dart VM or browser stack
+ /// trace. If it's formatted as a stack chain, this will return the equivalent
+ /// of [Chain.toTrace].
+ factory Trace.parse(String trace) {
+ try {
+ if (trace.isEmpty) return Trace(<Frame>[]);
+ if (trace.contains(_v8Trace)) return Trace.parseV8(trace);
+ if (trace.contains('\tat ')) return Trace.parseJSCore(trace);
+ if (trace.contains(_firefoxSafariTrace) ||
+ trace.contains(_firefoxEvalTrace)) {
+ return Trace.parseFirefox(trace);
+ }
+ if (trace.contains(chainGap)) return Chain.parse(trace).toTrace();
+ if (trace.contains(_friendlyTrace)) {
+ return Trace.parseFriendly(trace);
+ }
+
+ // Default to parsing the stack trace as a VM trace. This is also hit on
+ // IE and Safari, where the stack trace is just an empty string (issue
+ // 11257).
+ return Trace.parseVM(trace);
+ } on FormatException catch (error) {
+ throw FormatException('${error.message}\nStack trace:\n$trace');
+ }
+ }
+
+ /// Parses a string representation of a Dart VM stack trace.
+ Trace.parseVM(String trace) : this(_parseVM(trace), original: trace);
+
+ static List<Frame> _parseVM(String trace) {
+ // Ignore [vmChainGap]. This matches the behavior of
+ // `Chain.parse().toTrace()`.
+ var lines = trace
+ .trim()
+ .replaceAll(vmChainGap, '')
+ .split('\n')
+ .where((line) => line.isNotEmpty);
+
+ if (lines.isEmpty) {
+ return [];
+ }
+
+ var frames = lines.take(lines.length - 1).map(Frame.parseVM).toList();
+
+ // TODO(nweiz): Remove this when issue 23614 is fixed.
+ if (!lines.last.endsWith('.da')) {
+ frames.add(Frame.parseVM(lines.last));
+ }
+
+ return frames;
+ }
+
+ /// Parses a string representation of a Chrome/V8 stack trace.
+ Trace.parseV8(String trace)
+ : this(
+ trace
+ .split('\n')
+ .skip(1)
+ // It's possible that an Exception's description contains a line
+ // that looks like a V8 trace line, which will screw this up.
+ // Unfortunately, that's impossible to detect.
+ .skipWhile((line) => !line.startsWith(_v8TraceLine))
+ .map(Frame.parseV8),
+ original: trace);
+
+ /// Parses a string representation of a JavaScriptCore stack trace.
+ Trace.parseJSCore(String trace)
+ : this(
+ trace
+ .split('\n')
+ .where((line) => line != '\tat ')
+ .map(Frame.parseV8),
+ original: trace);
+
+ /// Parses a string representation of an Internet Explorer stack trace.
+ ///
+ /// IE10+ traces look just like V8 traces. Prior to IE10, stack traces can't
+ /// be retrieved.
+ Trace.parseIE(String trace) : this.parseV8(trace);
+
+ /// Parses a string representation of a Firefox stack trace.
+ Trace.parseFirefox(String trace)
+ : this(
+ trace
+ .trim()
+ .split('\n')
+ .where((line) => line.isNotEmpty && line != '[native code]')
+ .map(Frame.parseFirefox),
+ original: trace);
+
+ /// Parses a string representation of a Safari stack trace.
+ Trace.parseSafari(String trace) : this.parseFirefox(trace);
+
+ /// Parses a string representation of a Safari 6.1+ stack trace.
+ @Deprecated('Use Trace.parseSafari instead.')
+ Trace.parseSafari6_1(String trace) : this.parseSafari(trace);
+
+ /// Parses a string representation of a Safari 6.0 stack trace.
+ @Deprecated('Use Trace.parseSafari instead.')
+ Trace.parseSafari6_0(String trace)
+ : this(
+ trace
+ .trim()
+ .split('\n')
+ .where((line) => line != '[native code]')
+ .map(Frame.parseFirefox),
+ original: trace);
+
+ /// Parses this package's string representation of a stack trace.
+ ///
+ /// This also parses string representations of [Chain]s. They parse to the
+ /// same trace that [Chain.toTrace] would return.
+ Trace.parseFriendly(String trace)
+ : this(
+ trace.isEmpty
+ ? []
+ : trace
+ .trim()
+ .split('\n')
+ // Filter out asynchronous gaps from [Chain]s.
+ .where((line) => !line.startsWith('====='))
+ .map(Frame.parseFriendly),
+ original: trace);
+
+ /// Returns a new [Trace] comprised of [frames].
+ Trace(Iterable<Frame> frames, {String? original})
+ : frames = List<Frame>.unmodifiable(frames),
+ original = StackTrace.fromString(original ?? '');
+
+ /// Returns a VM-style [StackTrace] object.
+ ///
+ /// The return value's [toString] method will always return a string
+ /// representation in the Dart VM's stack trace format, regardless of what
+ /// platform is being used.
+ StackTrace get vmTrace => VMTrace(frames);
+
+ /// Returns a terser version of this trace.
+ ///
+ /// This is accomplished by folding together multiple stack frames from the
+ /// core library or from this package, as in [foldFrames]. Remaining core
+ /// library frames have their libraries, "-patch" suffixes, and line numbers
+ /// removed. If the outermost frame of the stack trace is a core library
+ /// frame, it's removed entirely.
+ ///
+ /// This won't do anything with a raw JavaScript trace, since there's no way
+ /// to determine which frames come from which Dart libraries. However, the
+ /// [`source_map_stack_trace`][https://pub.dev/packages/source_map_stack_trace]
+ /// package can be used to convert JavaScript traces into Dart-style traces.
+ ///
+ /// For custom folding, see [foldFrames].
+ Trace get terse => foldFrames((_) => false, terse: true);
+
+ /// Returns a new [Trace] based on `this` where multiple stack frames matching
+ /// [predicate] are folded together.
+ ///
+ /// This means that whenever there are multiple frames in a row that match
+ /// [predicate], only the last one is kept. This is useful for limiting the
+ /// amount of library code that appears in a stack trace by only showing user
+ /// code and code that's called by user code.
+ ///
+ /// If [terse] is true, this will also fold together frames from the core
+ /// library or from this package, simplify core library frames, and
+ /// potentially remove the outermost frame as in [Trace.terse].
+ Trace foldFrames(bool Function(Frame) predicate, {bool terse = false}) {
+ if (terse) {
+ var oldPredicate = predicate;
+ predicate = (frame) {
+ if (oldPredicate(frame)) return true;
+
+ if (frame.isCore) return true;
+ if (frame.package == 'stack_trace') return true;
+
+ // Ignore async stack frames without any line or column information.
+ // These come from the VM's async/await implementation and represent
+ // internal frames. They only ever show up in stack chains and are
+ // always surrounded by other traces that are actually useful, so we can
+ // just get rid of them.
+ // TODO(nweiz): Get rid of this logic some time after issue 22009 is
+ // fixed.
+ if (!frame.member!.contains('<async>')) return false;
+ return frame.line == null;
+ };
+ }
+
+ var newFrames = <Frame>[];
+ for (var frame in frames.reversed) {
+ if (frame is UnparsedFrame || !predicate(frame)) {
+ newFrames.add(frame);
+ } else if (newFrames.isEmpty || !predicate(newFrames.last)) {
+ newFrames.add(Frame(frame.uri, frame.line, frame.column, frame.member));
+ }
+ }
+
+ if (terse) {
+ newFrames = newFrames.map((frame) {
+ if (frame is UnparsedFrame || !predicate(frame)) return frame;
+ var library = frame.library.replaceAll(_terseRegExp, '');
+ return Frame(Uri.parse(library), null, null, frame.member);
+ }).toList();
+
+ if (newFrames.length > 1 && predicate(newFrames.first)) {
+ newFrames.removeAt(0);
+ }
+ }
+
+ return Trace(newFrames.reversed, original: original.toString());
+ }
+
+ @override
+ String toString() {
+ // Figure out the longest path so we know how much to pad.
+ var longest =
+ frames.map((frame) => frame.location.length).fold(0, math.max);
+
+ // Print out the stack trace nicely formatted.
+ return frames.map((frame) {
+ if (frame is UnparsedFrame) return '$frame\n';
+ return '${frame.location.padRight(longest)} ${frame.member}\n';
+ }).join();
+ }
+}
diff --git a/pkgs/stack_trace/lib/src/unparsed_frame.dart b/pkgs/stack_trace/lib/src/unparsed_frame.dart
new file mode 100644
index 0000000..27e97f6
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/unparsed_frame.dart
@@ -0,0 +1,33 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'frame.dart';
+
+/// A frame that failed to parse.
+///
+/// The [member] property contains the original frame's contents.
+class UnparsedFrame implements Frame {
+ @override
+ final Uri uri = Uri(path: 'unparsed');
+ @override
+ final int? line = null;
+ @override
+ final int? column = null;
+ @override
+ final bool isCore = false;
+ @override
+ final String library = 'unparsed';
+ @override
+ final String? package = null;
+ @override
+ final String location = 'unparsed';
+
+ @override
+ final String member;
+
+ UnparsedFrame(this.member);
+
+ @override
+ String toString() => member;
+}
diff --git a/pkgs/stack_trace/lib/src/utils.dart b/pkgs/stack_trace/lib/src/utils.dart
new file mode 100644
index 0000000..bd971fe
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/utils.dart
@@ -0,0 +1,15 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// The line used in the string representation of stack chains to represent
+/// the gap between traces.
+const chainGap = '===== asynchronous gap ===========================\n';
+
+/// The line used in the string representation of VM stack chains to represent
+/// the gap between traces.
+final vmChainGap = RegExp(r'^<asynchronous suspension>\n?$', multiLine: true);
+
+// TODO(nweiz): When cross-platform imports work, use them to set this.
+/// Whether we're running in a JS context.
+const bool inJS = 0.0 is int;
diff --git a/pkgs/stack_trace/lib/src/vm_trace.dart b/pkgs/stack_trace/lib/src/vm_trace.dart
new file mode 100644
index 0000000..005b7af
--- /dev/null
+++ b/pkgs/stack_trace/lib/src/vm_trace.dart
@@ -0,0 +1,32 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'frame.dart';
+
+/// An implementation of [StackTrace] that emulates the behavior of the VM's
+/// implementation.
+///
+/// In particular, when [toString] is called, this returns a string in the VM's
+/// stack trace format.
+class VMTrace implements StackTrace {
+ /// The stack frames that comprise this stack trace.
+ final List<Frame> frames;
+
+ VMTrace(this.frames);
+
+ @override
+ String toString() {
+ var i = 1;
+ return frames.map((frame) {
+ var number = '#${i++}'.padRight(8);
+ var member = frame.member!
+ .replaceAllMapped(RegExp(r'[^.]+\.<async>'),
+ (match) => '${match[1]}.<${match[1]}_async_body>')
+ .replaceAll('<fn>', '<anonymous closure>');
+ var line = frame.line ?? 0;
+ var column = frame.column ?? 0;
+ return '$number$member (${frame.uri}:$line:$column)\n';
+ }).join();
+ }
+}
diff --git a/pkgs/stack_trace/lib/stack_trace.dart b/pkgs/stack_trace/lib/stack_trace.dart
new file mode 100644
index 0000000..fad30ce
--- /dev/null
+++ b/pkgs/stack_trace/lib/stack_trace.dart
@@ -0,0 +1,8 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/chain.dart';
+export 'src/frame.dart';
+export 'src/trace.dart';
+export 'src/unparsed_frame.dart';
diff --git a/pkgs/stack_trace/pubspec.yaml b/pkgs/stack_trace/pubspec.yaml
new file mode 100644
index 0000000..4f387b1
--- /dev/null
+++ b/pkgs/stack_trace/pubspec.yaml
@@ -0,0 +1,14 @@
+name: stack_trace
+version: 1.12.1
+description: A package for manipulating stack traces and printing them readably.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/stack_trace
+
+environment:
+ sdk: ^3.4.0
+
+dependencies:
+ path: ^1.8.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.6
diff --git a/pkgs/stack_trace/test/chain/chain_test.dart b/pkgs/stack_trace/test/chain/chain_test.dart
new file mode 100644
index 0000000..d5426dd
--- /dev/null
+++ b/pkgs/stack_trace/test/chain/chain_test.dart
@@ -0,0 +1,375 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:path/path.dart' as p;
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ group('Chain.parse()', () {
+ test('parses a real Chain', () async {
+ // ignore: only_throw_errors
+ final chain = await captureFuture(() => inMicrotask(() => throw 'error'));
+
+ expect(
+ Chain.parse(chain.toString()).toString(),
+ equals(chain.toString()),
+ );
+ });
+
+ test('parses an empty string', () {
+ var chain = Chain.parse('');
+ expect(chain.traces, isEmpty);
+ });
+
+ test('parses a chain containing empty traces', () {
+ var chain =
+ Chain.parse('===== asynchronous gap ===========================\n'
+ '===== asynchronous gap ===========================\n');
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces[0].frames, isEmpty);
+ expect(chain.traces[1].frames, isEmpty);
+ expect(chain.traces[2].frames, isEmpty);
+ });
+
+ test('parses a chain with VM gaps', () {
+ final chain =
+ Chain.parse('#1 MyClass.run (package:my_lib.dart:134:5)\n'
+ '<asynchronous suspension>\n'
+ '#2 main (file:///my_app.dart:9:3)\n'
+ '<asynchronous suspension>\n');
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames, hasLength(1));
+ expect(chain.traces[0].frames[0].toString(),
+ equals('package:my_lib.dart 134:5 in MyClass.run'));
+ expect(chain.traces[1].frames, hasLength(1));
+ expect(
+ chain.traces[1].frames[0].toString(),
+ anyOf(
+ equals('/my_app.dart 9:3 in main'), // VM
+ equals('file:///my_app.dart 9:3 in main'), // Browser
+ ),
+ );
+ });
+ });
+
+ group('Chain.capture()', () {
+ test('with onError blocks errors', () {
+ Chain.capture(() {
+ return Future<void>.error('oh no');
+ }, onError: expectAsync2((error, chain) {
+ expect(error, equals('oh no'));
+ expect(chain, isA<Chain>());
+ })).then(expectAsync1((_) {}, count: 0),
+ onError: expectAsync2((_, __) {}, count: 0));
+ });
+
+ test('with no onError blocks errors', () {
+ runZonedGuarded(() {
+ Chain.capture(() => Future<void>.error('oh no')).then(
+ expectAsync1((_) {}, count: 0),
+ onError: expectAsync2((_, __) {}, count: 0));
+ }, expectAsync2((error, chain) {
+ expect(error, equals('oh no'));
+ expect(chain, isA<Chain>());
+ }));
+ });
+
+ test("with errorZone: false doesn't block errors", () {
+ expect(Chain.capture(() => Future<void>.error('oh no'), errorZone: false),
+ throwsA('oh no'));
+ });
+
+ test("doesn't allow onError and errorZone: false", () {
+ expect(() => Chain.capture(() {}, onError: (_, __) {}, errorZone: false),
+ throwsArgumentError);
+ });
+
+ group('with when: false', () {
+ test("with no onError doesn't block errors", () {
+ expect(Chain.capture(() => Future<void>.error('oh no'), when: false),
+ throwsA('oh no'));
+ });
+
+ test('with onError blocks errors', () {
+ Chain.capture(() {
+ return Future<void>.error('oh no');
+ }, onError: expectAsync2((error, chain) {
+ expect(error, equals('oh no'));
+ expect(chain, isA<Chain>());
+ }), when: false);
+ });
+
+ test("doesn't enable chain-tracking", () {
+ return Chain.disable(() {
+ return Chain.capture(() {
+ var completer = Completer<Chain>();
+ inMicrotask(() {
+ completer.complete(Chain.current());
+ });
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(1));
+ });
+ }, when: false);
+ });
+ });
+ });
+ });
+
+ test('Chain.capture() with custom zoneValues', () {
+ return Chain.capture(() {
+ expect(Zone.current[#enabled], true);
+ }, zoneValues: {#enabled: true});
+ });
+
+ group('Chain.disable()', () {
+ test('disables chain-tracking', () {
+ return Chain.disable(() {
+ var completer = Completer<Chain>();
+ inMicrotask(() => completer.complete(Chain.current()));
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(1));
+ });
+ });
+ });
+
+ test('Chain.capture() re-enables chain-tracking', () {
+ return Chain.disable(() {
+ return Chain.capture(() {
+ var completer = Completer<Chain>();
+ inMicrotask(() => completer.complete(Chain.current()));
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(2));
+ });
+ });
+ });
+ });
+
+ test('preserves parent zones of the capture zone', () {
+ // The outer disable call turns off the test package's chain-tracking.
+ return Chain.disable(() {
+ return runZoned(() {
+ return Chain.capture(() {
+ expect(Chain.disable(() => Zone.current[#enabled]), isTrue);
+ });
+ }, zoneValues: {#enabled: true});
+ });
+ });
+
+ test('preserves child zones of the capture zone', () {
+ // The outer disable call turns off the test package's chain-tracking.
+ return Chain.disable(() {
+ return Chain.capture(() {
+ return runZoned(() {
+ expect(Chain.disable(() => Zone.current[#enabled]), isTrue);
+ }, zoneValues: {#enabled: true});
+ });
+ });
+ });
+
+ test("with when: false doesn't disable", () {
+ return Chain.capture(() {
+ return Chain.disable(() {
+ var completer = Completer<Chain>();
+ inMicrotask(() => completer.complete(Chain.current()));
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(2));
+ });
+ }, when: false);
+ });
+ });
+ });
+
+ test('toString() ensures that all traces are aligned', () {
+ var chain = Chain([
+ Trace.parse('short 10:11 Foo.bar\n'),
+ Trace.parse('loooooooooooong 10:11 Zop.zoop')
+ ]);
+
+ expect(
+ chain.toString(),
+ equals('short 10:11 Foo.bar\n'
+ '===== asynchronous gap ===========================\n'
+ 'loooooooooooong 10:11 Zop.zoop\n'));
+ });
+
+ var userSlashCode = p.join('user', 'code.dart');
+ group('Chain.terse', () {
+ test('makes each trace terse', () {
+ var chain = Chain([
+ Trace.parse('dart:core 10:11 Foo.bar\n'
+ 'dart:core 10:11 Bar.baz\n'
+ 'user/code.dart 10:11 Bang.qux\n'
+ 'dart:core 10:11 Zip.zap\n'
+ 'dart:core 10:11 Zop.zoop'),
+ Trace.parse('user/code.dart 10:11 Bang.qux\n'
+ 'dart:core 10:11 Foo.bar\n'
+ 'package:stack_trace/stack_trace.dart 10:11 Bar.baz\n'
+ 'dart:core 10:11 Zip.zap\n'
+ 'user/code.dart 10:11 Zop.zoop')
+ ]);
+
+ expect(
+ chain.terse.toString(),
+ equals('dart:core Bar.baz\n'
+ '$userSlashCode 10:11 Bang.qux\n'
+ '===== asynchronous gap ===========================\n'
+ '$userSlashCode 10:11 Bang.qux\n'
+ 'dart:core Zip.zap\n'
+ '$userSlashCode 10:11 Zop.zoop\n'));
+ });
+
+ test('eliminates internal-only traces', () {
+ var chain = Chain([
+ Trace.parse('user/code.dart 10:11 Foo.bar\n'
+ 'dart:core 10:11 Bar.baz'),
+ Trace.parse('dart:core 10:11 Foo.bar\n'
+ 'package:stack_trace/stack_trace.dart 10:11 Bar.baz\n'
+ 'dart:core 10:11 Zip.zap'),
+ Trace.parse('user/code.dart 10:11 Foo.bar\n'
+ 'dart:core 10:11 Bar.baz')
+ ]);
+
+ expect(
+ chain.terse.toString(),
+ equals('$userSlashCode 10:11 Foo.bar\n'
+ '===== asynchronous gap ===========================\n'
+ '$userSlashCode 10:11 Foo.bar\n'));
+ });
+
+ test("doesn't return an empty chain", () {
+ var chain = Chain([
+ Trace.parse('dart:core 10:11 Foo.bar\n'
+ 'package:stack_trace/stack_trace.dart 10:11 Bar.baz\n'
+ 'dart:core 10:11 Zip.zap'),
+ Trace.parse('dart:core 10:11 A.b\n'
+ 'package:stack_trace/stack_trace.dart 10:11 C.d\n'
+ 'dart:core 10:11 E.f')
+ ]);
+
+ expect(chain.terse.toString(), equals('dart:core E.f\n'));
+ });
+
+ // Regression test for #9
+ test("doesn't crash on empty traces", () {
+ var chain = Chain([
+ Trace.parse('user/code.dart 10:11 Bang.qux'),
+ Trace([]),
+ Trace.parse('user/code.dart 10:11 Bang.qux')
+ ]);
+
+ expect(
+ chain.terse.toString(),
+ equals('$userSlashCode 10:11 Bang.qux\n'
+ '===== asynchronous gap ===========================\n'
+ '$userSlashCode 10:11 Bang.qux\n'));
+ });
+ });
+
+ group('Chain.foldFrames', () {
+ test('folds each trace', () {
+ var chain = Chain([
+ Trace.parse('a.dart 10:11 Foo.bar\n'
+ 'a.dart 10:11 Bar.baz\n'
+ 'b.dart 10:11 Bang.qux\n'
+ 'a.dart 10:11 Zip.zap\n'
+ 'a.dart 10:11 Zop.zoop'),
+ Trace.parse('a.dart 10:11 Foo.bar\n'
+ 'a.dart 10:11 Bar.baz\n'
+ 'a.dart 10:11 Bang.qux\n'
+ 'a.dart 10:11 Zip.zap\n'
+ 'b.dart 10:11 Zop.zoop')
+ ]);
+
+ var folded = chain.foldFrames((frame) => frame.library == 'a.dart');
+ expect(
+ folded.toString(),
+ equals('a.dart 10:11 Bar.baz\n'
+ 'b.dart 10:11 Bang.qux\n'
+ 'a.dart 10:11 Zop.zoop\n'
+ '===== asynchronous gap ===========================\n'
+ 'a.dart 10:11 Zip.zap\n'
+ 'b.dart 10:11 Zop.zoop\n'));
+ });
+
+ test('with terse: true, folds core frames as well', () {
+ var chain = Chain([
+ Trace.parse('a.dart 10:11 Foo.bar\n'
+ 'dart:async-patch/future.dart 10:11 Zip.zap\n'
+ 'b.dart 10:11 Bang.qux\n'
+ 'dart:core 10:11 Bar.baz\n'
+ 'a.dart 10:11 Zop.zoop'),
+ Trace.parse('a.dart 10:11 Foo.bar\n'
+ 'a.dart 10:11 Bar.baz\n'
+ 'a.dart 10:11 Bang.qux\n'
+ 'a.dart 10:11 Zip.zap\n'
+ 'b.dart 10:11 Zop.zoop')
+ ]);
+
+ var folded =
+ chain.foldFrames((frame) => frame.library == 'a.dart', terse: true);
+ expect(
+ folded.toString(),
+ equals('dart:async Zip.zap\n'
+ 'b.dart 10:11 Bang.qux\n'
+ '===== asynchronous gap ===========================\n'
+ 'a.dart Zip.zap\n'
+ 'b.dart 10:11 Zop.zoop\n'));
+ });
+
+ test('eliminates completely-folded traces', () {
+ var chain = Chain([
+ Trace.parse('a.dart 10:11 Foo.bar\n'
+ 'b.dart 10:11 Bang.qux'),
+ Trace.parse('a.dart 10:11 Foo.bar\n'
+ 'a.dart 10:11 Bang.qux'),
+ Trace.parse('a.dart 10:11 Zip.zap\n'
+ 'b.dart 10:11 Zop.zoop')
+ ]);
+
+ var folded = chain.foldFrames((frame) => frame.library == 'a.dart');
+ expect(
+ folded.toString(),
+ equals('a.dart 10:11 Foo.bar\n'
+ 'b.dart 10:11 Bang.qux\n'
+ '===== asynchronous gap ===========================\n'
+ 'a.dart 10:11 Zip.zap\n'
+ 'b.dart 10:11 Zop.zoop\n'));
+ });
+
+ test("doesn't return an empty trace", () {
+ var chain = Chain([
+ Trace.parse('a.dart 10:11 Foo.bar\n'
+ 'a.dart 10:11 Bang.qux')
+ ]);
+
+ var folded = chain.foldFrames((frame) => frame.library == 'a.dart');
+ expect(folded.toString(), equals('a.dart 10:11 Bang.qux\n'));
+ });
+ });
+
+ test('Chain.toTrace eliminates asynchronous gaps', () {
+ var trace = Chain([
+ Trace.parse('user/code.dart 10:11 Foo.bar\n'
+ 'dart:core 10:11 Bar.baz'),
+ Trace.parse('user/code.dart 10:11 Foo.bar\n'
+ 'dart:core 10:11 Bar.baz')
+ ]).toTrace();
+
+ expect(
+ trace.toString(),
+ equals('$userSlashCode 10:11 Foo.bar\n'
+ 'dart:core 10:11 Bar.baz\n'
+ '$userSlashCode 10:11 Foo.bar\n'
+ 'dart:core 10:11 Bar.baz\n'));
+ });
+}
diff --git a/pkgs/stack_trace/test/chain/dart2js_test.dart b/pkgs/stack_trace/test/chain/dart2js_test.dart
new file mode 100644
index 0000000..abb842d
--- /dev/null
+++ b/pkgs/stack_trace/test/chain/dart2js_test.dart
@@ -0,0 +1,337 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: only_throw_errors
+
+// dart2js chain tests are separated out because dart2js stack traces are
+// inconsistent due to inlining and browser differences. These tests don't
+// assert anything about the content of the traces, just the number of traces in
+// a chain.
+@TestOn('js')
+library;
+
+import 'dart:async';
+
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ group('capture() with onError catches exceptions', () {
+ test('thrown synchronously', () async {
+ var chain = await captureFuture(() => throw 'error');
+ expect(chain.traces, hasLength(1));
+ });
+
+ test('thrown in a microtask', () async {
+ var chain = await captureFuture(() => inMicrotask(() => throw 'error'));
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('thrown in a one-shot timer', () async {
+ var chain =
+ await captureFuture(() => inOneShotTimer(() => throw 'error'));
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('thrown in a periodic timer', () async {
+ var chain =
+ await captureFuture(() => inPeriodicTimer(() => throw 'error'));
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('thrown in a nested series of asynchronous operations', () async {
+ var chain = await captureFuture(() {
+ inPeriodicTimer(() {
+ inOneShotTimer(() => inMicrotask(() => throw 'error'));
+ });
+ });
+
+ expect(chain.traces, hasLength(4));
+ });
+
+ test('thrown in a long future chain', () async {
+ var chain = await captureFuture(() => inFutureChain(() => throw 'error'));
+
+ // Despite many asynchronous operations, there's only one level of
+ // nested calls, so there should be only two traces in the chain. This
+ // is important; programmers expect stack trace memory consumption to be
+ // O(depth of program), not O(length of program).
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('thrown in new Future()', () async {
+ var chain = await captureFuture(() => inNewFuture(() => throw 'error'));
+ expect(chain.traces, hasLength(3));
+ });
+
+ test('thrown in new Future.sync()', () async {
+ var chain = await captureFuture(() {
+ inMicrotask(() => inSyncFuture(() => throw 'error'));
+ });
+
+ expect(chain.traces, hasLength(3));
+ });
+
+ test('multiple times', () {
+ var completer = Completer<void>();
+ var first = true;
+
+ Chain.capture(() {
+ inMicrotask(() => throw 'first error');
+ inPeriodicTimer(() => throw 'second error');
+ }, onError: (error, chain) {
+ try {
+ if (first) {
+ expect(error, equals('first error'));
+ expect(chain.traces, hasLength(2));
+ first = false;
+ } else {
+ expect(error, equals('second error'));
+ expect(chain.traces, hasLength(2));
+ completer.complete();
+ }
+ } on Object catch (error, stackTrace) {
+ completer.completeError(error, stackTrace);
+ }
+ });
+
+ return completer.future;
+ });
+
+ test('passed to a completer', () async {
+ var trace = Trace.current();
+ var chain = await captureFuture(() {
+ inMicrotask(() => completerErrorFuture(trace));
+ });
+
+ expect(chain.traces, hasLength(3));
+
+ // The first trace is the trace that was manually reported for the
+ // error.
+ expect(chain.traces.first.toString(), equals(trace.toString()));
+ });
+
+ test('passed to a completer with no stack trace', () async {
+ var chain = await captureFuture(() {
+ inMicrotask(completerErrorFuture);
+ });
+
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('passed to a stream controller', () async {
+ var trace = Trace.current();
+ var chain = await captureFuture(() {
+ inMicrotask(() => controllerErrorStream(trace).listen(null));
+ });
+
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces.first.toString(), equals(trace.toString()));
+ });
+
+ test('passed to a stream controller with no stack trace', () async {
+ var chain = await captureFuture(() {
+ inMicrotask(() => controllerErrorStream().listen(null));
+ });
+
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('and relays them to the parent zone', () {
+ var completer = Completer<void>();
+
+ runZonedGuarded(() {
+ Chain.capture(() {
+ inMicrotask(() => throw 'error');
+ }, onError: (error, chain) {
+ expect(error, equals('error'));
+ expect(chain.traces, hasLength(2));
+ throw error;
+ });
+ }, (error, chain) {
+ try {
+ expect(error, equals('error'));
+ expect(chain,
+ isA<Chain>().having((c) => c.traces, 'traces', hasLength(2)));
+ completer.complete();
+ } on Object catch (error, stackTrace) {
+ completer.completeError(error, stackTrace);
+ }
+ });
+
+ return completer.future;
+ });
+ });
+
+ test('capture() without onError passes exceptions to parent zone', () {
+ var completer = Completer<void>();
+
+ runZonedGuarded(() {
+ Chain.capture(() => inMicrotask(() => throw 'error'));
+ }, (error, chain) {
+ try {
+ expect(error, equals('error'));
+ expect(chain,
+ isA<Chain>().having((c) => c.traces, 'traces', hasLength(2)));
+ completer.complete();
+ } on Object catch (error, stackTrace) {
+ completer.completeError(error, stackTrace);
+ }
+ });
+
+ return completer.future;
+ });
+
+ group('current() within capture()', () {
+ test('called in a microtask', () async {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inMicrotask(() => completer.complete(Chain.current()));
+ });
+
+ var chain = await completer.future;
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('called in a one-shot timer', () async {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inOneShotTimer(() => completer.complete(Chain.current()));
+ });
+
+ var chain = await completer.future;
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('called in a periodic timer', () async {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inPeriodicTimer(() => completer.complete(Chain.current()));
+ });
+
+ var chain = await completer.future;
+ expect(chain.traces, hasLength(2));
+ });
+
+ test('called in a nested series of asynchronous operations', () async {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inPeriodicTimer(() {
+ inOneShotTimer(() {
+ inMicrotask(() => completer.complete(Chain.current()));
+ });
+ });
+ });
+
+ var chain = await completer.future;
+ expect(chain.traces, hasLength(4));
+ });
+
+ test('called in a long future chain', () async {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inFutureChain(() => completer.complete(Chain.current()));
+ });
+
+ var chain = await completer.future;
+ expect(chain.traces, hasLength(2));
+ });
+ });
+
+ test(
+ 'current() outside of capture() returns a chain wrapping the current trace',
+ () =>
+ // The test runner runs all tests with chains enabled.
+ Chain.disable(() async {
+ var completer = Completer<Chain>();
+ inMicrotask(() => completer.complete(Chain.current()));
+
+ var chain = await completer.future;
+ // Since the chain wasn't loaded within [Chain.capture], the full stack
+ // chain isn't available and it just returns the current stack when
+ // called.
+ expect(chain.traces, hasLength(1));
+ }),
+ );
+
+ group('forTrace() within capture()', () {
+ test('called for a stack trace from a microtask', () async {
+ var chain = await Chain.capture(
+ () => chainForTrace(inMicrotask, () => throw 'error'));
+
+ // Because [chainForTrace] has to set up a future chain to capture the
+ // stack trace while still showing it to the zone specification, it adds
+ // an additional level of async nesting and so an additional trace.
+ expect(chain.traces, hasLength(3));
+ });
+
+ test('called for a stack trace from a one-shot timer', () async {
+ var chain = await Chain.capture(
+ () => chainForTrace(inOneShotTimer, () => throw 'error'));
+
+ expect(chain.traces, hasLength(3));
+ });
+
+ test('called for a stack trace from a periodic timer', () async {
+ var chain = await Chain.capture(
+ () => chainForTrace(inPeriodicTimer, () => throw 'error'));
+
+ expect(chain.traces, hasLength(3));
+ });
+
+ test(
+ 'called for a stack trace from a nested series of asynchronous '
+ 'operations', () async {
+ var chain = await Chain.capture(() => chainForTrace((callback) {
+ inPeriodicTimer(() => inOneShotTimer(() => inMicrotask(callback)));
+ }, () => throw 'error'));
+
+ expect(chain.traces, hasLength(5));
+ });
+
+ test('called for a stack trace from a long future chain', () async {
+ var chain = await Chain.capture(
+ () => chainForTrace(inFutureChain, () => throw 'error'));
+
+ expect(chain.traces, hasLength(3));
+ });
+
+ test(
+ 'called for an unregistered stack trace returns a chain wrapping that '
+ 'trace', () {
+ late StackTrace trace;
+ var chain = Chain.capture(() {
+ try {
+ throw 'error';
+ } catch (_, stackTrace) {
+ trace = stackTrace;
+ return Chain.forTrace(stackTrace);
+ }
+ });
+
+ expect(chain.traces, hasLength(1));
+ expect(
+ chain.traces.first.toString(), equals(Trace.from(trace).toString()));
+ });
+ });
+
+ test(
+ 'forTrace() outside of capture() returns a chain wrapping the given '
+ 'trace', () {
+ late StackTrace trace;
+ var chain = Chain.capture(() {
+ try {
+ throw 'error';
+ } catch (_, stackTrace) {
+ trace = stackTrace;
+ return Chain.forTrace(stackTrace);
+ }
+ });
+
+ expect(chain.traces, hasLength(1));
+ expect(chain.traces.first.toString(), equals(Trace.from(trace).toString()));
+ });
+}
diff --git a/pkgs/stack_trace/test/chain/utils.dart b/pkgs/stack_trace/test/chain/utils.dart
new file mode 100644
index 0000000..27fb0e6
--- /dev/null
+++ b/pkgs/stack_trace/test/chain/utils.dart
@@ -0,0 +1,94 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+/// Runs [callback] in a microtask callback.
+void inMicrotask(void Function() callback) => scheduleMicrotask(callback);
+
+/// Runs [callback] in a one-shot timer callback.
+void inOneShotTimer(void Function() callback) => Timer.run(callback);
+
+/// Runs [callback] once in a periodic timer callback.
+void inPeriodicTimer(void Function() callback) {
+ var count = 0;
+ Timer.periodic(const Duration(milliseconds: 1), (timer) {
+ count++;
+ if (count != 5) return;
+ timer.cancel();
+ callback();
+ });
+}
+
+/// Runs [callback] within a long asynchronous Future chain.
+void inFutureChain(void Function() callback) {
+ Future(() {})
+ .then((_) => Future(() {}))
+ .then((_) => Future(() {}))
+ .then((_) => Future(() {}))
+ .then((_) => Future(() {}))
+ .then((_) => callback())
+ .then((_) => Future(() {}));
+}
+
+void inNewFuture(void Function() callback) {
+ Future(callback);
+}
+
+void inSyncFuture(void Function() callback) {
+ Future.sync(callback);
+}
+
+/// Returns a Future that completes to an error using a completer.
+///
+/// If [trace] is passed, it's used as the stack trace for the error.
+Future<void> completerErrorFuture([StackTrace? trace]) {
+ var completer = Completer<void>();
+ completer.completeError('error', trace);
+ return completer.future;
+}
+
+/// Returns a Stream that emits an error using a controller.
+///
+/// If [trace] is passed, it's used as the stack trace for the error.
+Stream<void> controllerErrorStream([StackTrace? trace]) {
+ var controller = StreamController<void>();
+ controller.addError('error', trace);
+ return controller.stream;
+}
+
+/// Runs [callback] within [asyncFn], then converts any errors raised into a
+/// [Chain] with [Chain.forTrace].
+Future<Chain> chainForTrace(
+ void Function(void Function()) asyncFn, void Function() callback) {
+ var completer = Completer<Chain>();
+ asyncFn(() {
+ // We use `new Future.value().then(...)` here as opposed to [new Future] or
+ // [new Future.sync] because those methods don't pass the exception through
+ // the zone specification before propagating it, so there's no chance to
+ // attach a chain to its stack trace. See issue 15105.
+ Future<void>.value()
+ .then((_) => callback())
+ .catchError(completer.completeError);
+ });
+
+ return completer.future
+ .catchError((_, StackTrace stackTrace) => Chain.forTrace(stackTrace));
+}
+
+/// Runs [callback] in a [Chain.capture] zone and returns a Future that
+/// completes to the stack chain for an error thrown by [callback].
+///
+/// [callback] is expected to throw the string `"error"`.
+Future<Chain> captureFuture(void Function() callback) {
+ var completer = Completer<Chain>();
+ Chain.capture(callback, onError: (error, chain) {
+ expect(error, equals('error'));
+ completer.complete(chain);
+ });
+ return completer.future;
+}
diff --git a/pkgs/stack_trace/test/chain/vm_test.dart b/pkgs/stack_trace/test/chain/vm_test.dart
new file mode 100644
index 0000000..5c6c0b7
--- /dev/null
+++ b/pkgs/stack_trace/test/chain/vm_test.dart
@@ -0,0 +1,508 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: only_throw_errors
+
+// VM chain tests can rely on stronger guarantees about the contents of the
+// stack traces than dart2js.
+@TestOn('dart-vm')
+library;
+
+import 'dart:async';
+
+import 'package:stack_trace/src/utils.dart';
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+import '../utils.dart';
+import 'utils.dart';
+
+void main() {
+ group('capture() with onError catches exceptions', () {
+ test('thrown synchronously', () async {
+ late StackTrace vmTrace;
+ var chain = await captureFuture(() {
+ try {
+ throw 'error';
+ } catch (_, stackTrace) {
+ vmTrace = stackTrace;
+ rethrow;
+ }
+ });
+
+ // Because there's no chain context for a synchronous error, we fall back
+ // on the VM's stack chain tracking.
+ expect(
+ chain.toString(), equals(Chain.parse(vmTrace.toString()).toString()));
+ });
+
+ test('thrown in a microtask', () {
+ return captureFuture(() => inMicrotask(() => throw 'error'))
+ .then((chain) {
+ // Since there was only one asynchronous operation, there should be only
+ // two traces in the chain.
+ expect(chain.traces, hasLength(2));
+
+ // The first frame of the first trace should be the line on which the
+ // actual error was thrown.
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+
+ // The second trace should describe the stack when the error callback
+ // was scheduled.
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('thrown in a one-shot timer', () {
+ return captureFuture(() => inOneShotTimer(() => throw 'error'))
+ .then((chain) {
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inOneShotTimer'))));
+ });
+ });
+
+ test('thrown in a periodic timer', () {
+ return captureFuture(() => inPeriodicTimer(() => throw 'error'))
+ .then((chain) {
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inPeriodicTimer'))));
+ });
+ });
+
+ test('thrown in a nested series of asynchronous operations', () {
+ return captureFuture(() {
+ inPeriodicTimer(() {
+ inOneShotTimer(() => inMicrotask(() => throw 'error'));
+ });
+ }).then((chain) {
+ expect(chain.traces, hasLength(4));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inOneShotTimer'))));
+ expect(chain.traces[3].frames,
+ contains(frameMember(startsWith('inPeriodicTimer'))));
+ });
+ });
+
+ test('thrown in a long future chain', () {
+ return captureFuture(() => inFutureChain(() => throw 'error'))
+ .then((chain) {
+ // Despite many asynchronous operations, there's only one level of
+ // nested calls, so there should be only two traces in the chain. This
+ // is important; programmers expect stack trace memory consumption to be
+ // O(depth of program), not O(length of program).
+ expect(chain.traces, hasLength(2));
+
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inFutureChain'))));
+ });
+ });
+
+ test('thrown in new Future()', () {
+ return captureFuture(() => inNewFuture(() => throw 'error'))
+ .then((chain) {
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+
+ // The second trace is the one captured by
+ // [StackZoneSpecification.errorCallback]. Because that runs
+ // asynchronously within [new Future], it doesn't actually refer to the
+ // source file at all.
+ expect(chain.traces[1].frames,
+ everyElement(frameLibrary(isNot(contains('chain_test')))));
+
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inNewFuture'))));
+ });
+ });
+
+ test('thrown in new Future.sync()', () {
+ return captureFuture(() {
+ inMicrotask(() => inSyncFuture(() => throw 'error'));
+ }).then((chain) {
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inSyncFuture'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('multiple times', () {
+ var completer = Completer<void>();
+ var first = true;
+
+ Chain.capture(() {
+ inMicrotask(() => throw 'first error');
+ inPeriodicTimer(() => throw 'second error');
+ }, onError: (error, chain) {
+ try {
+ if (first) {
+ expect(error, equals('first error'));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ first = false;
+ } else {
+ expect(error, equals('second error'));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inPeriodicTimer'))));
+ completer.complete();
+ }
+ } on Object catch (error, stackTrace) {
+ completer.completeError(error, stackTrace);
+ }
+ });
+
+ return completer.future;
+ });
+
+ test('passed to a completer', () {
+ var trace = Trace.current();
+ return captureFuture(() {
+ inMicrotask(() => completerErrorFuture(trace));
+ }).then((chain) {
+ expect(chain.traces, hasLength(3));
+
+ // The first trace is the trace that was manually reported for the
+ // error.
+ expect(chain.traces.first.toString(), equals(trace.toString()));
+
+ // The second trace is the trace that was captured when
+ // [Completer.addError] was called.
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('completerErrorFuture'))));
+
+ // The third trace is the automatically-captured trace from when the
+ // microtask was scheduled.
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('passed to a completer with no stack trace', () {
+ return captureFuture(() {
+ inMicrotask(completerErrorFuture);
+ }).then((chain) {
+ expect(chain.traces, hasLength(2));
+
+ // The first trace is the one captured when [Completer.addError] was
+ // called.
+ expect(chain.traces[0].frames,
+ contains(frameMember(startsWith('completerErrorFuture'))));
+
+ // The second trace is the automatically-captured trace from when the
+ // microtask was scheduled.
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('passed to a stream controller', () {
+ var trace = Trace.current();
+ return captureFuture(() {
+ inMicrotask(() => controllerErrorStream(trace).listen(null));
+ }).then((chain) {
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces.first.toString(), equals(trace.toString()));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('controllerErrorStream'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('passed to a stream controller with no stack trace', () {
+ return captureFuture(() {
+ inMicrotask(() => controllerErrorStream().listen(null));
+ }).then((chain) {
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames,
+ contains(frameMember(startsWith('controllerErrorStream'))));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('and relays them to the parent zone', () {
+ var completer = Completer<void>();
+
+ runZonedGuarded(() {
+ Chain.capture(() {
+ inMicrotask(() => throw 'error');
+ }, onError: (error, chain) {
+ expect(error, equals('error'));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ throw error;
+ });
+ }, (error, chain) {
+ try {
+ expect(error, equals('error'));
+ expect(
+ chain,
+ isA<Chain>().having((c) => c.traces[1].frames, 'traces[1].frames',
+ contains(frameMember(startsWith('inMicrotask')))));
+ completer.complete();
+ } on Object catch (error, stackTrace) {
+ completer.completeError(error, stackTrace);
+ }
+ });
+
+ return completer.future;
+ });
+ });
+
+ test('capture() without onError passes exceptions to parent zone', () {
+ var completer = Completer<void>();
+
+ runZonedGuarded(() {
+ Chain.capture(() => inMicrotask(() => throw 'error'));
+ }, (error, chain) {
+ try {
+ expect(error, equals('error'));
+ expect(
+ chain,
+ isA<Chain>().having((c) => c.traces[1].frames, 'traces[1].frames',
+ contains(frameMember(startsWith('inMicrotask')))));
+ completer.complete();
+ } on Object catch (error, stackTrace) {
+ completer.completeError(error, stackTrace);
+ }
+ });
+
+ return completer.future;
+ });
+
+ group('current() within capture()', () {
+ test('called in a microtask', () {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inMicrotask(() => completer.complete(Chain.current()));
+ });
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('called in a one-shot timer', () {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inOneShotTimer(() => completer.complete(Chain.current()));
+ });
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inOneShotTimer'))));
+ });
+ });
+
+ test('called in a periodic timer', () {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inPeriodicTimer(() => completer.complete(Chain.current()));
+ });
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inPeriodicTimer'))));
+ });
+ });
+
+ test('called in a nested series of asynchronous operations', () {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inPeriodicTimer(() {
+ inOneShotTimer(() {
+ inMicrotask(() => completer.complete(Chain.current()));
+ });
+ });
+ });
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(4));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inOneShotTimer'))));
+ expect(chain.traces[3].frames,
+ contains(frameMember(startsWith('inPeriodicTimer'))));
+ });
+ });
+
+ test('called in a long future chain', () {
+ var completer = Completer<Chain>();
+ Chain.capture(() {
+ inFutureChain(() => completer.complete(Chain.current()));
+ });
+
+ return completer.future.then((chain) {
+ expect(chain.traces, hasLength(2));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('inFutureChain'))));
+ });
+ });
+ });
+
+ test(
+ 'current() outside of capture() returns a chain wrapping the current '
+ 'trace', () {
+ // The test runner runs all tests with chains enabled.
+ return Chain.disable(() {
+ var completer = Completer<Chain>();
+ inMicrotask(() => completer.complete(Chain.current()));
+
+ return completer.future.then((chain) {
+ // Since the chain wasn't loaded within [Chain.capture], the full stack
+ // chain isn't available and it just returns the current stack when
+ // called.
+ expect(chain.traces, hasLength(1));
+ expect(
+ chain.traces.first.frames.first, frameMember(startsWith('main')));
+ });
+ });
+ });
+
+ group('forTrace() within capture()', () {
+ test('called for a stack trace from a microtask', () {
+ return Chain.capture(() {
+ return chainForTrace(inMicrotask, () => throw 'error');
+ }).then((chain) {
+ // Because [chainForTrace] has to set up a future chain to capture the
+ // stack trace while still showing it to the zone specification, it adds
+ // an additional level of async nesting and so an additional trace.
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('chainForTrace'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ });
+ });
+
+ test('called for a stack trace from a one-shot timer', () {
+ return Chain.capture(() {
+ return chainForTrace(inOneShotTimer, () => throw 'error');
+ }).then((chain) {
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('chainForTrace'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inOneShotTimer'))));
+ });
+ });
+
+ test('called for a stack trace from a periodic timer', () {
+ return Chain.capture(() {
+ return chainForTrace(inPeriodicTimer, () => throw 'error');
+ }).then((chain) {
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('chainForTrace'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inPeriodicTimer'))));
+ });
+ });
+
+ test(
+ 'called for a stack trace from a nested series of asynchronous '
+ 'operations', () {
+ return Chain.capture(() {
+ return chainForTrace((callback) {
+ inPeriodicTimer(() => inOneShotTimer(() => inMicrotask(callback)));
+ }, () => throw 'error');
+ }).then((chain) {
+ expect(chain.traces, hasLength(5));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('chainForTrace'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inMicrotask'))));
+ expect(chain.traces[3].frames,
+ contains(frameMember(startsWith('inOneShotTimer'))));
+ expect(chain.traces[4].frames,
+ contains(frameMember(startsWith('inPeriodicTimer'))));
+ });
+ });
+
+ test('called for a stack trace from a long future chain', () {
+ return Chain.capture(() {
+ return chainForTrace(inFutureChain, () => throw 'error');
+ }).then((chain) {
+ expect(chain.traces, hasLength(3));
+ expect(chain.traces[0].frames.first, frameMember(startsWith('main')));
+ expect(chain.traces[1].frames,
+ contains(frameMember(startsWith('chainForTrace'))));
+ expect(chain.traces[2].frames,
+ contains(frameMember(startsWith('inFutureChain'))));
+ });
+ });
+
+ test('called for an unregistered stack trace uses the current chain',
+ () async {
+ late StackTrace trace;
+ var chain = await Chain.capture(() async {
+ try {
+ throw 'error';
+ } catch (_, stackTrace) {
+ trace = stackTrace;
+ return Chain.forTrace(stackTrace);
+ }
+ });
+
+ expect(chain.traces, hasLength(greaterThan(1)));
+
+ // Assert that we've trimmed the VM's stack chains here to avoid
+ // duplication.
+ expect(chain.traces.first.toString(),
+ equals(Chain.parse(trace.toString()).traces.first.toString()));
+ });
+ });
+
+ test(
+ 'forTrace() outside of capture() returns a chain describing the VM stack '
+ 'chain', () {
+ // Disable the test package's chain-tracking.
+ return Chain.disable(() async {
+ late StackTrace trace;
+ await Chain.capture(() async {
+ try {
+ throw 'error';
+ } catch (_, stackTrace) {
+ trace = stackTrace;
+ }
+ });
+
+ final chain = Chain.forTrace(trace);
+ final traceStr = trace.toString();
+ final gaps = vmChainGap.allMatches(traceStr);
+ // If the trace ends on a gap, there's no sub-trace following the gap.
+ final expectedLength =
+ (gaps.last.end == traceStr.length) ? gaps.length : gaps.length + 1;
+ expect(chain.traces, hasLength(expectedLength));
+ expect(
+ chain.traces.first.frames, contains(frameMember(startsWith('main'))));
+ });
+ });
+}
diff --git a/pkgs/stack_trace/test/frame_test.dart b/pkgs/stack_trace/test/frame_test.dart
new file mode 100644
index 0000000..a5dfc20
--- /dev/null
+++ b/pkgs/stack_trace/test/frame_test.dart
@@ -0,0 +1,729 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:path/path.dart' as path;
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('.parseVM', () {
+ test('parses a stack frame with column correctly', () {
+ var frame = Frame.parseVM('#1 Foo._bar '
+ '(file:///home/nweiz/code/stuff.dart:42:21)');
+ expect(
+ frame.uri, equals(Uri.parse('file:///home/nweiz/code/stuff.dart')));
+ expect(frame.line, equals(42));
+ expect(frame.column, equals(21));
+ expect(frame.member, equals('Foo._bar'));
+ });
+
+ test('parses a stack frame without column correctly', () {
+ var frame = Frame.parseVM('#1 Foo._bar '
+ '(file:///home/nweiz/code/stuff.dart:24)');
+ expect(
+ frame.uri, equals(Uri.parse('file:///home/nweiz/code/stuff.dart')));
+ expect(frame.line, equals(24));
+ expect(frame.column, null);
+ expect(frame.member, equals('Foo._bar'));
+ });
+
+ // This can happen with async stack traces. See issue 22009.
+ test('parses a stack frame without line or column correctly', () {
+ var frame = Frame.parseVM('#1 Foo._bar '
+ '(file:///home/nweiz/code/stuff.dart)');
+ expect(
+ frame.uri, equals(Uri.parse('file:///home/nweiz/code/stuff.dart')));
+ expect(frame.line, isNull);
+ expect(frame.column, isNull);
+ expect(frame.member, equals('Foo._bar'));
+ });
+
+ test('converts "<anonymous closure>" to "<fn>"', () {
+ String? parsedMember(String member) =>
+ Frame.parseVM('#0 $member (foo:0:0)').member;
+
+ expect(parsedMember('Foo.<anonymous closure>'), equals('Foo.<fn>'));
+ expect(parsedMember('<anonymous closure>.<anonymous closure>.bar'),
+ equals('<fn>.<fn>.bar'));
+ });
+
+ test('converts "<<anonymous closure>_async_body>" to "<async>"', () {
+ var frame =
+ Frame.parseVM('#0 Foo.<<anonymous closure>_async_body> (foo:0:0)');
+ expect(frame.member, equals('Foo.<async>'));
+ });
+
+ test('converts "<function_name_async_body>" to "<async>"', () {
+ var frame = Frame.parseVM('#0 Foo.<function_name_async_body> (foo:0:0)');
+ expect(frame.member, equals('Foo.<async>'));
+ });
+
+ test('parses a folded frame correctly', () {
+ var frame = Frame.parseVM('...');
+
+ expect(frame.member, equals('...'));
+ expect(frame.uri, equals(Uri()));
+ expect(frame.line, isNull);
+ expect(frame.column, isNull);
+ });
+ });
+
+ group('.parseV8', () {
+ test('returns an UnparsedFrame for malformed frames', () {
+ expectIsUnparsed(Frame.parseV8, '');
+ expectIsUnparsed(Frame.parseV8, '#1');
+ expectIsUnparsed(Frame.parseV8, '#1 Foo');
+ expectIsUnparsed(Frame.parseV8, '#1 (dart:async/future.dart:10:15)');
+ expectIsUnparsed(Frame.parseV8, 'Foo (dart:async/future.dart:10:15)');
+ });
+
+ test('parses a stack frame correctly', () {
+ var frame = Frame.parseV8(' at VW.call\$0 '
+ '(https://example.com/stuff.dart.js:560:28)');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with a : in the authority', () {
+ var frame = Frame.parseV8(' at VW.call\$0 '
+ '(http://localhost:8080/stuff.dart.js:560:28)');
+ expect(
+ frame.uri, equals(Uri.parse('http://localhost:8080/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with an absolute POSIX path correctly', () {
+ var frame = Frame.parseV8(' at VW.call\$0 '
+ '(/path/to/stuff.dart.js:560:28)');
+ expect(frame.uri, equals(Uri.parse('file:///path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with an absolute Windows path correctly', () {
+ var frame = Frame.parseV8(' at VW.call\$0 '
+ r'(C:\path\to\stuff.dart.js:560:28)');
+ expect(frame.uri, equals(Uri.parse('file:///C:/path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with a Windows UNC path correctly', () {
+ var frame = Frame.parseV8(' at VW.call\$0 '
+ r'(\\mount\path\to\stuff.dart.js:560:28)');
+ expect(
+ frame.uri, equals(Uri.parse('file://mount/path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with a relative POSIX path correctly', () {
+ var frame = Frame.parseV8(' at VW.call\$0 '
+ '(path/to/stuff.dart.js:560:28)');
+ expect(frame.uri, equals(Uri.parse('path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with a relative Windows path correctly', () {
+ var frame = Frame.parseV8(' at VW.call\$0 '
+ r'(path\to\stuff.dart.js:560:28)');
+ expect(frame.uri, equals(Uri.parse('path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses an anonymous stack frame correctly', () {
+ var frame =
+ Frame.parseV8(' at https://example.com/stuff.dart.js:560:28');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('<fn>'));
+ });
+
+ test('parses a native stack frame correctly', () {
+ var frame = Frame.parseV8(' at Object.stringify (native)');
+ expect(frame.uri, Uri.parse('native'));
+ expect(frame.line, isNull);
+ expect(frame.column, isNull);
+ expect(frame.member, equals('Object.stringify'));
+ });
+
+ test('parses a stack frame with [as ...] correctly', () {
+ // Ignore "[as ...]", since other stack trace formats don't support a
+ // similar construct.
+ var frame = Frame.parseV8(' at VW.call\$0 [as call\$4] '
+ '(https://example.com/stuff.dart.js:560:28)');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a basic eval stack frame correctly', () {
+ var frame = Frame.parseV8(' at eval (eval at <anonymous> '
+ '(https://example.com/stuff.dart.js:560:28))');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('eval'));
+ });
+
+ test('parses an IE10 eval stack frame correctly', () {
+ var frame = Frame.parseV8(' at eval (eval at Anonymous function '
+ '(https://example.com/stuff.dart.js:560:28))');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('eval'));
+ });
+
+ test('parses an eval stack frame with inner position info correctly', () {
+ var frame = Frame.parseV8(' at eval (eval at <anonymous> '
+ '(https://example.com/stuff.dart.js:560:28), <anonymous>:3:28)');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('eval'));
+ });
+
+ test('parses a nested eval stack frame correctly', () {
+ var frame = Frame.parseV8(' at eval (eval at <anonymous> '
+ '(eval at sub (https://example.com/stuff.dart.js:560:28)))');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, equals(28));
+ expect(frame.member, equals('eval'));
+ });
+
+ test('converts "<anonymous>" to "<fn>"', () {
+ String? parsedMember(String member) =>
+ Frame.parseV8(' at $member (foo:0:0)').member;
+
+ expect(parsedMember('Foo.<anonymous>'), equals('Foo.<fn>'));
+ expect(
+ parsedMember('<anonymous>.<anonymous>.bar'), equals('<fn>.<fn>.bar'));
+ });
+
+ test('returns an UnparsedFrame for malformed frames', () {
+ expectIsUnparsed(Frame.parseV8, '');
+ expectIsUnparsed(Frame.parseV8, ' at');
+ expectIsUnparsed(Frame.parseV8, ' at Foo');
+ expectIsUnparsed(Frame.parseV8, ' at Foo (dart:async/future.dart)');
+ expectIsUnparsed(Frame.parseV8, ' at (dart:async/future.dart:10:15)');
+ expectIsUnparsed(Frame.parseV8, 'Foo (dart:async/future.dart:10:15)');
+ expectIsUnparsed(Frame.parseV8, ' at dart:async/future.dart');
+ expectIsUnparsed(Frame.parseV8, 'dart:async/future.dart:10:15');
+ });
+ });
+
+ group('.parseFirefox/.parseSafari', () {
+ test('parses a Firefox stack trace with anonymous function', () {
+ var trace = Trace.parse('''
+Foo._bar@https://example.com/stuff.js:18056:12
+anonymous/<@https://example.com/stuff.js line 693 > Function:3:40
+baz@https://pub.dev/buz.js:56355:55
+ ''');
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[0].line, equals(18056));
+ expect(trace.frames[0].column, equals(12));
+ expect(trace.frames[0].member, equals('Foo._bar'));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].line, equals(693));
+ expect(trace.frames[1].column, isNull);
+ expect(trace.frames[1].member, equals('<fn>'));
+ expect(trace.frames[2].uri, equals(Uri.parse('https://pub.dev/buz.js')));
+ expect(trace.frames[2].line, equals(56355));
+ expect(trace.frames[2].column, equals(55));
+ expect(trace.frames[2].member, equals('baz'));
+ });
+
+ test('parses a Firefox stack trace with nested evals in anonymous function',
+ () {
+ var trace = Trace.parse('''
+ Foo._bar@https://example.com/stuff.js:18056:12
+ anonymous@file:///C:/example.html line 7 > eval line 1 > eval:1:1
+ anonymous@file:///C:/example.html line 45 > Function:1:1
+ ''');
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[0].line, equals(18056));
+ expect(trace.frames[0].column, equals(12));
+ expect(trace.frames[0].member, equals('Foo._bar'));
+ expect(trace.frames[1].uri, equals(Uri.parse('file:///C:/example.html')));
+ expect(trace.frames[1].line, equals(7));
+ expect(trace.frames[1].column, isNull);
+ expect(trace.frames[1].member, equals('<fn>'));
+ expect(trace.frames[2].uri, equals(Uri.parse('file:///C:/example.html')));
+ expect(trace.frames[2].line, equals(45));
+ expect(trace.frames[2].column, isNull);
+ expect(trace.frames[2].member, equals('<fn>'));
+ });
+
+ test('parses a simple stack frame correctly', () {
+ var frame = Frame.parseFirefox(
+ '.VW.call\$0@https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with an absolute POSIX path correctly', () {
+ var frame = Frame.parseFirefox('.VW.call\$0@/path/to/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('file:///path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with an absolute Windows path correctly', () {
+ var frame =
+ Frame.parseFirefox(r'.VW.call$0@C:\path\to\stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('file:///C:/path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with a Windows UNC path correctly', () {
+ var frame =
+ Frame.parseFirefox(r'.VW.call$0@\\mount\path\to\stuff.dart.js:560');
+ expect(
+ frame.uri, equals(Uri.parse('file://mount/path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with a relative POSIX path correctly', () {
+ var frame = Frame.parseFirefox('.VW.call\$0@path/to/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a stack frame with a relative Windows path correctly', () {
+ var frame = Frame.parseFirefox(r'.VW.call$0@path\to\stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('path/to/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('VW.call\$0'));
+ });
+
+ test('parses a simple anonymous stack frame correctly', () {
+ var frame = Frame.parseFirefox('@https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('<fn>'));
+ });
+
+ test('parses a nested anonymous stack frame correctly', () {
+ var frame =
+ Frame.parseFirefox('.foo/<@https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('foo.<fn>'));
+
+ frame = Frame.parseFirefox('.foo/@https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('foo.<fn>'));
+ });
+
+ test('parses a named nested anonymous stack frame correctly', () {
+ var frame = Frame.parseFirefox(
+ '.foo/.name<@https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('foo.<fn>'));
+
+ frame = Frame.parseFirefox(
+ '.foo/.name@https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('foo.<fn>'));
+ });
+
+ test('parses a stack frame with parameters correctly', () {
+ var frame = Frame.parseFirefox(
+ '.foo(12, "@)()/<")@https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('foo'));
+ });
+
+ test('parses a nested anonymous stack frame with parameters correctly', () {
+ var frame = Frame.parseFirefox(
+ '.foo(12, "@)()/<")/.fn<@https://example.com/stuff.dart.js:560',
+ );
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('foo.<fn>'));
+ });
+
+ test(
+ 'parses a deeply-nested anonymous stack frame with parameters '
+ 'correctly', () {
+ var frame = Frame.parseFirefox('.convertDartClosureToJS/\$function</<@'
+ 'https://example.com/stuff.dart.js:560');
+ expect(frame.uri, equals(Uri.parse('https://example.com/stuff.dart.js')));
+ expect(frame.line, equals(560));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('convertDartClosureToJS.<fn>.<fn>'));
+ });
+
+ test('returns an UnparsedFrame for malformed frames', () {
+ expectIsUnparsed(Frame.parseFirefox, '');
+ expectIsUnparsed(Frame.parseFirefox, '.foo');
+ expectIsUnparsed(Frame.parseFirefox, '.foo@dart:async/future.dart');
+ expectIsUnparsed(Frame.parseFirefox, '.foo(@dart:async/future.dart:10');
+ expectIsUnparsed(Frame.parseFirefox, '@dart:async/future.dart');
+ });
+
+ test('parses a simple stack frame correctly', () {
+ var frame =
+ Frame.parseFirefox('foo\$bar@https://dart.dev/foo/bar.dart:10:11');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, equals(10));
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('foo\$bar'));
+ });
+
+ test('parses an anonymous stack frame correctly', () {
+ var frame = Frame.parseFirefox('https://dart.dev/foo/bar.dart:10:11');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, equals(10));
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('<fn>'));
+ });
+
+ test('parses a stack frame with no line correctly', () {
+ var frame =
+ Frame.parseFirefox('foo\$bar@https://dart.dev/foo/bar.dart::11');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, isNull);
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('foo\$bar'));
+ });
+
+ test('parses a stack frame with no column correctly', () {
+ var frame =
+ Frame.parseFirefox('foo\$bar@https://dart.dev/foo/bar.dart:10:');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, equals(10));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('foo\$bar'));
+ });
+
+ test('parses a stack frame with no line or column correctly', () {
+ var frame =
+ Frame.parseFirefox('foo\$bar@https://dart.dev/foo/bar.dart:10:11');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, equals(10));
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('foo\$bar'));
+ });
+ });
+
+ group('.parseFriendly', () {
+ test('parses a simple stack frame correctly', () {
+ var frame = Frame.parseFriendly(
+ 'https://dart.dev/foo/bar.dart 10:11 Foo.<fn>.bar');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, equals(10));
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('Foo.<fn>.bar'));
+ });
+
+ test('parses a stack frame with no line or column correctly', () {
+ var frame =
+ Frame.parseFriendly('https://dart.dev/foo/bar.dart Foo.<fn>.bar');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, isNull);
+ expect(frame.column, isNull);
+ expect(frame.member, equals('Foo.<fn>.bar'));
+ });
+
+ test('parses a stack frame with no column correctly', () {
+ var frame =
+ Frame.parseFriendly('https://dart.dev/foo/bar.dart 10 Foo.<fn>.bar');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, equals(10));
+ expect(frame.column, isNull);
+ expect(frame.member, equals('Foo.<fn>.bar'));
+ });
+
+ test('parses a stack frame with a relative path correctly', () {
+ var frame = Frame.parseFriendly('foo/bar.dart 10:11 Foo.<fn>.bar');
+ expect(frame.uri,
+ equals(path.toUri(path.absolute(path.join('foo', 'bar.dart')))));
+ expect(frame.line, equals(10));
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('Foo.<fn>.bar'));
+ });
+
+ test('returns an UnparsedFrame for malformed frames', () {
+ expectIsUnparsed(Frame.parseFriendly, '');
+ expectIsUnparsed(Frame.parseFriendly, 'foo/bar.dart');
+ expectIsUnparsed(Frame.parseFriendly, 'foo/bar.dart 10:11');
+ });
+
+ test('parses a data url stack frame with no line or column correctly', () {
+ var frame = Frame.parseFriendly('data:... main');
+ expect(frame.uri.scheme, equals('data'));
+ expect(frame.line, isNull);
+ expect(frame.column, isNull);
+ expect(frame.member, equals('main'));
+ });
+
+ test('parses a data url stack frame correctly', () {
+ var frame = Frame.parseFriendly('data:... 10:11 main');
+ expect(frame.uri.scheme, equals('data'));
+ expect(frame.line, equals(10));
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('main'));
+ });
+
+ test('parses a stack frame with spaces in the member name correctly', () {
+ var frame = Frame.parseFriendly(
+ 'foo/bar.dart 10:11 (anonymous function).dart.fn');
+ expect(frame.uri,
+ equals(path.toUri(path.absolute(path.join('foo', 'bar.dart')))));
+ expect(frame.line, equals(10));
+ expect(frame.column, equals(11));
+ expect(frame.member, equals('(anonymous function).dart.fn'));
+ });
+
+ test(
+ 'parses a stack frame with spaces in the member name and no line or '
+ 'column correctly', () {
+ var frame = Frame.parseFriendly(
+ 'https://dart.dev/foo/bar.dart (anonymous function).dart.fn');
+ expect(frame.uri, equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(frame.line, isNull);
+ expect(frame.column, isNull);
+ expect(frame.member, equals('(anonymous function).dart.fn'));
+ });
+ });
+
+ test('only considers dart URIs to be core', () {
+ bool isCore(String library) =>
+ Frame.parseVM('#0 Foo ($library:0:0)').isCore;
+
+ expect(isCore('dart:core'), isTrue);
+ expect(isCore('dart:async'), isTrue);
+ expect(isCore('dart:core/uri.dart'), isTrue);
+ expect(isCore('dart:async/future.dart'), isTrue);
+ expect(isCore('bart:core'), isFalse);
+ expect(isCore('sdart:core'), isFalse);
+ expect(isCore('darty:core'), isFalse);
+ expect(isCore('bart:core/uri.dart'), isFalse);
+ });
+
+ group('.library', () {
+ test('returns the URI string for non-file URIs', () {
+ expect(Frame.parseVM('#0 Foo (dart:async/future.dart:0:0)').library,
+ equals('dart:async/future.dart'));
+ expect(
+ Frame.parseVM('#0 Foo '
+ '(https://dart.dev/stuff/thing.dart:0:0)')
+ .library,
+ equals('https://dart.dev/stuff/thing.dart'));
+ });
+
+ test('returns the relative path for file URIs', () {
+ expect(Frame.parseVM('#0 Foo (foo/bar.dart:0:0)').library,
+ equals(path.join('foo', 'bar.dart')));
+ });
+
+ test('truncates legacy data: URIs', () {
+ var frame = Frame.parseVM(
+ '#0 Foo (data:application/dart;charset=utf-8,blah:0:0)');
+ expect(frame.library, equals('data:...'));
+ });
+
+ test('truncates data: URIs', () {
+ var frame = Frame.parseVM(
+ '#0 main (<data:application/dart;charset=utf-8>:1:15)');
+ expect(frame.library, equals('data:...'));
+ });
+ });
+
+ group('.location', () {
+ test(
+ 'returns the library and line/column numbers for non-core '
+ 'libraries', () {
+ expect(
+ Frame.parseVM('#0 Foo '
+ '(https://dart.dev/thing.dart:5:10)')
+ .location,
+ equals('https://dart.dev/thing.dart 5:10'));
+ expect(Frame.parseVM('#0 Foo (foo/bar.dart:1:2)').location,
+ equals('${path.join('foo', 'bar.dart')} 1:2'));
+ });
+ });
+
+ group('.package', () {
+ test('returns null for non-package URIs', () {
+ expect(
+ Frame.parseVM('#0 Foo (dart:async/future.dart:0:0)').package, isNull);
+ expect(
+ Frame.parseVM('#0 Foo '
+ '(https://dart.dev/stuff/thing.dart:0:0)')
+ .package,
+ isNull);
+ });
+
+ test('returns the package name for package: URIs', () {
+ expect(Frame.parseVM('#0 Foo (package:foo/foo.dart:0:0)').package,
+ equals('foo'));
+ expect(Frame.parseVM('#0 Foo (package:foo/zap/bar.dart:0:0)').package,
+ equals('foo'));
+ });
+ });
+
+ group('.toString()', () {
+ test(
+ 'returns the library and line/column numbers for non-core '
+ 'libraries', () {
+ expect(
+ Frame.parseVM('#0 Foo (https://dart.dev/thing.dart:5:10)').toString(),
+ equals('https://dart.dev/thing.dart 5:10 in Foo'));
+ });
+
+ test('converts "<anonymous closure>" to "<fn>"', () {
+ expect(
+ Frame.parseVM('#0 Foo.<anonymous closure> '
+ '(dart:core/uri.dart:5:10)')
+ .toString(),
+ equals('dart:core/uri.dart 5:10 in Foo.<fn>'));
+ });
+
+ test('prints a frame without a column correctly', () {
+ expect(Frame.parseVM('#0 Foo (dart:core/uri.dart:5)').toString(),
+ equals('dart:core/uri.dart 5 in Foo'));
+ });
+
+ test('prints relative paths as relative', () {
+ var relative = path.normalize('relative/path/to/foo.dart');
+ expect(Frame.parseFriendly('$relative 5:10 Foo').toString(),
+ equals('$relative 5:10 in Foo'));
+ });
+ });
+
+ test('parses a V8 Wasm frame with a name', () {
+ var frame = Frame.parseV8(' at Error._throwWithCurrentStackTrace '
+ '(wasm://wasm/0006d966:wasm-function[119]:0xbb13)');
+ expect(frame.uri, Uri.parse('wasm://wasm/0006d966'));
+ expect(frame.line, 1);
+ expect(frame.column, 0xbb13 + 1);
+ expect(frame.member, 'Error._throwWithCurrentStackTrace');
+ });
+
+ test('parses a V8 Wasm frame with a name with spaces', () {
+ var frame = Frame.parseV8(' at main tear-off trampoline '
+ '(wasm://wasm/0017fbea:wasm-function[863]:0x23cc8)');
+ expect(frame.uri, Uri.parse('wasm://wasm/0017fbea'));
+ expect(frame.line, 1);
+ expect(frame.column, 0x23cc8 + 1);
+ expect(frame.member, 'main tear-off trampoline');
+ });
+
+ test('parses a V8 Wasm frame with a name with colons and parens', () {
+ var frame = Frame.parseV8(' at a::b::c() '
+ '(https://a.b.com/x/y/z.wasm:wasm-function[66334]:0x12c28ad)');
+ expect(frame.uri, Uri.parse('https://a.b.com/x/y/z.wasm'));
+ expect(frame.line, 1);
+ expect(frame.column, 0x12c28ad + 1);
+ expect(frame.member, 'a::b::c()');
+ });
+
+ test('parses a V8 Wasm frame without a name', () {
+ var frame =
+ Frame.parseV8(' at wasm://wasm/0006d966:wasm-function[119]:0xbb13');
+ expect(frame.uri, Uri.parse('wasm://wasm/0006d966'));
+ expect(frame.line, 1);
+ expect(frame.column, 0xbb13 + 1);
+ expect(frame.member, '119');
+ });
+
+ test('parses a Firefox Wasm frame with a name', () {
+ var frame = Frame.parseFirefox(
+ 'g@http://localhost:8080/test.wasm:wasm-function[796]:0x143b4');
+ expect(frame.uri, Uri.parse('http://localhost:8080/test.wasm'));
+ expect(frame.line, 1);
+ expect(frame.column, 0x143b4 + 1);
+ expect(frame.member, 'g');
+ });
+
+ test('parses a Firefox Wasm frame with a name with spaces', () {
+ var frame = Frame.parseFirefox(
+ 'main tear-off trampoline@http://localhost:8080/test.wasm:wasm-function[794]:0x14387');
+ expect(frame.uri, Uri.parse('http://localhost:8080/test.wasm'));
+ expect(frame.line, 1);
+ expect(frame.column, 0x14387 + 1);
+ expect(frame.member, 'main tear-off trampoline');
+ });
+
+ test('parses a Firefox Wasm frame without a name', () {
+ var frame = Frame.parseFirefox(
+ '@http://localhost:8080/test.wasm:wasm-function[796]:0x143b4');
+ expect(frame.uri, Uri.parse('http://localhost:8080/test.wasm'));
+ expect(frame.line, 1);
+ expect(frame.column, 0x143b4 + 1);
+ expect(frame.member, '796');
+ });
+
+ test('parses a Safari Wasm frame with a name', () {
+ var frame = Frame.parseSafari('<?>.wasm-function[g]@[wasm code]');
+ expect(frame.uri, Uri.parse('wasm code'));
+ expect(frame.line, null);
+ expect(frame.column, null);
+ expect(frame.member, 'g');
+ });
+
+ test('parses a Safari Wasm frame with a name', () {
+ var frame = Frame.parseSafari(
+ '<?>.wasm-function[main tear-off trampoline]@[wasm code]');
+ expect(frame.uri, Uri.parse('wasm code'));
+ expect(frame.line, null);
+ expect(frame.column, null);
+ expect(frame.member, 'main tear-off trampoline');
+ });
+
+ test('parses a Safari Wasm frame without a name', () {
+ var frame = Frame.parseSafari('<?>.wasm-function[796]@[wasm code]');
+ expect(frame.uri, Uri.parse('wasm code'));
+ expect(frame.line, null);
+ expect(frame.column, null);
+ expect(frame.member, '796');
+ });
+}
+
+void expectIsUnparsed(Frame Function(String) constructor, String text) {
+ var frame = constructor(text);
+ expect(frame, isA<UnparsedFrame>());
+ expect(frame.toString(), equals(text));
+}
diff --git a/pkgs/stack_trace/test/trace_test.dart b/pkgs/stack_trace/test/trace_test.dart
new file mode 100644
index 0000000..e09de95
--- /dev/null
+++ b/pkgs/stack_trace/test/trace_test.dart
@@ -0,0 +1,615 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:path/path.dart' as path;
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+void main() {
+ // This just shouldn't crash.
+ test('a native stack trace is parseable', Trace.current);
+
+ group('.parse', () {
+ test('.parse parses a V8 stack trace with eval statment correctly', () {
+ var trace = Trace.parse(r'''Error
+ at Object.eval (eval at Foo (main.dart.js:588), <anonymous>:3:47)''');
+ expect(trace.frames[0].uri, Uri.parse('main.dart.js'));
+ expect(trace.frames[0].member, equals('Object.eval'));
+ expect(trace.frames[0].line, equals(588));
+ expect(trace.frames[0].column, isNull);
+ });
+
+ test('.parse parses a VM stack trace correctly', () {
+ var trace = Trace.parse(
+ '#0 Foo._bar (file:///home/nweiz/code/stuff.dart:42:21)\n'
+ '#1 zip.<anonymous closure>.zap (dart:async/future.dart:0:2)\n'
+ '#2 zip.<anonymous closure>.zap (https://pub.dev/thing.dart:1:100)',
+ );
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('file:///home/nweiz/code/stuff.dart')));
+ expect(trace.frames[1].uri, equals(Uri.parse('dart:async/future.dart')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.dart')));
+ });
+
+ test('parses a V8 stack trace correctly', () {
+ var trace = Trace.parse('Error\n'
+ ' at Foo._bar (https://example.com/stuff.js:42:21)\n'
+ ' at https://example.com/stuff.js:0:2\n'
+ ' at zip.<anonymous>.zap '
+ '(https://pub.dev/thing.js:1:100)');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+
+ trace = Trace.parse('Exception: foo\n'
+ ' at Foo._bar (https://example.com/stuff.js:42:21)\n'
+ ' at https://example.com/stuff.js:0:2\n'
+ ' at zip.<anonymous>.zap '
+ '(https://pub.dev/thing.js:1:100)');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+
+ trace = Trace.parse('Exception: foo\n'
+ ' bar\n'
+ ' at Foo._bar (https://example.com/stuff.js:42:21)\n'
+ ' at https://example.com/stuff.js:0:2\n'
+ ' at zip.<anonymous>.zap '
+ '(https://pub.dev/thing.js:1:100)');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+
+ trace = Trace.parse('Exception: foo\n'
+ ' bar\n'
+ ' at Foo._bar (https://example.com/stuff.js:42:21)\n'
+ ' at https://example.com/stuff.js:0:2\n'
+ ' at (anonymous function).zip.zap '
+ '(https://pub.dev/thing.js:1:100)');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].member, equals('<fn>'));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+ expect(trace.frames[2].member, equals('<fn>.zip.zap'));
+ });
+
+ // JavaScriptCore traces are just like V8, except that it doesn't have a
+ // header and it starts with a tab rather than spaces.
+ test('parses a JavaScriptCore stack trace correctly', () {
+ var trace =
+ Trace.parse('\tat Foo._bar (https://example.com/stuff.js:42:21)\n'
+ '\tat https://example.com/stuff.js:0:2\n'
+ '\tat zip.<anonymous>.zap '
+ '(https://pub.dev/thing.js:1:100)');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+
+ trace = Trace.parse('\tat Foo._bar (https://example.com/stuff.js:42:21)\n'
+ '\tat \n'
+ '\tat zip.<anonymous>.zap '
+ '(https://pub.dev/thing.js:1:100)');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[1].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+ });
+
+ test('parses a Firefox/Safari stack trace correctly', () {
+ var trace = Trace.parse('Foo._bar@https://example.com/stuff.js:42\n'
+ 'zip/<@https://example.com/stuff.js:0\n'
+ 'zip.zap(12, "@)()/<")@https://pub.dev/thing.js:1');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+
+ trace = Trace.parse('zip/<@https://example.com/stuff.js:0\n'
+ 'Foo._bar@https://example.com/stuff.js:42\n'
+ 'zip.zap(12, "@)()/<")@https://pub.dev/thing.js:1');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+
+ trace = Trace.parse('zip.zap(12, "@)()/<")@https://pub.dev/thing.js:1\n'
+ 'zip/<@https://example.com/stuff.js:0\n'
+ 'Foo._bar@https://example.com/stuff.js:42');
+
+ expect(
+ trace.frames[0].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[2].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ });
+
+ test('parses a Firefox/Safari stack trace containing native code correctly',
+ () {
+ var trace = Trace.parse('Foo._bar@https://example.com/stuff.js:42\n'
+ 'zip/<@https://example.com/stuff.js:0\n'
+ 'zip.zap(12, "@)()/<")@https://pub.dev/thing.js:1\n'
+ '[native code]');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+ expect(trace.frames.length, equals(3));
+ });
+
+ test('parses a Firefox/Safari stack trace without a method name correctly',
+ () {
+ var trace = Trace.parse('https://example.com/stuff.js:42\n'
+ 'zip/<@https://example.com/stuff.js:0\n'
+ 'zip.zap(12, "@)()/<")@https://pub.dev/thing.js:1');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[0].member, equals('<fn>'));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+ });
+
+ test('parses a Firefox/Safari stack trace with an empty line correctly',
+ () {
+ var trace = Trace.parse('Foo._bar@https://example.com/stuff.js:42\n'
+ '\n'
+ 'zip/<@https://example.com/stuff.js:0\n'
+ 'zip.zap(12, "@)()/<")@https://pub.dev/thing.js:1');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+ });
+
+ test('parses a Firefox/Safari stack trace with a column number correctly',
+ () {
+ var trace = Trace.parse('Foo._bar@https://example.com/stuff.js:42:2\n'
+ 'zip/<@https://example.com/stuff.js:0\n'
+ 'zip.zap(12, "@)()/<")@https://pub.dev/thing.js:1');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(trace.frames[0].line, equals(42));
+ expect(trace.frames[0].column, equals(2));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://example.com/stuff.js')));
+ expect(
+ trace.frames[2].uri, equals(Uri.parse('https://pub.dev/thing.js')));
+ });
+
+ test('parses a package:stack_trace stack trace correctly', () {
+ var trace =
+ Trace.parse('https://dart.dev/foo/bar.dart 10:11 Foo.<fn>.bar\n'
+ 'https://dart.dev/foo/baz.dart Foo.<fn>.bar');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://dart.dev/foo/baz.dart')));
+ });
+
+ test('parses a package:stack_trace stack chain correctly', () {
+ var trace =
+ Trace.parse('https://dart.dev/foo/bar.dart 10:11 Foo.<fn>.bar\n'
+ 'https://dart.dev/foo/baz.dart Foo.<fn>.bar\n'
+ '===== asynchronous gap ===========================\n'
+ 'https://dart.dev/foo/bang.dart 10:11 Foo.<fn>.bar\n'
+ 'https://dart.dev/foo/quux.dart Foo.<fn>.bar');
+
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://dart.dev/foo/baz.dart')));
+ expect(trace.frames[2].uri,
+ equals(Uri.parse('https://dart.dev/foo/bang.dart')));
+ expect(trace.frames[3].uri,
+ equals(Uri.parse('https://dart.dev/foo/quux.dart')));
+ });
+
+ test('parses a package:stack_trace stack chain with end gap correctly', () {
+ var trace = Trace.parse(
+ 'https://dart.dev/foo/bar.dart 10:11 Foo.<fn>.bar\n'
+ 'https://dart.dev/foo/baz.dart Foo.<fn>.bar\n'
+ 'https://dart.dev/foo/bang.dart 10:11 Foo.<fn>.bar\n'
+ 'https://dart.dev/foo/quux.dart Foo.<fn>.bar===== asynchronous gap ===========================\n',
+ );
+
+ expect(trace.frames.length, 4);
+ expect(trace.frames[0].uri,
+ equals(Uri.parse('https://dart.dev/foo/bar.dart')));
+ expect(trace.frames[1].uri,
+ equals(Uri.parse('https://dart.dev/foo/baz.dart')));
+ expect(trace.frames[2].uri,
+ equals(Uri.parse('https://dart.dev/foo/bang.dart')));
+ expect(trace.frames[3].uri,
+ equals(Uri.parse('https://dart.dev/foo/quux.dart')));
+ });
+
+ test('parses a real package:stack_trace stack trace correctly', () {
+ var traceString = Trace.current().toString();
+ expect(Trace.parse(traceString).toString(), equals(traceString));
+ });
+
+ test('parses an empty string correctly', () {
+ var trace = Trace.parse('');
+ expect(trace.frames, isEmpty);
+ expect(trace.toString(), equals(''));
+ });
+
+ test('parses trace with async gap correctly', () {
+ var trace = Trace.parse('#0 bop (file:///pull.dart:42:23)\n'
+ '<asynchronous suspension>\n'
+ '#1 twist (dart:the/future.dart:0:2)\n'
+ '#2 main (dart:my/file.dart:4:6)\n');
+
+ expect(trace.frames.length, 3);
+ expect(trace.frames[0].uri, equals(Uri.parse('file:///pull.dart')));
+ expect(trace.frames[1].uri, equals(Uri.parse('dart:the/future.dart')));
+ expect(trace.frames[2].uri, equals(Uri.parse('dart:my/file.dart')));
+ });
+
+ test('parses trace with async gap at end correctly', () {
+ var trace = Trace.parse('#0 bop (file:///pull.dart:42:23)\n'
+ '#1 twist (dart:the/future.dart:0:2)\n'
+ '<asynchronous suspension>\n');
+
+ expect(trace.frames.length, 2);
+ expect(trace.frames[0].uri, equals(Uri.parse('file:///pull.dart')));
+ expect(trace.frames[1].uri, equals(Uri.parse('dart:the/future.dart')));
+ });
+
+ test('parses a V8 stack frace with Wasm frames correctly', () {
+ var trace = Trace.parse(
+ '\tat Error._throwWithCurrentStackTrace (wasm://wasm/0006d892:wasm-function[119]:0xbaf8)\n'
+ '\tat main (wasm://wasm/0006d892:wasm-function[792]:0x14378)\n'
+ '\tat main tear-off trampoline (wasm://wasm/0006d892:wasm-function[794]:0x14387)\n'
+ '\tat _invokeMain (wasm://wasm/0006d892:wasm-function[70]:0xa56c)\n'
+ '\tat InstantiatedApp.invokeMain (/home/user/test.mjs:361:37)\n'
+ '\tat main (/home/user/run_wasm.js:416:21)\n'
+ '\tat async action (/home/user/run_wasm.js:353:38)\n'
+ '\tat async eventLoop (/home/user/run_wasm.js:329:9)');
+
+ expect(trace.frames.length, 8);
+
+ for (final frame in trace.frames) {
+ expect(frame is UnparsedFrame, false);
+ }
+
+ expect(trace.frames[0].uri, Uri.parse('wasm://wasm/0006d892'));
+ expect(trace.frames[0].line, 1);
+ expect(trace.frames[0].column, 0xbaf8 + 1);
+ expect(trace.frames[0].member, 'Error._throwWithCurrentStackTrace');
+
+ expect(trace.frames[4].uri, Uri.parse('file:///home/user/test.mjs'));
+ expect(trace.frames[4].line, 361);
+ expect(trace.frames[4].column, 37);
+ expect(trace.frames[4].member, 'InstantiatedApp.invokeMain');
+
+ expect(trace.frames[5].uri, Uri.parse('file:///home/user/run_wasm.js'));
+ expect(trace.frames[5].line, 416);
+ expect(trace.frames[5].column, 21);
+ expect(trace.frames[5].member, 'main');
+ });
+
+ test('parses Firefox stack frace with Wasm frames correctly', () {
+ var trace = Trace.parse(
+ 'Error._throwWithCurrentStackTrace@http://localhost:8080/test.wasm:wasm-function[119]:0xbaf8\n'
+ 'main@http://localhost:8080/test.wasm:wasm-function[792]:0x14378\n'
+ 'main tear-off trampoline@http://localhost:8080/test.wasm:wasm-function[794]:0x14387\n'
+ '_invokeMain@http://localhost:8080/test.wasm:wasm-function[70]:0xa56c\n'
+ 'invoke@http://localhost:8080/test.mjs:48:26');
+
+ expect(trace.frames.length, 5);
+
+ for (final frame in trace.frames) {
+ expect(frame is UnparsedFrame, false);
+ }
+
+ expect(trace.frames[0].uri, Uri.parse('http://localhost:8080/test.wasm'));
+ expect(trace.frames[0].line, 1);
+ expect(trace.frames[0].column, 0xbaf8 + 1);
+ expect(trace.frames[0].member, 'Error._throwWithCurrentStackTrace');
+
+ expect(trace.frames[4].uri, Uri.parse('http://localhost:8080/test.mjs'));
+ expect(trace.frames[4].line, 48);
+ expect(trace.frames[4].column, 26);
+ expect(trace.frames[4].member, 'invoke');
+ });
+
+ test('parses JSShell stack frace with Wasm frames correctly', () {
+ var trace = Trace.parse(
+ 'Error._throwWithCurrentStackTrace@/home/user/test.mjs line 29 > WebAssembly.compile:wasm-function[119]:0xbaf8\n'
+ 'main@/home/user/test.mjs line 29 > WebAssembly.compile:wasm-function[792]:0x14378\n'
+ 'main tear-off trampoline@/home/user/test.mjs line 29 > WebAssembly.compile:wasm-function[794]:0x14387\n'
+ '_invokeMain@/home/user/test.mjs line 29 > WebAssembly.compile:wasm-function[70]:0xa56c\n'
+ 'invokeMain@/home/user/test.mjs:361:37\n'
+ 'main@/home/user/run_wasm.js:416:21\n'
+ 'async*action@/home/user/run_wasm.js:353:44\n'
+ 'eventLoop@/home/user/run_wasm.js:329:15\n'
+ 'self.dartMainRunner@/home/user/run_wasm.js:354:14\n'
+ '@/home/user/run_wasm.js:419:15');
+
+ expect(trace.frames.length, 10);
+
+ for (final frame in trace.frames) {
+ expect(frame is UnparsedFrame, false);
+ }
+
+ expect(trace.frames[0].uri, Uri.parse('file:///home/user/test.mjs'));
+ expect(trace.frames[0].line, 1);
+ expect(trace.frames[0].column, 0xbaf8 + 1);
+ expect(trace.frames[0].member, 'Error._throwWithCurrentStackTrace');
+
+ expect(trace.frames[4].uri, Uri.parse('file:///home/user/test.mjs'));
+ expect(trace.frames[4].line, 361);
+ expect(trace.frames[4].column, 37);
+ expect(trace.frames[4].member, 'invokeMain');
+
+ expect(trace.frames[9].uri, Uri.parse('file:///home/user/run_wasm.js'));
+ expect(trace.frames[9].line, 419);
+ expect(trace.frames[9].column, 15);
+ expect(trace.frames[9].member, '<fn>');
+ });
+
+ test('parses Safari stack frace with Wasm frames correctly', () {
+ var trace = Trace.parse(
+ '<?>.wasm-function[Error._throwWithCurrentStackTrace]@[wasm code]\n'
+ '<?>.wasm-function[main]@[wasm code]\n'
+ '<?>.wasm-function[main tear-off trampoline]@[wasm code]\n'
+ '<?>.wasm-function[_invokeMain]@[wasm code]\n'
+ 'invokeMain@/home/user/test.mjs:361:48\n'
+ '@/home/user/run_wasm.js:416:31');
+
+ expect(trace.frames.length, 6);
+
+ for (final frame in trace.frames) {
+ expect(frame is UnparsedFrame, false);
+ }
+
+ expect(trace.frames[0].uri, Uri.parse('wasm code'));
+ expect(trace.frames[0].line, null);
+ expect(trace.frames[0].column, null);
+ expect(trace.frames[0].member, 'Error._throwWithCurrentStackTrace');
+
+ expect(trace.frames[4].uri, Uri.parse('file:///home/user/test.mjs'));
+ expect(trace.frames[4].line, 361);
+ expect(trace.frames[4].column, 48);
+ expect(trace.frames[4].member, 'invokeMain');
+
+ expect(trace.frames[5].uri, Uri.parse('file:///home/user/run_wasm.js'));
+ expect(trace.frames[5].line, 416);
+ expect(trace.frames[5].column, 31);
+ expect(trace.frames[5].member, '<fn>');
+ });
+ });
+
+ test('.toString() nicely formats the stack trace', () {
+ var trace = Trace.parse('''
+#0 Foo._bar (foo/bar.dart:42:21)
+#1 zip.<anonymous closure>.zap (dart:async/future.dart:0:2)
+#2 zip.<anonymous closure>.zap (https://pub.dev/thing.dart:1:100)
+''');
+
+ expect(trace.toString(), equals('''
+${path.join('foo', 'bar.dart')} 42:21 Foo._bar
+dart:async/future.dart 0:2 zip.<fn>.zap
+https://pub.dev/thing.dart 1:100 zip.<fn>.zap
+'''));
+ });
+
+ test('.vmTrace returns a native-style trace', () {
+ var uri = path.toUri(path.absolute('foo'));
+ var trace = Trace([
+ Frame(uri, 10, 20, 'Foo.<fn>'),
+ Frame(Uri.parse('https://dart.dev/foo.dart'), null, null, 'bar'),
+ Frame(Uri.parse('dart:async'), 15, null, 'baz'),
+ ]);
+
+ expect(
+ trace.vmTrace.toString(),
+ equals('#1 Foo.<anonymous closure> ($uri:10:20)\n'
+ '#2 bar (https://dart.dev/foo.dart:0:0)\n'
+ '#3 baz (dart:async:15:0)\n'));
+ });
+
+ group('folding', () {
+ group('.terse', () {
+ test('folds core frames together bottom-up', () {
+ var trace = Trace.parse('''
+#1 top (dart:async/future.dart:0:2)
+#2 bottom (dart:core/uri.dart:1:100)
+#0 notCore (foo.dart:42:21)
+#3 top (dart:io:5:10)
+#4 bottom (dart:async-patch/future.dart:9:11)
+#5 alsoNotCore (bar.dart:10:20)
+''');
+
+ expect(trace.terse.toString(), equals('''
+dart:core bottom
+foo.dart 42:21 notCore
+dart:async bottom
+bar.dart 10:20 alsoNotCore
+'''));
+ });
+
+ test('folds empty async frames', () {
+ var trace = Trace.parse('''
+#0 top (dart:async/future.dart:0:2)
+#1 empty.<<anonymous closure>_async_body> (bar.dart)
+#2 bottom (dart:async-patch/future.dart:9:11)
+#3 notCore (foo.dart:42:21)
+''');
+
+ expect(trace.terse.toString(), equals('''
+dart:async bottom
+foo.dart 42:21 notCore
+'''));
+ });
+
+ test('removes the bottom-most async frame', () {
+ var trace = Trace.parse('''
+#0 notCore (foo.dart:42:21)
+#1 top (dart:async/future.dart:0:2)
+#2 bottom (dart:core/uri.dart:1:100)
+#3 top (dart:io:5:10)
+#4 bottom (dart:async-patch/future.dart:9:11)
+''');
+
+ expect(trace.terse.toString(), equals('''
+foo.dart 42:21 notCore
+'''));
+ });
+
+ test("won't make a trace empty", () {
+ var trace = Trace.parse('''
+#1 top (dart:async/future.dart:0:2)
+#2 bottom (dart:core/uri.dart:1:100)
+''');
+
+ expect(trace.terse.toString(), equals('''
+dart:core bottom
+'''));
+ });
+
+ test("won't panic on an empty trace", () {
+ expect(Trace.parse('').terse.toString(), equals(''));
+ });
+ });
+
+ group('.foldFrames', () {
+ test('folds frames together bottom-up', () {
+ var trace = Trace.parse('''
+#0 notFoo (foo.dart:42:21)
+#1 fooTop (bar.dart:0:2)
+#2 fooBottom (foo.dart:1:100)
+#3 alsoNotFoo (bar.dart:10:20)
+#4 fooTop (dart:io/socket.dart:5:10)
+#5 fooBottom (dart:async-patch/future.dart:9:11)
+''');
+
+ var folded =
+ trace.foldFrames((frame) => frame.member!.startsWith('foo'));
+ expect(folded.toString(), equals('''
+foo.dart 42:21 notFoo
+foo.dart 1:100 fooBottom
+bar.dart 10:20 alsoNotFoo
+dart:async-patch/future.dart 9:11 fooBottom
+'''));
+ });
+
+ test('will never fold unparsed frames', () {
+ var trace = Trace.parse(r'''
+.g"cs$#:b";a#>sw{*{ul$"$xqwr`p
+%+j-?uppx<([j@#nu{{>*+$%x-={`{
+!e($b{nj)zs?cgr%!;bmw.+$j+pfj~
+''');
+
+ expect(trace.foldFrames((frame) => true).toString(), equals(r'''
+.g"cs$#:b";a#>sw{*{ul$"$xqwr`p
+%+j-?uppx<([j@#nu{{>*+$%x-={`{
+!e($b{nj)zs?cgr%!;bmw.+$j+pfj~
+'''));
+ });
+
+ group('with terse: true', () {
+ test('folds core frames as well', () {
+ var trace = Trace.parse('''
+#0 notFoo (foo.dart:42:21)
+#1 fooTop (bar.dart:0:2)
+#2 coreBottom (dart:async/future.dart:0:2)
+#3 alsoNotFoo (bar.dart:10:20)
+#4 fooTop (foo.dart:9:11)
+#5 coreBottom (dart:async-patch/future.dart:9:11)
+''');
+
+ var folded = trace.foldFrames(
+ (frame) => frame.member!.startsWith('foo'),
+ terse: true);
+ expect(folded.toString(), equals('''
+foo.dart 42:21 notFoo
+dart:async coreBottom
+bar.dart 10:20 alsoNotFoo
+'''));
+ });
+
+ test('shortens folded frames', () {
+ var trace = Trace.parse('''
+#0 notFoo (foo.dart:42:21)
+#1 fooTop (bar.dart:0:2)
+#2 fooBottom (package:foo/bar.dart:0:2)
+#3 alsoNotFoo (bar.dart:10:20)
+#4 fooTop (foo.dart:9:11)
+#5 fooBottom (foo/bar.dart:9:11)
+#6 againNotFoo (bar.dart:20:20)
+''');
+
+ var folded = trace.foldFrames(
+ (frame) => frame.member!.startsWith('foo'),
+ terse: true);
+ expect(folded.toString(), equals('''
+foo.dart 42:21 notFoo
+package:foo fooBottom
+bar.dart 10:20 alsoNotFoo
+foo fooBottom
+bar.dart 20:20 againNotFoo
+'''));
+ });
+
+ test('removes the bottom-most folded frame', () {
+ var trace = Trace.parse('''
+#2 fooTop (package:foo/bar.dart:0:2)
+#3 notFoo (bar.dart:10:20)
+#5 fooBottom (foo/bar.dart:9:11)
+''');
+
+ var folded = trace.foldFrames(
+ (frame) => frame.member!.startsWith('foo'),
+ terse: true);
+ expect(folded.toString(), equals('''
+package:foo fooTop
+bar.dart 10:20 notFoo
+'''));
+ });
+ });
+ });
+ });
+}
diff --git a/pkgs/stack_trace/test/utils.dart b/pkgs/stack_trace/test/utils.dart
new file mode 100644
index 0000000..98cb5ed
--- /dev/null
+++ b/pkgs/stack_trace/test/utils.dart
@@ -0,0 +1,14 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+/// Returns a matcher that runs [matcher] against a [Frame]'s `member` field.
+Matcher frameMember(Object? matcher) =>
+ isA<Frame>().having((p0) => p0.member, 'member', matcher);
+
+/// Returns a matcher that runs [matcher] against a [Frame]'s `library` field.
+Matcher frameLibrary(Object? matcher) =>
+ isA<Frame>().having((p0) => p0.library, 'library', matcher);
diff --git a/pkgs/stack_trace/test/vm_test.dart b/pkgs/stack_trace/test/vm_test.dart
new file mode 100644
index 0000000..70ac014
--- /dev/null
+++ b/pkgs/stack_trace/test/vm_test.dart
@@ -0,0 +1,112 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// This file tests stack_trace's ability to parse live stack traces. It's a
+/// dual of dartium_test.dart, since method names can differ somewhat from
+/// platform to platform. No similar file exists for dart2js since the specific
+/// method names there are implementation details.
+@TestOn('vm')
+library;
+
+import 'package:path/path.dart' as path;
+import 'package:stack_trace/stack_trace.dart';
+import 'package:test/test.dart';
+
+// The name of this (trivial) function is verified as part of the test
+String getStackTraceString() => StackTrace.current.toString();
+
+// The name of this (trivial) function is verified as part of the test
+StackTrace getStackTraceObject() => StackTrace.current;
+
+Frame getCaller([int? level]) {
+ if (level == null) return Frame.caller();
+ return Frame.caller(level);
+}
+
+Frame nestedGetCaller(int level) => getCaller(level);
+
+Trace getCurrentTrace([int level = 0]) => Trace.current(level);
+
+Trace nestedGetCurrentTrace(int level) => getCurrentTrace(level);
+
+void main() {
+ group('Trace', () {
+ test('.parse parses a real stack trace correctly', () {
+ var string = getStackTraceString();
+ var trace = Trace.parse(string);
+ expect(path.url.basename(trace.frames.first.uri.path),
+ equals('vm_test.dart'));
+ expect(trace.frames.first.member, equals('getStackTraceString'));
+ });
+
+ test('converts from a native stack trace correctly', () {
+ var trace = Trace.from(getStackTraceObject());
+ expect(path.url.basename(trace.frames.first.uri.path),
+ equals('vm_test.dart'));
+ expect(trace.frames.first.member, equals('getStackTraceObject'));
+ });
+
+ test('.from handles a stack overflow trace correctly', () {
+ void overflow() => overflow();
+
+ late Trace? trace;
+ try {
+ overflow();
+ } catch (_, stackTrace) {
+ trace = Trace.from(stackTrace);
+ }
+
+ expect(trace!.frames.first.member, equals('main.<fn>.<fn>.overflow'));
+ });
+
+ group('.current()', () {
+ test('with no argument returns a trace starting at the current frame',
+ () {
+ var trace = Trace.current();
+ expect(trace.frames.first.member, equals('main.<fn>.<fn>.<fn>'));
+ });
+
+ test('at level 0 returns a trace starting at the current frame', () {
+ var trace = Trace.current();
+ expect(trace.frames.first.member, equals('main.<fn>.<fn>.<fn>'));
+ });
+
+ test('at level 1 returns a trace starting at the parent frame', () {
+ var trace = getCurrentTrace(1);
+ expect(trace.frames.first.member, equals('main.<fn>.<fn>.<fn>'));
+ });
+
+ test('at level 2 returns a trace starting at the grandparent frame', () {
+ var trace = nestedGetCurrentTrace(2);
+ expect(trace.frames.first.member, equals('main.<fn>.<fn>.<fn>'));
+ });
+
+ test('throws an ArgumentError for negative levels', () {
+ expect(() => Trace.current(-1), throwsArgumentError);
+ });
+ });
+ });
+
+ group('Frame.caller()', () {
+ test('with no argument returns the parent frame', () {
+ expect(getCaller().member, equals('main.<fn>.<fn>'));
+ });
+
+ test('at level 0 returns the current frame', () {
+ expect(getCaller(0).member, equals('getCaller'));
+ });
+
+ test('at level 1 returns the current frame', () {
+ expect(getCaller(1).member, equals('main.<fn>.<fn>'));
+ });
+
+ test('at level 2 returns the grandparent frame', () {
+ expect(nestedGetCaller(2).member, equals('main.<fn>.<fn>'));
+ });
+
+ test('throws an ArgumentError for negative levels', () {
+ expect(() => Frame.caller(-1), throwsArgumentError);
+ });
+ });
+}
diff --git a/pkgs/stream_channel/.gitignore b/pkgs/stream_channel/.gitignore
new file mode 100644
index 0000000..1447012
--- /dev/null
+++ b/pkgs/stream_channel/.gitignore
@@ -0,0 +1,10 @@
+.buildlog
+.dart_tool/
+.DS_Store
+.idea
+.pub/
+.settings/
+build/
+packages
+.packages
+pubspec.lock
diff --git a/pkgs/stream_channel/AUTHORS b/pkgs/stream_channel/AUTHORS
new file mode 100644
index 0000000..e8063a8
--- /dev/null
+++ b/pkgs/stream_channel/AUTHORS
@@ -0,0 +1,6 @@
+# Below is a list of people and organizations that have contributed
+# to the project. Names should be added to the list like so:
+#
+# Name/Organization <email address>
+
+Google Inc.
diff --git a/pkgs/stream_channel/CHANGELOG.md b/pkgs/stream_channel/CHANGELOG.md
new file mode 100644
index 0000000..9dd3990
--- /dev/null
+++ b/pkgs/stream_channel/CHANGELOG.md
@@ -0,0 +1,162 @@
+## 2.1.4
+
+* Fix `StreamChannelMixin` so that it can be used as a mixin again.
+
+## 2.1.3
+
+* Require Dart 3.3
+* Move to `dart-lang/tools` monorepo.
+
+## 2.1.2
+
+* Require Dart 2.19
+* Add an example.
+* Fix a race condition in `IsolateChannel.connectReceive()` where the channel
+ could hang forever if its sink was closed before the connection was established.
+
+## 2.1.1
+
+* Require Dart 2.14
+* Populate the pubspec `repository` field.
+* Handle multichannel messages where the ID element is a `double` at runtime
+ instead of an `int`. When reading an array with `dart2wasm` numbers within the
+ array are parsed as `double`.
+
+## 2.1.0
+
+* Stable release for null safety.
+
+## 2.0.0
+
+**Breaking changes**
+
+* `IsolateChannel` requires a separate import
+ `package:stram_channel/isolate_channel.dart`.
+ `package:stream_channel/stream_channel.dart` will now not trigger any platform
+ concerns due to importing `dart:isolate`.
+* Remove `JsonDocumentTransformer` class. The `jsonDocument` top level is still
+ available.
+* Remove `StreamChannelTransformer.typed`. Use `.cast` on the transformed
+ channel instead.
+* Change `Future<dynamic>` returns to `Future<void>`.
+
+## 1.7.0
+
+* Make `IsolateChannel` available through
+ `package:stream_channel/isolate_channel.dart`. This will be the required
+ import in the next release.
+* Require `2.0.0` or newer SDK.
+* Internal style changes.
+
+## 1.6.8
+
+* Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 1.6.7+1
+
+* Fix Dart 2 runtime types in `IsolateChannel`.
+
+## 1.6.7
+
+* Update SDK version to 2.0.0-dev.17.0.
+* Add a type argument to `MultiChannel`.
+
+## 1.6.6
+
+* Fix a Dart 2 issue with inner stream transformation in `GuaranteeChannel`.
+
+* Fix a Dart 2 issue with `StreamChannelTransformer.fromCodec()`.
+
+## 1.6.5
+
+* Fix an issue with `JsonDocumentTransformer.bind` where it created an internal
+ stream channel which didn't get a properly inferred type for its `sink`.
+
+## 1.6.4
+
+* Fix a race condition in `MultiChannel` where messages from a remote virtual
+ channel could get dropped if the corresponding local channel wasn't registered
+ quickly enough.
+
+## 1.6.3
+
+* Use `pumpEventQueue()` from test.
+
+## 1.6.2
+
+* Declare support for `async` 2.0.0.
+
+## 1.6.1
+
+* Fix the type of `StreamChannel.transform()`. This previously inverted the
+ generic parameters, so it only really worked with transformers where both
+ generic types were identical.
+
+## 1.6.0
+
+* `Disconnector.disconnect()` now returns a future that completes when all the
+ inner `StreamSink.close()` futures have completed.
+
+## 1.5.0
+
+* Add `new StreamChannel.withCloseGuarantee()` to provide the specific guarantee
+ that closing the sink causes the stream to close before it emits any more
+ events. This is the only guarantee that isn't automatically preserved when
+ transforming a channel.
+
+* `StreamChannelTransformer`s provided by the `stream_channel` package now
+ properly provide the guarantee that closing the sink causes the stream to
+ close before it emits any more events
+
+## 1.4.0
+
+* Add `StreamChannel.cast()`, which soundly coerces the generic type of a
+ channel.
+
+* Add `StreamChannelTransformer.typed()`, which soundly coerces the generic type
+ of a transformer.
+
+## 1.3.2
+
+* Fix all strong-mode errors and warnings.
+
+## 1.3.1
+
+* Make `IsolateChannel` slightly more efficient.
+
+* Make `MultiChannel` follow the stream channel rules.
+
+## 1.3.0
+
+* Add `Disconnector`, a transformer that allows the caller to disconnect the
+ transformed channel.
+
+## 1.2.0
+
+* Add `new StreamChannel.withGuarantees()`, which creates a channel with extra
+ wrapping to ensure that it obeys the stream channel guarantees.
+
+* Add `StreamChannelController`, which can be used to create custom
+ `StreamChannel` objects.
+
+## 1.1.1
+
+* Fix the type annotation for `StreamChannel.transform()`'s parameter.
+
+## 1.1.0
+
+* Add `StreamChannel.transformStream()`, `StreamChannel.transformSink()`,
+ `StreamChannel.changeStream()`, and `StreamChannel.changeSink()` to support
+ changing only the stream or only the sink of a channel.
+
+* Be more explicit about `JsonDocumentTransformer`'s error-handling behavior.
+
+## 1.0.1
+
+* Fix `MultiChannel`'s constructor to take a `StreamChannel`. This is
+ technically a breaking change, but since 1.0.0 was only released an hour ago,
+ we're treating it as a bug fix.
+
+## 1.0.0
+
+* Initial version
diff --git a/pkgs/stream_channel/LICENSE b/pkgs/stream_channel/LICENSE
new file mode 100644
index 0000000..dbd2843
--- /dev/null
+++ b/pkgs/stream_channel/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2015, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/stream_channel/README.md b/pkgs/stream_channel/README.md
new file mode 100644
index 0000000..3677ccf
--- /dev/null
+++ b/pkgs/stream_channel/README.md
@@ -0,0 +1,20 @@
+[](https://github.com/dart-lang/tools/actions/workflows/stream_channel.yaml)
+[](https://pub.dev/packages/stream_channel)
+[](https://pub.dev/packages/stream_channel/publisher)
+
+This package exposes the `StreamChannel` interface, which represents a two-way
+communication channel. Each `StreamChannel` exposes a `Stream` for receiving
+data and a `StreamSink` for sending it.
+
+`StreamChannel` helps abstract communication logic away from the underlying
+protocol. For example, the [`test`][test] package re-uses its test suite
+communication protocol for both WebSocket connections to browser suites and
+Isolate connections to VM tests.
+
+[test]: https://pub.dev/packages/test
+
+This package also contains utilities for dealing with `StreamChannel`s and with
+two-way communications in general. For documentation of these utilities, see
+[the API docs][api].
+
+[api]: https://pub.dev/documentation/stream_channel/latest/
diff --git a/pkgs/stream_channel/analysis_options.yaml b/pkgs/stream_channel/analysis_options.yaml
new file mode 100644
index 0000000..44cda4d
--- /dev/null
+++ b/pkgs/stream_channel/analysis_options.yaml
@@ -0,0 +1,5 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
diff --git a/pkgs/stream_channel/example/example.dart b/pkgs/stream_channel/example/example.dart
new file mode 100644
index 0000000..b41d8d9
--- /dev/null
+++ b/pkgs/stream_channel/example/example.dart
@@ -0,0 +1,110 @@
+// Copyright (c) 2023, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+import 'dart:io';
+import 'dart:isolate';
+
+import 'package:stream_channel/isolate_channel.dart';
+import 'package:stream_channel/stream_channel.dart';
+
+Future<void> main() async {
+ // A StreamChannel<T>, is in simplest terms, a wrapper around a Stream<T> and
+ // a StreamSink<T>. For example, you can create a channel that wraps standard
+ // IO:
+ var stdioChannel = StreamChannel(stdin, stdout);
+ stdioChannel.sink.add('Hello!\n'.codeUnits);
+
+ // Like a Stream<T> can be transformed with a StreamTransformer<T>, a
+ // StreamChannel<T> can be transformed with a StreamChannelTransformer<T>.
+ // For example, we can handle standard input as strings:
+ var stringChannel = stdioChannel
+ .transform(StreamChannelTransformer.fromCodec(utf8))
+ .transformStream(const LineSplitter());
+ stringChannel.sink.add('world!\n');
+
+ // You can implement StreamChannel<T> by extending StreamChannelMixin<T>, but
+ // it's much easier to use a StreamChannelController<T>. A controller has two
+ // StreamChannel<T> members: `local` and `foreign`. The creator of a
+ // controller should work with the `local` channel, while the recipient should
+ // work with the `foreign` channel, and usually will not have direct access to
+ // the underlying controller.
+ var ctrl = StreamChannelController<String>();
+ ctrl.local.stream.listen((event) {
+ // Do something useful here...
+ });
+
+ // You can also pipe events from one channel to another.
+ ctrl
+ ..foreign.pipe(stringChannel)
+ ..local.sink.add('Piped!\n');
+ await ctrl.local.sink.close();
+
+ // The StreamChannel<T> interface provides several guarantees, which can be
+ // found here:
+ // https://pub.dev/documentation/stream_channel/latest/stream_channel/StreamChannel-class.html
+ //
+ // By calling `StreamChannel<T>.withGuarantees()`, you can create a
+ // StreamChannel<T> that provides all guarantees.
+ var dummyCtrl0 = StreamChannelController<String>();
+ var guaranteedChannel = StreamChannel.withGuarantees(
+ dummyCtrl0.foreign.stream, dummyCtrl0.foreign.sink);
+
+ // To close a StreamChannel, use `sink.close()`.
+ await guaranteedChannel.sink.close();
+
+ // A MultiChannel<T> multiplexes multiple virtual channels across a single
+ // underlying transport layer. For example, an application listening over
+ // standard I/O can still support multiple clients if it has a mechanism to
+ // separate events from different clients.
+ //
+ // A MultiChannel<T> splits events into numbered channels, which are
+ // instances of VirtualChannel<T>.
+ var dummyCtrl1 = StreamChannelController<String>();
+ var multiChannel = MultiChannel<String>(dummyCtrl1.foreign);
+ var channel1 = multiChannel.virtualChannel();
+ await multiChannel.sink.close();
+
+ // The client/peer should also create its own MultiChannel<T>, connected to
+ // the underlying transport, use the corresponding ID's to handle events in
+ // their respective channels. It is up to you how to communicate channel ID's
+ // across different endpoints.
+ var dummyCtrl2 = StreamChannelController<String>();
+ var multiChannel2 = MultiChannel<String>(dummyCtrl2.foreign);
+ var channel2 = multiChannel2.virtualChannel(channel1.id);
+ await channel2.sink.close();
+ await multiChannel2.sink.close();
+
+ // Multiple instances of a Dart application can communicate easily across
+ // `SendPort`/`ReceivePort` pairs by means of the `IsolateChannel<T>` class.
+ // Typically, one endpoint will create a `ReceivePort`, and call the
+ // `IsolateChannel.connectReceive` constructor. The other endpoint will be
+ // given the corresponding `SendPort`, and then call
+ // `IsolateChannel.connectSend`.
+ var recv = ReceivePort();
+ var recvChannel = IsolateChannel<void>.connectReceive(recv);
+ var sendChannel = IsolateChannel<void>.connectSend(recv.sendPort);
+
+ // You must manually close `IsolateChannel<T>` sinks, however.
+ await recvChannel.sink.close();
+ await sendChannel.sink.close();
+
+ // You can use the `Disconnector` transformer to cause a channel to act as
+ // though the remote end of its transport had disconnected.
+ var disconnector = Disconnector<String>();
+ var disconnectable = stringChannel.transform(disconnector);
+ disconnectable.sink.add('Still connected!');
+ await disconnector.disconnect();
+
+ // Additionally:
+ // * The `DelegatingStreamController<T>` class can be extended to build a
+ // basis for wrapping other `StreamChannel<T>` objects.
+ // * The `jsonDocument` transformer converts events to/from JSON, using
+ // the `json` codec from `dart:convert`.
+ // * `package:json_rpc_2` directly builds on top of
+ // `package:stream_channel`, so any compatible transport can be used to
+ // create interactive client/server or peer-to-peer applications (i.e.
+ // language servers, microservices, etc.
+}
diff --git a/pkgs/stream_channel/lib/isolate_channel.dart b/pkgs/stream_channel/lib/isolate_channel.dart
new file mode 100644
index 0000000..5d9f6e1
--- /dev/null
+++ b/pkgs/stream_channel/lib/isolate_channel.dart
@@ -0,0 +1,5 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/isolate_channel.dart' show IsolateChannel;
diff --git a/pkgs/stream_channel/lib/src/close_guarantee_channel.dart b/pkgs/stream_channel/lib/src/close_guarantee_channel.dart
new file mode 100644
index 0000000..13432d1
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/close_guarantee_channel.dart
@@ -0,0 +1,91 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// A [StreamChannel] that specifically enforces the stream channel guarantee
+/// that closing the sink causes the stream to close before it emits any more
+/// events
+///
+/// This is exposed via [StreamChannel.withCloseGuarantee].
+class CloseGuaranteeChannel<T> extends StreamChannelMixin<T> {
+ @override
+ Stream<T> get stream => _stream;
+ late final _CloseGuaranteeStream<T> _stream;
+
+ @override
+ StreamSink<T> get sink => _sink;
+ late final _CloseGuaranteeSink<T> _sink;
+
+ /// The subscription to the inner stream.
+ StreamSubscription<T>? _subscription;
+
+ /// Whether the sink has closed, causing the underlying channel to disconnect.
+ bool _disconnected = false;
+
+ CloseGuaranteeChannel(Stream<T> innerStream, StreamSink<T> innerSink) {
+ _sink = _CloseGuaranteeSink<T>(innerSink, this);
+ _stream = _CloseGuaranteeStream<T>(innerStream, this);
+ }
+}
+
+/// The stream for [CloseGuaranteeChannel].
+///
+/// This wraps the inner stream to save the subscription on the channel when
+/// [listen] is called.
+class _CloseGuaranteeStream<T> extends Stream<T> {
+ /// The inner stream this is delegating to.
+ final Stream<T> _inner;
+
+ /// The [CloseGuaranteeChannel] this belongs to.
+ final CloseGuaranteeChannel<T> _channel;
+
+ _CloseGuaranteeStream(this._inner, this._channel);
+
+ @override
+ StreamSubscription<T> listen(void Function(T)? onData,
+ {Function? onError, void Function()? onDone, bool? cancelOnError}) {
+ // If the channel is already disconnected, we shouldn't dispatch anything
+ // but a done event.
+ if (_channel._disconnected) {
+ onData = null;
+ onError = null;
+ }
+
+ var subscription = _inner.listen(onData,
+ onError: onError, onDone: onDone, cancelOnError: cancelOnError);
+ if (!_channel._disconnected) {
+ _channel._subscription = subscription;
+ }
+ return subscription;
+ }
+}
+
+/// The sink for [CloseGuaranteeChannel].
+///
+/// This wraps the inner sink to cancel the stream subscription when the sink is
+/// canceled.
+class _CloseGuaranteeSink<T> extends DelegatingStreamSink<T> {
+ /// The [CloseGuaranteeChannel] this belongs to.
+ final CloseGuaranteeChannel<T> _channel;
+
+ _CloseGuaranteeSink(super.inner, this._channel);
+
+ @override
+ Future<void> close() {
+ var done = super.close();
+ _channel._disconnected = true;
+ var subscription = _channel._subscription;
+ if (subscription != null) {
+ // Don't dispatch anything but a done event.
+ subscription.onData(null);
+ subscription.onError(null);
+ }
+ return done;
+ }
+}
diff --git a/pkgs/stream_channel/lib/src/delegating_stream_channel.dart b/pkgs/stream_channel/lib/src/delegating_stream_channel.dart
new file mode 100644
index 0000000..4484a59
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/delegating_stream_channel.dart
@@ -0,0 +1,23 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import '../stream_channel.dart';
+
+/// A simple delegating wrapper around [StreamChannel].
+///
+/// Subclasses can override individual methods, or use this to expose only
+/// [StreamChannel] methods.
+class DelegatingStreamChannel<T> extends StreamChannelMixin<T> {
+ /// The inner channel to which methods are forwarded.
+ final StreamChannel<T> _inner;
+
+ @override
+ Stream<T> get stream => _inner.stream;
+ @override
+ StreamSink<T> get sink => _inner.sink;
+
+ DelegatingStreamChannel(this._inner);
+}
diff --git a/pkgs/stream_channel/lib/src/disconnector.dart b/pkgs/stream_channel/lib/src/disconnector.dart
new file mode 100644
index 0000000..3414e9c
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/disconnector.dart
@@ -0,0 +1,153 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// Allows the caller to force a channel to disconnect.
+///
+/// When [disconnect] is called, the channel (or channels) transformed by this
+/// transformer will act as though the remote end had disconnected—the stream
+/// will emit a done event, and the sink will ignore future inputs. The inner
+/// sink will also be closed to notify the remote end of the disconnection.
+///
+/// If a channel is transformed after the [disconnect] has been called, it will
+/// be disconnected immediately.
+class Disconnector<T> implements StreamChannelTransformer<T, T> {
+ /// Whether [disconnect] has been called.
+ bool get isDisconnected => _disconnectMemo.hasRun;
+
+ /// The sinks for transformed channels.
+ ///
+ /// Note that we assume that transformed channels provide the stream channel
+ /// guarantees. This allows us to only track sinks, because we know closing
+ /// the underlying sink will cause the stream to emit a done event.
+ final _sinks = <_DisconnectorSink<T>>[];
+
+ /// Disconnects all channels that have been transformed.
+ ///
+ /// Returns a future that completes when all inner sinks' [StreamSink.close]
+ /// futures have completed. Note that a [StreamController]'s sink won't close
+ /// until the corresponding stream has a listener.
+ Future<void> disconnect() => _disconnectMemo.runOnce(() {
+ var futures = _sinks.map((sink) => sink._disconnect()).toList();
+ _sinks.clear();
+ return Future.wait(futures, eagerError: true);
+ });
+ final _disconnectMemo = AsyncMemoizer<List<void>>();
+
+ @override
+ StreamChannel<T> bind(StreamChannel<T> channel) {
+ return channel.changeSink((innerSink) {
+ var sink = _DisconnectorSink<T>(innerSink);
+
+ if (isDisconnected) {
+ // Ignore errors here, because otherwise there would be no way for the
+ // user to handle them gracefully.
+ sink._disconnect().catchError((_) {});
+ } else {
+ _sinks.add(sink);
+ }
+
+ return sink;
+ });
+ }
+}
+
+/// A sink wrapper that can force a disconnection.
+class _DisconnectorSink<T> implements StreamSink<T> {
+ /// The inner sink.
+ final StreamSink<T> _inner;
+
+ @override
+ Future<void> get done => _inner.done;
+
+ /// Whether [Disconnector.disconnect] has been called.
+ var _isDisconnected = false;
+
+ /// Whether the user has called [close].
+ var _closed = false;
+
+ /// The subscription to the stream passed to [addStream], if a stream is
+ /// currently being added.
+ StreamSubscription<T>? _addStreamSubscription;
+
+ /// The completer for the future returned by [addStream], if a stream is
+ /// currently being added.
+ Completer? _addStreamCompleter;
+
+ /// Whether we're currently adding a stream with [addStream].
+ bool get _inAddStream => _addStreamSubscription != null;
+
+ _DisconnectorSink(this._inner);
+
+ @override
+ void add(T data) {
+ if (_closed) throw StateError('Cannot add event after closing.');
+ if (_inAddStream) {
+ throw StateError('Cannot add event while adding stream.');
+ }
+ if (_isDisconnected) return;
+
+ _inner.add(data);
+ }
+
+ @override
+ void addError(Object error, [StackTrace? stackTrace]) {
+ if (_closed) throw StateError('Cannot add event after closing.');
+ if (_inAddStream) {
+ throw StateError('Cannot add event while adding stream.');
+ }
+ if (_isDisconnected) return;
+
+ _inner.addError(error, stackTrace);
+ }
+
+ @override
+ Future<void> addStream(Stream<T> stream) {
+ if (_closed) throw StateError('Cannot add stream after closing.');
+ if (_inAddStream) {
+ throw StateError('Cannot add stream while adding stream.');
+ }
+ if (_isDisconnected) return Future.value();
+
+ _addStreamCompleter = Completer.sync();
+ _addStreamSubscription = stream.listen(_inner.add,
+ onError: _inner.addError, onDone: _addStreamCompleter!.complete);
+ return _addStreamCompleter!.future.then((_) {
+ _addStreamCompleter = null;
+ _addStreamSubscription = null;
+ });
+ }
+
+ @override
+ Future<void> close() {
+ if (_inAddStream) {
+ throw StateError('Cannot close sink while adding stream.');
+ }
+
+ _closed = true;
+ return _inner.close();
+ }
+
+ /// Disconnects this sink.
+ ///
+ /// This closes the underlying sink and stops forwarding events. It returns
+ /// the [StreamSink.close] future for the underlying sink.
+ Future<void> _disconnect() {
+ _isDisconnected = true;
+ var future = _inner.close();
+
+ if (_inAddStream) {
+ _addStreamCompleter!.complete(_addStreamSubscription!.cancel());
+ _addStreamCompleter = null;
+ _addStreamSubscription = null;
+ }
+
+ return future;
+ }
+}
diff --git a/pkgs/stream_channel/lib/src/guarantee_channel.dart b/pkgs/stream_channel/lib/src/guarantee_channel.dart
new file mode 100644
index 0000000..30ebe2e
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/guarantee_channel.dart
@@ -0,0 +1,207 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// A [StreamChannel] that enforces the stream channel guarantees.
+///
+/// This is exposed via [StreamChannel.withGuarantees].
+class GuaranteeChannel<T> extends StreamChannelMixin<T> {
+ @override
+ Stream<T> get stream => _streamController.stream;
+
+ @override
+ StreamSink<T> get sink => _sink;
+ late final _GuaranteeSink<T> _sink;
+
+ /// The controller for [stream].
+ ///
+ /// This intermediate controller allows us to continue listening for a done
+ /// event even after the user has canceled their subscription, and to send our
+ /// own done event when the sink is closed.
+ late final StreamController<T> _streamController;
+
+ /// The subscription to the inner stream.
+ StreamSubscription<T>? _subscription;
+
+ /// Whether the sink has closed, causing the underlying channel to disconnect.
+ bool _disconnected = false;
+
+ GuaranteeChannel(Stream<T> innerStream, StreamSink<T> innerSink,
+ {bool allowSinkErrors = true}) {
+ _sink = _GuaranteeSink<T>(innerSink, this, allowErrors: allowSinkErrors);
+
+ // Enforce the single-subscription guarantee by changing a broadcast stream
+ // to single-subscription.
+ if (innerStream.isBroadcast) {
+ innerStream =
+ innerStream.transform(SingleSubscriptionTransformer<T, T>());
+ }
+
+ _streamController = StreamController<T>(
+ onListen: () {
+ // If the sink has disconnected, we've already called
+ // [_streamController.close].
+ if (_disconnected) return;
+
+ _subscription = innerStream.listen(_streamController.add,
+ onError: _streamController.addError, onDone: () {
+ _sink._onStreamDisconnected();
+ _streamController.close();
+ });
+ },
+ sync: true);
+ }
+
+ /// Called by [_GuaranteeSink] when the user closes it.
+ ///
+ /// The sink closing indicates that the connection is closed, so the stream
+ /// should stop emitting events.
+ void _onSinkDisconnected() {
+ _disconnected = true;
+ var subscription = _subscription;
+ if (subscription != null) subscription.cancel();
+ _streamController.close();
+ }
+}
+
+/// The sink for [GuaranteeChannel].
+///
+/// This wraps the inner sink to ignore events and cancel any in-progress
+/// [addStream] calls when the underlying channel closes.
+class _GuaranteeSink<T> implements StreamSink<T> {
+ /// The inner sink being wrapped.
+ final StreamSink<T> _inner;
+
+ /// The [GuaranteeChannel] this belongs to.
+ final GuaranteeChannel<T> _channel;
+
+ @override
+ Future<void> get done => _doneCompleter.future;
+ final _doneCompleter = Completer<void>();
+
+ /// Whether connection is disconnected.
+ ///
+ /// This can happen because the stream has emitted a done event, or because
+ /// the user added an error when [_allowErrors] is `false`.
+ bool _disconnected = false;
+
+ /// Whether the user has called [close].
+ bool _closed = false;
+
+ /// The subscription to the stream passed to [addStream], if a stream is
+ /// currently being added.
+ StreamSubscription<T>? _addStreamSubscription;
+
+ /// The completer for the future returned by [addStream], if a stream is
+ /// currently being added.
+ Completer? _addStreamCompleter;
+
+ /// Whether we're currently adding a stream with [addStream].
+ bool get _inAddStream => _addStreamSubscription != null;
+
+ /// Whether errors are passed on to the underlying sink.
+ ///
+ /// If this is `false`, any error passed to the sink is piped to [done] and
+ /// the underlying sink is closed.
+ final bool _allowErrors;
+
+ _GuaranteeSink(this._inner, this._channel, {bool allowErrors = true})
+ : _allowErrors = allowErrors;
+
+ @override
+ void add(T data) {
+ if (_closed) throw StateError('Cannot add event after closing.');
+ if (_inAddStream) {
+ throw StateError('Cannot add event while adding stream.');
+ }
+ if (_disconnected) return;
+
+ _inner.add(data);
+ }
+
+ @override
+ void addError(Object error, [StackTrace? stackTrace]) {
+ if (_closed) throw StateError('Cannot add event after closing.');
+ if (_inAddStream) {
+ throw StateError('Cannot add event while adding stream.');
+ }
+ if (_disconnected) return;
+
+ _addError(error, stackTrace);
+ }
+
+ /// Like [addError], but doesn't check to ensure that an error can be added.
+ ///
+ /// This is called from [addStream], so it shouldn't fail if a stream is being
+ /// added.
+ void _addError(Object error, [StackTrace? stackTrace]) {
+ if (_allowErrors) {
+ _inner.addError(error, stackTrace);
+ return;
+ }
+
+ _doneCompleter.completeError(error, stackTrace);
+
+ // Treat an error like both the stream and sink disconnecting.
+ _onStreamDisconnected();
+ _channel._onSinkDisconnected();
+
+ // Ignore errors from the inner sink. We're already surfacing one error, and
+ // if the user handles it we don't want them to have another top-level.
+ _inner.close().catchError((_) {});
+ }
+
+ @override
+ Future<void> addStream(Stream<T> stream) {
+ if (_closed) throw StateError('Cannot add stream after closing.');
+ if (_inAddStream) {
+ throw StateError('Cannot add stream while adding stream.');
+ }
+ if (_disconnected) return Future.value();
+
+ _addStreamCompleter = Completer.sync();
+ _addStreamSubscription = stream.listen(_inner.add,
+ onError: _addError, onDone: _addStreamCompleter!.complete);
+ return _addStreamCompleter!.future.then((_) {
+ _addStreamCompleter = null;
+ _addStreamSubscription = null;
+ });
+ }
+
+ @override
+ Future<void> close() {
+ if (_inAddStream) {
+ throw StateError('Cannot close sink while adding stream.');
+ }
+
+ if (_closed) return done;
+ _closed = true;
+
+ if (!_disconnected) {
+ _channel._onSinkDisconnected();
+ _doneCompleter.complete(_inner.close());
+ }
+
+ return done;
+ }
+
+ /// Called by [GuaranteeChannel] when the stream emits a done event.
+ ///
+ /// The stream being done indicates that the connection is closed, so the
+ /// sink should stop forwarding events.
+ void _onStreamDisconnected() {
+ _disconnected = true;
+ if (!_doneCompleter.isCompleted) _doneCompleter.complete();
+
+ if (!_inAddStream) return;
+ _addStreamCompleter!.complete(_addStreamSubscription!.cancel());
+ _addStreamCompleter = null;
+ _addStreamSubscription = null;
+ }
+}
diff --git a/pkgs/stream_channel/lib/src/isolate_channel.dart b/pkgs/stream_channel/lib/src/isolate_channel.dart
new file mode 100644
index 0000000..15c68a4
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/isolate_channel.dart
@@ -0,0 +1,115 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:isolate';
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// A [StreamChannel] that communicates over a [ReceivePort]/[SendPort] pair,
+/// presumably with another isolate.
+///
+/// The remote endpoint doesn't necessarily need to be running an
+/// [IsolateChannel]. This can be used with any two ports, although the
+/// [StreamChannel] semantics mean that this class will treat them as being
+/// paired (for example, closing the [sink] will cause the [stream] to stop
+/// emitting events).
+///
+/// The underlying isolate ports have no notion of closing connections. This
+/// means that [stream] won't close unless [sink] is closed, and that closing
+/// [sink] won't cause the remote endpoint to close. Users should take care to
+/// ensure that they always close the [sink] of every [IsolateChannel] they use
+/// to avoid leaving dangling [ReceivePort]s.
+class IsolateChannel<T> extends StreamChannelMixin<T> {
+ @override
+ final Stream<T> stream;
+ @override
+ final StreamSink<T> sink;
+
+ /// Connects to a remote channel that was created with
+ /// [IsolateChannel.connectSend].
+ ///
+ /// These constructors establish a connection using only a single
+ /// [SendPort]/[ReceivePort] pair, as long as each side uses one of the
+ /// connect constructors.
+ ///
+ /// The connection protocol is guaranteed to remain compatible across versions
+ /// at least until the next major version release. If the protocol is
+ /// violated, the resulting channel will emit a single value on its stream and
+ /// then close.
+ factory IsolateChannel.connectReceive(ReceivePort receivePort) {
+ // We can't use a [StreamChannelCompleter] here because we need the return
+ // value to be an [IsolateChannel].
+ var isCompleted = false;
+ var streamCompleter = StreamCompleter<T>();
+ var sinkCompleter = StreamSinkCompleter<T>();
+
+ var channel = IsolateChannel<T>._(streamCompleter.stream, sinkCompleter.sink
+ .transform(StreamSinkTransformer.fromHandlers(handleDone: (sink) {
+ if (!isCompleted) {
+ receivePort.close();
+ streamCompleter.setSourceStream(const Stream.empty());
+ sinkCompleter.setDestinationSink(NullStreamSink<T>());
+ }
+ sink.close();
+ })));
+
+ // The first message across the ReceivePort should be a SendPort pointing to
+ // the remote end. If it's not, we'll make the stream emit an error
+ // complaining.
+ late StreamSubscription<dynamic> subscription;
+ subscription = receivePort.listen((message) {
+ isCompleted = true;
+ if (message is SendPort) {
+ var controller =
+ StreamChannelController<T>(allowForeignErrors: false, sync: true);
+ SubscriptionStream(subscription).cast<T>().pipe(controller.local.sink);
+ controller.local.stream
+ .listen((data) => message.send(data), onDone: receivePort.close);
+
+ streamCompleter.setSourceStream(controller.foreign.stream);
+ sinkCompleter.setDestinationSink(controller.foreign.sink);
+ return;
+ }
+
+ streamCompleter.setError(
+ StateError('Unexpected Isolate response "$message".'),
+ StackTrace.current);
+ sinkCompleter.setDestinationSink(NullStreamSink<T>());
+ subscription.cancel();
+ });
+
+ return channel;
+ }
+
+ /// Connects to a remote channel that was created with
+ /// [IsolateChannel.connectReceive].
+ ///
+ /// These constructors establish a connection using only a single
+ /// [SendPort]/[ReceivePort] pair, as long as each side uses one of the
+ /// connect constructors.
+ ///
+ /// The connection protocol is guaranteed to remain compatible across versions
+ /// at least until the next major version release.
+ factory IsolateChannel.connectSend(SendPort sendPort) {
+ var receivePort = ReceivePort();
+ sendPort.send(receivePort.sendPort);
+ return IsolateChannel(receivePort, sendPort);
+ }
+
+ /// Creates a stream channel that receives messages from [receivePort] and
+ /// sends them over [sendPort].
+ factory IsolateChannel(ReceivePort receivePort, SendPort sendPort) {
+ var controller =
+ StreamChannelController<T>(allowForeignErrors: false, sync: true);
+ receivePort.cast<T>().pipe(controller.local.sink);
+ controller.local.stream
+ .listen((data) => sendPort.send(data), onDone: receivePort.close);
+ return IsolateChannel._(controller.foreign.stream, controller.foreign.sink);
+ }
+
+ IsolateChannel._(this.stream, this.sink);
+}
diff --git a/pkgs/stream_channel/lib/src/json_document_transformer.dart b/pkgs/stream_channel/lib/src/json_document_transformer.dart
new file mode 100644
index 0000000..3feda43
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/json_document_transformer.dart
@@ -0,0 +1,35 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert';
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// A [StreamChannelTransformer] that transforms JSON documents—strings that
+/// contain individual objects encoded as JSON—into decoded Dart objects.
+///
+/// This decodes JSON that's emitted by the transformed channel's stream, and
+/// encodes objects so that JSON is passed to the transformed channel's sink.
+///
+/// If the transformed channel emits invalid JSON, this emits a
+/// [FormatException]. If an unencodable object is added to the sink, it
+/// synchronously throws a [JsonUnsupportedObjectError].
+final StreamChannelTransformer<Object?, String> jsonDocument =
+ const _JsonDocument();
+
+class _JsonDocument implements StreamChannelTransformer<Object?, String> {
+ const _JsonDocument();
+
+ @override
+ StreamChannel<Object?> bind(StreamChannel<String> channel) {
+ var stream = channel.stream.map(jsonDecode);
+ var sink = StreamSinkTransformer<Object, String>.fromHandlers(
+ handleData: (data, sink) {
+ sink.add(jsonEncode(data));
+ }).bind(channel.sink);
+ return StreamChannel.withCloseGuarantee(stream, sink);
+ }
+}
diff --git a/pkgs/stream_channel/lib/src/multi_channel.dart b/pkgs/stream_channel/lib/src/multi_channel.dart
new file mode 100644
index 0000000..4894239
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/multi_channel.dart
@@ -0,0 +1,274 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// A class that multiplexes multiple virtual channels across a single
+/// underlying transport layer.
+///
+/// This should be connected to another [MultiChannel] on the other end of the
+/// underlying channel. It starts with a single default virtual channel,
+/// accessible via [stream] and [sink]. Additional virtual channels can be
+/// created with [virtualChannel].
+///
+/// When a virtual channel is created by one endpoint, the other must connect to
+/// it before messages may be sent through it. The first endpoint passes its
+/// [VirtualChannel.id] to the second, which then creates a channel from that id
+/// also using [virtualChannel]. For example:
+///
+/// ```dart
+/// // First endpoint
+/// var virtual = multiChannel.virtualChannel();
+/// multiChannel.sink.add({
+/// "channel": virtual.id
+/// });
+///
+/// // Second endpoint
+/// multiChannel.stream.listen((message) {
+/// var virtual = multiChannel.virtualChannel(message["channel"]);
+/// // ...
+/// });
+/// ```
+///
+/// Sending errors across a [MultiChannel] is not supported. Any errors from the
+/// underlying stream will be reported only via the default
+/// [MultiChannel.stream].
+///
+/// Each virtual channel may be closed individually. When all of them are
+/// closed, the underlying [StreamSink] is closed automatically.
+abstract class MultiChannel<T> implements StreamChannel<T> {
+ /// The default input stream.
+ ///
+ /// This connects to the remote [sink].
+ @override
+ Stream<T> get stream;
+
+ /// The default output stream.
+ ///
+ /// This connects to the remote [stream]. If this is closed, the remote
+ /// [stream] will close, but other virtual channels will remain open and new
+ /// virtual channels may be opened.
+ @override
+ StreamSink<T> get sink;
+
+ /// Creates a new [MultiChannel] that sends and receives messages over
+ /// [inner].
+ ///
+ /// The inner channel must take JSON-like objects.
+ factory MultiChannel(StreamChannel<dynamic> inner) => _MultiChannel<T>(inner);
+
+ /// Creates a new virtual channel.
+ ///
+ /// If [id] is not passed, this creates a virtual channel from scratch. Before
+ /// it's used, its [VirtualChannel.id] must be sent to the remote endpoint
+ /// where [virtualChannel] should be called with that id.
+ ///
+ /// If [id] is passed, this creates a virtual channel corresponding to the
+ /// channel with that id on the remote channel.
+ ///
+ /// Throws an [ArgumentError] if a virtual channel already exists for [id].
+ /// Throws a [StateError] if the underlying channel is closed.
+ VirtualChannel<T> virtualChannel([int? id]);
+}
+
+/// The implementation of [MultiChannel].
+///
+/// This is private so that [VirtualChannel] can inherit from [MultiChannel]
+/// without having to implement all the private members.
+class _MultiChannel<T> extends StreamChannelMixin<T>
+ implements MultiChannel<T> {
+ /// The inner channel over which all communication is conducted.
+ ///
+ /// This will be `null` if the underlying communication channel is closed.
+ StreamChannel<dynamic>? _inner;
+
+ /// The subscription to [_inner].stream.
+ StreamSubscription<dynamic>? _innerStreamSubscription;
+
+ @override
+ Stream<T> get stream => _mainController.foreign.stream;
+ @override
+ StreamSink<T> get sink => _mainController.foreign.sink;
+
+ /// The controller for this channel.
+ final _mainController = StreamChannelController<T>(sync: true);
+
+ /// A map from input IDs to [StreamChannelController]s that should be used to
+ /// communicate over those channels.
+ final _controllers = <int, StreamChannelController<T>>{};
+
+ /// Input IDs of controllers in [_controllers] that we've received messages
+ /// for but that have not yet had a local [virtualChannel] created.
+ final _pendingIds = <int>{};
+
+ /// Input IDs of virtual channels that used to exist but have since been
+ /// closed.
+ final _closedIds = <int>{};
+
+ /// The next id to use for a local virtual channel.
+ ///
+ /// Ids are used to identify virtual channels. Each message is tagged with an
+ /// id; the receiving [MultiChannel] uses this id to look up which
+ /// [VirtualChannel] the message should be dispatched to.
+ ///
+ /// The id scheme for virtual channels is somewhat complicated. This is
+ /// necessary to ensure that there are no conflicts even when both endpoints
+ /// have virtual channels with the same id; since both endpoints can send and
+ /// receive messages across each virtual channel, a naïve scheme would make it
+ /// impossible to tell whether a message was from a channel that originated in
+ /// the remote endpoint or a reply on a channel that originated in the local
+ /// endpoint.
+ ///
+ /// The trick is that each endpoint only uses odd ids for its own channels.
+ /// When sending a message over a channel that was created by the remote
+ /// endpoint, the channel's id plus one is used. This way each [MultiChannel]
+ /// knows that if an incoming message has an odd id, it's coming from a
+ /// channel that was originally created remotely, but if it has an even id,
+ /// it's coming from a channel that was originally created locally.
+ var _nextId = 1;
+
+ _MultiChannel(StreamChannel<dynamic> inner) : _inner = inner {
+ // The default connection is a special case which has id 0 on both ends.
+ // This allows it to begin connected without having to send over an id.
+ _controllers[0] = _mainController;
+ _mainController.local.stream.listen(
+ (message) => _inner!.sink.add(<Object?>[0, message]),
+ onDone: () => _closeChannel(0, 0));
+
+ _innerStreamSubscription = _inner!.stream.cast<List>().listen((message) {
+ var id = (message[0] as num).toInt();
+
+ // If the channel was closed before an incoming message was processed,
+ // ignore that message.
+ if (_closedIds.contains(id)) return;
+
+ var controller = _controllers.putIfAbsent(id, () {
+ // If we receive a message for a controller that doesn't have a local
+ // counterpart yet, create a controller for it to buffer incoming
+ // messages for when a local connection is created.
+ _pendingIds.add(id);
+ return StreamChannelController(sync: true);
+ });
+
+ if (message.length > 1) {
+ controller.local.sink.add(message[1] as T);
+ } else {
+ // A message without data indicates that the channel has been closed. We
+ // can just close the sink here without doing any more cleanup, because
+ // the sink closing will cause the stream to emit a done event which
+ // will trigger more cleanup.
+ controller.local.sink.close();
+ }
+ },
+ onDone: _closeInnerChannel,
+ onError: _mainController.local.sink.addError);
+ }
+
+ @override
+ VirtualChannel<T> virtualChannel([int? id]) {
+ int inputId;
+ int outputId;
+ if (id != null) {
+ // Since the user is passing in an id, we're connected to a remote
+ // VirtualChannel. This means messages they send over this channel will
+ // have the original odd id, but our replies will have an even id.
+ inputId = id;
+ outputId = id + 1;
+ } else {
+ // Since we're generating an id, we originated this VirtualChannel. This
+ // means messages we send over this channel will have the original odd id,
+ // but the remote channel's replies will have an even id.
+ inputId = _nextId + 1;
+ outputId = _nextId;
+ _nextId += 2;
+ }
+
+ // If the inner channel has already closed, create new virtual channels in a
+ // closed state.
+ if (_inner == null) {
+ return VirtualChannel._(
+ this, inputId, const Stream.empty(), NullStreamSink());
+ }
+
+ late StreamChannelController<T> controller;
+ if (_pendingIds.remove(inputId)) {
+ // If we've already received messages for this channel, use the controller
+ // where those messages are buffered.
+ controller = _controllers[inputId]!;
+ } else if (_controllers.containsKey(inputId) ||
+ _closedIds.contains(inputId)) {
+ throw ArgumentError('A virtual channel with id $id already exists.');
+ } else {
+ controller = StreamChannelController(sync: true);
+ _controllers[inputId] = controller;
+ }
+
+ controller.local.stream.listen(
+ (message) => _inner!.sink.add(<Object?>[outputId, message]),
+ onDone: () => _closeChannel(inputId, outputId));
+ return VirtualChannel._(
+ this, outputId, controller.foreign.stream, controller.foreign.sink);
+ }
+
+ /// Closes the virtual channel for which incoming messages have [inputId] and
+ /// outgoing messages have [outputId].
+ void _closeChannel(int inputId, int outputId) {
+ _closedIds.add(inputId);
+ var controller = _controllers.remove(inputId)!;
+ controller.local.sink.close();
+
+ if (_inner == null) return;
+
+ // A message without data indicates that the virtual channel has been
+ // closed.
+ _inner!.sink.add([outputId]);
+ if (_controllers.isEmpty) _closeInnerChannel();
+ }
+
+ /// Closes the underlying communication channel.
+ void _closeInnerChannel() {
+ _inner!.sink.close();
+ _innerStreamSubscription!.cancel();
+ _inner = null;
+
+ // Convert this to a list because the close is dispatched synchronously, and
+ // that could conceivably remove a controller from [_controllers].
+ for (var controller in _controllers.values.toList(growable: false)) {
+ controller.local.sink.close();
+ }
+ _controllers.clear();
+ }
+}
+
+/// A virtual channel created by [MultiChannel].
+///
+/// This implements [MultiChannel] for convenience.
+/// [VirtualChannel.virtualChannel] is semantically identical to the parent's
+/// [MultiChannel.virtualChannel].
+class VirtualChannel<T> extends StreamChannelMixin<T>
+ implements MultiChannel<T> {
+ /// The [MultiChannel] that created this.
+ final MultiChannel<T> _parent;
+
+ /// The identifier for this channel.
+ ///
+ /// This can be sent across the [MultiChannel] to provide the remote endpoint
+ /// a means to connect to this channel. Nothing about this is guaranteed
+ /// except that it will be JSON-serializable.
+ final int id;
+
+ @override
+ final Stream<T> stream;
+ @override
+ final StreamSink<T> sink;
+
+ VirtualChannel._(this._parent, this.id, this.stream, this.sink);
+
+ @override
+ VirtualChannel<T> virtualChannel([int? id]) => _parent.virtualChannel(id);
+}
diff --git a/pkgs/stream_channel/lib/src/stream_channel_completer.dart b/pkgs/stream_channel/lib/src/stream_channel_completer.dart
new file mode 100644
index 0000000..9d007eb
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/stream_channel_completer.dart
@@ -0,0 +1,74 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// A [channel] where the source and destination are provided later.
+///
+/// The [channel] is a normal channel that can be listened to and that events
+/// can be added to immediately, but until [setChannel] is called it won't emit
+/// any events and all events added to it will be buffered.
+class StreamChannelCompleter<T> {
+ /// The completer for this channel's stream.
+ final _streamCompleter = StreamCompleter<T>();
+
+ /// The completer for this channel's sink.
+ final _sinkCompleter = StreamSinkCompleter<T>();
+
+ /// The channel for this completer.
+ StreamChannel<T> get channel => _channel;
+ late final StreamChannel<T> _channel;
+
+ /// Whether [setChannel] has been called.
+ bool _set = false;
+
+ /// Convert a `Future<StreamChannel>` to a `StreamChannel`.
+ ///
+ /// This creates a channel using a channel completer, and sets the source
+ /// channel to the result of the future when the future completes.
+ ///
+ /// If the future completes with an error, the returned channel's stream will
+ /// instead contain just that error. The sink will silently discard all
+ /// events.
+ static StreamChannel fromFuture(Future<StreamChannel> channelFuture) {
+ var completer = StreamChannelCompleter<void>();
+ channelFuture.then(completer.setChannel, onError: completer.setError);
+ return completer.channel;
+ }
+
+ StreamChannelCompleter() {
+ _channel = StreamChannel<T>(_streamCompleter.stream, _sinkCompleter.sink);
+ }
+
+ /// Set a channel as the source and destination for [channel].
+ ///
+ /// A channel may be set at most once.
+ ///
+ /// Either [setChannel] or [setError] may be called at most once. Trying to
+ /// call either of them again will fail.
+ void setChannel(StreamChannel<T> channel) {
+ if (_set) throw StateError('The channel has already been set.');
+ _set = true;
+
+ _streamCompleter.setSourceStream(channel.stream);
+ _sinkCompleter.setDestinationSink(channel.sink);
+ }
+
+ /// Indicates that there was an error connecting the channel.
+ ///
+ /// This makes the stream emit [error] and close. It makes the sink discard
+ /// all its events.
+ ///
+ /// Either [setChannel] or [setError] may be called at most once. Trying to
+ /// call either of them again will fail.
+ void setError(Object error, [StackTrace? stackTrace]) {
+ if (_set) throw StateError('The channel has already been set.');
+ _set = true;
+
+ _streamCompleter.setError(error, stackTrace);
+ _sinkCompleter.setDestinationSink(NullStreamSink());
+ }
+}
diff --git a/pkgs/stream_channel/lib/src/stream_channel_controller.dart b/pkgs/stream_channel/lib/src/stream_channel_controller.dart
new file mode 100644
index 0000000..25d5239
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/stream_channel_controller.dart
@@ -0,0 +1,67 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// @docImport 'isolate_channel.dart';
+library;
+
+import 'dart:async';
+
+import '../stream_channel.dart';
+
+/// A controller for exposing a new [StreamChannel].
+///
+/// This exposes two connected [StreamChannel]s, [local] and [foreign]. The
+/// user's code should use [local] to emit and receive events. Then [foreign]
+/// can be returned for others to use. For example, here's a simplified version
+/// of the implementation of [IsolateChannel.new]:
+///
+/// ```dart
+/// StreamChannel isolateChannel(ReceivePort receivePort, SendPort sendPort) {
+/// var controller = new StreamChannelController(allowForeignErrors: false);
+///
+/// // Pipe all events from the receive port into the local sink...
+/// receivePort.pipe(controller.local.sink);
+///
+/// // ...and all events from the local stream into the send port.
+/// controller.local.stream.listen(sendPort.send, onDone: receivePort.close);
+///
+/// // Then return the foreign controller for your users to use.
+/// return controller.foreign;
+/// }
+/// ```
+class StreamChannelController<T> {
+ /// The local channel.
+ ///
+ /// This channel should be used directly by the creator of this
+ /// [StreamChannelController] to send and receive events.
+ StreamChannel<T> get local => _local;
+ late final StreamChannel<T> _local;
+
+ /// The foreign channel.
+ ///
+ /// This channel should be returned to external users so they can communicate
+ /// with [local].
+ StreamChannel<T> get foreign => _foreign;
+ late final StreamChannel<T> _foreign;
+
+ /// Creates a [StreamChannelController].
+ ///
+ /// If [sync] is true, events added to either channel's sink are synchronously
+ /// dispatched to the other channel's stream. This should only be done if the
+ /// source of those events is already asynchronous.
+ ///
+ /// If [allowForeignErrors] is `false`, errors are not allowed to be passed to
+ /// the foreign channel's sink. If any are, the connection will close and the
+ /// error will be forwarded to the foreign channel's [StreamSink.done] future.
+ /// This guarantees that the local stream will never emit errors.
+ StreamChannelController({bool allowForeignErrors = true, bool sync = false}) {
+ var localToForeignController = StreamController<T>(sync: sync);
+ var foreignToLocalController = StreamController<T>(sync: sync);
+ _local = StreamChannel<T>.withGuarantees(
+ foreignToLocalController.stream, localToForeignController.sink);
+ _foreign = StreamChannel<T>.withGuarantees(
+ localToForeignController.stream, foreignToLocalController.sink,
+ allowSinkErrors: allowForeignErrors);
+ }
+}
diff --git a/pkgs/stream_channel/lib/src/stream_channel_transformer.dart b/pkgs/stream_channel/lib/src/stream_channel_transformer.dart
new file mode 100644
index 0000000..cf62c76
--- /dev/null
+++ b/pkgs/stream_channel/lib/src/stream_channel_transformer.dart
@@ -0,0 +1,58 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+
+import 'package:async/async.dart';
+
+import '../stream_channel.dart';
+
+/// A [StreamChannelTransformer] transforms the events being passed to and
+/// emitted by a [StreamChannel].
+///
+/// This works on the same principle as [StreamTransformer] and
+/// [StreamSinkTransformer]. Each transformer defines a [bind] method that takes
+/// in the original [StreamChannel] and returns the transformed version.
+///
+/// Transformers must be able to have [bind] called multiple times. If a
+/// subclass implements [bind] explicitly, it should be sure that the returned
+/// stream follows the second stream channel guarantee: closing the sink causes
+/// the stream to close before it emits any more events. This guarantee is
+/// invalidated when an asynchronous gap is added between the original stream's
+/// event dispatch and the returned stream's, for example by transforming it
+/// with a [StreamTransformer]. The guarantee can be easily preserved using
+/// [StreamChannel.withCloseGuarantee].
+class StreamChannelTransformer<S, T> {
+ /// The transformer to use on the channel's stream.
+ final StreamTransformer<T, S> _streamTransformer;
+
+ /// The transformer to use on the channel's sink.
+ final StreamSinkTransformer<S, T> _sinkTransformer;
+
+ /// Creates a [StreamChannelTransformer] from existing stream and sink
+ /// transformers.
+ const StreamChannelTransformer(
+ this._streamTransformer, this._sinkTransformer);
+
+ /// Creates a [StreamChannelTransformer] from a codec's encoder and decoder.
+ ///
+ /// All input to the inner channel's sink is encoded using [Codec.encoder],
+ /// and all output from its stream is decoded using [Codec.decoder].
+ StreamChannelTransformer.fromCodec(Codec<S, T> codec)
+ : this(codec.decoder,
+ StreamSinkTransformer.fromStreamTransformer(codec.encoder));
+
+ /// Transforms the events sent to and emitted by [channel].
+ ///
+ /// Creates a new channel. When events are passed to the returned channel's
+ /// sink, the transformer will transform them and pass the transformed
+ /// versions to `channel.sink`. When events are emitted from the
+ /// `channel.straem`, the transformer will transform them and pass the
+ /// transformed versions to the returned channel's stream.
+ StreamChannel<S> bind(StreamChannel<T> channel) =>
+ StreamChannel<S>.withCloseGuarantee(
+ channel.stream.transform(_streamTransformer),
+ _sinkTransformer.bind(channel.sink));
+}
diff --git a/pkgs/stream_channel/lib/stream_channel.dart b/pkgs/stream_channel/lib/stream_channel.dart
new file mode 100644
index 0000000..8e53096
--- /dev/null
+++ b/pkgs/stream_channel/lib/stream_channel.dart
@@ -0,0 +1,181 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+
+import 'src/close_guarantee_channel.dart';
+import 'src/guarantee_channel.dart';
+import 'src/stream_channel_transformer.dart';
+
+export 'src/delegating_stream_channel.dart';
+export 'src/disconnector.dart';
+export 'src/json_document_transformer.dart';
+export 'src/multi_channel.dart';
+export 'src/stream_channel_completer.dart';
+export 'src/stream_channel_controller.dart';
+export 'src/stream_channel_transformer.dart';
+
+/// An abstract class representing a two-way communication channel.
+///
+/// Users should consider the [stream] emitting a "done" event to be the
+/// canonical indicator that the channel has closed. If they wish to close the
+/// channel, they should close the [sink]—canceling the stream subscription is
+/// not sufficient. Protocol errors may be emitted through the stream or through
+/// [sink].done, depending on their underlying cause. Note that the sink may
+/// silently drop events if the channel closes before [sink].close is called.
+///
+/// Implementations are strongly encouraged to mix in or extend
+/// [StreamChannelMixin] to get default implementations of the various instance
+/// methods. Adding new methods to this interface will not be considered a
+/// breaking change if implementations are also added to [StreamChannelMixin].
+///
+/// Implementations must provide the following guarantees:
+///
+/// * The stream is single-subscription, and must follow all the guarantees of
+/// single-subscription streams.
+///
+/// * Closing the sink causes the stream to close before it emits any more
+/// events.
+///
+/// * After the stream closes, the sink is automatically closed. If this
+/// happens, sink methods should silently drop their arguments until
+/// [sink].close is called.
+///
+/// * If the stream closes before it has a listener, the sink should silently
+/// drop events if possible.
+///
+/// * Canceling the stream's subscription has no effect on the sink. The channel
+/// must still be able to respond to the other endpoint closing the channel
+/// even after the subscription has been canceled.
+///
+/// * The sink *either* forwards errors to the other endpoint *or* closes as
+/// soon as an error is added and forwards that error to the [sink].done
+/// future.
+///
+/// These guarantees allow users to interact uniformly with all implementations,
+/// and ensure that either endpoint closing the stream produces consistent
+/// behavior.
+abstract class StreamChannel<T> {
+ /// The single-subscription stream that emits values from the other endpoint.
+ Stream<T> get stream;
+
+ /// The sink for sending values to the other endpoint.
+ StreamSink<T> get sink;
+
+ /// Creates a new [StreamChannel] that communicates over [stream] and [sink].
+ ///
+ /// Note that this stream/sink pair must provide the guarantees listed in the
+ /// [StreamChannel] documentation. If they don't do so natively,
+ /// [StreamChannel.withGuarantees] should be used instead.
+ factory StreamChannel(Stream<T> stream, StreamSink<T> sink) =>
+ _StreamChannel<T>(stream, sink);
+
+ /// Creates a new [StreamChannel] that communicates over [stream] and [sink].
+ ///
+ /// Unlike [StreamChannel.new], this enforces the guarantees listed in the
+ /// [StreamChannel] documentation. This makes it somewhat less efficient than
+ /// just wrapping a stream and a sink directly, so [StreamChannel.new] should
+ /// be used when the guarantees are provided natively.
+ ///
+ /// If [allowSinkErrors] is `false`, errors are not allowed to be passed to
+ /// [sink]. If any are, the connection will close and the error will be
+ /// forwarded to [sink].done.
+ factory StreamChannel.withGuarantees(Stream<T> stream, StreamSink<T> sink,
+ {bool allowSinkErrors = true}) =>
+ GuaranteeChannel(stream, sink, allowSinkErrors: allowSinkErrors);
+
+ /// Creates a new [StreamChannel] that communicates over [stream] and [sink].
+ ///
+ /// This specifically enforces the second guarantee: closing the sink causes
+ /// the stream to close before it emits any more events. This guarantee is
+ /// invalidated when an asynchronous gap is added between the original
+ /// stream's event dispatch and the returned stream's, for example by
+ /// transforming it with a [StreamTransformer]. This is a lighter-weight way
+ /// of preserving that guarantee in particular than
+ /// [StreamChannel.withGuarantees].
+ factory StreamChannel.withCloseGuarantee(
+ Stream<T> stream, StreamSink<T> sink) =>
+ CloseGuaranteeChannel(stream, sink);
+
+ /// Connects this to [other], so that any values emitted by either are sent
+ /// directly to the other.
+ void pipe(StreamChannel<T> other);
+
+ /// Transforms this using [transformer].
+ ///
+ /// This is identical to calling `transformer.bind(channel)`.
+ StreamChannel<S> transform<S>(StreamChannelTransformer<S, T> transformer);
+
+ /// Transforms only the [stream] component of this using [transformer].
+ StreamChannel<T> transformStream(StreamTransformer<T, T> transformer);
+
+ /// Transforms only the [sink] component of this using [transformer].
+ StreamChannel<T> transformSink(StreamSinkTransformer<T, T> transformer);
+
+ /// Returns a copy of this with [stream] replaced by [change]'s return
+ /// value.
+ StreamChannel<T> changeStream(Stream<T> Function(Stream<T>) change);
+
+ /// Returns a copy of this with [sink] replaced by [change]'s return
+ /// value.
+ StreamChannel<T> changeSink(StreamSink<T> Function(StreamSink<T>) change);
+
+ /// Returns a copy of this with the generic type coerced to [S].
+ ///
+ /// If any events emitted by [stream] aren't of type [S], they're converted
+ /// into [TypeError] events (`CastError` on some SDK versions). Similarly, if
+ /// any events are added to [sink] that aren't of type [S], a [TypeError] is
+ /// thrown.
+ StreamChannel<S> cast<S>();
+}
+
+/// An implementation of [StreamChannel] that simply takes a stream and a sink
+/// as parameters.
+///
+/// This is distinct from [StreamChannel] so that it can use
+/// [StreamChannelMixin].
+class _StreamChannel<T> extends StreamChannelMixin<T> {
+ @override
+ final Stream<T> stream;
+ @override
+ final StreamSink<T> sink;
+
+ _StreamChannel(this.stream, this.sink);
+}
+
+/// A mixin that implements the instance methods of [StreamChannel] in terms of
+/// [stream] and [sink].
+abstract mixin class StreamChannelMixin<T> implements StreamChannel<T> {
+ @override
+ void pipe(StreamChannel<T> other) {
+ stream.pipe(other.sink);
+ other.stream.pipe(sink);
+ }
+
+ @override
+ StreamChannel<S> transform<S>(StreamChannelTransformer<S, T> transformer) =>
+ transformer.bind(this);
+
+ @override
+ StreamChannel<T> transformStream(StreamTransformer<T, T> transformer) =>
+ changeStream(transformer.bind);
+
+ @override
+ StreamChannel<T> transformSink(StreamSinkTransformer<T, T> transformer) =>
+ changeSink(transformer.bind);
+
+ @override
+ StreamChannel<T> changeStream(Stream<T> Function(Stream<T>) change) =>
+ StreamChannel.withCloseGuarantee(change(stream), sink);
+
+ @override
+ StreamChannel<T> changeSink(StreamSink<T> Function(StreamSink<T>) change) =>
+ StreamChannel.withCloseGuarantee(stream, change(sink));
+
+ @override
+ StreamChannel<S> cast<S>() => StreamChannel(
+ stream.cast(), StreamController(sync: true)..stream.cast<T>().pipe(sink));
+}
diff --git a/pkgs/stream_channel/pubspec.yaml b/pkgs/stream_channel/pubspec.yaml
new file mode 100644
index 0000000..f050fca
--- /dev/null
+++ b/pkgs/stream_channel/pubspec.yaml
@@ -0,0 +1,16 @@
+name: stream_channel
+version: 2.1.4
+description: >-
+ An abstraction for two-way communication channels based on the Dart Stream
+ class.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/stream_channel
+
+environment:
+ sdk: ^3.3.0
+
+dependencies:
+ async: ^2.5.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.6
diff --git a/pkgs/stream_channel/test/disconnector_test.dart b/pkgs/stream_channel/test/disconnector_test.dart
new file mode 100644
index 0000000..bbba568
--- /dev/null
+++ b/pkgs/stream_channel/test/disconnector_test.dart
@@ -0,0 +1,152 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late StreamController streamController;
+ late StreamController sinkController;
+ late Disconnector disconnector;
+ late StreamChannel channel;
+ setUp(() {
+ streamController = StreamController<void>();
+ sinkController = StreamController<void>();
+ disconnector = Disconnector();
+ channel = StreamChannel.withGuarantees(
+ streamController.stream, sinkController.sink)
+ .transform(disconnector);
+ });
+
+ group('before disconnection', () {
+ test('forwards events from the sink as normal', () {
+ channel.sink.add(1);
+ channel.sink.add(2);
+ channel.sink.add(3);
+ channel.sink.close();
+
+ expect(sinkController.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test('forwards events to the stream as normal', () {
+ streamController.add(1);
+ streamController.add(2);
+ streamController.add(3);
+ streamController.close();
+
+ expect(channel.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test("events can't be added when the sink is explicitly closed", () {
+ sinkController.stream.listen(null); // Work around sdk#19095.
+
+ expect(channel.sink.close(), completes);
+ expect(() => channel.sink.add(1), throwsStateError);
+ expect(() => channel.sink.addError('oh no'), throwsStateError);
+ expect(() => channel.sink.addStream(Stream.fromIterable([])),
+ throwsStateError);
+ });
+
+ test("events can't be added while a stream is being added", () {
+ var controller = StreamController<void>();
+ channel.sink.addStream(controller.stream);
+
+ expect(() => channel.sink.add(1), throwsStateError);
+ expect(() => channel.sink.addError('oh no'), throwsStateError);
+ expect(() => channel.sink.addStream(Stream.fromIterable([])),
+ throwsStateError);
+ expect(() => channel.sink.close(), throwsStateError);
+
+ controller.close();
+ });
+ });
+
+ test('cancels addStream when disconnected', () async {
+ var canceled = false;
+ var controller = StreamController<void>(onCancel: () {
+ canceled = true;
+ });
+ expect(channel.sink.addStream(controller.stream), completes);
+ unawaited(disconnector.disconnect());
+
+ await pumpEventQueue();
+ expect(canceled, isTrue);
+ });
+
+ test('disconnect() returns the close future from the inner sink', () async {
+ var streamController = StreamController<void>();
+ var sinkController = StreamController<void>();
+ var disconnector = Disconnector<void>();
+ var sink = _CloseCompleterSink(sinkController.sink);
+ StreamChannel.withGuarantees(streamController.stream, sink)
+ .transform(disconnector);
+
+ var disconnectFutureFired = false;
+ expect(
+ disconnector.disconnect().then((_) {
+ disconnectFutureFired = true;
+ }),
+ completes);
+
+ // Give the future time to fire early if it's going to.
+ await pumpEventQueue();
+ expect(disconnectFutureFired, isFalse);
+
+ // When the inner sink's close future completes, so should the
+ // disconnector's.
+ sink.completer.complete();
+ await pumpEventQueue();
+ expect(disconnectFutureFired, isTrue);
+ });
+
+ group('after disconnection', () {
+ setUp(() {
+ disconnector.disconnect();
+ });
+
+ test('closes the inner sink and ignores events to the outer sink', () {
+ channel.sink.add(1);
+ channel.sink.add(2);
+ channel.sink.add(3);
+ channel.sink.close();
+
+ expect(sinkController.stream.toList(), completion(isEmpty));
+ });
+
+ test('closes the stream', () {
+ expect(channel.stream.toList(), completion(isEmpty));
+ });
+
+ test('completes done', () {
+ sinkController.stream.listen(null); // Work around sdk#19095.
+ expect(channel.sink.done, completes);
+ });
+
+ test('still emits state errors after explicit close', () {
+ sinkController.stream.listen(null); // Work around sdk#19095.
+ expect(channel.sink.close(), completes);
+
+ expect(() => channel.sink.add(1), throwsStateError);
+ expect(() => channel.sink.addError('oh no'), throwsStateError);
+ });
+ });
+}
+
+/// A [StreamSink] wrapper that adds the ability to manually complete the Future
+/// returned by [close] using [completer].
+class _CloseCompleterSink extends DelegatingStreamSink {
+ /// The completer for the future returned by [close].
+ final completer = Completer<void>();
+
+ _CloseCompleterSink(super.inner);
+
+ @override
+ Future<void> close() {
+ super.close();
+ return completer.future;
+ }
+}
diff --git a/pkgs/stream_channel/test/isolate_channel_test.dart b/pkgs/stream_channel/test/isolate_channel_test.dart
new file mode 100644
index 0000000..3a8b42e
--- /dev/null
+++ b/pkgs/stream_channel/test/isolate_channel_test.dart
@@ -0,0 +1,174 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'dart:async';
+import 'dart:isolate';
+
+import 'package:stream_channel/isolate_channel.dart';
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late ReceivePort receivePort;
+ late SendPort sendPort;
+ late StreamChannel channel;
+ setUp(() {
+ receivePort = ReceivePort();
+ var receivePortForSend = ReceivePort();
+ sendPort = receivePortForSend.sendPort;
+ channel = IsolateChannel(receivePortForSend, receivePort.sendPort);
+ });
+
+ tearDown(() {
+ receivePort.close();
+ channel.sink.close();
+ });
+
+ test('the channel can send messages', () {
+ channel.sink.add(1);
+ channel.sink.add(2);
+ channel.sink.add(3);
+
+ expect(receivePort.take(3).toList(), completion(equals([1, 2, 3])));
+ });
+
+ test('the channel can receive messages', () {
+ sendPort.send(1);
+ sendPort.send(2);
+ sendPort.send(3);
+
+ expect(channel.stream.take(3).toList(), completion(equals([1, 2, 3])));
+ });
+
+ test("events can't be added to an explicitly-closed sink", () {
+ expect(channel.sink.close(), completes);
+ expect(() => channel.sink.add(1), throwsStateError);
+ expect(() => channel.sink.addError('oh no'), throwsStateError);
+ expect(() => channel.sink.addStream(Stream.fromIterable([])),
+ throwsStateError);
+ });
+
+ test("events can't be added while a stream is being added", () {
+ var controller = StreamController<void>();
+ channel.sink.addStream(controller.stream);
+
+ expect(() => channel.sink.add(1), throwsStateError);
+ expect(() => channel.sink.addError('oh no'), throwsStateError);
+ expect(() => channel.sink.addStream(Stream.fromIterable([])),
+ throwsStateError);
+ expect(() => channel.sink.close(), throwsStateError);
+
+ controller.close();
+ });
+
+ group('stream channel rules', () {
+ test(
+ 'closing the sink causes the stream to close before it emits any more '
+ 'events', () {
+ sendPort.send(1);
+ sendPort.send(2);
+ sendPort.send(3);
+ sendPort.send(4);
+ sendPort.send(5);
+
+ channel.stream.listen(expectAsync1((message) {
+ expect(message, equals(1));
+ channel.sink.close();
+ }, count: 1));
+ });
+
+ test("cancelling the stream's subscription has no effect on the sink",
+ () async {
+ unawaited(channel.stream.listen(null).cancel());
+ await pumpEventQueue();
+
+ channel.sink.add(1);
+ channel.sink.add(2);
+ channel.sink.add(3);
+ expect(receivePort.take(3).toList(), completion(equals([1, 2, 3])));
+ });
+
+ test('the sink closes as soon as an error is added', () async {
+ channel.sink.addError('oh no');
+ channel.sink.add(1);
+ expect(channel.sink.done, throwsA('oh no'));
+
+ // Since the sink is closed, the stream should also be closed.
+ expect(channel.stream.isEmpty, completion(isTrue));
+
+ // The other end shouldn't receive the next event, since the sink was
+ // closed. Pump the event queue to give it a chance to.
+ receivePort.listen(expectAsync1((_) {}, count: 0));
+ await pumpEventQueue();
+ });
+
+ test('the sink closes as soon as an error is added via addStream',
+ () async {
+ var canceled = false;
+ var controller = StreamController<void>(onCancel: () {
+ canceled = true;
+ });
+
+ // This future shouldn't get the error, because it's sent to [Sink.done].
+ expect(channel.sink.addStream(controller.stream), completes);
+
+ controller.addError('oh no');
+ expect(channel.sink.done, throwsA('oh no'));
+ await pumpEventQueue();
+ expect(canceled, isTrue);
+
+ // Even though the sink is closed, this shouldn't throw an error because
+ // the user didn't explicitly close it.
+ channel.sink.add(1);
+ });
+ });
+
+ group('connect constructors', () {
+ late ReceivePort connectPort;
+ setUp(() {
+ connectPort = ReceivePort();
+ });
+
+ tearDown(() {
+ connectPort.close();
+ });
+
+ test('create a connected pair of channels', () async {
+ var channel1 = IsolateChannel<int>.connectReceive(connectPort);
+ var channel2 = IsolateChannel<int>.connectSend(connectPort.sendPort);
+
+ channel1.sink.add(1);
+ channel1.sink.add(2);
+ channel1.sink.add(3);
+ expect(await channel2.stream.take(3).toList(), equals([1, 2, 3]));
+
+ channel2.sink.add(4);
+ channel2.sink.add(5);
+ channel2.sink.add(6);
+ expect(await channel1.stream.take(3).toList(), equals([4, 5, 6]));
+
+ await channel2.sink.close();
+ });
+
+ test('the receiving channel produces an error if it gets the wrong message',
+ () {
+ var connectedChannel = IsolateChannel<int>.connectReceive(connectPort);
+ connectPort.sendPort.send('wrong value');
+
+ expect(connectedChannel.stream.toList(), throwsStateError);
+ expect(connectedChannel.sink.done, completes);
+ });
+
+ test('the receiving channel closes gracefully without a connection',
+ () async {
+ var connectedChannel = IsolateChannel<int>.connectReceive(connectPort);
+ await connectedChannel.sink.close();
+ await expectLater(connectedChannel.stream.toList(), completion(isEmpty));
+ await expectLater(connectedChannel.sink.done, completes);
+ });
+ });
+}
diff --git a/pkgs/stream_channel/test/json_document_transformer_test.dart b/pkgs/stream_channel/test/json_document_transformer_test.dart
new file mode 100644
index 0000000..290c4e2
--- /dev/null
+++ b/pkgs/stream_channel/test/json_document_transformer_test.dart
@@ -0,0 +1,46 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late StreamController<String> streamController;
+ late StreamController<String> sinkController;
+ late StreamChannel<String> channel;
+ setUp(() {
+ streamController = StreamController<String>();
+ sinkController = StreamController<String>();
+ channel =
+ StreamChannel<String>(streamController.stream, sinkController.sink);
+ });
+
+ test('decodes JSON emitted by the channel', () {
+ var transformed = channel.transform(jsonDocument);
+ streamController.add('{"foo": "bar"}');
+ expect(transformed.stream.first, completion(equals({'foo': 'bar'})));
+ });
+
+ test('encodes objects added to the channel', () {
+ var transformed = channel.transform(jsonDocument);
+ transformed.sink.add({'foo': 'bar'});
+ expect(sinkController.stream.first,
+ completion(equals(jsonEncode({'foo': 'bar'}))));
+ });
+
+ test('emits a stream error when incoming JSON is malformed', () {
+ var transformed = channel.transform(jsonDocument);
+ streamController.add('{invalid');
+ expect(transformed.stream.first, throwsFormatException);
+ });
+
+ test('synchronously throws if an unencodable object is added', () {
+ var transformed = channel.transform(jsonDocument);
+ expect(() => transformed.sink.add(Object()),
+ throwsA(const TypeMatcher<JsonUnsupportedObjectError>()));
+ });
+}
diff --git a/pkgs/stream_channel/test/multi_channel_test.dart b/pkgs/stream_channel/test/multi_channel_test.dart
new file mode 100644
index 0000000..ee6f8d2
--- /dev/null
+++ b/pkgs/stream_channel/test/multi_channel_test.dart
@@ -0,0 +1,478 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late StreamChannelController controller;
+ late MultiChannel channel1;
+ late MultiChannel channel2;
+ setUp(() {
+ controller = StreamChannelController();
+ channel1 = MultiChannel<int>(controller.local);
+ channel2 = MultiChannel<int>(controller.foreign);
+ });
+
+ group('the default virtual channel', () {
+ test('begins connected', () {
+ var first = true;
+ channel2.stream.listen(expectAsync1((message) {
+ if (first) {
+ expect(message, equals(1));
+ first = false;
+ } else {
+ expect(message, equals(2));
+ }
+ }, count: 2));
+
+ channel1.sink.add(1);
+ channel1.sink.add(2);
+ });
+
+ test('closes the remote virtual channel when it closes', () {
+ expect(channel2.stream.toList(), completion(isEmpty));
+ expect(channel2.sink.done, completes);
+
+ channel1.sink.close();
+ });
+
+ test('closes the local virtual channel when it closes', () {
+ expect(channel1.stream.toList(), completion(isEmpty));
+ expect(channel1.sink.done, completes);
+
+ channel1.sink.close();
+ });
+
+ test(
+ "doesn't closes the local virtual channel when the stream "
+ 'subscription is canceled', () {
+ channel1.sink.done.then(expectAsync1((_) {}, count: 0));
+
+ channel1.stream.listen((_) {}).cancel();
+
+ // Ensure that there's enough time for the channel to close if it's going
+ // to.
+ return pumpEventQueue();
+ });
+
+ test(
+ 'closes the underlying channel when it closes without any other '
+ 'virtual channels', () {
+ expect(controller.local.sink.done, completes);
+ expect(controller.foreign.sink.done, completes);
+
+ channel1.sink.close();
+ });
+
+ test(
+ "doesn't close the underlying channel when it closes with other "
+ 'virtual channels', () {
+ controller.local.sink.done.then(expectAsync1((_) {}, count: 0));
+ controller.foreign.sink.done.then(expectAsync1((_) {}, count: 0));
+
+ // Establish another virtual connection which should keep the underlying
+ // connection open.
+ channel2.virtualChannel(channel1.virtualChannel().id);
+ channel1.sink.close();
+
+ // Ensure that there's enough time for the underlying channel to complete
+ // if it's going to.
+ return pumpEventQueue();
+ });
+ });
+
+ group('a locally-created virtual channel', () {
+ late VirtualChannel virtual1;
+ late VirtualChannel virtual2;
+ setUp(() {
+ virtual1 = channel1.virtualChannel();
+ virtual2 = channel2.virtualChannel(virtual1.id);
+ });
+
+ test('sends messages only to the other virtual channel', () {
+ var first = true;
+ virtual2.stream.listen(expectAsync1((message) {
+ if (first) {
+ expect(message, equals(1));
+ first = false;
+ } else {
+ expect(message, equals(2));
+ }
+ }, count: 2));
+
+ // No other virtual channels should receive the message.
+ for (var i = 0; i < 10; i++) {
+ var virtual = channel2.virtualChannel(channel1.virtualChannel().id);
+ virtual.stream.listen(expectAsync1((_) {}, count: 0));
+ }
+ channel2.stream.listen(expectAsync1((_) {}, count: 0));
+
+ virtual1.sink.add(1);
+ virtual1.sink.add(2);
+ });
+
+ test('closes the remote virtual channel when it closes', () {
+ expect(virtual2.stream.toList(), completion(isEmpty));
+ expect(virtual2.sink.done, completes);
+
+ virtual1.sink.close();
+ });
+
+ test('closes the local virtual channel when it closes', () {
+ expect(virtual1.stream.toList(), completion(isEmpty));
+ expect(virtual1.sink.done, completes);
+
+ virtual1.sink.close();
+ });
+
+ test(
+ "doesn't closes the local virtual channel when the stream "
+ 'subscription is canceled', () {
+ virtual1.sink.done.then(expectAsync1((_) {}, count: 0));
+ virtual1.stream.listen((_) {}).cancel();
+
+ // Ensure that there's enough time for the channel to close if it's going
+ // to.
+ return pumpEventQueue();
+ });
+
+ test(
+ 'closes the underlying channel when it closes without any other '
+ 'virtual channels', () async {
+ // First close the default channel so we can test the new channel as the
+ // last living virtual channel.
+ unawaited(channel1.sink.close());
+
+ await channel2.stream.toList();
+ expect(controller.local.sink.done, completes);
+ expect(controller.foreign.sink.done, completes);
+
+ unawaited(virtual1.sink.close());
+ });
+
+ test(
+ "doesn't close the underlying channel when it closes with other "
+ 'virtual channels', () {
+ controller.local.sink.done.then(expectAsync1((_) {}, count: 0));
+ controller.foreign.sink.done.then(expectAsync1((_) {}, count: 0));
+
+ virtual1.sink.close();
+
+ // Ensure that there's enough time for the underlying channel to complete
+ // if it's going to.
+ return pumpEventQueue();
+ });
+
+ test("doesn't conflict with a remote virtual channel", () {
+ var virtual3 = channel2.virtualChannel();
+ var virtual4 = channel1.virtualChannel(virtual3.id);
+
+ // This is an implementation detail, but we assert it here to make sure
+ // we're properly testing two channels with the same id.
+ expect(virtual1.id, equals(virtual3.id));
+
+ virtual2.stream
+ .listen(expectAsync1((message) => expect(message, equals(1))));
+ virtual4.stream
+ .listen(expectAsync1((message) => expect(message, equals(2))));
+
+ virtual1.sink.add(1);
+ virtual3.sink.add(2);
+ });
+ });
+
+ group('a remotely-created virtual channel', () {
+ late VirtualChannel virtual1;
+ late VirtualChannel virtual2;
+ setUp(() {
+ virtual1 = channel1.virtualChannel();
+ virtual2 = channel2.virtualChannel(virtual1.id);
+ });
+
+ test('sends messages only to the other virtual channel', () {
+ var first = true;
+ virtual1.stream.listen(expectAsync1((message) {
+ if (first) {
+ expect(message, equals(1));
+ first = false;
+ } else {
+ expect(message, equals(2));
+ }
+ }, count: 2));
+
+ // No other virtual channels should receive the message.
+ for (var i = 0; i < 10; i++) {
+ var virtual = channel2.virtualChannel(channel1.virtualChannel().id);
+ virtual.stream.listen(expectAsync1((_) {}, count: 0));
+ }
+ channel1.stream.listen(expectAsync1((_) {}, count: 0));
+
+ virtual2.sink.add(1);
+ virtual2.sink.add(2);
+ });
+
+ test('closes the remote virtual channel when it closes', () {
+ expect(virtual1.stream.toList(), completion(isEmpty));
+ expect(virtual1.sink.done, completes);
+
+ virtual2.sink.close();
+ });
+
+ test('closes the local virtual channel when it closes', () {
+ expect(virtual2.stream.toList(), completion(isEmpty));
+ expect(virtual2.sink.done, completes);
+
+ virtual2.sink.close();
+ });
+
+ test(
+ "doesn't closes the local virtual channel when the stream "
+ 'subscription is canceled', () {
+ virtual2.sink.done.then(expectAsync1((_) {}, count: 0));
+ virtual2.stream.listen((_) {}).cancel();
+
+ // Ensure that there's enough time for the channel to close if it's going
+ // to.
+ return pumpEventQueue();
+ });
+
+ test(
+ 'closes the underlying channel when it closes without any other '
+ 'virtual channels', () async {
+ // First close the default channel so we can test the new channel as the
+ // last living virtual channel.
+ unawaited(channel2.sink.close());
+
+ await channel1.stream.toList();
+ expect(controller.local.sink.done, completes);
+ expect(controller.foreign.sink.done, completes);
+
+ unawaited(virtual2.sink.close());
+ });
+
+ test(
+ "doesn't close the underlying channel when it closes with other "
+ 'virtual channels', () {
+ controller.local.sink.done.then(expectAsync1((_) {}, count: 0));
+ controller.foreign.sink.done.then(expectAsync1((_) {}, count: 0));
+
+ virtual2.sink.close();
+
+ // Ensure that there's enough time for the underlying channel to complete
+ // if it's going to.
+ return pumpEventQueue();
+ });
+
+ test("doesn't allow another virtual channel with the same id", () {
+ expect(() => channel2.virtualChannel(virtual1.id), throwsArgumentError);
+ });
+
+ test('dispatches events received before the virtual channel is created',
+ () async {
+ virtual1 = channel1.virtualChannel();
+
+ virtual1.sink.add(1);
+ await pumpEventQueue();
+
+ virtual1.sink.add(2);
+ await pumpEventQueue();
+
+ expect(channel2.virtualChannel(virtual1.id).stream, emitsInOrder([1, 2]));
+ });
+
+ test(
+ 'dispatches close events received before the virtual channel is '
+ 'created', () async {
+ virtual1 = channel1.virtualChannel();
+
+ unawaited(virtual1.sink.close());
+ await pumpEventQueue();
+
+ expect(channel2.virtualChannel(virtual1.id).stream.toList(),
+ completion(isEmpty));
+ });
+ });
+
+ group('when the underlying stream', () {
+ late VirtualChannel virtual1;
+ late VirtualChannel virtual2;
+ setUp(() {
+ virtual1 = channel1.virtualChannel();
+ virtual2 = channel2.virtualChannel(virtual1.id);
+ });
+
+ test('closes, all virtual channels close', () {
+ expect(channel1.stream.toList(), completion(isEmpty));
+ expect(channel1.sink.done, completes);
+ expect(channel2.stream.toList(), completion(isEmpty));
+ expect(channel2.sink.done, completes);
+ expect(virtual1.stream.toList(), completion(isEmpty));
+ expect(virtual1.sink.done, completes);
+ expect(virtual2.stream.toList(), completion(isEmpty));
+ expect(virtual2.sink.done, completes);
+
+ controller.local.sink.close();
+ });
+
+ test('closes, more virtual channels are created closed', () async {
+ unawaited(channel2.sink.close());
+ unawaited(virtual2.sink.close());
+
+ // Wait for the existing channels to emit done events.
+ await channel1.stream.toList();
+ await virtual1.stream.toList();
+
+ var virtual = channel1.virtualChannel();
+ expect(virtual.stream.toList(), completion(isEmpty));
+ expect(virtual.sink.done, completes);
+
+ virtual = channel1.virtualChannel();
+ expect(virtual.stream.toList(), completion(isEmpty));
+ expect(virtual.sink.done, completes);
+ });
+
+ test('emits an error, the error is sent only to the default channel', () {
+ channel1.stream.listen(expectAsync1((_) {}, count: 0),
+ onError: expectAsync1((error) => expect(error, equals('oh no'))));
+ virtual1.stream.listen(expectAsync1((_) {}, count: 0),
+ onError: expectAsync1((_) {}, count: 0));
+
+ controller.foreign.sink.addError('oh no');
+ });
+ });
+
+ group('stream channel rules', () {
+ group('for the main stream:', () {
+ test(
+ 'closing the sink causes the stream to close before it emits any '
+ 'more events', () {
+ channel1.sink.add(1);
+ channel1.sink.add(2);
+ channel1.sink.add(3);
+
+ channel2.stream.listen(expectAsync1((message) {
+ expect(message, equals(1));
+ channel2.sink.close();
+ }, count: 1));
+ });
+
+ test('after the stream closes, the sink ignores events', () async {
+ unawaited(channel1.sink.close());
+
+ // Wait for the done event to be delivered.
+ await channel2.stream.toList();
+ channel2.sink.add(1);
+ channel2.sink.add(2);
+ channel2.sink.add(3);
+ unawaited(channel2.sink.close());
+
+ // None of our channel.sink additions should make it to the other
+ // endpoint.
+ channel1.stream.listen(expectAsync1((_) {}, count: 0));
+ await pumpEventQueue();
+ });
+
+ test("canceling the stream's subscription has no effect on the sink",
+ () async {
+ unawaited(channel1.stream.listen(null).cancel());
+ await pumpEventQueue();
+
+ channel1.sink.add(1);
+ channel1.sink.add(2);
+ channel1.sink.add(3);
+ unawaited(channel1.sink.close());
+ expect(channel2.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test("canceling the stream's subscription doesn't stop a done event",
+ () async {
+ unawaited(channel1.stream.listen(null).cancel());
+ await pumpEventQueue();
+
+ unawaited(channel2.sink.close());
+ await pumpEventQueue();
+
+ channel1.sink.add(1);
+ channel1.sink.add(2);
+ channel1.sink.add(3);
+ unawaited(channel1.sink.close());
+
+ // The sink should be ignoring events because the channel closed.
+ channel2.stream.listen(expectAsync1((_) {}, count: 0));
+ await pumpEventQueue();
+ });
+ });
+
+ group('for a virtual channel:', () {
+ late VirtualChannel virtual1;
+ late VirtualChannel virtual2;
+ setUp(() {
+ virtual1 = channel1.virtualChannel();
+ virtual2 = channel2.virtualChannel(virtual1.id);
+ });
+
+ test(
+ 'closing the sink causes the stream to close before it emits any '
+ 'more events', () {
+ virtual1.sink.add(1);
+ virtual1.sink.add(2);
+ virtual1.sink.add(3);
+
+ virtual2.stream.listen(expectAsync1((message) {
+ expect(message, equals(1));
+ virtual2.sink.close();
+ }, count: 1));
+ });
+
+ test('after the stream closes, the sink ignores events', () async {
+ unawaited(virtual1.sink.close());
+
+ // Wait for the done event to be delivered.
+ await virtual2.stream.toList();
+ virtual2.sink.add(1);
+ virtual2.sink.add(2);
+ virtual2.sink.add(3);
+ unawaited(virtual2.sink.close());
+
+ // None of our virtual.sink additions should make it to the other
+ // endpoint.
+ virtual1.stream.listen(expectAsync1((_) {}, count: 0));
+ await pumpEventQueue();
+ });
+
+ test("canceling the stream's subscription has no effect on the sink",
+ () async {
+ unawaited(virtual1.stream.listen(null).cancel());
+ await pumpEventQueue();
+
+ virtual1.sink.add(1);
+ virtual1.sink.add(2);
+ virtual1.sink.add(3);
+ unawaited(virtual1.sink.close());
+ expect(virtual2.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test("canceling the stream's subscription doesn't stop a done event",
+ () async {
+ unawaited(virtual1.stream.listen(null).cancel());
+ await pumpEventQueue();
+
+ unawaited(virtual2.sink.close());
+ await pumpEventQueue();
+
+ virtual1.sink.add(1);
+ virtual1.sink.add(2);
+ virtual1.sink.add(3);
+ unawaited(virtual1.sink.close());
+
+ // The sink should be ignoring events because the stream closed.
+ virtual2.stream.listen(expectAsync1((_) {}, count: 0));
+ await pumpEventQueue();
+ });
+ });
+ });
+}
diff --git a/pkgs/stream_channel/test/stream_channel_completer_test.dart b/pkgs/stream_channel/test/stream_channel_completer_test.dart
new file mode 100644
index 0000000..c6fddc0
--- /dev/null
+++ b/pkgs/stream_channel/test/stream_channel_completer_test.dart
@@ -0,0 +1,120 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late StreamChannelCompleter completer;
+ late StreamController streamController;
+ late StreamController sinkController;
+ late StreamChannel innerChannel;
+ setUp(() {
+ completer = StreamChannelCompleter();
+ streamController = StreamController<void>();
+ sinkController = StreamController<void>();
+ innerChannel = StreamChannel(streamController.stream, sinkController.sink);
+ });
+
+ group('when a channel is set before accessing', () {
+ test('forwards events through the stream', () {
+ completer.setChannel(innerChannel);
+ expect(completer.channel.stream.toList(), completion(equals([1, 2, 3])));
+
+ streamController.add(1);
+ streamController.add(2);
+ streamController.add(3);
+ streamController.close();
+ });
+
+ test('forwards events through the sink', () {
+ completer.setChannel(innerChannel);
+ expect(sinkController.stream.toList(), completion(equals([1, 2, 3])));
+
+ completer.channel.sink.add(1);
+ completer.channel.sink.add(2);
+ completer.channel.sink.add(3);
+ completer.channel.sink.close();
+ });
+
+ test('forwards an error through the stream', () {
+ completer.setError('oh no');
+ expect(completer.channel.stream.first, throwsA('oh no'));
+ });
+
+ test('drops sink events', () {
+ completer.setError('oh no');
+ expect(completer.channel.sink.done, completes);
+ completer.channel.sink.add(1);
+ completer.channel.sink.addError('oh no');
+ });
+ });
+
+ group('when a channel is set after accessing', () {
+ test('forwards events through the stream', () async {
+ expect(completer.channel.stream.toList(), completion(equals([1, 2, 3])));
+ await pumpEventQueue();
+
+ completer.setChannel(innerChannel);
+ streamController.add(1);
+ streamController.add(2);
+ streamController.add(3);
+ unawaited(streamController.close());
+ });
+
+ test('forwards events through the sink', () async {
+ completer.channel.sink.add(1);
+ completer.channel.sink.add(2);
+ completer.channel.sink.add(3);
+ unawaited(completer.channel.sink.close());
+ await pumpEventQueue();
+
+ completer.setChannel(innerChannel);
+ expect(sinkController.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test('forwards an error through the stream', () async {
+ expect(completer.channel.stream.first, throwsA('oh no'));
+ await pumpEventQueue();
+
+ completer.setError('oh no');
+ });
+
+ test('drops sink events', () async {
+ expect(completer.channel.sink.done, completes);
+ completer.channel.sink.add(1);
+ completer.channel.sink.addError('oh no');
+ await pumpEventQueue();
+
+ completer.setError('oh no');
+ });
+ });
+
+ group('forFuture', () {
+ test('forwards a StreamChannel', () {
+ var channel =
+ StreamChannelCompleter.fromFuture(Future.value(innerChannel));
+ channel.sink.add(1);
+ channel.sink.close();
+ streamController.sink.add(2);
+ streamController.sink.close();
+
+ expect(sinkController.stream.toList(), completion(equals([1])));
+ expect(channel.stream.toList(), completion(equals([2])));
+ });
+
+ test('forwards an error', () {
+ var channel = StreamChannelCompleter.fromFuture(Future.error('oh no'));
+ expect(channel.stream.toList(), throwsA('oh no'));
+ });
+ });
+
+ test("doesn't allow the channel to be set multiple times", () {
+ completer.setChannel(innerChannel);
+ expect(() => completer.setChannel(innerChannel), throwsStateError);
+ expect(() => completer.setChannel(innerChannel), throwsStateError);
+ });
+}
diff --git a/pkgs/stream_channel/test/stream_channel_controller_test.dart b/pkgs/stream_channel/test/stream_channel_controller_test.dart
new file mode 100644
index 0000000..3d661e3
--- /dev/null
+++ b/pkgs/stream_channel/test/stream_channel_controller_test.dart
@@ -0,0 +1,104 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('asynchronously', () {
+ late StreamChannelController controller;
+ setUp(() {
+ controller = StreamChannelController();
+ });
+
+ test('forwards events from the local sink to the foreign stream', () {
+ controller.local.sink
+ ..add(1)
+ ..add(2)
+ ..add(3)
+ ..close();
+ expect(controller.foreign.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test('forwards events from the foreign sink to the local stream', () {
+ controller.foreign.sink
+ ..add(1)
+ ..add(2)
+ ..add(3)
+ ..close();
+ expect(controller.local.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test(
+ 'with allowForeignErrors: false, shuts down the connection if an '
+ 'error is added to the foreign channel', () {
+ controller = StreamChannelController(allowForeignErrors: false);
+
+ controller.foreign.sink.addError('oh no');
+ expect(controller.foreign.sink.done, throwsA('oh no'));
+ expect(controller.foreign.stream.toList(), completion(isEmpty));
+ expect(controller.local.sink.done, completes);
+ expect(controller.local.stream.toList(), completion(isEmpty));
+ });
+ });
+
+ group('synchronously', () {
+ late StreamChannelController controller;
+ setUp(() {
+ controller = StreamChannelController(sync: true);
+ });
+
+ test(
+ 'synchronously forwards events from the local sink to the foreign '
+ 'stream', () {
+ var receivedEvent = false;
+ var receivedError = false;
+ var receivedDone = false;
+ controller.foreign.stream.listen(expectAsync1((event) {
+ expect(event, equals(1));
+ receivedEvent = true;
+ }), onError: expectAsync1((error) {
+ expect(error, equals('oh no'));
+ receivedError = true;
+ }), onDone: expectAsync0(() {
+ receivedDone = true;
+ }));
+
+ controller.local.sink.add(1);
+ expect(receivedEvent, isTrue);
+
+ controller.local.sink.addError('oh no');
+ expect(receivedError, isTrue);
+
+ controller.local.sink.close();
+ expect(receivedDone, isTrue);
+ });
+
+ test(
+ 'synchronously forwards events from the foreign sink to the local '
+ 'stream', () {
+ var receivedEvent = false;
+ var receivedError = false;
+ var receivedDone = false;
+ controller.local.stream.listen(expectAsync1((event) {
+ expect(event, equals(1));
+ receivedEvent = true;
+ }), onError: expectAsync1((error) {
+ expect(error, equals('oh no'));
+ receivedError = true;
+ }), onDone: expectAsync0(() {
+ receivedDone = true;
+ }));
+
+ controller.foreign.sink.add(1);
+ expect(receivedEvent, isTrue);
+
+ controller.foreign.sink.addError('oh no');
+ expect(receivedError, isTrue);
+
+ controller.foreign.sink.close();
+ expect(receivedDone, isTrue);
+ });
+ });
+}
diff --git a/pkgs/stream_channel/test/stream_channel_test.dart b/pkgs/stream_channel/test/stream_channel_test.dart
new file mode 100644
index 0000000..166671d
--- /dev/null
+++ b/pkgs/stream_channel/test/stream_channel_test.dart
@@ -0,0 +1,180 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:convert';
+
+import 'package:async/async.dart';
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ test("pipe() pipes data from each channel's stream into the other's sink",
+ () {
+ var otherStreamController = StreamController<int>();
+ var otherSinkController = StreamController<int>();
+ var otherChannel =
+ StreamChannel(otherStreamController.stream, otherSinkController.sink);
+
+ var streamController = StreamController<int>();
+ var sinkController = StreamController<int>();
+ var channel = StreamChannel(streamController.stream, sinkController.sink);
+
+ channel.pipe(otherChannel);
+
+ streamController.add(1);
+ streamController.add(2);
+ streamController.add(3);
+ streamController.close();
+ expect(otherSinkController.stream.toList(), completion(equals([1, 2, 3])));
+
+ otherStreamController.add(4);
+ otherStreamController.add(5);
+ otherStreamController.add(6);
+ otherStreamController.close();
+ expect(sinkController.stream.toList(), completion(equals([4, 5, 6])));
+ });
+
+ test('transform() transforms the channel', () async {
+ var streamController = StreamController<List<int>>();
+ var sinkController = StreamController<List<int>>();
+ var channel = StreamChannel(streamController.stream, sinkController.sink);
+
+ var transformed = channel
+ .cast<List<int>>()
+ .transform(StreamChannelTransformer.fromCodec(utf8));
+
+ streamController.add([102, 111, 111, 98, 97, 114]);
+ unawaited(streamController.close());
+ expect(await transformed.stream.toList(), equals(['foobar']));
+
+ transformed.sink.add('fblthp');
+ unawaited(transformed.sink.close());
+ expect(
+ sinkController.stream.toList(),
+ completion(equals([
+ [102, 98, 108, 116, 104, 112]
+ ])));
+ });
+
+ test('transformStream() transforms only the stream', () async {
+ var streamController = StreamController<String>();
+ var sinkController = StreamController<String>();
+ var channel = StreamChannel(streamController.stream, sinkController.sink);
+
+ var transformed =
+ channel.cast<String>().transformStream(const LineSplitter());
+
+ streamController.add('hello world');
+ streamController.add(' what\nis');
+ streamController.add('\nup');
+ unawaited(streamController.close());
+ expect(await transformed.stream.toList(),
+ equals(['hello world what', 'is', 'up']));
+
+ transformed.sink.add('fbl\nthp');
+ unawaited(transformed.sink.close());
+ expect(sinkController.stream.toList(), completion(equals(['fbl\nthp'])));
+ });
+
+ test('transformSink() transforms only the sink', () async {
+ var streamController = StreamController<String>();
+ var sinkController = StreamController<String>();
+ var channel = StreamChannel(streamController.stream, sinkController.sink);
+
+ var transformed = channel.cast<String>().transformSink(
+ const StreamSinkTransformer.fromStreamTransformer(LineSplitter()));
+
+ streamController.add('fbl\nthp');
+ unawaited(streamController.close());
+ expect(await transformed.stream.toList(), equals(['fbl\nthp']));
+
+ transformed.sink.add('hello world');
+ transformed.sink.add(' what\nis');
+ transformed.sink.add('\nup');
+ unawaited(transformed.sink.close());
+ expect(sinkController.stream.toList(),
+ completion(equals(['hello world what', 'is', 'up'])));
+ });
+
+ test('changeStream() changes the stream', () {
+ var streamController = StreamController<int>();
+ var sinkController = StreamController<int>();
+ var channel = StreamChannel(streamController.stream, sinkController.sink);
+
+ var newController = StreamController<int>();
+ var changed = channel.changeStream((stream) {
+ expect(stream, equals(channel.stream));
+ return newController.stream;
+ });
+
+ newController.add(10);
+ newController.close();
+
+ streamController.add(20);
+ streamController.close();
+
+ expect(changed.stream.toList(), completion(equals([10])));
+ });
+
+ test('changeSink() changes the sink', () {
+ var streamController = StreamController<int>();
+ var sinkController = StreamController<int>();
+ var channel = StreamChannel(streamController.stream, sinkController.sink);
+
+ var newController = StreamController<int>();
+ var changed = channel.changeSink((sink) {
+ expect(sink, equals(channel.sink));
+ return newController.sink;
+ });
+
+ expect(newController.stream.toList(), completion(equals([10])));
+ streamController.stream.listen(expectAsync1((_) {}, count: 0));
+
+ changed.sink.add(10);
+ changed.sink.close();
+ });
+
+ group('StreamChannelMixin', () {
+ test('can be used as a mixin', () async {
+ var channel = StreamChannelMixinAsMixin<int>();
+ expect(channel.stream, emitsInOrder([1, 2, 3]));
+ channel.sink
+ ..add(1)
+ ..add(2)
+ ..add(3);
+ await channel.controller.close();
+ });
+
+ test('can be extended', () async {
+ var channel = StreamChannelMixinAsSuperclass<int>();
+ expect(channel.stream, emitsInOrder([1, 2, 3]));
+ channel.sink
+ ..add(1)
+ ..add(2)
+ ..add(3);
+ await channel.controller.close();
+ });
+ });
+}
+
+class StreamChannelMixinAsMixin<T> with StreamChannelMixin<T> {
+ final controller = StreamController<T>();
+
+ @override
+ StreamSink<T> get sink => controller.sink;
+
+ @override
+ Stream<T> get stream => controller.stream;
+}
+
+class StreamChannelMixinAsSuperclass<T> extends StreamChannelMixin<T> {
+ final controller = StreamController<T>();
+
+ @override
+ StreamSink<T> get sink => controller.sink;
+
+ @override
+ Stream<T> get stream => controller.stream;
+}
diff --git a/pkgs/stream_channel/test/with_close_guarantee_test.dart b/pkgs/stream_channel/test/with_close_guarantee_test.dart
new file mode 100644
index 0000000..9c0b729
--- /dev/null
+++ b/pkgs/stream_channel/test/with_close_guarantee_test.dart
@@ -0,0 +1,69 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+final _delayTransformer = StreamTransformer.fromHandlers(
+ handleData: (data, sink) => Future.microtask(() => sink.add(data)),
+ handleDone: (sink) => Future.microtask(() => sink.close()));
+
+final _delaySinkTransformer =
+ StreamSinkTransformer.fromStreamTransformer(_delayTransformer);
+
+void main() {
+ late StreamChannelController controller;
+ late StreamChannel channel;
+ setUp(() {
+ controller = StreamChannelController();
+
+ // Add a bunch of layers of asynchronous dispatch between the channel and
+ // the underlying controllers.
+ var stream = controller.foreign.stream;
+ var sink = controller.foreign.sink;
+ for (var i = 0; i < 10; i++) {
+ stream = stream.transform(_delayTransformer);
+ sink = _delaySinkTransformer.bind(sink);
+ }
+
+ channel = StreamChannel.withCloseGuarantee(stream, sink);
+ });
+
+ test(
+ 'closing the event sink causes the stream to close before it emits any '
+ 'more events', () async {
+ controller.local.sink.add(1);
+ controller.local.sink.add(2);
+ controller.local.sink.add(3);
+
+ expect(
+ channel.stream
+ .listen(expectAsync1((event) {
+ if (event == 2) channel.sink.close();
+ }, count: 2))
+ .asFuture<void>(),
+ completes);
+
+ await pumpEventQueue();
+ });
+
+ test(
+ 'closing the event sink before events are emitted causes the stream to '
+ 'close immediately', () async {
+ unawaited(channel.sink.close());
+ channel.stream.listen(expectAsync1((_) {}, count: 0),
+ onError: expectAsync2((_, __) {}, count: 0),
+ onDone: expectAsync0(() {}));
+
+ controller.local.sink.add(1);
+ controller.local.sink.add(2);
+ controller.local.sink.add(3);
+ unawaited(controller.local.sink.close());
+
+ await pumpEventQueue();
+ });
+}
diff --git a/pkgs/stream_channel/test/with_guarantees_test.dart b/pkgs/stream_channel/test/with_guarantees_test.dart
new file mode 100644
index 0000000..f026079
--- /dev/null
+++ b/pkgs/stream_channel/test/with_guarantees_test.dart
@@ -0,0 +1,200 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_channel/stream_channel.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late StreamController streamController;
+ late StreamController sinkController;
+ late StreamChannel channel;
+ setUp(() {
+ streamController = StreamController<void>();
+ sinkController = StreamController<void>();
+ channel = StreamChannel.withGuarantees(
+ streamController.stream, sinkController.sink);
+ });
+
+ group('with a broadcast stream', () {
+ setUp(() {
+ streamController = StreamController.broadcast();
+ channel = StreamChannel.withGuarantees(
+ streamController.stream, sinkController.sink);
+ });
+
+ test('buffers events', () async {
+ streamController.add(1);
+ streamController.add(2);
+ streamController.add(3);
+ await pumpEventQueue();
+
+ expect(channel.stream.toList(), completion(equals([1, 2, 3])));
+ unawaited(streamController.close());
+ });
+
+ test('only allows a single subscription', () {
+ channel.stream.listen(null);
+ expect(() => channel.stream.listen(null), throwsStateError);
+ });
+ });
+
+ test(
+ 'closing the event sink causes the stream to close before it emits any '
+ 'more events', () {
+ streamController.add(1);
+ streamController.add(2);
+ streamController.add(3);
+
+ expect(
+ channel.stream
+ .listen(expectAsync1((event) {
+ if (event == 2) channel.sink.close();
+ }, count: 2))
+ .asFuture<void>(),
+ completes);
+ });
+
+ test('after the stream closes, the sink ignores events', () async {
+ unawaited(streamController.close());
+
+ // Wait for the done event to be delivered.
+ await channel.stream.toList();
+ channel.sink.add(1);
+ channel.sink.add(2);
+ channel.sink.add(3);
+ unawaited(channel.sink.close());
+
+ // None of our channel.sink additions should make it to the other endpoint.
+ sinkController.stream.listen(expectAsync1((_) {}, count: 0),
+ onDone: expectAsync0(() {}, count: 0));
+ await pumpEventQueue();
+ });
+
+ test("canceling the stream's subscription has no effect on the sink",
+ () async {
+ unawaited(channel.stream.listen(null).cancel());
+ await pumpEventQueue();
+
+ channel.sink.add(1);
+ channel.sink.add(2);
+ channel.sink.add(3);
+ unawaited(channel.sink.close());
+ expect(sinkController.stream.toList(), completion(equals([1, 2, 3])));
+ });
+
+ test("canceling the stream's subscription doesn't stop a done event",
+ () async {
+ unawaited(channel.stream.listen(null).cancel());
+ await pumpEventQueue();
+
+ unawaited(streamController.close());
+ await pumpEventQueue();
+
+ channel.sink.add(1);
+ channel.sink.add(2);
+ channel.sink.add(3);
+ unawaited(channel.sink.close());
+
+ // The sink should be ignoring events because the stream closed.
+ sinkController.stream.listen(expectAsync1((_) {}, count: 0),
+ onDone: expectAsync0(() {}, count: 0));
+ await pumpEventQueue();
+ });
+
+ test('forwards errors to the other endpoint', () {
+ channel.sink.addError('error');
+ expect(sinkController.stream.first, throwsA('error'));
+ });
+
+ test('Sink.done completes once the stream is done', () {
+ channel.stream.listen(null);
+ expect(channel.sink.done, completes);
+ streamController.close();
+ });
+
+ test("events can't be added to an explicitly-closed sink", () {
+ sinkController.stream.listen(null); // Work around sdk#19095.
+
+ expect(channel.sink.close(), completes);
+ expect(() => channel.sink.add(1), throwsStateError);
+ expect(() => channel.sink.addError('oh no'), throwsStateError);
+ expect(() => channel.sink.addStream(Stream.fromIterable([])),
+ throwsStateError);
+ });
+
+ test("events can't be added while a stream is being added", () {
+ var controller = StreamController<void>();
+ channel.sink.addStream(controller.stream);
+
+ expect(() => channel.sink.add(1), throwsStateError);
+ expect(() => channel.sink.addError('oh no'), throwsStateError);
+ expect(() => channel.sink.addStream(Stream.fromIterable([])),
+ throwsStateError);
+ expect(() => channel.sink.close(), throwsStateError);
+
+ controller.close();
+ });
+
+ group('with allowSinkErrors: false', () {
+ setUp(() {
+ streamController = StreamController<void>();
+ sinkController = StreamController<void>();
+ channel = StreamChannel.withGuarantees(
+ streamController.stream, sinkController.sink,
+ allowSinkErrors: false);
+ });
+
+ test('forwards errors to Sink.done but not the stream', () {
+ channel.sink.addError('oh no');
+ expect(channel.sink.done, throwsA('oh no'));
+ sinkController.stream
+ .listen(null, onError: expectAsync1((dynamic _) {}, count: 0));
+ });
+
+ test('adding an error causes the stream to emit a done event', () {
+ expect(channel.sink.done, throwsA('oh no'));
+
+ streamController.add(1);
+ streamController.add(2);
+ streamController.add(3);
+
+ expect(
+ channel.stream
+ .listen(expectAsync1((event) {
+ if (event == 2) channel.sink.addError('oh no');
+ }, count: 2))
+ .asFuture<void>(),
+ completes);
+ });
+
+ test('adding an error closes the inner sink', () {
+ channel.sink.addError('oh no');
+ expect(channel.sink.done, throwsA('oh no'));
+ expect(sinkController.stream.toList(), completion(isEmpty));
+ });
+
+ test(
+ 'adding an error via via addStream causes the stream to emit a done '
+ 'event', () async {
+ var canceled = false;
+ var controller = StreamController<void>(onCancel: () {
+ canceled = true;
+ });
+
+ // This future shouldn't get the error, because it's sent to [Sink.done].
+ expect(channel.sink.addStream(controller.stream), completes);
+
+ controller.addError('oh no');
+ expect(channel.sink.done, throwsA('oh no'));
+ await pumpEventQueue();
+ expect(canceled, isTrue);
+
+ // Even though the sink is closed, this shouldn't throw an error because
+ // the user didn't explicitly close it.
+ channel.sink.add(1);
+ });
+ });
+}
diff --git a/pkgs/stream_transform/.gitignore b/pkgs/stream_transform/.gitignore
new file mode 100644
index 0000000..bfffcc6
--- /dev/null
+++ b/pkgs/stream_transform/.gitignore
@@ -0,0 +1,6 @@
+.pub/
+.dart_tool/
+build/
+packages
+pubspec.lock
+.packages
diff --git a/pkgs/stream_transform/CHANGELOG.md b/pkgs/stream_transform/CHANGELOG.md
new file mode 100644
index 0000000..b09778b
--- /dev/null
+++ b/pkgs/stream_transform/CHANGELOG.md
@@ -0,0 +1,189 @@
+## 2.1.2-wip
+
+- Require Dart 3.4 or greater.
+
+## 2.1.1
+
+- Require Dart 3.1 or greater
+- Forward errors from the `trigger` future through to the result stream in
+ `takeUntil`. Previously an error would have not closed the stream, and instead
+ raised as an unhandled async error.
+- Move to `dart-lang/tools` monorepo.
+
+## 2.1.0
+
+- Add `whereNotNull`.
+
+## 2.0.1
+
+- Require Dart 2.14 or greater.
+- Wait for the future returned from `StreamSubscription.cancel()` before
+ listening to the subsequent stream in `switchLatest` and `switchMap`.
+
+## 2.0.0
+
+- Migrate to null safety.
+- Improve tests of `switchMap` and improve documentation with links and
+ clarification.
+- Add `trailing` argument to `throttle`.
+
+## 1.2.0
+
+- Add support for emitting the "leading" event in `debounce`.
+
+## 1.1.1
+
+- Fix a bug in `asyncMapSample`, `buffer`, `combineLatest`,
+ `combineLatestAll`, `merge`, and `mergeAll` which would cause an exception
+ when cancelling a subscription after using the transformer if the original
+ stream(s) returned `null` from cancelling their subscriptions.
+
+## 1.1.0
+
+- Add `concurrentAsyncExpand` to interleave events emitted by multiple sub
+ streams created by a callback.
+
+## 1.0.0
+
+- Remove the top level methods and retain the extensions only.
+
+## 0.0.20
+
+- Add extension methods for most transformers. These should be used in place
+ of the current methods. All current implementations are deprecated and will
+ be removed in the next major version bump.
+ - Migrating typical use: Instead of
+ `stream.transform(debounce(Duration(seconds: 1)))` use
+ `stream.debounce(Duration(seconds: 1))`.
+ - To migrate a usage where a `StreamTransformer` instance is stored or
+ passed see "Getting a StreamTransformer instance" on the README.
+- The `map` and `chainTransformers` utilities are no longer useful with the
+ new patterns so they are deprecated without a replacement. If you still have
+ a need for them they can be replicated with `StreamTransformer.fromBind`:
+
+ ```
+ // Replace `map(convert)`
+ StreamTransformer.fromBind((s) => s.map(convert));
+
+ // Replace `chainTransformers(first, second)`
+ StreamTransformer.fromBind((s) => s.transform(first).transform(second));
+ ```
+
+## 0.0.19
+
+- Add `asyncMapSample` transform.
+
+## 0.0.18
+
+- Internal cleanup. Passed "trigger" streams or futures now allow `<void>`
+ generic type rather than an implicit `dynamic>`
+
+## 0.0.17
+
+- Add concrete types to the `onError` callback in `tap`.
+
+## 0.0.16+1
+
+- Remove usage of Set literal which is not available before Dart 2.2.0
+
+## 0.0.16
+
+- Allow a `combine` callback to return a `FutureOr<T>` in `scan`. There are no
+ behavior changes for synchronous callbacks. **Potential breaking change** In
+ the unlikely situation where `scan` was used to produce a `Stream<Future>`
+ inference may now fail and require explicit generic type arguments.
+- Add `combineLatest`.
+- Add `combineLatestAll`.
+
+## 0.0.15
+
+- Add `whereType`.
+
+## 0.0.14+1
+
+- Allow using non-dev Dart 2 SDK.
+
+## 0.0.14
+
+- `asyncWhere` will now forward exceptions thrown by the callback through the
+ result Stream.
+- Added `concurrentAsyncMap`.
+
+## 0.0.13
+
+- `mergeAll` now accepts an `Iterable<Stream>` instead of only `List<Stream>`.
+
+## 0.0.12
+
+- Add `chainTransformers` and `map` for use cases where `StreamTransformer`
+ instances are stored as variables or passed to methods other than `transform`.
+
+## 0.0.11
+
+- Renamed `concat` as `followedBy` to match the naming of `Iterable.followedBy`.
+ `concat` is now deprecated.
+
+## 0.0.10
+
+- Updates to support Dart 2.0 core library changes (wave
+ 2.2). See [issue 31847][sdk#31847] for details.
+
+ [sdk#31847]: https://github.com/dart-lang/sdk/issues/31847
+
+## 0.0.9
+
+- Add `asyncMapBuffer`.
+
+## 0.0.8
+
+- Add `takeUntil`.
+
+## 0.0.7
+
+- Bug Fix: Streams produced with `scan` and `switchMap` now correctly report
+ `isBroadcast`.
+- Add `startWith`, `startWithMany`, and `startWithStream`.
+
+## 0.0.6
+
+- Bug Fix: Some transformers did not correctly add data to all listeners on
+ broadcast streams. Fixed for `throttle`, `debounce`, `asyncWhere` and `audit`.
+- Bug Fix: Only call the `tap` data callback once per event rather than once per
+ listener.
+- Bug Fix: Allow canceling and re-listening to broadcast streams after a
+ `merge` transform.
+- Bug Fix: Broadcast streams which are buffered using a single-subscription
+ trigger can be canceled and re-listened.
+- Bug Fix: Buffer outputs one more value if there is a pending trigger before
+ the trigger closes.
+- Bug Fix: Single-subscription streams concatted after broadcast streams are
+ handled correctly.
+- Use sync `StreamControllers` for forwarding where possible.
+
+## 0.0.5
+
+- Bug Fix: Allow compiling switchLatest with Dart2Js.
+- Add `asyncWhere`: Like `where` but allows an asynchronous predicate.
+
+## 0.0.4
+- Add `scan`: fold which returns intermediate values
+- Add `throttle`: block events for a duration after emitting a value
+- Add `audit`: emits the last event received after a duration
+
+## 0.0.3
+
+- Add `tap`: React to values as they pass without being a subscriber on a stream
+- Add `switchMap` and `switchLatest`: Flatten a Stream of Streams into a Stream
+ which forwards values from the most recent Stream
+
+## 0.0.2
+
+- Add `concat`: Appends streams in series
+- Add `merge` and `mergeAll`: Interleaves streams
+
+## 0.0.1
+
+- Initial release with the following utilities:
+ - `buffer`: Collects events in a `List` until a `trigger` stream fires.
+ - `debounce`, `debounceBuffer`: Collect or drop events which occur closer in
+ time than a given duration.
diff --git a/pkgs/stream_transform/LICENSE b/pkgs/stream_transform/LICENSE
new file mode 100644
index 0000000..03af64a
--- /dev/null
+++ b/pkgs/stream_transform/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2017, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/stream_transform/README.md b/pkgs/stream_transform/README.md
new file mode 100644
index 0000000..e7049bd
--- /dev/null
+++ b/pkgs/stream_transform/README.md
@@ -0,0 +1,141 @@
+[](https://github.com/dart-lang/tools/actions/workflows/stream_transform.yaml)
+[](https://pub.dev/packages/stream_transform)
+[](https://pub.dev/packages/stream_transform/publisher)
+
+Extension methods on `Stream` adding common transform operators.
+
+## Operators
+
+### asyncMapBuffer, asyncMapSample, concurrentAsyncMap
+
+Alternatives to `asyncMap`. `asyncMapBuffer` prevents the callback from
+overlapping execution and collects events while it is executing.
+`asyncMapSample` prevents overlapping execution and discards events while it is
+executing. `concurrentAsyncMap` allows overlap and removes ordering guarantees
+for higher throughput.
+
+Like `asyncMap` but events are buffered in a List until previous events have
+been processed rather than being called for each element individually.
+
+### asyncWhere
+
+Like `where` but allows an asynchronous predicate.
+
+### audit
+
+Waits for a period of time after receiving a value and then only emits the most
+recent value.
+
+### buffer
+
+Collects values from a source stream until a `trigger` stream fires and the
+collected values are emitted.
+
+### combineLatest, combineLatestAll
+
+Combine the most recent event from multiple streams through a callback or into a
+list.
+
+### debounce, debounceBuffer
+
+Prevents a source stream from emitting too frequently by dropping or collecting
+values that occur within a given duration.
+
+### followedBy
+
+Appends the values of a stream after another stream finishes.
+
+### merge, mergeAll, concurrentAsyncExpand
+
+Interleaves events from multiple streams into a single stream.
+
+### scan
+
+Scan is like fold, but instead of producing a single value it yields each
+intermediate accumulation.
+
+### startWith, startWithMany, startWithStream
+
+Prepend a value, an iterable, or a stream to the beginning of another stream.
+
+### switchMap, switchLatest
+
+Flatten a Stream of Streams into a Stream which forwards values from the most
+recent Stream
+
+### takeUntil
+
+Let values through until a Future fires.
+
+### tap
+
+Taps into a single-subscriber stream to react to values as they pass, without
+being a real subscriber.
+
+### throttle
+
+Blocks events for a duration after an event is successfully emitted.
+
+### whereType
+
+Like `Iterable.whereType` for a stream.
+
+## Comparison to Rx Operators
+
+The semantics and naming in this package have some overlap, and some conflict,
+with the [ReactiveX](https://reactivex.io/) suite of libraries. Some of the
+conflict is intentional - Dart `Stream` predates `Observable` and coherence with
+the Dart ecosystem semantics and naming is a strictly higher priority than
+consistency with ReactiveX.
+
+Rx Operator Category | variation | `stream_transform`
+------------------------- | ------------------------------------------------------ | ------------------
+[`sample`][rx_sample] | `sample/throttleLast(Duration)` | `sample(Stream.periodic(Duration), longPoll: false)`
+​ | `throttleFirst(Duration)` | [`throttle`][throttle]
+​ | `sample(Observable)` | `sample(trigger, longPoll: false)`
+[`debounce`][rx_debounce] | `debounce/throttleWithTimeout(Duration)` | [`debounce`][debounce]
+​ | `debounce(Observable)` | No equivalent
+[`buffer`][rx_buffer] | `buffer(boundary)`, `bufferWithTime`,`bufferWithCount` | No equivalent
+​ | `buffer(boundaryClosingSelector)` | `buffer(trigger, longPoll: false)`
+RxJs extensions | [`audit(callback)`][rxjs_audit] | No equivalent
+​ | [`auditTime(Duration)`][rxjs_auditTime] | [`audit`][audit]
+​ | [`exhaustMap`][rxjs_exhaustMap] | No equivalent
+​ | [`throttleTime(trailing: true)`][rxjs_throttleTime] | `throttle(trailing: true)`
+​ | `throttleTime(leading: false, trailing: true)` | No equivalent
+No equivalent? | | [`asyncMapBuffer`][asyncMapBuffer]
+​ | | [`asyncMapSample`][asyncMapSample]
+​ | | [`buffer`][buffer]
+​ | | [`sample`][sample]
+​ | | [`debounceBuffer`][debounceBuffer]
+​ | | `debounce(leading: true, trailing: false)`
+​ | | `debounce(leading: true, trailing: true)`
+
+[rx_sample]:https://reactivex.io/documentation/operators/sample.html
+[rx_debounce]:https://reactivex.io/documentation/operators/debounce.html
+[rx_buffer]:https://reactivex.io/documentation/operators/buffer.html
+[rxjs_audit]:https://rxjs.dev/api/operators/audit
+[rxjs_auditTime]:https://rxjs.dev/api/operators/auditTime
+[rxjs_throttleTime]:https://rxjs.dev/api/operators/throttleTime
+[rxjs_exhaustMap]:https://rxjs.dev/api/operators/exhaustMap
+[asyncMapBuffer]:https://pub.dev/documentation/stream_transform/latest/stream_transform/AsyncMap/asyncMapBuffer.html
+[asyncMapSample]:https://pub.dev/documentation/stream_transform/latest/stream_transform/AsyncMap/asyncMapSample.html
+[audit]:https://pub.dev/documentation/stream_transform/latest/stream_transform/RateLimit/audit.html
+[buffer]:https://pub.dev/documentation/stream_transform/latest/stream_transform/RateLimit/buffer.html
+[sample]:https://pub.dev/documentation/stream_transform/latest/stream_transform/RateLimit/sample.html
+[debounceBuffer]:https://pub.dev/documentation/stream_transform/latest/stream_transform/RateLimit/debounceBuffer.html
+[debounce]:https://pub.dev/documentation/stream_transform/latest/stream_transform/RateLimit/debounce.html
+[throttle]:https://pub.dev/documentation/stream_transform/latest/stream_transform/RateLimit/throttle.html
+
+## Getting a `StreamTransformer` instance
+
+It may be useful to pass an instance of `StreamTransformer` so that it can be
+used with `stream.transform` calls rather than reference the specific operator
+in place. Any operator on `Stream` that returns a `Stream` can be modeled as a
+`StreamTransformer` using the [`fromBind` constructor][fromBind].
+
+```dart
+final debounce = StreamTransformer.fromBind(
+ (s) => s.debounce(const Duration(milliseconds: 100)));
+```
+
+[fromBind]: https://api.dart.dev/stable/dart-async/StreamTransformer/StreamTransformer.fromBind.html
diff --git a/pkgs/stream_transform/analysis_options.yaml b/pkgs/stream_transform/analysis_options.yaml
new file mode 100644
index 0000000..05f1af1
--- /dev/null
+++ b/pkgs/stream_transform/analysis_options.yaml
@@ -0,0 +1,16 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-raw-types: true
+
+linter:
+ rules:
+ - avoid_bool_literals_in_conditional_expressions
+ - avoid_classes_with_only_static_members
+ - avoid_returning_this
+ - avoid_unused_constructor_parameters
+ - cascade_invocations
+ - join_return_with_assignment
+ - no_adjacent_strings_in_list
diff --git a/pkgs/stream_transform/example/index.html b/pkgs/stream_transform/example/index.html
new file mode 100644
index 0000000..aecdc09
--- /dev/null
+++ b/pkgs/stream_transform/example/index.html
@@ -0,0 +1,11 @@
+<html>
+ <head>
+ <script defer src="main.dart.js" type="application/javascript"></script>
+ </head>
+ <body>
+ <input id="first_input"><br>
+ <input id="second_input"><br>
+ <p id="output">
+ </p>
+ </body>
+</html>
diff --git a/pkgs/stream_transform/example/main.dart b/pkgs/stream_transform/example/main.dart
new file mode 100644
index 0000000..8224393
--- /dev/null
+++ b/pkgs/stream_transform/example/main.dart
@@ -0,0 +1,25 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:web/web.dart';
+
+void main() {
+ var firstInput = document.querySelector('#first_input') as HTMLInputElement;
+ var secondInput = document.querySelector('#second_input') as HTMLInputElement;
+ var output = document.querySelector('#output')!;
+
+ _inputValues(firstInput)
+ .combineLatest(_inputValues(secondInput),
+ (first, second) => 'First: $first, Second: $second')
+ .tap((v) {
+ print('Saw: $v');
+ }).forEach((v) {
+ output.text = v;
+ });
+}
+
+Stream<String?> _inputValues(HTMLInputElement element) => element.onKeyUp
+ .debounce(const Duration(milliseconds: 100))
+ .map((_) => element.value);
diff --git a/pkgs/stream_transform/lib/src/aggregate_sample.dart b/pkgs/stream_transform/lib/src/aggregate_sample.dart
new file mode 100644
index 0000000..f2ff8ed
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/aggregate_sample.dart
@@ -0,0 +1,146 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'common_callbacks.dart';
+
+extension AggregateSample<T> on Stream<T> {
+ /// Computes a value based on sequences of events, then emits that value when
+ /// [trigger] emits an event.
+ ///
+ /// Every time this stream emits an event, an intermediate value is created
+ /// by combining the new event with the previous intermediate value, or with
+ /// `null` if there is no previous value, using the [aggregate] function.
+ ///
+ /// When [trigger] emits value, the returned stream emits the current
+ /// intermediate value and clears it.
+ ///
+ /// If [longPoll] is `false`, if there is no intermediate value when [trigger]
+ /// emits an event, the [onEmpty] function is called with a [Sink] which can
+ /// add events to the returned stream.
+ ///
+ /// If [longPoll] is `true`, and there is no intermediate value when [trigger]
+ /// emits one or more events, then the *next* event from this stream is
+ /// immediately put through [aggregate] and emitted on the returned stream.
+ /// Subsequent events on [trigger] while there have been no events on this
+ /// stream are ignored.
+ /// In that case, [onEmpty] is never used.
+ ///
+ /// The result stream will close as soon as there is a guarantee it will not
+ /// emit any more events. There will not be any more events emitted if:
+ /// - [trigger] is closed and there is no waiting long poll.
+ /// - Or, the source stream is closed and there are no buffered events.
+ ///
+ /// If the source stream is a broadcast stream, the result will be as well.
+ /// Errors from the source stream or the trigger are immediately forwarded to
+ /// the output.
+ Stream<S> aggregateSample<S>(
+ {required Stream<void> trigger,
+ required S Function(T, S?) aggregate,
+ required bool longPoll,
+ required void Function(Sink<S>) onEmpty}) {
+ var controller = isBroadcast
+ ? StreamController<S>.broadcast(sync: true)
+ : StreamController<S>(sync: true);
+
+ S? currentResults;
+ var hasCurrentResults = false;
+ var activeLongPoll = false;
+ var isTriggerDone = false;
+ var isValueDone = false;
+ StreamSubscription<T>? valueSub;
+ StreamSubscription<void>? triggerSub;
+
+ void emit(S results) {
+ currentResults = null;
+ hasCurrentResults = false;
+ controller.add(results);
+ }
+
+ void onValue(T value) {
+ currentResults = aggregate(value, currentResults);
+ hasCurrentResults = true;
+ if (!longPoll) return;
+
+ if (activeLongPoll) {
+ activeLongPoll = false;
+ emit(currentResults as S);
+ }
+
+ if (isTriggerDone) {
+ valueSub!.cancel();
+ controller.close();
+ }
+ }
+
+ void onValuesDone() {
+ isValueDone = true;
+ if (!hasCurrentResults) {
+ triggerSub?.cancel();
+ controller.close();
+ }
+ }
+
+ void onTrigger(_) {
+ if (hasCurrentResults) {
+ emit(currentResults as S);
+ } else if (longPoll) {
+ activeLongPoll = true;
+ } else {
+ onEmpty(controller);
+ }
+
+ if (isValueDone) {
+ triggerSub!.cancel();
+ controller.close();
+ }
+ }
+
+ void onTriggerDone() {
+ isTriggerDone = true;
+ if (!activeLongPoll) {
+ valueSub?.cancel();
+ controller.close();
+ }
+ }
+
+ controller.onListen = () {
+ assert(valueSub == null);
+ valueSub =
+ listen(onValue, onError: controller.addError, onDone: onValuesDone);
+ final priorTriggerSub = triggerSub;
+ if (priorTriggerSub != null) {
+ if (priorTriggerSub.isPaused) priorTriggerSub.resume();
+ } else {
+ triggerSub = trigger.listen(onTrigger,
+ onError: controller.addError, onDone: onTriggerDone);
+ }
+ if (!isBroadcast) {
+ controller
+ ..onPause = () {
+ valueSub?.pause();
+ triggerSub?.pause();
+ }
+ ..onResume = () {
+ valueSub?.resume();
+ triggerSub?.resume();
+ };
+ }
+ controller.onCancel = () {
+ var cancels = <Future<void>>[if (!isValueDone) valueSub!.cancel()];
+ valueSub = null;
+ if (trigger.isBroadcast || !isBroadcast) {
+ if (!isTriggerDone) cancels.add(triggerSub!.cancel());
+ triggerSub = null;
+ } else {
+ triggerSub!.pause();
+ }
+ if (cancels.isEmpty) return null;
+ return cancels.wait.then(ignoreArgument);
+ };
+ };
+ return controller.stream;
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/async_expand.dart b/pkgs/stream_transform/lib/src/async_expand.dart
new file mode 100644
index 0000000..28d2f40
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/async_expand.dart
@@ -0,0 +1,89 @@
+// Copyright (c) 2022, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'common_callbacks.dart';
+import 'switch.dart';
+
+/// Alternatives to [asyncExpand].
+///
+/// The built in [asyncExpand] will not overlap the inner streams and every
+/// event will be sent to the callback individually.
+///
+/// - [concurrentAsyncExpand] allow overlap and merges inner streams without
+/// ordering guarantees.
+extension AsyncExpand<T> on Stream<T> {
+ /// Like [asyncExpand] but the [convert] callback may be called for an element
+ /// before the [Stream] emitted by the previous element has closed.
+ ///
+ /// Events on the result stream will be emitted in the order they are emitted
+ /// by the sub streams, which may not match the order of this stream.
+ ///
+ /// Errors from [convert], the source stream, or any of the sub streams are
+ /// forwarded to the result stream.
+ ///
+ /// The result stream will not close until the source stream closes and all
+ /// sub streams have closed.
+ ///
+ /// If the source stream is a broadcast stream, the result will be as well,
+ /// regardless of the types of streams created by [convert]. In this case,
+ /// some care should be taken:
+ /// - If [convert] returns a single subscription stream it may be listened to
+ /// and never canceled.
+ /// - For any period of time where there are no listeners on the result
+ /// stream, any sub streams from previously emitted events will be ignored,
+ /// regardless of whether they emit further events after a listener is added
+ /// back.
+ ///
+ /// See also:
+ /// - [switchMap], which cancels subscriptions to the previous sub stream
+ /// instead of concurrently emitting events from all sub streams.
+ Stream<S> concurrentAsyncExpand<S>(Stream<S> Function(T) convert) {
+ final controller = isBroadcast
+ ? StreamController<S>.broadcast(sync: true)
+ : StreamController<S>(sync: true);
+
+ controller.onListen = () {
+ final subscriptions = <StreamSubscription<dynamic>>[];
+ final outerSubscription = map(convert).listen((inner) {
+ if (isBroadcast && !inner.isBroadcast) {
+ inner = inner.asBroadcastStream();
+ }
+ final subscription =
+ inner.listen(controller.add, onError: controller.addError);
+ subscription.onDone(() {
+ subscriptions.remove(subscription);
+ if (subscriptions.isEmpty) controller.close();
+ });
+ subscriptions.add(subscription);
+ }, onError: controller.addError);
+ outerSubscription.onDone(() {
+ subscriptions.remove(outerSubscription);
+ if (subscriptions.isEmpty) controller.close();
+ });
+ subscriptions.add(outerSubscription);
+ if (!isBroadcast) {
+ controller
+ ..onPause = () {
+ for (final subscription in subscriptions) {
+ subscription.pause();
+ }
+ }
+ ..onResume = () {
+ for (final subscription in subscriptions) {
+ subscription.resume();
+ }
+ };
+ }
+ controller.onCancel = () {
+ if (subscriptions.isEmpty) return null;
+ return [for (var s in subscriptions) s.cancel()]
+ .wait
+ .then(ignoreArgument);
+ };
+ };
+ return controller.stream;
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/async_map.dart b/pkgs/stream_transform/lib/src/async_map.dart
new file mode 100644
index 0000000..094df9c
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/async_map.dart
@@ -0,0 +1,136 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'aggregate_sample.dart';
+import 'common_callbacks.dart';
+import 'from_handlers.dart';
+import 'rate_limit.dart';
+
+/// Alternatives to [asyncMap].
+///
+/// The built in [asyncMap] will not overlap execution of the passed callback,
+/// and every event will be sent to the callback individually.
+///
+/// - [asyncMapBuffer] prevents the callback from overlapping execution and
+/// collects events while it is executing to process in batches.
+/// - [asyncMapSample] prevents overlapping execution and discards events while
+/// it is executing.
+/// - [concurrentAsyncMap] allows overlap and removes ordering guarantees.
+extension AsyncMap<T> on Stream<T> {
+ /// Like [asyncMap] but events are buffered until previous events have been
+ /// processed by [convert].
+ ///
+ /// If this stream is a broadcast stream the result will be as well.
+ /// When used with a broadcast stream behavior also differs from [asyncMap] in
+ /// that the [convert] function is only called once per event, rather than
+ /// once per listener per event.
+ ///
+ /// The first event from this stream is always passed to [convert] as a
+ /// list with a single element.
+ /// After that, events are buffered until the previous Future returned from
+ /// [convert] has completed.
+ ///
+ /// Errors from this stream are forwarded directly to the result stream.
+ /// Errors during the conversion are also forwarded to the result stream and
+ /// are considered completing work so the next values are let through.
+ ///
+ /// The result stream will not close until this stream closes and all pending
+ /// conversions have finished.
+ Stream<S> asyncMapBuffer<S>(Future<S> Function(List<T>) convert) {
+ var workFinished = StreamController<void>()
+ // Let the first event through.
+ ..add(null);
+ return buffer(workFinished.stream)._asyncMapThen(convert, workFinished.add);
+ }
+
+ /// Like [asyncMap] but events are discarded while work is happening in
+ /// [convert].
+ ///
+ /// If this stream is a broadcast stream the result will be as well.
+ /// When used with a broadcast stream behavior also differs from [asyncMap] in
+ /// that the [convert] function is only called once per event, rather than
+ /// once per listener per event.
+ ///
+ /// If no work is happening when an event is emitted it will be immediately
+ /// passed to [convert]. If there is ongoing work when an event is emitted it
+ /// will be held until the work is finished. New events emitted will replace a
+ /// pending event.
+ ///
+ /// Errors from this stream are forwarded directly to the result stream.
+ /// Errors during the conversion are also forwarded to the result stream and
+ /// are considered completing work so the next values are let through.
+ ///
+ /// The result stream will not close until this stream closes and all pending
+ /// conversions have finished.
+ Stream<S> asyncMapSample<S>(Future<S> Function(T) convert) {
+ var workFinished = StreamController<void>()
+ // Let the first event through.
+ ..add(null);
+ return aggregateSample(
+ trigger: workFinished.stream,
+ aggregate: _dropPrevious,
+ longPoll: true,
+ onEmpty: ignoreArgument)
+ ._asyncMapThen(convert, workFinished.add);
+ }
+
+ /// Like [asyncMap] but the [convert] callback may be called for an element
+ /// before processing for the previous element is finished.
+ ///
+ /// Events on the result stream will be emitted in the order that [convert]
+ /// completed which may not match the order of this stream.
+ ///
+ /// If this stream is a broadcast stream the result will be as well.
+ /// When used with a broadcast stream behavior also differs from [asyncMap] in
+ /// that the [convert] function is only called once per event, rather than
+ /// once per listener per event. The [convert] callback won't be called for
+ /// events while a broadcast stream has no listener.
+ ///
+ /// Errors from [convert] or this stream are forwarded directly to the
+ /// result stream.
+ ///
+ /// The result stream will not close until this stream closes and all pending
+ /// conversions have finished.
+ Stream<S> concurrentAsyncMap<S>(FutureOr<S> Function(T) convert) {
+ var valuesWaiting = 0;
+ var sourceDone = false;
+ return transformByHandlers(onData: (element, sink) {
+ valuesWaiting++;
+ () async {
+ try {
+ sink.add(await convert(element));
+ } catch (e, st) {
+ sink.addError(e, st);
+ }
+ valuesWaiting--;
+ if (valuesWaiting <= 0 && sourceDone) sink.close();
+ }();
+ }, onDone: (sink) {
+ sourceDone = true;
+ if (valuesWaiting <= 0) sink.close();
+ });
+ }
+
+ /// Like [Stream.asyncMap] but the [convert] is only called once per event,
+ /// rather than once per listener, and [then] is called after completing the
+ /// work.
+ Stream<S> _asyncMapThen<S>(
+ Future<S> Function(T) convert, void Function(void) then) {
+ Future<void>? pendingEvent;
+ return transformByHandlers(onData: (event, sink) {
+ pendingEvent =
+ convert(event).then(sink.add).catchError(sink.addError).then(then);
+ }, onDone: (sink) {
+ if (pendingEvent != null) {
+ pendingEvent!.then((_) => sink.close());
+ } else {
+ sink.close();
+ }
+ });
+ }
+}
+
+T _dropPrevious<T>(T event, _) => event;
diff --git a/pkgs/stream_transform/lib/src/combine_latest.dart b/pkgs/stream_transform/lib/src/combine_latest.dart
new file mode 100644
index 0000000..f02a19e
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/combine_latest.dart
@@ -0,0 +1,240 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'common_callbacks.dart';
+
+/// Utilities to combine events from multiple streams through a callback or into
+/// a list.
+extension CombineLatest<T> on Stream<T> {
+ /// Combines the latest values from this stream with the latest values from
+ /// [other] using [combine].
+ ///
+ /// No event will be emitted until both the source stream and [other] have
+ /// each emitted at least one event. If either the source stream or [other]
+ /// emit multiple events before the other emits the first event, all but the
+ /// last value will be discarded. Once both streams have emitted at least
+ /// once, the result stream will emit any time either input stream emits.
+ ///
+ /// The result stream will not close until both the source stream and [other]
+ /// have closed.
+ ///
+ /// For example:
+ ///
+ /// source.combineLatest(other, (a, b) => a + b);
+ ///
+ /// source: --1--2--------4--|
+ /// other: -------3--|
+ /// result: -------5------7--|
+ ///
+ /// Errors thrown by [combine], along with any errors on the source stream or
+ /// [other], are forwarded to the result stream.
+ ///
+ /// If the source stream is a broadcast stream, the result stream will be as
+ /// well, regardless of [other]'s type. If a single subscription stream is
+ /// combined with a broadcast stream it may never be canceled.
+ Stream<S> combineLatest<T2, S>(
+ Stream<T2> other, FutureOr<S> Function(T, T2) combine) {
+ final controller = isBroadcast
+ ? StreamController<S>.broadcast(sync: true)
+ : StreamController<S>(sync: true);
+
+ other =
+ (isBroadcast && !other.isBroadcast) ? other.asBroadcastStream() : other;
+
+ StreamSubscription<T>? sourceSubscription;
+ StreamSubscription<T2>? otherSubscription;
+
+ var sourceDone = false;
+ var otherDone = false;
+
+ late T latestSource;
+ late T2 latestOther;
+
+ var sourceStarted = false;
+ var otherStarted = false;
+
+ void emitCombined() {
+ if (!sourceStarted || !otherStarted) return;
+ FutureOr<S> result;
+ try {
+ result = combine(latestSource, latestOther);
+ } catch (e, s) {
+ controller.addError(e, s);
+ return;
+ }
+ if (result is Future<S>) {
+ sourceSubscription!.pause();
+ otherSubscription!.pause();
+ result
+ .then(controller.add, onError: controller.addError)
+ .whenComplete(() {
+ sourceSubscription!.resume();
+ otherSubscription!.resume();
+ });
+ } else {
+ controller.add(result);
+ }
+ }
+
+ controller.onListen = () {
+ assert(sourceSubscription == null);
+ sourceSubscription = listen(
+ (s) {
+ sourceStarted = true;
+ latestSource = s;
+ emitCombined();
+ },
+ onError: controller.addError,
+ onDone: () {
+ sourceDone = true;
+ if (otherDone) {
+ controller.close();
+ } else if (!sourceStarted) {
+ // Nothing can ever be emitted
+ otherSubscription!.cancel();
+ controller.close();
+ }
+ });
+ otherSubscription = other.listen(
+ (o) {
+ otherStarted = true;
+ latestOther = o;
+ emitCombined();
+ },
+ onError: controller.addError,
+ onDone: () {
+ otherDone = true;
+ if (sourceDone) {
+ controller.close();
+ } else if (!otherStarted) {
+ // Nothing can ever be emitted
+ sourceSubscription!.cancel();
+ controller.close();
+ }
+ });
+ if (!isBroadcast) {
+ controller
+ ..onPause = () {
+ sourceSubscription!.pause();
+ otherSubscription!.pause();
+ }
+ ..onResume = () {
+ sourceSubscription!.resume();
+ otherSubscription!.resume();
+ };
+ }
+ controller.onCancel = () {
+ var cancels = [
+ sourceSubscription!.cancel(),
+ otherSubscription!.cancel()
+ ];
+ sourceSubscription = null;
+ otherSubscription = null;
+ return cancels.wait.then(ignoreArgument);
+ };
+ };
+ return controller.stream;
+ }
+
+ /// Combine the latest value emitted from the source stream with the latest
+ /// values emitted from [others].
+ ///
+ /// [combineLatestAll] subscribes to the source stream and [others] and when
+ /// any one of the streams emits, the result stream will emit a [List<T>] of
+ /// the latest values emitted from all streams.
+ ///
+ /// No event will be emitted until all source streams emit at least once. If a
+ /// source stream emits multiple values before another starts emitting, all
+ /// but the last value will be discarded. Once all source streams have emitted
+ /// at least once, the result stream will emit any time any source stream
+ /// emits.
+ ///
+ /// The result stream will not close until all source streams have closed.
+ /// When a source stream closes, the result stream will continue to emit the
+ /// last value from the closed stream when the other source streams emit until
+ /// the result stream has closed. If a source stream closes without emitting
+ /// any value, the result stream will close as well.
+ ///
+ /// For example:
+ ///
+ /// final combined = first
+ /// .combineLatestAll([second, third])
+ /// .map((data) => data.join());
+ ///
+ /// first: a----b------------------c--------d---|
+ /// second: --1---------2-----------------|
+ /// third: -------&----------%---|
+ /// combined: -------b1&--b2&---b2%---c2%------d2%-|
+ ///
+ /// Errors thrown by any source stream will be forwarded to the result stream.
+ ///
+ /// If the source stream is a broadcast stream, the result stream will be as
+ /// well, regardless of the types of [others]. If a single subscription stream
+ /// is combined with a broadcast source stream, it may never be canceled.
+ Stream<List<T>> combineLatestAll(Iterable<Stream<T>> others) {
+ final controller = isBroadcast
+ ? StreamController<List<T>>.broadcast(sync: true)
+ : StreamController<List<T>>(sync: true);
+
+ final allStreams = [
+ this,
+ for (final other in others)
+ !isBroadcast || other.isBroadcast ? other : other.asBroadcastStream(),
+ ];
+
+ controller.onListen = () {
+ final subscriptions = <StreamSubscription<T>>[];
+
+ final latestData = List<T?>.filled(allStreams.length, null);
+ final hasEmitted = <int>{};
+ void handleData(int index, T data) {
+ latestData[index] = data;
+ hasEmitted.add(index);
+ if (hasEmitted.length == allStreams.length) {
+ controller.add(List.from(latestData));
+ }
+ }
+
+ var streamId = 0;
+ for (final stream in allStreams) {
+ final index = streamId;
+
+ final subscription = stream.listen((data) => handleData(index, data),
+ onError: controller.addError);
+ subscription.onDone(() {
+ assert(subscriptions.contains(subscription));
+ subscriptions.remove(subscription);
+ if (subscriptions.isEmpty || !hasEmitted.contains(index)) {
+ controller.close();
+ }
+ });
+ subscriptions.add(subscription);
+
+ streamId++;
+ }
+ if (!isBroadcast) {
+ controller
+ ..onPause = () {
+ for (final subscription in subscriptions) {
+ subscription.pause();
+ }
+ }
+ ..onResume = () {
+ for (final subscription in subscriptions) {
+ subscription.resume();
+ }
+ };
+ }
+ controller.onCancel = () {
+ if (subscriptions.isEmpty) return null;
+ return [for (var s in subscriptions) s.cancel()]
+ .wait
+ .then(ignoreArgument);
+ };
+ };
+ return controller.stream;
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/common_callbacks.dart b/pkgs/stream_transform/lib/src/common_callbacks.dart
new file mode 100644
index 0000000..e211cf9
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/common_callbacks.dart
@@ -0,0 +1,5 @@
+// Copyright (c) 2024, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+void ignoreArgument(Object? _) {}
diff --git a/pkgs/stream_transform/lib/src/concatenate.dart b/pkgs/stream_transform/lib/src/concatenate.dart
new file mode 100644
index 0000000..0330dd7
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/concatenate.dart
@@ -0,0 +1,112 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+/// Utilities to append or prepend to a stream.
+extension Concatenate<T> on Stream<T> {
+ /// Emits all values and errors from [next] following all values and errors
+ /// from this stream.
+ ///
+ /// If this stream never finishes, the [next] stream will never get a
+ /// listener.
+ ///
+ /// If this stream is a broadcast stream, the result will be as well.
+ /// If a single-subscription follows a broadcast stream it may be listened
+ /// to and never canceled since there may be broadcast listeners added later.
+ ///
+ /// If a broadcast stream follows any other stream it will miss any events or
+ /// errors which occur before this stream is done.
+ /// If a broadcast stream follows a single-subscription stream, pausing the
+ /// stream while it is listening to the second stream will cause events to be
+ /// dropped rather than buffered.
+ Stream<T> followedBy(Stream<T> next) {
+ var controller = isBroadcast
+ ? StreamController<T>.broadcast(sync: true)
+ : StreamController<T>(sync: true);
+
+ next = isBroadcast && !next.isBroadcast ? next.asBroadcastStream() : next;
+
+ StreamSubscription<T>? subscription;
+ var currentStream = this;
+ var thisDone = false;
+ var secondDone = false;
+
+ late void Function() currentDoneHandler;
+
+ void listen() {
+ subscription = currentStream.listen(controller.add,
+ onError: controller.addError, onDone: () => currentDoneHandler());
+ }
+
+ void onSecondDone() {
+ secondDone = true;
+ controller.close();
+ }
+
+ void onThisDone() {
+ thisDone = true;
+ currentStream = next;
+ currentDoneHandler = onSecondDone;
+ listen();
+ }
+
+ currentDoneHandler = onThisDone;
+
+ controller.onListen = () {
+ assert(subscription == null);
+ listen();
+ if (!isBroadcast) {
+ controller
+ ..onPause = () {
+ if (!thisDone || !next.isBroadcast) return subscription!.pause();
+ subscription!.cancel();
+ subscription = null;
+ }
+ ..onResume = () {
+ if (!thisDone || !next.isBroadcast) return subscription!.resume();
+ listen();
+ };
+ }
+ controller.onCancel = () {
+ if (secondDone) return null;
+ var toCancel = subscription!;
+ subscription = null;
+ return toCancel.cancel();
+ };
+ };
+ return controller.stream;
+ }
+
+ /// Emits [initial] before any values or errors from the this stream.
+ ///
+ /// If this stream is a broadcast stream the result will be as well.
+ /// If this stream is a broadcast stream, the returned stream will only
+ /// contain events of this stream that are emitted after the [initial] value
+ /// has been emitted on the returned stream.
+ Stream<T> startWith(T initial) =>
+ startWithStream(Future.value(initial).asStream());
+
+ /// Emits all values in [initial] before any values or errors from this
+ /// stream.
+ ///
+ /// If this stream is a broadcast stream the result will be as well.
+ /// If this stream is a broadcast stream it will miss any events which
+ /// occur before the initial values are all emitted.
+ Stream<T> startWithMany(Iterable<T> initial) =>
+ startWithStream(Stream.fromIterable(initial));
+
+ /// Emits all values and errors in [initial] before any values or errors from
+ /// this stream.
+ ///
+ /// If this stream is a broadcast stream the result will be as well.
+ /// If this stream is a broadcast stream it will miss any events which occur
+ /// before [initial] closes.
+ Stream<T> startWithStream(Stream<T> initial) {
+ if (isBroadcast && !initial.isBroadcast) {
+ initial = initial.asBroadcastStream();
+ }
+ return initial.followedBy(this);
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/from_handlers.dart b/pkgs/stream_transform/lib/src/from_handlers.dart
new file mode 100644
index 0000000..1146a13
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/from_handlers.dart
@@ -0,0 +1,58 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+extension TransformByHandlers<S> on Stream<S> {
+ /// Transform a stream by callbacks.
+ ///
+ /// This is similar to `transform(StreamTransformer.fromHandler(...))` except
+ /// that the handlers are called once per event rather than called for the
+ /// same event for each listener on a broadcast stream.
+ Stream<T> transformByHandlers<T>(
+ {required void Function(S, EventSink<T>) onData,
+ void Function(Object, StackTrace, EventSink<T>)? onError,
+ void Function(EventSink<T>)? onDone}) {
+ final handleError = onError ?? _defaultHandleError;
+ final handleDone = onDone ?? _defaultHandleDone;
+
+ var controller = isBroadcast
+ ? StreamController<T>.broadcast(sync: true)
+ : StreamController<T>(sync: true);
+
+ StreamSubscription<S>? subscription;
+ controller.onListen = () {
+ assert(subscription == null);
+ var valuesDone = false;
+ subscription = listen((value) => onData(value, controller),
+ onError: (Object error, StackTrace stackTrace) {
+ handleError(error, stackTrace, controller);
+ }, onDone: () {
+ valuesDone = true;
+ handleDone(controller);
+ });
+ if (!isBroadcast) {
+ controller
+ ..onPause = subscription!.pause
+ ..onResume = subscription!.resume;
+ }
+ controller.onCancel = () {
+ var toCancel = subscription;
+ subscription = null;
+ if (!valuesDone) return toCancel!.cancel();
+ return null;
+ };
+ };
+ return controller.stream;
+ }
+
+ static void _defaultHandleError<T>(
+ Object error, StackTrace stackTrace, EventSink<T> sink) {
+ sink.addError(error, stackTrace);
+ }
+
+ static void _defaultHandleDone<T>(EventSink<T> sink) {
+ sink.close();
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/merge.dart b/pkgs/stream_transform/lib/src/merge.dart
new file mode 100644
index 0000000..3bfe06c
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/merge.dart
@@ -0,0 +1,102 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'common_callbacks.dart';
+
+/// Utilities to interleave events from multiple streams.
+extension Merge<T> on Stream<T> {
+ /// Merges values and errors from this stream and [other] in any order as they
+ /// arrive.
+ ///
+ /// The result stream will not close until both this stream and [other] have
+ /// closed.
+ ///
+ /// For example:
+ ///
+ /// final result = source.merge(other);
+ ///
+ /// source: 1--2-----3--|
+ /// other: ------4-------5--|
+ /// result: 1--2--4--3----5--|
+ ///
+ /// If this stream is a broadcast stream, the result stream will be as
+ /// well, regardless of [other]'s type. If a single subscription stream is
+ /// merged into a broadcast stream it may never be canceled since there may be
+ /// broadcast listeners added later.
+ ///
+ /// If a broadcast stream is merged into a single-subscription stream any
+ /// events emitted by [other] before the result stream has a subscriber will
+ /// be discarded.
+ Stream<T> merge(Stream<T> other) => mergeAll([other]);
+
+ /// Merges values and errors from this stream and any stream in [others] in
+ /// any order as they arrive.
+ ///
+ /// The result stream will not close until this stream and all streams
+ /// in [others] have closed.
+ ///
+ /// For example:
+ ///
+ /// final result = first.mergeAll([second, third]);
+ ///
+ /// first: 1--2--------3--|
+ /// second: ---------4-------5--|
+ /// third: ------6---------------7--|
+ /// result: 1--2--6--4--3----5----7--|
+ ///
+ /// If this stream is a broadcast stream, the result stream will be as
+ /// well, regardless the types of streams in [others]. If a single
+ /// subscription stream is merged into a broadcast stream it may never be
+ /// canceled since there may be broadcast listeners added later.
+ ///
+ /// If a broadcast stream is merged into a single-subscription stream any
+ /// events emitted by that stream before the result stream has a subscriber
+ /// will be discarded.
+ Stream<T> mergeAll(Iterable<Stream<T>> others) {
+ final controller = isBroadcast
+ ? StreamController<T>.broadcast(sync: true)
+ : StreamController<T>(sync: true);
+
+ final allStreams = [
+ this,
+ for (final other in others)
+ !isBroadcast || other.isBroadcast ? other : other.asBroadcastStream(),
+ ];
+
+ controller.onListen = () {
+ final subscriptions = <StreamSubscription<T>>[];
+ for (final stream in allStreams) {
+ final subscription =
+ stream.listen(controller.add, onError: controller.addError);
+ subscription.onDone(() {
+ subscriptions.remove(subscription);
+ if (subscriptions.isEmpty) controller.close();
+ });
+ subscriptions.add(subscription);
+ }
+ if (!isBroadcast) {
+ controller
+ ..onPause = () {
+ for (final subscription in subscriptions) {
+ subscription.pause();
+ }
+ }
+ ..onResume = () {
+ for (final subscription in subscriptions) {
+ subscription.resume();
+ }
+ };
+ }
+ controller.onCancel = () {
+ if (subscriptions.isEmpty) return null;
+ return [for (var s in subscriptions) s.cancel()]
+ .wait
+ .then(ignoreArgument);
+ };
+ };
+ return controller.stream;
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/rate_limit.dart b/pkgs/stream_transform/lib/src/rate_limit.dart
new file mode 100644
index 0000000..299c230
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/rate_limit.dart
@@ -0,0 +1,356 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'aggregate_sample.dart';
+import 'common_callbacks.dart';
+import 'from_handlers.dart';
+
+/// Utilities to rate limit events.
+///
+/// - [debounce] - emit the the _first_ or _last_ event of a series of closely
+/// spaced events.
+/// - [debounceBuffer] - emit _all_ events at the _end_ of a series of closely
+/// spaced events.
+/// - [throttle] - emit the _first_ event at the _beginning_ of the period.
+/// - [audit] - emit the _last_ event at the _end_ of the period.
+/// - [buffer] - emit _all_ events on a _trigger_.
+extension RateLimit<T> on Stream<T> {
+ /// Suppresses events with less inter-event spacing than [duration].
+ ///
+ /// Events which are emitted with less than [duration] elapsed between them
+ /// are considered to be part of the same "series". If [leading] is `true`,
+ /// the first event of this series is emitted immediately. If [trailing] is
+ /// `true` the last event of this series is emitted with a delay of at least
+ /// [duration]. By default only trailing events are emitted, both arguments
+ /// must be specified with `leading: true, trailing: false` to emit only
+ /// leading events.
+ ///
+ /// If this stream is a broadcast stream, the result will be as well.
+ /// Errors are forwarded immediately.
+ ///
+ /// If there is a trailing event waiting during the debounce period when the
+ /// source stream closes the returned stream will wait to emit it following
+ /// the debounce period before closing. If there is no pending debounced event
+ /// when this stream closes the returned stream will close immediately.
+ ///
+ /// For example:
+ ///
+ /// source.debounce(Duration(seconds: 1));
+ ///
+ /// source: 1-2-3---4---5-6-|
+ /// result: ------3---4-----6|
+ ///
+ /// source.debounce(Duration(seconds: 1), leading: true, trailing: false);
+ ///
+ /// source: 1-2-3---4---5-6-|
+ /// result: 1-------4---5---|
+ ///
+ /// source.debounce(Duration(seconds: 1), leading: true);
+ ///
+ /// source: 1-2-3---4---5-6-|
+ /// result: 1-----3-4---5---6|
+ ///
+ /// To collect values emitted during the debounce period see [debounceBuffer].
+ Stream<T> debounce(Duration duration,
+ {bool leading = false, bool trailing = true}) =>
+ _debounceAggregate(duration, _dropPrevious,
+ leading: leading, trailing: trailing);
+
+ /// Buffers values until this stream does not emit for [duration] then emits
+ /// the collected values.
+ ///
+ /// Values will always be delayed by at least [duration], and values which
+ /// come within this time will be aggregated into the same list.
+ ///
+ /// If this stream is a broadcast stream, the result will be as well.
+ /// Errors are forwarded immediately.
+ ///
+ /// If there are events waiting during the debounce period when this stream
+ /// closes the returned stream will wait to emit them following the debounce
+ /// period before closing. If there are no pending debounced events when this
+ /// stream closes the returned stream will close immediately.
+ ///
+ /// To keep only the most recent event during the debounce period see
+ /// [debounce].
+ Stream<List<T>> debounceBuffer(Duration duration) =>
+ _debounceAggregate(duration, _collect, leading: false, trailing: true);
+
+ /// Reduces the rate that events are emitted to at most once per [duration].
+ ///
+ /// No events will ever be emitted within [duration] of another event on the
+ /// result stream.
+ /// If this stream is a broadcast stream, the result will be as well.
+ /// Errors are forwarded immediately.
+ ///
+ /// If [trailing] is `false`, source events emitted during the [duration]
+ /// period following a result event are discarded.
+ /// The result stream will not emit an event until this stream emits an event
+ /// following the throttled period.
+ /// If this stream is consistently emitting events with less than
+ /// [duration] between events, the time between events on the result stream
+ /// may still be more than [duration].
+ /// The result stream will close immediately when this stream closes.
+ ///
+ /// If [trailing] is `true`, the latest source event emitted during the
+ /// [duration] period following an result event is held and emitted following
+ /// the period.
+ /// If this stream is consistently emitting events with less than [duration]
+ /// between events, the time between events on the result stream will be
+ /// [duration].
+ /// If this stream closes the result stream will wait to emit a pending event
+ /// before closing.
+ ///
+ /// For example:
+ ///
+ /// source.throttle(Duration(seconds: 6));
+ ///
+ /// source: 1-2-3---4-5-6---7-8-|
+ /// result: 1-------4-------7---|
+ ///
+ /// source.throttle(Duration(seconds: 6), trailing: true);
+ ///
+ /// source: 1-2-3---4-5----6--|
+ /// result: 1-----3-----5-----6|
+ ///
+ /// source.throttle(Duration(seconds: 6), trailing: true);
+ ///
+ /// source: 1-2-----------3|
+ /// result: 1-----2-------3|
+ ///
+ /// See also:
+ /// - [audit], which emits the most recent event at the end of the period.
+ /// Compared to `audit`, `throttle` will not introduce delay to forwarded
+ /// elements, except for the [trailing] events.
+ /// - [debounce], which uses inter-event spacing instead of a fixed period
+ /// from the first event in a window. Compared to `debouce`, `throttle` cannot
+ /// be starved by having events emitted continuously within [duration].
+ Stream<T> throttle(Duration duration, {bool trailing = false}) =>
+ trailing ? _throttleTrailing(duration) : _throttle(duration);
+
+ Stream<T> _throttle(Duration duration) {
+ Timer? timer;
+
+ return transformByHandlers(onData: (data, sink) {
+ if (timer == null) {
+ sink.add(data);
+ timer = Timer(duration, () {
+ timer = null;
+ });
+ }
+ });
+ }
+
+ Stream<T> _throttleTrailing(Duration duration) {
+ Timer? timer;
+ T? pending;
+ var hasPending = false;
+ var isDone = false;
+
+ return transformByHandlers(onData: (data, sink) {
+ void onTimer() {
+ if (hasPending) {
+ sink.add(pending as T);
+ if (isDone) {
+ sink.close();
+ } else {
+ timer = Timer(duration, onTimer);
+ hasPending = false;
+ pending = null;
+ }
+ } else {
+ timer = null;
+ }
+ }
+
+ if (timer == null) {
+ sink.add(data);
+ timer = Timer(duration, onTimer);
+ } else {
+ hasPending = true;
+ pending = data;
+ }
+ }, onDone: (sink) {
+ isDone = true;
+ if (hasPending) return; // Will be closed by timer.
+ sink.close();
+ timer?.cancel();
+ timer = null;
+ });
+ }
+
+ /// Audit a single event from each [duration] length period where there are
+ /// events on this stream.
+ ///
+ /// No events will ever be emitted within [duration] of another event on the
+ /// result stream.
+ /// If this stream is a broadcast stream, the result will be as well.
+ /// Errors are forwarded immediately.
+ ///
+ /// The first event will begin the audit period. At the end of the audit
+ /// period the most recent event is emitted, and the next event restarts the
+ /// audit period.
+ ///
+ /// If the event that started the period is the one that is emitted it will be
+ /// delayed by [duration]. If a later event comes in within the period it's
+ /// delay will be shorter by the difference in arrival times.
+ ///
+ /// If there is no pending event when this stream closes the output
+ /// stream will close immediately. If there is a pending event the output
+ /// stream will wait to emit it before closing.
+ ///
+ /// For example:
+ ///
+ /// source.audit(Duration(seconds: 5));
+ ///
+ /// source: a------b--c----d--|
+ /// output: -----a------c--------d|
+ ///
+ /// See also:
+ /// - [throttle], which emits the _first_ event during the window, instead of
+ /// the last event in the window. Compared to `throttle`, `audit` will
+ /// introduce delay to forwarded events.
+ /// - [debounce], which only emits after the stream has not emitted for some
+ /// period. Compared to `debouce`, `audit` cannot be starved by having events
+ /// emitted continuously within [duration].
+ Stream<T> audit(Duration duration) {
+ Timer? timer;
+ var shouldClose = false;
+ T recentData;
+
+ return transformByHandlers(onData: (data, sink) {
+ recentData = data;
+ timer ??= Timer(duration, () {
+ sink.add(recentData);
+ timer = null;
+ if (shouldClose) {
+ sink.close();
+ }
+ });
+ }, onDone: (sink) {
+ if (timer != null) {
+ shouldClose = true;
+ } else {
+ sink.close();
+ }
+ });
+ }
+
+ /// Buffers the values emitted on this stream and emits them when [trigger]
+ /// emits an event.
+ ///
+ /// If [longPoll] is `false`, if there are no buffered values when [trigger]
+ /// emits an empty list is immediately emitted.
+ ///
+ /// If [longPoll] is `true`, and there are no buffered values when [trigger]
+ /// emits one or more events, then the *next* value from this stream is
+ /// immediately emitted on the returned stream as a single element list.
+ /// Subsequent events on [trigger] while there have been no events on this
+ /// stream are ignored.
+ ///
+ /// The result stream will close as soon as there is a guarantee it will not
+ /// emit any more events. There will not be any more events emitted if:
+ /// - [trigger] is closed and there is no waiting long poll.
+ /// - Or, this stream is closed and previously buffered events have been
+ /// delivered.
+ ///
+ /// If this stream is a broadcast stream, the result will be as well.
+ /// Errors from this stream or the trigger are immediately forwarded to the
+ /// output.
+ ///
+ /// See also:
+ /// - [sample] which use a [trigger] stream in the same way, but keeps only
+ /// the most recent source event.
+ Stream<List<T>> buffer(Stream<void> trigger, {bool longPoll = true}) =>
+ aggregateSample(
+ trigger: trigger,
+ aggregate: _collect,
+ longPoll: longPoll,
+ onEmpty: _empty);
+
+ /// Emits the most recent new value from this stream when [trigger] emits an
+ /// event.
+ ///
+ /// If [longPoll] is `false`, then an event on [trigger] when there is no
+ /// pending source event will be ignored.
+ /// If [longPoll] is `true` (the default), then an event on [trigger] when
+ /// there is no pending source event will cause the next source event
+ /// to immediately flow to the result stream.
+ ///
+ /// If [longPoll] is `false`, if there is no pending source event when
+ /// [trigger] emits, then the trigger event will be ignored.
+ ///
+ /// If [longPoll] is `true`, and there are no buffered values when [trigger]
+ /// emits one or more events, then the *next* value from this stream is
+ /// immediately emitted on the returned stream as a single element list.
+ /// Subsequent events on [trigger] while there have been no events on this
+ /// stream are ignored.
+ ///
+ /// The result stream will close as soon as there is a guarantee it will not
+ /// emit any more events. There will not be any more events emitted if:
+ /// - [trigger] is closed and there is no waiting long poll.
+ /// - Or, this source stream is closed and any pending source event has been
+ /// delivered.
+ ///
+ /// If this source stream is a broadcast stream, the result will be as well.
+ /// Errors from this source stream or the trigger are immediately forwarded to
+ /// the output.
+ ///
+ /// See also:
+ /// - [buffer] which use [trigger] stream in the same way, but keeps a list of
+ /// pending source events.
+ Stream<T> sample(Stream<void> trigger, {bool longPoll = true}) =>
+ aggregateSample(
+ trigger: trigger,
+ aggregate: _dropPrevious,
+ longPoll: longPoll,
+ onEmpty: ignoreArgument);
+
+ /// Aggregates values until this source stream does not emit for [duration],
+ /// then emits the aggregated values.
+ Stream<S> _debounceAggregate<S>(
+ Duration duration, S Function(T element, S? soFar) collect,
+ {required bool leading, required bool trailing}) {
+ Timer? timer;
+ S? soFar;
+ var hasPending = false;
+ var shouldClose = false;
+ var emittedLatestAsLeading = false;
+
+ return transformByHandlers(onData: (value, sink) {
+ void emit() {
+ sink.add(soFar as S);
+ soFar = null;
+ hasPending = false;
+ }
+
+ timer?.cancel();
+ soFar = collect(value, soFar);
+ hasPending = true;
+ if (timer == null && leading) {
+ emittedLatestAsLeading = true;
+ emit();
+ } else {
+ emittedLatestAsLeading = false;
+ }
+ timer = Timer(duration, () {
+ if (trailing && !emittedLatestAsLeading) emit();
+ if (shouldClose) sink.close();
+ timer = null;
+ });
+ }, onDone: (EventSink<S> sink) {
+ if (hasPending && trailing) {
+ shouldClose = true;
+ } else {
+ timer?.cancel();
+ sink.close();
+ }
+ });
+ }
+}
+
+T _dropPrevious<T>(T element, _) => element;
+List<T> _collect<T>(T event, List<T>? soFar) => (soFar ?? <T>[])..add(event);
+void _empty<T>(Sink<List<T>> sink) => sink.add([]);
diff --git a/pkgs/stream_transform/lib/src/scan.dart b/pkgs/stream_transform/lib/src/scan.dart
new file mode 100644
index 0000000..acd3c76
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/scan.dart
@@ -0,0 +1,31 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+/// A utility similar to [fold] which emits intermediate accumulations.
+extension Scan<T> on Stream<T> {
+ /// Emits a sequence of the accumulated values from repeatedly applying
+ /// [combine].
+ ///
+ /// Like [fold], but instead of producing a single value it yields each
+ /// intermediate result.
+ ///
+ /// If [combine] returns a future it will not be called again for subsequent
+ /// events from the source until it completes, therefore [combine] is always
+ /// called for elements in order, and the result stream always maintains the
+ /// same order as this stream.
+ Stream<S> scan<S>(
+ S initialValue, FutureOr<S> Function(S soFar, T element) combine) {
+ var accumulated = initialValue;
+ return asyncMap((value) {
+ var result = combine(accumulated, value);
+ if (result is Future<S>) {
+ return result.then((r) => accumulated = r);
+ } else {
+ return accumulated = result;
+ }
+ });
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/switch.dart b/pkgs/stream_transform/lib/src/switch.dart
new file mode 100644
index 0000000..546036e
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/switch.dart
@@ -0,0 +1,135 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'async_expand.dart';
+import 'common_callbacks.dart';
+
+/// A utility to take events from the most recent sub stream returned by a
+/// callback.
+extension Switch<T> on Stream<T> {
+ /// Maps events to a Stream and emits values from the most recently created
+ /// Stream.
+ ///
+ /// When the source emits a value it will be converted to a [Stream] using
+ /// [convert] and the output will switch to emitting events from that result.
+ /// Like [asyncExpand] but the [Stream] emitted by a previous element
+ /// will be ignored as soon as the source stream emits a new event.
+ ///
+ /// This means that the source stream is not paused until a sub stream
+ /// returned from the [convert] callback is done. Instead, the subscription
+ /// to the sub stream is canceled as soon as the source stream emits a new
+ /// event.
+ ///
+ /// Errors from [convert], the source stream, or any of the sub streams are
+ /// forwarded to the result stream.
+ ///
+ /// The result stream will not close until the source stream closes and
+ /// the current sub stream have closed.
+ ///
+ /// If the source stream is a broadcast stream, the result will be as well,
+ /// regardless of the types of streams created by [convert]. In this case,
+ /// some care should be taken:
+ ///
+ /// * If [convert] returns a single subscription stream it may be listened to
+ /// and never canceled.
+ ///
+ /// See also:
+ /// - [concurrentAsyncExpand], which emits events from all sub streams
+ /// concurrently instead of cancelling subscriptions to previous subs
+ /// streams.
+ Stream<S> switchMap<S>(Stream<S> Function(T) convert) {
+ return map(convert).switchLatest();
+ }
+}
+
+/// A utility to take events from the most recent sub stream.
+extension SwitchLatest<T> on Stream<Stream<T>> {
+ /// Emits values from the most recently emitted Stream.
+ ///
+ /// When the source emits a stream, the output will switch to emitting events
+ /// from that stream.
+ ///
+ /// Whether the source stream is a single-subscription stream or a
+ /// broadcast stream, the result stream will be the same kind of stream,
+ /// regardless of the types of streams emitted.
+ Stream<T> switchLatest() {
+ var controller = isBroadcast
+ ? StreamController<T>.broadcast(sync: true)
+ : StreamController<T>(sync: true);
+
+ controller.onListen = () {
+ StreamSubscription<T>? innerSubscription;
+ var outerStreamDone = false;
+
+ void listenToInnerStream(Stream<T> innerStream) {
+ assert(innerSubscription == null);
+ var subscription = innerStream
+ .listen(controller.add, onError: controller.addError, onDone: () {
+ innerSubscription = null;
+ if (outerStreamDone) controller.close();
+ });
+ // If a pause happens during an innerSubscription.cancel,
+ // we still listen to the next stream when the cancel is done.
+ // Then we immediately pause it again here.
+ if (controller.isPaused) subscription.pause();
+ innerSubscription = subscription;
+ }
+
+ var addError = controller.addError;
+ final outerSubscription = listen(null, onError: addError, onDone: () {
+ outerStreamDone = true;
+ if (innerSubscription == null) controller.close();
+ });
+ outerSubscription.onData((innerStream) async {
+ var currentSubscription = innerSubscription;
+ if (currentSubscription == null) {
+ listenToInnerStream(innerStream);
+ return;
+ }
+ innerSubscription = null;
+ outerSubscription.pause();
+ try {
+ await currentSubscription.cancel();
+ } catch (error, stack) {
+ controller.addError(error, stack);
+ } finally {
+ if (!isBroadcast && !controller.hasListener) {
+ // Result single-subscription stream subscription was cancelled
+ // while waiting for previous innerStream cancel.
+ //
+ // Ensure that the last received stream is also listened to and
+ // cancelled, then do nothing further.
+ innerStream.listen(null).cancel().ignore();
+ } else {
+ outerSubscription.resume();
+ listenToInnerStream(innerStream);
+ }
+ }
+ });
+ if (!isBroadcast) {
+ controller
+ ..onPause = () {
+ innerSubscription?.pause();
+ outerSubscription.pause();
+ }
+ ..onResume = () {
+ innerSubscription?.resume();
+ outerSubscription.resume();
+ };
+ }
+ controller.onCancel = () {
+ var sub = innerSubscription;
+ var cancels = [
+ if (!outerStreamDone) outerSubscription.cancel(),
+ if (sub != null) sub.cancel(),
+ ];
+ if (cancels.isEmpty) return null;
+ return cancels.wait.then(ignoreArgument);
+ };
+ };
+ return controller.stream;
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/take_until.dart b/pkgs/stream_transform/lib/src/take_until.dart
new file mode 100644
index 0000000..e6deaa1
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/take_until.dart
@@ -0,0 +1,64 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+/// A utility to end a stream based on an external trigger.
+extension TakeUntil<T> on Stream<T> {
+ /// Takes values from this stream which are emitted before [trigger]
+ /// completes.
+ ///
+ /// Completing [trigger] differs from canceling a subscription in that values
+ /// which are emitted before the trigger, but have further asynchronous delays
+ /// in transformations following the takeUtil, will still go through.
+ /// Cancelling a subscription immediately stops values.
+ ///
+ /// If [trigger] completes as an error, the error will be forwarded through
+ /// the result stream before the result stream closes.
+ ///
+ /// If [trigger] completes as a value or as an error after this stream has
+ /// already ended, the completion will be ignored.
+ Stream<T> takeUntil(Future<void> trigger) {
+ var controller = isBroadcast
+ ? StreamController<T>.broadcast(sync: true)
+ : StreamController<T>(sync: true);
+
+ StreamSubscription<T>? subscription;
+ var isDone = false;
+ trigger.then((_) {
+ if (isDone) return;
+ isDone = true;
+ subscription?.cancel();
+ controller.close();
+ }, onError: (Object error, StackTrace stackTrace) {
+ if (isDone) return;
+ isDone = true;
+ controller
+ ..addError(error, stackTrace)
+ ..close();
+ });
+
+ controller.onListen = () {
+ if (isDone) return;
+ subscription =
+ listen(controller.add, onError: controller.addError, onDone: () {
+ if (isDone) return;
+ isDone = true;
+ controller.close();
+ });
+ if (!isBroadcast) {
+ controller
+ ..onPause = subscription!.pause
+ ..onResume = subscription!.resume;
+ }
+ controller.onCancel = () {
+ if (isDone) return null;
+ var toCancel = subscription!;
+ subscription = null;
+ return toCancel.cancel();
+ };
+ };
+ return controller.stream;
+ }
+}
diff --git a/pkgs/stream_transform/lib/src/tap.dart b/pkgs/stream_transform/lib/src/tap.dart
new file mode 100644
index 0000000..4b16ab5
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/tap.dart
@@ -0,0 +1,44 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+import 'from_handlers.dart';
+
+/// A utility to chain extra behavior on a stream.
+extension Tap<T> on Stream<T> {
+ /// Taps into this stream to allow additional handling on a single-subscriber
+ /// stream without first wrapping as a broadcast stream.
+ ///
+ /// The [onValue] callback will be called with every value from this stream
+ /// before it is forwarded to listeners on the resulting stream.
+ /// May be null if only [onError] or [onDone] callbacks are needed.
+ ///
+ /// The [onError] callback will be called with every error from this stream
+ /// before it is forwarded to listeners on the resulting stream.
+ ///
+ /// The [onDone] callback will be called after this stream closes and before
+ /// the resulting stream is closed.
+ ///
+ /// Errors from any of the callbacks are caught and ignored.
+ ///
+ /// The callbacks may not be called until the tapped stream has a listener,
+ /// and may not be called after the listener has canceled the subscription.
+ Stream<T> tap(void Function(T)? onValue,
+ {void Function(Object, StackTrace)? onError,
+ void Function()? onDone}) =>
+ transformByHandlers(onData: (value, sink) {
+ try {
+ onValue?.call(value);
+ } catch (_) {/*Ignore*/}
+ sink.add(value);
+ }, onError: (error, stackTrace, sink) {
+ try {
+ onError?.call(error, stackTrace);
+ } catch (_) {/*Ignore*/}
+ sink.addError(error, stackTrace);
+ }, onDone: (sink) {
+ try {
+ onDone?.call();
+ } catch (_) {/*Ignore*/}
+ sink.close();
+ });
+}
diff --git a/pkgs/stream_transform/lib/src/where.dart b/pkgs/stream_transform/lib/src/where.dart
new file mode 100644
index 0000000..76aa28a
--- /dev/null
+++ b/pkgs/stream_transform/lib/src/where.dart
@@ -0,0 +1,71 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'from_handlers.dart';
+
+/// Utilities to filter events.
+extension Where<T> on Stream<T> {
+ /// Discards events from this stream that are not of type [S].
+ ///
+ /// If the source stream is a broadcast stream the result will be as well.
+ ///
+ /// Errors from the source stream are forwarded directly to the result stream.
+ ///
+ /// [S] should be a subtype of the stream's generic type, otherwise nothing of
+ /// type [S] could possibly be emitted, however there is no static or runtime
+ /// checking that this is the case.
+ Stream<S> whereType<S>() => transformByHandlers(onData: (event, sink) {
+ if (event is S) sink.add(event);
+ });
+
+ /// Discards events from this stream based on an asynchronous [test] callback.
+ ///
+ /// Like [where] but allows the [test] to return a [Future].
+ ///
+ /// Events on the result stream will be emitted in the order that [test]
+ /// completes which may not match the order of this stream.
+ ///
+ /// If the source stream is a broadcast stream the result will be as well.
+ /// When used with a broadcast stream behavior also differs from [where] in
+ /// that the [test] function is only called once per event, rather than once
+ /// per listener per event.
+ ///
+ /// Errors from the source stream are forwarded directly to the result stream.
+ /// Errors from [test] are also forwarded to the result stream.
+ ///
+ /// The result stream will not close until the source stream closes and all
+ /// pending [test] calls have finished.
+ Stream<T> asyncWhere(FutureOr<bool> Function(T) test) {
+ var valuesWaiting = 0;
+ var sourceDone = false;
+ return transformByHandlers(onData: (element, sink) {
+ valuesWaiting++;
+ () async {
+ try {
+ if (await test(element)) sink.add(element);
+ } catch (e, st) {
+ sink.addError(e, st);
+ }
+ valuesWaiting--;
+ if (valuesWaiting <= 0 && sourceDone) sink.close();
+ }();
+ }, onDone: (sink) {
+ sourceDone = true;
+ if (valuesWaiting <= 0) sink.close();
+ });
+ }
+}
+
+extension WhereNotNull<T extends Object> on Stream<T?> {
+ /// Discards `null` events from this stream.
+ ///
+ /// If the source stream is a broadcast stream the result will be as well.
+ ///
+ /// Errors from the source stream are forwarded directly to the result stream.
+ Stream<T> whereNotNull() => transformByHandlers(onData: (event, sink) {
+ if (event != null) sink.add(event);
+ });
+}
diff --git a/pkgs/stream_transform/lib/stream_transform.dart b/pkgs/stream_transform/lib/stream_transform.dart
new file mode 100644
index 0000000..edf4df9
--- /dev/null
+++ b/pkgs/stream_transform/lib/stream_transform.dart
@@ -0,0 +1,15 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/async_expand.dart';
+export 'src/async_map.dart';
+export 'src/combine_latest.dart';
+export 'src/concatenate.dart';
+export 'src/merge.dart';
+export 'src/rate_limit.dart';
+export 'src/scan.dart';
+export 'src/switch.dart';
+export 'src/take_until.dart';
+export 'src/tap.dart';
+export 'src/where.dart';
diff --git a/pkgs/stream_transform/pubspec.yaml b/pkgs/stream_transform/pubspec.yaml
new file mode 100644
index 0000000..91840b7
--- /dev/null
+++ b/pkgs/stream_transform/pubspec.yaml
@@ -0,0 +1,14 @@
+name: stream_transform
+version: 2.1.2-wip
+description: A collection of utilities to transform and manipulate streams.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/stream_transform
+
+environment:
+ sdk: ^3.4.0
+
+dev_dependencies:
+ async: ^2.5.0
+ dart_flutter_team_lints: ^3.0.0
+ fake_async: ^1.3.0
+ test: ^1.16.0
+ web: ^1.1.0
diff --git a/pkgs/stream_transform/test/async_expand_test.dart b/pkgs/stream_transform/test/async_expand_test.dart
new file mode 100644
index 0000000..8d84300
--- /dev/null
+++ b/pkgs/stream_transform/test/async_expand_test.dart
@@ -0,0 +1,195 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ test('forwards errors from the convert callback', () async {
+ var errors = <String>[];
+ var source = Stream.fromIterable([1, 2, 3]);
+ source.concurrentAsyncExpand<void>((i) {
+ // ignore: only_throw_errors
+ throw 'Error: $i';
+ }).listen((_) {}, onError: errors.add);
+ await Future<void>(() {});
+ expect(errors, ['Error: 1', 'Error: 2', 'Error: 3']);
+ });
+
+ for (var outerType in streamTypes) {
+ for (var innerType in streamTypes) {
+ group('concurrentAsyncExpand $outerType to $innerType', () {
+ late StreamController<int> outerController;
+ late bool outerCanceled;
+ late List<StreamController<String>> innerControllers;
+ late List<bool> innerCanceled;
+ late List<String> emittedValues;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<String> transformed;
+ late StreamSubscription<String> subscription;
+
+ setUp(() {
+ outerController = createController(outerType)
+ ..onCancel = () {
+ outerCanceled = true;
+ };
+ outerCanceled = false;
+ innerControllers = [];
+ innerCanceled = [];
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ transformed = outerController.stream.concurrentAsyncExpand((i) {
+ var index = innerControllers.length;
+ innerCanceled.add(false);
+ innerControllers.add(createController<String>(innerType)
+ ..onCancel = () {
+ innerCanceled[index] = true;
+ });
+ return innerControllers.last.stream;
+ });
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ });
+
+ test('interleaves events from sub streams', () async {
+ outerController
+ ..add(1)
+ ..add(2);
+ await Future<void>(() {});
+ expect(emittedValues, isEmpty);
+ expect(innerControllers, hasLength(2));
+ innerControllers[0].add('First');
+ innerControllers[1].add('Second');
+ innerControllers[0].add('First again');
+ await Future<void>(() {});
+ expect(emittedValues, ['First', 'Second', 'First again']);
+ });
+
+ test('forwards errors from outer stream', () async {
+ outerController.addError('Error');
+ await Future<void>(() {});
+ expect(errors, ['Error']);
+ });
+
+ test('forwards errors from inner streams', () async {
+ outerController
+ ..add(1)
+ ..add(2);
+ await Future<void>(() {});
+ innerControllers[0].addError('Error 1');
+ innerControllers[1].addError('Error 2');
+ await Future<void>(() {});
+ expect(errors, ['Error 1', 'Error 2']);
+ });
+
+ test('can continue handling events after an error in outer stream',
+ () async {
+ outerController
+ ..addError('Error')
+ ..add(1);
+ await Future<void>(() {});
+ innerControllers[0].add('First');
+ await Future<void>(() {});
+ expect(emittedValues, ['First']);
+ expect(errors, ['Error']);
+ });
+
+ test('cancels outer subscription if output canceled', () async {
+ await subscription.cancel();
+ expect(outerCanceled, true);
+ });
+
+ if (outerType != 'broadcast' || innerType != 'single subscription') {
+ // A single subscription inner stream in a broadcast outer stream is
+ // not canceled.
+ test('cancels inner subscriptions if output canceled', () async {
+ outerController
+ ..add(1)
+ ..add(2);
+ await Future<void>(() {});
+ await subscription.cancel();
+ expect(innerCanceled, [true, true]);
+ });
+ }
+
+ test('stays open if any inner stream is still open', () async {
+ outerController.add(1);
+ await outerController.close();
+ await Future<void>(() {});
+ expect(isDone, false);
+ });
+
+ test('stays open if outer stream is still open', () async {
+ outerController.add(1);
+ await Future<void>(() {});
+ await innerControllers[0].close();
+ await Future<void>(() {});
+ expect(isDone, false);
+ });
+
+ test('closes after all inner streams and outer stream close', () async {
+ outerController.add(1);
+ await Future<void>(() {});
+ await innerControllers[0].close();
+ await outerController.close();
+ await Future<void>(() {});
+ expect(isDone, true);
+ });
+
+ if (outerType == 'broadcast') {
+ test('multiple listerns all get values', () async {
+ var otherValues = <String>[];
+ transformed.listen(otherValues.add);
+ outerController.add(1);
+ await Future<void>(() {});
+ innerControllers[0].add('First');
+ await Future<void>(() {});
+ expect(emittedValues, ['First']);
+ expect(otherValues, ['First']);
+ });
+
+ test('multiple listeners get closed', () async {
+ var otherDone = false;
+ transformed.listen(null, onDone: () => otherDone = true);
+ outerController.add(1);
+ await Future<void>(() {});
+ await innerControllers[0].close();
+ await outerController.close();
+ await Future<void>(() {});
+ expect(isDone, true);
+ expect(otherDone, true);
+ });
+
+ test('can cancel and relisten', () async {
+ outerController
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ innerControllers[0].add('First');
+ innerControllers[1].add('Second');
+ await Future(() {});
+ await subscription.cancel();
+ innerControllers[0].add('Ignored');
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ innerControllers[0].add('Also ignored');
+ outerController.add(3);
+ await Future(() {});
+ innerControllers[2].add('More');
+ await Future(() {});
+ expect(emittedValues, ['First', 'Second', 'More']);
+ });
+ }
+ });
+ }
+ }
+}
diff --git a/pkgs/stream_transform/test/async_map_buffer_test.dart b/pkgs/stream_transform/test/async_map_buffer_test.dart
new file mode 100644
index 0000000..2386217
--- /dev/null
+++ b/pkgs/stream_transform/test/async_map_buffer_test.dart
@@ -0,0 +1,204 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ late StreamController<int> values;
+ late List<String> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<String> transformed;
+ late StreamSubscription<String> subscription;
+
+ Completer<String>? finishWork;
+ List<int>? workArgument;
+
+ /// Represents the async `convert` function and asserts that is is only called
+ /// after the previous iteration has completed.
+ Future<String> work(List<int> values) {
+ expect(finishWork, isNull,
+ reason: 'See $values befor previous work is complete');
+ workArgument = values;
+ finishWork = Completer()
+ ..future.then((_) {
+ workArgument = null;
+ finishWork = null;
+ }).catchError((_) {
+ workArgument = null;
+ finishWork = null;
+ });
+ return finishWork!.future;
+ }
+
+ for (var streamType in streamTypes) {
+ group('asyncMapBuffer for stream type: [$streamType]', () {
+ setUp(() {
+ valuesCanceled = false;
+ values = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ finishWork = null;
+ workArgument = null;
+ transformed = values.stream.asyncMapBuffer(work);
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ });
+
+ test('does not emit before work finishes', () async {
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, isEmpty);
+ expect(workArgument, [1]);
+ finishWork!.complete('result');
+ await Future(() {});
+ expect(emittedValues, ['result']);
+ });
+
+ test('buffers values while work is ongoing', () async {
+ values.add(1);
+ await Future(() {});
+ values
+ ..add(2)
+ ..add(3);
+ await Future(() {});
+ finishWork!.complete('');
+ await Future(() {});
+ expect(workArgument, [2, 3]);
+ });
+
+ test('forwards errors without waiting for work', () async {
+ values.add(1);
+ await Future(() {});
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('forwards errors which occur during the work', () async {
+ values.add(1);
+ await Future(() {});
+ finishWork!.completeError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('can continue handling events after an error', () async {
+ values.add(1);
+ await Future(() {});
+ finishWork!.completeError('error');
+ values.add(2);
+ await Future(() {});
+ expect(workArgument, [2]);
+ finishWork!.completeError('another');
+ await Future(() {});
+ expect(errors, ['error', 'another']);
+ });
+
+ test('does not start next work early due to an error in values',
+ () async {
+ values.add(1);
+ await Future(() {});
+ values
+ ..addError('error')
+ ..add(2);
+ await Future(() {});
+ expect(errors, ['error']);
+ // [work] will assert that the second iteration is not called because
+ // the first has not completed.
+ });
+
+ test('cancels value subscription when output canceled', () async {
+ expect(valuesCanceled, false);
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('closes when values end if no work is pending', () async {
+ expect(isDone, false);
+ await values.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('waits for pending work when values close', () async {
+ values.add(1);
+ await Future(() {});
+ expect(isDone, false);
+ values.add(2);
+ await values.close();
+ expect(isDone, false);
+ finishWork!.complete('');
+ await Future(() {});
+ // Still a pending value
+ expect(isDone, false);
+ finishWork!.complete('');
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('forwards errors from values', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () async {
+ var otherValues = <String>[];
+ transformed.listen(otherValues.add);
+ values.add(1);
+ await Future(() {});
+ finishWork!.complete('result');
+ await Future(() {});
+ expect(emittedValues, ['result']);
+ expect(otherValues, ['result']);
+ });
+
+ test('multiple listeners get done when values end', () async {
+ var otherDone = false;
+ transformed.listen(null, onDone: () => otherDone = true);
+ values.add(1);
+ await Future(() {});
+ await values.close();
+ expect(isDone, false);
+ expect(otherDone, false);
+ finishWork!.complete('');
+ await Future(() {});
+ expect(isDone, true);
+ expect(otherDone, true);
+ });
+
+ test('can cancel and relisten', () async {
+ values.add(1);
+ await Future(() {});
+ finishWork!.complete('first');
+ await Future(() {});
+ await subscription.cancel();
+ values.add(2);
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ values.add(3);
+ await Future(() {});
+ expect(workArgument, [3]);
+ finishWork!.complete('second');
+ await Future(() {});
+ expect(emittedValues, ['first', 'second']);
+ });
+ }
+ });
+ }
+}
diff --git a/pkgs/stream_transform/test/async_map_sample_test.dart b/pkgs/stream_transform/test/async_map_sample_test.dart
new file mode 100644
index 0000000..62b1b92
--- /dev/null
+++ b/pkgs/stream_transform/test/async_map_sample_test.dart
@@ -0,0 +1,209 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ late StreamController<int> values;
+ late List<String> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<String> transformed;
+ late StreamSubscription<String> subscription;
+
+ Completer<String>? finishWork;
+ int? workArgument;
+
+ /// Represents the async `convert` function and asserts that is is only called
+ /// after the previous iteration has completed.
+ Future<String> work(int value) {
+ expect(finishWork, isNull,
+ reason: 'See $values befor previous work is complete');
+ workArgument = value;
+ finishWork = Completer()
+ ..future.then((_) {
+ workArgument = null;
+ finishWork = null;
+ }).catchError((_) {
+ workArgument = null;
+ finishWork = null;
+ });
+ return finishWork!.future;
+ }
+
+ for (var streamType in streamTypes) {
+ group('asyncMapSample for stream type: [$streamType]', () {
+ setUp(() {
+ valuesCanceled = false;
+ values = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ finishWork = null;
+ workArgument = null;
+ transformed = values.stream.asyncMapSample(work);
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ });
+
+ test('does not emit before work finishes', () async {
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, isEmpty);
+ expect(workArgument, 1);
+ finishWork!.complete('result');
+ await Future(() {});
+ expect(emittedValues, ['result']);
+ });
+
+ test('buffers values while work is ongoing', () async {
+ values.add(1);
+ await Future(() {});
+ values
+ ..add(2)
+ ..add(3);
+ await Future(() {});
+ finishWork!.complete('');
+ await Future(() {});
+ expect(workArgument, 3);
+ });
+
+ test('forwards errors without waiting for work', () async {
+ values.add(1);
+ await Future(() {});
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('forwards errors which occur during the work', () async {
+ values.add(1);
+ await Future(() {});
+ finishWork!.completeError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('can continue handling events after an error', () async {
+ values.add(1);
+ await Future(() {});
+ finishWork!.completeError('error');
+ values.add(2);
+ await Future(() {});
+ expect(workArgument, 2);
+ finishWork!.completeError('another');
+ await Future(() {});
+ expect(errors, ['error', 'another']);
+ });
+
+ test('does not start next work early due to an error in values',
+ () async {
+ values.add(1);
+ await Future(() {});
+ values
+ ..addError('error')
+ ..add(2);
+ await Future(() {});
+ expect(errors, ['error']);
+ // [work] will assert that the second iteration is not called because
+ // the first has not completed.
+ });
+
+ test('cancels value subscription when output canceled', () async {
+ expect(valuesCanceled, false);
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('closes when values end if no work is pending', () async {
+ expect(isDone, false);
+ await values.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('waits for pending work when values close', () async {
+ values.add(1);
+ await Future(() {});
+ expect(isDone, false);
+ values.add(2);
+ await values.close();
+ expect(isDone, false);
+ finishWork!.complete('');
+ await Future(() {});
+ // Still a pending value
+ expect(isDone, false);
+ finishWork!.complete('');
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('forwards errors from values', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () async {
+ var otherValues = <String>[];
+ transformed.listen(otherValues.add);
+ values.add(1);
+ await Future(() {});
+ finishWork!.complete('result');
+ await Future(() {});
+ expect(emittedValues, ['result']);
+ expect(otherValues, ['result']);
+ });
+
+ test('multiple listeners get done when values end', () async {
+ var otherDone = false;
+ transformed.listen(null, onDone: () => otherDone = true);
+ values.add(1);
+ await Future(() {});
+ await values.close();
+ expect(isDone, false);
+ expect(otherDone, false);
+ finishWork!.complete('');
+ await Future(() {});
+ expect(isDone, true);
+ expect(otherDone, true);
+ });
+
+ test('can cancel and relisten', () async {
+ values.add(1);
+ await Future(() {});
+ finishWork!.complete('first');
+ await Future(() {});
+ await subscription.cancel();
+ values.add(2);
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ values.add(3);
+ await Future(() {});
+ expect(workArgument, 3);
+ finishWork!.complete('second');
+ await Future(() {});
+ expect(emittedValues, ['first', 'second']);
+ });
+ }
+ });
+ }
+
+ test('allows nulls', () async {
+ var stream = Stream<int?>.value(null);
+ await stream.asyncMapSample(expectAsync1((_) async {})).drain<void>();
+ });
+}
diff --git a/pkgs/stream_transform/test/async_where_test.dart b/pkgs/stream_transform/test/async_where_test.dart
new file mode 100644
index 0000000..6ea4e76
--- /dev/null
+++ b/pkgs/stream_transform/test/async_where_test.dart
@@ -0,0 +1,90 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+void main() {
+ test('forwards only events that pass the predicate', () async {
+ var values = Stream.fromIterable([1, 2, 3, 4]);
+ var filtered = values.asyncWhere((e) async => e > 2);
+ expect(await filtered.toList(), [3, 4]);
+ });
+
+ test('allows predicates that go through event loop', () async {
+ var values = Stream.fromIterable([1, 2, 3, 4]);
+ var filtered = values.asyncWhere((e) async {
+ await Future(() {});
+ return e > 2;
+ });
+ expect(await filtered.toList(), [3, 4]);
+ });
+
+ test('allows synchronous predicate', () async {
+ var values = Stream.fromIterable([1, 2, 3, 4]);
+ var filtered = values.asyncWhere((e) => e > 2);
+ expect(await filtered.toList(), [3, 4]);
+ });
+
+ test('can result in empty stream', () async {
+ var values = Stream.fromIterable([1, 2, 3, 4]);
+ var filtered = values.asyncWhere((e) => e > 4);
+ expect(await filtered.isEmpty, true);
+ });
+
+ test('forwards values to multiple listeners', () async {
+ var values = StreamController<int>.broadcast();
+ var filtered = values.stream.asyncWhere((e) async => e > 2);
+ var firstValues = <int>[];
+ var secondValues = <int>[];
+ filtered
+ ..listen(firstValues.add)
+ ..listen(secondValues.add);
+ values
+ ..add(1)
+ ..add(2)
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ expect(firstValues, [3, 4]);
+ expect(secondValues, [3, 4]);
+ });
+
+ test('closes streams with multiple listeners', () async {
+ var values = StreamController<int>.broadcast();
+ var predicate = Completer<bool>();
+ var filtered = values.stream.asyncWhere((_) => predicate.future);
+ var firstDone = false;
+ var secondDone = false;
+ filtered
+ ..listen(null, onDone: () => firstDone = true)
+ ..listen(null, onDone: () => secondDone = true);
+ values.add(1);
+ await values.close();
+ expect(firstDone, false);
+ expect(secondDone, false);
+
+ predicate.complete(true);
+ await Future(() {});
+ expect(firstDone, true);
+ expect(secondDone, true);
+ });
+
+ test('forwards errors emitted by the test callback', () async {
+ var errors = <Object>[];
+ var emitted = <Object>[];
+ var values = Stream.fromIterable([1, 2, 3, 4]);
+ var filtered = values.asyncWhere((e) async {
+ await Future(() {});
+ if (e.isEven) throw Exception('$e');
+ return true;
+ });
+ var done = Completer<Object?>();
+ filtered.listen(emitted.add, onError: errors.add, onDone: done.complete);
+ await done.future;
+ expect(emitted, [1, 3]);
+ expect(errors.map((e) => '$e'), ['Exception: 2', 'Exception: 4']);
+ });
+}
diff --git a/pkgs/stream_transform/test/audit_test.dart b/pkgs/stream_transform/test/audit_test.dart
new file mode 100644
index 0000000..28537db
--- /dev/null
+++ b/pkgs/stream_transform/test/audit_test.dart
@@ -0,0 +1,140 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:fake_async/fake_async.dart';
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ for (var streamType in streamTypes) {
+ group('Stream type [$streamType]', () {
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<int> transformed;
+ late StreamSubscription<int> subscription;
+
+ group('audit', () {
+ setUp(() {
+ valuesCanceled = false;
+ values = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ transformed = values.stream.audit(const Duration(milliseconds: 6));
+ });
+
+ void listen() {
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ test('cancels values', () async {
+ listen();
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('swallows values that come faster than duration', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [2]);
+ });
+ });
+
+ test('outputs multiple values spaced further than duration', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values.add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ });
+ });
+
+ test('waits for pending value to close', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..close();
+ expect(isDone, false);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(isDone, true);
+ });
+ });
+
+ test('closes output if there are no pending values', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values
+ ..add(2)
+ ..close();
+ expect(isDone, false);
+ expect(emittedValues, [1]);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(isDone, true);
+ expect(emittedValues, [1, 2]);
+ });
+ });
+
+ test('does not starve output if many values come closer than duration',
+ () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 3));
+ values.add(2);
+ async.elapse(const Duration(milliseconds: 3));
+ values.add(3);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [2, 3]);
+ });
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get the values', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 3));
+ values.add(2);
+ var otherValues = <int>[];
+ transformed.listen(otherValues.add);
+ values.add(3);
+ async.elapse(const Duration(milliseconds: 3));
+ values.add(4);
+ async.elapse(const Duration(milliseconds: 3));
+ values
+ ..add(5)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [3, 5]);
+ expect(otherValues, [3, 5]);
+ });
+ });
+ }
+ });
+ });
+ }
+}
diff --git a/pkgs/stream_transform/test/buffer_test.dart b/pkgs/stream_transform/test/buffer_test.dart
new file mode 100644
index 0000000..830f555
--- /dev/null
+++ b/pkgs/stream_transform/test/buffer_test.dart
@@ -0,0 +1,305 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ late StreamController<void> trigger;
+ late StreamController<int> values;
+ late List<List<int>> emittedValues;
+ late bool valuesCanceled;
+ late bool triggerCanceled;
+ late bool triggerPaused;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<List<int>> transformed;
+ late StreamSubscription<List<int>> subscription;
+
+ void setUpForStreamTypes(String triggerType, String valuesType,
+ {required bool longPoll}) {
+ valuesCanceled = false;
+ triggerCanceled = false;
+ triggerPaused = false;
+ trigger = createController(triggerType)
+ ..onCancel = () {
+ triggerCanceled = true;
+ };
+ if (triggerType == 'single subscription') {
+ trigger.onPause = () {
+ triggerPaused = true;
+ };
+ }
+ values = createController(valuesType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ transformed = values.stream.buffer(trigger.stream, longPoll: longPoll);
+ subscription =
+ transformed.listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ for (var triggerType in streamTypes) {
+ for (var valuesType in streamTypes) {
+ group('Trigger type: [$triggerType], Values type: [$valuesType]', () {
+ group('general behavior', () {
+ setUp(() {
+ setUpForStreamTypes(triggerType, valuesType, longPoll: true);
+ });
+
+ test('does not emit before `trigger`', () async {
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, isEmpty);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [
+ [1]
+ ]);
+ });
+
+ test('groups values between trigger', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger.add(null);
+ values
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [
+ [1, 2],
+ [3, 4]
+ ]);
+ });
+
+ test('cancels value subscription when output canceled', () async {
+ expect(valuesCanceled, false);
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('closes when trigger ends', () async {
+ expect(isDone, false);
+ await trigger.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('closes after outputting final values when source closes',
+ () async {
+ expect(isDone, false);
+ values.add(1);
+ await values.close();
+ expect(isDone, false);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [
+ [1]
+ ]);
+ expect(isDone, true);
+ });
+
+ test('closes when source closes and there are no buffered', () async {
+ expect(isDone, false);
+ await values.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('forwards errors from trigger', () async {
+ trigger.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('forwards errors from values', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+ });
+
+ group('long polling', () {
+ setUp(() {
+ setUpForStreamTypes(triggerType, valuesType, longPoll: true);
+ });
+
+ test('emits immediately if trigger emits before a value', () async {
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, isEmpty);
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, [
+ [1]
+ ]);
+ });
+
+ test('two triggers in a row - emit buffere then emit next value',
+ () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger
+ ..add(null)
+ ..add(null);
+ await Future(() {});
+ values.add(3);
+ await Future(() {});
+ expect(emittedValues, [
+ [1, 2],
+ [3]
+ ]);
+ });
+
+ test('pre-emptive trigger then trigger after values', () async {
+ trigger.add(null);
+ await Future(() {});
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [
+ [1],
+ [2]
+ ]);
+ });
+
+ test('multiple pre-emptive triggers, only emits first value',
+ () async {
+ trigger
+ ..add(null)
+ ..add(null);
+ await Future(() {});
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(emittedValues, [
+ [1]
+ ]);
+ });
+
+ test('closes if there is no waiting long poll when source closes',
+ () async {
+ expect(isDone, false);
+ values.add(1);
+ trigger.add(null);
+ await values.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('waits to emit if there waiting long poll when trigger closes',
+ () async {
+ trigger.add(null);
+ await trigger.close();
+ expect(isDone, false);
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, [
+ [1]
+ ]);
+ expect(isDone, true);
+ });
+ });
+
+ group('immediate polling', () {
+ setUp(() {
+ setUpForStreamTypes(triggerType, valuesType, longPoll: false);
+ });
+
+ test('emits empty list before values', () async {
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [<int>[]]);
+ });
+
+ test('emits empty list after emitting values', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger
+ ..add(null)
+ ..add(null);
+ await Future(() {});
+ expect(emittedValues, [
+ [1, 2],
+ <int>[]
+ ]);
+ });
+ });
+ });
+ }
+ }
+
+ test('always cancels trigger if values is singlesubscription', () async {
+ setUpForStreamTypes('broadcast', 'single subscription', longPoll: true);
+ expect(triggerCanceled, false);
+ await subscription.cancel();
+ expect(triggerCanceled, true);
+
+ setUpForStreamTypes('single subscription', 'single subscription',
+ longPoll: true);
+ expect(triggerCanceled, false);
+ await subscription.cancel();
+ expect(triggerCanceled, true);
+ });
+
+ test('cancels trigger if trigger is broadcast', () async {
+ setUpForStreamTypes('broadcast', 'broadcast', longPoll: true);
+ expect(triggerCanceled, false);
+ await subscription.cancel();
+ expect(triggerCanceled, true);
+ });
+
+ test('pauses single subscription trigger for broadcast values', () async {
+ setUpForStreamTypes('single subscription', 'broadcast', longPoll: true);
+ expect(triggerCanceled, false);
+ expect(triggerPaused, false);
+ await subscription.cancel();
+ expect(triggerCanceled, false);
+ expect(triggerPaused, true);
+ });
+
+ for (var triggerType in streamTypes) {
+ test('cancel and relisten with [$triggerType] trigger', () async {
+ setUpForStreamTypes(triggerType, 'broadcast', longPoll: true);
+ values.add(1);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [
+ [1]
+ ]);
+ await subscription.cancel();
+ values.add(2);
+ trigger.add(null);
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ values.add(3);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [
+ [1],
+ [3]
+ ]);
+ });
+ }
+}
diff --git a/pkgs/stream_transform/test/combine_latest_all_test.dart b/pkgs/stream_transform/test/combine_latest_all_test.dart
new file mode 100644
index 0000000..f4b719c
--- /dev/null
+++ b/pkgs/stream_transform/test/combine_latest_all_test.dart
@@ -0,0 +1,166 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+Future<void> tick() => Future(() {});
+
+void main() {
+ group('combineLatestAll', () {
+ test('emits latest values', () async {
+ final first = StreamController<String>();
+ final second = StreamController<String>();
+ final third = StreamController<String>();
+ final combined = first.stream.combineLatestAll(
+ [second.stream, third.stream]).map((data) => data.join());
+
+ // first: a----b------------------c--------d---|
+ // second: --1---------2-----------------|
+ // third: -------&----------%---|
+ // combined: -------b1&--b2&---b2%---c2%------d2%-|
+
+ expect(combined,
+ emitsInOrder(['b1&', 'b2&', 'b2%', 'c2%', 'd2%', emitsDone]));
+
+ first.add('a');
+ await tick();
+ second.add('1');
+ await tick();
+ first.add('b');
+ await tick();
+ third.add('&');
+ await tick();
+ second.add('2');
+ await tick();
+ third.add('%');
+ await tick();
+ await third.close();
+ await tick();
+ first.add('c');
+ await tick();
+ await second.close();
+ await tick();
+ first.add('d');
+ await tick();
+ await first.close();
+ });
+
+ test('ends if a Stream closes without ever emitting a value', () async {
+ final first = StreamController<String>();
+ final second = StreamController<String>();
+ final combined = first.stream.combineLatestAll([second.stream]);
+
+ // first: -a------b-------|
+ // second: -----|
+ // combined: -----|
+
+ expect(combined, emits(emitsDone));
+
+ first.add('a');
+ await tick();
+ await second.close();
+ await tick();
+ first.add('b');
+ });
+
+ test('forwards errors', () async {
+ final first = StreamController<String>();
+ final second = StreamController<String>();
+ final combined = first.stream
+ .combineLatestAll([second.stream]).map((data) => data.join());
+
+ // first: -a---------|
+ // second: ----1---#
+ // combined: ----a1--#
+
+ expect(combined, emitsThrough(emitsError('doh')));
+
+ first.add('a');
+ await tick();
+ second.add('1');
+ await tick();
+ second.addError('doh');
+ });
+
+ test('ends after both streams have ended', () async {
+ final first = StreamController<String>();
+ final second = StreamController<String>();
+
+ var done = false;
+ first.stream.combineLatestAll([second.stream]).listen(null,
+ onDone: () => done = true);
+
+ // first: -a---|
+ // second: --------1--|
+ // combined: --------a1-|
+
+ first.add('a');
+ await tick();
+ await first.close();
+ await tick();
+
+ expect(done, isFalse);
+
+ second.add('1');
+ await tick();
+ await second.close();
+ await tick();
+
+ expect(done, isTrue);
+ });
+
+ group('broadcast source', () {
+ test('can cancel and relisten to broadcast stream', () async {
+ final first = StreamController<String>.broadcast();
+ final second = StreamController<String>.broadcast();
+ final combined = first.stream
+ .combineLatestAll([second.stream]).map((data) => data.join());
+
+ // first: a------b----------------c------d----e---|
+ // second: --1---------2---3---4------5-|
+ // combined: --a1---b1---b2--b3--b4-----c5--d5---e5--|
+ // sub1: ^-----------------!
+ // sub2: ----------------------^-----------------|
+
+ expect(combined.take(4), emitsInOrder(['a1', 'b1', 'b2', 'b3']));
+
+ first.add('a');
+ await tick();
+ second.add('1');
+ await tick();
+ first.add('b');
+ await tick();
+ second.add('2');
+ await tick();
+ second.add('3');
+ await tick();
+
+ // First subscription is canceled here by .take(4)
+ expect(first.hasListener, isFalse);
+ expect(second.hasListener, isFalse);
+
+ // This emit is thrown away because there are no subscribers
+ second.add('4');
+ await tick();
+
+ expect(combined, emitsInOrder(['c5', 'd5', 'e5', emitsDone]));
+
+ first.add('c');
+ await tick();
+ second.add('5');
+ await tick();
+ await second.close();
+ await tick();
+ first.add('d');
+ await tick();
+ first.add('e');
+ await tick();
+ await first.close();
+ });
+ });
+ });
+}
diff --git a/pkgs/stream_transform/test/combine_latest_test.dart b/pkgs/stream_transform/test/combine_latest_test.dart
new file mode 100644
index 0000000..1985c75
--- /dev/null
+++ b/pkgs/stream_transform/test/combine_latest_test.dart
@@ -0,0 +1,179 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('combineLatest', () {
+ test('flows through combine callback', () async {
+ var source = StreamController<int>();
+ var other = StreamController<int>();
+ int sum(int a, int b) => a + b;
+
+ var results = <int>[];
+ unawaited(
+ source.stream.combineLatest(other.stream, sum).forEach(results.add));
+
+ source.add(1);
+ await Future(() {});
+ expect(results, isEmpty);
+
+ other.add(2);
+ await Future(() {});
+ expect(results, [3]);
+
+ source.add(3);
+ await Future(() {});
+ expect(results, [3, 5]);
+
+ source.add(4);
+ await Future(() {});
+ expect(results, [3, 5, 6]);
+
+ other.add(5);
+ await Future(() {});
+ expect(results, [3, 5, 6, 9]);
+ });
+
+ test('can combine different typed streams', () async {
+ var source = StreamController<String>();
+ var other = StreamController<int>();
+ String times(String a, int b) => a * b;
+
+ var results = <String>[];
+ unawaited(source.stream
+ .combineLatest(other.stream, times)
+ .forEach(results.add));
+
+ source
+ ..add('a')
+ ..add('b');
+ await Future(() {});
+ expect(results, isEmpty);
+
+ other.add(2);
+ await Future(() {});
+ expect(results, ['bb']);
+
+ other.add(3);
+ await Future(() {});
+ expect(results, ['bb', 'bbb']);
+
+ source.add('c');
+ await Future(() {});
+ expect(results, ['bb', 'bbb', 'ccc']);
+ });
+
+ test('ends after both streams have ended', () async {
+ var source = StreamController<int>();
+ var other = StreamController<int>();
+ int sum(int a, int b) => a + b;
+
+ var done = false;
+ source.stream
+ .combineLatest(other.stream, sum)
+ .listen(null, onDone: () => done = true);
+
+ source.add(1);
+
+ await source.close();
+ await Future(() {});
+ expect(done, false);
+
+ await other.close();
+ await Future(() {});
+ expect(done, true);
+ });
+
+ test('ends if source stream closes without ever emitting a value',
+ () async {
+ var source = const Stream<int>.empty();
+ var other = StreamController<int>();
+
+ int sum(int a, int b) => a + b;
+
+ var done = false;
+ source
+ .combineLatest(other.stream, sum)
+ .listen(null, onDone: () => done = true);
+
+ await Future(() {});
+ // Nothing can ever be emitted on the result, may as well close.
+ expect(done, true);
+ });
+
+ test('ends if other stream closes without ever emitting a value', () async {
+ var source = StreamController<int>();
+ var other = const Stream<int>.empty();
+
+ int sum(int a, int b) => a + b;
+
+ var done = false;
+ source.stream
+ .combineLatest(other, sum)
+ .listen(null, onDone: () => done = true);
+
+ await Future(() {});
+ // Nothing can ever be emitted on the result, may as well close.
+ expect(done, true);
+ });
+
+ test('forwards errors', () async {
+ var source = StreamController<int>();
+ var other = StreamController<int>();
+ int sum(int a, int b) => throw _NumberedException(3);
+
+ var errors = <Object>[];
+ source.stream
+ .combineLatest(other.stream, sum)
+ .listen(null, onError: errors.add);
+
+ source.addError(_NumberedException(1));
+ other.addError(_NumberedException(2));
+
+ source.add(1);
+ other.add(2);
+
+ await Future(() {});
+
+ expect(errors, [_isException(1), _isException(2), _isException(3)]);
+ });
+
+ group('broadcast source', () {
+ test('can cancel and relisten to broadcast stream', () async {
+ var source = StreamController<int>.broadcast();
+ var other = StreamController<int>();
+ int combine(int a, int b) => a + b;
+
+ var emittedValues = <int>[];
+ var transformed = source.stream.combineLatest(other.stream, combine);
+
+ var subscription = transformed.listen(emittedValues.add);
+
+ source.add(1);
+ other.add(2);
+ await Future(() {});
+ expect(emittedValues, [3]);
+
+ await subscription.cancel();
+
+ subscription = transformed.listen(emittedValues.add);
+ source.add(3);
+ await Future(() {});
+ expect(emittedValues, [3, 5]);
+ });
+ });
+ });
+}
+
+class _NumberedException implements Exception {
+ final int id;
+ _NumberedException(this.id);
+}
+
+Matcher _isException(int id) =>
+ const TypeMatcher<_NumberedException>().having((n) => n.id, 'id', id);
diff --git a/pkgs/stream_transform/test/concurrent_async_map_test.dart b/pkgs/stream_transform/test/concurrent_async_map_test.dart
new file mode 100644
index 0000000..1807f9f
--- /dev/null
+++ b/pkgs/stream_transform/test/concurrent_async_map_test.dart
@@ -0,0 +1,157 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ late StreamController<int> controller;
+ late List<String> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<String> transformed;
+ late StreamSubscription<String> subscription;
+
+ late List<Completer<String>> finishWork;
+ late List<dynamic> values;
+
+ Future<String> convert(int value) {
+ values.add(value);
+ var completer = Completer<String>();
+ finishWork.add(completer);
+ return completer.future;
+ }
+
+ for (var streamType in streamTypes) {
+ group('concurrentAsyncMap for stream type: [$streamType]', () {
+ setUp(() {
+ valuesCanceled = false;
+ controller = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ finishWork = [];
+ values = [];
+ transformed = controller.stream.concurrentAsyncMap(convert);
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ });
+
+ test('does not emit before convert finishes', () async {
+ controller.add(1);
+ await Future(() {});
+ expect(emittedValues, isEmpty);
+ expect(values, [1]);
+ finishWork.first.complete('result');
+ await Future(() {});
+ expect(emittedValues, ['result']);
+ });
+
+ test('allows calls to convert before the last one finished', () async {
+ controller
+ ..add(1)
+ ..add(2)
+ ..add(3);
+ await Future(() {});
+ expect(values, [1, 2, 3]);
+ });
+
+ test('forwards errors directly without waiting for previous convert',
+ () async {
+ controller.add(1);
+ await Future(() {});
+ controller.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('forwards errors which occur during the convert', () async {
+ controller.add(1);
+ await Future(() {});
+ finishWork.first.completeError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('can continue handling events after an error', () async {
+ controller.add(1);
+ await Future(() {});
+ finishWork[0].completeError('error');
+ controller.add(2);
+ await Future(() {});
+ expect(values, [1, 2]);
+ finishWork[1].completeError('another');
+ await Future(() {});
+ expect(errors, ['error', 'another']);
+ });
+
+ test('cancels value subscription when output canceled', () async {
+ expect(valuesCanceled, false);
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('closes when values end if no conversion is pending', () async {
+ expect(isDone, false);
+ await controller.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () async {
+ var otherValues = <String>[];
+ transformed.listen(otherValues.add);
+ controller.add(1);
+ await Future(() {});
+ finishWork.first.complete('result');
+ await Future(() {});
+ expect(emittedValues, ['result']);
+ expect(otherValues, ['result']);
+ });
+
+ test('multiple listeners get done when values end', () async {
+ var otherDone = false;
+ transformed.listen(null, onDone: () => otherDone = true);
+ controller.add(1);
+ await Future(() {});
+ await controller.close();
+ expect(isDone, false);
+ expect(otherDone, false);
+ finishWork.first.complete('');
+ await Future(() {});
+ expect(isDone, true);
+ expect(otherDone, true);
+ });
+
+ test('can cancel and relisten', () async {
+ controller.add(1);
+ await Future(() {});
+ finishWork.first.complete('first');
+ await Future(() {});
+ await subscription.cancel();
+ controller.add(2);
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ controller.add(3);
+ await Future(() {});
+ expect(values, [1, 3]);
+ finishWork[1].complete('second');
+ await Future(() {});
+ expect(emittedValues, ['first', 'second']);
+ });
+ }
+ });
+ }
+}
diff --git a/pkgs/stream_transform/test/debounce_test.dart b/pkgs/stream_transform/test/debounce_test.dart
new file mode 100644
index 0000000..19de055
--- /dev/null
+++ b/pkgs/stream_transform/test/debounce_test.dart
@@ -0,0 +1,310 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:fake_async/fake_async.dart';
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ for (var streamType in streamTypes) {
+ group('Stream type [$streamType]', () {
+ group('debounce - trailing', () {
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late StreamSubscription<int> subscription;
+ late Stream<int> transformed;
+
+ setUp(() async {
+ valuesCanceled = false;
+ values = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ transformed = values.stream.debounce(const Duration(milliseconds: 5));
+ });
+
+ void listen() {
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ test('cancels values', () async {
+ listen();
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('swallows values that come faster than duration', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [2]);
+ });
+ });
+
+ test('outputs multiple values spaced further than duration', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values.add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ });
+ });
+
+ test('waits for pending value to close', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values.close();
+ async.flushMicrotasks();
+ expect(isDone, true);
+ });
+ });
+
+ test('closes output if there are no pending values', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values
+ ..add(2)
+ ..close();
+ async.flushMicrotasks();
+ expect(isDone, false);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(isDone, true);
+ });
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () {
+ fakeAsync((async) {
+ listen();
+ var otherValues = <int>[];
+ transformed.listen(otherValues.add);
+ values
+ ..add(1)
+ ..add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [2]);
+ expect(otherValues, [2]);
+ });
+ });
+ }
+ });
+
+ group('debounce - leading', () {
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late Stream<int> transformed;
+ late bool isDone;
+
+ setUp(() async {
+ values = createController(streamType);
+ emittedValues = [];
+ isDone = false;
+ transformed = values.stream.debounce(const Duration(milliseconds: 5),
+ leading: true, trailing: false);
+ });
+
+ void listen() {
+ transformed.listen(emittedValues.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ test('swallows values that come faster than duration', () async {
+ listen();
+ values
+ ..add(1)
+ ..add(2);
+ await values.close();
+ expect(emittedValues, [1]);
+ });
+
+ test('outputs multiple values spaced further than duration', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values.add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ });
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () {
+ fakeAsync((async) {
+ listen();
+ var otherValues = <int>[];
+ transformed.listen(otherValues.add);
+ values
+ ..add(1)
+ ..add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1]);
+ expect(otherValues, [1]);
+ });
+ });
+ }
+
+ test('closes output immediately if not waiting for trailing value',
+ () async {
+ listen();
+ values.add(1);
+ await values.close();
+ expect(isDone, true);
+ });
+ });
+
+ group('debounce - leading and trailing', () {
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late Stream<int> transformed;
+
+ setUp(() async {
+ values = createController(streamType);
+ emittedValues = [];
+ transformed = values.stream.debounce(const Duration(milliseconds: 5),
+ leading: true, trailing: true);
+ });
+ void listen() {
+ transformed.listen(emittedValues.add);
+ }
+
+ test('swallows values that come faster than duration', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..add(3)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 3]);
+ });
+ });
+
+ test('outputs multiple values spaced further than duration', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values.add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ });
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () {
+ fakeAsync((async) {
+ listen();
+ var otherValues = <int>[];
+ transformed.listen(otherValues.add);
+ values
+ ..add(1)
+ ..add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ expect(otherValues, [1, 2]);
+ });
+ });
+ }
+ });
+
+ group('debounceBuffer', () {
+ late StreamController<int> values;
+ late List<List<int>> emittedValues;
+ late List<String> errors;
+ late Stream<List<int>> transformed;
+
+ setUp(() async {
+ values = createController(streamType);
+ emittedValues = [];
+ errors = [];
+ transformed =
+ values.stream.debounceBuffer(const Duration(milliseconds: 5));
+ });
+ void listen() {
+ transformed.listen(emittedValues.add, onError: errors.add);
+ }
+
+ test('Emits all values as a list', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [
+ [1, 2]
+ ]);
+ });
+ });
+
+ test('separate lists for multiple values spaced further than duration',
+ () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values.add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [
+ [1],
+ [2]
+ ]);
+ });
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () {
+ fakeAsync((async) {
+ listen();
+ var otherValues = <List<int>>[];
+ transformed.listen(otherValues.add);
+ values
+ ..add(1)
+ ..add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [
+ [1, 2]
+ ]);
+ expect(otherValues, [
+ [1, 2]
+ ]);
+ });
+ });
+ }
+ });
+ });
+ }
+ test('allows nulls', () async {
+ final values = Stream<int?>.fromIterable([null]);
+ final transformed = values.debounce(const Duration(milliseconds: 1));
+ expect(await transformed.toList(), [null]);
+ });
+}
diff --git a/pkgs/stream_transform/test/followd_by_test.dart b/pkgs/stream_transform/test/followd_by_test.dart
new file mode 100644
index 0000000..d600d13
--- /dev/null
+++ b/pkgs/stream_transform/test/followd_by_test.dart
@@ -0,0 +1,159 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ for (var firstType in streamTypes) {
+ for (var secondType in streamTypes) {
+ group('followedBy [$firstType] with [$secondType]', () {
+ late StreamController<int> first;
+ late StreamController<int> second;
+
+ late List<int> emittedValues;
+ late bool firstCanceled;
+ late bool secondCanceled;
+ late bool secondListened;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<int> transformed;
+ late StreamSubscription<int> subscription;
+
+ setUp(() async {
+ firstCanceled = false;
+ secondCanceled = false;
+ secondListened = false;
+ first = createController(firstType)
+ ..onCancel = () {
+ firstCanceled = true;
+ };
+ second = createController(secondType)
+ ..onCancel = () {
+ secondCanceled = true;
+ }
+ ..onListen = () {
+ secondListened = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ transformed = first.stream.followedBy(second.stream);
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ });
+
+ test('adds all values from both streams', () async {
+ first
+ ..add(1)
+ ..add(2);
+ await first.close();
+ await Future(() {});
+ second
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3, 4]);
+ });
+
+ test('Does not listen to second stream before first stream finishes',
+ () async {
+ expect(secondListened, false);
+ await first.close();
+ expect(secondListened, true);
+ });
+
+ test('closes stream after both inputs close', () async {
+ await first.close();
+ await second.close();
+ expect(isDone, true);
+ });
+
+ test('cancels any type of first stream on cancel', () async {
+ await subscription.cancel();
+ expect(firstCanceled, true);
+ });
+
+ if (firstType == 'single subscription') {
+ test(
+ 'cancels any type of second stream on cancel if first is '
+ 'broadcast', () async {
+ await first.close();
+ await subscription.cancel();
+ expect(secondCanceled, true);
+ });
+
+ if (secondType == 'broadcast') {
+ test('can pause and resume during second stream - dropping values',
+ () async {
+ await first.close();
+ subscription.pause();
+ second.add(1);
+ await Future(() {});
+ subscription.resume();
+ second.add(2);
+ await Future(() {});
+ expect(emittedValues, [2]);
+ });
+ } else {
+ test('can pause and resume during second stream - buffering values',
+ () async {
+ await first.close();
+ subscription.pause();
+ second.add(1);
+ await Future(() {});
+ subscription.resume();
+ second.add(2);
+ await Future(() {});
+ expect(emittedValues, [1, 2]);
+ });
+ }
+ }
+
+ if (firstType == 'broadcast') {
+ test('can cancel and relisten during first stream', () async {
+ await subscription.cancel();
+ first.add(1);
+ subscription = transformed.listen(emittedValues.add);
+ first.add(2);
+ await Future(() {});
+ expect(emittedValues, [2]);
+ });
+
+ test('can cancel and relisten during second stream', () async {
+ await first.close();
+ await subscription.cancel();
+ second.add(2);
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ second.add(3);
+ await Future(() {});
+ expect(emittedValues, [3]);
+ });
+
+ test('forwards values to multiple listeners', () async {
+ var otherValues = <int>[];
+ transformed.listen(otherValues.add);
+ first.add(1);
+ await first.close();
+ second.add(2);
+ await Future(() {});
+ var thirdValues = <int>[];
+ transformed.listen(thirdValues.add);
+ second.add(3);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3]);
+ expect(otherValues, [1, 2, 3]);
+ expect(thirdValues, [3]);
+ });
+ }
+ });
+ }
+ }
+}
diff --git a/pkgs/stream_transform/test/from_handlers_test.dart b/pkgs/stream_transform/test/from_handlers_test.dart
new file mode 100644
index 0000000..694199c
--- /dev/null
+++ b/pkgs/stream_transform/test/from_handlers_test.dart
@@ -0,0 +1,183 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/src/from_handlers.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<int> transformed;
+ late StreamSubscription<int> subscription;
+
+ void setUpForController(StreamController<int> controller,
+ Stream<int> Function(Stream<int>) transform) {
+ valuesCanceled = false;
+ values = controller
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ transformed = transform(values.stream);
+ subscription =
+ transformed.listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ group('default from_handlers', () {
+ group('Single subscription stream', () {
+ setUp(() {
+ setUpForController(StreamController(),
+ (s) => s.transformByHandlers(onData: (e, sink) => sink.add(e)));
+ });
+
+ test('has correct stream type', () {
+ expect(transformed.isBroadcast, false);
+ });
+
+ test('forwards values', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(emittedValues, [1, 2]);
+ });
+
+ test('forwards errors', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('forwards done', () async {
+ await values.close();
+ expect(isDone, true);
+ });
+
+ test('forwards cancel', () async {
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+ });
+
+ group('broadcast stream with muliple listeners', () {
+ late List<int> emittedValues2;
+ late List<String> errors2;
+ late bool isDone2;
+ late StreamSubscription<int> subscription2;
+
+ setUp(() {
+ setUpForController(StreamController.broadcast(),
+ (s) => s.transformByHandlers(onData: (e, sink) => sink.add(e)));
+ emittedValues2 = [];
+ errors2 = [];
+ isDone2 = false;
+ subscription2 = transformed
+ .listen(emittedValues2.add, onError: errors2.add, onDone: () {
+ isDone2 = true;
+ });
+ });
+
+ test('has correct stream type', () {
+ expect(transformed.isBroadcast, true);
+ });
+
+ test('forwards values', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(emittedValues, [1, 2]);
+ expect(emittedValues2, [1, 2]);
+ });
+
+ test('forwards errors', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ expect(errors2, ['error']);
+ });
+
+ test('forwards done', () async {
+ await values.close();
+ expect(isDone, true);
+ expect(isDone2, true);
+ });
+
+ test('forwards cancel', () async {
+ await subscription.cancel();
+ expect(valuesCanceled, false);
+ await subscription2.cancel();
+ expect(valuesCanceled, true);
+ });
+ });
+ });
+
+ group('custom handlers', () {
+ group('single subscription', () {
+ setUp(() async {
+ setUpForController(
+ StreamController(),
+ (s) => s.transformByHandlers(onData: (value, sink) {
+ sink.add(value + 1);
+ }));
+ });
+ test('uses transform from handleData', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(emittedValues, [2, 3]);
+ });
+ });
+
+ group('broadcast stream with multiple listeners', () {
+ late int dataCallCount;
+ late int doneCallCount;
+ late int errorCallCount;
+
+ setUp(() async {
+ dataCallCount = 0;
+ doneCallCount = 0;
+ errorCallCount = 0;
+ setUpForController(
+ StreamController.broadcast(),
+ (s) => s.transformByHandlers(onData: (value, sink) {
+ dataCallCount++;
+ }, onError: (error, stackTrace, sink) {
+ errorCallCount++;
+ sink.addError(error, stackTrace);
+ }, onDone: (sink) {
+ doneCallCount++;
+ }));
+ transformed.listen((_) {}, onError: (_, __) {});
+ });
+
+ test('handles data once', () async {
+ values.add(1);
+ await Future(() {});
+ expect(dataCallCount, 1);
+ });
+
+ test('handles done once', () async {
+ await values.close();
+ expect(doneCallCount, 1);
+ });
+
+ test('handles errors once', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errorCallCount, 1);
+ });
+ });
+ });
+}
diff --git a/pkgs/stream_transform/test/merge_test.dart b/pkgs/stream_transform/test/merge_test.dart
new file mode 100644
index 0000000..ecbf97f
--- /dev/null
+++ b/pkgs/stream_transform/test/merge_test.dart
@@ -0,0 +1,140 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('merge', () {
+ test('includes all values', () async {
+ var first = Stream.fromIterable([1, 2, 3]);
+ var second = Stream.fromIterable([4, 5, 6]);
+ var allValues = await first.merge(second).toList();
+ expect(allValues, containsAllInOrder([1, 2, 3]));
+ expect(allValues, containsAllInOrder([4, 5, 6]));
+ expect(allValues, hasLength(6));
+ });
+
+ test('cancels both sources', () async {
+ var firstCanceled = false;
+ var first = StreamController<int>()
+ ..onCancel = () {
+ firstCanceled = true;
+ };
+ var secondCanceled = false;
+ var second = StreamController<int>()
+ ..onCancel = () {
+ secondCanceled = true;
+ };
+ var subscription = first.stream.merge(second.stream).listen((_) {});
+ await subscription.cancel();
+ expect(firstCanceled, true);
+ expect(secondCanceled, true);
+ });
+
+ test('completes when both sources complete', () async {
+ var first = StreamController<int>();
+ var second = StreamController<int>();
+ var isDone = false;
+ first.stream.merge(second.stream).listen((_) {}, onDone: () {
+ isDone = true;
+ });
+ await first.close();
+ expect(isDone, false);
+ await second.close();
+ expect(isDone, true);
+ });
+
+ test('can cancel and relisten to broadcast stream', () async {
+ var first = StreamController<int>.broadcast();
+ var second = StreamController<int>();
+ var emittedValues = <int>[];
+ var transformed = first.stream.merge(second.stream);
+ var subscription = transformed.listen(emittedValues.add);
+ first.add(1);
+ second.add(2);
+ await Future(() {});
+ expect(emittedValues, contains(1));
+ expect(emittedValues, contains(2));
+ await subscription.cancel();
+ emittedValues = [];
+ subscription = transformed.listen(emittedValues.add);
+ first.add(3);
+ second.add(4);
+ await Future(() {});
+ expect(emittedValues, contains(3));
+ expect(emittedValues, contains(4));
+ });
+ });
+
+ group('mergeAll', () {
+ test('includes all values', () async {
+ var first = Stream.fromIterable([1, 2, 3]);
+ var second = Stream.fromIterable([4, 5, 6]);
+ var third = Stream.fromIterable([7, 8, 9]);
+ var allValues = await first.mergeAll([second, third]).toList();
+ expect(allValues, containsAllInOrder([1, 2, 3]));
+ expect(allValues, containsAllInOrder([4, 5, 6]));
+ expect(allValues, containsAllInOrder([7, 8, 9]));
+ expect(allValues, hasLength(9));
+ });
+
+ test('handles mix of broadcast and single-subscription', () async {
+ var firstCanceled = false;
+ var first = StreamController<int>.broadcast()
+ ..onCancel = () {
+ firstCanceled = true;
+ };
+ var secondBroadcastCanceled = false;
+ var secondBroadcast = StreamController<int>.broadcast()
+ ..onCancel = () {
+ secondBroadcastCanceled = true;
+ };
+ var secondSingleCanceled = false;
+ var secondSingle = StreamController<int>()
+ ..onCancel = () {
+ secondSingleCanceled = true;
+ };
+
+ var merged =
+ first.stream.mergeAll([secondBroadcast.stream, secondSingle.stream]);
+
+ var firstListenerValues = <int>[];
+ var secondListenerValues = <int>[];
+
+ var firstSubscription = merged.listen(firstListenerValues.add);
+ var secondSubscription = merged.listen(secondListenerValues.add);
+
+ first.add(1);
+ secondBroadcast.add(2);
+ secondSingle.add(3);
+
+ await Future(() {});
+ await firstSubscription.cancel();
+
+ expect(firstCanceled, false);
+ expect(secondBroadcastCanceled, false);
+ expect(secondSingleCanceled, false);
+
+ first.add(4);
+ secondBroadcast.add(5);
+ secondSingle.add(6);
+
+ await Future(() {});
+ await secondSubscription.cancel();
+
+ await Future(() {});
+ expect(firstCanceled, true);
+ expect(secondBroadcastCanceled, true);
+ expect(secondSingleCanceled, false,
+ reason: 'Single subscription streams merged into broadcast streams '
+ 'are not canceled');
+
+ expect(firstListenerValues, [1, 2, 3]);
+ expect(secondListenerValues, [1, 2, 3, 4, 5, 6]);
+ });
+ });
+}
diff --git a/pkgs/stream_transform/test/sample_test.dart b/pkgs/stream_transform/test/sample_test.dart
new file mode 100644
index 0000000..66ca09d
--- /dev/null
+++ b/pkgs/stream_transform/test/sample_test.dart
@@ -0,0 +1,291 @@
+// Copyright (c) 2022, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ late StreamController<void> trigger;
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late bool valuesCanceled;
+ late bool triggerCanceled;
+ late bool triggerPaused;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<int> transformed;
+ late StreamSubscription<int> subscription;
+
+ void setUpForStreamTypes(String triggerType, String valuesType,
+ {required bool longPoll}) {
+ valuesCanceled = false;
+ triggerCanceled = false;
+ triggerPaused = false;
+ trigger = createController(triggerType)
+ ..onCancel = () {
+ triggerCanceled = true;
+ };
+ if (triggerType == 'single subscription') {
+ trigger.onPause = () {
+ triggerPaused = true;
+ };
+ }
+ values = createController(valuesType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ transformed = values.stream.sample(trigger.stream, longPoll: longPoll);
+ subscription =
+ transformed.listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ for (var triggerType in streamTypes) {
+ for (var valuesType in streamTypes) {
+ group('Trigger type: [$triggerType], Values type: [$valuesType]', () {
+ group('general behavior', () {
+ setUp(() {
+ setUpForStreamTypes(triggerType, valuesType, longPoll: true);
+ });
+
+ test('does not emit before `trigger`', () async {
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, isEmpty);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [1]);
+ });
+
+ test('keeps most recent event between triggers', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger.add(null);
+ values
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [2, 4]);
+ });
+
+ test('cancels value subscription when output canceled', () async {
+ expect(valuesCanceled, false);
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('closes when trigger ends', () async {
+ expect(isDone, false);
+ await trigger.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('closes after outputting final values when source closes',
+ () async {
+ expect(isDone, false);
+ values.add(1);
+ await values.close();
+ expect(isDone, false);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [1]);
+ expect(isDone, true);
+ });
+
+ test('closes when source closes and there is no pending', () async {
+ expect(isDone, false);
+ await values.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('forwards errors from trigger', () async {
+ trigger.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('forwards errors from values', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+ });
+
+ group('long polling', () {
+ setUp(() {
+ setUpForStreamTypes(triggerType, valuesType, longPoll: true);
+ });
+
+ test('emits immediately if trigger emits before a value', () async {
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, isEmpty);
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, [1]);
+ });
+
+ test('two triggers in a row - emit buffere then emit next value',
+ () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger
+ ..add(null)
+ ..add(null);
+ await Future(() {});
+ values.add(3);
+ await Future(() {});
+ expect(emittedValues, [2, 3]);
+ });
+
+ test('pre-emptive trigger then trigger after values', () async {
+ trigger.add(null);
+ await Future(() {});
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [1, 2]);
+ });
+
+ test('multiple pre-emptive triggers, only emits first value',
+ () async {
+ trigger
+ ..add(null)
+ ..add(null);
+ await Future(() {});
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(emittedValues, [1]);
+ });
+
+ test('closes if there is no waiting long poll when source closes',
+ () async {
+ expect(isDone, false);
+ values.add(1);
+ trigger.add(null);
+ await values.close();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('waits to emit if there waiting long poll when trigger closes',
+ () async {
+ trigger.add(null);
+ await trigger.close();
+ expect(isDone, false);
+ values.add(1);
+ await Future(() {});
+ expect(emittedValues, [1]);
+ expect(isDone, true);
+ });
+ });
+
+ group('immediate polling', () {
+ setUp(() {
+ setUpForStreamTypes(triggerType, valuesType, longPoll: false);
+ });
+
+ test('ignores trigger before values', () async {
+ trigger.add(null);
+ await Future(() {});
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [2]);
+ });
+
+ test('ignores trigger if no pending values', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ trigger
+ ..add(null)
+ ..add(null);
+ await Future(() {});
+ values
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [2, 4]);
+ });
+ });
+ });
+ }
+ }
+
+ test('always cancels trigger if values is singlesubscription', () async {
+ setUpForStreamTypes('broadcast', 'single subscription', longPoll: true);
+ expect(triggerCanceled, false);
+ await subscription.cancel();
+ expect(triggerCanceled, true);
+
+ setUpForStreamTypes('single subscription', 'single subscription',
+ longPoll: true);
+ expect(triggerCanceled, false);
+ await subscription.cancel();
+ expect(triggerCanceled, true);
+ });
+
+ test('cancels trigger if trigger is broadcast', () async {
+ setUpForStreamTypes('broadcast', 'broadcast', longPoll: true);
+ expect(triggerCanceled, false);
+ await subscription.cancel();
+ expect(triggerCanceled, true);
+ });
+
+ test('pauses single subscription trigger for broadcast values', () async {
+ setUpForStreamTypes('single subscription', 'broadcast', longPoll: true);
+ expect(triggerCanceled, false);
+ expect(triggerPaused, false);
+ await subscription.cancel();
+ expect(triggerCanceled, false);
+ expect(triggerPaused, true);
+ });
+
+ for (var triggerType in streamTypes) {
+ test('cancel and relisten with [$triggerType] trigger', () async {
+ setUpForStreamTypes(triggerType, 'broadcast', longPoll: true);
+ values.add(1);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [1]);
+ await subscription.cancel();
+ values.add(2);
+ trigger.add(null);
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ values.add(3);
+ trigger.add(null);
+ await Future(() {});
+ expect(emittedValues, [1, 3]);
+ });
+ }
+}
diff --git a/pkgs/stream_transform/test/scan_test.dart b/pkgs/stream_transform/test/scan_test.dart
new file mode 100644
index 0000000..3c749e7
--- /dev/null
+++ b/pkgs/stream_transform/test/scan_test.dart
@@ -0,0 +1,109 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('Scan', () {
+ test('produces intermediate values', () async {
+ var source = Stream.fromIterable([1, 2, 3, 4]);
+ int sum(int x, int y) => x + y;
+ var result = await source.scan(0, sum).toList();
+
+ expect(result, [1, 3, 6, 10]);
+ });
+
+ test('can create a broadcast stream', () {
+ var source = StreamController<int>.broadcast();
+
+ var transformed = source.stream.scan(null, (_, __) {});
+
+ expect(transformed.isBroadcast, true);
+ });
+
+ test('forwards errors from source', () async {
+ var source = StreamController<int>();
+
+ int sum(int x, int y) => x + y;
+
+ var errors = <Object>[];
+
+ source.stream.scan(0, sum).listen(null, onError: errors.add);
+
+ source.addError(StateError('fail'));
+ await Future(() {});
+
+ expect(errors, [isStateError]);
+ });
+
+ group('with async combine', () {
+ test('returns a Stream of non-futures', () async {
+ var source = Stream.fromIterable([1, 2, 3, 4]);
+ Future<int> sum(int x, int y) async => x + y;
+ var result = await source.scan(0, sum).toList();
+
+ expect(result, [1, 3, 6, 10]);
+ });
+
+ test('can return a Stream of futures when specified', () async {
+ var source = Stream.fromIterable([1, 2]);
+ Future<int> sum(Future<int> x, int y) async => (await x) + y;
+ var result =
+ await source.scan<Future<int>>(Future.value(0), sum).toList();
+
+ expect(result, [
+ const TypeMatcher<Future<void>>(),
+ const TypeMatcher<Future<void>>()
+ ]);
+ expect(await result.wait, [1, 3]);
+ });
+
+ test('does not call for subsequent values while waiting', () async {
+ var source = StreamController<int>();
+
+ var calledWith = <int>[];
+ var block = Completer<void>();
+ Future<int> combine(int x, int y) async {
+ calledWith.add(y);
+ await block.future;
+ return x + y;
+ }
+
+ var results = <int>[];
+
+ unawaited(source.stream.scan(0, combine).forEach(results.add));
+
+ source
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(calledWith, [1]);
+ expect(results, isEmpty);
+
+ block.complete();
+ await Future(() {});
+ expect(calledWith, [1, 2]);
+ expect(results, [1, 3]);
+ });
+
+ test('forwards async errors', () async {
+ var source = StreamController<int>();
+
+ Future<int> combine(int x, int y) async => throw StateError('fail');
+
+ var errors = <Object>[];
+
+ source.stream.scan(0, combine).listen(null, onError: errors.add);
+
+ source.add(1);
+ await Future(() {});
+
+ expect(errors, [isStateError]);
+ });
+ });
+ });
+}
diff --git a/pkgs/stream_transform/test/start_with_test.dart b/pkgs/stream_transform/test/start_with_test.dart
new file mode 100644
index 0000000..35f0330
--- /dev/null
+++ b/pkgs/stream_transform/test/start_with_test.dart
@@ -0,0 +1,167 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ late StreamController<int> values;
+ late Stream<int> transformed;
+ late StreamSubscription<int> subscription;
+
+ late List<int> emittedValues;
+ late bool isDone;
+
+ void setupForStreamType(
+ String streamType, Stream<int> Function(Stream<int>) transform) {
+ emittedValues = [];
+ isDone = false;
+ values = createController(streamType);
+ transformed = transform(values.stream);
+ subscription =
+ transformed.listen(emittedValues.add, onDone: () => isDone = true);
+ }
+
+ for (var streamType in streamTypes) {
+ group('startWith then [$streamType]', () {
+ setUp(() => setupForStreamType(streamType, (s) => s.startWith(1)));
+
+ test('outputs all values', () async {
+ values
+ ..add(2)
+ ..add(3);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3]);
+ });
+
+ test('outputs initial when followed by empty stream', () async {
+ await values.close();
+ expect(emittedValues, [1]);
+ });
+
+ test('closes with values', () async {
+ expect(isDone, false);
+ await values.close();
+ expect(isDone, true);
+ });
+
+ if (streamType == 'broadcast') {
+ test('can cancel and relisten', () async {
+ values.add(2);
+ await Future(() {});
+ await subscription.cancel();
+ subscription = transformed.listen(emittedValues.add);
+ values.add(3);
+ await Future(() {});
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3]);
+ });
+ }
+ });
+
+ group('startWithMany then [$streamType]', () {
+ setUp(() async {
+ setupForStreamType(streamType, (s) => s.startWithMany([1, 2]));
+ // Ensure all initial values go through
+ await Future(() {});
+ });
+
+ test('outputs all values', () async {
+ values
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3, 4]);
+ });
+
+ test('outputs initial when followed by empty stream', () async {
+ await values.close();
+ expect(emittedValues, [1, 2]);
+ });
+
+ test('closes with values', () async {
+ expect(isDone, false);
+ await values.close();
+ expect(isDone, true);
+ });
+
+ if (streamType == 'broadcast') {
+ test('can cancel and relisten', () async {
+ values.add(3);
+ await Future(() {});
+ await subscription.cancel();
+ subscription = transformed.listen(emittedValues.add);
+ values.add(4);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3, 4]);
+ });
+ }
+ });
+
+ for (var startingStreamType in streamTypes) {
+ group('startWithStream [$startingStreamType] then [$streamType]', () {
+ late StreamController<int> starting;
+ setUp(() async {
+ starting = createController(startingStreamType);
+ setupForStreamType(
+ streamType, (s) => s.startWithStream(starting.stream));
+ });
+
+ test('outputs all values', () async {
+ starting
+ ..add(1)
+ ..add(2);
+ await starting.close();
+ values
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3, 4]);
+ });
+
+ test('closes with values', () async {
+ expect(isDone, false);
+ await starting.close();
+ expect(isDone, false);
+ await values.close();
+ expect(isDone, true);
+ });
+
+ if (streamType == 'broadcast') {
+ test('can cancel and relisten during starting', () async {
+ starting.add(1);
+ await Future(() {});
+ await subscription.cancel();
+ subscription = transformed.listen(emittedValues.add);
+ starting.add(2);
+ await starting.close();
+ values
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3, 4]);
+ });
+
+ test('can cancel and relisten during values', () async {
+ starting
+ ..add(1)
+ ..add(2);
+ await starting.close();
+ values.add(3);
+ await Future(() {});
+ await subscription.cancel();
+ subscription = transformed.listen(emittedValues.add);
+ values.add(4);
+ await Future(() {});
+ expect(emittedValues, [1, 2, 3, 4]);
+ });
+ }
+ });
+ }
+ }
+}
diff --git a/pkgs/stream_transform/test/switch_test.dart b/pkgs/stream_transform/test/switch_test.dart
new file mode 100644
index 0000000..9e70c08
--- /dev/null
+++ b/pkgs/stream_transform/test/switch_test.dart
@@ -0,0 +1,229 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ for (var outerType in streamTypes) {
+ for (var innerType in streamTypes) {
+ group('Outer type: [$outerType], Inner type: [$innerType]', () {
+ late StreamController<int> first;
+ late StreamController<int> second;
+ late StreamController<int> third;
+ late StreamController<Stream<int>> outer;
+
+ late List<int> emittedValues;
+ late bool firstCanceled;
+ late bool outerCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late StreamSubscription<int> subscription;
+
+ setUp(() async {
+ firstCanceled = false;
+ outerCanceled = false;
+ outer = createController(outerType)
+ ..onCancel = () {
+ outerCanceled = true;
+ };
+ first = createController(innerType)
+ ..onCancel = () {
+ firstCanceled = true;
+ };
+ second = createController(innerType);
+ third = createController(innerType);
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ subscription = outer.stream
+ .switchLatest()
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ });
+
+ test('forwards events', () async {
+ outer.add(first.stream);
+ await Future(() {});
+ first
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+
+ outer.add(second.stream);
+ await Future(() {});
+ second
+ ..add(3)
+ ..add(4);
+ await Future(() {});
+
+ expect(emittedValues, [1, 2, 3, 4]);
+ });
+
+ test('forwards errors from outer Stream', () async {
+ outer.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('forwards errors from inner Stream', () async {
+ outer.add(first.stream);
+ await Future(() {});
+ first.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('closes when final stream is done', () async {
+ outer.add(first.stream);
+ await Future(() {});
+
+ outer.add(second.stream);
+ await Future(() {});
+
+ await outer.close();
+ expect(isDone, false);
+
+ await second.close();
+ expect(isDone, true);
+ });
+
+ test(
+ 'closes when outer stream closes if latest inner stream already '
+ 'closed', () async {
+ outer.add(first.stream);
+ await Future(() {});
+ await first.close();
+ expect(isDone, false);
+
+ await outer.close();
+ expect(isDone, true);
+ });
+
+ test('cancels listeners on previous streams', () async {
+ outer.add(first.stream);
+ await Future(() {});
+
+ outer.add(second.stream);
+ await Future(() {});
+ expect(firstCanceled, true);
+ });
+
+ if (innerType != 'broadcast') {
+ test('waits for cancel before listening to subsequent stream',
+ () async {
+ var cancelWork = Completer<void>();
+ first.onCancel = () => cancelWork.future;
+ outer.add(first.stream);
+ await Future(() {});
+
+ var cancelDone = false;
+ second.onListen = expectAsync0(() {
+ expect(cancelDone, true);
+ });
+ outer.add(second.stream);
+ await Future(() {});
+ cancelWork.complete();
+ cancelDone = true;
+ });
+
+ test('all streams are listened to, even while cancelling', () async {
+ var cancelWork = Completer<void>();
+ first.onCancel = () => cancelWork.future;
+ outer.add(first.stream);
+ await Future(() {});
+
+ var cancelDone = false;
+ second.onListen = expectAsync0(() {
+ expect(cancelDone, true);
+ });
+ third.onListen = expectAsync0(() {
+ expect(cancelDone, true);
+ });
+ outer
+ ..add(second.stream)
+ ..add(third.stream);
+ await Future(() {});
+ cancelWork.complete();
+ cancelDone = true;
+ });
+ }
+
+ if (outerType != 'broadcast' && innerType != 'broadcast') {
+ test('pausing while cancelling an inner stream is respected',
+ () async {
+ var cancelWork = Completer<void>();
+ first.onCancel = () => cancelWork.future;
+ outer.add(first.stream);
+ await Future(() {});
+
+ var cancelDone = false;
+ second.onListen = expectAsync0(() {
+ expect(cancelDone, true);
+ });
+ outer.add(second.stream);
+ await Future(() {});
+ subscription.pause();
+ cancelWork.complete();
+ cancelDone = true;
+ await Future(() {});
+ expect(second.isPaused, true);
+ subscription.resume();
+ });
+ }
+
+ test('cancels listener on current and outer stream on cancel',
+ () async {
+ outer.add(first.stream);
+ await Future(() {});
+ await subscription.cancel();
+
+ await Future(() {});
+ expect(outerCanceled, true);
+ expect(firstCanceled, true);
+ });
+ });
+ }
+ }
+
+ group('switchMap', () {
+ test('uses map function', () async {
+ var outer = StreamController<List<int>>();
+
+ var values = <int>[];
+ outer.stream.switchMap(Stream.fromIterable).listen(values.add);
+
+ outer.add([1, 2, 3]);
+ await Future(() {});
+ outer.add([4, 5, 6]);
+ await Future(() {});
+ expect(values, [1, 2, 3, 4, 5, 6]);
+ });
+
+ test('can create a broadcast stream', () async {
+ var outer = StreamController<int>.broadcast();
+
+ var transformed =
+ outer.stream.switchMap((_) => const Stream<int>.empty());
+
+ expect(transformed.isBroadcast, true);
+ });
+
+ test('forwards errors from the convert callback', () async {
+ var errors = <String>[];
+ var source = Stream.fromIterable([1, 2, 3]);
+ source.switchMap<int>((i) {
+ // ignore: only_throw_errors
+ throw 'Error: $i';
+ }).listen((_) {}, onError: errors.add);
+ await Future<void>(() {});
+ expect(errors, ['Error: 1', 'Error: 2', 'Error: 3']);
+ });
+ });
+}
diff --git a/pkgs/stream_transform/test/take_until_test.dart b/pkgs/stream_transform/test/take_until_test.dart
new file mode 100644
index 0000000..982b3da
--- /dev/null
+++ b/pkgs/stream_transform/test/take_until_test.dart
@@ -0,0 +1,135 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ for (var streamType in streamTypes) {
+ group('takeUntil on Stream type [$streamType]', () {
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late List<String> errors;
+ late Stream<int> transformed;
+ late StreamSubscription<int> subscription;
+ late Completer<void> closeTrigger;
+
+ setUp(() {
+ valuesCanceled = false;
+ values = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ errors = [];
+ isDone = false;
+ closeTrigger = Completer();
+ transformed = values.stream.takeUntil(closeTrigger.future);
+ subscription = transformed
+ .listen(emittedValues.add, onError: errors.add, onDone: () {
+ isDone = true;
+ });
+ });
+
+ test('forwards cancellation', () async {
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('lets values through before trigger', () async {
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(emittedValues, [1, 2]);
+ });
+
+ test('forwards errors', () async {
+ values.addError('error');
+ await Future(() {});
+ expect(errors, ['error']);
+ });
+
+ test('sends done if original strem ends', () async {
+ await values.close();
+ expect(isDone, true);
+ });
+
+ test('sends done when trigger fires', () async {
+ closeTrigger.complete();
+ await Future(() {});
+ expect(isDone, true);
+ });
+
+ test('forwards errors from the close trigger', () async {
+ closeTrigger.completeError('sad');
+ await Future(() {});
+ expect(errors, ['sad']);
+ expect(isDone, true);
+ });
+
+ test('ignores errors from the close trigger after stream closed',
+ () async {
+ await values.close();
+ closeTrigger.completeError('sad');
+ await Future(() {});
+ expect(errors, <Object>[]);
+ });
+
+ test('cancels value subscription when trigger fires', () async {
+ closeTrigger.complete();
+ await Future(() {});
+ expect(valuesCanceled, true);
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () async {
+ var otherValues = <Object>[];
+ transformed.listen(otherValues.add);
+ values
+ ..add(1)
+ ..add(2);
+ await Future(() {});
+ expect(emittedValues, [1, 2]);
+ expect(otherValues, [1, 2]);
+ });
+
+ test('multiple listeners get done when trigger fires', () async {
+ var otherDone = false;
+ transformed.listen(null, onDone: () => otherDone = true);
+ closeTrigger.complete();
+ await Future(() {});
+ expect(otherDone, true);
+ expect(isDone, true);
+ });
+
+ test('multiple listeners get done when values end', () async {
+ var otherDone = false;
+ transformed.listen(null, onDone: () => otherDone = true);
+ await values.close();
+ expect(otherDone, true);
+ expect(isDone, true);
+ });
+
+ test('can cancel and relisten before trigger fires', () async {
+ values.add(1);
+ await Future(() {});
+ await subscription.cancel();
+ values.add(2);
+ await Future(() {});
+ subscription = transformed.listen(emittedValues.add);
+ values.add(3);
+ await Future(() {});
+ expect(emittedValues, [1, 3]);
+ });
+ }
+ });
+ }
+}
diff --git a/pkgs/stream_transform/test/tap_test.dart b/pkgs/stream_transform/test/tap_test.dart
new file mode 100644
index 0000000..f2b4346
--- /dev/null
+++ b/pkgs/stream_transform/test/tap_test.dart
@@ -0,0 +1,116 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+void main() {
+ test('calls function for values', () async {
+ var valuesSeen = <int>[];
+ var stream = Stream.fromIterable([1, 2, 3]);
+ await stream.tap(valuesSeen.add).last;
+ expect(valuesSeen, [1, 2, 3]);
+ });
+
+ test('forwards values', () async {
+ var stream = Stream.fromIterable([1, 2, 3]);
+ var values = await stream.tap((_) {}).toList();
+ expect(values, [1, 2, 3]);
+ });
+
+ test('calls function for errors', () async {
+ dynamic error;
+ var source = StreamController<int>();
+ source.stream.tap((_) {}, onError: (e, st) {
+ error = e;
+ }).listen((_) {}, onError: (_) {});
+ source.addError('error');
+ await Future(() {});
+ expect(error, 'error');
+ });
+
+ test('forwards errors', () async {
+ dynamic error;
+ var source = StreamController<int>();
+ source.stream.tap((_) {}, onError: (e, st) {}).listen((_) {},
+ onError: (Object e) {
+ error = e;
+ });
+ source.addError('error');
+ await Future(() {});
+ expect(error, 'error');
+ });
+
+ test('calls function on done', () async {
+ var doneCalled = false;
+ var source = StreamController<int>();
+ source.stream.tap((_) {}, onDone: () {
+ doneCalled = true;
+ }).listen((_) {});
+ await source.close();
+ expect(doneCalled, true);
+ });
+
+ test('forwards only once with multiple listeners on a broadcast stream',
+ () async {
+ var dataCallCount = 0;
+ var source = StreamController<int>.broadcast();
+ source.stream.tap((_) {
+ dataCallCount++;
+ })
+ ..listen((_) {})
+ ..listen((_) {});
+ source.add(1);
+ await Future(() {});
+ expect(dataCallCount, 1);
+ });
+
+ test(
+ 'forwards errors only once with multiple listeners on a broadcast stream',
+ () async {
+ var errorCallCount = 0;
+ var source = StreamController<int>.broadcast();
+ source.stream.tap((_) {}, onError: (_, __) {
+ errorCallCount++;
+ })
+ ..listen((_) {}, onError: (_, __) {})
+ ..listen((_) {}, onError: (_, __) {});
+ source.addError('error');
+ await Future(() {});
+ expect(errorCallCount, 1);
+ });
+
+ test('calls onDone only once with multiple listeners on a broadcast stream',
+ () async {
+ var doneCallCount = 0;
+ var source = StreamController<int>.broadcast();
+ source.stream.tap((_) {}, onDone: () {
+ doneCallCount++;
+ })
+ ..listen((_) {})
+ ..listen((_) {});
+ await source.close();
+ expect(doneCallCount, 1);
+ });
+
+ test('forwards values to multiple listeners', () async {
+ var source = StreamController<int>.broadcast();
+ var emittedValues1 = <int>[];
+ var emittedValues2 = <int>[];
+ source.stream.tap((_) {})
+ ..listen(emittedValues1.add)
+ ..listen(emittedValues2.add);
+ source.add(1);
+ await Future(() {});
+ expect(emittedValues1, [1]);
+ expect(emittedValues2, [1]);
+ });
+
+ test('allows null callback', () async {
+ var stream = Stream.fromIterable([1, 2, 3]);
+ await stream.tap(null).last;
+ });
+}
diff --git a/pkgs/stream_transform/test/throttle_test.dart b/pkgs/stream_transform/test/throttle_test.dart
new file mode 100644
index 0000000..07f607a
--- /dev/null
+++ b/pkgs/stream_transform/test/throttle_test.dart
@@ -0,0 +1,193 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:fake_async/fake_async.dart';
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ for (var streamType in streamTypes) {
+ group('Stream type [$streamType]', () {
+ late StreamController<int> values;
+ late List<int> emittedValues;
+ late bool valuesCanceled;
+ late bool isDone;
+ late Stream<int> transformed;
+ late StreamSubscription<int> subscription;
+
+ group('throttle - trailing: false', () {
+ setUp(() async {
+ valuesCanceled = false;
+ values = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ isDone = false;
+ transformed = values.stream.throttle(const Duration(milliseconds: 5));
+ });
+
+ void listen() {
+ subscription = transformed.listen(emittedValues.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ test('cancels values', () async {
+ listen();
+ await subscription.cancel();
+ expect(valuesCanceled, true);
+ });
+
+ test('swallows values that come faster than duration', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1]);
+ });
+ });
+
+ test('outputs multiple values spaced further than duration', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values.add(2);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ async.elapse(const Duration(milliseconds: 6));
+ });
+ });
+
+ test('closes output immediately', () {
+ fakeAsync((async) {
+ listen();
+ values.add(1);
+ async.elapse(const Duration(milliseconds: 6));
+ values
+ ..add(2)
+ ..close();
+ async.flushMicrotasks();
+ expect(isDone, true);
+ });
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () {
+ fakeAsync((async) {
+ listen();
+ var otherValues = <int>[];
+ transformed.listen(otherValues.add);
+ values.add(1);
+ async.flushMicrotasks();
+ expect(emittedValues, [1]);
+ expect(otherValues, [1]);
+ });
+ });
+ }
+ });
+
+ group('throttle - trailing: true', () {
+ setUp(() async {
+ valuesCanceled = false;
+ values = createController(streamType)
+ ..onCancel = () {
+ valuesCanceled = true;
+ };
+ emittedValues = [];
+ isDone = false;
+ transformed = values.stream
+ .throttle(const Duration(milliseconds: 5), trailing: true);
+ });
+ void listen() {
+ subscription = transformed.listen(emittedValues.add, onDone: () {
+ isDone = true;
+ });
+ }
+
+ test('emits both first and last in a period', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ });
+ });
+
+ test('swallows values that are not the latest in a period', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..add(3)
+ ..close();
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 3]);
+ });
+ });
+
+ test('waits to output the last value even if the stream closes',
+ () async {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..add(2)
+ ..close();
+ async.flushMicrotasks();
+ expect(isDone, false);
+ expect(emittedValues, [1],
+ reason: 'Should not be emitted until after duration');
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ expect(isDone, true);
+ async.elapse(const Duration(milliseconds: 6));
+ });
+ });
+
+ test('closes immediately if there is no pending value', () {
+ fakeAsync((async) {
+ listen();
+ values
+ ..add(1)
+ ..close();
+ async.flushMicrotasks();
+ expect(isDone, true);
+ });
+ });
+
+ if (streamType == 'broadcast') {
+ test('multiple listeners all get values', () {
+ fakeAsync((async) {
+ listen();
+ var otherValues = <int>[];
+ transformed.listen(otherValues.add);
+ values
+ ..add(1)
+ ..add(2);
+ async.flushMicrotasks();
+ expect(emittedValues, [1]);
+ expect(otherValues, [1]);
+ async.elapse(const Duration(milliseconds: 6));
+ expect(emittedValues, [1, 2]);
+ expect(otherValues, [1, 2]);
+ });
+ });
+ }
+ });
+ });
+ }
+}
diff --git a/pkgs/stream_transform/test/utils.dart b/pkgs/stream_transform/test/utils.dart
new file mode 100644
index 0000000..42d9613
--- /dev/null
+++ b/pkgs/stream_transform/test/utils.dart
@@ -0,0 +1,19 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+StreamController<T> createController<T>(String streamType) {
+ switch (streamType) {
+ case 'single subscription':
+ return StreamController<T>();
+ case 'broadcast':
+ return StreamController<T>.broadcast();
+ default:
+ throw ArgumentError.value(
+ streamType, 'streamType', 'Must be one of $streamTypes');
+ }
+}
+
+const streamTypes = ['single subscription', 'broadcast'];
diff --git a/pkgs/stream_transform/test/where_not_null_test.dart b/pkgs/stream_transform/test/where_not_null_test.dart
new file mode 100644
index 0000000..c9af794
--- /dev/null
+++ b/pkgs/stream_transform/test/where_not_null_test.dart
@@ -0,0 +1,56 @@
+// Copyright (c) 2022, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+void main() {
+ test('forwards only events that match the type', () async {
+ var values = Stream.fromIterable([null, 'a', null, 'b']);
+ var filtered = values.whereNotNull();
+ expect(await filtered.toList(), ['a', 'b']);
+ });
+
+ test('can result in empty stream', () async {
+ var values = Stream<Object?>.fromIterable([null, null]);
+ var filtered = values.whereNotNull();
+ expect(await filtered.isEmpty, true);
+ });
+
+ test('forwards values to multiple listeners', () async {
+ var values = StreamController<Object?>.broadcast();
+ var filtered = values.stream.whereNotNull();
+ var firstValues = <Object>[];
+ var secondValues = <Object>[];
+ filtered
+ ..listen(firstValues.add)
+ ..listen(secondValues.add);
+ values
+ ..add(null)
+ ..add('a')
+ ..add(null)
+ ..add('b');
+ await Future(() {});
+ expect(firstValues, ['a', 'b']);
+ expect(secondValues, ['a', 'b']);
+ });
+
+ test('closes streams with multiple listeners', () async {
+ var values = StreamController<Object?>.broadcast();
+ var filtered = values.stream.whereNotNull();
+ var firstDone = false;
+ var secondDone = false;
+ filtered
+ ..listen(null, onDone: () => firstDone = true)
+ ..listen(null, onDone: () => secondDone = true);
+ values
+ ..add(null)
+ ..add('a');
+ await values.close();
+ expect(firstDone, true);
+ expect(secondDone, true);
+ });
+}
diff --git a/pkgs/stream_transform/test/where_type_test.dart b/pkgs/stream_transform/test/where_type_test.dart
new file mode 100644
index 0000000..4cbea37
--- /dev/null
+++ b/pkgs/stream_transform/test/where_type_test.dart
@@ -0,0 +1,56 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:stream_transform/stream_transform.dart';
+import 'package:test/test.dart';
+
+void main() {
+ test('forwards only events that match the type', () async {
+ var values = Stream.fromIterable([1, 'a', 2, 'b']);
+ var filtered = values.whereType<String>();
+ expect(await filtered.toList(), ['a', 'b']);
+ });
+
+ test('can result in empty stream', () async {
+ var values = Stream.fromIterable([1, 2, 3, 4]);
+ var filtered = values.whereType<String>();
+ expect(await filtered.isEmpty, true);
+ });
+
+ test('forwards values to multiple listeners', () async {
+ var values = StreamController<Object>.broadcast();
+ var filtered = values.stream.whereType<String>();
+ var firstValues = <Object>[];
+ var secondValues = <Object>[];
+ filtered
+ ..listen(firstValues.add)
+ ..listen(secondValues.add);
+ values
+ ..add(1)
+ ..add('a')
+ ..add(2)
+ ..add('b');
+ await Future(() {});
+ expect(firstValues, ['a', 'b']);
+ expect(secondValues, ['a', 'b']);
+ });
+
+ test('closes streams with multiple listeners', () async {
+ var values = StreamController<Object>.broadcast();
+ var filtered = values.stream.whereType<String>();
+ var firstDone = false;
+ var secondDone = false;
+ filtered
+ ..listen(null, onDone: () => firstDone = true)
+ ..listen(null, onDone: () => secondDone = true);
+ values
+ ..add(1)
+ ..add('a');
+ await values.close();
+ expect(firstDone, true);
+ expect(secondDone, true);
+ });
+}
diff --git a/pkgs/string_scanner/.github/dependabot.yml b/pkgs/string_scanner/.github/dependabot.yml
new file mode 100644
index 0000000..a19a66a
--- /dev/null
+++ b/pkgs/string_scanner/.github/dependabot.yml
@@ -0,0 +1,16 @@
+# Set update schedule for GitHub Actions
+# See https://docs.github.com/en/free-pro-team@latest/github/administering-a-repository/keeping-your-actions-up-to-date-with-dependabot
+
+version: 2
+updates:
+
+- package-ecosystem: github-actions
+ directory: /
+ schedule:
+ interval: monthly
+ labels:
+ - autosubmit
+ groups:
+ github-actions:
+ patterns:
+ - "*"
diff --git a/pkgs/string_scanner/.github/workflows/publish.yaml b/pkgs/string_scanner/.github/workflows/publish.yaml
new file mode 100644
index 0000000..27157a0
--- /dev/null
+++ b/pkgs/string_scanner/.github/workflows/publish.yaml
@@ -0,0 +1,17 @@
+# A CI configuration to auto-publish pub packages.
+
+name: Publish
+
+on:
+ pull_request:
+ branches: [ master ]
+ push:
+ tags: [ 'v[0-9]+.[0-9]+.[0-9]+' ]
+
+jobs:
+ publish:
+ if: ${{ github.repository_owner == 'dart-lang' }}
+ uses: dart-lang/ecosystem/.github/workflows/publish.yaml@main
+ permissions:
+ id-token: write # Required for authentication using OIDC
+ pull-requests: write # Required for writing the pull request note
diff --git a/pkgs/string_scanner/.github/workflows/test-package.yml b/pkgs/string_scanner/.github/workflows/test-package.yml
new file mode 100644
index 0000000..c60f710
--- /dev/null
+++ b/pkgs/string_scanner/.github/workflows/test-package.yml
@@ -0,0 +1,64 @@
+name: Dart CI
+
+on:
+ # Run on PRs and pushes to the default branch.
+ push:
+ branches: [ master ]
+ pull_request:
+ branches: [ master ]
+ schedule:
+ - cron: "0 0 * * 0"
+
+env:
+ PUB_ENVIRONMENT: bot.github
+
+jobs:
+ # Check code formatting and static analysis on a single OS (linux)
+ # against Dart dev.
+ analyze:
+ runs-on: ubuntu-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ sdk: [dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Check formatting
+ run: dart format --output=none --set-exit-if-changed .
+ if: always() && steps.install.outcome == 'success'
+ - name: Analyze code
+ run: dart analyze --fatal-infos
+ if: always() && steps.install.outcome == 'success'
+
+ # Run tests on a matrix consisting of two dimensions:
+ # 1. OS: ubuntu-latest, (macos-latest, windows-latest)
+ # 2. release channel: dev
+ test:
+ needs: analyze
+ runs-on: ${{ matrix.os }}
+ strategy:
+ fail-fast: false
+ matrix:
+ # Add macos-latest and/or windows-latest if relevant for this package.
+ os: [ubuntu-latest]
+ sdk: [3.1, dev]
+ steps:
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
+ - uses: dart-lang/setup-dart@e630b99d28a3b71860378cafdc2a067c71107f94
+ with:
+ sdk: ${{ matrix.sdk }}
+ - id: install
+ name: Install dependencies
+ run: dart pub get
+ - name: Run VM tests
+ run: dart test --platform vm
+ if: always() && steps.install.outcome == 'success'
+ - name: Run Chrome tests
+ run: dart test --platform chrome
+ if: always() && steps.install.outcome == 'success'
diff --git a/pkgs/string_scanner/.gitignore b/pkgs/string_scanner/.gitignore
new file mode 100644
index 0000000..fb97bde
--- /dev/null
+++ b/pkgs/string_scanner/.gitignore
@@ -0,0 +1,5 @@
+# Don’t commit the following directories created by pub.
+.dart_tool/
+.pub/
+.packages
+pubspec.lock
diff --git a/pkgs/string_scanner/CHANGELOG.md b/pkgs/string_scanner/CHANGELOG.md
new file mode 100644
index 0000000..082e9f2
--- /dev/null
+++ b/pkgs/string_scanner/CHANGELOG.md
@@ -0,0 +1,175 @@
+## 1.4.1
+
+* Move to `dart-lang/tools` monorepo.
+
+## 1.4.0
+
+* Fix `LineScanner`'s handling of `\r\n`'s to preventing errors scanning
+ zero-length matches when between CR and LF. CR is treated as a new line only
+ if not immediately followed by a LF.
+* Fix `LineScanner`'s updating of `column` when setting `position` if the
+ current position is not `0`.
+
+## 1.3.0
+
+* Require Dart 3.1.0
+
+* Add a `SpanScanner.spanFromPosition()` method which takes raw code units
+ rather than `SpanScanner.spanFrom()`'s `LineScannerState`s.
+
+## 1.2.0
+
+* Require Dart 2.18.0
+
+* Add better support for reading code points in the Unicode supplementary plane:
+
+ * Added `StringScanner.readCodePoint()`, which consumes an entire Unicode code
+ point even if it's represented by two UTF-16 code units.
+
+ * Added `StringScanner.peekCodePoint()`, which returns an entire Unicode code
+ point even if it's represented by two UTF-16 code units.
+
+ * `StringScanner.scanChar()` and `StringScanner.expectChar()` will now
+ properly consume two UTF-16 code units if they're passed Unicode code points
+ in the supplementary plane.
+
+## 1.1.1
+
+* Populate the pubspec `repository` field.
+* Switch to `package:lints`.
+* Remove a dependency on `package:charcode`.
+
+## 1.1.0
+
+* Stable release for null safety.
+
+## 1.1.0-nullsafety.3
+
+* Update SDK constraints to `>=2.12.0-0 <3.0.0` based on beta release
+ guidelines.
+
+## 1.1.0-nullsafety.2
+
+* Allow prerelease versions of the 2.12 sdk.
+
+## 1.1.0-nullsafety.1
+
+- Allow 2.10 stable and 2.11.0 dev SDK versions.
+
+## 1.1.0-nullsafety
+
+- Migrate to null safety.
+
+## 1.0.5
+
+- Added an example.
+
+- Update Dart SDK constraint to `>=2.0.0 <3.0.0`.
+
+## 1.0.4
+
+* Add @alwaysThrows annotation to error method.
+
+## 1.0.3
+
+* Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 1.0.2
+
+* `SpanScanner` no longer crashes when creating a span that contains a UTF-16
+ surrogate pair.
+
+## 1.0.1
+
+* Fix the error text emitted by `StringScanner.expectChar()`.
+
+## 1.0.0
+
+* **Breaking change**: `StringScanner.error()`'s `length` argument now defaults
+ to `0` rather than `1` when no match data is available.
+
+* **Breaking change**: `StringScanner.lastMatch` and related methods are now
+ reset when the scanner's position changes without producing a new match.
+
+**Note**: While the changes in `1.0.0` are user-visible, they're unlikely to
+actually break any code in practice. Unless you know that your package is
+incompatible with 0.1.x, consider using 0.1.5 as your lower bound rather
+than 1.0.0. For example, `string_scanner: ">=0.1.5 <2.0.0"`.
+
+## 0.1.5
+
+* Add `new SpanScanner.within()`, which scans within a existing `FileSpan`.
+
+* Add `StringScanner.scanChar()` and `StringScanner.expectChar()`.
+
+## 0.1.4+1
+
+* Remove the dependency on `path`, since we don't actually import it.
+
+## 0.1.4
+
+* Add `new SpanScanner.eager()` for creating a `SpanScanner` that eagerly
+ computes its current line and column numbers.
+
+## 0.1.3+2
+
+* Fix `LineScanner`'s handling of carriage returns to match that of
+ `SpanScanner`.
+
+## 0.1.3+1
+
+* Fixed the homepage URL.
+
+## 0.1.3
+
+* Add an optional `endState` argument to `SpanScanner.spanFrom`.
+
+## 0.1.2
+
+* Add `StringScanner.substring`, which returns a substring of the source string.
+
+## 0.1.1
+
+* Declare `SpanScanner`'s exposed `SourceSpan`s and `SourceLocation`s to be
+ `FileSpan`s and `FileLocation`s. They always were underneath, but callers may
+ now rely on it.
+
+* Add `SpanScanner.location`, which returns the scanner's current
+ `SourceLocation`.
+
+## 0.1.0
+
+* Switch from `source_maps`' `Span` class to `source_span`'s `SourceSpan` class.
+
+* `new StringScanner()`'s `sourceUrl` parameter is now named to make it clear
+ that it can be safely `null`.
+
+* `new StringScannerException()` takes different arguments in a different order
+ to match `SpanFormatException`.
+
+* `StringScannerException.string` has been renamed to
+ `StringScannerException.source` to match the `FormatException` interface.
+
+## 0.0.3
+
+* Make `StringScannerException` inherit from source_map's `SpanFormatException`.
+
+## 0.0.2
+
+* `new StringScanner()` now takes an optional `sourceUrl` argument that provides
+ the URL of the source file. This is used for error reporting.
+
+* Add `StringScanner.readChar()` and `StringScanner.peekChar()` methods for
+ doing character-by-character scanning.
+
+* Scanners now throw `StringScannerException`s which provide more detailed
+ access to information about the errors that were thrown and can provide
+ terminal-colored messages.
+
+* Add a `LineScanner` subclass of `StringScanner` that automatically tracks line
+ and column information of the text being scanned.
+
+* Add a `SpanScanner` subclass of `LineScanner` that exposes matched ranges as
+ [source map][] `Span` objects.
+
+[source_map]: https://pub.dev/packages/source_maps
diff --git a/pkgs/string_scanner/LICENSE b/pkgs/string_scanner/LICENSE
new file mode 100644
index 0000000..000cd7b
--- /dev/null
+++ b/pkgs/string_scanner/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2014, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/string_scanner/README.md b/pkgs/string_scanner/README.md
new file mode 100644
index 0000000..e06e325
--- /dev/null
+++ b/pkgs/string_scanner/README.md
@@ -0,0 +1,41 @@
+[](https://github.com/dart-lang/string_scanner/actions/workflows/test-package.yml)
+[](https://pub.dev/packages/string_scanner)
+[](https://pub.dev/packages/string_scanner/publisher)
+
+This package exposes a `StringScanner` type that makes it easy to parse a string
+using a series of `Pattern`s. For example:
+
+```dart
+import 'dart:math' as math;
+
+import 'package:string_scanner/string_scanner.dart';
+
+num parseNumber(String source) {
+ // Scan a number ("1", "1.5", "-3").
+ final scanner = StringScanner(source);
+
+ // [Scanner.scan] tries to consume a [Pattern] and returns whether or not it
+ // succeeded. It will move the scan pointer past the end of the pattern.
+ final negative = scanner.scan('-');
+
+ // [Scanner.expect] consumes a [Pattern] and throws a [FormatError] if it
+ // fails. Like [Scanner.scan], it will move the scan pointer forward.
+ scanner.expect(RegExp(r'\d+'));
+
+ // [Scanner.lastMatch] holds the [MatchData] for the most recent call to
+ // [Scanner.scan], [Scanner.expect], or [Scanner.matches].
+ var number = num.parse(scanner.lastMatch![0]!);
+
+ if (scanner.scan('.')) {
+ scanner.expect(RegExp(r'\d+'));
+ final decimal = scanner.lastMatch![0]!;
+ number += int.parse(decimal) / math.pow(10, decimal.length);
+ }
+
+ // [Scanner.expectDone] will throw a [FormatError] if there's any input that
+ // hasn't yet been consumed.
+ scanner.expectDone();
+
+ return (negative ? -1 : 1) * number;
+}
+```
diff --git a/pkgs/string_scanner/analysis_options.yaml b/pkgs/string_scanner/analysis_options.yaml
new file mode 100644
index 0000000..59f763a
--- /dev/null
+++ b/pkgs/string_scanner/analysis_options.yaml
@@ -0,0 +1,32 @@
+# https://dart.dev/guides/language/analysis-options
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-inference: true
+ strict-raw-types: true
+
+linter:
+ rules:
+ - avoid_bool_literals_in_conditional_expressions
+ - avoid_classes_with_only_static_members
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_returning_this
+ - avoid_unused_constructor_parameters
+ - avoid_void_async
+ - cancel_subscriptions
+ - join_return_with_assignment
+ - literal_only_boolean_expressions
+ - missing_whitespace_between_adjacent_strings
+ - no_adjacent_strings_in_list
+ - no_runtimeType_toString
+ - prefer_const_declarations
+ - prefer_expression_function_bodies
+ - prefer_final_locals
+ - unnecessary_await_in_return
+ - unnecessary_raw_strings
+ - use_if_null_to_convert_nulls_to_bools
+ - use_raw_strings
+ - use_string_buffers
diff --git a/pkgs/string_scanner/example/example.dart b/pkgs/string_scanner/example/example.dart
new file mode 100644
index 0000000..ec9dd76
--- /dev/null
+++ b/pkgs/string_scanner/example/example.dart
@@ -0,0 +1,40 @@
+// Copyright (c) 2019, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math' as math;
+
+import 'package:string_scanner/string_scanner.dart';
+
+void main(List<String> args) {
+ print(parseNumber(args.single));
+}
+
+num parseNumber(String source) {
+ // Scan a number ("1", "1.5", "-3").
+ final scanner = StringScanner(source);
+
+ // [Scanner.scan] tries to consume a [Pattern] and returns whether or not it
+ // succeeded. It will move the scan pointer past the end of the pattern.
+ final negative = scanner.scan('-');
+
+ // [Scanner.expect] consumes a [Pattern] and throws a [FormatError] if it
+ // fails. Like [Scanner.scan], it will move the scan pointer forward.
+ scanner.expect(RegExp(r'\d+'));
+
+ // [Scanner.lastMatch] holds the [MatchData] for the most recent call to
+ // [Scanner.scan], [Scanner.expect], or [Scanner.matches].
+ var number = num.parse(scanner.lastMatch![0]!);
+
+ if (scanner.scan('.')) {
+ scanner.expect(RegExp(r'\d+'));
+ final decimal = scanner.lastMatch![0]!;
+ number += int.parse(decimal) / math.pow(10, decimal.length);
+ }
+
+ // [Scanner.expectDone] will throw a [FormatError] if there's any input that
+ // hasn't yet been consumed.
+ scanner.expectDone();
+
+ return (negative ? -1 : 1) * number;
+}
diff --git a/pkgs/string_scanner/lib/src/charcode.dart b/pkgs/string_scanner/lib/src/charcode.dart
new file mode 100644
index 0000000..d157749
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/charcode.dart
@@ -0,0 +1,24 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Character '\'.
+const int $backslash = 0x5C;
+
+/// "Carriage return" control character.
+const int $cr = 0x0D;
+
+/// Character '"'.
+const int $doubleQuote = 0x22;
+
+/// Character 'f'.
+const int $f = 0x66;
+
+/// "Line feed" control character.
+const int $lf = 0x0A;
+
+/// Space character.
+const int $space = 0x20;
+
+/// Character 'x'.
+const int $x = 0x78;
diff --git a/pkgs/string_scanner/lib/src/eager_span_scanner.dart b/pkgs/string_scanner/lib/src/eager_span_scanner.dart
new file mode 100644
index 0000000..1ccc746
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/eager_span_scanner.dart
@@ -0,0 +1,133 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'charcode.dart';
+import 'line_scanner.dart';
+import 'span_scanner.dart';
+import 'utils.dart';
+
+// TODO(nweiz): Currently this duplicates code in line_scanner.dart. Once
+// sdk#23770 is fully complete, we should move the shared code into a mixin.
+
+/// A regular expression matching newlines across platforms.
+final _newlineRegExp = RegExp(r'\r\n?|\n');
+
+/// A [SpanScanner] that tracks the line and column eagerly, like [LineScanner].
+class EagerSpanScanner extends SpanScanner {
+ @override
+ int get line => _line;
+ int _line = 0;
+
+ @override
+ int get column => _column;
+ int _column = 0;
+
+ @override
+ LineScannerState get state =>
+ _EagerSpanScannerState(this, position, line, column);
+
+ bool get _betweenCRLF => peekChar(-1) == $cr && peekChar() == $lf;
+
+ @override
+ set state(LineScannerState state) {
+ if (state is! _EagerSpanScannerState || !identical(state._scanner, this)) {
+ throw ArgumentError('The given LineScannerState was not returned by '
+ 'this LineScanner.');
+ }
+
+ super.position = state.position;
+ _line = state.line;
+ _column = state.column;
+ }
+
+ @override
+ set position(int newPosition) {
+ final oldPosition = position;
+ super.position = newPosition;
+
+ if (newPosition > oldPosition) {
+ final newlines = _newlinesIn(string.substring(oldPosition, newPosition));
+ _line += newlines.length;
+ if (newlines.isEmpty) {
+ _column += newPosition - oldPosition;
+ } else {
+ _column = newPosition - newlines.last.end;
+ }
+ } else {
+ final newlines = _newlinesIn(string.substring(newPosition, oldPosition));
+ if (_betweenCRLF) newlines.removeLast();
+
+ _line -= newlines.length;
+ if (newlines.isEmpty) {
+ _column -= oldPosition - newPosition;
+ } else {
+ _column =
+ newPosition - string.lastIndexOf(_newlineRegExp, newPosition) - 1;
+ }
+ }
+ }
+
+ EagerSpanScanner(super.string, {super.sourceUrl, super.position});
+
+ @override
+ bool scanChar(int character) {
+ if (!super.scanChar(character)) return false;
+ _adjustLineAndColumn(character);
+ return true;
+ }
+
+ @override
+ int readChar() {
+ final character = super.readChar();
+ _adjustLineAndColumn(character);
+ return character;
+ }
+
+ /// Adjusts [_line] and [_column] after having consumed [character].
+ void _adjustLineAndColumn(int character) {
+ if (character == $lf || (character == $cr && peekChar() != $lf)) {
+ _line += 1;
+ _column = 0;
+ } else {
+ _column += inSupplementaryPlane(character) ? 2 : 1;
+ }
+ }
+
+ @override
+ bool scan(Pattern pattern) {
+ if (!super.scan(pattern)) return false;
+ final firstMatch = lastMatch![0]!;
+
+ final newlines = _newlinesIn(firstMatch);
+ _line += newlines.length;
+ if (newlines.isEmpty) {
+ _column += firstMatch.length;
+ } else {
+ _column = firstMatch.length - newlines.last.end;
+ }
+
+ return true;
+ }
+
+ /// Returns a list of [Match]es describing all the newlines in [text], which
+ /// is assumed to end at [position].
+ List<Match> _newlinesIn(String text) {
+ final newlines = _newlineRegExp.allMatches(text).toList();
+ if (_betweenCRLF) newlines.removeLast();
+ return newlines;
+ }
+}
+
+/// A class representing the state of an [EagerSpanScanner].
+class _EagerSpanScannerState implements LineScannerState {
+ final EagerSpanScanner _scanner;
+ @override
+ final int position;
+ @override
+ final int line;
+ @override
+ final int column;
+
+ _EagerSpanScannerState(this._scanner, this.position, this.line, this.column);
+}
diff --git a/pkgs/string_scanner/lib/src/exception.dart b/pkgs/string_scanner/lib/src/exception.dart
new file mode 100644
index 0000000..57af541
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/exception.dart
@@ -0,0 +1,21 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+
+import 'string_scanner.dart';
+
+/// An exception thrown by a [StringScanner] that failed to parse a string.
+class StringScannerException extends SourceSpanFormatException {
+ @override
+ String get source => super.source as String;
+
+ /// The URL of the source file being parsed.
+ ///
+ /// This may be `null`, indicating that the source URL is unknown.
+ Uri? get sourceUrl => span?.sourceUrl;
+
+ StringScannerException(
+ super.message, SourceSpan super.span, String super.source);
+}
diff --git a/pkgs/string_scanner/lib/src/line_scanner.dart b/pkgs/string_scanner/lib/src/line_scanner.dart
new file mode 100644
index 0000000..b18d610
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/line_scanner.dart
@@ -0,0 +1,183 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'charcode.dart';
+import 'string_scanner.dart';
+import 'utils.dart';
+
+// Note that much of this code is duplicated in eager_span_scanner.dart.
+
+/// A regular expression matching newlines. A newline is either a `\n`, a `\r\n`
+/// or a `\r` that is not immediately followed by a `\n`.
+final _newlineRegExp = RegExp(r'\n|\r\n|\r(?!\n)');
+
+/// A subclass of [StringScanner] that tracks line and column information.
+class LineScanner extends StringScanner {
+ /// The scanner's current (zero-based) line number.
+ int get line => _line;
+ int _line = 0;
+
+ /// The scanner's current (zero-based) column number.
+ int get column => _column;
+ int _column = 0;
+
+ /// The scanner's state, including line and column information.
+ ///
+ /// This can be used to efficiently save and restore the state of the scanner
+ /// when backtracking. A given [LineScannerState] is only valid for the
+ /// [LineScanner] that created it.
+ ///
+ /// This does not include the scanner's match information.
+ LineScannerState get state =>
+ LineScannerState._(this, position, line, column);
+
+ /// Whether the current position is between a CR character and an LF
+ /// charactet.
+ bool get _betweenCRLF => peekChar(-1) == $cr && peekChar() == $lf;
+
+ set state(LineScannerState state) {
+ if (!identical(state._scanner, this)) {
+ throw ArgumentError('The given LineScannerState was not returned by '
+ 'this LineScanner.');
+ }
+
+ super.position = state.position;
+ _line = state.line;
+ _column = state.column;
+ }
+
+ @override
+ set position(int newPosition) {
+ if (newPosition == position) {
+ return;
+ }
+
+ final oldPosition = position;
+ super.position = newPosition;
+
+ if (newPosition == 0) {
+ _line = 0;
+ _column = 0;
+ } else if (newPosition > oldPosition) {
+ final newlines = _newlinesIn(string.substring(oldPosition, newPosition),
+ endPosition: newPosition);
+ _line += newlines.length;
+ if (newlines.isEmpty) {
+ _column += newPosition - oldPosition;
+ } else {
+ // The regex got a substring, so we need to account for where it started
+ // in the string.
+ final offsetOfLastNewline = oldPosition + newlines.last.end;
+ _column = newPosition - offsetOfLastNewline;
+ }
+ } else if (newPosition < oldPosition) {
+ final newlines = _newlinesIn(string.substring(newPosition, oldPosition),
+ endPosition: oldPosition);
+
+ _line -= newlines.length;
+ if (newlines.isEmpty) {
+ _column -= oldPosition - newPosition;
+ } else {
+ // To compute the new column, we need to locate the last newline before
+ // the new position. When searching, we must exclude the CR if we're
+ // between a CRLF because it's not considered a newline.
+ final crOffset = _betweenCRLF ? -1 : 0;
+ // Additionally, if we use newPosition as the end of the search and the
+ // character at that position itself (the next character) is a newline
+ // we should not use it, so also offset to account for that.
+ const currentCharOffset = -1;
+ final lastNewline = string.lastIndexOf(
+ _newlineRegExp, newPosition + currentCharOffset + crOffset);
+
+ // Now we need to know the offset after the newline. This is the index
+ // above plus the length of the newline (eg. if we found `\r\n`) we need
+ // to add two. However if no newline was found, that index is 0.
+ final offsetAfterLastNewline = lastNewline == -1
+ ? 0
+ : string[lastNewline] == '\r' && string[lastNewline + 1] == '\n'
+ ? lastNewline + 2
+ : lastNewline + 1;
+
+ _column = newPosition - offsetAfterLastNewline;
+ }
+ }
+ }
+
+ LineScanner(super.string, {super.sourceUrl, super.position});
+
+ @override
+ bool scanChar(int character) {
+ if (!super.scanChar(character)) return false;
+ _adjustLineAndColumn(character);
+ return true;
+ }
+
+ @override
+ int readChar() {
+ final character = super.readChar();
+ _adjustLineAndColumn(character);
+ return character;
+ }
+
+ /// Adjusts [_line] and [_column] after having consumed [character].
+ void _adjustLineAndColumn(int character) {
+ if (character == $lf || (character == $cr && peekChar() != $lf)) {
+ _line += 1;
+ _column = 0;
+ } else {
+ _column += inSupplementaryPlane(character) ? 2 : 1;
+ }
+ }
+
+ @override
+ bool scan(Pattern pattern) {
+ if (!super.scan(pattern)) return false;
+
+ final newlines = _newlinesIn(lastMatch![0]!, endPosition: position);
+ _line += newlines.length;
+ if (newlines.isEmpty) {
+ _column += lastMatch![0]!.length;
+ } else {
+ _column = lastMatch![0]!.length - newlines.last.end;
+ }
+
+ return true;
+ }
+
+ /// Returns a list of [Match]es describing all the newlines in [text], which
+ /// ends at [endPosition].
+ ///
+ /// If [text] ends with `\r`, it will only be treated as a newline if the next
+ /// character at [position] is not a `\n`.
+ List<Match> _newlinesIn(String text, {required int endPosition}) {
+ final newlines = _newlineRegExp.allMatches(text).toList();
+ // If the last character is a `\r` it will have been treated as a newline,
+ // but this is only valid if the next character is not a `\n`.
+ if (endPosition < string.length &&
+ text.endsWith('\r') &&
+ string[endPosition] == '\n') {
+ // newlines should never be empty here, because if `text` ends with `\r`
+ // it would have matched `\r(?!\n)` in the newline regex.
+ newlines.removeLast();
+ }
+ return newlines;
+ }
+}
+
+/// A class representing the state of a [LineScanner].
+class LineScannerState {
+ /// The [LineScanner] that created this.
+ final LineScanner _scanner;
+
+ /// The position of the scanner in this state.
+ final int position;
+
+ /// The zero-based line number of the scanner in this state.
+ final int line;
+
+ /// The zero-based column number of the scanner in this state.
+ final int column;
+
+ LineScannerState._(this._scanner, this.position, this.line, this.column);
+}
diff --git a/pkgs/string_scanner/lib/src/relative_span_scanner.dart b/pkgs/string_scanner/lib/src/relative_span_scanner.dart
new file mode 100644
index 0000000..cd9af0e
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/relative_span_scanner.dart
@@ -0,0 +1,132 @@
+// Copyright (c) 2016, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+
+import 'exception.dart';
+import 'line_scanner.dart';
+import 'span_scanner.dart';
+import 'string_scanner.dart';
+import 'utils.dart';
+
+/// A [SpanScanner] that scans within an existing [FileSpan].
+///
+/// This re-implements chunks of [SpanScanner] rather than using a dummy span or
+/// inheritance because scanning is often a performance-critical operation, so
+/// it's important to avoid adding extra overhead when relative scanning isn't
+/// needed.
+class RelativeSpanScanner extends StringScanner implements SpanScanner {
+ /// The source of the scanner.
+ ///
+ /// This caches line break information and is used to generate [SourceSpan]s.
+ final SourceFile _sourceFile;
+
+ /// The start location of the span within which this scanner is scanning.
+ ///
+ /// This is used to convert between span-relative and file-relative fields.
+ final FileLocation _startLocation;
+
+ @override
+ int get line =>
+ _sourceFile.getLine(_startLocation.offset + position) -
+ _startLocation.line;
+
+ @override
+ int get column {
+ final line = _sourceFile.getLine(_startLocation.offset + position);
+ final column =
+ _sourceFile.getColumn(_startLocation.offset + position, line: line);
+ return line == _startLocation.line
+ ? column - _startLocation.column
+ : column;
+ }
+
+ @override
+ LineScannerState get state => _SpanScannerState(this, position);
+
+ @override
+ set state(LineScannerState state) {
+ if (state is! _SpanScannerState || !identical(state._scanner, this)) {
+ throw ArgumentError('The given LineScannerState was not returned by '
+ 'this LineScanner.');
+ }
+
+ position = state.position;
+ }
+
+ @override
+ FileSpan? get lastSpan => _lastSpan;
+ FileSpan? _lastSpan;
+
+ @override
+ FileLocation get location =>
+ _sourceFile.location(_startLocation.offset + position);
+
+ @override
+ FileSpan get emptySpan => location.pointSpan();
+
+ RelativeSpanScanner(FileSpan span)
+ : _sourceFile = span.file,
+ _startLocation = span.start,
+ super(span.text, sourceUrl: span.sourceUrl);
+
+ @override
+ FileSpan spanFrom(LineScannerState startState, [LineScannerState? endState]) {
+ final endPosition = endState == null ? position : endState.position;
+ return _sourceFile.span(_startLocation.offset + startState.position,
+ _startLocation.offset + endPosition);
+ }
+
+ @override
+ FileSpan spanFromPosition(int startPosition, [int? endPosition]) {
+ RangeError.checkValidRange(
+ startPosition,
+ endPosition,
+ _sourceFile.length - _startLocation.offset,
+ 'startPosition',
+ 'endPosition');
+ return _sourceFile.span(_startLocation.offset + startPosition,
+ _startLocation.offset + (endPosition ?? position));
+ }
+
+ @override
+ bool matches(Pattern pattern) {
+ if (!super.matches(pattern)) {
+ _lastSpan = null;
+ return false;
+ }
+
+ _lastSpan = _sourceFile.span(_startLocation.offset + position,
+ _startLocation.offset + lastMatch!.end);
+ return true;
+ }
+
+ @override
+ Never error(String message, {Match? match, int? position, int? length}) {
+ validateErrorArgs(string, match, position, length);
+
+ if (match == null && position == null && length == null) match = lastMatch;
+ position ??= match == null ? this.position : match.start;
+ length ??= match == null ? 1 : match.end - match.start;
+
+ final span = _sourceFile.span(_startLocation.offset + position,
+ _startLocation.offset + position + length);
+ throw StringScannerException(message, span, string);
+ }
+}
+
+/// A class representing the state of a [SpanScanner].
+class _SpanScannerState implements LineScannerState {
+ /// The [SpanScanner] that created this.
+ final RelativeSpanScanner _scanner;
+
+ @override
+ final int position;
+ @override
+ int get line => _scanner._sourceFile.getLine(position);
+ @override
+ int get column => _scanner._sourceFile.getColumn(position);
+
+ _SpanScannerState(this._scanner, this.position);
+}
diff --git a/pkgs/string_scanner/lib/src/span_scanner.dart b/pkgs/string_scanner/lib/src/span_scanner.dart
new file mode 100644
index 0000000..509cf60
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/span_scanner.dart
@@ -0,0 +1,142 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+
+import 'eager_span_scanner.dart';
+import 'exception.dart';
+import 'line_scanner.dart';
+import 'relative_span_scanner.dart';
+import 'string_scanner.dart';
+import 'utils.dart';
+
+/// A subclass of [LineScanner] that exposes matched ranges as source map
+/// [FileSpan]s.
+class SpanScanner extends StringScanner implements LineScanner {
+ /// The source of the scanner.
+ ///
+ /// This caches line break information and is used to generate [FileSpan]s.
+ final SourceFile _sourceFile;
+
+ @override
+ int get line => _sourceFile.getLine(position);
+ @override
+ int get column => _sourceFile.getColumn(position);
+
+ @override
+ LineScannerState get state => _SpanScannerState(this, position);
+
+ @override
+ set state(LineScannerState state) {
+ if (state is! _SpanScannerState || !identical(state._scanner, this)) {
+ throw ArgumentError('The given LineScannerState was not returned by '
+ 'this LineScanner.');
+ }
+
+ position = state.position;
+ }
+
+ /// The [FileSpan] for [lastMatch].
+ ///
+ /// This is the span for the entire match. There's no way to get spans for
+ /// subgroups since [Match] exposes no information about their positions.
+ FileSpan? get lastSpan {
+ if (lastMatch == null) _lastSpan = null;
+ return _lastSpan;
+ }
+
+ FileSpan? _lastSpan;
+
+ /// The current location of the scanner.
+ FileLocation get location => _sourceFile.location(position);
+
+ /// Returns an empty span at the current location.
+ FileSpan get emptySpan => location.pointSpan();
+
+ /// Creates a new [SpanScanner] that starts scanning from [position].
+ ///
+ /// [sourceUrl] is used as [SourceLocation.sourceUrl] for the returned
+ /// [FileSpan]s as well as for error reporting. It can be a [String], a
+ /// [Uri], or `null`.
+ SpanScanner(super.string, {super.sourceUrl, super.position})
+ : _sourceFile = SourceFile.fromString(string, url: sourceUrl);
+
+ /// Creates a new [SpanScanner] that eagerly computes line and column numbers.
+ ///
+ /// In general [SpanScanner.new] will be more efficient, since it avoids extra
+ /// computation on every scan. However, eager scanning can be useful for
+ /// situations where the normal course of parsing frequently involves
+ /// accessing the current line and column numbers.
+ ///
+ /// Note that *only* the `line` and `column` fields on the `SpanScanner`
+ /// itself and its `LineScannerState` are eagerly computed. To limit their
+ /// memory footprint, returned spans and locations will still lazily compute
+ /// their line and column numbers.
+ factory SpanScanner.eager(String string, {sourceUrl, int? position}) =
+ EagerSpanScanner;
+
+ /// Creates a new [SpanScanner] that scans within [span].
+ ///
+ /// This scans through [span]`.text, but emits new spans from [span]`.file` in
+ /// their appropriate relative positions. The [string] field contains only
+ /// [span]`.text`, and [position], [line], and [column] are all relative to
+ /// the span.
+ factory SpanScanner.within(FileSpan span) = RelativeSpanScanner;
+
+ /// Creates a [FileSpan] representing the source range between [startState]
+ /// and the current position.
+ FileSpan spanFrom(LineScannerState startState, [LineScannerState? endState]) {
+ final endPosition = endState == null ? position : endState.position;
+ return _sourceFile.span(startState.position, endPosition);
+ }
+
+ /// Creates a [FileSpan] representing the source range between [startPosition]
+ /// and [endPosition], or the current position if [endPosition] is null.
+ ///
+ /// Each position should be a code unit offset into the string being scanned,
+ /// with the same conventions as [StringScanner.position].
+ ///
+ /// Throws a [RangeError] if [startPosition] or [endPosition] aren't within
+ /// this source file.
+ FileSpan spanFromPosition(int startPosition, [int? endPosition]) =>
+ _sourceFile.span(startPosition, endPosition ?? position);
+
+ @override
+ bool matches(Pattern pattern) {
+ if (!super.matches(pattern)) {
+ _lastSpan = null;
+ return false;
+ }
+
+ _lastSpan = _sourceFile.span(position, lastMatch!.end);
+ return true;
+ }
+
+ @override
+ Never error(String message, {Match? match, int? position, int? length}) {
+ validateErrorArgs(string, match, position, length);
+
+ if (match == null && position == null && length == null) match = lastMatch;
+ position ??= match == null ? this.position : match.start;
+ length ??= match == null ? 0 : match.end - match.start;
+
+ final span = _sourceFile.span(position, position + length);
+ throw StringScannerException(message, span, string);
+ }
+}
+
+/// A class representing the state of a [SpanScanner].
+class _SpanScannerState implements LineScannerState {
+ /// The [SpanScanner] that created this.
+ final SpanScanner _scanner;
+
+ @override
+ final int position;
+ @override
+ int get line => _scanner._sourceFile.getLine(position);
+ @override
+ int get column => _scanner._sourceFile.getColumn(position);
+
+ _SpanScannerState(this._scanner, this.position);
+}
diff --git a/pkgs/string_scanner/lib/src/string_scanner.dart b/pkgs/string_scanner/lib/src/string_scanner.dart
new file mode 100644
index 0000000..1466944
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/string_scanner.dart
@@ -0,0 +1,272 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+
+import 'charcode.dart';
+import 'exception.dart';
+import 'utils.dart';
+
+/// A class that scans through a string using [Pattern]s.
+class StringScanner {
+ /// The URL of the source of the string being scanned.
+ ///
+ /// This is used for error reporting. It may be `null`, indicating that the
+ /// source URL is unknown or unavailable.
+ final Uri? sourceUrl;
+
+ /// The string being scanned through.
+ final String string;
+
+ /// The current position of the scanner in the string, in characters.
+ int get position => _position;
+ set position(int position) {
+ if (position.isNegative || position > string.length) {
+ throw ArgumentError('Invalid position $position');
+ }
+
+ _position = position;
+ _lastMatch = null;
+ }
+
+ int _position = 0;
+
+ /// The data about the previous match made by the scanner.
+ ///
+ /// If the last match failed, this will be `null`.
+ Match? get lastMatch {
+ // Lazily unset [_lastMatch] so that we avoid extra assignments in
+ // character-by-character methods that are used in core loops.
+ if (_position != _lastMatchPosition) _lastMatch = null;
+ return _lastMatch;
+ }
+
+ Match? _lastMatch;
+ int? _lastMatchPosition;
+
+ /// The portion of the string that hasn't yet been scanned.
+ String get rest => string.substring(position);
+
+ /// Whether the scanner has completely consumed [string].
+ bool get isDone => position == string.length;
+
+ /// Creates a new [StringScanner] that starts scanning from [position].
+ ///
+ /// [position] defaults to 0, the beginning of the string. [sourceUrl] is the
+ /// URL of the source of the string being scanned, if available. It can be
+ /// a [String], a [Uri], or `null`.
+ StringScanner(this.string, {Object? sourceUrl, int? position})
+ : sourceUrl = sourceUrl == null
+ ? null
+ : sourceUrl is String
+ ? Uri.parse(sourceUrl)
+ : sourceUrl as Uri {
+ if (position != null) this.position = position;
+ }
+
+ /// Consumes a single character and returns its character code.
+ ///
+ /// This throws a [FormatException] if the string has been fully consumed. It
+ /// doesn't affect [lastMatch].
+ int readChar() {
+ if (isDone) _fail('more input');
+ return string.codeUnitAt(_position++);
+ }
+
+ /// Returns the character code of the character [offset] away from [position].
+ ///
+ /// [offset] defaults to zero, and may be negative to inspect already-consumed
+ /// characters.
+ ///
+ /// This returns `null` if [offset] points outside the string. It doesn't
+ /// affect [lastMatch].
+ int? peekChar([int? offset]) {
+ offset ??= 0;
+ final index = position + offset;
+ if (index < 0 || index >= string.length) return null;
+ return string.codeUnitAt(index);
+ }
+
+ /// If the next character in the string is [character], consumes it.
+ ///
+ /// If [character] is a Unicode code point in a supplementary plane, this will
+ /// consume two code units. Dart's string representation is UTF-16, which
+ /// represents supplementary-plane code units as two code units.
+ ///
+ /// Returns whether or not [character] was consumed.
+ bool scanChar(int character) {
+ if (inSupplementaryPlane(character)) {
+ if (_position + 1 >= string.length ||
+ string.codeUnitAt(_position) != highSurrogate(character) ||
+ string.codeUnitAt(_position + 1) != lowSurrogate(character)) {
+ return false;
+ } else {
+ _position += 2;
+ return true;
+ }
+ } else {
+ if (isDone) return false;
+ if (string.codeUnitAt(_position) != character) return false;
+ _position++;
+ return true;
+ }
+ }
+
+ /// If the next character in the string is [character], consumes it.
+ ///
+ /// If [character] is a Unicode code point in a supplementary plane, this will
+ /// consume two code units. Dart's string representation is UTF-16, which
+ /// represents supplementary-plane code units as two code units.
+ ///
+ /// If [character] could not be consumed, throws a [FormatException]
+ /// describing the position of the failure. [name] is used in this error as
+ /// the expected name of the character being matched; if it's `null`, the
+ /// character itself is used instead.
+ void expectChar(int character, {String? name}) {
+ if (scanChar(character)) return;
+
+ if (name == null) {
+ if (character == $backslash) {
+ name = r'"\"';
+ } else if (character == $doubleQuote) {
+ name = r'"\""';
+ } else {
+ name = '"${String.fromCharCode(character)}"';
+ }
+ }
+
+ _fail(name);
+ }
+
+ /// Consumes a single Unicode code unit and returns it.
+ ///
+ /// This works like [readChar], except that it automatically handles UTF-16
+ /// surrogate pairs. Specifically, if the next two code units form a surrogate
+ /// pair, consumes them both and returns the corresponding Unicode code point.
+ ///
+ /// If next two characters are not a surrogate pair, the next code unit is
+ /// returned as-is, even if it's an unpaired surrogate.
+ int readCodePoint() {
+ final first = readChar();
+ if (!isHighSurrogate(first)) return first;
+
+ final next = peekChar();
+ if (next == null || !isLowSurrogate(next)) return first;
+
+ readChar();
+ return decodeSurrogatePair(first, next);
+ }
+
+ /// Returns the Unicode code point immediately after [position].
+ ///
+ /// This works like [peekChar], except that it automatically handles UTF-16
+ /// surrogate pairs. Specifically, if the next two code units form a surrogate
+ /// pair, returns the corresponding Unicode code point.
+ ///
+ /// If next two characters are not a surrogate pair, the next code unit is
+ /// returned as-is, even if it's an unpaired surrogate.
+ int? peekCodePoint() {
+ final first = peekChar();
+ if (first == null || !isHighSurrogate(first)) return first;
+
+ final next = peekChar(1);
+ if (next == null || !isLowSurrogate(next)) return first;
+
+ return decodeSurrogatePair(first, next);
+ }
+
+ /// If [pattern] matches at the current position of the string, scans forward
+ /// until the end of the match.
+ ///
+ /// Returns whether or not [pattern] matched.
+ bool scan(Pattern pattern) {
+ final success = matches(pattern);
+ if (success) {
+ _position = _lastMatch!.end;
+ _lastMatchPosition = _position;
+ }
+ return success;
+ }
+
+ /// If [pattern] matches at the current position of the string, scans forward
+ /// until the end of the match.
+ ///
+ /// If [pattern] did not match, throws a [FormatException] describing the
+ /// position of the failure. [name] is used in this error as the expected name
+ /// of the pattern being matched; if it's `null`, the pattern itself is used
+ /// instead.
+ void expect(Pattern pattern, {String? name}) {
+ if (scan(pattern)) return;
+
+ if (name == null) {
+ if (pattern is RegExp) {
+ final source = pattern.pattern;
+ name = '/$source/';
+ } else {
+ name =
+ pattern.toString().replaceAll(r'\', r'\\').replaceAll('"', r'\"');
+ name = '"$name"';
+ }
+ }
+ _fail(name);
+ }
+
+ /// If the string has not been fully consumed, this throws a
+ /// [FormatException].
+ void expectDone() {
+ if (isDone) return;
+ _fail('no more input');
+ }
+
+ /// Returns whether or not [pattern] matches at the current position of the
+ /// string.
+ ///
+ /// This doesn't move the scan pointer forward.
+ bool matches(Pattern pattern) {
+ _lastMatch = pattern.matchAsPrefix(string, position);
+ _lastMatchPosition = _position;
+ return _lastMatch != null;
+ }
+
+ /// Returns the substring of [string] between [start] and [end].
+ ///
+ /// Unlike [String.substring], [end] defaults to [position] rather than the
+ /// end of the string.
+ String substring(int start, [int? end]) {
+ end ??= position;
+ return string.substring(start, end);
+ }
+
+ /// Throws a [FormatException] with [message] as well as a detailed
+ /// description of the location of the error in the string.
+ ///
+ /// [match] is the match information for the span of the string with which the
+ /// error is associated. This should be a match returned by this scanner's
+ /// [lastMatch] property. By default, the error is associated with the last
+ /// match.
+ ///
+ /// If [position] and/or [length] are passed, they are used as the error span
+ /// instead. If only [length] is passed, [position] defaults to the current
+ /// position; if only [position] is passed, [length] defaults to 0.
+ ///
+ /// It's an error to pass [match] at the same time as [position] or [length].
+ Never error(String message, {Match? match, int? position, int? length}) {
+ validateErrorArgs(string, match, position, length);
+
+ if (match == null && position == null && length == null) match = lastMatch;
+ position ??= match == null ? this.position : match.start;
+ length ??= match == null ? 0 : match.end - match.start;
+
+ final sourceFile = SourceFile.fromString(string, url: sourceUrl);
+ final span = sourceFile.span(position, position + length);
+ throw StringScannerException(message, span, string);
+ }
+
+ // TODO(nweiz): Make this handle long lines more gracefully.
+ /// Throws a [FormatException] describing that [name] is expected at the
+ /// current position in the string.
+ Never _fail(String name) {
+ error('expected $name.', position: position, length: 0);
+ }
+}
diff --git a/pkgs/string_scanner/lib/src/utils.dart b/pkgs/string_scanner/lib/src/utils.dart
new file mode 100644
index 0000000..39891a1
--- /dev/null
+++ b/pkgs/string_scanner/lib/src/utils.dart
@@ -0,0 +1,95 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'string_scanner.dart';
+
+/// Validates the arguments passed to [StringScanner.error].
+void validateErrorArgs(
+ String string, Match? match, int? position, int? length) {
+ if (match != null && (position != null || length != null)) {
+ throw ArgumentError("Can't pass both match and position/length.");
+ }
+
+ if (position != null) {
+ if (position < 0) {
+ throw RangeError('position must be greater than or equal to 0.');
+ } else if (position > string.length) {
+ throw RangeError('position must be less than or equal to the '
+ 'string length.');
+ }
+ }
+
+ if (length != null && length < 0) {
+ throw RangeError('length must be greater than or equal to 0.');
+ }
+
+ if (position != null && length != null && position + length > string.length) {
+ throw RangeError('position plus length must not go beyond the end of '
+ 'the string.');
+ }
+}
+
+// See https://en.wikipedia.org/wiki/UTF-16#Code_points_from_U+010000_to_U+10FFFF
+// for documentation on how UTF-16 encoding works and definitions of various
+// related terms.
+
+/// The inclusive lower bound of Unicode's supplementary plane.
+const _supplementaryPlaneLowerBound = 0x10000;
+
+/// The inclusive upper bound of Unicode's supplementary plane.
+const _supplementaryPlaneUpperBound = 0x10FFFF;
+
+/// The inclusive lower bound of the UTF-16 high surrogate block.
+const _highSurrogateLowerBound = 0xD800;
+
+/// The inclusive lower bound of the UTF-16 low surrogate block.
+const _lowSurrogateLowerBound = 0xDC00;
+
+/// The number of low bits in each code unit of a surrogate pair that goes into
+/// determining which code point it encodes.
+const _surrogateBits = 10;
+
+/// A bit mask that covers the lower [_surrogateBits] of a code point, which can
+/// be used to extract the value of a surrogate or the low surrogate value of a
+/// code unit.
+const _surrogateValueMask = (1 << _surrogateBits) - 1;
+
+/// Returns whether [codePoint] is in the Unicode supplementary plane, and thus
+/// must be represented as a surrogate pair in UTF-16.
+bool inSupplementaryPlane(int codePoint) =>
+ codePoint >= _supplementaryPlaneLowerBound &&
+ codePoint <= _supplementaryPlaneUpperBound;
+
+/// Returns whether [codeUnit] is a UTF-16 high surrogate.
+bool isHighSurrogate(int codeUnit) =>
+ (codeUnit & ~_surrogateValueMask) == _highSurrogateLowerBound;
+
+/// Returns whether [codeUnit] is a UTF-16 low surrogate.
+bool isLowSurrogate(int codeUnit) =>
+ (codeUnit >> _surrogateBits) == (_lowSurrogateLowerBound >> _surrogateBits);
+
+/// Returns the high surrogate needed to encode the supplementary-plane
+/// [codePoint].
+int highSurrogate(int codePoint) {
+ assert(inSupplementaryPlane(codePoint));
+ return ((codePoint - _supplementaryPlaneLowerBound) >> _surrogateBits) +
+ _highSurrogateLowerBound;
+}
+
+/// Returns the low surrogate needed to encode the supplementary-plane
+/// [codePoint].
+int lowSurrogate(int codePoint) {
+ assert(inSupplementaryPlane(codePoint));
+ return ((codePoint - _supplementaryPlaneLowerBound) & _surrogateValueMask) +
+ _lowSurrogateLowerBound;
+}
+
+/// Converts a UTF-16 surrogate pair into the Unicode code unit it represents.
+int decodeSurrogatePair(int highSurrogate, int lowSurrogate) {
+ assert(isHighSurrogate(highSurrogate));
+ assert(isLowSurrogate(lowSurrogate));
+ return _supplementaryPlaneLowerBound +
+ (((highSurrogate & _surrogateValueMask) << _surrogateBits) |
+ (lowSurrogate & _surrogateValueMask));
+}
diff --git a/pkgs/string_scanner/lib/string_scanner.dart b/pkgs/string_scanner/lib/string_scanner.dart
new file mode 100644
index 0000000..e641ae7
--- /dev/null
+++ b/pkgs/string_scanner/lib/string_scanner.dart
@@ -0,0 +1,11 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// A library for parsing strings using a sequence of patterns.
+library;
+
+export 'src/exception.dart';
+export 'src/line_scanner.dart';
+export 'src/span_scanner.dart';
+export 'src/string_scanner.dart';
diff --git a/pkgs/string_scanner/pubspec.yaml b/pkgs/string_scanner/pubspec.yaml
new file mode 100644
index 0000000..9b259cf
--- /dev/null
+++ b/pkgs/string_scanner/pubspec.yaml
@@ -0,0 +1,14 @@
+name: string_scanner
+version: 1.4.1
+description: A class for parsing strings using a sequence of patterns.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/string_scanner
+
+environment:
+ sdk: ^3.1.0
+
+dependencies:
+ source_span: ^1.8.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.6
diff --git a/pkgs/string_scanner/test/error_test.dart b/pkgs/string_scanner/test/error_test.dart
new file mode 100644
index 0000000..1f98c32
--- /dev/null
+++ b/pkgs/string_scanner/test/error_test.dart
@@ -0,0 +1,143 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:string_scanner/string_scanner.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ test('defaults to the last match', () {
+ final scanner = StringScanner('foo bar baz');
+ scanner.expect('foo ');
+ scanner.expect('bar');
+ expect(() => scanner.error('oh no!'), throwsStringScannerException('bar'));
+ });
+
+ group('with match', () {
+ test('supports an earlier match', () {
+ final scanner = StringScanner('foo bar baz');
+ scanner.expect('foo ');
+ final match = scanner.lastMatch;
+ scanner.expect('bar');
+ expect(() => scanner.error('oh no!', match: match),
+ throwsStringScannerException('foo '));
+ });
+
+ test('supports a match on a previous line', () {
+ final scanner = StringScanner('foo bar baz\ndo re mi\nearth fire water');
+ scanner.expect('foo bar baz\ndo ');
+ scanner.expect('re');
+ final match = scanner.lastMatch;
+ scanner.expect(' mi\nearth ');
+ expect(() => scanner.error('oh no!', match: match),
+ throwsStringScannerException('re'));
+ });
+
+ test('supports a multiline match', () {
+ final scanner = StringScanner('foo bar baz\ndo re mi\nearth fire water');
+ scanner.expect('foo bar ');
+ scanner.expect('baz\ndo');
+ final match = scanner.lastMatch;
+ scanner.expect(' re mi');
+ expect(() => scanner.error('oh no!', match: match),
+ throwsStringScannerException('baz\ndo'));
+ });
+
+ test('supports a match after position', () {
+ final scanner = StringScanner('foo bar baz');
+ scanner.expect('foo ');
+ scanner.expect('bar');
+ final match = scanner.lastMatch;
+ scanner.position = 0;
+ expect(() => scanner.error('oh no!', match: match),
+ throwsStringScannerException('bar'));
+ });
+ });
+
+ group('with position and/or length', () {
+ test('defaults to length 0', () {
+ final scanner = StringScanner('foo bar baz');
+ scanner.expect('foo ');
+ expect(() => scanner.error('oh no!', position: 1),
+ throwsStringScannerException(''));
+ });
+
+ test('defaults to the current position', () {
+ final scanner = StringScanner('foo bar baz');
+ scanner.expect('foo ');
+ expect(() => scanner.error('oh no!', length: 3),
+ throwsStringScannerException('bar'));
+ });
+
+ test('supports an earlier position', () {
+ final scanner = StringScanner('foo bar baz');
+ scanner.expect('foo ');
+ expect(() => scanner.error('oh no!', position: 1, length: 2),
+ throwsStringScannerException('oo'));
+ });
+
+ test('supports a position on a previous line', () {
+ final scanner = StringScanner('foo bar baz\ndo re mi\nearth fire water');
+ scanner.expect('foo bar baz\ndo re mi\nearth');
+ expect(() => scanner.error('oh no!', position: 15, length: 2),
+ throwsStringScannerException('re'));
+ });
+
+ test('supports a multiline length', () {
+ final scanner = StringScanner('foo bar baz\ndo re mi\nearth fire water');
+ scanner.expect('foo bar baz\ndo re mi\nearth');
+ expect(() => scanner.error('oh no!', position: 8, length: 8),
+ throwsStringScannerException('baz\ndo r'));
+ });
+
+ test('supports a position after the current one', () {
+ final scanner = StringScanner('foo bar baz');
+ expect(() => scanner.error('oh no!', position: 4, length: 3),
+ throwsStringScannerException('bar'));
+ });
+
+ test('supports a length of zero', () {
+ final scanner = StringScanner('foo bar baz');
+ expect(() => scanner.error('oh no!', position: 4, length: 0),
+ throwsStringScannerException(''));
+ });
+ });
+
+ group('argument errors', () {
+ late StringScanner scanner;
+ setUp(() {
+ scanner = StringScanner('foo bar baz');
+ scanner.scan('foo');
+ });
+
+ test('if match is passed with position', () {
+ expect(
+ () => scanner.error('oh no!', match: scanner.lastMatch, position: 1),
+ throwsArgumentError);
+ });
+
+ test('if match is passed with length', () {
+ expect(() => scanner.error('oh no!', match: scanner.lastMatch, length: 1),
+ throwsArgumentError);
+ });
+
+ test('if position is negative', () {
+ expect(() => scanner.error('oh no!', position: -1), throwsArgumentError);
+ });
+
+ test('if position is outside the string', () {
+ expect(() => scanner.error('oh no!', position: 100), throwsArgumentError);
+ });
+
+ test('if position + length is outside the string', () {
+ expect(() => scanner.error('oh no!', position: 7, length: 7),
+ throwsArgumentError);
+ });
+
+ test('if length is negative', () {
+ expect(() => scanner.error('oh no!', length: -1), throwsArgumentError);
+ });
+ });
+}
diff --git a/pkgs/string_scanner/test/line_scanner_test.dart b/pkgs/string_scanner/test/line_scanner_test.dart
new file mode 100644
index 0000000..1af5c36
--- /dev/null
+++ b/pkgs/string_scanner/test/line_scanner_test.dart
@@ -0,0 +1,465 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:string_scanner/src/charcode.dart';
+import 'package:string_scanner/string_scanner.dart';
+import 'package:test/test.dart';
+
+void main() {
+ late LineScanner scanner;
+ setUp(() {
+ scanner = LineScanner('foo\nbar\r\nbaz');
+ });
+
+ test('begins with line and column 0', () {
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(0));
+ });
+
+ group('scan()', () {
+ test('consuming no newlines increases the column but not the line', () {
+ scanner.expect('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+ });
+
+ test('consuming a LF resets the column and increases the line', () {
+ scanner.expect('foo\nba');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(2));
+ });
+
+ test('consuming multiple LFs resets the column and increases the line', () {
+ scanner.expect('foo\nbar\r\nb');
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test('consuming a CR LF increases the line only after the LF', () {
+ scanner.expect('foo\nbar\r');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+
+ scanner.expect('\nb');
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test('consuming a CR not followed by LF increases the line', () {
+ scanner = LineScanner('foo\nbar\rbaz');
+ scanner.expect('foo\nbar\r');
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+
+ scanner.expect('b');
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test('consuming a CR at the end increases the line', () {
+ scanner = LineScanner('foo\nbar\r');
+ scanner.expect('foo\nbar\r');
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+ expect(scanner.isDone, isTrue);
+ });
+
+ test('consuming a mix of CR, LF, CR+LF increases the line', () {
+ scanner = LineScanner('0\n1\r2\r\n3');
+ scanner.expect('0\n1\r2\r\n3');
+ expect(scanner.line, equals(3));
+ expect(scanner.column, equals(1));
+ });
+
+ test('scanning a zero length match between CR LF does not fail', () {
+ scanner.expect('foo\nbar\r');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+ scanner.expect(RegExp('(?!x)'));
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+ });
+ });
+
+ group('readChar()', () {
+ test('on a non-newline character increases the column but not the line',
+ () {
+ scanner.readChar();
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(1));
+ });
+
+ test('consuming a LF resets the column and increases the line', () {
+ scanner.expect('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+
+ scanner.readChar();
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a CR LF increases the line only after the LF', () {
+ scanner = LineScanner('foo\r\nbar');
+ scanner.expect('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+
+ scanner.readChar();
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(4));
+
+ scanner.readChar();
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a CR not followed by a LF increases the line', () {
+ scanner = LineScanner('foo\nbar\rbaz');
+ scanner.expect('foo\nbar');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(3));
+
+ scanner.readChar();
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a CR at the end increases the line', () {
+ scanner = LineScanner('foo\nbar\r');
+ scanner.expect('foo\nbar');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(3));
+
+ scanner.readChar();
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a mix of CR, LF, CR+LF increases the line', () {
+ scanner = LineScanner('0\n1\r2\r\n3');
+ for (var i = 0; i < scanner.string.length; i++) {
+ scanner.readChar();
+ }
+
+ expect(scanner.line, equals(3));
+ expect(scanner.column, equals(1));
+ });
+ });
+
+ group('readCodePoint()', () {
+ test('on a non-newline character increases the column but not the line',
+ () {
+ scanner.readCodePoint();
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(1));
+ });
+
+ test('consuming a newline resets the column and increases the line', () {
+ scanner.expect('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+
+ scanner.readCodePoint();
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(0));
+ });
+
+ test("consuming halfway through a CR LF doesn't count as a line", () {
+ scanner.expect('foo\nbar');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(3));
+
+ scanner.readCodePoint();
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+
+ scanner.readCodePoint();
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+ });
+ });
+
+ group('scanChar()', () {
+ test('on a non-newline character increases the column but not the line',
+ () {
+ scanner.scanChar($f);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(1));
+ });
+
+ test('consuming a LF resets the column and increases the line', () {
+ scanner.expect('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+
+ scanner.scanChar($lf);
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a CR LF increases the line only after the LF', () {
+ scanner.expect('foo\nbar');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(3));
+
+ scanner.scanChar($cr);
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+
+ scanner.scanChar($lf);
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a CR not followed by LF increases the line', () {
+ scanner = LineScanner('foo\rbar');
+ scanner.expect('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+
+ scanner.scanChar($cr);
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a CR at the end increases the line', () {
+ scanner = LineScanner('foo\r');
+ scanner.expect('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+
+ scanner.scanChar($cr);
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(0));
+ });
+
+ test('consuming a mix of CR, LF, CR+LF increases the line', () {
+ scanner = LineScanner('0\n1\r2\r\n3');
+ for (var i = 0; i < scanner.string.length; i++) {
+ scanner.scanChar(scanner.string[i].codeUnits.single);
+ }
+
+ expect(scanner.line, equals(3));
+ expect(scanner.column, equals(1));
+ });
+ });
+
+ group('before a surrogate pair', () {
+ final codePoint = '\uD83D\uDC6D'.runes.first;
+ const highSurrogate = 0xD83D;
+
+ late LineScanner scanner;
+ setUp(() {
+ scanner = LineScanner('foo: \uD83D\uDC6D');
+ expect(scanner.scan('foo: '), isTrue);
+ });
+
+ test('readChar returns the high surrogate and moves into the pair', () {
+ expect(scanner.readChar(), equals(highSurrogate));
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(6));
+ expect(scanner.position, equals(6));
+ });
+
+ test('readCodePoint returns the code unit and moves past the pair', () {
+ expect(scanner.readCodePoint(), equals(codePoint));
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(7));
+ expect(scanner.position, equals(7));
+ });
+
+ test('scanChar with the high surrogate moves into the pair', () {
+ expect(scanner.scanChar(highSurrogate), isTrue);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(6));
+ expect(scanner.position, equals(6));
+ });
+
+ test('scanChar with the code point moves past the pair', () {
+ expect(scanner.scanChar(codePoint), isTrue);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(7));
+ expect(scanner.position, equals(7));
+ });
+
+ test('expectChar with the high surrogate moves into the pair', () {
+ scanner.expectChar(highSurrogate);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(6));
+ expect(scanner.position, equals(6));
+ });
+
+ test('expectChar with the code point moves past the pair', () {
+ scanner.expectChar(codePoint);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(7));
+ expect(scanner.position, equals(7));
+ });
+ });
+
+ group('position=', () {
+ test('forward through LFs sets the line and column', () {
+ scanner = LineScanner('foo\nbar\nbaz');
+ scanner.position = 9; // "foo\nbar\nb"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test('forward from non-zero character through LFs sets the line and column',
+ () {
+ scanner = LineScanner('foo\nbar\nbaz');
+ scanner.expect('fo');
+ scanner.position = 9; // "foo\nbar\nb"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test('forward through CR LFs sets the line and column', () {
+ scanner = LineScanner('foo\r\nbar\r\nbaz');
+ scanner.position = 11; // "foo\r\nbar\r\nb"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test('forward through CR not followed by LFs sets the line and column', () {
+ scanner = LineScanner('foo\rbar\rbaz');
+ scanner.position = 9; // "foo\rbar\rb"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test('forward through CR at end sets the line and column', () {
+ scanner = LineScanner('foo\rbar\r');
+ scanner.position = 8; // "foo\rbar\r"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+ });
+
+ test('forward through a mix of CR, LF, CR+LF sets the line and column', () {
+ scanner = LineScanner('0\n1\r2\r\n3');
+ scanner.position = scanner.string.length;
+
+ expect(scanner.line, equals(3));
+ expect(scanner.column, equals(1));
+ });
+
+ test('forward through no newlines sets the column', () {
+ scanner.position = 2; // "fo"
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(2));
+ });
+
+ test('backward through LFs sets the line and column', () {
+ scanner = LineScanner('foo\nbar\nbaz');
+ scanner.expect('foo\nbar\nbaz');
+ scanner.position = 2; // "fo"
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(2));
+ });
+
+ test('backward through CR LFs sets the line and column', () {
+ scanner = LineScanner('foo\r\nbar\r\nbaz');
+ scanner.expect('foo\r\nbar\r\nbaz');
+ scanner.position = 2; // "fo"
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(2));
+ });
+
+ test('backward through CR not followed by LFs sets the line and column',
+ () {
+ scanner = LineScanner('foo\rbar\rbaz');
+ scanner.expect('foo\rbar\rbaz');
+ scanner.position = 2; // "fo"
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(2));
+ });
+
+ test('backward through CR at end sets the line and column', () {
+ scanner = LineScanner('foo\rbar\r');
+ scanner.expect('foo\rbar\r');
+ scanner.position = 2; // "fo"
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(2));
+ });
+
+ test('backward through a mix of CR, LF, CR+LF sets the line and column',
+ () {
+ scanner = LineScanner('0\n1\r2\r\n3');
+ scanner.expect(scanner.string);
+
+ scanner.position = 1;
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(1));
+ });
+
+ test('backward through no newlines sets the column', () {
+ scanner.expect('foo\nbar\r\nbaz');
+ scanner.position = 10; // "foo\nbar\r\nb"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(1));
+ });
+
+ test("forward halfway through a CR LF doesn't count as a line", () {
+ scanner.position = 8; // "foo\nbar\r"
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+ });
+
+ test('forward from halfway through a CR LF counts as a line', () {
+ scanner.expect('foo\nbar\r');
+ scanner.position = 11; // "foo\nbar\r\nba"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(2));
+ });
+
+ test('backward to between CR LF', () {
+ scanner.expect('foo\nbar\r\nbaz');
+ scanner.position = 8; // "foo\nbar\r"
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+ });
+
+ test('backward from between CR LF', () {
+ scanner.expect('foo\nbar\r');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(4));
+ scanner.position = 5; // "foo\nb"
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(1));
+ });
+
+ test('backward to after CR LF', () {
+ scanner.expect('foo\nbar\r\nbaz');
+ scanner.position = 9; // "foo\nbar\r\n"
+ expect(scanner.line, equals(2));
+ expect(scanner.column, equals(0));
+ });
+
+ test('backward to before CR LF', () {
+ scanner.expect('foo\nbar\r\nbaz');
+ scanner.position = 7; // "foo\nbar"
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(3));
+ });
+ });
+
+ test('state= restores the line, column, and position', () {
+ scanner.expect('foo\nb');
+ final state = scanner.state;
+
+ scanner.scan('ar\nba');
+ scanner.state = state;
+ expect(scanner.rest, equals('ar\r\nbaz'));
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(1));
+ });
+
+ test('state= rejects a foreign state', () {
+ scanner.scan('foo\nb');
+
+ expect(() => LineScanner(scanner.string).state = scanner.state,
+ throwsArgumentError);
+ });
+}
diff --git a/pkgs/string_scanner/test/span_scanner_test.dart b/pkgs/string_scanner/test/span_scanner_test.dart
new file mode 100644
index 0000000..93d9c47
--- /dev/null
+++ b/pkgs/string_scanner/test/span_scanner_test.dart
@@ -0,0 +1,238 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+import 'package:string_scanner/string_scanner.dart';
+import 'package:test/test.dart';
+
+import 'utils.dart';
+
+void main() {
+ testForImplementation(
+ 'lazy',
+ ([String? string]) =>
+ SpanScanner(string ?? 'foo\nbar\nbaz', sourceUrl: 'source'));
+
+ testForImplementation(
+ 'eager',
+ ([String? string]) =>
+ SpanScanner.eager(string ?? 'foo\nbar\nbaz', sourceUrl: 'source'));
+
+ group('within', () {
+ const text = 'first\nbefore: foo\nbar\nbaz :after\nlast';
+ final startOffset = text.indexOf('foo');
+
+ late SpanScanner scanner;
+ setUp(() {
+ final file = SourceFile.fromString(text, url: 'source');
+ scanner =
+ SpanScanner.within(file.span(startOffset, text.indexOf(' :after')));
+ });
+
+ test('string only includes the span text', () {
+ expect(scanner.string, equals('foo\nbar\nbaz'));
+ });
+
+ test('line and column are span-relative', () {
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(0));
+
+ scanner.scan('foo');
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(3));
+
+ scanner.scan('\n');
+ expect(scanner.line, equals(1));
+ expect(scanner.column, equals(0));
+ });
+
+ test('tracks the span for the last match', () {
+ scanner.scan('fo');
+ scanner.scan('o\nba');
+
+ final span = scanner.lastSpan!;
+ expect(span.start.offset, equals(startOffset + 2));
+ expect(span.start.line, equals(1));
+ expect(span.start.column, equals(10));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.end.offset, equals(startOffset + 6));
+ expect(span.end.line, equals(2));
+ expect(span.end.column, equals(2));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.text, equals('o\nba'));
+ });
+
+ test('.spanFrom() returns a span from a previous state', () {
+ scanner.scan('fo');
+ final state = scanner.state;
+ scanner.scan('o\nba');
+ scanner.scan('r\nba');
+
+ final span = scanner.spanFrom(state);
+ expect(span.text, equals('o\nbar\nba'));
+ });
+
+ test('.spanFromPosition() returns a span from a previous state', () {
+ scanner.scan('fo');
+ final start = scanner.position;
+ scanner.scan('o\nba');
+ scanner.scan('r\nba');
+
+ final span = scanner.spanFromPosition(start + 2, start + 5);
+ expect(span.text, equals('bar'));
+ });
+
+ test('.emptySpan returns an empty span at the current location', () {
+ scanner.scan('foo\nba');
+
+ final span = scanner.emptySpan;
+ expect(span.start.offset, equals(startOffset + 6));
+ expect(span.start.line, equals(2));
+ expect(span.start.column, equals(2));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.end.offset, equals(startOffset + 6));
+ expect(span.end.line, equals(2));
+ expect(span.end.column, equals(2));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.text, equals(''));
+ });
+
+ test('.error() uses an absolute span', () {
+ scanner.expect('foo');
+ expect(
+ () => scanner.error('oh no!'), throwsStringScannerException('foo'));
+ });
+
+ test('.isDone returns true at the end of the span', () {
+ scanner.expect('foo\nbar\nbaz');
+ expect(scanner.isDone, isTrue);
+ });
+ });
+}
+
+void testForImplementation(
+ String name, SpanScanner Function([String string]) create) {
+ group('for a $name scanner', () {
+ late SpanScanner scanner;
+ setUp(() => scanner = create());
+
+ test('tracks the span for the last match', () {
+ scanner.scan('fo');
+ scanner.scan('o\nba');
+
+ final span = scanner.lastSpan!;
+ expect(span.start.offset, equals(2));
+ expect(span.start.line, equals(0));
+ expect(span.start.column, equals(2));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.end.offset, equals(6));
+ expect(span.end.line, equals(1));
+ expect(span.end.column, equals(2));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.text, equals('o\nba'));
+ });
+
+ test('.spanFrom() returns a span from a previous state', () {
+ scanner.scan('fo');
+ final state = scanner.state;
+ scanner.scan('o\nba');
+ scanner.scan('r\nba');
+
+ final span = scanner.spanFrom(state);
+ expect(span.text, equals('o\nbar\nba'));
+ });
+
+ test('.spanFromPosition() returns a span from a previous state', () {
+ scanner.scan('fo');
+ final start = scanner.position;
+ scanner.scan('o\nba');
+ scanner.scan('r\nba');
+
+ final span = scanner.spanFromPosition(start + 2, start + 5);
+ expect(span.text, equals('bar'));
+ });
+
+ test('.emptySpan returns an empty span at the current location', () {
+ scanner.scan('foo\nba');
+
+ final span = scanner.emptySpan;
+ expect(span.start.offset, equals(6));
+ expect(span.start.line, equals(1));
+ expect(span.start.column, equals(2));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.end.offset, equals(6));
+ expect(span.end.line, equals(1));
+ expect(span.end.column, equals(2));
+ expect(span.start.sourceUrl, equals(Uri.parse('source')));
+
+ expect(span.text, equals(''));
+ });
+
+ group('before a surrogate pair', () {
+ final codePoint = '\uD83D\uDC6D'.runes.first;
+ const highSurrogate = 0xD83D;
+
+ late SpanScanner scanner;
+ setUp(() {
+ scanner = create('foo: \uD83D\uDC6D bar');
+ expect(scanner.scan('foo: '), isTrue);
+ });
+
+ test('readChar returns the high surrogate and moves into the pair', () {
+ expect(scanner.readChar(), equals(highSurrogate));
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(6));
+ expect(scanner.position, equals(6));
+ });
+
+ test('readCodePoint returns the code unit and moves past the pair', () {
+ expect(scanner.readCodePoint(), equals(codePoint));
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(7));
+ expect(scanner.position, equals(7));
+ });
+
+ test('scanChar with the high surrogate moves into the pair', () {
+ expect(scanner.scanChar(highSurrogate), isTrue);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(6));
+ expect(scanner.position, equals(6));
+ });
+
+ test('scanChar with the code point moves past the pair', () {
+ expect(scanner.scanChar(codePoint), isTrue);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(7));
+ expect(scanner.position, equals(7));
+ });
+
+ test('expectChar with the high surrogate moves into the pair', () {
+ scanner.expectChar(highSurrogate);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(6));
+ expect(scanner.position, equals(6));
+ });
+
+ test('expectChar with the code point moves past the pair', () {
+ scanner.expectChar(codePoint);
+ expect(scanner.line, equals(0));
+ expect(scanner.column, equals(7));
+ expect(scanner.position, equals(7));
+ });
+
+ test('spanFrom covers the surrogate pair', () {
+ final state = scanner.state;
+ scanner.scan('\uD83D\uDC6D b');
+ expect(scanner.spanFrom(state).text, equals('\uD83D\uDC6D b'));
+ });
+ });
+ });
+}
diff --git a/pkgs/string_scanner/test/string_scanner_test.dart b/pkgs/string_scanner/test/string_scanner_test.dart
new file mode 100644
index 0000000..36a737e
--- /dev/null
+++ b/pkgs/string_scanner/test/string_scanner_test.dart
@@ -0,0 +1,564 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:string_scanner/src/charcode.dart';
+import 'package:string_scanner/string_scanner.dart';
+import 'package:test/test.dart';
+
+void main() {
+ group('with an empty string', () {
+ late StringScanner scanner;
+ setUp(() {
+ scanner = StringScanner('');
+ });
+
+ test('is done', () {
+ expect(scanner.isDone, isTrue);
+ expect(scanner.expectDone, isNot(throwsFormatException));
+ });
+
+ test('rest is empty', () {
+ expect(scanner.rest, isEmpty);
+ });
+
+ test('lastMatch is null', () {
+ expect(scanner.lastMatch, isNull);
+ });
+
+ test('position is zero', () {
+ expect(scanner.position, equals(0));
+ });
+
+ test("readChar fails and doesn't change the state", () {
+ expect(scanner.readChar, throwsFormatException);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("readCodePoint fails and doesn't change the state", () {
+ expect(scanner.readCodePoint, throwsFormatException);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("peekChar returns null and doesn't change the state", () {
+ expect(scanner.peekChar(), isNull);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("peekCodePoint returns null and doesn't change the state", () {
+ expect(scanner.peekCodePoint(), isNull);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("scanChar returns false and doesn't change the state", () {
+ expect(scanner.scanChar($f), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("expectChar fails and doesn't change the state", () {
+ expect(() => scanner.expectChar($f), throwsFormatException);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("scan returns false and doesn't change the state", () {
+ expect(scanner.scan(RegExp('.')), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("expect throws a FormatException and doesn't change the state", () {
+ expect(() => scanner.expect(RegExp('.')), throwsFormatException);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test("matches returns false and doesn't change the state", () {
+ expect(scanner.matches(RegExp('.')), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test('substring returns the empty string', () {
+ expect(scanner.substring(0), isEmpty);
+ });
+
+ test('setting position to 1 throws an ArgumentError', () {
+ expect(() {
+ scanner.position = 1;
+ }, throwsArgumentError);
+ });
+
+ test('setting position to -1 throws an ArgumentError', () {
+ expect(() {
+ scanner.position = -1;
+ }, throwsArgumentError);
+ });
+ });
+
+ group('at the beginning of a string', () {
+ late StringScanner scanner;
+ setUp(() {
+ scanner = StringScanner('foo bar');
+ });
+
+ test('is not done', () {
+ expect(scanner.isDone, isFalse);
+ expect(scanner.expectDone, throwsFormatException);
+ });
+
+ test('rest is the whole string', () {
+ expect(scanner.rest, equals('foo bar'));
+ });
+
+ test('lastMatch is null', () {
+ expect(scanner.lastMatch, isNull);
+ });
+
+ test('position is zero', () {
+ expect(scanner.position, equals(0));
+ });
+
+ test('readChar returns the first character and moves forward', () {
+ expect(scanner.readChar(), equals(0x66));
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(1));
+ });
+
+ test('readCodePoint returns the first character and moves forward', () {
+ expect(scanner.readCodePoint(), equals(0x66));
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(1));
+ });
+
+ test('peekChar returns the first character', () {
+ expect(scanner.peekChar(), equals(0x66));
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test('peekChar with an argument returns the nth character', () {
+ expect(scanner.peekChar(4), equals(0x62));
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test('peekCodePoint returns the first character', () {
+ expect(scanner.peekCodePoint(), equals(0x66));
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test('a matching scanChar returns true moves forward', () {
+ expect(scanner.scanChar($f), isTrue);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(1));
+ });
+
+ test('a non-matching scanChar returns false and does nothing', () {
+ expect(scanner.scanChar($x), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test('a matching expectChar moves forward', () {
+ scanner.expectChar($f);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(1));
+ });
+
+ test('a non-matching expectChar fails', () {
+ expect(() => scanner.expectChar($x), throwsFormatException);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ });
+
+ test('a matching scan returns true and changes the state', () {
+ expect(scanner.scan(RegExp('f(..)')), isTrue);
+ expect(scanner.lastMatch![1], equals('oo'));
+ expect(scanner.position, equals(3));
+ expect(scanner.rest, equals(' bar'));
+ });
+
+ test('a non-matching scan returns false and sets lastMatch to null', () {
+ expect(scanner.matches(RegExp('f(..)')), isTrue);
+ expect(scanner.lastMatch, isNotNull);
+
+ expect(scanner.scan(RegExp('b(..)')), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ expect(scanner.rest, equals('foo bar'));
+ });
+
+ test('a matching expect changes the state', () {
+ scanner.expect(RegExp('f(..)'));
+ expect(scanner.lastMatch![1], equals('oo'));
+ expect(scanner.position, equals(3));
+ expect(scanner.rest, equals(' bar'));
+ });
+
+ test(
+ 'a non-matching expect throws a FormatException and sets lastMatch to '
+ 'null', () {
+ expect(scanner.matches(RegExp('f(..)')), isTrue);
+ expect(scanner.lastMatch, isNotNull);
+
+ expect(() => scanner.expect(RegExp('b(..)')), throwsFormatException);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ expect(scanner.rest, equals('foo bar'));
+ });
+
+ test('a matching matches returns true and only changes lastMatch', () {
+ expect(scanner.matches(RegExp('f(..)')), isTrue);
+ expect(scanner.lastMatch![1], equals('oo'));
+ expect(scanner.position, equals(0));
+ expect(scanner.rest, equals('foo bar'));
+ });
+
+ test("a non-matching matches returns false and doesn't change the state",
+ () {
+ expect(scanner.matches(RegExp('b(..)')), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(0));
+ expect(scanner.rest, equals('foo bar'));
+ });
+
+ test('substring from the beginning returns the empty string', () {
+ expect(scanner.substring(0), isEmpty);
+ });
+
+ test('substring with a custom end returns the substring', () {
+ expect(scanner.substring(0, 3), equals('foo'));
+ });
+
+ test('substring with the string length returns the whole string', () {
+ expect(scanner.substring(0, 7), equals('foo bar'));
+ });
+
+ test('setting position to 1 moves the cursor forward', () {
+ scanner.position = 1;
+ expect(scanner.position, equals(1));
+ expect(scanner.rest, equals('oo bar'));
+
+ expect(scanner.scan(RegExp('oo.')), isTrue);
+ expect(scanner.lastMatch![0], equals('oo '));
+ expect(scanner.position, equals(4));
+ expect(scanner.rest, equals('bar'));
+ });
+
+ test('setting position beyond the string throws an ArgumentError', () {
+ expect(() {
+ scanner.position = 8;
+ }, throwsArgumentError);
+ });
+
+ test('setting position to -1 throws an ArgumentError', () {
+ expect(() {
+ scanner.position = -1;
+ }, throwsArgumentError);
+ });
+
+ test('scan accepts any Pattern', () {
+ expect(scanner.scan('foo'), isTrue);
+ expect(scanner.lastMatch![0], equals('foo'));
+ expect(scanner.position, equals(3));
+ expect(scanner.rest, equals(' bar'));
+ });
+
+ test('scans multiple times', () {
+ expect(scanner.scan(RegExp('f(..)')), isTrue);
+ expect(scanner.lastMatch![1], equals('oo'));
+ expect(scanner.position, equals(3));
+ expect(scanner.rest, equals(' bar'));
+
+ expect(scanner.scan(RegExp(' b(..)')), isTrue);
+ expect(scanner.lastMatch![1], equals('ar'));
+ expect(scanner.position, equals(7));
+ expect(scanner.rest, equals(''));
+ expect(scanner.isDone, isTrue);
+ expect(scanner.expectDone, isNot(throwsFormatException));
+ });
+ });
+
+ group('after a scan', () {
+ late StringScanner scanner;
+ setUp(() {
+ scanner = StringScanner('foo bar');
+ expect(scanner.scan('foo'), isTrue);
+ });
+
+ test('readChar returns the first character and unsets the last match', () {
+ expect(scanner.readChar(), equals($space));
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(4));
+ });
+
+ test('readCodePoint returns the first character and unsets the last match',
+ () {
+ expect(scanner.readCodePoint(), equals($space));
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(4));
+ });
+
+ test('a matching scanChar returns true and unsets the last match', () {
+ expect(scanner.scanChar($space), isTrue);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(4));
+ });
+
+ test('a matching expectChar returns true and unsets the last match', () {
+ scanner.expectChar($space);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(4));
+ });
+ });
+
+ group('at the end of a string', () {
+ late StringScanner scanner;
+ setUp(() {
+ scanner = StringScanner('foo bar');
+ expect(scanner.scan('foo bar'), isTrue);
+ });
+
+ test('is done', () {
+ expect(scanner.isDone, isTrue);
+ expect(scanner.expectDone, isNot(throwsFormatException));
+ });
+
+ test('rest is empty', () {
+ expect(scanner.rest, isEmpty);
+ });
+
+ test('position is zero', () {
+ expect(scanner.position, equals(7));
+ });
+
+ test("readChar fails and doesn't change the state", () {
+ expect(scanner.readChar, throwsFormatException);
+ expect(scanner.lastMatch, isNotNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test("readCodePoint fails and doesn't change the state", () {
+ expect(scanner.readCodePoint, throwsFormatException);
+ expect(scanner.lastMatch, isNotNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test("peekChar returns null and doesn't change the state", () {
+ expect(scanner.peekChar(), isNull);
+ expect(scanner.lastMatch, isNotNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test("peekCodePoint returns null and doesn't change the state", () {
+ expect(scanner.peekCodePoint(), isNull);
+ expect(scanner.lastMatch, isNotNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test("scanChar returns false and doesn't change the state", () {
+ expect(scanner.scanChar($f), isFalse);
+ expect(scanner.lastMatch, isNotNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test("expectChar fails and doesn't change the state", () {
+ expect(() => scanner.expectChar($f), throwsFormatException);
+ expect(scanner.lastMatch, isNotNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test('scan returns false and sets lastMatch to null', () {
+ expect(scanner.scan(RegExp('.')), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test('expect throws a FormatException and sets lastMatch to null', () {
+ expect(() => scanner.expect(RegExp('.')), throwsFormatException);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test('matches returns false sets lastMatch to null', () {
+ expect(scanner.matches(RegExp('.')), isFalse);
+ expect(scanner.lastMatch, isNull);
+ expect(scanner.position, equals(7));
+ });
+
+ test('substring from the beginning returns the whole string', () {
+ expect(scanner.substring(0), equals('foo bar'));
+ });
+
+ test('substring with a custom start returns a substring from there', () {
+ expect(scanner.substring(4), equals('bar'));
+ });
+
+ test('substring with a custom start and end returns that substring', () {
+ expect(scanner.substring(3, 5), equals(' b'));
+ });
+
+ test('setting position to 1 moves the cursor backward', () {
+ scanner.position = 1;
+ expect(scanner.position, equals(1));
+ expect(scanner.rest, equals('oo bar'));
+
+ expect(scanner.scan(RegExp('oo.')), isTrue);
+ expect(scanner.lastMatch![0], equals('oo '));
+ expect(scanner.position, equals(4));
+ expect(scanner.rest, equals('bar'));
+ });
+
+ test('setting and resetting position clears lastMatch', () {
+ final oldPosition = scanner.position;
+ scanner.position = 1;
+ scanner.position = oldPosition;
+ expect(scanner.lastMatch, isNull);
+ });
+
+ test('setting position beyond the string throws an ArgumentError', () {
+ expect(() {
+ scanner.position = 8;
+ }, throwsArgumentError);
+ });
+
+ test('setting position to -1 throws an ArgumentError', () {
+ expect(() {
+ scanner.position = -1;
+ }, throwsArgumentError);
+ });
+ });
+
+ group('before a surrogate pair', () {
+ final codePoint = '\uD83D\uDC6D'.runes.first;
+ const highSurrogate = 0xD83D;
+
+ late StringScanner scanner;
+ setUp(() {
+ scanner = StringScanner('foo: \uD83D\uDC6D');
+ expect(scanner.scan('foo: '), isTrue);
+ });
+
+ test('readChar returns the high surrogate and moves into the pair', () {
+ expect(scanner.readChar(), equals(highSurrogate));
+ expect(scanner.position, equals(6));
+ });
+
+ test('readCodePoint returns the code unit and moves past the pair', () {
+ expect(scanner.readCodePoint(), equals(codePoint));
+ expect(scanner.position, equals(7));
+ });
+
+ test('peekChar returns the high surrogate', () {
+ expect(scanner.peekChar(), equals(highSurrogate));
+ expect(scanner.position, equals(5));
+ });
+
+ test('peekCodePoint returns the code unit', () {
+ expect(scanner.peekCodePoint(), equals(codePoint));
+ expect(scanner.position, equals(5));
+ });
+
+ test('scanChar with the high surrogate moves into the pair', () {
+ expect(scanner.scanChar(highSurrogate), isTrue);
+ expect(scanner.position, equals(6));
+ });
+
+ test('scanChar with the code point moves past the pair', () {
+ expect(scanner.scanChar(codePoint), isTrue);
+ expect(scanner.position, equals(7));
+ });
+
+ test('expectChar with the high surrogate moves into the pair', () {
+ scanner.expectChar(highSurrogate);
+ expect(scanner.position, equals(6));
+ });
+
+ test('expectChar with the code point moves past the pair', () {
+ scanner.expectChar(codePoint);
+ expect(scanner.position, equals(7));
+ });
+ });
+
+ group('before an invalid surrogate pair', () {
+ // This surrogate pair is invalid because U+E000 is just outside the range
+ // of low surrogates. If it were interpreted as a surrogate pair anyway, the
+ // value would be U+110000, which is outside of the Unicode gamut.
+ const codePoint = 0x110000;
+ const highSurrogate = 0xD800;
+
+ late StringScanner scanner;
+ setUp(() {
+ scanner = StringScanner('foo: \uD800\uE000');
+ expect(scanner.scan('foo: '), isTrue);
+ });
+
+ test('readChar returns the high surrogate and moves into the pair', () {
+ expect(scanner.readChar(), equals(highSurrogate));
+ expect(scanner.position, equals(6));
+ });
+
+ test('readCodePoint returns the high surrogate and moves past the pair',
+ () {
+ expect(scanner.readCodePoint(), equals(highSurrogate));
+ expect(scanner.position, equals(6));
+ });
+
+ test('peekChar returns the high surrogate', () {
+ expect(scanner.peekChar(), equals(highSurrogate));
+ expect(scanner.position, equals(5));
+ });
+
+ test('peekCodePoint returns the high surrogate', () {
+ expect(scanner.peekCodePoint(), equals(highSurrogate));
+ expect(scanner.position, equals(5));
+ });
+
+ test('scanChar with the high surrogate moves into the pair', () {
+ expect(scanner.scanChar(highSurrogate), isTrue);
+ expect(scanner.position, equals(6));
+ });
+
+ test('scanChar with the fake code point returns false', () {
+ expect(scanner.scanChar(codePoint), isFalse);
+ expect(scanner.position, equals(5));
+ });
+
+ test('expectChar with the high surrogate moves into the pair', () {
+ scanner.expectChar(highSurrogate);
+ expect(scanner.position, equals(6));
+ });
+
+ test('expectChar with the fake code point fails', () {
+ expect(() => scanner.expectChar(codePoint), throwsRangeError);
+ });
+ });
+
+ group('a scanner constructed with a custom position', () {
+ test('starts scanning from that position', () {
+ final scanner = StringScanner('foo bar', position: 1);
+ expect(scanner.position, equals(1));
+ expect(scanner.rest, equals('oo bar'));
+
+ expect(scanner.scan(RegExp('oo.')), isTrue);
+ expect(scanner.lastMatch![0], equals('oo '));
+ expect(scanner.position, equals(4));
+ expect(scanner.rest, equals('bar'));
+ });
+
+ test('throws an ArgumentError if the position is -1', () {
+ expect(() => StringScanner('foo bar', position: -1), throwsArgumentError);
+ });
+
+ test('throws an ArgumentError if the position is beyond the string', () {
+ expect(() => StringScanner('foo bar', position: 8), throwsArgumentError);
+ });
+ });
+}
diff --git a/pkgs/string_scanner/test/utils.dart b/pkgs/string_scanner/test/utils.dart
new file mode 100644
index 0000000..ca03c06
--- /dev/null
+++ b/pkgs/string_scanner/test/utils.dart
@@ -0,0 +1,12 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:string_scanner/string_scanner.dart';
+import 'package:test/test.dart';
+
+/// Returns a matcher that asserts that a closure throws a
+/// [StringScannerException] with the given [text].
+Matcher throwsStringScannerException(String text) =>
+ throwsA(const TypeMatcher<StringScannerException>()
+ .having((e) => e.span!.text, 'span.text', text));
diff --git a/pkgs/term_glyph/.gitignore b/pkgs/term_glyph/.gitignore
new file mode 100644
index 0000000..01d42c0
--- /dev/null
+++ b/pkgs/term_glyph/.gitignore
@@ -0,0 +1,4 @@
+.dart_tool/
+.pub/
+.packages
+pubspec.lock
diff --git a/pkgs/term_glyph/AUTHORS b/pkgs/term_glyph/AUTHORS
new file mode 100644
index 0000000..e8063a8
--- /dev/null
+++ b/pkgs/term_glyph/AUTHORS
@@ -0,0 +1,6 @@
+# Below is a list of people and organizations that have contributed
+# to the project. Names should be added to the list like so:
+#
+# Name/Organization <email address>
+
+Google Inc.
diff --git a/pkgs/term_glyph/CHANGELOG.md b/pkgs/term_glyph/CHANGELOG.md
new file mode 100644
index 0000000..bf8eb79
--- /dev/null
+++ b/pkgs/term_glyph/CHANGELOG.md
@@ -0,0 +1,33 @@
+## 1.2.3-wip
+
+## 1.2.2
+
+* Require Dart 3.1
+* Move to `dart-lang/tools` monorepo.
+
+## 1.2.1
+
+* Migrate to `package:lints`.
+* Populate the pubspec `repository` field.
+
+## 1.2.0
+
+* Stable release for null safety.
+* Update SDK constraints to `>=2.12.0-0 <3.0.0` based on beta release
+ guidelines.
+
+## 1.1.0
+
+* Add a `GlyphSet` class that can be used to easily choose which set of glyphs
+ to use for a particular chunk of code.
+
+* Add `asciiGlyphs`, `unicodeGlyphs`, and `glyphs` getters that provide access
+ to `GlyphSet`s.
+
+## 1.0.1
+
+* Set max SDK version to `<3.0.0`.
+
+## 1.0.0
+
+* Initial version.
diff --git a/pkgs/term_glyph/LICENSE b/pkgs/term_glyph/LICENSE
new file mode 100644
index 0000000..03af64a
--- /dev/null
+++ b/pkgs/term_glyph/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2017, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/term_glyph/README.md b/pkgs/term_glyph/README.md
new file mode 100644
index 0000000..75039aa
--- /dev/null
+++ b/pkgs/term_glyph/README.md
@@ -0,0 +1,47 @@
+[](https://github.com/dart-lang/tools/actions/workflows/term_glyph.yaml)
+[](https://pub.dev/packages/term_glyph)
+[](https://pub.dev/packages/term_glyph/publisher)
+
+This library contains getters for useful Unicode glyphs as well as plain ASCII
+alternatives. It's intended to be used in command-line applications that may run
+in places where Unicode isn't well-supported and libraries that may be used by
+those applications.
+
+We recommend that you import this library with the prefix "glyph". For example:
+
+```dart
+import 'package:term_glyph/term_glyph.dart' as glyph;
+
+/// Formats [items] into a bulleted list, with one item per line.
+String bulletedList(List<String> items) =>
+ items.map((item) => "${glyph.bullet} $item").join("\n");
+```
+
+## ASCII Mode
+
+Some shells are unable to display Unicode characters, so this package is able to
+transparently switch its glyphs to ASCII alternatives by setting [the `ascii`
+attribute][ascii]. When this attribute is `true`, all glyphs use ASCII
+characters instead. It currently defaults to `false`, although in the future it
+may default to `true` for applications running on the Dart VM on Windows. For
+example:
+
+[ascii]: https://pub.dev/documentation/term_glyph/latest/term_glyph/ascii.html
+
+```dart
+import 'dart:io';
+
+import 'package:term_glyph/term_glyph.dart' as glyph;
+
+void main() {
+ glyph.ascii = Platform.isWindows;
+
+ // Prints "Unicode => ASCII" on Windows, "Unicode ━▶ ASCII" everywhere else.
+ print("Unicode ${glyph.rightArrow} ASCII");
+}
+```
+
+All ASCII glyphs are guaranteed to be the same number of characters as the
+corresponding Unicode glyphs, so that they line up properly when printed on a
+terminal. The specific ASCII text for a given Unicode glyph may change over
+time; this is not considered a breaking change.
diff --git a/pkgs/term_glyph/analysis_options.yaml b/pkgs/term_glyph/analysis_options.yaml
new file mode 100644
index 0000000..6d74ee9
--- /dev/null
+++ b/pkgs/term_glyph/analysis_options.yaml
@@ -0,0 +1,32 @@
+# https://dart.dev/guides/language/analysis-options
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+ strict-inference: true
+ strict-raw-types: true
+
+linter:
+ rules:
+ - avoid_bool_literals_in_conditional_expressions
+ - avoid_classes_with_only_static_members
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_returning_this
+ - avoid_unused_constructor_parameters
+ - avoid_void_async
+ - cancel_subscriptions
+ - join_return_with_assignment
+ - literal_only_boolean_expressions
+ - missing_whitespace_between_adjacent_strings
+ - no_adjacent_strings_in_list
+ - no_runtimeType_toString
+ - prefer_const_declarations
+ - prefer_expression_function_bodies
+ - prefer_final_locals
+ - unnecessary_await_in_return
+ - unnecessary_breaks
+ - use_if_null_to_convert_nulls_to_bools
+ - use_raw_strings
+ - use_string_buffers
diff --git a/pkgs/term_glyph/data.csv b/pkgs/term_glyph/data.csv
new file mode 100644
index 0000000..92a72f7
--- /dev/null
+++ b/pkgs/term_glyph/data.csv
@@ -0,0 +1,85 @@
+# Miscellaneous
+bullet,•,*,A bullet point.
+
+# Arrows
+leftArrow,←,<,"A left-pointing arrow.
+
+Note that the Unicode arrow glyphs may overlap with adjacent characters in some
+terminal fonts, and should generally be surrounding by spaces."
+rightArrow,→,>,"A right-pointing arrow.
+
+Note that the Unicode arrow glyphs may overlap with adjacent characters in some
+terminal fonts, and should generally be surrounding by spaces."
+upArrow,↑,^,An upwards-pointing arrow.
+downArrow,↓,v,A downwards-pointing arrow.
+longLeftArrow,◀━,<=,A two-character left-pointing arrow.
+longRightArrow,━▶,=>,A two-character right-pointing arrow.
+
+# Box drawing characters
+
+## Normal
+horizontalLine,─,-,A horizontal line that can be used to draw a box.
+verticalLine,│,|,A vertical line that can be used to draw a box.
+topLeftCorner,┌,",",The upper left-hand corner of a box.
+topRightCorner,┐,",",The upper right-hand corner of a box.
+bottomLeftCorner,└,',The lower left-hand corner of a box.
+bottomRightCorner,┘,',The lower right-hand corner of a box.
+cross,┼,+,An intersection of vertical and horizontal box lines.
+teeUp,┴,+,A horizontal box line with a vertical line going up from the middle.
+teeDown,┬,+,A horizontal box line with a vertical line going down from the middle.
+teeLeft,┤,+,A vertical box line with a horizontal line going left from the middle.
+teeRight,├,+,A vertical box line with a horizontal line going right from the middle.
+upEnd,╵,',The top half of a vertical box line.
+downEnd,╷,",",The bottom half of a vertical box line.
+leftEnd,╴,-,The left half of a horizontal box line.
+rightEnd,╶,-,The right half of a horizontal box line.
+
+## Bold
+horizontalLineBold,━,=,A bold horizontal line that can be used to draw a box.
+verticalLineBold,┃,|,A bold vertical line that can be used to draw a box.
+topLeftCornerBold,┏,",",The bold upper left-hand corner of a box.
+topRightCornerBold,┓,",",The bold upper right-hand corner of a box.
+bottomLeftCornerBold,┗,',The bold lower left-hand corner of a box.
+bottomRightCornerBold,┛,',The bold lower right-hand corner of a box.
+crossBold,╋,+,An intersection of bold vertical and horizontal box lines.
+teeUpBold,┻,+,A bold horizontal box line with a vertical line going up from the middle.
+teeDownBold,┳,+,A bold horizontal box line with a vertical line going down from the middle.
+teeLeftBold,┫,+,A bold vertical box line with a horizontal line going left from the middle.
+teeRightBold,┣,+,A bold vertical box line with a horizontal line going right from the middle.
+upEndBold,╹,',The top half of a bold vertical box line.
+downEndBold,╻,",",The bottom half of a bold vertical box line.
+leftEndBold,╸,-,The left half of a bold horizontal box line.
+rightEndBold,╺,-,The right half of a bold horizontal box line.
+
+## Double
+horizontalLineDouble,═,=,A double horizontal line that can be used to draw a box.
+verticalLineDouble,║,|,A double vertical line that can be used to draw a box.
+topLeftCornerDouble,╔,",",The double upper left-hand corner of a box.
+topRightCornerDouble,╗,",",The double upper right-hand corner of a box.
+bottomLeftCornerDouble,╚,"""",The double lower left-hand corner of a box.
+bottomRightCornerDouble,╝,"""",The double lower right-hand corner of a box.
+crossDouble,╬,+,An intersection of double vertical and horizontal box lines.
+teeUpDouble,╩,+,A double horizontal box line with a vertical line going up from the middle.
+teeDownDouble,╦,+,A double horizontal box line with a vertical line going down from the middle.
+teeLeftDouble,╣,+,A double vertical box line with a horizontal line going left from the middle.
+teeRightDouble,╠,+,A double vertical box line with a horizontal line going right from the middle.
+
+## Dashed
+
+### Double
+horizontalLineDoubleDash,╌,-,A dashed horizontal line that can be used to draw a box.
+horizontalLineDoubleDashBold,╍,-,A bold dashed horizontal line that can be used to draw a box.
+verticalLineDoubleDash,╎,|,A dashed vertical line that can be used to draw a box.
+verticalLineDoubleDashBold,╏,|,A bold dashed vertical line that can be used to draw a box.
+
+### Triple
+horizontalLineTripleDash,┄,-,A dashed horizontal line that can be used to draw a box.
+horizontalLineTripleDashBold,┅,-,A bold dashed horizontal line that can be used to draw a box.
+verticalLineTripleDash,┆,|,A dashed vertical line that can be used to draw a box.
+verticalLineTripleDashBold,┇,|,A bold dashed vertical line that can be used to draw a box.
+
+### Quadruple
+horizontalLineQuadrupleDash,┈,-,A dashed horizontal line that can be used to draw a box.
+horizontalLineQuadrupleDashBold,┉,-,A bold dashed horizontal line that can be used to draw a box.
+verticalLineQuadrupleDash,┊,|,A dashed vertical line that can be used to draw a box.
+verticalLineQuadrupleDashBold,┋,|,A bold dashed vertical line that can be used to draw a box.
diff --git a/pkgs/term_glyph/lib/src/generated/ascii_glyph_set.dart b/pkgs/term_glyph/lib/src/generated/ascii_glyph_set.dart
new file mode 100644
index 0000000..08534a0
--- /dev/null
+++ b/pkgs/term_glyph/lib/src/generated/ascii_glyph_set.dart
@@ -0,0 +1,139 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Don't modify this file by hand! It's generated by tool/generate.dart.
+
+// ignore_for_file: lines_longer_than_80_chars
+
+import 'glyph_set.dart';
+
+/// A [GlyphSet] that includes only ASCII glyphs.
+class AsciiGlyphSet implements GlyphSet {
+ const AsciiGlyphSet();
+
+ /// Returns [glyph] if `this` supports Unicode glyphs and [alternative]
+ /// otherwise.
+ @override
+ String glyphOrAscii(String glyph, String alternative) => alternative;
+ @override
+ String get bullet => '*';
+ @override
+ String get leftArrow => '<';
+ @override
+ String get rightArrow => '>';
+ @override
+ String get upArrow => '^';
+ @override
+ String get downArrow => 'v';
+ @override
+ String get longLeftArrow => '<=';
+ @override
+ String get longRightArrow => '=>';
+ @override
+ String get horizontalLine => '-';
+ @override
+ String get verticalLine => '|';
+ @override
+ String get topLeftCorner => ',';
+ @override
+ String get topRightCorner => ',';
+ @override
+ String get bottomLeftCorner => "'";
+ @override
+ String get bottomRightCorner => "'";
+ @override
+ String get cross => '+';
+ @override
+ String get teeUp => '+';
+ @override
+ String get teeDown => '+';
+ @override
+ String get teeLeft => '+';
+ @override
+ String get teeRight => '+';
+ @override
+ String get upEnd => "'";
+ @override
+ String get downEnd => ',';
+ @override
+ String get leftEnd => '-';
+ @override
+ String get rightEnd => '-';
+ @override
+ String get horizontalLineBold => '=';
+ @override
+ String get verticalLineBold => '|';
+ @override
+ String get topLeftCornerBold => ',';
+ @override
+ String get topRightCornerBold => ',';
+ @override
+ String get bottomLeftCornerBold => "'";
+ @override
+ String get bottomRightCornerBold => "'";
+ @override
+ String get crossBold => '+';
+ @override
+ String get teeUpBold => '+';
+ @override
+ String get teeDownBold => '+';
+ @override
+ String get teeLeftBold => '+';
+ @override
+ String get teeRightBold => '+';
+ @override
+ String get upEndBold => "'";
+ @override
+ String get downEndBold => ',';
+ @override
+ String get leftEndBold => '-';
+ @override
+ String get rightEndBold => '-';
+ @override
+ String get horizontalLineDouble => '=';
+ @override
+ String get verticalLineDouble => '|';
+ @override
+ String get topLeftCornerDouble => ',';
+ @override
+ String get topRightCornerDouble => ',';
+ @override
+ String get bottomLeftCornerDouble => '"';
+ @override
+ String get bottomRightCornerDouble => '"';
+ @override
+ String get crossDouble => '+';
+ @override
+ String get teeUpDouble => '+';
+ @override
+ String get teeDownDouble => '+';
+ @override
+ String get teeLeftDouble => '+';
+ @override
+ String get teeRightDouble => '+';
+ @override
+ String get horizontalLineDoubleDash => '-';
+ @override
+ String get horizontalLineDoubleDashBold => '-';
+ @override
+ String get verticalLineDoubleDash => '|';
+ @override
+ String get verticalLineDoubleDashBold => '|';
+ @override
+ String get horizontalLineTripleDash => '-';
+ @override
+ String get horizontalLineTripleDashBold => '-';
+ @override
+ String get verticalLineTripleDash => '|';
+ @override
+ String get verticalLineTripleDashBold => '|';
+ @override
+ String get horizontalLineQuadrupleDash => '-';
+ @override
+ String get horizontalLineQuadrupleDashBold => '-';
+ @override
+ String get verticalLineQuadrupleDash => '|';
+ @override
+ String get verticalLineQuadrupleDashBold => '|';
+}
diff --git a/pkgs/term_glyph/lib/src/generated/glyph_set.dart b/pkgs/term_glyph/lib/src/generated/glyph_set.dart
new file mode 100644
index 0000000..c8cc4a9
--- /dev/null
+++ b/pkgs/term_glyph/lib/src/generated/glyph_set.dart
@@ -0,0 +1,222 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Don't modify this file by hand! It's generated by tool/generate.dart.
+
+// ignore_for_file: lines_longer_than_80_chars
+
+/// A class that provides access to every configurable glyph.
+///
+/// This is provided as a class so that individual chunks of code can choose
+/// between `ascii` and `unicode` glyphs. For example:
+///
+/// ```dart
+/// import 'package:term_glyph/term_glyph.dart' as glyph;
+///
+/// /// Adds a vertical line to the left of [text].
+/// ///
+/// /// If [unicode] is `true`, this uses Unicode for the line. If it's
+/// /// `false`, this uses plain ASCII characters. If it's `null`, it
+/// /// defaults to [glyph.ascii].
+/// void addVerticalLine(String text, {bool unicode}) {
+/// var glyphs =
+/// (unicode ?? !glyph.ascii) ? glyph.unicodeGlyphs : glyph.asciiGlyphs;
+///
+/// return text
+/// .split('\n')
+/// .map((line) => '${glyphs.verticalLine} $line')
+/// .join('\n');
+/// }
+/// ```
+abstract class GlyphSet {
+ /// Returns [glyph] if `this` supports Unicode glyphs and [alternative]
+ /// otherwise.
+ String glyphOrAscii(String glyph, String alternative);
+
+ /// A bullet point.
+ String get bullet;
+
+ /// A left-pointing arrow.
+ ///
+ /// Note that the Unicode arrow glyphs may overlap with adjacent characters in some
+ /// terminal fonts, and should generally be surrounding by spaces.
+ String get leftArrow;
+
+ /// A right-pointing arrow.
+ ///
+ /// Note that the Unicode arrow glyphs may overlap with adjacent characters in some
+ /// terminal fonts, and should generally be surrounding by spaces.
+ String get rightArrow;
+
+ /// An upwards-pointing arrow.
+ String get upArrow;
+
+ /// A downwards-pointing arrow.
+ String get downArrow;
+
+ /// A two-character left-pointing arrow.
+ String get longLeftArrow;
+
+ /// A two-character right-pointing arrow.
+ String get longRightArrow;
+
+ /// A horizontal line that can be used to draw a box.
+ String get horizontalLine;
+
+ /// A vertical line that can be used to draw a box.
+ String get verticalLine;
+
+ /// The upper left-hand corner of a box.
+ String get topLeftCorner;
+
+ /// The upper right-hand corner of a box.
+ String get topRightCorner;
+
+ /// The lower left-hand corner of a box.
+ String get bottomLeftCorner;
+
+ /// The lower right-hand corner of a box.
+ String get bottomRightCorner;
+
+ /// An intersection of vertical and horizontal box lines.
+ String get cross;
+
+ /// A horizontal box line with a vertical line going up from the middle.
+ String get teeUp;
+
+ /// A horizontal box line with a vertical line going down from the middle.
+ String get teeDown;
+
+ /// A vertical box line with a horizontal line going left from the middle.
+ String get teeLeft;
+
+ /// A vertical box line with a horizontal line going right from the middle.
+ String get teeRight;
+
+ /// The top half of a vertical box line.
+ String get upEnd;
+
+ /// The bottom half of a vertical box line.
+ String get downEnd;
+
+ /// The left half of a horizontal box line.
+ String get leftEnd;
+
+ /// The right half of a horizontal box line.
+ String get rightEnd;
+
+ /// A bold horizontal line that can be used to draw a box.
+ String get horizontalLineBold;
+
+ /// A bold vertical line that can be used to draw a box.
+ String get verticalLineBold;
+
+ /// The bold upper left-hand corner of a box.
+ String get topLeftCornerBold;
+
+ /// The bold upper right-hand corner of a box.
+ String get topRightCornerBold;
+
+ /// The bold lower left-hand corner of a box.
+ String get bottomLeftCornerBold;
+
+ /// The bold lower right-hand corner of a box.
+ String get bottomRightCornerBold;
+
+ /// An intersection of bold vertical and horizontal box lines.
+ String get crossBold;
+
+ /// A bold horizontal box line with a vertical line going up from the middle.
+ String get teeUpBold;
+
+ /// A bold horizontal box line with a vertical line going down from the middle.
+ String get teeDownBold;
+
+ /// A bold vertical box line with a horizontal line going left from the middle.
+ String get teeLeftBold;
+
+ /// A bold vertical box line with a horizontal line going right from the middle.
+ String get teeRightBold;
+
+ /// The top half of a bold vertical box line.
+ String get upEndBold;
+
+ /// The bottom half of a bold vertical box line.
+ String get downEndBold;
+
+ /// The left half of a bold horizontal box line.
+ String get leftEndBold;
+
+ /// The right half of a bold horizontal box line.
+ String get rightEndBold;
+
+ /// A double horizontal line that can be used to draw a box.
+ String get horizontalLineDouble;
+
+ /// A double vertical line that can be used to draw a box.
+ String get verticalLineDouble;
+
+ /// The double upper left-hand corner of a box.
+ String get topLeftCornerDouble;
+
+ /// The double upper right-hand corner of a box.
+ String get topRightCornerDouble;
+
+ /// The double lower left-hand corner of a box.
+ String get bottomLeftCornerDouble;
+
+ /// The double lower right-hand corner of a box.
+ String get bottomRightCornerDouble;
+
+ /// An intersection of double vertical and horizontal box lines.
+ String get crossDouble;
+
+ /// A double horizontal box line with a vertical line going up from the middle.
+ String get teeUpDouble;
+
+ /// A double horizontal box line with a vertical line going down from the middle.
+ String get teeDownDouble;
+
+ /// A double vertical box line with a horizontal line going left from the middle.
+ String get teeLeftDouble;
+
+ /// A double vertical box line with a horizontal line going right from the middle.
+ String get teeRightDouble;
+
+ /// A dashed horizontal line that can be used to draw a box.
+ String get horizontalLineDoubleDash;
+
+ /// A bold dashed horizontal line that can be used to draw a box.
+ String get horizontalLineDoubleDashBold;
+
+ /// A dashed vertical line that can be used to draw a box.
+ String get verticalLineDoubleDash;
+
+ /// A bold dashed vertical line that can be used to draw a box.
+ String get verticalLineDoubleDashBold;
+
+ /// A dashed horizontal line that can be used to draw a box.
+ String get horizontalLineTripleDash;
+
+ /// A bold dashed horizontal line that can be used to draw a box.
+ String get horizontalLineTripleDashBold;
+
+ /// A dashed vertical line that can be used to draw a box.
+ String get verticalLineTripleDash;
+
+ /// A bold dashed vertical line that can be used to draw a box.
+ String get verticalLineTripleDashBold;
+
+ /// A dashed horizontal line that can be used to draw a box.
+ String get horizontalLineQuadrupleDash;
+
+ /// A bold dashed horizontal line that can be used to draw a box.
+ String get horizontalLineQuadrupleDashBold;
+
+ /// A dashed vertical line that can be used to draw a box.
+ String get verticalLineQuadrupleDash;
+
+ /// A bold dashed vertical line that can be used to draw a box.
+ String get verticalLineQuadrupleDashBold;
+}
diff --git a/pkgs/term_glyph/lib/src/generated/top_level.dart b/pkgs/term_glyph/lib/src/generated/top_level.dart
new file mode 100644
index 0000000..848ef6d
--- /dev/null
+++ b/pkgs/term_glyph/lib/src/generated/top_level.dart
@@ -0,0 +1,382 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Don't modify this file by hand! It's generated by tool/generate.dart.
+
+// ignore_for_file: lines_longer_than_80_chars
+
+import '../../term_glyph.dart' as glyph;
+
+/// A bullet point.
+///
+/// If [glyph.ascii] is `false`, this is "•". If it's `true`, this is
+/// "*" instead.
+String get bullet => glyph.glyphs.bullet;
+
+/// A left-pointing arrow.
+///
+/// Note that the Unicode arrow glyphs may overlap with adjacent characters in some
+/// terminal fonts, and should generally be surrounding by spaces.
+///
+/// If [glyph.ascii] is `false`, this is "←". If it's `true`, this is
+/// "<" instead.
+String get leftArrow => glyph.glyphs.leftArrow;
+
+/// A right-pointing arrow.
+///
+/// Note that the Unicode arrow glyphs may overlap with adjacent characters in some
+/// terminal fonts, and should generally be surrounding by spaces.
+///
+/// If [glyph.ascii] is `false`, this is "→". If it's `true`, this is
+/// ">" instead.
+String get rightArrow => glyph.glyphs.rightArrow;
+
+/// An upwards-pointing arrow.
+///
+/// If [glyph.ascii] is `false`, this is "↑". If it's `true`, this is
+/// "^" instead.
+String get upArrow => glyph.glyphs.upArrow;
+
+/// A downwards-pointing arrow.
+///
+/// If [glyph.ascii] is `false`, this is "↓". If it's `true`, this is
+/// "v" instead.
+String get downArrow => glyph.glyphs.downArrow;
+
+/// A two-character left-pointing arrow.
+///
+/// If [glyph.ascii] is `false`, this is "◀━". If it's `true`, this is
+/// "<=" instead.
+String get longLeftArrow => glyph.glyphs.longLeftArrow;
+
+/// A two-character right-pointing arrow.
+///
+/// If [glyph.ascii] is `false`, this is "━▶". If it's `true`, this is
+/// "=>" instead.
+String get longRightArrow => glyph.glyphs.longRightArrow;
+
+/// A horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "─". If it's `true`, this is
+/// "-" instead.
+String get horizontalLine => glyph.glyphs.horizontalLine;
+
+/// A vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "│". If it's `true`, this is
+/// "|" instead.
+String get verticalLine => glyph.glyphs.verticalLine;
+
+/// The upper left-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "┌". If it's `true`, this is
+/// "," instead.
+String get topLeftCorner => glyph.glyphs.topLeftCorner;
+
+/// The upper right-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "┐". If it's `true`, this is
+/// "," instead.
+String get topRightCorner => glyph.glyphs.topRightCorner;
+
+/// The lower left-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "└". If it's `true`, this is
+/// "'" instead.
+String get bottomLeftCorner => glyph.glyphs.bottomLeftCorner;
+
+/// The lower right-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "┘". If it's `true`, this is
+/// "'" instead.
+String get bottomRightCorner => glyph.glyphs.bottomRightCorner;
+
+/// An intersection of vertical and horizontal box lines.
+///
+/// If [glyph.ascii] is `false`, this is "┼". If it's `true`, this is
+/// "+" instead.
+String get cross => glyph.glyphs.cross;
+
+/// A horizontal box line with a vertical line going up from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "┴". If it's `true`, this is
+/// "+" instead.
+String get teeUp => glyph.glyphs.teeUp;
+
+/// A horizontal box line with a vertical line going down from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "┬". If it's `true`, this is
+/// "+" instead.
+String get teeDown => glyph.glyphs.teeDown;
+
+/// A vertical box line with a horizontal line going left from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "┤". If it's `true`, this is
+/// "+" instead.
+String get teeLeft => glyph.glyphs.teeLeft;
+
+/// A vertical box line with a horizontal line going right from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "├". If it's `true`, this is
+/// "+" instead.
+String get teeRight => glyph.glyphs.teeRight;
+
+/// The top half of a vertical box line.
+///
+/// If [glyph.ascii] is `false`, this is "╵". If it's `true`, this is
+/// "'" instead.
+String get upEnd => glyph.glyphs.upEnd;
+
+/// The bottom half of a vertical box line.
+///
+/// If [glyph.ascii] is `false`, this is "╷". If it's `true`, this is
+/// "," instead.
+String get downEnd => glyph.glyphs.downEnd;
+
+/// The left half of a horizontal box line.
+///
+/// If [glyph.ascii] is `false`, this is "╴". If it's `true`, this is
+/// "-" instead.
+String get leftEnd => glyph.glyphs.leftEnd;
+
+/// The right half of a horizontal box line.
+///
+/// If [glyph.ascii] is `false`, this is "╶". If it's `true`, this is
+/// "-" instead.
+String get rightEnd => glyph.glyphs.rightEnd;
+
+/// A bold horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "━". If it's `true`, this is
+/// "=" instead.
+String get horizontalLineBold => glyph.glyphs.horizontalLineBold;
+
+/// A bold vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┃". If it's `true`, this is
+/// "|" instead.
+String get verticalLineBold => glyph.glyphs.verticalLineBold;
+
+/// The bold upper left-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "┏". If it's `true`, this is
+/// "," instead.
+String get topLeftCornerBold => glyph.glyphs.topLeftCornerBold;
+
+/// The bold upper right-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "┓". If it's `true`, this is
+/// "," instead.
+String get topRightCornerBold => glyph.glyphs.topRightCornerBold;
+
+/// The bold lower left-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "┗". If it's `true`, this is
+/// "'" instead.
+String get bottomLeftCornerBold => glyph.glyphs.bottomLeftCornerBold;
+
+/// The bold lower right-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "┛". If it's `true`, this is
+/// "'" instead.
+String get bottomRightCornerBold => glyph.glyphs.bottomRightCornerBold;
+
+/// An intersection of bold vertical and horizontal box lines.
+///
+/// If [glyph.ascii] is `false`, this is "╋". If it's `true`, this is
+/// "+" instead.
+String get crossBold => glyph.glyphs.crossBold;
+
+/// A bold horizontal box line with a vertical line going up from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "┻". If it's `true`, this is
+/// "+" instead.
+String get teeUpBold => glyph.glyphs.teeUpBold;
+
+/// A bold horizontal box line with a vertical line going down from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "┳". If it's `true`, this is
+/// "+" instead.
+String get teeDownBold => glyph.glyphs.teeDownBold;
+
+/// A bold vertical box line with a horizontal line going left from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "┫". If it's `true`, this is
+/// "+" instead.
+String get teeLeftBold => glyph.glyphs.teeLeftBold;
+
+/// A bold vertical box line with a horizontal line going right from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "┣". If it's `true`, this is
+/// "+" instead.
+String get teeRightBold => glyph.glyphs.teeRightBold;
+
+/// The top half of a bold vertical box line.
+///
+/// If [glyph.ascii] is `false`, this is "╹". If it's `true`, this is
+/// "'" instead.
+String get upEndBold => glyph.glyphs.upEndBold;
+
+/// The bottom half of a bold vertical box line.
+///
+/// If [glyph.ascii] is `false`, this is "╻". If it's `true`, this is
+/// "," instead.
+String get downEndBold => glyph.glyphs.downEndBold;
+
+/// The left half of a bold horizontal box line.
+///
+/// If [glyph.ascii] is `false`, this is "╸". If it's `true`, this is
+/// "-" instead.
+String get leftEndBold => glyph.glyphs.leftEndBold;
+
+/// The right half of a bold horizontal box line.
+///
+/// If [glyph.ascii] is `false`, this is "╺". If it's `true`, this is
+/// "-" instead.
+String get rightEndBold => glyph.glyphs.rightEndBold;
+
+/// A double horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "═". If it's `true`, this is
+/// "=" instead.
+String get horizontalLineDouble => glyph.glyphs.horizontalLineDouble;
+
+/// A double vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "║". If it's `true`, this is
+/// "|" instead.
+String get verticalLineDouble => glyph.glyphs.verticalLineDouble;
+
+/// The double upper left-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "╔". If it's `true`, this is
+/// "," instead.
+String get topLeftCornerDouble => glyph.glyphs.topLeftCornerDouble;
+
+/// The double upper right-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "╗". If it's `true`, this is
+/// "," instead.
+String get topRightCornerDouble => glyph.glyphs.topRightCornerDouble;
+
+/// The double lower left-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "╚". If it's `true`, this is
+/// """ instead.
+String get bottomLeftCornerDouble => glyph.glyphs.bottomLeftCornerDouble;
+
+/// The double lower right-hand corner of a box.
+///
+/// If [glyph.ascii] is `false`, this is "╝". If it's `true`, this is
+/// """ instead.
+String get bottomRightCornerDouble => glyph.glyphs.bottomRightCornerDouble;
+
+/// An intersection of double vertical and horizontal box lines.
+///
+/// If [glyph.ascii] is `false`, this is "╬". If it's `true`, this is
+/// "+" instead.
+String get crossDouble => glyph.glyphs.crossDouble;
+
+/// A double horizontal box line with a vertical line going up from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "╩". If it's `true`, this is
+/// "+" instead.
+String get teeUpDouble => glyph.glyphs.teeUpDouble;
+
+/// A double horizontal box line with a vertical line going down from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "╦". If it's `true`, this is
+/// "+" instead.
+String get teeDownDouble => glyph.glyphs.teeDownDouble;
+
+/// A double vertical box line with a horizontal line going left from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "╣". If it's `true`, this is
+/// "+" instead.
+String get teeLeftDouble => glyph.glyphs.teeLeftDouble;
+
+/// A double vertical box line with a horizontal line going right from the middle.
+///
+/// If [glyph.ascii] is `false`, this is "╠". If it's `true`, this is
+/// "+" instead.
+String get teeRightDouble => glyph.glyphs.teeRightDouble;
+
+/// A dashed horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "╌". If it's `true`, this is
+/// "-" instead.
+String get horizontalLineDoubleDash => glyph.glyphs.horizontalLineDoubleDash;
+
+/// A bold dashed horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "╍". If it's `true`, this is
+/// "-" instead.
+String get horizontalLineDoubleDashBold =>
+ glyph.glyphs.horizontalLineDoubleDashBold;
+
+/// A dashed vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "╎". If it's `true`, this is
+/// "|" instead.
+String get verticalLineDoubleDash => glyph.glyphs.verticalLineDoubleDash;
+
+/// A bold dashed vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "╏". If it's `true`, this is
+/// "|" instead.
+String get verticalLineDoubleDashBold =>
+ glyph.glyphs.verticalLineDoubleDashBold;
+
+/// A dashed horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┄". If it's `true`, this is
+/// "-" instead.
+String get horizontalLineTripleDash => glyph.glyphs.horizontalLineTripleDash;
+
+/// A bold dashed horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┅". If it's `true`, this is
+/// "-" instead.
+String get horizontalLineTripleDashBold =>
+ glyph.glyphs.horizontalLineTripleDashBold;
+
+/// A dashed vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┆". If it's `true`, this is
+/// "|" instead.
+String get verticalLineTripleDash => glyph.glyphs.verticalLineTripleDash;
+
+/// A bold dashed vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┇". If it's `true`, this is
+/// "|" instead.
+String get verticalLineTripleDashBold =>
+ glyph.glyphs.verticalLineTripleDashBold;
+
+/// A dashed horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┈". If it's `true`, this is
+/// "-" instead.
+String get horizontalLineQuadrupleDash =>
+ glyph.glyphs.horizontalLineQuadrupleDash;
+
+/// A bold dashed horizontal line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┉". If it's `true`, this is
+/// "-" instead.
+String get horizontalLineQuadrupleDashBold =>
+ glyph.glyphs.horizontalLineQuadrupleDashBold;
+
+/// A dashed vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┊". If it's `true`, this is
+/// "|" instead.
+String get verticalLineQuadrupleDash => glyph.glyphs.verticalLineQuadrupleDash;
+
+/// A bold dashed vertical line that can be used to draw a box.
+///
+/// If [glyph.ascii] is `false`, this is "┋". If it's `true`, this is
+/// "|" instead.
+String get verticalLineQuadrupleDashBold =>
+ glyph.glyphs.verticalLineQuadrupleDashBold;
diff --git a/pkgs/term_glyph/lib/src/generated/unicode_glyph_set.dart b/pkgs/term_glyph/lib/src/generated/unicode_glyph_set.dart
new file mode 100644
index 0000000..7264e6d
--- /dev/null
+++ b/pkgs/term_glyph/lib/src/generated/unicode_glyph_set.dart
@@ -0,0 +1,139 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Don't modify this file by hand! It's generated by tool/generate.dart.
+
+// ignore_for_file: lines_longer_than_80_chars
+
+import 'glyph_set.dart';
+
+/// A [GlyphSet] that includes only Unicode glyphs.
+class UnicodeGlyphSet implements GlyphSet {
+ const UnicodeGlyphSet();
+
+ /// Returns [glyph] if `this` supports Unicode glyphs and [alternative]
+ /// otherwise.
+ @override
+ String glyphOrAscii(String glyph, String alternative) => glyph;
+ @override
+ String get bullet => '•';
+ @override
+ String get leftArrow => '←';
+ @override
+ String get rightArrow => '→';
+ @override
+ String get upArrow => '↑';
+ @override
+ String get downArrow => '↓';
+ @override
+ String get longLeftArrow => '◀━';
+ @override
+ String get longRightArrow => '━▶';
+ @override
+ String get horizontalLine => '─';
+ @override
+ String get verticalLine => '│';
+ @override
+ String get topLeftCorner => '┌';
+ @override
+ String get topRightCorner => '┐';
+ @override
+ String get bottomLeftCorner => '└';
+ @override
+ String get bottomRightCorner => '┘';
+ @override
+ String get cross => '┼';
+ @override
+ String get teeUp => '┴';
+ @override
+ String get teeDown => '┬';
+ @override
+ String get teeLeft => '┤';
+ @override
+ String get teeRight => '├';
+ @override
+ String get upEnd => '╵';
+ @override
+ String get downEnd => '╷';
+ @override
+ String get leftEnd => '╴';
+ @override
+ String get rightEnd => '╶';
+ @override
+ String get horizontalLineBold => '━';
+ @override
+ String get verticalLineBold => '┃';
+ @override
+ String get topLeftCornerBold => '┏';
+ @override
+ String get topRightCornerBold => '┓';
+ @override
+ String get bottomLeftCornerBold => '┗';
+ @override
+ String get bottomRightCornerBold => '┛';
+ @override
+ String get crossBold => '╋';
+ @override
+ String get teeUpBold => '┻';
+ @override
+ String get teeDownBold => '┳';
+ @override
+ String get teeLeftBold => '┫';
+ @override
+ String get teeRightBold => '┣';
+ @override
+ String get upEndBold => '╹';
+ @override
+ String get downEndBold => '╻';
+ @override
+ String get leftEndBold => '╸';
+ @override
+ String get rightEndBold => '╺';
+ @override
+ String get horizontalLineDouble => '═';
+ @override
+ String get verticalLineDouble => '║';
+ @override
+ String get topLeftCornerDouble => '╔';
+ @override
+ String get topRightCornerDouble => '╗';
+ @override
+ String get bottomLeftCornerDouble => '╚';
+ @override
+ String get bottomRightCornerDouble => '╝';
+ @override
+ String get crossDouble => '╬';
+ @override
+ String get teeUpDouble => '╩';
+ @override
+ String get teeDownDouble => '╦';
+ @override
+ String get teeLeftDouble => '╣';
+ @override
+ String get teeRightDouble => '╠';
+ @override
+ String get horizontalLineDoubleDash => '╌';
+ @override
+ String get horizontalLineDoubleDashBold => '╍';
+ @override
+ String get verticalLineDoubleDash => '╎';
+ @override
+ String get verticalLineDoubleDashBold => '╏';
+ @override
+ String get horizontalLineTripleDash => '┄';
+ @override
+ String get horizontalLineTripleDashBold => '┅';
+ @override
+ String get verticalLineTripleDash => '┆';
+ @override
+ String get verticalLineTripleDashBold => '┇';
+ @override
+ String get horizontalLineQuadrupleDash => '┈';
+ @override
+ String get horizontalLineQuadrupleDashBold => '┉';
+ @override
+ String get verticalLineQuadrupleDash => '┊';
+ @override
+ String get verticalLineQuadrupleDashBold => '┋';
+}
diff --git a/pkgs/term_glyph/lib/term_glyph.dart b/pkgs/term_glyph/lib/term_glyph.dart
new file mode 100644
index 0000000..9f2b422
--- /dev/null
+++ b/pkgs/term_glyph/lib/term_glyph.dart
@@ -0,0 +1,37 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'src/generated/ascii_glyph_set.dart';
+import 'src/generated/glyph_set.dart';
+import 'src/generated/unicode_glyph_set.dart';
+
+export 'src/generated/glyph_set.dart';
+export 'src/generated/top_level.dart';
+
+/// A [GlyphSet] that always returns ASCII glyphs.
+const GlyphSet asciiGlyphs = AsciiGlyphSet();
+
+/// A [GlyphSet] that always returns Unicode glyphs.
+const GlyphSet unicodeGlyphs = UnicodeGlyphSet();
+
+/// Returns [asciiGlyphs] if [ascii] is `true` or [unicodeGlyphs] otherwise.
+///
+/// Returns [unicodeGlyphs] by default.
+GlyphSet get glyphs => _glyphs;
+GlyphSet _glyphs = unicodeGlyphs;
+
+/// Whether the glyph getters return plain ASCII, as opposed to Unicode
+/// characters or sequences.
+///
+/// Defaults to `false`.
+bool get ascii => glyphs == asciiGlyphs;
+
+set ascii(bool value) {
+ _glyphs = value ? asciiGlyphs : unicodeGlyphs;
+}
+
+/// Returns [glyph] if Unicode glyph are allowed, and [alternative] if they
+/// aren't.
+String glyphOrAscii(String glyph, String alternative) =>
+ glyphs.glyphOrAscii(glyph, alternative);
diff --git a/pkgs/term_glyph/pubspec.yaml b/pkgs/term_glyph/pubspec.yaml
new file mode 100644
index 0000000..bac16f1
--- /dev/null
+++ b/pkgs/term_glyph/pubspec.yaml
@@ -0,0 +1,12 @@
+name: term_glyph
+version: 1.2.3-wip
+description: Useful Unicode glyphs and ASCII substitutes.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/term_glyph
+
+environment:
+ sdk: ^3.1.0
+
+dev_dependencies:
+ csv: ^6.0.0
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.6
diff --git a/pkgs/term_glyph/test/symbol_test.dart b/pkgs/term_glyph/test/symbol_test.dart
new file mode 100644
index 0000000..b3b4d09
--- /dev/null
+++ b/pkgs/term_glyph/test/symbol_test.dart
@@ -0,0 +1,60 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:term_glyph/term_glyph.dart' as glyph;
+import 'package:test/test.dart';
+
+void main() {
+ group('with ascii = false', () {
+ setUpAll(() {
+ glyph.ascii = false;
+ });
+
+ test('glyph getters return Unicode versions', () {
+ expect(glyph.topLeftCorner, equals('┌'));
+ expect(glyph.teeUpBold, equals('┻'));
+ expect(glyph.longLeftArrow, equals('◀━'));
+ });
+
+ test('glyphOrAscii returns the first argument', () {
+ expect(glyph.glyphOrAscii('A', 'B'), equals('A'));
+ });
+
+ test('glyphs returns unicodeGlyphs', () {
+ expect(glyph.glyphs, equals(glyph.unicodeGlyphs));
+ });
+
+ test('asciiGlyphs still returns ASCII characters', () {
+ expect(glyph.asciiGlyphs.topLeftCorner, equals(','));
+ expect(glyph.asciiGlyphs.teeUpBold, equals('+'));
+ expect(glyph.asciiGlyphs.longLeftArrow, equals('<='));
+ });
+ });
+
+ group('with ascii = true', () {
+ setUpAll(() {
+ glyph.ascii = true;
+ });
+
+ test('glyphs return ASCII versions', () {
+ expect(glyph.topLeftCorner, equals(','));
+ expect(glyph.teeUpBold, equals('+'));
+ expect(glyph.longLeftArrow, equals('<='));
+ });
+
+ test('glyphOrAscii returns the second argument', () {
+ expect(glyph.glyphOrAscii('A', 'B'), equals('B'));
+ });
+
+ test('glyphs returns asciiGlyphs', () {
+ expect(glyph.glyphs, equals(glyph.asciiGlyphs));
+ });
+
+ test('unicodeGlyphs still returns Unicode characters', () {
+ expect(glyph.unicodeGlyphs.topLeftCorner, equals('┌'));
+ expect(glyph.unicodeGlyphs.teeUpBold, equals('┻'));
+ expect(glyph.unicodeGlyphs.longLeftArrow, equals('◀━'));
+ });
+ });
+}
diff --git a/pkgs/term_glyph/tool/generate.dart b/pkgs/term_glyph/tool/generate.dart
new file mode 100644
index 0000000..b96b7bd
--- /dev/null
+++ b/pkgs/term_glyph/tool/generate.dart
@@ -0,0 +1,156 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import 'package:csv/csv.dart';
+
+void main() {
+ final csv = CsvCodec(eol: '\n');
+ final data = csv.decoder.convert(File('data.csv').readAsStringSync());
+
+ // Remove comments and empty lines.
+ data.removeWhere((row) => row.length < 3);
+
+ Directory(_generatedDir).createSync(recursive: true);
+
+ _writeGlyphSetInterface(data);
+ _writeGlyphSet(data, ascii: false);
+ _writeGlyphSet(data, ascii: true);
+ _writeTopLevel(data);
+
+ final result = Process.runSync(Platform.resolvedExecutable, [
+ 'format',
+ _generatedDir,
+ ]);
+ print(result.stderr);
+ exit(result.exitCode);
+}
+
+const _generatedDir = 'lib/src/generated';
+
+/// Writes `lib/src/generated/glyph_set.dart`.
+void _writeGlyphSetInterface(List<List<dynamic>> data) {
+ final file =
+ File('$_generatedDir/glyph_set.dart').openSync(mode: FileMode.write);
+ file.writeStringSync(_header);
+ file.writeStringSync(r'''
+
+ /// A class that provides access to every configurable glyph.
+ ///
+ /// This is provided as a class so that individual chunks of code can choose
+ /// between `ascii` and `unicode` glyphs. For example:
+ ///
+ /// ```dart
+ /// import 'package:term_glyph/term_glyph.dart' as glyph;
+ ///
+ /// /// Adds a vertical line to the left of [text].
+ /// ///
+ /// /// If [unicode] is `true`, this uses Unicode for the line. If it's
+ /// /// `false`, this uses plain ASCII characters. If it's `null`, it
+ /// /// defaults to [glyph.ascii].
+ /// void addVerticalLine(String text, {bool unicode}) {
+ /// var glyphs =
+ /// (unicode ?? !glyph.ascii) ? glyph.unicodeGlyphs : glyph.asciiGlyphs;
+ ///
+ /// return text
+ /// .split('\n')
+ /// .map((line) => '${glyphs.verticalLine} $line')
+ /// .join('\n');
+ /// }
+ /// ```
+ abstract class GlyphSet {
+ /// Returns [glyph] if `this` supports Unicode glyphs and [alternative]
+ /// otherwise.
+ String glyphOrAscii(String glyph, String alternative);
+ ''');
+
+ for (var glyph in data) {
+ for (var line in (glyph[3] as String).split('\n')) {
+ file.writeStringSync('/// $line\n');
+ }
+
+ file.writeStringSync('String get ${glyph[0]};');
+ }
+
+ file.writeStringSync('}');
+ file.closeSync();
+}
+
+/// Writes `lib/src/generated/${prefix.toLowerCase()}_glyph_set.dart`.
+///
+/// If [ascii] is `true`, this writes the ASCII glyph set. Otherwise it writes
+/// the Unicode glyph set.
+void _writeGlyphSet(List<List<dynamic>> data, {required bool ascii}) {
+ final file =
+ File('$_generatedDir/${ascii ? "ascii" : "unicode"}_glyph_set.dart')
+ .openSync(mode: FileMode.write);
+
+ final className = '${ascii ? "Ascii" : "Unicode"}GlyphSet';
+ file.writeStringSync('''
+ $_header
+
+
+ import 'glyph_set.dart';
+
+ /// A [GlyphSet] that includes only ${ascii ? "ASCII" : "Unicode"} glyphs.
+ class $className implements GlyphSet {
+ const $className();
+ /// Returns [glyph] if `this` supports Unicode glyphs and [alternative]
+ /// otherwise.
+ @override
+ String glyphOrAscii(String glyph, String alternative) =>
+ ${ascii ? "alternative" : "glyph"};
+ ''');
+
+ final index = ascii ? 2 : 1;
+ for (var glyph in data) {
+ file.writeStringSync('''
+ @override
+ String get ${glyph[0]} => ${_quote(glyph[index] as String)};
+ ''');
+ }
+
+ file.writeStringSync('}');
+ file.closeSync();
+}
+
+/// Writes `lib/src/generated/top_level.dart`.
+void _writeTopLevel(List<List<dynamic>> data) {
+ final file =
+ File('$_generatedDir/top_level.dart').openSync(mode: FileMode.write);
+
+ file.writeStringSync('''
+ $_header
+
+ import '../../term_glyph.dart' as glyph;
+ ''');
+
+ for (var glyph in data) {
+ for (var line in (glyph[3] as String).split('\n')) {
+ file.writeStringSync('/// $line\n');
+ }
+
+ file.writeStringSync('''
+ ///
+ /// If [glyph.ascii] is `false`, this is "${glyph[1]}". If it's `true`, this is
+ /// "${glyph[2]}" instead.
+ String get ${glyph[0]} => glyph.glyphs.${glyph[0]};
+ ''');
+ }
+
+ file.closeSync();
+}
+
+String _quote(String input) => input.contains("'") ? '"$input"' : "'$input'";
+
+const _header = '''
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// Don't modify this file by hand! It's generated by tool/generate.dart.
+
+// ignore_for_file: lines_longer_than_80_chars
+''';
diff --git a/pkgs/test_reflective_loader/.gitignore b/pkgs/test_reflective_loader/.gitignore
new file mode 100644
index 0000000..2a2c261
--- /dev/null
+++ b/pkgs/test_reflective_loader/.gitignore
@@ -0,0 +1,11 @@
+.buildlog
+.DS_Store
+.idea
+.dart_tool/
+.pub/
+.project
+.settings/
+build/
+packages
+.packages
+pubspec.lock
diff --git a/pkgs/test_reflective_loader/AUTHORS b/pkgs/test_reflective_loader/AUTHORS
new file mode 100644
index 0000000..e8063a8
--- /dev/null
+++ b/pkgs/test_reflective_loader/AUTHORS
@@ -0,0 +1,6 @@
+# Below is a list of people and organizations that have contributed
+# to the project. Names should be added to the list like so:
+#
+# Name/Organization <email address>
+
+Google Inc.
diff --git a/pkgs/test_reflective_loader/CHANGELOG.md b/pkgs/test_reflective_loader/CHANGELOG.md
new file mode 100644
index 0000000..803eb0e
--- /dev/null
+++ b/pkgs/test_reflective_loader/CHANGELOG.md
@@ -0,0 +1,72 @@
+## 0.2.3
+
+- Require Dart `^3.1.0`.
+- Move to `dart-lang/tools` monorepo.
+
+## 0.2.2
+
+- Update to package:lints 2.0.0 and move it to a dev dependency.
+
+## 0.2.1
+
+- Use package:lints for analysis.
+- Populate the pubspec `repository` field.
+
+## 0.2.0
+
+- Stable null safety release.
+
+## 0.2.0-nullsafety.0
+
+- Migrate to the null safety language feature.
+
+## 0.1.9
+
+- Add `@SkippedTest` annotation and `skip_test` prefix.
+
+## 0.1.8
+
+- Update `FailingTest` to add named parameters `issue` and `reason`.
+
+## 0.1.7
+
+- Update documentation comments.
+- Remove `@MirrorsUsed` annotation on `dart:mirrors`.
+
+## 0.1.6
+
+- Make `FailingTest` public, with the URI of the issue that causes
+ the test to break.
+
+## 0.1.5
+
+- Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 0.1.3
+
+- Fix `@failingTest` to fail when the test passes.
+
+## 0.1.2
+
+- Update the pubspec `dependencies` section to include `package:test`
+
+## 0.1.1
+
+- For `@failingTest` tests, properly handle when the test fails by throwing an
+ exception in a timer task
+- Analyze this package in strong mode
+
+## 0.1.0
+
+- Switched from 'package:unittest' to 'package:test'.
+- Since 'package:test' does not define 'solo_test', in order to keep this
+ functionality, `defineReflectiveSuite` must be used to wrap all
+ `defineReflectiveTests` invocations.
+
+## 0.0.4
+
+- Added @failingTest, @assertFailingTest and @soloTest annotations.
+
+## 0.0.1
+
+- Initial version
diff --git a/pkgs/test_reflective_loader/LICENSE b/pkgs/test_reflective_loader/LICENSE
new file mode 100644
index 0000000..633672a
--- /dev/null
+++ b/pkgs/test_reflective_loader/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2015, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/test_reflective_loader/README.md b/pkgs/test_reflective_loader/README.md
new file mode 100644
index 0000000..9b5a83d
--- /dev/null
+++ b/pkgs/test_reflective_loader/README.md
@@ -0,0 +1,28 @@
+[](https://github.com/dart-lang/tools/actions/workflows/test_reflective_loader.yaml)
+[](https://pub.dev/packages/test_reflective_loader)
+[](https://pub.dev/packages/test_reflective_loader/publisher)
+
+Support for discovering tests and test suites using reflection.
+
+This package follows the xUnit style where each class is a test suite, and each
+method with the name prefix `test_` is a single test.
+
+Methods with names starting with `test_` are run using the `test()` function with
+the corresponding name. If the class defines methods `setUp()` or `tearDown()`,
+they are executed before / after each test correspondingly, even if the test fails.
+
+Methods with names starting with `solo_test_` are run using the `solo_test()` function.
+
+Methods with names starting with `fail_` are expected to fail.
+
+Methods with names starting with `solo_fail_` are run using the `solo_test()` function
+and expected to fail.
+
+Method returning `Future` class instances are asynchronous, so `tearDown()` is
+executed after the returned `Future` completes.
+
+## Features and bugs
+
+Please file feature requests and bugs at the [issue tracker][tracker].
+
+[tracker]: https://github.com/dart-lang/tools/issues?q=is%3Aissue+is%3Aopen+label%3Apackage%3Atest_reflective_loader
diff --git a/pkgs/test_reflective_loader/analysis_options.yaml b/pkgs/test_reflective_loader/analysis_options.yaml
new file mode 100644
index 0000000..ea61158
--- /dev/null
+++ b/pkgs/test_reflective_loader/analysis_options.yaml
@@ -0,0 +1,5 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+linter:
+ rules:
+ - public_member_api_docs
diff --git a/pkgs/test_reflective_loader/lib/test_reflective_loader.dart b/pkgs/test_reflective_loader/lib/test_reflective_loader.dart
new file mode 100644
index 0000000..cb69bf3
--- /dev/null
+++ b/pkgs/test_reflective_loader/lib/test_reflective_loader.dart
@@ -0,0 +1,354 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:mirrors';
+
+import 'package:test/test.dart' as test_package;
+
+/// A marker annotation used to annotate test methods which are expected to fail
+/// when asserts are enabled.
+const Object assertFailingTest = _AssertFailingTest();
+
+/// A marker annotation used to annotate test methods which are expected to
+/// fail.
+const Object failingTest = FailingTest();
+
+/// A marker annotation used to instruct dart2js to keep reflection information
+/// for the annotated classes.
+const Object reflectiveTest = _ReflectiveTest();
+
+/// A marker annotation used to annotate test methods that should be skipped.
+const Object skippedTest = SkippedTest();
+
+/// A marker annotation used to annotate "solo" groups and tests.
+const Object soloTest = _SoloTest();
+
+final List<_Group> _currentGroups = <_Group>[];
+int _currentSuiteLevel = 0;
+String _currentSuiteName = '';
+
+/// Is `true` the application is running in the checked mode.
+final bool _isCheckedMode = () {
+ try {
+ assert(false);
+ return false;
+ } catch (_) {
+ return true;
+ }
+}();
+
+/// Run the [define] function parameter that calls [defineReflectiveTests] to
+/// add normal and "solo" tests, and also calls [defineReflectiveSuite] to
+/// create embedded suites. If the current suite is the top-level one, perform
+/// check for "solo" groups and tests, and run all or only "solo" items.
+void defineReflectiveSuite(void Function() define, {String name = ''}) {
+ var groupName = _currentSuiteName;
+ _currentSuiteLevel++;
+ try {
+ _currentSuiteName = _combineNames(_currentSuiteName, name);
+ define();
+ } finally {
+ _currentSuiteName = groupName;
+ _currentSuiteLevel--;
+ }
+ _addTestsIfTopLevelSuite();
+}
+
+/// Runs test methods existing in the given [type].
+///
+/// If there is a "solo" test method in the top-level suite, only "solo" methods
+/// are run.
+///
+/// If there is a "solo" test type, only its test methods are run.
+///
+/// Otherwise all tests methods of all test types are run.
+///
+/// Each method is run with a new instance of [type].
+/// So, [type] should have a default constructor.
+///
+/// If [type] declares method `setUp`, it methods will be invoked before any
+/// test method invocation.
+///
+/// If [type] declares method `tearDown`, it will be invoked after any test
+/// method invocation. If method returns [Future] to test some asynchronous
+/// behavior, then `tearDown` will be invoked in `Future.complete`.
+void defineReflectiveTests(Type type) {
+ var classMirror = reflectClass(type);
+ if (!classMirror.metadata.any((InstanceMirror annotation) =>
+ annotation.type.reflectedType == _ReflectiveTest)) {
+ var name = MirrorSystem.getName(classMirror.qualifiedName);
+ throw Exception('Class $name must have annotation "@reflectiveTest" '
+ 'in order to be run by runReflectiveTests.');
+ }
+
+ _Group group;
+ {
+ var isSolo = _hasAnnotationInstance(classMirror, soloTest);
+ var className = MirrorSystem.getName(classMirror.simpleName);
+ group = _Group(isSolo, _combineNames(_currentSuiteName, className));
+ _currentGroups.add(group);
+ }
+
+ classMirror.instanceMembers
+ .forEach((Symbol symbol, MethodMirror memberMirror) {
+ // we need only methods
+ if (!memberMirror.isRegularMethod) {
+ return;
+ }
+ // prepare information about the method
+ var memberName = MirrorSystem.getName(symbol);
+ var isSolo = memberName.startsWith('solo_') ||
+ _hasAnnotationInstance(memberMirror, soloTest);
+ // test_
+ if (memberName.startsWith('test_')) {
+ if (_hasSkippedTestAnnotation(memberMirror)) {
+ group.addSkippedTest(memberName);
+ } else {
+ group.addTest(isSolo, memberName, memberMirror, () {
+ if (_hasFailingTestAnnotation(memberMirror) ||
+ _isCheckedMode && _hasAssertFailingTestAnnotation(memberMirror)) {
+ return _runFailingTest(classMirror, symbol);
+ } else {
+ return _runTest(classMirror, symbol);
+ }
+ });
+ }
+ return;
+ }
+ // solo_test_
+ if (memberName.startsWith('solo_test_')) {
+ group.addTest(true, memberName, memberMirror, () {
+ return _runTest(classMirror, symbol);
+ });
+ }
+ // fail_test_
+ if (memberName.startsWith('fail_')) {
+ group.addTest(isSolo, memberName, memberMirror, () {
+ return _runFailingTest(classMirror, symbol);
+ });
+ }
+ // solo_fail_test_
+ if (memberName.startsWith('solo_fail_')) {
+ group.addTest(true, memberName, memberMirror, () {
+ return _runFailingTest(classMirror, symbol);
+ });
+ }
+ // skip_test_
+ if (memberName.startsWith('skip_test_')) {
+ group.addSkippedTest(memberName);
+ }
+ });
+
+ // Support for the case of missing enclosing [defineReflectiveSuite].
+ _addTestsIfTopLevelSuite();
+}
+
+/// If the current suite is the top-level one, add tests to the `test` package.
+void _addTestsIfTopLevelSuite() {
+ if (_currentSuiteLevel == 0) {
+ void runTests({required bool allGroups, required bool allTests}) {
+ for (var group in _currentGroups) {
+ if (allGroups || group.isSolo) {
+ for (var test in group.tests) {
+ if (allTests || test.isSolo) {
+ test_package.test(test.name, test.function,
+ timeout: test.timeout, skip: test.isSkipped);
+ }
+ }
+ }
+ }
+ }
+
+ if (_currentGroups.any((g) => g.hasSoloTest)) {
+ runTests(allGroups: true, allTests: false);
+ } else if (_currentGroups.any((g) => g.isSolo)) {
+ runTests(allGroups: false, allTests: true);
+ } else {
+ runTests(allGroups: true, allTests: true);
+ }
+ _currentGroups.clear();
+ }
+}
+
+/// Return the combination of the [base] and [addition] names.
+/// If any other two is `null`, then the other one is returned.
+String _combineNames(String base, String addition) {
+ if (base.isEmpty) {
+ return addition;
+ } else if (addition.isEmpty) {
+ return base;
+ } else {
+ return '$base | $addition';
+ }
+}
+
+Object? _getAnnotationInstance(DeclarationMirror declaration, Type type) {
+ for (var annotation in declaration.metadata) {
+ if ((annotation.reflectee as Object).runtimeType == type) {
+ return annotation.reflectee;
+ }
+ }
+ return null;
+}
+
+bool _hasAnnotationInstance(DeclarationMirror declaration, Object instance) =>
+ declaration.metadata.any((InstanceMirror annotation) =>
+ identical(annotation.reflectee, instance));
+
+bool _hasAssertFailingTestAnnotation(MethodMirror method) =>
+ _hasAnnotationInstance(method, assertFailingTest);
+
+bool _hasFailingTestAnnotation(MethodMirror method) =>
+ _hasAnnotationInstance(method, failingTest);
+
+bool _hasSkippedTestAnnotation(MethodMirror method) =>
+ _hasAnnotationInstance(method, skippedTest);
+
+Future<Object?> _invokeSymbolIfExists(
+ InstanceMirror instanceMirror, Symbol symbol) {
+ Object? invocationResult;
+ InstanceMirror? closure;
+ try {
+ closure = instanceMirror.getField(symbol);
+ // ignore: avoid_catching_errors
+ } on NoSuchMethodError {
+ // ignore
+ }
+
+ if (closure is ClosureMirror) {
+ invocationResult = closure.apply([]).reflectee;
+ }
+ return Future.value(invocationResult);
+}
+
+/// Run a test that is expected to fail, and confirm that it fails.
+///
+/// This properly handles the following cases:
+/// - The test fails by throwing an exception
+/// - The test returns a future which completes with an error.
+/// - An exception is thrown to the zone handler from a timer task.
+Future<Object?>? _runFailingTest(ClassMirror classMirror, Symbol symbol) {
+ var passed = false;
+ return runZonedGuarded(() {
+ // ignore: void_checks
+ return Future.sync(() => _runTest(classMirror, symbol)).then<void>((_) {
+ passed = true;
+ test_package.fail('Test passed - expected to fail.');
+ }).catchError((Object e) {
+ // if passed, and we call fail(), rethrow this exception
+ if (passed) {
+ // ignore: only_throw_errors
+ throw e;
+ }
+ // otherwise, an exception is not a failure for _runFailingTest
+ });
+ }, (e, st) {
+ // if passed, and we call fail(), rethrow this exception
+ if (passed) {
+ // ignore: only_throw_errors
+ throw e;
+ }
+ // otherwise, an exception is not a failure for _runFailingTest
+ });
+}
+
+Future<void> _runTest(ClassMirror classMirror, Symbol symbol) async {
+ var instanceMirror = classMirror.newInstance(const Symbol(''), []);
+ try {
+ await _invokeSymbolIfExists(instanceMirror, #setUp);
+ await instanceMirror.invoke(symbol, []).reflectee;
+ } finally {
+ await _invokeSymbolIfExists(instanceMirror, #tearDown);
+ }
+}
+
+typedef _TestFunction = dynamic Function();
+
+/// A marker annotation used to annotate test methods which are expected to
+/// fail.
+class FailingTest {
+ /// Initialize this annotation with the given arguments.
+ ///
+ /// [issue] is a full URI describing the failure and used for tracking.
+ /// [reason] is a free form textual description.
+ const FailingTest({String? issue, String? reason});
+}
+
+/// A marker annotation used to annotate test methods which are skipped.
+class SkippedTest {
+ /// Initialize this annotation with the given arguments.
+ ///
+ /// [issue] is a full URI describing the failure and used for tracking.
+ /// [reason] is a free form textual description.
+ const SkippedTest({String? issue, String? reason});
+}
+
+/// A marker annotation used to annotate test methods with additional timeout
+/// information.
+class TestTimeout {
+ final test_package.Timeout _timeout;
+
+ /// Initialize this annotation with the given timeout.
+ const TestTimeout(test_package.Timeout timeout) : _timeout = timeout;
+}
+
+/// A marker annotation used to annotate test methods which are expected to fail
+/// when asserts are enabled.
+class _AssertFailingTest {
+ const _AssertFailingTest();
+}
+
+/// Information about a type based test group.
+class _Group {
+ final bool isSolo;
+ final String name;
+ final List<_Test> tests = <_Test>[];
+
+ _Group(this.isSolo, this.name);
+
+ bool get hasSoloTest => tests.any((test) => test.isSolo);
+
+ void addSkippedTest(String name) {
+ var fullName = _combineNames(this.name, name);
+ tests.add(_Test.skipped(isSolo, fullName));
+ }
+
+ void addTest(bool isSolo, String name, MethodMirror memberMirror,
+ _TestFunction function) {
+ var fullName = _combineNames(this.name, name);
+ var timeout =
+ _getAnnotationInstance(memberMirror, TestTimeout) as TestTimeout?;
+ tests.add(_Test(isSolo, fullName, function, timeout?._timeout));
+ }
+}
+
+/// A marker annotation used to instruct dart2js to keep reflection information
+/// for the annotated classes.
+class _ReflectiveTest {
+ const _ReflectiveTest();
+}
+
+/// A marker annotation used to annotate "solo" groups and tests.
+class _SoloTest {
+ const _SoloTest();
+}
+
+/// Information about a test.
+class _Test {
+ final bool isSolo;
+ final String name;
+ final _TestFunction function;
+ final test_package.Timeout? timeout;
+
+ final bool isSkipped;
+
+ _Test(this.isSolo, this.name, this.function, this.timeout)
+ : isSkipped = false;
+
+ _Test.skipped(this.isSolo, this.name)
+ : isSkipped = true,
+ function = (() {}),
+ timeout = null;
+}
diff --git a/pkgs/test_reflective_loader/pubspec.yaml b/pkgs/test_reflective_loader/pubspec.yaml
new file mode 100644
index 0000000..569933f
--- /dev/null
+++ b/pkgs/test_reflective_loader/pubspec.yaml
@@ -0,0 +1,13 @@
+name: test_reflective_loader
+version: 0.2.3
+description: Support for discovering tests and test suites using reflection.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/test_reflective_loader
+
+environment:
+ sdk: ^3.1.0
+
+dependencies:
+ test: ^1.16.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
diff --git a/pkgs/test_reflective_loader/test/test_reflective_loader_test.dart b/pkgs/test_reflective_loader/test/test_reflective_loader_test.dart
new file mode 100644
index 0000000..fad98a5
--- /dev/null
+++ b/pkgs/test_reflective_loader/test/test_reflective_loader_test.dart
@@ -0,0 +1,48 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: non_constant_identifier_names
+
+import 'dart:async';
+
+import 'package:test/test.dart';
+import 'package:test_reflective_loader/test_reflective_loader.dart';
+
+void main() {
+ defineReflectiveSuite(() {
+ defineReflectiveTests(TestReflectiveLoaderTest);
+ });
+}
+
+@reflectiveTest
+class TestReflectiveLoaderTest {
+ void test_passes() {
+ expect(true, true);
+ }
+
+ @failingTest
+ void test_fails() {
+ expect(false, true);
+ }
+
+ @failingTest
+ void test_fails_throws_sync() {
+ throw StateError('foo');
+ }
+
+ @failingTest
+ Future test_fails_throws_async() {
+ return Future.error('foo');
+ }
+
+ @skippedTest
+ void test_fails_but_skipped() {
+ throw StateError('foo');
+ }
+
+ @skippedTest
+ void test_times_out_but_skipped() {
+ while (true) {}
+ }
+}
diff --git a/pkgs/timing/.gitignore b/pkgs/timing/.gitignore
new file mode 100644
index 0000000..1ddf798
--- /dev/null
+++ b/pkgs/timing/.gitignore
@@ -0,0 +1,7 @@
+.packages
+/build/
+pubspec.lock
+
+# Files generated by dart tools
+.dart_tool
+doc/
diff --git a/pkgs/timing/CHANGELOG.md b/pkgs/timing/CHANGELOG.md
new file mode 100644
index 0000000..8cdb8ea
--- /dev/null
+++ b/pkgs/timing/CHANGELOG.md
@@ -0,0 +1,34 @@
+## 1.0.2
+
+- Require Dart `3.4`.
+- Move to `dart-lang/tools` monorepo.
+
+## 1.0.1
+
+- Require Dart `2.14`.
+
+## 1.0.0
+
+- Enable null safety.
+- Require Dart `2.12`.
+
+## 0.1.1+3
+
+- Allow `package:json_annotation` `'>=1.0.0 <5.0.0'`.
+
+## 0.1.1+2
+
+- Support the latest version of `package:json_annotation`.
+- Require Dart 2.2 or later.
+
+## 0.1.1+1
+
+- Support the latest version of `package:json_annotation`.
+
+## 0.1.1
+
+- Add JSON serialization
+
+## 0.1.0
+
+- Initial release
diff --git a/pkgs/timing/LICENSE b/pkgs/timing/LICENSE
new file mode 100644
index 0000000..9972f6e
--- /dev/null
+++ b/pkgs/timing/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2018, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/timing/README.md b/pkgs/timing/README.md
new file mode 100644
index 0000000..9dab7cc
--- /dev/null
+++ b/pkgs/timing/README.md
@@ -0,0 +1,30 @@
+[](https://github.com/dart-lang/tools/actions/workflows/timing.yaml)
+[](https://pub.dev/packages/timing)
+[](https://pub.dev/packages/timing/publisher)
+
+Timing is a simple package for tracking performance of both async and sync actions
+
+## Usage
+
+```dart
+var tracker = AsyncTimeTracker();
+await tracker.track(() async {
+ // some async code here
+});
+
+// Use results
+print('${tracker.duration} ${tracker.innerDuration} ${tracker.slices}');
+```
+
+## Building
+
+Use the following command to re-generate `lib/src/timing.g.dart` file:
+
+```bash
+dart pub run build_runner build
+```
+
+## Publishing automation
+
+For information about our publishing automation and release process, see
+https://github.com/dart-lang/ecosystem/wiki/Publishing-automation.
diff --git a/pkgs/timing/analysis_options.yaml b/pkgs/timing/analysis_options.yaml
new file mode 100644
index 0000000..396236d
--- /dev/null
+++ b/pkgs/timing/analysis_options.yaml
@@ -0,0 +1,2 @@
+# https://dart.dev/tools/analysis#the-analysis-options-file
+include: package:dart_flutter_team_lints/analysis_options.yaml
diff --git a/pkgs/timing/lib/src/clock.dart b/pkgs/timing/lib/src/clock.dart
new file mode 100644
index 0000000..6a9d295
--- /dev/null
+++ b/pkgs/timing/lib/src/clock.dart
@@ -0,0 +1,20 @@
+// Copyright (c) 2017, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+/// A function that returns the current [DateTime].
+typedef _Clock = DateTime Function();
+DateTime _defaultClock() => DateTime.now();
+
+const _zoneKey = #timing_Clock;
+
+/// Returns the current [DateTime].
+///
+/// May be overridden for tests using [scopeClock].
+DateTime now() => (Zone.current[_zoneKey] as _Clock? ?? _defaultClock)();
+
+/// Runs [f], with [clock] scoped whenever [now] is called.
+T scopeClock<T>(DateTime Function() clock, T Function() f) =>
+ runZoned(f, zoneValues: {_zoneKey: clock});
diff --git a/pkgs/timing/lib/src/timing.dart b/pkgs/timing/lib/src/timing.dart
new file mode 100644
index 0000000..049ba81
--- /dev/null
+++ b/pkgs/timing/lib/src/timing.dart
@@ -0,0 +1,338 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:json_annotation/json_annotation.dart';
+
+import 'clock.dart';
+
+part 'timing.g.dart';
+
+/// The timings of an operation, including its [startTime], [stopTime], and
+/// [duration].
+@JsonSerializable()
+class TimeSlice {
+ /// The total duration of this operation, equivalent to taking the difference
+ /// between [stopTime] and [startTime].
+ Duration get duration => stopTime.difference(startTime);
+
+ final DateTime startTime;
+
+ final DateTime stopTime;
+
+ TimeSlice(this.startTime, this.stopTime);
+
+ factory TimeSlice.fromJson(Map<String, dynamic> json) =>
+ _$TimeSliceFromJson(json);
+
+ Map<String, dynamic> toJson() => _$TimeSliceToJson(this);
+
+ @override
+ String toString() => '($startTime + $duration)';
+}
+
+/// The timings of an async operation, consist of several sync [slices] and
+/// includes total [startTime], [stopTime], and [duration].
+@JsonSerializable()
+class TimeSliceGroup implements TimeSlice {
+ final List<TimeSlice> slices;
+
+ @override
+ DateTime get startTime => slices.first.startTime;
+
+ @override
+ DateTime get stopTime => slices.last.stopTime;
+
+ /// The total duration of this operation, equivalent to taking the difference
+ /// between [stopTime] and [startTime].
+ @override
+ Duration get duration => stopTime.difference(startTime);
+
+ /// Sum of [duration]s of all [slices].
+ ///
+ /// If some of slices implements [TimeSliceGroup] [innerDuration] will be used
+ /// to compute sum.
+ Duration get innerDuration => slices.fold(
+ Duration.zero,
+ (duration, slice) =>
+ duration +
+ (slice is TimeSliceGroup ? slice.innerDuration : slice.duration));
+
+ TimeSliceGroup(this.slices);
+
+ /// Constructs TimeSliceGroup from JSON representation
+ factory TimeSliceGroup.fromJson(Map<String, dynamic> json) =>
+ _$TimeSliceGroupFromJson(json);
+
+ @override
+ Map<String, dynamic> toJson() => _$TimeSliceGroupToJson(this);
+
+ @override
+ String toString() => slices.toString();
+}
+
+abstract class TimeTracker implements TimeSlice {
+ /// Whether tracking is active.
+ ///
+ /// Tracking is only active after `isStarted` and before `isFinished`.
+ bool get isTracking;
+
+ /// Whether tracking is finished.
+ ///
+ /// Tracker can't be used as [TimeSlice] before it is finished
+ bool get isFinished;
+
+ /// Whether tracking was started.
+ ///
+ /// Equivalent of `isTracking || isFinished`
+ bool get isStarted;
+
+ T track<T>(T Function() action);
+}
+
+/// Tracks only sync actions
+class SyncTimeTracker implements TimeTracker {
+ /// When this operation started, call [_start] to set this.
+ @override
+ DateTime get startTime => _startTime!;
+ DateTime? _startTime;
+
+ /// When this operation stopped, call [_stop] to set this.
+ @override
+ DateTime get stopTime => _stopTime!;
+ DateTime? _stopTime;
+
+ /// Start tracking this operation, must only be called once, before [_stop].
+ void _start() {
+ assert(_startTime == null && _stopTime == null);
+ _startTime = now();
+ }
+
+ /// Stop tracking this operation, must only be called once, after [_start].
+ void _stop() {
+ assert(_startTime != null && _stopTime == null);
+ _stopTime = now();
+ }
+
+ /// Splits tracker into two slices.
+ ///
+ /// Returns new [TimeSlice] started on [startTime] and ended now. Modifies
+ /// [startTime] of tracker to current time point
+ ///
+ /// Don't change state of tracker. Can be called only while [isTracking], and
+ /// tracker will sill be tracking after call.
+ TimeSlice _split() {
+ if (!isTracking) {
+ throw StateError('Can be only called while tracking');
+ }
+ final splitPoint = now();
+ final prevSlice = TimeSlice(_startTime!, splitPoint);
+ _startTime = splitPoint;
+ return prevSlice;
+ }
+
+ @override
+ T track<T>(T Function() action) {
+ if (isStarted) {
+ throw StateError('Can not be tracked twice');
+ }
+ _start();
+ try {
+ return action();
+ } finally {
+ _stop();
+ }
+ }
+
+ @override
+ bool get isStarted => _startTime != null;
+
+ @override
+ bool get isTracking => _startTime != null && _stopTime == null;
+
+ @override
+ bool get isFinished => _startTime != null && _stopTime != null;
+
+ @override
+ Duration get duration => _stopTime!.difference(_startTime!);
+
+ /// Converts to JSON representation
+ ///
+ /// Can't be used before [isFinished]
+ @override
+ Map<String, dynamic> toJson() => _$TimeSliceToJson(this);
+}
+
+/// Async actions returning [Future] will be tracked as single sync time span
+/// from the beginning of execution till completion of future
+class SimpleAsyncTimeTracker extends SyncTimeTracker {
+ @override
+ T track<T>(T Function() action) {
+ if (isStarted) {
+ throw StateError('Can not be tracked twice');
+ }
+ T result;
+ _start();
+ try {
+ result = action();
+ } catch (_) {
+ _stop();
+ rethrow;
+ }
+ if (result is Future) {
+ return result.whenComplete(_stop) as T;
+ } else {
+ _stop();
+ return result;
+ }
+ }
+}
+
+/// No-op implementation of [SyncTimeTracker] that does nothing.
+class NoOpTimeTracker implements TimeTracker {
+ static final sharedInstance = NoOpTimeTracker();
+
+ @override
+ Duration get duration =>
+ throw UnsupportedError('Unsupported in no-op implementation');
+
+ @override
+ DateTime get startTime =>
+ throw UnsupportedError('Unsupported in no-op implementation');
+
+ @override
+ DateTime get stopTime =>
+ throw UnsupportedError('Unsupported in no-op implementation');
+
+ @override
+ bool get isStarted =>
+ throw UnsupportedError('Unsupported in no-op implementation');
+
+ @override
+ bool get isTracking =>
+ throw UnsupportedError('Unsupported in no-op implementation');
+
+ @override
+ bool get isFinished =>
+ throw UnsupportedError('Unsupported in no-op implementation');
+
+ @override
+ T track<T>(T Function() action) => action();
+
+ @override
+ Map<String, dynamic> toJson() =>
+ throw UnsupportedError('Unsupported in no-op implementation');
+}
+
+/// Track all async execution as disjoint time [slices] in ascending order.
+///
+/// Can [track] both async and sync actions.
+/// Can exclude time of tested trackers.
+///
+/// If tracked action spawns some dangled async executions behavior is't
+/// defined. Tracked might or might not track time of such executions
+class AsyncTimeTracker extends TimeSliceGroup implements TimeTracker {
+ final bool trackNested;
+
+ static const _zoneKey = #timing_AsyncTimeTracker;
+
+ AsyncTimeTracker({this.trackNested = true}) : super([]);
+
+ T _trackSyncSlice<T>(ZoneDelegate parent, Zone zone, T Function() action) {
+ // Ignore dangling runs after tracker completes
+ if (isFinished) {
+ return action();
+ }
+
+ final isNestedRun = slices.isNotEmpty &&
+ slices.last is SyncTimeTracker &&
+ (slices.last as SyncTimeTracker).isTracking;
+ final isExcludedNestedTrack = !trackNested && zone[_zoneKey] != this;
+
+ // Exclude nested sync tracks
+ if (isNestedRun && isExcludedNestedTrack) {
+ final timer = slices.last as SyncTimeTracker;
+ // Split already tracked time into new slice.
+ // Replace tracker in slices.last with splitted slice, to indicate for
+ // recursive calls that we not tracking.
+ slices.last = parent.run(zone, timer._split);
+ try {
+ return action();
+ } finally {
+ // Split tracker again and discard slice from nested tracker
+ parent.run(zone, timer._split);
+ // Add tracker back to list of slices and continue tracking
+ slices.add(timer);
+ }
+ }
+
+ // Exclude nested async tracks
+ if (isExcludedNestedTrack) {
+ return action();
+ }
+
+ // Split time slices in nested sync runs
+ if (isNestedRun) {
+ return action();
+ }
+
+ final timer = SyncTimeTracker();
+ slices.add(timer);
+
+ // Pass to parent zone, in case of overwritten clock
+ return parent.runUnary(zone, timer.track, action);
+ }
+
+ static final asyncTimeTrackerZoneSpecification = ZoneSpecification(
+ run: <R>(Zone self, ZoneDelegate parent, Zone zone, R Function() f) {
+ final tracker = self[_zoneKey] as AsyncTimeTracker;
+ return tracker._trackSyncSlice(parent, zone, () => parent.run(zone, f));
+ },
+ runUnary: <R, T>(Zone self, ZoneDelegate parent, Zone zone, R Function(T) f,
+ T arg) {
+ final tracker = self[_zoneKey] as AsyncTimeTracker;
+ return tracker._trackSyncSlice(
+ parent, zone, () => parent.runUnary(zone, f, arg));
+ },
+ runBinary: <R, T1, T2>(Zone self, ZoneDelegate parent, Zone zone,
+ R Function(T1, T2) f, T1 arg1, T2 arg2) {
+ final tracker = self[_zoneKey] as AsyncTimeTracker;
+ return tracker._trackSyncSlice(
+ parent, zone, () => parent.runBinary(zone, f, arg1, arg2));
+ },
+ );
+
+ @override
+ T track<T>(T Function() action) {
+ if (isStarted) {
+ throw StateError('Can not be tracked twice');
+ }
+ _tracking = true;
+ final result = runZoned(action,
+ zoneSpecification: asyncTimeTrackerZoneSpecification,
+ zoneValues: {_zoneKey: this});
+ if (result is Future) {
+ return result
+ // Break possible sync processing of future completion, so slice
+ // trackers can be finished
+ .whenComplete(Future.value)
+ .whenComplete(() => _tracking = false) as T;
+ } else {
+ _tracking = false;
+ return result;
+ }
+ }
+
+ bool? _tracking;
+
+ @override
+ bool get isStarted => _tracking != null;
+
+ @override
+ bool get isFinished => _tracking == false;
+
+ @override
+ bool get isTracking => _tracking == true;
+}
diff --git a/pkgs/timing/lib/src/timing.g.dart b/pkgs/timing/lib/src/timing.g.dart
new file mode 100644
index 0000000..679c082
--- /dev/null
+++ b/pkgs/timing/lib/src/timing.g.dart
@@ -0,0 +1,29 @@
+// GENERATED CODE - DO NOT MODIFY BY HAND
+
+part of 'timing.dart';
+
+// **************************************************************************
+// JsonSerializableGenerator
+// **************************************************************************
+
+TimeSlice _$TimeSliceFromJson(Map<String, dynamic> json) => TimeSlice(
+ DateTime.parse(json['startTime'] as String),
+ DateTime.parse(json['stopTime'] as String),
+ );
+
+Map<String, dynamic> _$TimeSliceToJson(TimeSlice instance) => <String, dynamic>{
+ 'startTime': instance.startTime.toIso8601String(),
+ 'stopTime': instance.stopTime.toIso8601String(),
+ };
+
+TimeSliceGroup _$TimeSliceGroupFromJson(Map<String, dynamic> json) =>
+ TimeSliceGroup(
+ (json['slices'] as List<dynamic>)
+ .map((e) => TimeSlice.fromJson(e as Map<String, dynamic>))
+ .toList(),
+ );
+
+Map<String, dynamic> _$TimeSliceGroupToJson(TimeSliceGroup instance) =>
+ <String, dynamic>{
+ 'slices': instance.slices,
+ };
diff --git a/pkgs/timing/lib/timing.dart b/pkgs/timing/lib/timing.dart
new file mode 100644
index 0000000..5cb16d4
--- /dev/null
+++ b/pkgs/timing/lib/timing.dart
@@ -0,0 +1,13 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+export 'src/timing.dart'
+ show
+ AsyncTimeTracker,
+ NoOpTimeTracker,
+ SimpleAsyncTimeTracker,
+ SyncTimeTracker,
+ TimeSlice,
+ TimeSliceGroup,
+ TimeTracker;
diff --git a/pkgs/timing/pubspec.yaml b/pkgs/timing/pubspec.yaml
new file mode 100644
index 0000000..891a8af
--- /dev/null
+++ b/pkgs/timing/pubspec.yaml
@@ -0,0 +1,18 @@
+name: timing
+version: 1.0.2
+description: >-
+ A simple package for tracking the performance of synchronous and asynchronous
+ actions.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/timing
+
+environment:
+ sdk: ^3.4.0
+
+dependencies:
+ json_annotation: ^4.9.0
+
+dev_dependencies:
+ build_runner: ^2.0.6
+ dart_flutter_team_lints: ^3.0.0
+ json_serializable: ^6.0.0
+ test: ^1.17.10
diff --git a/pkgs/timing/test/timing_test.dart b/pkgs/timing/test/timing_test.dart
new file mode 100644
index 0000000..b5836d9
--- /dev/null
+++ b/pkgs/timing/test/timing_test.dart
@@ -0,0 +1,416 @@
+// Copyright (c) 2018, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: only_throw_errors, inference_failure_on_instance_creation
+
+import 'dart:async';
+
+import 'package:test/test.dart';
+import 'package:timing/src/clock.dart';
+import 'package:timing/src/timing.dart';
+
+void _noop() {}
+
+void main() {
+ late DateTime time;
+ final startTime = DateTime(2017);
+ DateTime fakeClock() => time;
+
+ late TimeTracker tracker;
+ late TimeTracker nestedTracker;
+
+ T scopedTrack<T>(T Function() f) =>
+ scopeClock(fakeClock, () => tracker.track(f));
+
+ setUp(() {
+ time = startTime;
+ });
+
+ void canHandleSync([void Function() additionalExpects = _noop]) {
+ test('Can track sync code', () {
+ expect(tracker.isStarted, false);
+ expect(tracker.isTracking, false);
+ expect(tracker.isFinished, false);
+ scopedTrack(() {
+ expect(tracker.isStarted, true);
+ expect(tracker.isTracking, true);
+ expect(tracker.isFinished, false);
+ time = time.add(const Duration(seconds: 5));
+ });
+ expect(tracker.isStarted, true);
+ expect(tracker.isTracking, false);
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 5));
+ additionalExpects();
+ });
+
+ test('Can track handled sync exceptions', () async {
+ scopedTrack(() {
+ try {
+ time = time.add(const Duration(seconds: 4));
+ throw 'error';
+ } on String {
+ time = time.add(const Duration(seconds: 1));
+ }
+ });
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 5));
+ additionalExpects();
+ });
+
+ test('Can track in case of unhandled sync exceptions', () async {
+ expect(
+ () => scopedTrack(() {
+ time = time.add(const Duration(seconds: 5));
+ throw 'error';
+ }),
+ throwsA(const TypeMatcher<String>()));
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 5));
+ additionalExpects();
+ });
+
+ test('Can be nested sync', () {
+ scopedTrack(() {
+ time = time.add(const Duration(seconds: 1));
+ nestedTracker.track(() {
+ time = time.add(const Duration(seconds: 2));
+ });
+ time = time.add(const Duration(seconds: 4));
+ });
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 7));
+ expect(nestedTracker.startTime.isAfter(startTime), true);
+ expect(nestedTracker.stopTime.isBefore(time), true);
+ expect(nestedTracker.duration, const Duration(seconds: 2));
+ additionalExpects();
+ });
+ }
+
+ void canHandleAsync([void Function() additionalExpects = _noop]) {
+ test('Can track async code', () async {
+ expect(tracker.isStarted, false);
+ expect(tracker.isTracking, false);
+ expect(tracker.isFinished, false);
+ await scopedTrack(() => Future(() {
+ expect(tracker.isStarted, true);
+ expect(tracker.isTracking, true);
+ expect(tracker.isFinished, false);
+ time = time.add(const Duration(seconds: 5));
+ }));
+ expect(tracker.isStarted, true);
+ expect(tracker.isTracking, false);
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 5));
+ additionalExpects();
+ });
+
+ test('Can track handled async exceptions', () async {
+ await scopedTrack(() {
+ time = time.add(const Duration(seconds: 1));
+ return Future(() {
+ time = time.add(const Duration(seconds: 2));
+ throw 'error';
+ }).then((_) {
+ time = time.add(const Duration(seconds: 4));
+ }).catchError((error, stack) {
+ time = time.add(const Duration(seconds: 8));
+ });
+ });
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 11));
+ additionalExpects();
+ });
+
+ test('Can track in case of unhandled async exceptions', () async {
+ final future = scopedTrack(() {
+ time = time.add(const Duration(seconds: 1));
+ return Future(() {
+ time = time.add(const Duration(seconds: 2));
+ throw 'error';
+ }).then((_) {
+ time = time.add(const Duration(seconds: 4));
+ });
+ });
+ await expectLater(future, throwsA(const TypeMatcher<String>()));
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 3));
+ additionalExpects();
+ });
+
+ test('Can be nested async', () async {
+ await scopedTrack(() async {
+ time = time.add(const Duration(milliseconds: 1));
+ await Future.value();
+ time = time.add(const Duration(milliseconds: 2));
+ await nestedTracker.track(() async {
+ time = time.add(const Duration(milliseconds: 4));
+ await Future.value();
+ time = time.add(const Duration(milliseconds: 8));
+ await Future.value();
+ time = time.add(const Duration(milliseconds: 16));
+ });
+ time = time.add(const Duration(milliseconds: 32));
+ await Future.value();
+ time = time.add(const Duration(milliseconds: 64));
+ });
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(milliseconds: 127));
+ expect(nestedTracker.startTime.isAfter(startTime), true);
+ expect(nestedTracker.stopTime.isBefore(time), true);
+ expect(nestedTracker.duration, const Duration(milliseconds: 28));
+ additionalExpects();
+ });
+ }
+
+ group('SyncTimeTracker', () {
+ setUp(() {
+ tracker = SyncTimeTracker();
+ nestedTracker = SyncTimeTracker();
+ });
+
+ canHandleSync();
+
+ test('Can not track async code', () async {
+ await scopedTrack(() => Future(() {
+ time = time.add(const Duration(seconds: 5));
+ }));
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, startTime);
+ expect(tracker.duration, const Duration(seconds: 0));
+ });
+ });
+
+ group('AsyncTimeTracker.simple', () {
+ setUp(() {
+ tracker = SimpleAsyncTimeTracker();
+ nestedTracker = SimpleAsyncTimeTracker();
+ });
+
+ canHandleSync();
+
+ canHandleAsync();
+
+ test('Can not distinguish own async code', () async {
+ final future = scopedTrack(() => Future(() {
+ time = time.add(const Duration(seconds: 5));
+ }));
+ time = time.add(const Duration(seconds: 10));
+ await future;
+ expect(tracker.isFinished, true);
+ expect(tracker.startTime, startTime);
+ expect(tracker.stopTime, time);
+ expect(tracker.duration, const Duration(seconds: 15));
+ });
+ });
+
+ group('AsyncTimeTracker', () {
+ late AsyncTimeTracker asyncTracker;
+ late AsyncTimeTracker nestedAsyncTracker;
+ setUp(() {
+ tracker = asyncTracker = AsyncTimeTracker();
+ nestedTracker = nestedAsyncTracker = AsyncTimeTracker();
+ });
+
+ canHandleSync(() {
+ expect(asyncTracker.innerDuration, asyncTracker.duration);
+ expect(asyncTracker.slices.length, 1);
+ });
+
+ canHandleAsync(() {
+ expect(asyncTracker.innerDuration, asyncTracker.duration);
+ expect(asyncTracker.slices.length, greaterThan(1));
+ });
+
+ test('Can track complex async innerDuration', () async {
+ final completer = Completer();
+ final future = scopedTrack(() async {
+ time = time.add(const Duration(seconds: 1)); // Tracked sync
+ await Future.value();
+ time = time.add(const Duration(seconds: 2)); // Tracked async
+ await completer.future;
+ time = time.add(const Duration(seconds: 4)); // Tracked async, delayed
+ }).then((_) {
+ time = time.add(const Duration(seconds: 8)); // Async, after tracking
+ });
+ time = time.add(const Duration(seconds: 16)); // Sync, between slices
+
+ await Future(() {
+ // Async, between slices
+ time = time.add(const Duration(seconds: 32));
+ completer.complete();
+ });
+ await future;
+ expect(asyncTracker.isFinished, true);
+ expect(asyncTracker.startTime, startTime);
+ expect(asyncTracker.stopTime.isBefore(time), true);
+ expect(asyncTracker.duration, const Duration(seconds: 55));
+ expect(asyncTracker.innerDuration, const Duration(seconds: 7));
+ expect(asyncTracker.slices.length, greaterThan(1));
+ });
+
+ test('Can exclude nested sync', () {
+ tracker = asyncTracker = AsyncTimeTracker(trackNested: false);
+ scopedTrack(() {
+ time = time.add(const Duration(seconds: 1));
+ nestedAsyncTracker.track(() {
+ time = time.add(const Duration(seconds: 2));
+ });
+ time = time.add(const Duration(seconds: 4));
+ });
+ expect(asyncTracker.isFinished, true);
+ expect(asyncTracker.startTime, startTime);
+ expect(asyncTracker.stopTime, time);
+ expect(asyncTracker.duration, const Duration(seconds: 7));
+ expect(asyncTracker.innerDuration, const Duration(seconds: 5));
+ expect(asyncTracker.slices.length, greaterThan(1));
+ expect(nestedAsyncTracker.startTime.isAfter(startTime), true);
+ expect(nestedAsyncTracker.stopTime.isBefore(time), true);
+ expect(nestedAsyncTracker.duration, const Duration(seconds: 2));
+ expect(nestedAsyncTracker.innerDuration, const Duration(seconds: 2));
+ expect(nestedAsyncTracker.slices.length, 1);
+ });
+
+ test('Can exclude complex nested sync', () {
+ tracker = asyncTracker = AsyncTimeTracker(trackNested: false);
+ nestedAsyncTracker = AsyncTimeTracker(trackNested: false);
+ final nestedAsyncTracker2 = AsyncTimeTracker(trackNested: false);
+ scopedTrack(() {
+ time = time.add(const Duration(seconds: 1));
+ nestedAsyncTracker.track(() {
+ time = time.add(const Duration(seconds: 2));
+ nestedAsyncTracker2.track(() {
+ time = time.add(const Duration(seconds: 4));
+ });
+ time = time.add(const Duration(seconds: 8));
+ });
+ time = time.add(const Duration(seconds: 16));
+ });
+ expect(asyncTracker.isFinished, true);
+ expect(asyncTracker.startTime, startTime);
+ expect(asyncTracker.stopTime, time);
+ expect(asyncTracker.duration, const Duration(seconds: 31));
+ expect(asyncTracker.innerDuration, const Duration(seconds: 17));
+ expect(asyncTracker.slices.length, greaterThan(1));
+ expect(nestedAsyncTracker.startTime.isAfter(startTime), true);
+ expect(nestedAsyncTracker.stopTime.isBefore(time), true);
+ expect(nestedAsyncTracker.duration, const Duration(seconds: 14));
+ expect(nestedAsyncTracker.innerDuration, const Duration(seconds: 10));
+ expect(nestedAsyncTracker.slices.length, greaterThan(1));
+ expect(nestedAsyncTracker2.startTime.isAfter(startTime), true);
+ expect(nestedAsyncTracker2.stopTime.isBefore(time), true);
+ expect(nestedAsyncTracker2.duration, const Duration(seconds: 4));
+ expect(nestedAsyncTracker2.innerDuration, const Duration(seconds: 4));
+ expect(nestedAsyncTracker2.slices.length, 1);
+ });
+
+ test(
+ 'Can track all on grand-parent level and '
+ 'exclude grand-childrens from parent', () {
+ tracker = asyncTracker = AsyncTimeTracker(trackNested: true);
+ nestedAsyncTracker = AsyncTimeTracker(trackNested: false);
+ final nestedAsyncTracker2 = AsyncTimeTracker();
+ scopedTrack(() {
+ time = time.add(const Duration(seconds: 1));
+ nestedAsyncTracker.track(() {
+ time = time.add(const Duration(seconds: 2));
+ nestedAsyncTracker2.track(() {
+ time = time.add(const Duration(seconds: 4));
+ });
+ time = time.add(const Duration(seconds: 8));
+ });
+ time = time.add(const Duration(seconds: 16));
+ });
+ expect(asyncTracker.isFinished, true);
+ expect(asyncTracker.startTime, startTime);
+ expect(asyncTracker.stopTime, time);
+ expect(asyncTracker.duration, const Duration(seconds: 31));
+ expect(asyncTracker.innerDuration, const Duration(seconds: 31));
+ expect(asyncTracker.slices.length, 1);
+ expect(nestedAsyncTracker.startTime.isAfter(startTime), true);
+ expect(nestedAsyncTracker.stopTime.isBefore(time), true);
+ expect(nestedAsyncTracker.duration, const Duration(seconds: 14));
+ expect(nestedAsyncTracker.innerDuration, const Duration(seconds: 10));
+ expect(nestedAsyncTracker.slices.length, greaterThan(1));
+ expect(nestedAsyncTracker2.startTime.isAfter(startTime), true);
+ expect(nestedAsyncTracker2.stopTime.isBefore(time), true);
+ expect(nestedAsyncTracker2.duration, const Duration(seconds: 4));
+ expect(nestedAsyncTracker2.innerDuration, const Duration(seconds: 4));
+ expect(nestedAsyncTracker2.slices.length, 1);
+ });
+
+ test('Can exclude nested async', () async {
+ tracker = asyncTracker = AsyncTimeTracker(trackNested: false);
+ await scopedTrack(() async {
+ time = time.add(const Duration(seconds: 1));
+ await nestedAsyncTracker.track(() async {
+ time = time.add(const Duration(seconds: 2));
+ await Future.value();
+ time = time.add(const Duration(seconds: 4));
+ await Future.value();
+ time = time.add(const Duration(seconds: 8));
+ });
+ time = time.add(const Duration(seconds: 16));
+ });
+ expect(asyncTracker.isFinished, true);
+ expect(asyncTracker.startTime, startTime);
+ expect(asyncTracker.stopTime, time);
+ expect(asyncTracker.duration, const Duration(seconds: 31));
+ expect(asyncTracker.innerDuration, const Duration(seconds: 17));
+ expect(asyncTracker.slices.length, greaterThan(1));
+ expect(nestedAsyncTracker.startTime.isAfter(startTime), true);
+ expect(nestedAsyncTracker.stopTime.isBefore(time), true);
+ expect(nestedAsyncTracker.duration, const Duration(seconds: 14));
+ expect(nestedAsyncTracker.innerDuration, const Duration(seconds: 14));
+ expect(nestedAsyncTracker.slices.length, greaterThan(1));
+ });
+
+ test('Can handle callbacks in excluded nested async', () async {
+ tracker = asyncTracker = AsyncTimeTracker(trackNested: false);
+ await scopedTrack(() async {
+ time = time.add(const Duration(seconds: 1));
+ final completer = Completer();
+ final future = completer.future.then((_) {
+ time = time.add(const Duration(seconds: 2));
+ });
+ await nestedAsyncTracker.track(() async {
+ time = time.add(const Duration(seconds: 4));
+ await Future.value();
+ time = time.add(const Duration(seconds: 8));
+ completer.complete();
+ await future;
+ time = time.add(const Duration(seconds: 16));
+ });
+ time = time.add(const Duration(seconds: 32));
+ });
+ expect(asyncTracker.isFinished, true);
+ expect(asyncTracker.startTime, startTime);
+ expect(asyncTracker.stopTime, time);
+ expect(asyncTracker.duration, const Duration(seconds: 63));
+ expect(asyncTracker.innerDuration, const Duration(seconds: 35));
+ expect(asyncTracker.slices.length, greaterThan(1));
+ expect(nestedAsyncTracker.startTime.isAfter(startTime), true);
+ expect(nestedAsyncTracker.stopTime.isBefore(time), true);
+ expect(nestedAsyncTracker.duration, const Duration(seconds: 30));
+ expect(nestedAsyncTracker.innerDuration, const Duration(seconds: 28));
+ expect(nestedAsyncTracker.slices.length, greaterThan(1));
+ });
+ });
+}
diff --git a/pkgs/watcher/.gitignore b/pkgs/watcher/.gitignore
new file mode 100644
index 0000000..ac98e87
--- /dev/null
+++ b/pkgs/watcher/.gitignore
@@ -0,0 +1,4 @@
+# Don’t commit the following directories created by pub.
+.dart_tool
+.packages
+pubspec.lock
diff --git a/pkgs/watcher/.test_config b/pkgs/watcher/.test_config
new file mode 100644
index 0000000..531426a
--- /dev/null
+++ b/pkgs/watcher/.test_config
@@ -0,0 +1,5 @@
+{
+ "test_package": {
+ "platforms": ["vm"]
+ }
+}
\ No newline at end of file
diff --git a/pkgs/watcher/CHANGELOG.md b/pkgs/watcher/CHANGELOG.md
new file mode 100644
index 0000000..ef3a7e2
--- /dev/null
+++ b/pkgs/watcher/CHANGELOG.md
@@ -0,0 +1,130 @@
+## 1.1.1
+
+- Ensure `PollingFileWatcher.ready` completes for files that do not exist.
+- Require Dart SDK `^3.1.0`
+- Move to `dart-lang/tools` monorepo.
+
+## 1.1.0
+
+- Require Dart SDK >= 3.0.0
+- Remove usage of redundant ConstructableFileSystemEvent classes.
+
+## 1.0.3-dev
+
+- Require Dart SDK >= 2.19
+
+## 1.0.2
+
+- Require Dart SDK >= 2.14
+- Ensure `DirectoryWatcher.ready` completes even when errors occur that close the watcher.
+- Add markdown badges to the readme.
+
+## 1.0.1
+
+* Drop package:pedantic and use package:lints instead.
+
+## 1.0.0
+
+* Require Dart SDK >= 2.12
+* Add the ability to create custom Watcher types for specific file paths.
+
+## 0.9.7+15
+
+* Fix a bug on Mac where modifying a directory with a path exactly matching a
+ prefix of a modified file would suppress change events for that file.
+
+## 0.9.7+14
+
+* Prepare for breaking change in SDK where modified times for not found files
+ becomes meaningless instead of null.
+
+## 0.9.7+13
+
+* Catch & forward `FileSystemException` from unexpectedly closed file watchers
+ on windows; the watcher will also be automatically restarted when this occurs.
+
+## 0.9.7+12
+
+* Catch `FileSystemException` during `existsSync()` on Windows.
+* Internal cleanup.
+
+## 0.9.7+11
+
+* Fix an analysis hint.
+
+## 0.9.7+10
+
+* Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 0.9.7+9
+
+* Internal changes only.
+
+## 0.9.7+8
+
+* Fix Dart 2.0 type issues on Mac and Windows.
+
+## 0.9.7+7
+
+* Updates to support Dart 2.0 core library changes (wave 2.2).
+ See [issue 31847][sdk#31847] for details.
+
+ [sdk#31847]: https://github.com/dart-lang/sdk/issues/31847
+
+
+## 0.9.7+6
+
+* Internal changes only, namely removing dep on scheduled test.
+
+## 0.9.7+5
+
+* Fix an analysis warning.
+
+## 0.9.7+4
+
+* Declare support for `async` 2.0.0.
+
+## 0.9.7+3
+
+* Fix a crashing bug on Linux.
+
+## 0.9.7+2
+
+* Narrow the constraint on `async` to reflect the APIs this package is actually
+ using.
+
+## 0.9.7+1
+
+* Fix all strong-mode warnings.
+
+## 0.9.7
+
+* Fix a bug in `FileWatcher` where events could be added after watchers were
+ closed.
+
+## 0.9.6
+
+* Add a `Watcher` interface that encompasses watching both files and
+ directories.
+
+* Add `FileWatcher` and `PollingFileWatcher` classes for watching changes to
+ individual files.
+
+* Deprecate `DirectoryWatcher.directory`. Use `DirectoryWatcher.path` instead.
+
+## 0.9.5
+
+* Fix bugs where events could be added after watchers were closed.
+
+## 0.9.4
+
+* Treat add events for known files as modifications instead of discarding them
+ on Mac OS.
+
+## 0.9.3
+
+* Improved support for Windows via `WindowsDirectoryWatcher`.
+
+* Simplified `PollingDirectoryWatcher`.
+
+* Fixed bugs in `MacOSDirectoryWatcher`
diff --git a/pkgs/watcher/LICENSE b/pkgs/watcher/LICENSE
new file mode 100644
index 0000000..000cd7b
--- /dev/null
+++ b/pkgs/watcher/LICENSE
@@ -0,0 +1,27 @@
+Copyright 2014, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/watcher/README.md b/pkgs/watcher/README.md
new file mode 100644
index 0000000..83a0324
--- /dev/null
+++ b/pkgs/watcher/README.md
@@ -0,0 +1,10 @@
+[](https://github.com/dart-lang/tools/actions/workflows/watcher.yaml)
+[](https://pub.dev/packages/watcher)
+[](https://pub.dev/packages/watcher/publisher)
+
+A file system watcher.
+
+## What's this?
+
+`package:watcher` monitors changes to contents of directories and sends
+notifications when files have been added, removed, or modified.
diff --git a/pkgs/watcher/analysis_options.yaml b/pkgs/watcher/analysis_options.yaml
new file mode 100644
index 0000000..d978f81
--- /dev/null
+++ b/pkgs/watcher/analysis_options.yaml
@@ -0,0 +1 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
diff --git a/pkgs/watcher/benchmark/path_set.dart b/pkgs/watcher/benchmark/path_set.dart
new file mode 100644
index 0000000..e7929d8
--- /dev/null
+++ b/pkgs/watcher/benchmark/path_set.dart
@@ -0,0 +1,158 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Benchmarks for the PathSet class.
+library;
+
+import 'dart:io';
+import 'dart:math' as math;
+
+import 'package:benchmark_harness/benchmark_harness.dart';
+import 'package:path/path.dart' as p;
+import 'package:watcher/src/path_set.dart';
+
+final String root = Platform.isWindows ? r'C:\root' : '/root';
+
+/// Base class for benchmarks on [PathSet].
+abstract class PathSetBenchmark extends BenchmarkBase {
+ PathSetBenchmark(String method) : super('PathSet.$method');
+
+ final PathSet pathSet = PathSet(root);
+
+ /// Use a fixed [math.Random] with a constant seed to ensure the tests are
+ /// deterministic.
+ final math.Random random = math.Random(1234);
+
+ /// Walks over a virtual directory [depth] levels deep invoking [callback]
+ /// for each "file".
+ ///
+ /// Each virtual directory contains ten entries: either subdirectories or
+ /// files.
+ void walkTree(int depth, void Function(String) callback) {
+ void recurse(String path, int remainingDepth) {
+ for (var i = 0; i < 10; i++) {
+ var padded = i.toString().padLeft(2, '0');
+ if (remainingDepth == 0) {
+ callback(p.join(path, 'file_$padded.txt'));
+ } else {
+ var subdir = p.join(path, 'subdirectory_$padded');
+ recurse(subdir, remainingDepth - 1);
+ }
+ }
+ }
+
+ recurse(root, depth);
+ }
+}
+
+class AddBenchmark extends PathSetBenchmark {
+ AddBenchmark() : super('add()');
+
+ final List<String> paths = [];
+
+ @override
+ void setup() {
+ // Make a bunch of paths in about the same order we expect to get them from
+ // Directory.list().
+ walkTree(3, paths.add);
+ }
+
+ @override
+ void run() {
+ for (var path in paths) {
+ pathSet.add(path);
+ }
+ }
+}
+
+class ContainsBenchmark extends PathSetBenchmark {
+ ContainsBenchmark() : super('contains()');
+
+ final List<String> paths = [];
+
+ @override
+ void setup() {
+ // Add a bunch of paths to the set.
+ walkTree(3, (path) {
+ pathSet.add(path);
+ paths.add(path);
+ });
+
+ // Add some non-existent paths to test the false case.
+ for (var i = 0; i < 100; i++) {
+ paths.addAll([
+ '/nope',
+ '/root/nope',
+ '/root/subdirectory_04/nope',
+ '/root/subdirectory_04/subdirectory_04/nope',
+ '/root/subdirectory_04/subdirectory_04/subdirectory_04/nope',
+ '/root/subdirectory_04/subdirectory_04/subdirectory_04/nope/file_04.txt',
+ ]);
+ }
+ }
+
+ @override
+ void run() {
+ var contained = 0;
+ for (var path in paths) {
+ if (pathSet.contains(path)) contained++;
+ }
+
+ if (contained != 10000) throw StateError('Wrong result: $contained');
+ }
+}
+
+class PathsBenchmark extends PathSetBenchmark {
+ PathsBenchmark() : super('toSet()');
+
+ @override
+ void setup() {
+ walkTree(3, pathSet.add);
+ }
+
+ @override
+ void run() {
+ var count = 0;
+ for (var _ in pathSet.paths) {
+ count++;
+ }
+
+ if (count != 10000) throw StateError('Wrong result: $count');
+ }
+}
+
+class RemoveBenchmark extends PathSetBenchmark {
+ RemoveBenchmark() : super('remove()');
+
+ final List<String> paths = [];
+
+ @override
+ void setup() {
+ // Make a bunch of paths. Do this here so that we don't spend benchmarked
+ // time synthesizing paths.
+ walkTree(3, (path) {
+ pathSet.add(path);
+ paths.add(path);
+ });
+
+ // Shuffle the paths so that we delete them in a random order that
+ // hopefully mimics real-world file system usage. Do the shuffling here so
+ // that we don't spend benchmarked time shuffling.
+ paths.shuffle(random);
+ }
+
+ @override
+ void run() {
+ for (var path in paths) {
+ pathSet.remove(path);
+ }
+ }
+}
+
+void main() {
+ AddBenchmark().report();
+ ContainsBenchmark().report();
+ PathsBenchmark().report();
+ RemoveBenchmark().report();
+}
diff --git a/pkgs/watcher/example/watch.dart b/pkgs/watcher/example/watch.dart
new file mode 100644
index 0000000..37931d3
--- /dev/null
+++ b/pkgs/watcher/example/watch.dart
@@ -0,0 +1,19 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// Watches the given directory and prints each modification to it.
+library;
+
+import 'package:path/path.dart' as p;
+import 'package:watcher/watcher.dart';
+
+void main(List<String> arguments) {
+ if (arguments.length != 1) {
+ print('Usage: watch <directory path>');
+ return;
+ }
+
+ var watcher = DirectoryWatcher(p.absolute(arguments[0]));
+ watcher.events.listen(print);
+}
diff --git a/pkgs/watcher/lib/src/async_queue.dart b/pkgs/watcher/lib/src/async_queue.dart
new file mode 100644
index 0000000..f6c76a9
--- /dev/null
+++ b/pkgs/watcher/lib/src/async_queue.dart
@@ -0,0 +1,70 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:collection';
+
+typedef ItemProcessor<T> = Future<void> Function(T item);
+
+/// A queue of items that are sequentially, asynchronously processed.
+///
+/// Unlike [Stream.map] or [Stream.forEach], the callback used to process each
+/// item returns a [Future], and it will not advance to the next item until the
+/// current item is finished processing.
+///
+/// Items can be added at any point in time and processing will be started as
+/// needed. When all items are processed, it stops processing until more items
+/// are added.
+class AsyncQueue<T> {
+ final _items = Queue<T>();
+
+ /// Whether or not the queue is currently waiting on a processing future to
+ /// complete.
+ bool _isProcessing = false;
+
+ /// The callback to invoke on each queued item.
+ ///
+ /// The next item in the queue will not be processed until the [Future]
+ /// returned by this completes.
+ final ItemProcessor<T> _processor;
+
+ /// The handler for errors thrown during processing.
+ ///
+ /// Used to avoid top-leveling asynchronous errors.
+ final void Function(Object, StackTrace) _errorHandler;
+
+ AsyncQueue(this._processor,
+ {required void Function(Object, StackTrace) onError})
+ : _errorHandler = onError;
+
+ /// Enqueues [item] to be processed and starts asynchronously processing it
+ /// if a process isn't already running.
+ void add(T item) {
+ _items.add(item);
+
+ // Start up the asynchronous processing if not already running.
+ if (_isProcessing) return;
+ _isProcessing = true;
+
+ _processNextItem().catchError(_errorHandler);
+ }
+
+ /// Removes all remaining items to be processed.
+ void clear() {
+ _items.clear();
+ }
+
+ /// Pulls the next item off [_items] and processes it.
+ ///
+ /// When complete, recursively calls itself to continue processing unless
+ /// the process was cancelled.
+ Future<void> _processNextItem() async {
+ var item = _items.removeFirst();
+ await _processor(item);
+ if (_items.isNotEmpty) return _processNextItem();
+
+ // We have drained the queue, stop processing and wait until something
+ // has been enqueued.
+ _isProcessing = false;
+ }
+}
diff --git a/pkgs/watcher/lib/src/custom_watcher_factory.dart b/pkgs/watcher/lib/src/custom_watcher_factory.dart
new file mode 100644
index 0000000..fc4e3fb
--- /dev/null
+++ b/pkgs/watcher/lib/src/custom_watcher_factory.dart
@@ -0,0 +1,88 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import '../watcher.dart';
+
+/// A factory to produce custom watchers for specific file paths.
+class _CustomWatcherFactory {
+ final String id;
+ final DirectoryWatcher? Function(String path, {Duration? pollingDelay})
+ createDirectoryWatcher;
+ final FileWatcher? Function(String path, {Duration? pollingDelay})
+ createFileWatcher;
+
+ _CustomWatcherFactory(
+ this.id, this.createDirectoryWatcher, this.createFileWatcher);
+}
+
+/// Registers a custom watcher.
+///
+/// Each custom watcher must have a unique [id] and the same watcher may not be
+/// registered more than once.
+/// [createDirectoryWatcher] and [createFileWatcher] should return watchers for
+/// the file paths they are able to handle. If the custom watcher is not able to
+/// handle the path it should return null.
+/// The paths handled by each custom watch may not overlap, at most one custom
+/// matcher may return a non-null watcher for a given path.
+///
+/// When a file or directory watcher is created the path is checked against each
+/// registered custom watcher, and if exactly one custom watcher is available it
+/// will be used instead of the default.
+void registerCustomWatcher(
+ String id,
+ DirectoryWatcher? Function(String path, {Duration? pollingDelay})?
+ createDirectoryWatcher,
+ FileWatcher? Function(String path, {Duration? pollingDelay})?
+ createFileWatcher,
+) {
+ if (_customWatcherFactories.containsKey(id)) {
+ throw ArgumentError('A custom watcher with id `$id` '
+ 'has already been registered');
+ }
+ _customWatcherFactories[id] = _CustomWatcherFactory(
+ id,
+ createDirectoryWatcher ?? (_, {pollingDelay}) => null,
+ createFileWatcher ?? (_, {pollingDelay}) => null);
+}
+
+/// Tries to create a custom [DirectoryWatcher] and returns it.
+///
+/// Returns `null` if no custom watcher was applicable and throws a [StateError]
+/// if more than one was.
+DirectoryWatcher? createCustomDirectoryWatcher(String path,
+ {Duration? pollingDelay}) {
+ DirectoryWatcher? customWatcher;
+ String? customFactoryId;
+ for (var watcherFactory in _customWatcherFactories.values) {
+ if (customWatcher != null) {
+ throw StateError('Two `CustomWatcherFactory`s applicable: '
+ '`$customFactoryId` and `${watcherFactory.id}` for `$path`');
+ }
+ customWatcher =
+ watcherFactory.createDirectoryWatcher(path, pollingDelay: pollingDelay);
+ customFactoryId = watcherFactory.id;
+ }
+ return customWatcher;
+}
+
+/// Tries to create a custom [FileWatcher] and returns it.
+///
+/// Returns `null` if no custom watcher was applicable and throws a [StateError]
+/// if more than one was.
+FileWatcher? createCustomFileWatcher(String path, {Duration? pollingDelay}) {
+ FileWatcher? customWatcher;
+ String? customFactoryId;
+ for (var watcherFactory in _customWatcherFactories.values) {
+ if (customWatcher != null) {
+ throw StateError('Two `CustomWatcherFactory`s applicable: '
+ '`$customFactoryId` and `${watcherFactory.id}` for `$path`');
+ }
+ customWatcher =
+ watcherFactory.createFileWatcher(path, pollingDelay: pollingDelay);
+ customFactoryId = watcherFactory.id;
+ }
+ return customWatcher;
+}
+
+final _customWatcherFactories = <String, _CustomWatcherFactory>{};
diff --git a/pkgs/watcher/lib/src/directory_watcher.dart b/pkgs/watcher/lib/src/directory_watcher.dart
new file mode 100644
index 0000000..158b86b
--- /dev/null
+++ b/pkgs/watcher/lib/src/directory_watcher.dart
@@ -0,0 +1,41 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import '../watcher.dart';
+import 'custom_watcher_factory.dart';
+import 'directory_watcher/linux.dart';
+import 'directory_watcher/mac_os.dart';
+import 'directory_watcher/windows.dart';
+
+/// Watches the contents of a directory and emits [WatchEvent]s when something
+/// in the directory has changed.
+abstract class DirectoryWatcher implements Watcher {
+ /// The directory whose contents are being monitored.
+ @Deprecated('Expires in 1.0.0. Use DirectoryWatcher.path instead.')
+ String get directory;
+
+ /// Creates a new [DirectoryWatcher] monitoring [directory].
+ ///
+ /// If a native directory watcher is available for this platform, this will
+ /// use it. Otherwise, it will fall back to a [PollingDirectoryWatcher].
+ ///
+ /// If [pollingDelay] is passed, it specifies the amount of time the watcher
+ /// will pause between successive polls of the directory contents. Making this
+ /// shorter will give more immediate feedback at the expense of doing more IO
+ /// and higher CPU usage. Defaults to one second. Ignored for non-polling
+ /// watchers.
+ factory DirectoryWatcher(String directory, {Duration? pollingDelay}) {
+ if (FileSystemEntity.isWatchSupported) {
+ var customWatcher =
+ createCustomDirectoryWatcher(directory, pollingDelay: pollingDelay);
+ if (customWatcher != null) return customWatcher;
+ if (Platform.isLinux) return LinuxDirectoryWatcher(directory);
+ if (Platform.isMacOS) return MacOSDirectoryWatcher(directory);
+ if (Platform.isWindows) return WindowsDirectoryWatcher(directory);
+ }
+ return PollingDirectoryWatcher(directory, pollingDelay: pollingDelay);
+ }
+}
diff --git a/pkgs/watcher/lib/src/directory_watcher/linux.dart b/pkgs/watcher/lib/src/directory_watcher/linux.dart
new file mode 100644
index 0000000..cb1d077
--- /dev/null
+++ b/pkgs/watcher/lib/src/directory_watcher/linux.dart
@@ -0,0 +1,294 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+import 'package:async/async.dart';
+
+import '../directory_watcher.dart';
+import '../path_set.dart';
+import '../resubscribable.dart';
+import '../utils.dart';
+import '../watch_event.dart';
+
+/// Uses the inotify subsystem to watch for filesystem events.
+///
+/// Inotify doesn't suport recursively watching subdirectories, nor does
+/// [Directory.watch] polyfill that functionality. This class polyfills it
+/// instead.
+///
+/// This class also compensates for the non-inotify-specific issues of
+/// [Directory.watch] producing multiple events for a single logical action
+/// (issue 14372) and providing insufficient information about move events
+/// (issue 14424).
+class LinuxDirectoryWatcher extends ResubscribableWatcher
+ implements DirectoryWatcher {
+ @override
+ String get directory => path;
+
+ LinuxDirectoryWatcher(String directory)
+ : super(directory, () => _LinuxDirectoryWatcher(directory));
+}
+
+class _LinuxDirectoryWatcher
+ implements DirectoryWatcher, ManuallyClosedWatcher {
+ @override
+ String get directory => _files.root;
+ @override
+ String get path => _files.root;
+
+ @override
+ Stream<WatchEvent> get events => _eventsController.stream;
+ final _eventsController = StreamController<WatchEvent>.broadcast();
+
+ @override
+ bool get isReady => _readyCompleter.isCompleted;
+
+ @override
+ Future<void> get ready => _readyCompleter.future;
+ final _readyCompleter = Completer<void>();
+
+ /// A stream group for the [Directory.watch] events of [path] and all its
+ /// subdirectories.
+ final _nativeEvents = StreamGroup<FileSystemEvent>();
+
+ /// All known files recursively within [path].
+ final PathSet _files;
+
+ /// [Directory.watch] streams for [path]'s subdirectories, indexed by name.
+ ///
+ /// A stream is in this map if and only if it's also in [_nativeEvents].
+ final _subdirStreams = <String, Stream<FileSystemEvent>>{};
+
+ /// A set of all subscriptions that this watcher subscribes to.
+ ///
+ /// These are gathered together so that they may all be canceled when the
+ /// watcher is closed.
+ final _subscriptions = <StreamSubscription>{};
+
+ _LinuxDirectoryWatcher(String path) : _files = PathSet(path) {
+ _nativeEvents.add(Directory(path)
+ .watch()
+ .transform(StreamTransformer.fromHandlers(handleDone: (sink) {
+ // Handle the done event here rather than in the call to [_listen] because
+ // [innerStream] won't close until we close the [StreamGroup]. However, if
+ // we close the [StreamGroup] here, we run the risk of new-directory
+ // events being fired after the group is closed, since batching delays
+ // those events. See b/30768513.
+ _onDone();
+ })));
+
+ // Batch the inotify changes together so that we can dedup events.
+ var innerStream = _nativeEvents.stream.batchEvents();
+ _listen(innerStream, _onBatch,
+ onError: (Object error, StackTrace stackTrace) {
+ // Guarantee that ready always completes.
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ _eventsController.addError(error, stackTrace);
+ });
+
+ _listen(
+ Directory(path).list(recursive: true),
+ (FileSystemEntity entity) {
+ if (entity is Directory) {
+ _watchSubdir(entity.path);
+ } else {
+ _files.add(entity.path);
+ }
+ },
+ onError: _emitError,
+ onDone: () {
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ },
+ cancelOnError: true,
+ );
+ }
+
+ @override
+ void close() {
+ for (var subscription in _subscriptions) {
+ subscription.cancel();
+ }
+
+ _subscriptions.clear();
+ _subdirStreams.clear();
+ _files.clear();
+ _nativeEvents.close();
+ _eventsController.close();
+ }
+
+ /// Watch a subdirectory of [directory] for changes.
+ void _watchSubdir(String path) {
+ // TODO(nweiz): Right now it's possible for the watcher to emit an event for
+ // a file before the directory list is complete. This could lead to the user
+ // seeing a MODIFY or REMOVE event for a file before they see an ADD event,
+ // which is bad. We should handle that.
+ //
+ // One possibility is to provide a general means (e.g.
+ // `DirectoryWatcher.eventsAndExistingFiles`) to tell a watcher to emit
+ // events for all the files that already exist. This would be useful for
+ // top-level clients such as barback as well, and could be implemented with
+ // a wrapper similar to how listening/canceling works now.
+
+ // TODO(nweiz): Catch any errors here that indicate that the directory in
+ // question doesn't exist and silently stop watching it instead of
+ // propagating the errors.
+ var stream = Directory(path).watch();
+ _subdirStreams[path] = stream;
+ _nativeEvents.add(stream);
+ }
+
+ /// The callback that's run when a batch of changes comes in.
+ void _onBatch(List<FileSystemEvent> batch) {
+ var files = <String>{};
+ var dirs = <String>{};
+ var changed = <String>{};
+
+ // inotify event batches are ordered by occurrence, so we treat them as a
+ // log of what happened to a file. We only emit events based on the
+ // difference between the state before the batch and the state after it, not
+ // the intermediate state.
+ for (var event in batch) {
+ // If the watched directory is deleted or moved, we'll get a deletion
+ // event for it. Ignore it; we handle closing [this] when the underlying
+ // stream is closed.
+ if (event.path == path) continue;
+
+ changed.add(event.path);
+
+ if (event is FileSystemMoveEvent) {
+ files.remove(event.path);
+ dirs.remove(event.path);
+
+ var destination = event.destination;
+ if (destination == null) continue;
+
+ changed.add(destination);
+ if (event.isDirectory) {
+ files.remove(destination);
+ dirs.add(destination);
+ } else {
+ files.add(destination);
+ dirs.remove(destination);
+ }
+ } else if (event is FileSystemDeleteEvent) {
+ files.remove(event.path);
+ dirs.remove(event.path);
+ } else if (event.isDirectory) {
+ files.remove(event.path);
+ dirs.add(event.path);
+ } else {
+ files.add(event.path);
+ dirs.remove(event.path);
+ }
+ }
+
+ _applyChanges(files, dirs, changed);
+ }
+
+ /// Applies the net changes computed for a batch.
+ ///
+ /// The [files] and [dirs] sets contain the files and directories that now
+ /// exist, respectively. The [changed] set contains all files and directories
+ /// that have changed (including being removed), and so is a superset of
+ /// [files] and [dirs].
+ void _applyChanges(Set<String> files, Set<String> dirs, Set<String> changed) {
+ for (var path in changed) {
+ var stream = _subdirStreams.remove(path);
+ if (stream != null) _nativeEvents.add(stream);
+
+ // Unless [path] was a file and still is, emit REMOVE events for it or its
+ // contents,
+ if (files.contains(path) && _files.contains(path)) continue;
+ for (var file in _files.remove(path)) {
+ _emitEvent(ChangeType.REMOVE, file);
+ }
+ }
+
+ for (var file in files) {
+ if (_files.contains(file)) {
+ _emitEvent(ChangeType.MODIFY, file);
+ } else {
+ _emitEvent(ChangeType.ADD, file);
+ _files.add(file);
+ }
+ }
+
+ for (var dir in dirs) {
+ _watchSubdir(dir);
+ _addSubdir(dir);
+ }
+ }
+
+ /// Emits [ChangeType.ADD] events for the recursive contents of [path].
+ void _addSubdir(String path) {
+ _listen(Directory(path).list(recursive: true), (FileSystemEntity entity) {
+ if (entity is Directory) {
+ _watchSubdir(entity.path);
+ } else {
+ _files.add(entity.path);
+ _emitEvent(ChangeType.ADD, entity.path);
+ }
+ }, onError: (Object error, StackTrace stackTrace) {
+ // Ignore an exception caused by the dir not existing. It's fine if it
+ // was added and then quickly removed.
+ if (error is FileSystemException) return;
+
+ _emitError(error, stackTrace);
+ }, cancelOnError: true);
+ }
+
+ /// Handles the underlying event stream closing, indicating that the directory
+ /// being watched was removed.
+ void _onDone() {
+ // Most of the time when a directory is removed, its contents will get
+ // individual REMOVE events before the watch stream is closed -- in that
+ // case, [_files] will be empty here. However, if the directory's removal is
+ // caused by a MOVE, we need to manually emit events.
+ if (isReady) {
+ for (var file in _files.paths) {
+ _emitEvent(ChangeType.REMOVE, file);
+ }
+ }
+
+ close();
+ }
+
+ /// Emits a [WatchEvent] with [type] and [path] if this watcher is in a state
+ /// to emit events.
+ void _emitEvent(ChangeType type, String path) {
+ if (!isReady) return;
+ if (_eventsController.isClosed) return;
+ _eventsController.add(WatchEvent(type, path));
+ }
+
+ /// Emit an error, then close the watcher.
+ void _emitError(Object error, StackTrace stackTrace) {
+ // Guarantee that ready always completes.
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ _eventsController.addError(error, stackTrace);
+ close();
+ }
+
+ /// Like [Stream.listen], but automatically adds the subscription to
+ /// [_subscriptions] so that it can be canceled when [close] is called.
+ void _listen<T>(Stream<T> stream, void Function(T) onData,
+ {Function? onError,
+ void Function()? onDone,
+ bool cancelOnError = false}) {
+ late StreamSubscription<T> subscription;
+ subscription = stream.listen(onData, onError: onError, onDone: () {
+ _subscriptions.remove(subscription);
+ onDone?.call();
+ }, cancelOnError: cancelOnError);
+ _subscriptions.add(subscription);
+ }
+}
diff --git a/pkgs/watcher/lib/src/directory_watcher/mac_os.dart b/pkgs/watcher/lib/src/directory_watcher/mac_os.dart
new file mode 100644
index 0000000..b461383
--- /dev/null
+++ b/pkgs/watcher/lib/src/directory_watcher/mac_os.dart
@@ -0,0 +1,410 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+import 'package:path/path.dart' as p;
+
+import '../directory_watcher.dart';
+import '../path_set.dart';
+import '../resubscribable.dart';
+import '../utils.dart';
+import '../watch_event.dart';
+
+/// Uses the FSEvents subsystem to watch for filesystem events.
+///
+/// FSEvents has two main idiosyncrasies that this class works around. First, it
+/// will occasionally report events that occurred before the filesystem watch
+/// was initiated. Second, if multiple events happen to the same file in close
+/// succession, it won't report them in the order they occurred. See issue
+/// 14373.
+///
+/// This also works around issues 16003 and 14849 in the implementation of
+/// [Directory.watch].
+class MacOSDirectoryWatcher extends ResubscribableWatcher
+ implements DirectoryWatcher {
+ @override
+ String get directory => path;
+
+ MacOSDirectoryWatcher(String directory)
+ : super(directory, () => _MacOSDirectoryWatcher(directory));
+}
+
+class _MacOSDirectoryWatcher
+ implements DirectoryWatcher, ManuallyClosedWatcher {
+ @override
+ String get directory => path;
+ @override
+ final String path;
+
+ @override
+ Stream<WatchEvent> get events => _eventsController.stream;
+ final _eventsController = StreamController<WatchEvent>.broadcast();
+
+ @override
+ bool get isReady => _readyCompleter.isCompleted;
+
+ @override
+ Future<void> get ready => _readyCompleter.future;
+ final _readyCompleter = Completer<void>();
+
+ /// The set of files that are known to exist recursively within the watched
+ /// directory.
+ ///
+ /// The state of files on the filesystem is compared against this to determine
+ /// the real change that occurred when working around issue 14373. This is
+ /// also used to emit REMOVE events when subdirectories are moved out of the
+ /// watched directory.
+ final PathSet _files;
+
+ /// The subscription to the stream returned by [Directory.watch].
+ ///
+ /// This is separate from [_listSubscriptions] because this stream
+ /// occasionally needs to be resubscribed in order to work around issue 14849.
+ StreamSubscription<List<FileSystemEvent>>? _watchSubscription;
+
+ /// The subscription to the [Directory.list] call for the initial listing of
+ /// the directory to determine its initial state.
+ StreamSubscription<FileSystemEntity>? _initialListSubscription;
+
+ /// The subscriptions to [Directory.list] calls for listing the contents of a
+ /// subdirectory that was moved into the watched directory.
+ final _listSubscriptions = <StreamSubscription<FileSystemEntity>>{};
+
+ /// The timer for tracking how long we wait for an initial batch of bogus
+ /// events (see issue 14373).
+ late Timer _bogusEventTimer;
+
+ _MacOSDirectoryWatcher(this.path) : _files = PathSet(path) {
+ _startWatch();
+
+ // Before we're ready to emit events, wait for [_listDir] to complete and
+ // for enough time to elapse that if bogus events (issue 14373) would be
+ // emitted, they will be.
+ //
+ // If we do receive a batch of events, [_onBatch] will ensure that these
+ // futures don't fire and that the directory is re-listed.
+ Future.wait([_listDir(), _waitForBogusEvents()]).then((_) {
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ });
+ }
+
+ @override
+ void close() {
+ _watchSubscription?.cancel();
+ _initialListSubscription?.cancel();
+ _watchSubscription = null;
+ _initialListSubscription = null;
+
+ for (var subscription in _listSubscriptions) {
+ subscription.cancel();
+ }
+ _listSubscriptions.clear();
+
+ _eventsController.close();
+ }
+
+ /// The callback that's run when [Directory.watch] emits a batch of events.
+ void _onBatch(List<FileSystemEvent> batch) {
+ // If we get a batch of events before we're ready to begin emitting events,
+ // it's probable that it's a batch of pre-watcher events (see issue 14373).
+ // Ignore those events and re-list the directory.
+ if (!isReady) {
+ // Cancel the timer because bogus events only occur in the first batch, so
+ // we can fire [ready] as soon as we're done listing the directory.
+ _bogusEventTimer.cancel();
+ _listDir().then((_) {
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ });
+ return;
+ }
+
+ _sortEvents(batch).forEach((path, eventSet) {
+ var canonicalEvent = _canonicalEvent(eventSet);
+ var events = canonicalEvent == null
+ ? _eventsBasedOnFileSystem(path)
+ : [canonicalEvent];
+
+ for (var event in events) {
+ if (event is FileSystemCreateEvent) {
+ if (!event.isDirectory) {
+ // If we already know about the file, treat it like a modification.
+ // This can happen if a file is copied on top of an existing one.
+ // We'll see an ADD event for the latter file when from the user's
+ // perspective, the file's contents just changed.
+ var type =
+ _files.contains(path) ? ChangeType.MODIFY : ChangeType.ADD;
+
+ _emitEvent(type, path);
+ _files.add(path);
+ continue;
+ }
+
+ if (_files.containsDir(path)) continue;
+
+ var stream = Directory(path).list(recursive: true);
+ var subscription = stream.listen((entity) {
+ if (entity is Directory) return;
+ if (_files.contains(path)) return;
+
+ _emitEvent(ChangeType.ADD, entity.path);
+ _files.add(entity.path);
+ }, cancelOnError: true);
+ subscription.onDone(() {
+ _listSubscriptions.remove(subscription);
+ });
+ subscription.onError(_emitError);
+ _listSubscriptions.add(subscription);
+ } else if (event is FileSystemModifyEvent) {
+ assert(!event.isDirectory);
+ _emitEvent(ChangeType.MODIFY, path);
+ } else {
+ assert(event is FileSystemDeleteEvent);
+ for (var removedPath in _files.remove(path)) {
+ _emitEvent(ChangeType.REMOVE, removedPath);
+ }
+ }
+ }
+ });
+ }
+
+ /// Sort all the events in a batch into sets based on their path.
+ ///
+ /// A single input event may result in multiple events in the returned map;
+ /// for example, a MOVE event becomes a DELETE event for the source and a
+ /// CREATE event for the destination.
+ ///
+ /// The returned events won't contain any [FileSystemMoveEvent]s, nor will it
+ /// contain any events relating to [path].
+ Map<String, Set<FileSystemEvent>> _sortEvents(List<FileSystemEvent> batch) {
+ var eventsForPaths = <String, Set<FileSystemEvent>>{};
+
+ // FSEvents can report past events, including events on the root directory
+ // such as it being created. We want to ignore these. If the directory is
+ // really deleted, that's handled by [_onDone].
+ batch = batch.where((event) => event.path != path).toList();
+
+ // Events within directories that already have events are superfluous; the
+ // directory's full contents will be examined anyway, so we ignore such
+ // events. Emitting them could cause useless or out-of-order events.
+ var directories = unionAll(batch.map((event) {
+ if (!event.isDirectory) return <String>{};
+ if (event is FileSystemMoveEvent) {
+ var destination = event.destination;
+ if (destination != null) {
+ return {event.path, destination};
+ }
+ }
+ return {event.path};
+ }));
+
+ bool isInModifiedDirectory(String path) =>
+ directories.any((dir) => path != dir && p.isWithin(dir, path));
+
+ void addEvent(String path, FileSystemEvent event) {
+ if (isInModifiedDirectory(path)) return;
+ eventsForPaths.putIfAbsent(path, () => <FileSystemEvent>{}).add(event);
+ }
+
+ for (var event in batch) {
+ // The Mac OS watcher doesn't emit move events. See issue 14806.
+ assert(event is! FileSystemMoveEvent);
+ addEvent(event.path, event);
+ }
+
+ return eventsForPaths;
+ }
+
+ /// Returns the canonical event from a batch of events on the same path, if
+ /// one exists.
+ ///
+ /// If [batch] doesn't contain any contradictory events (e.g. DELETE and
+ /// CREATE, or events with different values for `isDirectory`), this returns a
+ /// single event that describes what happened to the path in question.
+ ///
+ /// If [batch] does contain contradictory events, this returns `null` to
+ /// indicate that the state of the path on the filesystem should be checked to
+ /// determine what occurred.
+ FileSystemEvent? _canonicalEvent(Set<FileSystemEvent> batch) {
+ // An empty batch indicates that we've learned earlier that the batch is
+ // contradictory (e.g. because of a move).
+ if (batch.isEmpty) return null;
+
+ var type = batch.first.type;
+ var isDir = batch.first.isDirectory;
+ var hadModifyEvent = false;
+
+ for (var event in batch.skip(1)) {
+ // If one event reports that the file is a directory and another event
+ // doesn't, that's a contradiction.
+ if (isDir != event.isDirectory) return null;
+
+ // Modify events don't contradict either CREATE or REMOVE events. We can
+ // safely assume the file was modified after a CREATE or before the
+ // REMOVE; otherwise there will also be a REMOVE or CREATE event
+ // (respectively) that will be contradictory.
+ if (event is FileSystemModifyEvent) {
+ hadModifyEvent = true;
+ continue;
+ }
+ assert(event is FileSystemCreateEvent || event is FileSystemDeleteEvent);
+
+ // If we previously thought this was a MODIFY, we now consider it to be a
+ // CREATE or REMOVE event. This is safe for the same reason as above.
+ if (type == FileSystemEvent.modify) {
+ type = event.type;
+ continue;
+ }
+
+ // A CREATE event contradicts a REMOVE event and vice versa.
+ assert(type == FileSystemEvent.create || type == FileSystemEvent.delete);
+ if (type != event.type) return null;
+ }
+
+ // If we got a CREATE event for a file we already knew about, that comes
+ // from FSEvents reporting an add that happened prior to the watch
+ // beginning. If we also received a MODIFY event, we want to report that,
+ // but not the CREATE.
+ if (type == FileSystemEvent.create &&
+ hadModifyEvent &&
+ _files.contains(batch.first.path)) {
+ type = FileSystemEvent.modify;
+ }
+
+ switch (type) {
+ case FileSystemEvent.create:
+ // Issue 16003 means that a CREATE event for a directory can indicate
+ // that the directory was moved and then re-created.
+ // [_eventsBasedOnFileSystem] will handle this correctly by producing a
+ // DELETE event followed by a CREATE event if the directory exists.
+ if (isDir) return null;
+ return FileSystemCreateEvent(batch.first.path, false);
+ case FileSystemEvent.delete:
+ return FileSystemDeleteEvent(batch.first.path, isDir);
+ case FileSystemEvent.modify:
+ return FileSystemModifyEvent(batch.first.path, isDir, false);
+ default:
+ throw StateError('unreachable');
+ }
+ }
+
+ /// Returns one or more events that describe the change between the last known
+ /// state of [path] and its current state on the filesystem.
+ ///
+ /// This returns a list whose order should be reflected in the events emitted
+ /// to the user, unlike the batched events from [Directory.watch]. The
+ /// returned list may be empty, indicating that no changes occurred to [path]
+ /// (probably indicating that it was created and then immediately deleted).
+ List<FileSystemEvent> _eventsBasedOnFileSystem(String path) {
+ var fileExisted = _files.contains(path);
+ var dirExisted = _files.containsDir(path);
+ var fileExists = File(path).existsSync();
+ var dirExists = Directory(path).existsSync();
+
+ var events = <FileSystemEvent>[];
+ if (fileExisted) {
+ if (fileExists) {
+ events.add(FileSystemModifyEvent(path, false, false));
+ } else {
+ events.add(FileSystemDeleteEvent(path, false));
+ }
+ } else if (dirExisted) {
+ if (dirExists) {
+ // If we got contradictory events for a directory that used to exist and
+ // still exists, we need to rescan the whole thing in case it was
+ // replaced with a different directory.
+ events.add(FileSystemDeleteEvent(path, true));
+ events.add(FileSystemCreateEvent(path, true));
+ } else {
+ events.add(FileSystemDeleteEvent(path, true));
+ }
+ }
+
+ if (!fileExisted && fileExists) {
+ events.add(FileSystemCreateEvent(path, false));
+ } else if (!dirExisted && dirExists) {
+ events.add(FileSystemCreateEvent(path, true));
+ }
+
+ return events;
+ }
+
+ /// The callback that's run when the [Directory.watch] stream is closed.
+ void _onDone() {
+ _watchSubscription = null;
+
+ // If the directory still exists and we're still expecting bogus events,
+ // this is probably issue 14849 rather than a real close event. We should
+ // just restart the watcher.
+ if (!isReady && Directory(path).existsSync()) {
+ _startWatch();
+ return;
+ }
+
+ // FSEvents can fail to report the contents of the directory being removed
+ // when the directory itself is removed, so we need to manually mark the
+ // files as removed.
+ for (var file in _files.paths) {
+ _emitEvent(ChangeType.REMOVE, file);
+ }
+ _files.clear();
+ close();
+ }
+
+ /// Start or restart the underlying [Directory.watch] stream.
+ void _startWatch() {
+ // Batch the FSEvent changes together so that we can dedup events.
+ var innerStream = Directory(path).watch(recursive: true).batchEvents();
+ _watchSubscription = innerStream.listen(_onBatch,
+ onError: _eventsController.addError, onDone: _onDone);
+ }
+
+ /// Starts or restarts listing the watched directory to get an initial picture
+ /// of its state.
+ Future<void> _listDir() {
+ assert(!isReady);
+ _initialListSubscription?.cancel();
+
+ _files.clear();
+ var completer = Completer<void>();
+ var stream = Directory(path).list(recursive: true);
+ _initialListSubscription = stream.listen((entity) {
+ if (entity is! Directory) _files.add(entity.path);
+ }, onError: _emitError, onDone: completer.complete, cancelOnError: true);
+ return completer.future;
+ }
+
+ /// Wait 200ms for a batch of bogus events (issue 14373) to come in.
+ ///
+ /// 200ms is short in terms of human interaction, but longer than any Mac OS
+ /// watcher tests take on the bots, so it should be safe to assume that any
+ /// bogus events will be signaled in that time frame.
+ Future<void> _waitForBogusEvents() {
+ var completer = Completer<void>();
+ _bogusEventTimer =
+ Timer(const Duration(milliseconds: 200), completer.complete);
+ return completer.future;
+ }
+
+ /// Emit an event with the given [type] and [path].
+ void _emitEvent(ChangeType type, String path) {
+ if (!isReady) return;
+ _eventsController.add(WatchEvent(type, path));
+ }
+
+ /// Emit an error, then close the watcher.
+ void _emitError(Object error, StackTrace stackTrace) {
+ // Guarantee that ready always completes.
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ _eventsController.addError(error, stackTrace);
+ close();
+ }
+}
diff --git a/pkgs/watcher/lib/src/directory_watcher/polling.dart b/pkgs/watcher/lib/src/directory_watcher/polling.dart
new file mode 100644
index 0000000..207679b
--- /dev/null
+++ b/pkgs/watcher/lib/src/directory_watcher/polling.dart
@@ -0,0 +1,191 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+import '../async_queue.dart';
+import '../directory_watcher.dart';
+import '../resubscribable.dart';
+import '../stat.dart';
+import '../utils.dart';
+import '../watch_event.dart';
+
+/// Periodically polls a directory for changes.
+class PollingDirectoryWatcher extends ResubscribableWatcher
+ implements DirectoryWatcher {
+ @override
+ String get directory => path;
+
+ /// Creates a new polling watcher monitoring [directory].
+ ///
+ /// If [pollingDelay] is passed, it specifies the amount of time the watcher
+ /// will pause between successive polls of the directory contents. Making this
+ /// shorter will give more immediate feedback at the expense of doing more IO
+ /// and higher CPU usage. Defaults to one second.
+ PollingDirectoryWatcher(String directory, {Duration? pollingDelay})
+ : super(directory, () {
+ return _PollingDirectoryWatcher(
+ directory, pollingDelay ?? const Duration(seconds: 1));
+ });
+}
+
+class _PollingDirectoryWatcher
+ implements DirectoryWatcher, ManuallyClosedWatcher {
+ @override
+ String get directory => path;
+ @override
+ final String path;
+
+ @override
+ Stream<WatchEvent> get events => _events.stream;
+ final _events = StreamController<WatchEvent>.broadcast();
+
+ @override
+ bool get isReady => _readyCompleter.isCompleted;
+
+ @override
+ Future<void> get ready => _readyCompleter.future;
+ final _readyCompleter = Completer<void>();
+
+ /// The amount of time the watcher pauses between successive polls of the
+ /// directory contents.
+ final Duration _pollingDelay;
+
+ /// The previous modification times of the files in the directory.
+ ///
+ /// Used to tell which files have been modified.
+ final _lastModifieds = <String, DateTime?>{};
+
+ /// The subscription used while [directory] is being listed.
+ ///
+ /// Will be `null` if a list is not currently happening.
+ StreamSubscription<FileSystemEntity>? _listSubscription;
+
+ /// The queue of files waiting to be processed to see if they have been
+ /// modified.
+ ///
+ /// Processing a file is asynchronous, as is listing the directory, so the
+ /// queue exists to let each of those proceed at their own rate. The lister
+ /// will enqueue files as quickly as it can. Meanwhile, files are dequeued
+ /// and processed sequentially.
+ late final AsyncQueue<String?> _filesToProcess =
+ AsyncQueue<String?>(_processFile, onError: (error, stackTrace) {
+ if (!_events.isClosed) _events.addError(error, stackTrace);
+ });
+
+ /// The set of files that have been seen in the current directory listing.
+ ///
+ /// Used to tell which files have been removed: files that are in
+ /// [_lastModifieds] but not in here when a poll completes have been removed.
+ final _polledFiles = <String>{};
+
+ _PollingDirectoryWatcher(this.path, this._pollingDelay) {
+ _poll();
+ }
+
+ @override
+ void close() {
+ _events.close();
+
+ // If we're in the middle of listing the directory, stop.
+ _listSubscription?.cancel();
+
+ // Don't process any remaining files.
+ _filesToProcess.clear();
+ _polledFiles.clear();
+ _lastModifieds.clear();
+ }
+
+ /// Scans the contents of the directory once to see which files have been
+ /// added, removed, and modified.
+ void _poll() {
+ _filesToProcess.clear();
+ _polledFiles.clear();
+
+ void endListing() {
+ assert(!_events.isClosed);
+ _listSubscription = null;
+
+ // Null tells the queue consumer that we're done listing.
+ _filesToProcess.add(null);
+ }
+
+ var stream = Directory(path).list(recursive: true);
+ _listSubscription = stream.listen((entity) {
+ assert(!_events.isClosed);
+
+ if (entity is! File) return;
+ _filesToProcess.add(entity.path);
+ }, onError: (Object error, StackTrace stackTrace) {
+ // Guarantee that ready always completes.
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ if (!isDirectoryNotFoundException(error)) {
+ // It's some unknown error. Pipe it over to the event stream so the
+ // user can see it.
+ _events.addError(error, stackTrace);
+ }
+
+ // When an error occurs, we end the listing normally, which has the
+ // desired effect of marking all files that were in the directory as
+ // being removed.
+ endListing();
+ }, onDone: endListing, cancelOnError: true);
+ }
+
+ /// Processes [file] to determine if it has been modified since the last
+ /// time it was scanned.
+ Future<void> _processFile(String? file) async {
+ // `null` is the sentinel which means the directory listing is complete.
+ if (file == null) {
+ await _completePoll();
+ return;
+ }
+
+ final modified = await modificationTime(file);
+
+ if (_events.isClosed) return;
+
+ var lastModified = _lastModifieds[file];
+
+ // If its modification time hasn't changed, assume the file is unchanged.
+ if (lastModified != null && lastModified == modified) {
+ // The file is still here.
+ _polledFiles.add(file);
+ return;
+ }
+
+ if (_events.isClosed) return;
+
+ _lastModifieds[file] = modified;
+ _polledFiles.add(file);
+
+ // Only notify if we're ready to emit events.
+ if (!isReady) return;
+
+ var type = lastModified == null ? ChangeType.ADD : ChangeType.MODIFY;
+ _events.add(WatchEvent(type, file));
+ }
+
+ /// After the directory listing is complete, this determines which files were
+ /// removed and then restarts the next poll.
+ Future<void> _completePoll() async {
+ // Any files that were not seen in the last poll but that we have a
+ // status for must have been removed.
+ var removedFiles = _lastModifieds.keys.toSet().difference(_polledFiles);
+ for (var removed in removedFiles) {
+ if (isReady) _events.add(WatchEvent(ChangeType.REMOVE, removed));
+ _lastModifieds.remove(removed);
+ }
+
+ if (!isReady) _readyCompleter.complete();
+
+ // Wait and then poll again.
+ await Future<void>.delayed(_pollingDelay);
+ if (_events.isClosed) return;
+ _poll();
+ }
+}
diff --git a/pkgs/watcher/lib/src/directory_watcher/windows.dart b/pkgs/watcher/lib/src/directory_watcher/windows.dart
new file mode 100644
index 0000000..d1c98be
--- /dev/null
+++ b/pkgs/watcher/lib/src/directory_watcher/windows.dart
@@ -0,0 +1,437 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+// TODO(rnystrom): Merge with mac_os version.
+
+import 'dart:async';
+import 'dart:collection';
+import 'dart:io';
+
+import 'package:path/path.dart' as p;
+
+import '../directory_watcher.dart';
+import '../path_set.dart';
+import '../resubscribable.dart';
+import '../utils.dart';
+import '../watch_event.dart';
+
+class WindowsDirectoryWatcher extends ResubscribableWatcher
+ implements DirectoryWatcher {
+ @override
+ String get directory => path;
+
+ WindowsDirectoryWatcher(String directory)
+ : super(directory, () => _WindowsDirectoryWatcher(directory));
+}
+
+class _EventBatcher {
+ static const Duration _batchDelay = Duration(milliseconds: 100);
+ final List<FileSystemEvent> events = [];
+ Timer? timer;
+
+ void addEvent(FileSystemEvent event, void Function() callback) {
+ events.add(event);
+ timer?.cancel();
+ timer = Timer(_batchDelay, callback);
+ }
+
+ void cancelTimer() {
+ timer?.cancel();
+ }
+}
+
+class _WindowsDirectoryWatcher
+ implements DirectoryWatcher, ManuallyClosedWatcher {
+ @override
+ String get directory => path;
+ @override
+ final String path;
+
+ @override
+ Stream<WatchEvent> get events => _eventsController.stream;
+ final _eventsController = StreamController<WatchEvent>.broadcast();
+
+ @override
+ bool get isReady => _readyCompleter.isCompleted;
+
+ @override
+ Future<void> get ready => _readyCompleter.future;
+ final _readyCompleter = Completer<void>();
+
+ final Map<String, _EventBatcher> _eventBatchers =
+ HashMap<String, _EventBatcher>();
+
+ /// The set of files that are known to exist recursively within the watched
+ /// directory.
+ ///
+ /// The state of files on the filesystem is compared against this to determine
+ /// the real change that occurred. This is also used to emit REMOVE events
+ /// when subdirectories are moved out of the watched directory.
+ final PathSet _files;
+
+ /// The subscription to the stream returned by [Directory.watch].
+ StreamSubscription<FileSystemEvent>? _watchSubscription;
+
+ /// The subscription to the stream returned by [Directory.watch] of the
+ /// parent directory to [directory]. This is needed to detect changes to
+ /// [directory], as they are not included on Windows.
+ StreamSubscription<FileSystemEvent>? _parentWatchSubscription;
+
+ /// The subscription to the [Directory.list] call for the initial listing of
+ /// the directory to determine its initial state.
+ StreamSubscription<FileSystemEntity>? _initialListSubscription;
+
+ /// The subscriptions to the [Directory.list] calls for listing the contents
+ /// of subdirectories that were moved into the watched directory.
+ final Set<StreamSubscription<FileSystemEntity>> _listSubscriptions =
+ HashSet<StreamSubscription<FileSystemEntity>>();
+
+ _WindowsDirectoryWatcher(this.path) : _files = PathSet(path) {
+ // Before we're ready to emit events, wait for [_listDir] to complete.
+ _listDir().then((_) {
+ _startWatch();
+ _startParentWatcher();
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ });
+ }
+
+ @override
+ void close() {
+ _watchSubscription?.cancel();
+ _parentWatchSubscription?.cancel();
+ _initialListSubscription?.cancel();
+ for (var sub in _listSubscriptions) {
+ sub.cancel();
+ }
+ _listSubscriptions.clear();
+ for (var batcher in _eventBatchers.values) {
+ batcher.cancelTimer();
+ }
+ _eventBatchers.clear();
+ _watchSubscription = null;
+ _parentWatchSubscription = null;
+ _initialListSubscription = null;
+ _eventsController.close();
+ }
+
+ /// On Windows, if [directory] is deleted, we will not receive any event.
+ ///
+ /// Instead, we add a watcher on the parent folder (if any), that can notify
+ /// us about [path]. This also includes events such as moves.
+ void _startParentWatcher() {
+ var absoluteDir = p.absolute(path);
+ var parent = p.dirname(absoluteDir);
+ // Check if [path] is already the root directory.
+ if (FileSystemEntity.identicalSync(parent, path)) return;
+ var parentStream = Directory(parent).watch(recursive: false);
+ _parentWatchSubscription = parentStream.listen((event) {
+ // Only look at events for 'directory'.
+ if (p.basename(event.path) != p.basename(absoluteDir)) return;
+ // Test if the directory is removed. FileSystemEntity.typeSync will
+ // return NOT_FOUND if it's unable to decide upon the type, including
+ // access denied issues, which may happen when the directory is deleted.
+ // FileSystemMoveEvent and FileSystemDeleteEvent events will always mean
+ // the directory is now gone.
+ if (event is FileSystemMoveEvent ||
+ event is FileSystemDeleteEvent ||
+ (FileSystemEntity.typeSync(path) == FileSystemEntityType.notFound)) {
+ for (var path in _files.paths) {
+ _emitEvent(ChangeType.REMOVE, path);
+ }
+ _files.clear();
+ close();
+ }
+ }, onError: (error) {
+ // Ignore errors, simply close the stream. The user listens on
+ // [directory], and while it can fail to listen on the parent, we may
+ // still be able to listen on the path requested.
+ _parentWatchSubscription?.cancel();
+ _parentWatchSubscription = null;
+ });
+ }
+
+ void _onEvent(FileSystemEvent event) {
+ assert(isReady);
+ final batcher = _eventBatchers.putIfAbsent(event.path, _EventBatcher.new);
+ batcher.addEvent(event, () {
+ _eventBatchers.remove(event.path);
+ _onBatch(batcher.events);
+ });
+ }
+
+ /// The callback that's run when [Directory.watch] emits a batch of events.
+ void _onBatch(List<FileSystemEvent> batch) {
+ _sortEvents(batch).forEach((path, eventSet) {
+ var canonicalEvent = _canonicalEvent(eventSet);
+ var events = canonicalEvent == null
+ ? _eventsBasedOnFileSystem(path)
+ : [canonicalEvent];
+
+ for (var event in events) {
+ if (event is FileSystemCreateEvent) {
+ if (!event.isDirectory) {
+ if (_files.contains(path)) continue;
+
+ _emitEvent(ChangeType.ADD, path);
+ _files.add(path);
+ continue;
+ }
+
+ if (_files.containsDir(path)) continue;
+
+ var stream = Directory(path).list(recursive: true);
+ var subscription = stream.listen((entity) {
+ if (entity is Directory) return;
+ if (_files.contains(path)) return;
+
+ _emitEvent(ChangeType.ADD, entity.path);
+ _files.add(entity.path);
+ }, cancelOnError: true);
+ subscription.onDone(() {
+ _listSubscriptions.remove(subscription);
+ });
+ subscription.onError((Object e, StackTrace stackTrace) {
+ _listSubscriptions.remove(subscription);
+ _emitError(e, stackTrace);
+ });
+ _listSubscriptions.add(subscription);
+ } else if (event is FileSystemModifyEvent) {
+ if (!event.isDirectory) {
+ _emitEvent(ChangeType.MODIFY, path);
+ }
+ } else {
+ assert(event is FileSystemDeleteEvent);
+ for (var removedPath in _files.remove(path)) {
+ _emitEvent(ChangeType.REMOVE, removedPath);
+ }
+ }
+ }
+ });
+ }
+
+ /// Sort all the events in a batch into sets based on their path.
+ ///
+ /// A single input event may result in multiple events in the returned map;
+ /// for example, a MOVE event becomes a DELETE event for the source and a
+ /// CREATE event for the destination.
+ ///
+ /// The returned events won't contain any [FileSystemMoveEvent]s, nor will it
+ /// contain any events relating to [path].
+ Map<String, Set<FileSystemEvent>> _sortEvents(List<FileSystemEvent> batch) {
+ var eventsForPaths = <String, Set<FileSystemEvent>>{};
+
+ // Events within directories that already have events are superfluous; the
+ // directory's full contents will be examined anyway, so we ignore such
+ // events. Emitting them could cause useless or out-of-order events.
+ var directories = unionAll(batch.map((event) {
+ if (!event.isDirectory) return <String>{};
+ if (event is FileSystemMoveEvent) {
+ var destination = event.destination;
+ if (destination != null) {
+ return {event.path, destination};
+ }
+ }
+ return {event.path};
+ }));
+
+ bool isInModifiedDirectory(String path) =>
+ directories.any((dir) => path != dir && p.isWithin(dir, path));
+
+ void addEvent(String path, FileSystemEvent event) {
+ if (isInModifiedDirectory(path)) return;
+ eventsForPaths.putIfAbsent(path, () => <FileSystemEvent>{}).add(event);
+ }
+
+ for (var event in batch) {
+ if (event is FileSystemMoveEvent) {
+ var destination = event.destination;
+ if (destination != null) {
+ addEvent(destination, event);
+ }
+ }
+ addEvent(event.path, event);
+ }
+
+ return eventsForPaths;
+ }
+
+ /// Returns the canonical event from a batch of events on the same path, if
+ /// one exists.
+ ///
+ /// If [batch] doesn't contain any contradictory events (e.g. DELETE and
+ /// CREATE, or events with different values for `isDirectory`), this returns a
+ /// single event that describes what happened to the path in question.
+ ///
+ /// If [batch] does contain contradictory events, this returns `null` to
+ /// indicate that the state of the path on the filesystem should be checked to
+ /// determine what occurred.
+ FileSystemEvent? _canonicalEvent(Set<FileSystemEvent> batch) {
+ // An empty batch indicates that we've learned earlier that the batch is
+ // contradictory (e.g. because of a move).
+ if (batch.isEmpty) return null;
+
+ var type = batch.first.type;
+ var isDir = batch.first.isDirectory;
+
+ for (var event in batch.skip(1)) {
+ // If one event reports that the file is a directory and another event
+ // doesn't, that's a contradiction.
+ if (isDir != event.isDirectory) return null;
+
+ // Modify events don't contradict either CREATE or REMOVE events. We can
+ // safely assume the file was modified after a CREATE or before the
+ // REMOVE; otherwise there will also be a REMOVE or CREATE event
+ // (respectively) that will be contradictory.
+ if (event is FileSystemModifyEvent) continue;
+ assert(event is FileSystemCreateEvent ||
+ event is FileSystemDeleteEvent ||
+ event is FileSystemMoveEvent);
+
+ // If we previously thought this was a MODIFY, we now consider it to be a
+ // CREATE or REMOVE event. This is safe for the same reason as above.
+ if (type == FileSystemEvent.modify) {
+ type = event.type;
+ continue;
+ }
+
+ // A CREATE event contradicts a REMOVE event and vice versa.
+ assert(type == FileSystemEvent.create ||
+ type == FileSystemEvent.delete ||
+ type == FileSystemEvent.move);
+ if (type != event.type) return null;
+ }
+
+ switch (type) {
+ case FileSystemEvent.create:
+ return FileSystemCreateEvent(batch.first.path, isDir);
+ case FileSystemEvent.delete:
+ return FileSystemDeleteEvent(batch.first.path, isDir);
+ case FileSystemEvent.modify:
+ return FileSystemModifyEvent(batch.first.path, isDir, false);
+ case FileSystemEvent.move:
+ return null;
+ default:
+ throw StateError('unreachable');
+ }
+ }
+
+ /// Returns zero or more events that describe the change between the last
+ /// known state of [path] and its current state on the filesystem.
+ ///
+ /// This returns a list whose order should be reflected in the events emitted
+ /// to the user, unlike the batched events from [Directory.watch]. The
+ /// returned list may be empty, indicating that no changes occurred to [path]
+ /// (probably indicating that it was created and then immediately deleted).
+ List<FileSystemEvent> _eventsBasedOnFileSystem(String path) {
+ var fileExisted = _files.contains(path);
+ var dirExisted = _files.containsDir(path);
+
+ bool fileExists;
+ bool dirExists;
+ try {
+ fileExists = File(path).existsSync();
+ dirExists = Directory(path).existsSync();
+ } on FileSystemException {
+ return const <FileSystemEvent>[];
+ }
+
+ var events = <FileSystemEvent>[];
+ if (fileExisted) {
+ if (fileExists) {
+ events.add(FileSystemModifyEvent(path, false, false));
+ } else {
+ events.add(FileSystemDeleteEvent(path, false));
+ }
+ } else if (dirExisted) {
+ if (dirExists) {
+ // If we got contradictory events for a directory that used to exist and
+ // still exists, we need to rescan the whole thing in case it was
+ // replaced with a different directory.
+ events.add(FileSystemDeleteEvent(path, true));
+ events.add(FileSystemCreateEvent(path, true));
+ } else {
+ events.add(FileSystemDeleteEvent(path, true));
+ }
+ }
+
+ if (!fileExisted && fileExists) {
+ events.add(FileSystemCreateEvent(path, false));
+ } else if (!dirExisted && dirExists) {
+ events.add(FileSystemCreateEvent(path, true));
+ }
+
+ return events;
+ }
+
+ /// The callback that's run when the [Directory.watch] stream is closed.
+ /// Note that this is unlikely to happen on Windows, unless the system itself
+ /// closes the handle.
+ void _onDone() {
+ _watchSubscription = null;
+
+ // Emit remove events for any remaining files.
+ for (var file in _files.paths) {
+ _emitEvent(ChangeType.REMOVE, file);
+ }
+ _files.clear();
+ close();
+ }
+
+ /// Start or restart the underlying [Directory.watch] stream.
+ void _startWatch() {
+ // Note: "watcher closed" exceptions do not get sent over the stream
+ // returned by watch, and must be caught via a zone handler.
+ runZonedGuarded(() {
+ var innerStream = Directory(path).watch(recursive: true);
+ _watchSubscription = innerStream.listen(_onEvent,
+ onError: _eventsController.addError, onDone: _onDone);
+ }, (error, stackTrace) {
+ if (error is FileSystemException &&
+ error.message.startsWith('Directory watcher closed unexpectedly')) {
+ _watchSubscription?.cancel();
+ _eventsController.addError(error, stackTrace);
+ _startWatch();
+ } else {
+ // ignore: only_throw_errors
+ throw error;
+ }
+ });
+ }
+
+ /// Starts or restarts listing the watched directory to get an initial picture
+ /// of its state.
+ Future<void> _listDir() {
+ assert(!isReady);
+ _initialListSubscription?.cancel();
+
+ _files.clear();
+ var completer = Completer<void>();
+ var stream = Directory(path).list(recursive: true);
+ void handleEntity(FileSystemEntity entity) {
+ if (entity is! Directory) _files.add(entity.path);
+ }
+
+ _initialListSubscription = stream.listen(handleEntity,
+ onError: _emitError, onDone: completer.complete, cancelOnError: true);
+ return completer.future;
+ }
+
+ /// Emit an event with the given [type] and [path].
+ void _emitEvent(ChangeType type, String path) {
+ if (!isReady) return;
+
+ _eventsController.add(WatchEvent(type, path));
+ }
+
+ /// Emit an error, then close the watcher.
+ void _emitError(Object error, StackTrace stackTrace) {
+ // Guarantee that ready always completes.
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ _eventsController.addError(error, stackTrace);
+ close();
+ }
+}
diff --git a/pkgs/watcher/lib/src/file_watcher.dart b/pkgs/watcher/lib/src/file_watcher.dart
new file mode 100644
index 0000000..143aa31
--- /dev/null
+++ b/pkgs/watcher/lib/src/file_watcher.dart
@@ -0,0 +1,44 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import '../watcher.dart';
+import 'custom_watcher_factory.dart';
+import 'file_watcher/native.dart';
+
+/// Watches a file and emits [WatchEvent]s when the file has changed.
+///
+/// Note that since each watcher only watches a single file, it will only emit
+/// [ChangeType.MODIFY] events, except when the file is deleted at which point
+/// it will emit a single [ChangeType.REMOVE] event and then close the stream.
+///
+/// If the file is deleted and quickly replaced (when a new file is moved in its
+/// place, for example) this will emit a [ChangeType.MODIFY] event.
+abstract class FileWatcher implements Watcher {
+ /// Creates a new [FileWatcher] monitoring [file].
+ ///
+ /// If a native file watcher is available for this platform, this will use it.
+ /// Otherwise, it will fall back to a [PollingFileWatcher]. Notably, native
+ /// file watching is *not* supported on Windows.
+ ///
+ /// If [pollingDelay] is passed, it specifies the amount of time the watcher
+ /// will pause between successive polls of the directory contents. Making this
+ /// shorter will give more immediate feedback at the expense of doing more IO
+ /// and higher CPU usage. Defaults to one second. Ignored for non-polling
+ /// watchers.
+ factory FileWatcher(String file, {Duration? pollingDelay}) {
+ var customWatcher =
+ createCustomFileWatcher(file, pollingDelay: pollingDelay);
+ if (customWatcher != null) return customWatcher;
+
+ // [File.watch] doesn't work on Windows, but
+ // [FileSystemEntity.isWatchSupported] is still true because directory
+ // watching does work.
+ if (FileSystemEntity.isWatchSupported && !Platform.isWindows) {
+ return NativeFileWatcher(file);
+ }
+ return PollingFileWatcher(file, pollingDelay: pollingDelay);
+ }
+}
diff --git a/pkgs/watcher/lib/src/file_watcher/native.dart b/pkgs/watcher/lib/src/file_watcher/native.dart
new file mode 100644
index 0000000..502aa10
--- /dev/null
+++ b/pkgs/watcher/lib/src/file_watcher/native.dart
@@ -0,0 +1,90 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+import '../file_watcher.dart';
+import '../resubscribable.dart';
+import '../utils.dart';
+import '../watch_event.dart';
+
+/// Uses the native file system notifications to watch for filesystem events.
+///
+/// Single-file notifications are much simpler than those for multiple files, so
+/// this doesn't need to be split out into multiple OS-specific classes.
+class NativeFileWatcher extends ResubscribableWatcher implements FileWatcher {
+ NativeFileWatcher(String path) : super(path, () => _NativeFileWatcher(path));
+}
+
+class _NativeFileWatcher implements FileWatcher, ManuallyClosedWatcher {
+ @override
+ final String path;
+
+ @override
+ Stream<WatchEvent> get events => _eventsController.stream;
+ final _eventsController = StreamController<WatchEvent>.broadcast();
+
+ @override
+ bool get isReady => _readyCompleter.isCompleted;
+
+ @override
+ Future<void> get ready => _readyCompleter.future;
+ final _readyCompleter = Completer<void>();
+
+ StreamSubscription<List<FileSystemEvent>>? _subscription;
+
+ _NativeFileWatcher(this.path) {
+ _listen();
+
+ // We don't need to do any initial set-up, so we're ready immediately after
+ // being listened to.
+ _readyCompleter.complete();
+ }
+
+ void _listen() {
+ // Batch the events together so that we can dedup them.
+ _subscription = File(path)
+ .watch()
+ .batchEvents()
+ .listen(_onBatch, onError: _eventsController.addError, onDone: _onDone);
+ }
+
+ void _onBatch(List<FileSystemEvent> batch) {
+ if (batch.any((event) => event.type == FileSystemEvent.delete)) {
+ // If the file is deleted, the underlying stream will close. We handle
+ // emitting our own REMOVE event in [_onDone].
+ return;
+ }
+
+ _eventsController.add(WatchEvent(ChangeType.MODIFY, path));
+ }
+
+ void _onDone() async {
+ var fileExists = await File(path).exists();
+
+ // Check for this after checking whether the file exists because it's
+ // possible that [close] was called between [File.exists] being called and
+ // it completing.
+ if (_eventsController.isClosed) return;
+
+ if (fileExists) {
+ // If the file exists now, it was probably removed and quickly replaced;
+ // this can happen for example when another file is moved on top of it.
+ // Re-subscribe and report a modify event.
+ _eventsController.add(WatchEvent(ChangeType.MODIFY, path));
+ _listen();
+ } else {
+ _eventsController.add(WatchEvent(ChangeType.REMOVE, path));
+ close();
+ }
+ }
+
+ @override
+ void close() {
+ _subscription?.cancel();
+ _subscription = null;
+ _eventsController.close();
+ }
+}
diff --git a/pkgs/watcher/lib/src/file_watcher/polling.dart b/pkgs/watcher/lib/src/file_watcher/polling.dart
new file mode 100644
index 0000000..15ff9ab
--- /dev/null
+++ b/pkgs/watcher/lib/src/file_watcher/polling.dart
@@ -0,0 +1,106 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+import '../file_watcher.dart';
+import '../resubscribable.dart';
+import '../stat.dart';
+import '../watch_event.dart';
+
+/// Periodically polls a file for changes.
+class PollingFileWatcher extends ResubscribableWatcher implements FileWatcher {
+ PollingFileWatcher(String path, {Duration? pollingDelay})
+ : super(path, () {
+ return _PollingFileWatcher(
+ path, pollingDelay ?? const Duration(seconds: 1));
+ });
+}
+
+class _PollingFileWatcher implements FileWatcher, ManuallyClosedWatcher {
+ @override
+ final String path;
+
+ @override
+ Stream<WatchEvent> get events => _eventsController.stream;
+ final _eventsController = StreamController<WatchEvent>.broadcast();
+
+ @override
+ bool get isReady => _readyCompleter.isCompleted;
+
+ @override
+ Future<void> get ready => _readyCompleter.future;
+ final _readyCompleter = Completer<void>();
+
+ /// The timer that controls polling.
+ late final Timer _timer;
+
+ /// The previous modification time of the file.
+ ///
+ /// `null` indicates the file does not (or did not on the last poll) exist.
+ DateTime? _lastModified;
+
+ _PollingFileWatcher(this.path, Duration pollingDelay) {
+ _timer = Timer.periodic(pollingDelay, (_) => _poll());
+ _poll();
+ }
+
+ /// Checks the mtime of the file and whether it's been removed.
+ Future<void> _poll() async {
+ // We don't mark the file as removed if this is the first poll. Instead,
+ // below we forward the dart:io error that comes from trying to read the
+ // mtime below.
+ var pathExists = await File(path).exists();
+ if (_eventsController.isClosed) return;
+
+ if (_lastModified != null && !pathExists) {
+ _flagReady();
+ _eventsController.add(WatchEvent(ChangeType.REMOVE, path));
+ unawaited(close());
+ return;
+ }
+
+ DateTime? modified;
+ try {
+ modified = await modificationTime(path);
+ } on FileSystemException catch (error, stackTrace) {
+ if (!_eventsController.isClosed) {
+ _flagReady();
+ _eventsController.addError(error, stackTrace);
+ await close();
+ }
+ }
+ if (_eventsController.isClosed) {
+ _flagReady();
+ return;
+ }
+
+ if (!isReady) {
+ // If this is the first poll, don't emit an event, just set the last mtime
+ // and complete the completer.
+ _lastModified = modified;
+ _flagReady();
+ return;
+ }
+
+ if (_lastModified == modified) return;
+
+ _lastModified = modified;
+ _eventsController.add(WatchEvent(ChangeType.MODIFY, path));
+ }
+
+ /// Flags this watcher as ready if it has not already been done.
+ void _flagReady() {
+ if (!isReady) {
+ _readyCompleter.complete();
+ }
+ }
+
+ @override
+ Future<void> close() async {
+ _timer.cancel();
+ await _eventsController.close();
+ }
+}
diff --git a/pkgs/watcher/lib/src/path_set.dart b/pkgs/watcher/lib/src/path_set.dart
new file mode 100644
index 0000000..4f41cf9
--- /dev/null
+++ b/pkgs/watcher/lib/src/path_set.dart
@@ -0,0 +1,190 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:collection';
+
+import 'package:path/path.dart' as p;
+
+/// A set of paths, organized into a directory hierarchy.
+///
+/// When a path is [add]ed, it creates an implicit directory structure above
+/// that path. Directories can be inspected using [containsDir] and removed
+/// using [remove]. If they're removed, their contents are removed as well.
+///
+/// The paths in the set are normalized so that they all begin with [root].
+class PathSet {
+ /// The root path, which all paths in the set must be under.
+ final String root;
+
+ /// The path set's directory hierarchy.
+ ///
+ /// Each entry represents a directory or file. It may be a file or directory
+ /// that was explicitly added, or a parent directory that was implicitly
+ /// added in order to add a child.
+ final _Entry _entries = _Entry();
+
+ PathSet(this.root);
+
+ /// Adds [path] to the set.
+ void add(String path) {
+ path = _normalize(path);
+
+ var parts = p.split(path);
+ var entry = _entries;
+ for (var part in parts) {
+ entry = entry.contents.putIfAbsent(part, _Entry.new);
+ }
+
+ entry.isExplicit = true;
+ }
+
+ /// Removes [path] and any paths beneath it from the set and returns the
+ /// removed paths.
+ ///
+ /// Even if [path] itself isn't in the set, if it's a directory containing
+ /// paths that are in the set those paths will be removed and returned.
+ ///
+ /// If neither [path] nor any paths beneath it are in the set, returns an
+ /// empty set.
+ Set<String> remove(String path) {
+ path = _normalize(path);
+ var parts = Queue.of(p.split(path));
+
+ // Remove the children of [dir], as well as [dir] itself if necessary.
+ //
+ // [partialPath] is the path to [dir], and a prefix of [path]; the remaining
+ // components of [path] are in [parts].
+ Set<String> recurse(_Entry dir, String partialPath) {
+ if (parts.length > 1) {
+ // If there's more than one component left in [path], recurse down to
+ // the next level.
+ var part = parts.removeFirst();
+ var entry = dir.contents[part];
+ if (entry == null || entry.contents.isEmpty) return <String>{};
+
+ partialPath = p.join(partialPath, part);
+ var paths = recurse(entry, partialPath);
+ // After removing this entry's children, if it has no more children and
+ // it's not in the set in its own right, remove it as well.
+ if (entry.contents.isEmpty && !entry.isExplicit) {
+ dir.contents.remove(part);
+ }
+ return paths;
+ }
+
+ // If there's only one component left in [path], we should remove it.
+ var entry = dir.contents.remove(parts.first);
+ if (entry == null) return <String>{};
+
+ if (entry.contents.isEmpty) {
+ return {p.join(root, path)};
+ }
+
+ var set = _explicitPathsWithin(entry, path);
+ if (entry.isExplicit) {
+ set.add(p.join(root, path));
+ }
+
+ return set;
+ }
+
+ return recurse(_entries, root);
+ }
+
+ /// Recursively lists all of the explicit paths within [dir].
+ ///
+ /// [dirPath] should be the path to [dir].
+ Set<String> _explicitPathsWithin(_Entry dir, String dirPath) {
+ var paths = <String>{};
+ void recurse(_Entry dir, String path) {
+ dir.contents.forEach((name, entry) {
+ var entryPath = p.join(path, name);
+ if (entry.isExplicit) paths.add(p.join(root, entryPath));
+
+ recurse(entry, entryPath);
+ });
+ }
+
+ recurse(dir, dirPath);
+ return paths;
+ }
+
+ /// Returns whether this set contains [path].
+ ///
+ /// This only returns true for paths explicitly added to this set.
+ /// Implicitly-added directories can be inspected using [containsDir].
+ bool contains(String path) {
+ path = _normalize(path);
+ var entry = _entries;
+
+ for (var part in p.split(path)) {
+ var child = entry.contents[part];
+ if (child == null) return false;
+ entry = child;
+ }
+
+ return entry.isExplicit;
+ }
+
+ /// Returns whether this set contains paths beneath [path].
+ bool containsDir(String path) {
+ path = _normalize(path);
+ var entry = _entries;
+
+ for (var part in p.split(path)) {
+ var child = entry.contents[part];
+ if (child == null) return false;
+ entry = child;
+ }
+
+ return entry.contents.isNotEmpty;
+ }
+
+ /// All of the paths explicitly added to this set.
+ List<String> get paths {
+ var result = <String>[];
+
+ void recurse(_Entry dir, String path) {
+ for (var mapEntry in dir.contents.entries) {
+ var entry = mapEntry.value;
+ var entryPath = p.join(path, mapEntry.key);
+ if (entry.isExplicit) result.add(entryPath);
+ recurse(entry, entryPath);
+ }
+ }
+
+ recurse(_entries, root);
+ return result;
+ }
+
+ /// Removes all paths from this set.
+ void clear() {
+ _entries.contents.clear();
+ }
+
+ /// Returns a normalized version of [path].
+ ///
+ /// This removes any extra ".." or "."s and ensure that the returned path
+ /// begins with [root]. It's an error if [path] isn't within [root].
+ String _normalize(String path) {
+ assert(p.isWithin(root, path));
+
+ return p.relative(p.normalize(path), from: root);
+ }
+}
+
+/// A virtual file system entity tracked by the [PathSet].
+///
+/// It may have child entries in [contents], which implies it's a directory.
+class _Entry {
+ /// The child entries contained in this directory.
+ final Map<String, _Entry> contents = {};
+
+ /// If this entry was explicitly added as a leaf file system entity, this
+ /// will be true.
+ ///
+ /// Otherwise, it represents a parent directory that was implicitly added
+ /// when added some child of it.
+ bool isExplicit = false;
+}
diff --git a/pkgs/watcher/lib/src/resubscribable.dart b/pkgs/watcher/lib/src/resubscribable.dart
new file mode 100644
index 0000000..b99e9d7
--- /dev/null
+++ b/pkgs/watcher/lib/src/resubscribable.dart
@@ -0,0 +1,79 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import '../watcher.dart';
+
+/// A wrapper for [ManuallyClosedWatcher] that encapsulates support for closing
+/// the watcher when it has no subscribers and re-opening it when it's
+/// re-subscribed.
+///
+/// It's simpler to implement watchers without worrying about this behavior.
+/// This class wraps a watcher class which can be written with the simplifying
+/// assumption that it can continue emitting events until an explicit `close`
+/// method is called, at which point it will cease emitting events entirely. The
+/// [ManuallyClosedWatcher] interface is used for these watchers.
+///
+/// This would be more cleanly implemented as a function that takes a class and
+/// emits a new class, but Dart doesn't support that sort of thing. Instead it
+/// takes a factory function that produces instances of the inner class.
+abstract class ResubscribableWatcher implements Watcher {
+ /// The factory function that produces instances of the inner class.
+ final ManuallyClosedWatcher Function() _factory;
+
+ @override
+ final String path;
+
+ @override
+ Stream<WatchEvent> get events => _eventsController.stream;
+ late StreamController<WatchEvent> _eventsController;
+
+ @override
+ bool get isReady => _readyCompleter.isCompleted;
+
+ @override
+ Future<void> get ready => _readyCompleter.future;
+ var _readyCompleter = Completer<void>();
+
+ /// Creates a new [ResubscribableWatcher] wrapping the watchers
+ /// emitted by [_factory].
+ ResubscribableWatcher(this.path, this._factory) {
+ late ManuallyClosedWatcher watcher;
+ late StreamSubscription<WatchEvent> subscription;
+
+ _eventsController = StreamController<WatchEvent>.broadcast(
+ onListen: () async {
+ watcher = _factory();
+ subscription = watcher.events.listen(_eventsController.add,
+ onError: _eventsController.addError,
+ onDone: _eventsController.close);
+
+ // It's important that we complete the value of [_readyCompleter] at
+ // the time [onListen] is called, as opposed to the value when
+ // [watcher.ready] fires. A new completer may be created by that time.
+ await watcher.ready;
+ _readyCompleter.complete();
+ },
+ onCancel: () {
+ // Cancel the subscription before closing the watcher so that the
+ // watcher's `onDone` event doesn't close [events].
+ subscription.cancel();
+ watcher.close();
+ _readyCompleter = Completer();
+ },
+ sync: true);
+ }
+}
+
+/// An interface for watchers with an explicit, manual [close] method.
+///
+/// See [ResubscribableWatcher].
+abstract class ManuallyClosedWatcher implements Watcher {
+ /// Closes the watcher.
+ ///
+ /// Subclasses should close their [events] stream and release any internal
+ /// resources.
+ void close();
+}
diff --git a/pkgs/watcher/lib/src/stat.dart b/pkgs/watcher/lib/src/stat.dart
new file mode 100644
index 0000000..fe0f155
--- /dev/null
+++ b/pkgs/watcher/lib/src/stat.dart
@@ -0,0 +1,34 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+/// A function that takes a file path and returns the last modified time for
+/// the file at that path.
+typedef MockTimeCallback = DateTime? Function(String path);
+
+MockTimeCallback? _mockTimeCallback;
+
+/// Overrides the default behavior for accessing a file's modification time
+/// with [callback].
+///
+/// The OS file modification time has pretty rough granularity (like a few
+/// seconds) which can make for slow tests that rely on modtime. This lets you
+/// replace it with something you control.
+void mockGetModificationTime(MockTimeCallback callback) {
+ _mockTimeCallback = callback;
+}
+
+/// Gets the modification time for the file at [path].
+/// Completes with `null` if the file does not exist.
+Future<DateTime?> modificationTime(String path) async {
+ var mockTimeCallback = _mockTimeCallback;
+ if (mockTimeCallback != null) {
+ return mockTimeCallback(path);
+ }
+
+ final stat = await FileStat.stat(path);
+ if (stat.type == FileSystemEntityType.notFound) return null;
+ return stat.modified;
+}
diff --git a/pkgs/watcher/lib/src/utils.dart b/pkgs/watcher/lib/src/utils.dart
new file mode 100644
index 0000000..c2e71b3
--- /dev/null
+++ b/pkgs/watcher/lib/src/utils.dart
@@ -0,0 +1,52 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:collection';
+import 'dart:io';
+
+/// Returns `true` if [error] is a [FileSystemException] for a missing
+/// directory.
+bool isDirectoryNotFoundException(Object error) {
+ if (error is! FileSystemException) return false;
+
+ // See dartbug.com/12461 and tests/standalone/io/directory_error_test.dart.
+ var notFoundCode = Platform.operatingSystem == 'windows' ? 3 : 2;
+ return error.osError?.errorCode == notFoundCode;
+}
+
+/// Returns the union of all elements in each set in [sets].
+Set<T> unionAll<T>(Iterable<Set<T>> sets) =>
+ sets.fold(<T>{}, (union, set) => union.union(set));
+
+extension BatchEvents<T> on Stream<T> {
+ /// Batches all events that are sent at the same time.
+ ///
+ /// When multiple events are synchronously added to a stream controller, the
+ /// [StreamController] implementation uses [scheduleMicrotask] to schedule the
+ /// asynchronous firing of each event. In order to recreate the synchronous
+ /// batches, this collates all the events that are received in "nearby"
+ /// microtasks.
+ Stream<List<T>> batchEvents() {
+ var batch = Queue<T>();
+ return StreamTransformer<T, List<T>>.fromHandlers(
+ handleData: (event, sink) {
+ batch.add(event);
+
+ // [Timer.run] schedules an event that runs after any microtasks that have
+ // been scheduled.
+ Timer.run(() {
+ if (batch.isEmpty) return;
+ sink.add(batch.toList());
+ batch.clear();
+ });
+ }, handleDone: (sink) {
+ if (batch.isNotEmpty) {
+ sink.add(batch.toList());
+ batch.clear();
+ }
+ sink.close();
+ }).bind(this);
+ }
+}
diff --git a/pkgs/watcher/lib/src/watch_event.dart b/pkgs/watcher/lib/src/watch_event.dart
new file mode 100644
index 0000000..b65afc2
--- /dev/null
+++ b/pkgs/watcher/lib/src/watch_event.dart
@@ -0,0 +1,38 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// An event describing a single change to the file system.
+class WatchEvent {
+ /// The manner in which the file at [path] has changed.
+ final ChangeType type;
+
+ /// The path of the file that changed.
+ final String path;
+
+ WatchEvent(this.type, this.path);
+
+ @override
+ String toString() => '$type $path';
+}
+
+/// Enum for what kind of change has happened to a file.
+class ChangeType {
+ /// A new file has been added.
+ // ignore: constant_identifier_names
+ static const ADD = ChangeType('add');
+
+ /// A file has been removed.
+ // ignore: constant_identifier_names
+ static const REMOVE = ChangeType('remove');
+
+ /// The contents of a file have changed.
+ // ignore: constant_identifier_names
+ static const MODIFY = ChangeType('modify');
+
+ final String _name;
+ const ChangeType(this._name);
+
+ @override
+ String toString() => _name;
+}
diff --git a/pkgs/watcher/lib/watcher.dart b/pkgs/watcher/lib/watcher.dart
new file mode 100644
index 0000000..12a5369
--- /dev/null
+++ b/pkgs/watcher/lib/watcher.dart
@@ -0,0 +1,70 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:io';
+
+import 'src/directory_watcher.dart';
+import 'src/file_watcher.dart';
+import 'src/watch_event.dart';
+
+export 'src/custom_watcher_factory.dart' show registerCustomWatcher;
+export 'src/directory_watcher.dart';
+export 'src/directory_watcher/polling.dart';
+export 'src/file_watcher.dart';
+export 'src/file_watcher/polling.dart';
+export 'src/watch_event.dart';
+
+abstract class Watcher {
+ /// The path to the file or directory whose contents are being monitored.
+ String get path;
+
+ /// The broadcast [Stream] of events that have occurred to the watched file or
+ /// files in the watched directory.
+ ///
+ /// Changes will only be monitored while this stream has subscribers. Any
+ /// changes that occur during periods when there are no subscribers will not
+ /// be reported the next time a subscriber is added.
+ Stream<WatchEvent> get events;
+
+ /// Whether the watcher is initialized and watching for changes.
+ ///
+ /// This is true if and only if [ready] is complete.
+ bool get isReady;
+
+ /// A [Future] that completes when the watcher is initialized and watching for
+ /// changes.
+ ///
+ /// If the watcher is not currently monitoring the file or directory (because
+ /// there are no subscribers to [events]), this returns a future that isn't
+ /// complete yet. It will complete when a subscriber starts listening and the
+ /// watcher finishes any initialization work it needs to do.
+ ///
+ /// If the watcher is already monitoring, this returns an already complete
+ /// future.
+ ///
+ /// This future always completes successfully as errors are provided through
+ /// the [events] stream.
+ Future get ready;
+
+ /// Creates a new [DirectoryWatcher] or [FileWatcher] monitoring [path],
+ /// depending on whether it's a file or directory.
+ ///
+ /// If a native watcher is available for this platform, this will use it.
+ /// Otherwise, it will fall back to a polling watcher. Notably, watching
+ /// individual files is not natively supported on Windows, although watching
+ /// directories is.
+ ///
+ /// If [pollingDelay] is passed, it specifies the amount of time the watcher
+ /// will pause between successive polls of the contents of [path]. Making this
+ /// shorter will give more immediate feedback at the expense of doing more IO
+ /// and higher CPU usage. Defaults to one second. Ignored for non-polling
+ /// watchers.
+ factory Watcher(String path, {Duration? pollingDelay}) {
+ if (File(path).existsSync()) {
+ return FileWatcher(path, pollingDelay: pollingDelay);
+ } else {
+ return DirectoryWatcher(path, pollingDelay: pollingDelay);
+ }
+ }
+}
diff --git a/pkgs/watcher/pubspec.yaml b/pkgs/watcher/pubspec.yaml
new file mode 100644
index 0000000..7781bd4
--- /dev/null
+++ b/pkgs/watcher/pubspec.yaml
@@ -0,0 +1,19 @@
+name: watcher
+version: 1.1.1
+description: >-
+ A file system watcher. It monitors changes to contents of directories and
+ sends notifications when files have been added, removed, or modified.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/watcher
+
+environment:
+ sdk: ^3.1.0
+
+dependencies:
+ async: ^2.5.0
+ path: ^1.8.0
+
+dev_dependencies:
+ benchmark_harness: ^2.0.0
+ dart_flutter_team_lints: ^3.0.0
+ test: ^1.16.6
+ test_descriptor: ^2.0.0
diff --git a/pkgs/watcher/test/custom_watcher_factory_test.dart b/pkgs/watcher/test/custom_watcher_factory_test.dart
new file mode 100644
index 0000000..e9d65bb
--- /dev/null
+++ b/pkgs/watcher/test/custom_watcher_factory_test.dart
@@ -0,0 +1,142 @@
+import 'dart:async';
+
+import 'package:test/test.dart';
+import 'package:watcher/watcher.dart';
+
+void main() {
+ late _MemFs memFs;
+ final memFsFactoryId = 'MemFs';
+ final noOpFactoryId = 'NoOp';
+
+ setUpAll(() {
+ memFs = _MemFs();
+ var memFsWatcherFactory = _MemFsWatcherFactory(memFs);
+ var noOpWatcherFactory = _NoOpWatcherFactory();
+ registerCustomWatcher(
+ noOpFactoryId,
+ noOpWatcherFactory.createDirectoryWatcher,
+ noOpWatcherFactory.createFileWatcher);
+ registerCustomWatcher(
+ memFsFactoryId,
+ memFsWatcherFactory.createDirectoryWatcher,
+ memFsWatcherFactory.createFileWatcher);
+ });
+
+ test('notifies for files', () async {
+ var watcher = FileWatcher('file.txt');
+
+ var completer = Completer<WatchEvent>();
+ watcher.events.listen((event) => completer.complete(event));
+ await watcher.ready;
+ memFs.add('file.txt');
+ var event = await completer.future;
+
+ expect(event.type, ChangeType.ADD);
+ expect(event.path, 'file.txt');
+ });
+
+ test('notifies for directories', () async {
+ var watcher = DirectoryWatcher('dir');
+
+ var completer = Completer<WatchEvent>();
+ watcher.events.listen((event) => completer.complete(event));
+ await watcher.ready;
+ memFs.add('dir');
+ var event = await completer.future;
+
+ expect(event.type, ChangeType.ADD);
+ expect(event.path, 'dir');
+ });
+
+ test('registering twice throws', () async {
+ expect(
+ () => registerCustomWatcher(
+ memFsFactoryId,
+ (_, {pollingDelay}) => throw UnimplementedError(),
+ (_, {pollingDelay}) => throw UnimplementedError()),
+ throwsA(isA<ArgumentError>()),
+ );
+ });
+
+ test('finding two applicable factories throws', () async {
+ // Note that _MemFsWatcherFactory always returns a watcher, so having two
+ // will always produce a conflict.
+ var watcherFactory = _MemFsWatcherFactory(memFs);
+ registerCustomWatcher('Different id', watcherFactory.createDirectoryWatcher,
+ watcherFactory.createFileWatcher);
+ expect(() => FileWatcher('file.txt'), throwsA(isA<StateError>()));
+ expect(() => DirectoryWatcher('dir'), throwsA(isA<StateError>()));
+ });
+}
+
+class _MemFs {
+ final _streams = <String, Set<StreamController<WatchEvent>>>{};
+
+ StreamController<WatchEvent> watchStream(String path) {
+ var controller = StreamController<WatchEvent>();
+ _streams
+ .putIfAbsent(path, () => <StreamController<WatchEvent>>{})
+ .add(controller);
+ return controller;
+ }
+
+ void add(String path) {
+ var controllers = _streams[path];
+ if (controllers != null) {
+ for (var controller in controllers) {
+ controller.add(WatchEvent(ChangeType.ADD, path));
+ }
+ }
+ }
+
+ void remove(String path) {
+ var controllers = _streams[path];
+ if (controllers != null) {
+ for (var controller in controllers) {
+ controller.add(WatchEvent(ChangeType.REMOVE, path));
+ }
+ }
+ }
+}
+
+class _MemFsWatcher implements FileWatcher, DirectoryWatcher, Watcher {
+ final String _path;
+ final StreamController<WatchEvent> _controller;
+
+ _MemFsWatcher(this._path, this._controller);
+
+ @override
+ String get path => _path;
+
+ @override
+ String get directory => throw UnsupportedError('directory is not supported');
+
+ @override
+ Stream<WatchEvent> get events => _controller.stream;
+
+ @override
+ bool get isReady => true;
+
+ @override
+ Future<void> get ready async {}
+}
+
+class _MemFsWatcherFactory {
+ final _MemFs _memFs;
+ _MemFsWatcherFactory(this._memFs);
+
+ DirectoryWatcher? createDirectoryWatcher(String path,
+ {Duration? pollingDelay}) =>
+ _MemFsWatcher(path, _memFs.watchStream(path));
+
+ FileWatcher? createFileWatcher(String path, {Duration? pollingDelay}) =>
+ _MemFsWatcher(path, _memFs.watchStream(path));
+}
+
+class _NoOpWatcherFactory {
+ DirectoryWatcher? createDirectoryWatcher(String path,
+ {Duration? pollingDelay}) =>
+ null;
+
+ FileWatcher? createFileWatcher(String path, {Duration? pollingDelay}) => null;
+}
diff --git a/pkgs/watcher/test/directory_watcher/linux_test.dart b/pkgs/watcher/test/directory_watcher/linux_test.dart
new file mode 100644
index 0000000..a10a72c
--- /dev/null
+++ b/pkgs/watcher/test/directory_watcher/linux_test.dart
@@ -0,0 +1,44 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('linux')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/linux.dart';
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = LinuxDirectoryWatcher.new;
+
+ sharedTests();
+
+ test('DirectoryWatcher creates a LinuxDirectoryWatcher on Linux', () {
+ expect(DirectoryWatcher('.'), const TypeMatcher<LinuxDirectoryWatcher>());
+ });
+
+ test('emits events for many nested files moved out then immediately back in',
+ () async {
+ withPermutations(
+ (i, j, k) => writeFile('dir/sub/sub-$i/sub-$j/file-$k.txt'));
+ await startWatcher(path: 'dir');
+
+ renameDir('dir/sub', 'sub');
+ renameDir('sub', 'dir/sub');
+
+ await allowEither(() {
+ inAnyOrder(withPermutations(
+ (i, j, k) => isRemoveEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+
+ inAnyOrder(withPermutations(
+ (i, j, k) => isAddEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+ }, () {
+ inAnyOrder(withPermutations(
+ (i, j, k) => isModifyEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+ });
+ });
+}
diff --git a/pkgs/watcher/test/directory_watcher/mac_os_test.dart b/pkgs/watcher/test/directory_watcher/mac_os_test.dart
new file mode 100644
index 0000000..3376626
--- /dev/null
+++ b/pkgs/watcher/test/directory_watcher/mac_os_test.dart
@@ -0,0 +1,69 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('mac-os')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/mac_os.dart';
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = MacOSDirectoryWatcher.new;
+
+ sharedTests();
+
+ test('DirectoryWatcher creates a MacOSDirectoryWatcher on Mac OS', () {
+ expect(DirectoryWatcher('.'), const TypeMatcher<MacOSDirectoryWatcher>());
+ });
+
+ test(
+ 'does not notify about the watched directory being deleted and '
+ 'recreated immediately before watching', () async {
+ createDir('dir');
+ writeFile('dir/old.txt');
+ deleteDir('dir');
+ createDir('dir');
+
+ await startWatcher(path: 'dir');
+ writeFile('dir/newer.txt');
+ await expectAddEvent('dir/newer.txt');
+ });
+
+ test('emits events for many nested files moved out then immediately back in',
+ () async {
+ withPermutations(
+ (i, j, k) => writeFile('dir/sub/sub-$i/sub-$j/file-$k.txt'));
+
+ await startWatcher(path: 'dir');
+
+ renameDir('dir/sub', 'sub');
+ renameDir('sub', 'dir/sub');
+
+ await allowEither(() {
+ inAnyOrder(withPermutations(
+ (i, j, k) => isRemoveEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+
+ inAnyOrder(withPermutations(
+ (i, j, k) => isAddEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+ }, () {
+ inAnyOrder(withPermutations(
+ (i, j, k) => isModifyEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+ });
+ });
+ test('does not suppress files with the same prefix as a directory', () async {
+ // Regression test for https://github.com/dart-lang/watcher/issues/83
+ writeFile('some_name.txt');
+
+ await startWatcher();
+
+ writeFile('some_name/some_name.txt');
+ deleteFile('some_name.txt');
+
+ await expectRemoveEvent('some_name.txt');
+ });
+}
diff --git a/pkgs/watcher/test/directory_watcher/polling_test.dart b/pkgs/watcher/test/directory_watcher/polling_test.dart
new file mode 100644
index 0000000..f4ec8f4
--- /dev/null
+++ b/pkgs/watcher/test/directory_watcher/polling_test.dart
@@ -0,0 +1,26 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ // Use a short delay to make the tests run quickly.
+ watcherFactory = (dir) => PollingDirectoryWatcher(dir,
+ pollingDelay: const Duration(milliseconds: 100));
+
+ sharedTests();
+
+ test('does not notify if the modification time did not change', () async {
+ writeFile('a.txt', contents: 'before');
+ writeFile('b.txt', contents: 'before');
+ await startWatcher();
+ writeFile('a.txt', contents: 'after', updateModified: false);
+ writeFile('b.txt', contents: 'after');
+ await expectModifyEvent('b.txt');
+ });
+}
diff --git a/pkgs/watcher/test/directory_watcher/shared.dart b/pkgs/watcher/test/directory_watcher/shared.dart
new file mode 100644
index 0000000..1ebc78d
--- /dev/null
+++ b/pkgs/watcher/test/directory_watcher/shared.dart
@@ -0,0 +1,344 @@
+// Copyright (c) 2012, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:watcher/src/utils.dart';
+
+import '../utils.dart';
+
+void sharedTests() {
+ test('does not notify for files that already exist when started', () async {
+ // Make some pre-existing files.
+ writeFile('a.txt');
+ writeFile('b.txt');
+
+ await startWatcher();
+
+ // Change one after the watcher is running.
+ writeFile('b.txt', contents: 'modified');
+
+ // We should get a modify event for the changed file, but no add events
+ // for them before this.
+ await expectModifyEvent('b.txt');
+ });
+
+ test('notifies when a file is added', () async {
+ await startWatcher();
+ writeFile('file.txt');
+ await expectAddEvent('file.txt');
+ });
+
+ test('notifies when a file is modified', () async {
+ writeFile('file.txt');
+ await startWatcher();
+ writeFile('file.txt', contents: 'modified');
+ await expectModifyEvent('file.txt');
+ });
+
+ test('notifies when a file is removed', () async {
+ writeFile('file.txt');
+ await startWatcher();
+ deleteFile('file.txt');
+ await expectRemoveEvent('file.txt');
+ });
+
+ test('notifies when a file is modified multiple times', () async {
+ writeFile('file.txt');
+ await startWatcher();
+ writeFile('file.txt', contents: 'modified');
+ await expectModifyEvent('file.txt');
+ writeFile('file.txt', contents: 'modified again');
+ await expectModifyEvent('file.txt');
+ });
+
+ test('notifies even if the file contents are unchanged', () async {
+ writeFile('a.txt', contents: 'same');
+ writeFile('b.txt', contents: 'before');
+ await startWatcher();
+
+ writeFile('a.txt', contents: 'same');
+ writeFile('b.txt', contents: 'after');
+ await inAnyOrder([isModifyEvent('a.txt'), isModifyEvent('b.txt')]);
+ });
+
+ test('when the watched directory is deleted, removes all files', () async {
+ writeFile('dir/a.txt');
+ writeFile('dir/b.txt');
+
+ await startWatcher(path: 'dir');
+
+ deleteDir('dir');
+ await inAnyOrder([isRemoveEvent('dir/a.txt'), isRemoveEvent('dir/b.txt')]);
+ });
+
+ test('when the watched directory is moved, removes all files', () async {
+ writeFile('dir/a.txt');
+ writeFile('dir/b.txt');
+
+ await startWatcher(path: 'dir');
+
+ renameDir('dir', 'moved_dir');
+ createDir('dir');
+ await inAnyOrder([isRemoveEvent('dir/a.txt'), isRemoveEvent('dir/b.txt')]);
+ });
+
+ // Regression test for b/30768513.
+ test(
+ "doesn't crash when the directory is moved immediately after a subdir "
+ 'is added', () async {
+ writeFile('dir/a.txt');
+ writeFile('dir/b.txt');
+
+ await startWatcher(path: 'dir');
+
+ createDir('dir/subdir');
+ renameDir('dir', 'moved_dir');
+ createDir('dir');
+ await inAnyOrder([isRemoveEvent('dir/a.txt'), isRemoveEvent('dir/b.txt')]);
+ });
+
+ group('moves', () {
+ test('notifies when a file is moved within the watched directory',
+ () async {
+ writeFile('old.txt');
+ await startWatcher();
+ renameFile('old.txt', 'new.txt');
+
+ await inAnyOrder([isAddEvent('new.txt'), isRemoveEvent('old.txt')]);
+ });
+
+ test('notifies when a file is moved from outside the watched directory',
+ () async {
+ writeFile('old.txt');
+ createDir('dir');
+ await startWatcher(path: 'dir');
+
+ renameFile('old.txt', 'dir/new.txt');
+ await expectAddEvent('dir/new.txt');
+ });
+
+ test('notifies when a file is moved outside the watched directory',
+ () async {
+ writeFile('dir/old.txt');
+ await startWatcher(path: 'dir');
+
+ renameFile('dir/old.txt', 'new.txt');
+ await expectRemoveEvent('dir/old.txt');
+ });
+
+ test('notifies when a file is moved onto an existing one', () async {
+ writeFile('from.txt');
+ writeFile('to.txt');
+ await startWatcher();
+
+ renameFile('from.txt', 'to.txt');
+ await inAnyOrder([isRemoveEvent('from.txt'), isModifyEvent('to.txt')]);
+ }, onPlatform: {
+ 'windows': const Skip('https://github.com/dart-lang/watcher/issues/125')
+ });
+ });
+
+ // Most of the time, when multiple filesystem actions happen in sequence,
+ // they'll be batched together and the watcher will see them all at once.
+ // These tests verify that the watcher normalizes and combine these events
+ // properly. However, very occasionally the events will be reported in
+ // separate batches, and the watcher will report them as though they occurred
+ // far apart in time, so each of these tests has a "backup case" to allow for
+ // that as well.
+ group('clustered changes', () {
+ test("doesn't notify when a file is created and then immediately removed",
+ () async {
+ writeFile('test.txt');
+ await startWatcher();
+ writeFile('file.txt');
+ deleteFile('file.txt');
+
+ // Backup case.
+ startClosingEventStream();
+ await allowEvents(() {
+ expectAddEvent('file.txt');
+ expectRemoveEvent('file.txt');
+ });
+ });
+
+ test(
+ 'reports a modification when a file is deleted and then immediately '
+ 'recreated', () async {
+ writeFile('file.txt');
+ await startWatcher();
+
+ deleteFile('file.txt');
+ writeFile('file.txt', contents: 're-created');
+
+ await allowEither(() {
+ expectModifyEvent('file.txt');
+ }, () {
+ // Backup case.
+ expectRemoveEvent('file.txt');
+ expectAddEvent('file.txt');
+ });
+ });
+
+ test(
+ 'reports a modification when a file is moved and then immediately '
+ 'recreated', () async {
+ writeFile('old.txt');
+ await startWatcher();
+
+ renameFile('old.txt', 'new.txt');
+ writeFile('old.txt', contents: 're-created');
+
+ await allowEither(() {
+ inAnyOrder([isModifyEvent('old.txt'), isAddEvent('new.txt')]);
+ }, () {
+ // Backup case.
+ expectRemoveEvent('old.txt');
+ expectAddEvent('new.txt');
+ expectAddEvent('old.txt');
+ });
+ });
+
+ test(
+ 'reports a removal when a file is modified and then immediately '
+ 'removed', () async {
+ writeFile('file.txt');
+ await startWatcher();
+
+ writeFile('file.txt', contents: 'modified');
+ deleteFile('file.txt');
+
+ // Backup case.
+ await allowModifyEvent('file.txt');
+
+ await expectRemoveEvent('file.txt');
+ });
+
+ test('reports an add when a file is added and then immediately modified',
+ () async {
+ await startWatcher();
+
+ writeFile('file.txt');
+ writeFile('file.txt', contents: 'modified');
+
+ await expectAddEvent('file.txt');
+
+ // Backup case.
+ startClosingEventStream();
+ await allowModifyEvent('file.txt');
+ });
+ });
+
+ group('subdirectories', () {
+ test('watches files in subdirectories', () async {
+ await startWatcher();
+ writeFile('a/b/c/d/file.txt');
+ await expectAddEvent('a/b/c/d/file.txt');
+ });
+
+ test(
+ 'notifies when a subdirectory is moved within the watched directory '
+ 'and then its contents are modified', () async {
+ writeFile('old/file.txt');
+ await startWatcher();
+
+ renameDir('old', 'new');
+ await inAnyOrder(
+ [isRemoveEvent('old/file.txt'), isAddEvent('new/file.txt')]);
+
+ writeFile('new/file.txt', contents: 'modified');
+ await expectModifyEvent('new/file.txt');
+ });
+
+ test('notifies when a file is replaced by a subdirectory', () async {
+ writeFile('new');
+ writeFile('old/file.txt');
+ await startWatcher();
+
+ deleteFile('new');
+ renameDir('old', 'new');
+ await inAnyOrder([
+ isRemoveEvent('new'),
+ isRemoveEvent('old/file.txt'),
+ isAddEvent('new/file.txt')
+ ]);
+ });
+
+ test('notifies when a subdirectory is replaced by a file', () async {
+ writeFile('old');
+ writeFile('new/file.txt');
+ await startWatcher();
+
+ renameDir('new', 'newer');
+ renameFile('old', 'new');
+ await inAnyOrder([
+ isRemoveEvent('new/file.txt'),
+ isAddEvent('newer/file.txt'),
+ isRemoveEvent('old'),
+ isAddEvent('new')
+ ]);
+ }, onPlatform: {
+ 'windows': const Skip('https://github.com/dart-lang/watcher/issues/21')
+ });
+
+ test('emits events for many nested files added at once', () async {
+ withPermutations((i, j, k) => writeFile('sub/sub-$i/sub-$j/file-$k.txt'));
+
+ createDir('dir');
+ await startWatcher(path: 'dir');
+ renameDir('sub', 'dir/sub');
+
+ await inAnyOrder(withPermutations(
+ (i, j, k) => isAddEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+ });
+
+ test('emits events for many nested files removed at once', () async {
+ withPermutations(
+ (i, j, k) => writeFile('dir/sub/sub-$i/sub-$j/file-$k.txt'));
+
+ createDir('dir');
+ await startWatcher(path: 'dir');
+
+ // Rename the directory rather than deleting it because native watchers
+ // report a rename as a single DELETE event for the directory, whereas
+ // they report recursive deletion with DELETE events for every file in the
+ // directory.
+ renameDir('dir/sub', 'sub');
+
+ await inAnyOrder(withPermutations(
+ (i, j, k) => isRemoveEvent('dir/sub/sub-$i/sub-$j/file-$k.txt')));
+ });
+
+ test('emits events for many nested files moved at once', () async {
+ withPermutations(
+ (i, j, k) => writeFile('dir/old/sub-$i/sub-$j/file-$k.txt'));
+
+ createDir('dir');
+ await startWatcher(path: 'dir');
+ renameDir('dir/old', 'dir/new');
+
+ await inAnyOrder(unionAll(withPermutations((i, j, k) {
+ return {
+ isRemoveEvent('dir/old/sub-$i/sub-$j/file-$k.txt'),
+ isAddEvent('dir/new/sub-$i/sub-$j/file-$k.txt')
+ };
+ })));
+ });
+
+ test(
+ 'emits events for many files added at once in a subdirectory with the '
+ 'same name as a removed file', () async {
+ writeFile('dir/sub');
+ withPermutations((i, j, k) => writeFile('old/sub-$i/sub-$j/file-$k.txt'));
+ await startWatcher(path: 'dir');
+
+ deleteFile('dir/sub');
+ renameDir('old', 'dir/sub');
+
+ var events = withPermutations(
+ (i, j, k) => isAddEvent('dir/sub/sub-$i/sub-$j/file-$k.txt'));
+ events.add(isRemoveEvent('dir/sub'));
+ await inAnyOrder(events);
+ });
+ });
+}
diff --git a/pkgs/watcher/test/directory_watcher/windows_test.dart b/pkgs/watcher/test/directory_watcher/windows_test.dart
new file mode 100644
index 0000000..499e7fb
--- /dev/null
+++ b/pkgs/watcher/test/directory_watcher/windows_test.dart
@@ -0,0 +1,23 @@
+// Copyright (c) 2014, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('windows')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/windows.dart';
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = WindowsDirectoryWatcher.new;
+
+ group('Shared Tests:', sharedTests);
+
+ test('DirectoryWatcher creates a WindowsDirectoryWatcher on Windows', () {
+ expect(DirectoryWatcher('.'), const TypeMatcher<WindowsDirectoryWatcher>());
+ });
+}
diff --git a/pkgs/watcher/test/file_watcher/native_test.dart b/pkgs/watcher/test/file_watcher/native_test.dart
new file mode 100644
index 0000000..0d4ad63
--- /dev/null
+++ b/pkgs/watcher/test/file_watcher/native_test.dart
@@ -0,0 +1,22 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('linux || mac-os')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/file_watcher/native.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = NativeFileWatcher.new;
+
+ setUp(() {
+ writeFile('file.txt');
+ });
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/file_watcher/polling_test.dart b/pkgs/watcher/test/file_watcher/polling_test.dart
new file mode 100644
index 0000000..861fcb2
--- /dev/null
+++ b/pkgs/watcher/test/file_watcher/polling_test.dart
@@ -0,0 +1,20 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = (file) =>
+ PollingFileWatcher(file, pollingDelay: const Duration(milliseconds: 100));
+
+ setUp(() {
+ writeFile('file.txt');
+ });
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/file_watcher/shared.dart b/pkgs/watcher/test/file_watcher/shared.dart
new file mode 100644
index 0000000..081b92e
--- /dev/null
+++ b/pkgs/watcher/test/file_watcher/shared.dart
@@ -0,0 +1,73 @@
+// Copyright (c) 2015, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+
+import '../utils.dart';
+
+void sharedTests() {
+ test("doesn't notify if the file isn't modified", () async {
+ await startWatcher(path: 'file.txt');
+ await pumpEventQueue();
+ deleteFile('file.txt');
+ await expectRemoveEvent('file.txt');
+ });
+
+ test('notifies when a file is modified', () async {
+ await startWatcher(path: 'file.txt');
+ writeFile('file.txt', contents: 'modified');
+ await expectModifyEvent('file.txt');
+ });
+
+ test('notifies when a file is removed', () async {
+ await startWatcher(path: 'file.txt');
+ deleteFile('file.txt');
+ await expectRemoveEvent('file.txt');
+ });
+
+ test('notifies when a file is modified multiple times', () async {
+ await startWatcher(path: 'file.txt');
+ writeFile('file.txt', contents: 'modified');
+ await expectModifyEvent('file.txt');
+ writeFile('file.txt', contents: 'modified again');
+ await expectModifyEvent('file.txt');
+ });
+
+ test('notifies even if the file contents are unchanged', () async {
+ await startWatcher(path: 'file.txt');
+ writeFile('file.txt');
+ await expectModifyEvent('file.txt');
+ });
+
+ test('emits a remove event when the watched file is moved away', () async {
+ await startWatcher(path: 'file.txt');
+ renameFile('file.txt', 'new.txt');
+ await expectRemoveEvent('file.txt');
+ });
+
+ test(
+ 'emits a modify event when another file is moved on top of the watched '
+ 'file', () async {
+ writeFile('old.txt');
+ await startWatcher(path: 'file.txt');
+ renameFile('old.txt', 'file.txt');
+ await expectModifyEvent('file.txt');
+ });
+
+ // Regression test for a race condition.
+ test('closes the watcher immediately after deleting the file', () async {
+ writeFile('old.txt');
+ var watcher = createWatcher(path: 'file.txt');
+ var sub = watcher.events.listen(null);
+
+ deleteFile('file.txt');
+ await Future<void>.delayed(const Duration(milliseconds: 10));
+ await sub.cancel();
+ });
+
+ test('ready completes even if file does not exist', () async {
+ // startWatcher awaits 'ready'
+ await startWatcher(path: 'foo/bar/baz');
+ });
+}
diff --git a/pkgs/watcher/test/no_subscription/linux_test.dart b/pkgs/watcher/test/no_subscription/linux_test.dart
new file mode 100644
index 0000000..aac0810
--- /dev/null
+++ b/pkgs/watcher/test/no_subscription/linux_test.dart
@@ -0,0 +1,18 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('linux')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/linux.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = LinuxDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/no_subscription/mac_os_test.dart b/pkgs/watcher/test/no_subscription/mac_os_test.dart
new file mode 100644
index 0000000..55a8308
--- /dev/null
+++ b/pkgs/watcher/test/no_subscription/mac_os_test.dart
@@ -0,0 +1,18 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('mac-os')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/mac_os.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = MacOSDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/no_subscription/polling_test.dart b/pkgs/watcher/test/no_subscription/polling_test.dart
new file mode 100644
index 0000000..bfd2958
--- /dev/null
+++ b/pkgs/watcher/test/no_subscription/polling_test.dart
@@ -0,0 +1,14 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = PollingDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/no_subscription/shared.dart b/pkgs/watcher/test/no_subscription/shared.dart
new file mode 100644
index 0000000..e7a6144
--- /dev/null
+++ b/pkgs/watcher/test/no_subscription/shared.dart
@@ -0,0 +1,54 @@
+// Copyright (c) 2012, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:async/async.dart';
+import 'package:test/test.dart';
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+
+void sharedTests() {
+ test('does not notify for changes when there are no subscribers', () async {
+ // Note that this test doesn't rely as heavily on the test functions in
+ // utils.dart because it needs to be very explicit about when the event
+ // stream is and is not subscribed.
+ var watcher = createWatcher();
+ var queue = StreamQueue(watcher.events);
+ unawaited(queue.hasNext);
+
+ var future =
+ expectLater(queue, emits(isWatchEvent(ChangeType.ADD, 'file.txt')));
+ expect(queue, neverEmits(anything));
+
+ await watcher.ready;
+
+ writeFile('file.txt');
+
+ await future;
+
+ // Unsubscribe.
+ await queue.cancel(immediate: true);
+
+ // Now write a file while we aren't listening.
+ writeFile('unwatched.txt');
+
+ queue = StreamQueue(watcher.events);
+ future =
+ expectLater(queue, emits(isWatchEvent(ChangeType.ADD, 'added.txt')));
+ expect(queue, neverEmits(isWatchEvent(ChangeType.ADD, 'unwatched.txt')));
+
+ // Wait until the watcher is ready to dispatch events again.
+ await watcher.ready;
+
+ // And add a third file.
+ writeFile('added.txt');
+
+ // Wait until we get an event for the third file.
+ await future;
+
+ await queue.cancel(immediate: true);
+ });
+}
diff --git a/pkgs/watcher/test/no_subscription/windows_test.dart b/pkgs/watcher/test/no_subscription/windows_test.dart
new file mode 100644
index 0000000..9f9e5a9
--- /dev/null
+++ b/pkgs/watcher/test/no_subscription/windows_test.dart
@@ -0,0 +1,18 @@
+// Copyright (c) 2022, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('windows')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/windows.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = WindowsDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/path_set_test.dart b/pkgs/watcher/test/path_set_test.dart
new file mode 100644
index 0000000..61ab2cd
--- /dev/null
+++ b/pkgs/watcher/test/path_set_test.dart
@@ -0,0 +1,228 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:path/path.dart' as p;
+import 'package:test/test.dart';
+import 'package:watcher/src/path_set.dart';
+
+Matcher containsPath(String path) => predicate(
+ (paths) => paths is PathSet && paths.contains(path),
+ 'set contains "$path"');
+
+Matcher containsDir(String path) => predicate(
+ (paths) => paths is PathSet && paths.containsDir(path),
+ 'set contains directory "$path"');
+
+void main() {
+ late PathSet paths;
+ setUp(() => paths = PathSet('root'));
+
+ group('adding a path', () {
+ test('stores the path in the set', () {
+ paths.add('root/path/to/file');
+ expect(paths, containsPath('root/path/to/file'));
+ });
+
+ test("that's a subdir of another path keeps both in the set", () {
+ paths.add('root/path');
+ paths.add('root/path/to/file');
+ expect(paths, containsPath('root/path'));
+ expect(paths, containsPath('root/path/to/file'));
+ });
+
+ test("that's not normalized normalizes the path before storing it", () {
+ paths.add('root/../root/path/to/../to/././file');
+ expect(paths, containsPath('root/path/to/file'));
+ });
+
+ test("that's absolute normalizes the path before storing it", () {
+ paths.add(p.absolute('root/path/to/file'));
+ expect(paths, containsPath('root/path/to/file'));
+ });
+ });
+
+ group('removing a path', () {
+ test("that's in the set removes and returns that path", () {
+ paths.add('root/path/to/file');
+ expect(paths.remove('root/path/to/file'),
+ unorderedEquals([p.normalize('root/path/to/file')]));
+ expect(paths, isNot(containsPath('root/path/to/file')));
+ });
+
+ test("that's not in the set returns an empty set", () {
+ paths.add('root/path/to/file');
+ expect(paths.remove('root/path/to/nothing'), isEmpty);
+ });
+
+ test("that's a directory removes and returns all files beneath it", () {
+ paths.add('root/outside');
+ paths.add('root/path/to/one');
+ paths.add('root/path/to/two');
+ paths.add('root/path/to/sub/three');
+
+ expect(
+ paths.remove('root/path'),
+ unorderedEquals([
+ 'root/path/to/one',
+ 'root/path/to/two',
+ 'root/path/to/sub/three'
+ ].map(p.normalize)));
+
+ expect(paths, containsPath('root/outside'));
+ expect(paths, isNot(containsPath('root/path/to/one')));
+ expect(paths, isNot(containsPath('root/path/to/two')));
+ expect(paths, isNot(containsPath('root/path/to/sub/three')));
+ });
+
+ test(
+ "that's a directory in the set removes and returns it and all files "
+ 'beneath it', () {
+ paths.add('root/path');
+ paths.add('root/path/to/one');
+ paths.add('root/path/to/two');
+ paths.add('root/path/to/sub/three');
+
+ expect(
+ paths.remove('root/path'),
+ unorderedEquals([
+ 'root/path',
+ 'root/path/to/one',
+ 'root/path/to/two',
+ 'root/path/to/sub/three'
+ ].map(p.normalize)));
+
+ expect(paths, isNot(containsPath('root/path')));
+ expect(paths, isNot(containsPath('root/path/to/one')));
+ expect(paths, isNot(containsPath('root/path/to/two')));
+ expect(paths, isNot(containsPath('root/path/to/sub/three')));
+ });
+
+ test("that's not normalized removes and returns the normalized path", () {
+ paths.add('root/path/to/file');
+ expect(paths.remove('root/../root/path/to/../to/./file'),
+ unorderedEquals([p.normalize('root/path/to/file')]));
+ });
+
+ test("that's absolute removes and returns the normalized path", () {
+ paths.add('root/path/to/file');
+ expect(paths.remove(p.absolute('root/path/to/file')),
+ unorderedEquals([p.normalize('root/path/to/file')]));
+ });
+ });
+
+ group('containsPath()', () {
+ test('returns false for a non-existent path', () {
+ paths.add('root/path/to/file');
+ expect(paths, isNot(containsPath('root/path/to/nothing')));
+ });
+
+ test("returns false for a directory that wasn't added explicitly", () {
+ paths.add('root/path/to/file');
+ expect(paths, isNot(containsPath('root/path')));
+ });
+
+ test('returns true for a directory that was added explicitly', () {
+ paths.add('root/path');
+ paths.add('root/path/to/file');
+ expect(paths, containsPath('root/path'));
+ });
+
+ test('with a non-normalized path normalizes the path before looking it up',
+ () {
+ paths.add('root/path/to/file');
+ expect(paths, containsPath('root/../root/path/to/../to/././file'));
+ });
+
+ test('with an absolute path normalizes the path before looking it up', () {
+ paths.add('root/path/to/file');
+ expect(paths, containsPath(p.absolute('root/path/to/file')));
+ });
+ });
+
+ group('containsDir()', () {
+ test('returns true for a directory that was added implicitly', () {
+ paths.add('root/path/to/file');
+ expect(paths, containsDir('root/path'));
+ expect(paths, containsDir('root/path/to'));
+ });
+
+ test('returns true for a directory that was added explicitly', () {
+ paths.add('root/path');
+ paths.add('root/path/to/file');
+ expect(paths, containsDir('root/path'));
+ });
+
+ test("returns false for a directory that wasn't added", () {
+ expect(paths, isNot(containsDir('root/nothing')));
+ });
+
+ test('returns false for a non-directory path that was added', () {
+ paths.add('root/path/to/file');
+ expect(paths, isNot(containsDir('root/path/to/file')));
+ });
+
+ test(
+ 'returns false for a directory that was added implicitly and then '
+ 'removed implicitly', () {
+ paths.add('root/path/to/file');
+ paths.remove('root/path/to/file');
+ expect(paths, isNot(containsDir('root/path')));
+ });
+
+ test(
+ 'returns false for a directory that was added explicitly whose '
+ 'children were then removed', () {
+ paths.add('root/path');
+ paths.add('root/path/to/file');
+ paths.remove('root/path/to/file');
+ expect(paths, isNot(containsDir('root/path')));
+ });
+
+ test('with a non-normalized path normalizes the path before looking it up',
+ () {
+ paths.add('root/path/to/file');
+ expect(paths, containsDir('root/../root/path/to/../to/.'));
+ });
+
+ test('with an absolute path normalizes the path before looking it up', () {
+ paths.add('root/path/to/file');
+ expect(paths, containsDir(p.absolute('root/path')));
+ });
+ });
+
+ group('paths', () {
+ test('returns paths added to the set', () {
+ paths.add('root/path');
+ paths.add('root/path/to/one');
+ paths.add('root/path/to/two');
+
+ expect(
+ paths.paths,
+ unorderedEquals([
+ 'root/path',
+ 'root/path/to/one',
+ 'root/path/to/two',
+ ].map(p.normalize)));
+ });
+
+ test("doesn't return paths removed from the set", () {
+ paths.add('root/path/to/one');
+ paths.add('root/path/to/two');
+ paths.remove('root/path/to/two');
+
+ expect(paths.paths, unorderedEquals([p.normalize('root/path/to/one')]));
+ });
+ });
+
+ group('clear', () {
+ test('removes all paths from the set', () {
+ paths.add('root/path');
+ paths.add('root/path/to/one');
+ paths.add('root/path/to/two');
+
+ paths.clear();
+ expect(paths.paths, isEmpty);
+ });
+ });
+}
diff --git a/pkgs/watcher/test/ready/linux_test.dart b/pkgs/watcher/test/ready/linux_test.dart
new file mode 100644
index 0000000..aac0810
--- /dev/null
+++ b/pkgs/watcher/test/ready/linux_test.dart
@@ -0,0 +1,18 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('linux')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/linux.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = LinuxDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/ready/mac_os_test.dart b/pkgs/watcher/test/ready/mac_os_test.dart
new file mode 100644
index 0000000..55a8308
--- /dev/null
+++ b/pkgs/watcher/test/ready/mac_os_test.dart
@@ -0,0 +1,18 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('mac-os')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/mac_os.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = MacOSDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/ready/polling_test.dart b/pkgs/watcher/test/ready/polling_test.dart
new file mode 100644
index 0000000..bfd2958
--- /dev/null
+++ b/pkgs/watcher/test/ready/polling_test.dart
@@ -0,0 +1,14 @@
+// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:watcher/watcher.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = PollingDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/ready/shared.dart b/pkgs/watcher/test/ready/shared.dart
new file mode 100644
index 0000000..ab2c3e1
--- /dev/null
+++ b/pkgs/watcher/test/ready/shared.dart
@@ -0,0 +1,84 @@
+// Copyright (c) 2012, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:test/test.dart';
+
+import '../utils.dart';
+
+void sharedTests() {
+ test('ready does not complete until after subscription', () async {
+ var watcher = createWatcher();
+
+ var ready = false;
+ unawaited(watcher.ready.then((_) {
+ ready = true;
+ }));
+ await pumpEventQueue();
+
+ expect(ready, isFalse);
+
+ // Subscribe to the events.
+ var subscription = watcher.events.listen((event) {});
+
+ await watcher.ready;
+
+ // Should eventually be ready.
+ expect(watcher.isReady, isTrue);
+
+ await subscription.cancel();
+ });
+
+ test('ready completes immediately when already ready', () async {
+ var watcher = createWatcher();
+
+ // Subscribe to the events.
+ var subscription = watcher.events.listen((event) {});
+
+ // Allow watcher to become ready
+ await watcher.ready;
+
+ // Ensure ready completes immediately
+ expect(
+ watcher.ready.timeout(
+ const Duration(milliseconds: 0),
+ onTimeout: () => throw StateError('Does not complete immediately'),
+ ),
+ completes,
+ );
+
+ await subscription.cancel();
+ });
+
+ test('ready returns a future that does not complete after unsubscribing',
+ () async {
+ var watcher = createWatcher();
+
+ // Subscribe to the events.
+ var subscription = watcher.events.listen((event) {});
+
+ // Wait until ready.
+ await watcher.ready;
+
+ // Now unsubscribe.
+ await subscription.cancel();
+
+ // Should be back to not ready.
+ expect(watcher.ready, doesNotComplete);
+ });
+
+ test('ready completes even if directory does not exist', () async {
+ var watcher = createWatcher(path: 'does/not/exist');
+
+ // Subscribe to the events (else ready will never fire).
+ var subscription = watcher.events.listen((event) {}, onError: (error) {});
+
+ // Expect ready still completes.
+ await watcher.ready;
+
+ // Now unsubscribe.
+ await subscription.cancel();
+ });
+}
diff --git a/pkgs/watcher/test/ready/windows_test.dart b/pkgs/watcher/test/ready/windows_test.dart
new file mode 100644
index 0000000..9f9e5a9
--- /dev/null
+++ b/pkgs/watcher/test/ready/windows_test.dart
@@ -0,0 +1,18 @@
+// Copyright (c) 2022, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('windows')
+library;
+
+import 'package:test/test.dart';
+import 'package:watcher/src/directory_watcher/windows.dart';
+
+import '../utils.dart';
+import 'shared.dart';
+
+void main() {
+ watcherFactory = WindowsDirectoryWatcher.new;
+
+ sharedTests();
+}
diff --git a/pkgs/watcher/test/utils.dart b/pkgs/watcher/test/utils.dart
new file mode 100644
index 0000000..7867b9f
--- /dev/null
+++ b/pkgs/watcher/test/utils.dart
@@ -0,0 +1,288 @@
+// Copyright (c) 2012, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:io';
+
+import 'package:async/async.dart';
+import 'package:path/path.dart' as p;
+import 'package:test/test.dart';
+import 'package:test_descriptor/test_descriptor.dart' as d;
+import 'package:watcher/src/stat.dart';
+import 'package:watcher/watcher.dart';
+
+typedef WatcherFactory = Watcher Function(String directory);
+
+/// Sets the function used to create the watcher.
+set watcherFactory(WatcherFactory factory) {
+ _watcherFactory = factory;
+}
+
+/// The mock modification times (in milliseconds since epoch) for each file.
+///
+/// The actual file system has pretty coarse granularity for file modification
+/// times. This means using the real file system requires us to put delays in
+/// the tests to ensure we wait long enough between operations for the mod time
+/// to be different.
+///
+/// Instead, we'll just mock that out. Each time a file is written, we manually
+/// increment the mod time for that file instantly.
+final _mockFileModificationTimes = <String, int>{};
+
+late WatcherFactory _watcherFactory;
+
+/// Creates a new [Watcher] that watches a temporary file or directory.
+///
+/// If [path] is provided, watches a subdirectory in the sandbox with that name.
+Watcher createWatcher({String? path}) {
+ if (path == null) {
+ path = d.sandbox;
+ } else {
+ path = p.join(d.sandbox, path);
+ }
+
+ return _watcherFactory(path);
+}
+
+/// The stream of events from the watcher started with [startWatcher].
+late StreamQueue<WatchEvent> _watcherEvents;
+
+/// Whether the event stream has been closed.
+///
+/// If this is not done by a test (by calling [startClosingEventStream]) it will
+/// be done automatically via [addTearDown] in [startWatcher].
+var _hasClosedStream = true;
+
+/// Creates a new [Watcher] that watches a temporary file or directory and
+/// starts monitoring it for events.
+///
+/// If [path] is provided, watches a path in the sandbox with that name.
+Future<void> startWatcher({String? path}) async {
+ mockGetModificationTime((path) {
+ final normalized = p.normalize(p.relative(path, from: d.sandbox));
+
+ // Make sure we got a path in the sandbox.
+ assert(p.isRelative(normalized) && !normalized.startsWith('..'),
+ 'Path is not in the sandbox: $path not in ${d.sandbox}');
+
+ var mtime = _mockFileModificationTimes[normalized];
+ return mtime != null ? DateTime.fromMillisecondsSinceEpoch(mtime) : null;
+ });
+
+ // We want to wait until we're ready *after* we subscribe to the watcher's
+ // events.
+ var watcher = createWatcher(path: path);
+ _watcherEvents = StreamQueue(watcher.events);
+ // Forces a subscription to the underlying stream.
+ unawaited(_watcherEvents.hasNext);
+
+ _hasClosedStream = false;
+ addTearDown(startClosingEventStream);
+
+ await watcher.ready;
+}
+
+/// Schedule closing the watcher stream after the event queue has been pumped.
+///
+/// This is necessary when events are allowed to occur, but don't have to occur,
+/// at the end of a test. Otherwise, if they don't occur, the test will wait
+/// indefinitely because they might in the future and because the watcher is
+/// normally only closed after the test completes.
+void startClosingEventStream() async {
+ if (_hasClosedStream) return;
+ _hasClosedStream = true;
+ await pumpEventQueue();
+ await _watcherEvents.cancel(immediate: true);
+}
+
+/// A list of [StreamMatcher]s that have been collected using
+/// [_collectStreamMatcher].
+List<StreamMatcher>? _collectedStreamMatchers;
+
+/// Collects all stream matchers that are registered within [block] into a
+/// single stream matcher.
+///
+/// The returned matcher will match each of the collected matchers in order.
+StreamMatcher _collectStreamMatcher(void Function() block) {
+ var oldStreamMatchers = _collectedStreamMatchers;
+ var collectedStreamMatchers = _collectedStreamMatchers = <StreamMatcher>[];
+ try {
+ block();
+ return emitsInOrder(collectedStreamMatchers);
+ } finally {
+ _collectedStreamMatchers = oldStreamMatchers;
+ }
+}
+
+/// Either add [streamMatcher] as an expectation to [_watcherEvents], or collect
+/// it with [_collectStreamMatcher].
+///
+/// [streamMatcher] can be a [StreamMatcher], a [Matcher], or a value.
+Future _expectOrCollect(Matcher streamMatcher) {
+ var collectedStreamMatchers = _collectedStreamMatchers;
+ if (collectedStreamMatchers != null) {
+ collectedStreamMatchers.add(emits(streamMatcher));
+ return Future.sync(() {});
+ } else {
+ return expectLater(_watcherEvents, emits(streamMatcher));
+ }
+}
+
+/// Expects that [matchers] will match emitted events in any order.
+///
+/// [matchers] may be [Matcher]s or values, but not [StreamMatcher]s.
+Future inAnyOrder(Iterable matchers) {
+ matchers = matchers.toSet();
+ return _expectOrCollect(emitsInAnyOrder(matchers));
+}
+
+/// Expects that the expectations established in either [block1] or [block2]
+/// will match the emitted events.
+///
+/// If both blocks match, the one that consumed more events will be used.
+Future allowEither(void Function() block1, void Function() block2) =>
+ _expectOrCollect(emitsAnyOf(
+ [_collectStreamMatcher(block1), _collectStreamMatcher(block2)]));
+
+/// Allows the expectations established in [block] to match the emitted events.
+///
+/// If the expectations in [block] don't match, no error will be raised and no
+/// events will be consumed. If this is used at the end of a test,
+/// [startClosingEventStream] should be called before it.
+Future allowEvents(void Function() block) =>
+ _expectOrCollect(mayEmit(_collectStreamMatcher(block)));
+
+/// Returns a StreamMatcher that matches a [WatchEvent] with the given [type]
+/// and [path].
+Matcher isWatchEvent(ChangeType type, String path) {
+ return predicate((e) {
+ return e is WatchEvent &&
+ e.type == type &&
+ e.path == p.join(d.sandbox, p.normalize(path));
+ }, 'is $type $path');
+}
+
+/// Returns a [Matcher] that matches a [WatchEvent] for an add event for [path].
+Matcher isAddEvent(String path) => isWatchEvent(ChangeType.ADD, path);
+
+/// Returns a [Matcher] that matches a [WatchEvent] for a modification event for
+/// [path].
+Matcher isModifyEvent(String path) => isWatchEvent(ChangeType.MODIFY, path);
+
+/// Returns a [Matcher] that matches a [WatchEvent] for a removal event for
+/// [path].
+Matcher isRemoveEvent(String path) => isWatchEvent(ChangeType.REMOVE, path);
+
+/// Expects that the next event emitted will be for an add event for [path].
+Future expectAddEvent(String path) =>
+ _expectOrCollect(isWatchEvent(ChangeType.ADD, path));
+
+/// Expects that the next event emitted will be for a modification event for
+/// [path].
+Future expectModifyEvent(String path) =>
+ _expectOrCollect(isWatchEvent(ChangeType.MODIFY, path));
+
+/// Expects that the next event emitted will be for a removal event for [path].
+Future expectRemoveEvent(String path) =>
+ _expectOrCollect(isWatchEvent(ChangeType.REMOVE, path));
+
+/// Consumes a modification event for [path] if one is emitted at this point in
+/// the schedule, but doesn't throw an error if it isn't.
+///
+/// If this is used at the end of a test, [startClosingEventStream] should be
+/// called before it.
+Future allowModifyEvent(String path) =>
+ _expectOrCollect(mayEmit(isWatchEvent(ChangeType.MODIFY, path)));
+
+/// Track a fake timestamp to be used when writing files. This always increases
+/// so that files that are deleted and re-created do not have their timestamp
+/// set back to a previously used value.
+int _nextTimestamp = 1;
+
+/// Schedules writing a file in the sandbox at [path] with [contents].
+///
+/// If [contents] is omitted, creates an empty file. If [updateModified] is
+/// `false`, the mock file modification time is not changed.
+void writeFile(String path, {String? contents, bool? updateModified}) {
+ contents ??= '';
+ updateModified ??= true;
+
+ var fullPath = p.join(d.sandbox, path);
+
+ // Create any needed subdirectories.
+ var dir = Directory(p.dirname(fullPath));
+ if (!dir.existsSync()) {
+ dir.createSync(recursive: true);
+ }
+
+ File(fullPath).writeAsStringSync(contents);
+
+ if (updateModified) {
+ path = p.normalize(path);
+
+ _mockFileModificationTimes[path] = _nextTimestamp++;
+ }
+}
+
+/// Schedules deleting a file in the sandbox at [path].
+void deleteFile(String path) {
+ File(p.join(d.sandbox, path)).deleteSync();
+
+ _mockFileModificationTimes.remove(path);
+}
+
+/// Schedules renaming a file in the sandbox from [from] to [to].
+void renameFile(String from, String to) {
+ File(p.join(d.sandbox, from)).renameSync(p.join(d.sandbox, to));
+
+ // Make sure we always use the same separator on Windows.
+ to = p.normalize(to);
+
+ _mockFileModificationTimes.update(to, (value) => value + 1,
+ ifAbsent: () => 1);
+}
+
+/// Schedules creating a directory in the sandbox at [path].
+void createDir(String path) {
+ Directory(p.join(d.sandbox, path)).createSync();
+}
+
+/// Schedules renaming a directory in the sandbox from [from] to [to].
+void renameDir(String from, String to) {
+ Directory(p.join(d.sandbox, from)).renameSync(p.join(d.sandbox, to));
+
+ // Migrate timestamps for any files in this folder.
+ final knownFilePaths = _mockFileModificationTimes.keys.toList();
+ for (final filePath in knownFilePaths) {
+ if (p.isWithin(from, filePath)) {
+ _mockFileModificationTimes[filePath.replaceAll(from, to)] =
+ _mockFileModificationTimes[filePath]!;
+ _mockFileModificationTimes.remove(filePath);
+ }
+ }
+}
+
+/// Schedules deleting a directory in the sandbox at [path].
+void deleteDir(String path) {
+ Directory(p.join(d.sandbox, path)).deleteSync(recursive: true);
+}
+
+/// Runs [callback] with every permutation of non-negative numbers for each
+/// argument less than [limit].
+///
+/// Returns a set of all values returns by [callback].
+///
+/// [limit] defaults to 3.
+Set<S> withPermutations<S>(S Function(int, int, int) callback, {int? limit}) {
+ limit ??= 3;
+ var results = <S>{};
+ for (var i = 0; i < limit; i++) {
+ for (var j = 0; j < limit; j++) {
+ for (var k = 0; k < limit; k++) {
+ results.add(callback(i, j, k));
+ }
+ }
+ }
+ return results;
+}
diff --git a/pkgs/yaml/.gitignore b/pkgs/yaml/.gitignore
new file mode 100644
index 0000000..ab3cb76
--- /dev/null
+++ b/pkgs/yaml/.gitignore
@@ -0,0 +1,16 @@
+# Don’t commit the following directories created by pub.
+.buildlog
+.dart_tool/
+.pub/
+build/
+packages
+.packages
+
+# Or the files created by dart2js.
+*.dart.js
+*.js_
+*.js.deps
+*.js.map
+
+# Include when developing application packages.
+pubspec.lock
diff --git a/pkgs/yaml/CHANGELOG.md b/pkgs/yaml/CHANGELOG.md
new file mode 100644
index 0000000..3f9d3fd
--- /dev/null
+++ b/pkgs/yaml/CHANGELOG.md
@@ -0,0 +1,199 @@
+## 3.1.3
+
+* Require Dart 3.4
+* Fix UTF-16 surrogate pair handling in plain scaler.
+* Move to `dart-lang/tools` monorepo.
+
+## 3.1.2
+
+* Require Dart 2.19
+* Added `topics` in `pubspec.yaml`.
+
+## 3.1.1
+
+* Switch to using package:lints.
+* Populate the pubspec `repository` field.
+
+## 3.1.0
+
+* `loadYaml` and related functions now accept a `recover` flag instructing the parser
+ to attempt to recover from parse errors and may return invalid or synthetic nodes.
+ When recovering, an `ErrorListener` can also be supplied to listen for errors that
+ are recovered from.
+* Drop dependency on `package:charcode`.
+
+## 3.0.0
+
+* Stable null safety release.
+
+## 3.0.0-nullsafety.0
+
+* Updated to support 2.12.0 and null safety.
+* Allow `YamlNode`s to be wrapped with an optional `style` parameter.
+* **BREAKING** The `sourceUrl` named argument is statically typed as `Uri`
+ instead of allowing `String` or `Uri`.
+
+## 2.2.1
+
+* Update min Dart SDK to `2.4.0`.
+* Fixed span for null nodes in block lists.
+
+## 2.2.0
+
+* POSSIBLY BREAKING CHANGE: Make `YamlMap` preserve parsed key order.
+ This is breaking because some programs may rely on the
+ `HashMap` sort order.
+
+## 2.1.16
+
+* Fixed deprecated API usage in README.
+* Fixed lints that affect package score.
+
+## 2.1.15
+
+* Set max SDK version to `<3.0.0`, and adjust other dependencies.
+
+## 2.1.14
+
+* Remove use of deprecated features.
+* Updated SDK version to 2.0.0-dev.17.0
+
+## 2.1.13
+
+* Stop using comment-based generic syntax.
+
+## 2.1.12
+
+* Properly refuse mappings with duplicate keys.
+
+## 2.1.11
+
+* Fix an infinite loop when parsing some invalid documents.
+
+## 2.1.10
+
+* Support `string_scanner` 1.0.0.
+
+## 2.1.9
+
+* Fix all strong-mode warnings.
+
+## 2.1.8
+
+* Remove the dependency on `path`, since we don't actually import it.
+
+## 2.1.7
+
+* Fix more strong mode warnings.
+
+## 2.1.6
+
+* Fix two analysis issues with DDC's strong mode.
+
+## 2.1.5
+
+* Fix a bug with 2.1.4 where source span information was being discarded for
+ scalar values.
+
+## 2.1.4
+
+* Substantially improve performance.
+
+## 2.1.3
+
+* Add a hint that a colon might be missing when a mapping value is found in the
+ wrong context.
+
+## 2.1.2
+
+* Fix a crashing bug when parsing block scalars.
+
+## 2.1.1
+
+* Properly scope `SourceSpan`s for scalar values surrounded by whitespace.
+
+## 2.1.0
+
+* Rewrite the parser for a 10x speed improvement.
+
+* Support anchors and aliases (`&foo` and `*foo`).
+
+* Support explicit tags (e.g. `!!str`). Note that user-defined tags are still
+ not fully supported.
+
+* `%YAML` and `%TAG` directives are now parsed, although again user-defined tags
+ are not fully supported.
+
+* `YamlScalar`, `YamlList`, and `YamlMap` now expose the styles in which they
+ were written (for example plain vs folded, block vs flow).
+
+* A `yamlWarningCallback` field is exposed. This field can be used to customize
+ how YAML warnings are displayed.
+
+## 2.0.1+1
+
+* Fix an import in a test.
+
+* Widen the version constraint on the `collection` package.
+
+## 2.0.1
+
+* Fix a few lingering references to the old `Span` class in documentation and
+ tests.
+
+## 2.0.0
+
+* Switch from `source_maps`' `Span` class to `source_span`'s `SourceSpan` class.
+
+* For consistency with `source_span` and `string_scanner`, all `sourceName`
+ parameters have been renamed to `sourceUrl`. They now accept Urls as well as
+ Strings.
+
+## 1.1.1
+
+* Fix broken type arguments that caused breakage on dart2js.
+
+* Fix an analyzer warning in `yaml_node_wrapper.dart`.
+
+## 1.1.0
+
+* Add new publicly-accessible constructors for `YamlNode` subclasses. These
+ constructors make it possible to use the same API to access non-YAML data as
+ YAML data.
+
+* Make `YamlException` inherit from source_map's `SpanFormatException`. This
+ improves the error formatting and allows callers access to source range
+ information.
+
+## 1.0.0+1
+
+* Fix a variable name typo.
+
+## 1.0.0
+
+* **Backwards incompatibility**: The data structures returned by `loadYaml` and
+ `loadYamlStream` are now immutable.
+
+* **Backwards incompatibility**: The interface of the `YamlMap` class has
+ changed substantially in numerous ways. External users may no longer construct
+ their own instances.
+
+* Maps and lists returned by `loadYaml` and `loadYamlStream` now contain
+ information about their source locations.
+
+* A new `loadYamlNode` function returns the source location of top-level scalars
+ as well.
+
+## 0.10.0
+
+* Improve error messages when a file fails to parse.
+
+## 0.9.0+2
+
+* Ensure that maps are order-independent when used as map keys.
+
+## 0.9.0+1
+
+* The `YamlMap` class is deprecated. In a future version, maps returned by
+ `loadYaml` and `loadYamlStream` will be Dart `HashMap`s with a custom equality
+ operation.
diff --git a/pkgs/yaml/LICENSE b/pkgs/yaml/LICENSE
new file mode 100644
index 0000000..e7589cb
--- /dev/null
+++ b/pkgs/yaml/LICENSE
@@ -0,0 +1,20 @@
+Copyright (c) 2014, the Dart project authors.
+Copyright (c) 2006, Kirill Simonov.
+
+Permission is hereby granted, free of charge, to any person obtaining a copy of
+this software and associated documentation files (the "Software"), to deal in
+the Software without restriction, including without limitation the rights to
+use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
+of the Software, and to permit persons to whom the Software is furnished to do
+so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
diff --git a/pkgs/yaml/README.md b/pkgs/yaml/README.md
new file mode 100644
index 0000000..ba56893
--- /dev/null
+++ b/pkgs/yaml/README.md
@@ -0,0 +1,33 @@
+[](https://github.com/dart-lang/tools/actions/workflows/yaml.yaml)
+[](https://pub.dev/packages/yaml)
+[](https://pub.dev/packages/yaml/publisher)
+
+
+A parser for [YAML](https://yaml.org/).
+
+## Usage
+
+Use `loadYaml` to load a single document, or `loadYamlStream` to load a
+stream of documents. For example:
+
+```dart
+import 'package:yaml/yaml.dart';
+
+main() {
+ var doc = loadYaml("YAML: YAML Ain't Markup Language");
+ print(doc['YAML']);
+}
+```
+
+This library currently doesn't support dumping to YAML. You should use
+`json.encode` from `dart:convert` instead:
+
+```dart
+import 'dart:convert';
+import 'package:yaml/yaml.dart';
+
+main() {
+ var doc = loadYaml("YAML: YAML Ain't Markup Language");
+ print(json.encode(doc));
+}
+```
diff --git a/pkgs/yaml/analysis_options.yaml b/pkgs/yaml/analysis_options.yaml
new file mode 100644
index 0000000..46e45f0
--- /dev/null
+++ b/pkgs/yaml/analysis_options.yaml
@@ -0,0 +1,18 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ language:
+ strict-casts: true
+
+linter:
+ rules:
+ - avoid_private_typedef_functions
+ - avoid_redundant_argument_values
+ - avoid_unused_constructor_parameters
+ - cancel_subscriptions
+ - join_return_with_assignment
+ - missing_whitespace_between_adjacent_strings
+ - no_runtimeType_toString
+ - prefer_const_declarations
+ - prefer_expression_function_bodies
+ - use_string_buffers
diff --git a/pkgs/yaml/benchmark/benchmark.dart b/pkgs/yaml/benchmark/benchmark.dart
new file mode 100644
index 0000000..afc3c97
--- /dev/null
+++ b/pkgs/yaml/benchmark/benchmark.dart
@@ -0,0 +1,65 @@
+// Copyright (c) 2015, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'dart:convert';
+import 'dart:io';
+
+import 'package:path/path.dart' as p;
+import 'package:yaml/yaml.dart';
+
+const numTrials = 100;
+const runsPerTrial = 1000;
+
+final source = _loadFile('input.yaml');
+final expected = _loadFile('output.json');
+
+void main(List<String> args) {
+ var best = double.infinity;
+
+ // Run the benchmark several times. This ensures the VM is warmed up and lets
+ // us see how much variance there is.
+ for (var i = 0; i <= numTrials; i++) {
+ var start = DateTime.now();
+
+ // For a single benchmark, convert the source multiple times.
+ Object? result;
+ for (var j = 0; j < runsPerTrial; j++) {
+ result = loadYaml(source);
+ }
+
+ var elapsed =
+ DateTime.now().difference(start).inMilliseconds / runsPerTrial;
+
+ // Keep track of the best run so far.
+ if (elapsed >= best) continue;
+ best = elapsed;
+
+ // Sanity check to make sure the output is what we expect and to make sure
+ // the VM doesn't optimize "dead" code away.
+ if (jsonEncode(result) != expected) {
+ print('Incorrect output:\n${jsonEncode(result)}');
+ exit(1);
+ }
+
+ // Don't print the first run. It's always terrible since the VM hasn't
+ // warmed up yet.
+ if (i == 0) continue;
+ _printResult("Run ${'#$i'.padLeft(3, '')}", elapsed);
+ }
+
+ _printResult('Best ', best);
+}
+
+String _loadFile(String name) {
+ var path = p.join(p.dirname(p.fromUri(Platform.script)), name);
+ return File(path).readAsStringSync();
+}
+
+void _printResult(String label, double time) {
+ print('$label: ${time.toStringAsFixed(3).padLeft(4, '0')}ms '
+ "${'=' * ((time * 100).toInt())}");
+}
diff --git a/pkgs/yaml/benchmark/input.yaml b/pkgs/yaml/benchmark/input.yaml
new file mode 100644
index 0000000..89bf9dc
--- /dev/null
+++ b/pkgs/yaml/benchmark/input.yaml
@@ -0,0 +1,48 @@
+verb: RecommendCafes
+recipe:
+ - verb: List
+ outputs: ["Cafe[]"]
+ - verb: Fetch
+ inputs: ["Cafe[]"]
+ outputs: ["CafeWithMenu[]"]
+ - verb: Flatten
+ inputs: ["CafeWithMenu[]"]
+ outputs: ["DishOffering[]"]
+ - verb: Score
+ inputs: ["DishOffering[]"]
+ outputs: ["DishOffering[]/Scored"]
+ - verb: Display
+ inputs: ["DishOffering[]/Scored"]
+tags:
+ booleans: [ true, false ]
+ dates:
+ - canonical: 2001-12-15T02:59:43.1Z
+ - iso8601: 2001-12-14t21:59:43.10-05:00
+ - spaced: 2001-12-14 21:59:43.10 -5
+ - date: 2002-12-14
+ numbers:
+ - int: 12345
+ - negative: -345
+ - floating-point: 345.678
+ - hexidecimal: 0x123abc
+ - exponential: 12.3015e+02
+ - octal: 0o14
+ strings:
+ - unicode: "Sosa did fine.\u263A"
+ - control: "\b1998\t1999\t2000\n"
+ - hex esc: "\x0d\x0a is \r\n"
+ - single: '"Howdy!" he cried.'
+ - quoted: ' # Not a ''comment''.'
+ - tie-fighter: '|\-*-/|'
+ - plain:
+ This unquoted scalar
+ spans many lines.
+
+ - quoted: "So does this
+ quoted scalar.\n"
+ - accomplishment: >
+ Mark set a major league
+ home run record in 1998.
+ - stats: |
+ 65 Home Runs
+ 0.278 Batting Average
diff --git a/pkgs/yaml/benchmark/output.json b/pkgs/yaml/benchmark/output.json
new file mode 100644
index 0000000..9e6cb84
--- /dev/null
+++ b/pkgs/yaml/benchmark/output.json
@@ -0,0 +1 @@
+{"verb":"RecommendCafes","recipe":[{"verb":"List","outputs":["Cafe[]"]},{"verb":"Fetch","inputs":["Cafe[]"],"outputs":["CafeWithMenu[]"]},{"verb":"Flatten","inputs":["CafeWithMenu[]"],"outputs":["DishOffering[]"]},{"verb":"Score","inputs":["DishOffering[]"],"outputs":["DishOffering[]/Scored"]},{"verb":"Display","inputs":["DishOffering[]/Scored"]}],"tags":{"booleans":[true,false],"dates":[{"canonical":"2001-12-15T02:59:43.1Z"},{"iso8601":"2001-12-14t21:59:43.10-05:00"},{"spaced":"2001-12-14 21:59:43.10 -5"},{"date":"2002-12-14"}],"numbers":[{"int":12345},{"negative":-345},{"floating-point":345.678},{"hexidecimal":1194684},{"exponential":1230.15},{"octal":12}],"strings":[{"unicode":"Sosa did fine.☺"},{"control":"\b1998\t1999\t2000\n"},{"hex esc":"\r\n is \r\n"},{"single":"\"Howdy!\" he cried."},{"quoted":" # Not a 'comment'."},{"tie-fighter":"|\\-*-/|"},{"plain":"This unquoted scalar spans many lines."},{"quoted":"So does this quoted scalar.\n"},{"accomplishment":"Mark set a major league home run record in 1998.\n"},{"stats":"65 Home Runs\n0.278 Batting Average\n"}]}}
\ No newline at end of file
diff --git a/pkgs/yaml/example/example.dart b/pkgs/yaml/example/example.dart
new file mode 100644
index 0000000..bb283a3
--- /dev/null
+++ b/pkgs/yaml/example/example.dart
@@ -0,0 +1,13 @@
+// Copyright (c) 2020, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:yaml/yaml.dart';
+
+void main() {
+ var doc = loadYaml("YAML: YAML Ain't Markup Language") as Map;
+ print(doc['YAML']);
+}
diff --git a/pkgs/yaml/lib/src/charcodes.dart b/pkgs/yaml/lib/src/charcodes.dart
new file mode 100644
index 0000000..602d597
--- /dev/null
+++ b/pkgs/yaml/lib/src/charcodes.dart
@@ -0,0 +1,48 @@
+// Copyright (c) 2021, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+/// Character `+`.
+const int $plus = 0x2b;
+
+/// Character `-`.
+const int $minus = 0x2d;
+
+/// Character `.`.
+const int $dot = 0x2e;
+
+/// Character `0`.
+const int $0 = 0x30;
+
+/// Character `9`.
+const int $9 = 0x39;
+
+/// Character `F`.
+const int $F = 0x46;
+
+/// Character `N`.
+const int $N = 0x4e;
+
+/// Character `T`.
+const int $T = 0x54;
+
+/// Character `f`.
+const int $f = 0x66;
+
+/// Character `n`.
+const int $n = 0x6e;
+
+/// Character `o`.
+const int $o = 0x6f;
+
+/// Character `t`.
+const int $t = 0x74;
+
+/// Character `x`.
+const int $x = 0x78;
+
+/// Character `~`.
+const int $tilde = 0x7e;
diff --git a/pkgs/yaml/lib/src/equality.dart b/pkgs/yaml/lib/src/equality.dart
new file mode 100644
index 0000000..c833dc6
--- /dev/null
+++ b/pkgs/yaml/lib/src/equality.dart
@@ -0,0 +1,128 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'dart:collection';
+
+import 'package:collection/collection.dart';
+
+import 'yaml_node.dart';
+
+/// Returns a [Map] that compares its keys based on [deepEquals].
+Map<K, V> deepEqualsMap<K, V>() =>
+ LinkedHashMap(equals: deepEquals, hashCode: deepHashCode);
+
+/// Returns whether two objects are structurally equivalent.
+///
+/// This considers `NaN` values to be equivalent, handles self-referential
+/// structures, and considers [YamlScalar]s to be equal to their values.
+bool deepEquals(Object? obj1, Object? obj2) => _DeepEquals().equals(obj1, obj2);
+
+/// A class that provides access to the list of parent objects used for loop
+/// detection.
+class _DeepEquals {
+ final _parents1 = <Object?>[];
+ final _parents2 = <Object?>[];
+
+ /// Returns whether [obj1] and [obj2] are structurally equivalent.
+ bool equals(Object? obj1, Object? obj2) {
+ if (obj1 is YamlScalar) obj1 = obj1.value;
+ if (obj2 is YamlScalar) obj2 = obj2.value;
+
+ // _parents1 and _parents2 are guaranteed to be the same size.
+ for (var i = 0; i < _parents1.length; i++) {
+ var loop1 = identical(obj1, _parents1[i]);
+ var loop2 = identical(obj2, _parents2[i]);
+ // If both structures loop in the same place, they're equal at that point
+ // in the structure. If one loops and the other doesn't, they're not
+ // equal.
+ if (loop1 && loop2) return true;
+ if (loop1 || loop2) return false;
+ }
+
+ _parents1.add(obj1);
+ _parents2.add(obj2);
+ try {
+ if (obj1 is List && obj2 is List) {
+ return _listEquals(obj1, obj2);
+ } else if (obj1 is Map && obj2 is Map) {
+ return _mapEquals(obj1, obj2);
+ } else if (obj1 is num && obj2 is num) {
+ return _numEquals(obj1, obj2);
+ } else {
+ return obj1 == obj2;
+ }
+ } finally {
+ _parents1.removeLast();
+ _parents2.removeLast();
+ }
+ }
+
+ /// Returns whether [list1] and [list2] are structurally equal.
+ bool _listEquals(List list1, List list2) {
+ if (list1.length != list2.length) return false;
+
+ for (var i = 0; i < list1.length; i++) {
+ if (!equals(list1[i], list2[i])) return false;
+ }
+
+ return true;
+ }
+
+ /// Returns whether [map1] and [map2] are structurally equal.
+ bool _mapEquals(Map map1, Map map2) {
+ if (map1.length != map2.length) return false;
+
+ for (var key in map1.keys) {
+ if (!map2.containsKey(key)) return false;
+ if (!equals(map1[key], map2[key])) return false;
+ }
+
+ return true;
+ }
+
+ /// Returns whether two numbers are equivalent.
+ ///
+ /// This differs from `n1 == n2` in that it considers `NaN` to be equal to
+ /// itself.
+ bool _numEquals(num n1, num n2) {
+ if (n1.isNaN && n2.isNaN) return true;
+ return n1 == n2;
+ }
+}
+
+/// Returns a hash code for [obj] such that structurally equivalent objects
+/// will have the same hash code.
+///
+/// This supports deep equality for maps and lists, including those with
+/// self-referential structures, and returns the same hash code for
+/// [YamlScalar]s and their values.
+int deepHashCode(Object? obj) {
+ var parents = <Object?>[];
+
+ int deepHashCodeInner(Object? value) {
+ if (parents.any((parent) => identical(parent, value))) return -1;
+
+ parents.add(value);
+ try {
+ if (value is Map) {
+ var equality = const UnorderedIterableEquality<Object?>();
+ return equality.hash(value.keys.map(deepHashCodeInner)) ^
+ equality.hash(value.values.map(deepHashCodeInner));
+ } else if (value is Iterable) {
+ return const IterableEquality<Object?>().hash(value.map(deepHashCode));
+ } else if (value is YamlScalar) {
+ return (value.value as Object?).hashCode;
+ } else {
+ return value.hashCode;
+ }
+ } finally {
+ parents.removeLast();
+ }
+ }
+
+ return deepHashCodeInner(obj);
+}
diff --git a/pkgs/yaml/lib/src/error_listener.dart b/pkgs/yaml/lib/src/error_listener.dart
new file mode 100644
index 0000000..0498d68
--- /dev/null
+++ b/pkgs/yaml/lib/src/error_listener.dart
@@ -0,0 +1,22 @@
+// Copyright (c) 2021, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'yaml_exception.dart';
+
+/// A listener that is notified of [YamlException]s during scanning/parsing.
+abstract class ErrorListener {
+ /// This method is invoked when an [error] has been found in the YAML.
+ void onError(YamlException error);
+}
+
+/// An [ErrorListener] that collects all errors into [errors].
+class ErrorCollector extends ErrorListener {
+ final List<YamlException> errors = [];
+
+ @override
+ void onError(YamlException error) => errors.add(error);
+}
diff --git a/pkgs/yaml/lib/src/event.dart b/pkgs/yaml/lib/src/event.dart
new file mode 100644
index 0000000..1476311
--- /dev/null
+++ b/pkgs/yaml/lib/src/event.dart
@@ -0,0 +1,171 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:source_span/source_span.dart';
+
+import 'parser.dart';
+import 'style.dart';
+import 'yaml_document.dart';
+
+/// An event emitted by a [Parser].
+class Event {
+ final EventType type;
+ final FileSpan span;
+
+ Event(this.type, this.span);
+
+ @override
+ String toString() => type.toString();
+}
+
+/// An event indicating the beginning of a YAML document.
+class DocumentStartEvent implements Event {
+ @override
+ EventType get type => EventType.documentStart;
+ @override
+ final FileSpan span;
+
+ /// The document's `%YAML` directive, or `null` if there was none.
+ final VersionDirective? versionDirective;
+
+ /// The document's `%TAG` directives, if any.
+ final List<TagDirective> tagDirectives;
+
+ /// Whether the document started implicitly (that is, without an explicit
+ /// `===` sequence).
+ final bool isImplicit;
+
+ DocumentStartEvent(this.span,
+ {this.versionDirective,
+ List<TagDirective>? tagDirectives,
+ this.isImplicit = true})
+ : tagDirectives = tagDirectives ?? [];
+
+ @override
+ String toString() => 'DOCUMENT_START';
+}
+
+/// An event indicating the end of a YAML document.
+class DocumentEndEvent implements Event {
+ @override
+ EventType get type => EventType.documentEnd;
+ @override
+ final FileSpan span;
+
+ /// Whether the document ended implicitly (that is, without an explicit
+ /// `...` sequence).
+ final bool isImplicit;
+
+ DocumentEndEvent(this.span, {this.isImplicit = true});
+
+ @override
+ String toString() => 'DOCUMENT_END';
+}
+
+/// An event indicating that an alias was referenced.
+class AliasEvent implements Event {
+ @override
+ EventType get type => EventType.alias;
+ @override
+ final FileSpan span;
+
+ /// The alias name.
+ final String name;
+
+ AliasEvent(this.span, this.name);
+
+ @override
+ String toString() => 'ALIAS $name';
+}
+
+/// An event that can have associated anchor and tag properties.
+abstract class _ValueEvent implements Event {
+ /// The name of the value's anchor, or `null` if it wasn't anchored.
+ String? get anchor;
+
+ /// The text of the value's tag, or `null` if it wasn't tagged.
+ String? get tag;
+
+ @override
+ String toString() {
+ var buffer = StringBuffer('$type');
+ if (anchor != null) buffer.write(' &$anchor');
+ if (tag != null) buffer.write(' $tag');
+ return buffer.toString();
+ }
+}
+
+/// An event indicating a single scalar value.
+class ScalarEvent extends _ValueEvent {
+ @override
+ EventType get type => EventType.scalar;
+ @override
+ final FileSpan span;
+ @override
+ final String? anchor;
+ @override
+ final String? tag;
+
+ /// The contents of the scalar.
+ final String value;
+
+ /// The style of the scalar in the original source.
+ final ScalarStyle style;
+
+ ScalarEvent(this.span, this.value, this.style, {this.anchor, this.tag});
+
+ @override
+ String toString() => '${super.toString()} "$value"';
+}
+
+/// An event indicating the beginning of a sequence.
+class SequenceStartEvent extends _ValueEvent {
+ @override
+ EventType get type => EventType.sequenceStart;
+ @override
+ final FileSpan span;
+ @override
+ final String? anchor;
+ @override
+ final String? tag;
+
+ /// The style of the collection in the original source.
+ final CollectionStyle style;
+
+ SequenceStartEvent(this.span, this.style, {this.anchor, this.tag});
+}
+
+/// An event indicating the beginning of a mapping.
+class MappingStartEvent extends _ValueEvent {
+ @override
+ EventType get type => EventType.mappingStart;
+ @override
+ final FileSpan span;
+ @override
+ final String? anchor;
+ @override
+ final String? tag;
+
+ /// The style of the collection in the original source.
+ final CollectionStyle style;
+
+ MappingStartEvent(this.span, this.style, {this.anchor, this.tag});
+}
+
+/// The types of [Event] objects.
+enum EventType {
+ streamStart,
+ streamEnd,
+ documentStart,
+ documentEnd,
+ alias,
+ scalar,
+ sequenceStart,
+ sequenceEnd,
+ mappingStart,
+ mappingEnd
+}
diff --git a/pkgs/yaml/lib/src/loader.dart b/pkgs/yaml/lib/src/loader.dart
new file mode 100644
index 0000000..7cdf45a
--- /dev/null
+++ b/pkgs/yaml/lib/src/loader.dart
@@ -0,0 +1,343 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:source_span/source_span.dart';
+
+import 'charcodes.dart';
+import 'equality.dart';
+import 'error_listener.dart';
+import 'event.dart';
+import 'parser.dart';
+import 'yaml_document.dart';
+import 'yaml_exception.dart';
+import 'yaml_node.dart';
+
+/// A loader that reads [Event]s emitted by a [Parser] and emits
+/// [YamlDocument]s.
+///
+/// This is based on the libyaml loader, available at
+/// https://github.com/yaml/libyaml/blob/master/src/loader.c. The license for
+/// that is available in ../../libyaml-license.txt.
+class Loader {
+ /// The underlying [Parser] that generates [Event]s.
+ final Parser _parser;
+
+ /// Aliases by the alias name.
+ final _aliases = <String, YamlNode>{};
+
+ /// The span of the entire stream emitted so far.
+ FileSpan get span => _span;
+ FileSpan _span;
+
+ /// Creates a loader that loads [source].
+ factory Loader(String source,
+ {Uri? sourceUrl, bool recover = false, ErrorListener? errorListener}) {
+ var parser = Parser(source,
+ sourceUrl: sourceUrl, recover: recover, errorListener: errorListener);
+ var event = parser.parse();
+ assert(event.type == EventType.streamStart);
+ return Loader._(parser, event.span);
+ }
+
+ Loader._(this._parser, this._span);
+
+ /// Loads the next document from the stream.
+ ///
+ /// If there are no more documents, returns `null`.
+ YamlDocument? load() {
+ if (_parser.isDone) return null;
+
+ var event = _parser.parse();
+ if (event.type == EventType.streamEnd) {
+ _span = _span.expand(event.span);
+ return null;
+ }
+
+ var document = _loadDocument(event as DocumentStartEvent);
+ _span = _span.expand(document.span as FileSpan);
+ _aliases.clear();
+ return document;
+ }
+
+ /// Composes a document object.
+ YamlDocument _loadDocument(DocumentStartEvent firstEvent) {
+ var contents = _loadNode(_parser.parse());
+
+ var lastEvent = _parser.parse() as DocumentEndEvent;
+ assert(lastEvent.type == EventType.documentEnd);
+
+ return YamlDocument.internal(
+ contents,
+ firstEvent.span.expand(lastEvent.span),
+ firstEvent.versionDirective,
+ firstEvent.tagDirectives,
+ startImplicit: firstEvent.isImplicit,
+ endImplicit: lastEvent.isImplicit);
+ }
+
+ /// Composes a node.
+ YamlNode _loadNode(Event firstEvent) => switch (firstEvent.type) {
+ EventType.alias => _loadAlias(firstEvent as AliasEvent),
+ EventType.scalar => _loadScalar(firstEvent as ScalarEvent),
+ EventType.sequenceStart =>
+ _loadSequence(firstEvent as SequenceStartEvent),
+ EventType.mappingStart => _loadMapping(firstEvent as MappingStartEvent),
+ _ => throw StateError('Unreachable')
+ };
+
+ /// Registers an anchor.
+ void _registerAnchor(String? anchor, YamlNode node) {
+ if (anchor == null) return;
+
+ // libyaml throws an error for duplicate anchors, but example 7.1 makes it
+ // clear that they should be overridden:
+ // http://yaml.org/spec/1.2/spec.html#id2786448.
+
+ _aliases[anchor] = node;
+ }
+
+ /// Composes a node corresponding to an alias.
+ YamlNode _loadAlias(AliasEvent event) {
+ var alias = _aliases[event.name];
+ if (alias != null) return alias;
+
+ throw YamlException('Undefined alias.', event.span);
+ }
+
+ /// Composes a scalar node.
+ YamlNode _loadScalar(ScalarEvent scalar) {
+ YamlNode node;
+ if (scalar.tag == '!') {
+ node = YamlScalar.internal(scalar.value, scalar);
+ } else if (scalar.tag != null) {
+ node = _parseByTag(scalar);
+ } else {
+ node = _parseScalar(scalar);
+ }
+
+ _registerAnchor(scalar.anchor, node);
+ return node;
+ }
+
+ /// Composes a sequence node.
+ YamlNode _loadSequence(SequenceStartEvent firstEvent) {
+ if (firstEvent.tag != '!' &&
+ firstEvent.tag != null &&
+ firstEvent.tag != 'tag:yaml.org,2002:seq') {
+ throw YamlException('Invalid tag for sequence.', firstEvent.span);
+ }
+
+ var children = <YamlNode>[];
+ var node = YamlList.internal(children, firstEvent.span, firstEvent.style);
+ _registerAnchor(firstEvent.anchor, node);
+
+ var event = _parser.parse();
+ while (event.type != EventType.sequenceEnd) {
+ children.add(_loadNode(event));
+ event = _parser.parse();
+ }
+
+ setSpan(node, firstEvent.span.expand(event.span));
+ return node;
+ }
+
+ /// Composes a mapping node.
+ YamlNode _loadMapping(MappingStartEvent firstEvent) {
+ if (firstEvent.tag != '!' &&
+ firstEvent.tag != null &&
+ firstEvent.tag != 'tag:yaml.org,2002:map') {
+ throw YamlException('Invalid tag for mapping.', firstEvent.span);
+ }
+
+ var children = deepEqualsMap<dynamic, YamlNode>();
+ var node = YamlMap.internal(children, firstEvent.span, firstEvent.style);
+ _registerAnchor(firstEvent.anchor, node);
+
+ var event = _parser.parse();
+ while (event.type != EventType.mappingEnd) {
+ var key = _loadNode(event);
+ var value = _loadNode(_parser.parse());
+ if (children.containsKey(key)) {
+ throw YamlException('Duplicate mapping key.', key.span);
+ }
+
+ children[key] = value;
+ event = _parser.parse();
+ }
+
+ setSpan(node, firstEvent.span.expand(event.span));
+ return node;
+ }
+
+ /// Parses a scalar according to its tag name.
+ YamlScalar _parseByTag(ScalarEvent scalar) {
+ switch (scalar.tag) {
+ case 'tag:yaml.org,2002:null':
+ var result = _parseNull(scalar);
+ if (result != null) return result;
+ throw YamlException('Invalid null scalar.', scalar.span);
+ case 'tag:yaml.org,2002:bool':
+ var result = _parseBool(scalar);
+ if (result != null) return result;
+ throw YamlException('Invalid bool scalar.', scalar.span);
+ case 'tag:yaml.org,2002:int':
+ var result = _parseNumber(scalar, allowFloat: false);
+ if (result != null) return result;
+ throw YamlException('Invalid int scalar.', scalar.span);
+ case 'tag:yaml.org,2002:float':
+ var result = _parseNumber(scalar, allowInt: false);
+ if (result != null) return result;
+ throw YamlException('Invalid float scalar.', scalar.span);
+ case 'tag:yaml.org,2002:str':
+ return YamlScalar.internal(scalar.value, scalar);
+ default:
+ throw YamlException('Undefined tag: ${scalar.tag}.', scalar.span);
+ }
+ }
+
+ /// Parses [scalar], which may be one of several types.
+ YamlScalar _parseScalar(ScalarEvent scalar) =>
+ _tryParseScalar(scalar) ?? YamlScalar.internal(scalar.value, scalar);
+
+ /// Tries to parse [scalar].
+ ///
+ /// If parsing fails, this returns `null`, indicating that the scalar should
+ /// be parsed as a string.
+ YamlScalar? _tryParseScalar(ScalarEvent scalar) {
+ // Quickly check for the empty string, which means null.
+ var length = scalar.value.length;
+ if (length == 0) return YamlScalar.internal(null, scalar);
+
+ // Dispatch on the first character.
+ var firstChar = scalar.value.codeUnitAt(0);
+ return switch (firstChar) {
+ $dot || $plus || $minus => _parseNumber(scalar),
+ $n || $N => length == 4 ? _parseNull(scalar) : null,
+ $t || $T => length == 4 ? _parseBool(scalar) : null,
+ $f || $F => length == 5 ? _parseBool(scalar) : null,
+ $tilde => length == 1 ? YamlScalar.internal(null, scalar) : null,
+ _ => (firstChar >= $0 && firstChar <= $9) ? _parseNumber(scalar) : null
+ };
+ }
+
+ /// Parse a null scalar.
+ ///
+ /// Returns a Dart `null` if parsing fails.
+ YamlScalar? _parseNull(ScalarEvent scalar) => switch (scalar.value) {
+ '' ||
+ 'null' ||
+ 'Null' ||
+ 'NULL' ||
+ '~' =>
+ YamlScalar.internal(null, scalar),
+ _ => null
+ };
+
+ /// Parse a boolean scalar.
+ ///
+ /// Returns `null` if parsing fails.
+ YamlScalar? _parseBool(ScalarEvent scalar) => switch (scalar.value) {
+ 'true' || 'True' || 'TRUE' => YamlScalar.internal(true, scalar),
+ 'false' || 'False' || 'FALSE' => YamlScalar.internal(false, scalar),
+ _ => null
+ };
+
+ /// Parses a numeric scalar.
+ ///
+ /// Returns `null` if parsing fails.
+ YamlScalar? _parseNumber(ScalarEvent scalar,
+ {bool allowInt = true, bool allowFloat = true}) {
+ var value = _parseNumberValue(scalar.value,
+ allowInt: allowInt, allowFloat: allowFloat);
+ return value == null ? null : YamlScalar.internal(value, scalar);
+ }
+
+ /// Parses the value of a number.
+ ///
+ /// Returns the number if it's parsed successfully, or `null` if it's not.
+ num? _parseNumberValue(String contents,
+ {bool allowInt = true, bool allowFloat = true}) {
+ assert(allowInt || allowFloat);
+
+ var firstChar = contents.codeUnitAt(0);
+ var length = contents.length;
+
+ // Quick check for single digit integers.
+ if (allowInt && length == 1) {
+ var value = firstChar - $0;
+ return value >= 0 && value <= 9 ? value : null;
+ }
+
+ var secondChar = contents.codeUnitAt(1);
+
+ // Hexadecimal or octal integers.
+ if (allowInt && firstChar == $0) {
+ // int.tryParse supports 0x natively.
+ if (secondChar == $x) return int.tryParse(contents);
+
+ if (secondChar == $o) {
+ var afterRadix = contents.substring(2);
+ return int.tryParse(afterRadix, radix: 8);
+ }
+ }
+
+ // Int or float starting with a digit or a +/- sign.
+ if ((firstChar >= $0 && firstChar <= $9) ||
+ ((firstChar == $plus || firstChar == $minus) &&
+ secondChar >= $0 &&
+ secondChar <= $9)) {
+ // Try to parse an int or, failing that, a double.
+ num? result;
+ if (allowInt) {
+ // Pass "radix: 10" explicitly to ensure that "-0x10", which is valid
+ // Dart but invalid YAML, doesn't get parsed.
+ result = int.tryParse(contents, radix: 10);
+ }
+
+ if (allowFloat) result ??= double.tryParse(contents);
+ return result;
+ }
+
+ if (!allowFloat) return null;
+
+ // Now the only possibility is to parse a float starting with a dot or a
+ // sign and a dot, or the signed/unsigned infinity values and not-a-numbers.
+ if ((firstChar == $dot && secondChar >= $0 && secondChar <= $9) ||
+ (firstChar == $minus || firstChar == $plus) && secondChar == $dot) {
+ // Starting with a . and a number or a sign followed by a dot.
+ if (length == 5) {
+ switch (contents) {
+ case '+.inf':
+ case '+.Inf':
+ case '+.INF':
+ return double.infinity;
+ case '-.inf':
+ case '-.Inf':
+ case '-.INF':
+ return -double.infinity;
+ }
+ }
+
+ return double.tryParse(contents);
+ }
+
+ if (length == 4 && firstChar == $dot) {
+ switch (contents) {
+ case '.inf':
+ case '.Inf':
+ case '.INF':
+ return double.infinity;
+ case '.nan':
+ case '.NaN':
+ case '.NAN':
+ return double.nan;
+ }
+ }
+
+ return null;
+ }
+}
diff --git a/pkgs/yaml/lib/src/null_span.dart b/pkgs/yaml/lib/src/null_span.dart
new file mode 100644
index 0000000..49e1a1c
--- /dev/null
+++ b/pkgs/yaml/lib/src/null_span.dart
@@ -0,0 +1,26 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:source_span/source_span.dart';
+
+import 'yaml_node.dart';
+
+/// A [SourceSpan] with no location information.
+///
+/// This is used with [YamlMap.wrap] and [YamlList.wrap] to provide means of
+/// accessing a non-YAML map that behaves transparently like a map parsed from
+/// YAML.
+class NullSpan extends SourceSpanMixin {
+ @override
+ final SourceLocation start;
+ @override
+ SourceLocation get end => start;
+ @override
+ final text = '';
+
+ NullSpan(Object? sourceUrl) : start = SourceLocation(0, sourceUrl: sourceUrl);
+}
diff --git a/pkgs/yaml/lib/src/parser.dart b/pkgs/yaml/lib/src/parser.dart
new file mode 100644
index 0000000..e924e40
--- /dev/null
+++ b/pkgs/yaml/lib/src/parser.dart
@@ -0,0 +1,805 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+// ignore_for_file: constant_identifier_names
+
+import 'package:source_span/source_span.dart';
+import 'package:string_scanner/string_scanner.dart';
+
+import 'error_listener.dart';
+import 'event.dart';
+import 'scanner.dart';
+import 'style.dart';
+import 'token.dart';
+import 'utils.dart';
+import 'yaml_document.dart';
+import 'yaml_exception.dart';
+
+/// A parser that reads [Token]s emitted by a [Scanner] and emits [Event]s.
+///
+/// This is based on the libyaml parser, available at
+/// https://github.com/yaml/libyaml/blob/master/src/parser.c. The license for
+/// that is available in ../../libyaml-license.txt.
+class Parser {
+ /// The underlying [Scanner] that generates [Token]s.
+ final Scanner _scanner;
+
+ /// The stack of parse states for nested contexts.
+ final _states = <_State>[];
+
+ /// The current parse state.
+ var _state = _State.STREAM_START;
+
+ /// The custom tag directives, by tag handle.
+ final _tagDirectives = <String, TagDirective>{};
+
+ /// Whether the parser has finished parsing.
+ bool get isDone => _state == _State.END;
+
+ /// Creates a parser that parses [source].
+ ///
+ /// If [recover] is true, will attempt to recover from parse errors and may
+ /// return invalid or synthetic nodes. If [errorListener] is also supplied,
+ /// its onError method will be called for each error recovered from. It is not
+ /// valid to provide [errorListener] if [recover] is false.
+ Parser(String source,
+ {Uri? sourceUrl, bool recover = false, ErrorListener? errorListener})
+ : assert(recover || errorListener == null),
+ _scanner = Scanner(source,
+ sourceUrl: sourceUrl,
+ recover: recover,
+ errorListener: errorListener);
+
+ /// Consumes and returns the next event.
+ Event parse() {
+ try {
+ if (isDone) throw StateError('No more events.');
+ var event = _stateMachine();
+ return event;
+ } on StringScannerException catch (error) {
+ throw YamlException(error.message, error.span);
+ }
+ }
+
+ /// Dispatches parsing based on the current state.
+ Event _stateMachine() {
+ switch (_state) {
+ case _State.STREAM_START:
+ return _parseStreamStart();
+ case _State.DOCUMENT_START:
+ return _parseDocumentStart();
+ case _State.DOCUMENT_CONTENT:
+ return _parseDocumentContent();
+ case _State.DOCUMENT_END:
+ return _parseDocumentEnd();
+ case _State.BLOCK_NODE:
+ return _parseNode(block: true);
+ case _State.BLOCK_NODE_OR_INDENTLESS_SEQUENCE:
+ return _parseNode(block: true, indentlessSequence: true);
+ case _State.FLOW_NODE:
+ return _parseNode();
+ case _State.BLOCK_SEQUENCE_FIRST_ENTRY:
+ // Scan past the `BLOCK-SEQUENCE-FIRST-ENTRY` token to the
+ // `BLOCK-SEQUENCE-ENTRY` token.
+ _scanner.scan();
+ return _parseBlockSequenceEntry();
+ case _State.BLOCK_SEQUENCE_ENTRY:
+ return _parseBlockSequenceEntry();
+ case _State.INDENTLESS_SEQUENCE_ENTRY:
+ return _parseIndentlessSequenceEntry();
+ case _State.BLOCK_MAPPING_FIRST_KEY:
+ // Scan past the `BLOCK-MAPPING-FIRST-KEY` token to the
+ // `BLOCK-MAPPING-KEY` token.
+ _scanner.scan();
+ return _parseBlockMappingKey();
+ case _State.BLOCK_MAPPING_KEY:
+ return _parseBlockMappingKey();
+ case _State.BLOCK_MAPPING_VALUE:
+ return _parseBlockMappingValue();
+ case _State.FLOW_SEQUENCE_FIRST_ENTRY:
+ return _parseFlowSequenceEntry(first: true);
+ case _State.FLOW_SEQUENCE_ENTRY:
+ return _parseFlowSequenceEntry();
+ case _State.FLOW_SEQUENCE_ENTRY_MAPPING_KEY:
+ return _parseFlowSequenceEntryMappingKey();
+ case _State.FLOW_SEQUENCE_ENTRY_MAPPING_VALUE:
+ return _parseFlowSequenceEntryMappingValue();
+ case _State.FLOW_SEQUENCE_ENTRY_MAPPING_END:
+ return _parseFlowSequenceEntryMappingEnd();
+ case _State.FLOW_MAPPING_FIRST_KEY:
+ return _parseFlowMappingKey(first: true);
+ case _State.FLOW_MAPPING_KEY:
+ return _parseFlowMappingKey();
+ case _State.FLOW_MAPPING_VALUE:
+ return _parseFlowMappingValue();
+ case _State.FLOW_MAPPING_EMPTY_VALUE:
+ return _parseFlowMappingValue(empty: true);
+ default:
+ throw StateError('Unreachable');
+ }
+ }
+
+ /// Parses the production:
+ ///
+ /// stream ::=
+ /// STREAM-START implicit_document? explicit_document* STREAM-END
+ /// ************
+ Event _parseStreamStart() {
+ var token = _scanner.scan();
+ assert(token.type == TokenType.streamStart);
+
+ _state = _State.DOCUMENT_START;
+ return Event(EventType.streamStart, token.span);
+ }
+
+ /// Parses the productions:
+ ///
+ /// implicit_document ::= block_node DOCUMENT-END*
+ /// *
+ /// explicit_document ::=
+ /// DIRECTIVE* DOCUMENT-START block_node? DOCUMENT-END*
+ /// *************************
+ Event _parseDocumentStart() {
+ var token = _scanner.peek()!;
+
+ // libyaml requires any document beyond the first in the stream to have an
+ // explicit document start indicator, but the spec allows it to be omitted
+ // as long as there was an end indicator.
+
+ // Parse extra document end indicators.
+ while (token.type == TokenType.documentEnd) {
+ token = _scanner.advance()!;
+ }
+
+ if (token.type != TokenType.versionDirective &&
+ token.type != TokenType.tagDirective &&
+ token.type != TokenType.documentStart &&
+ token.type != TokenType.streamEnd) {
+ // Parse an implicit document.
+ _processDirectives();
+ _states.add(_State.DOCUMENT_END);
+ _state = _State.BLOCK_NODE;
+ return DocumentStartEvent(token.span.start.pointSpan());
+ }
+
+ if (token.type == TokenType.streamEnd) {
+ _state = _State.END;
+ _scanner.scan();
+ return Event(EventType.streamEnd, token.span);
+ }
+
+ // Parse an explicit document.
+ var start = token.span;
+ var (versionDirective, tagDirectives) = _processDirectives();
+ token = _scanner.peek()!;
+ if (token.type != TokenType.documentStart) {
+ throw YamlException('Expected document start.', token.span);
+ }
+
+ _states.add(_State.DOCUMENT_END);
+ _state = _State.DOCUMENT_CONTENT;
+ _scanner.scan();
+ return DocumentStartEvent(start.expand(token.span),
+ versionDirective: versionDirective,
+ tagDirectives: tagDirectives,
+ isImplicit: false);
+ }
+
+ /// Parses the productions:
+ ///
+ /// explicit_document ::=
+ /// DIRECTIVE* DOCUMENT-START block_node? DOCUMENT-END*
+ /// ***********
+ Event _parseDocumentContent() {
+ var token = _scanner.peek()!;
+
+ switch (token.type) {
+ case TokenType.versionDirective:
+ case TokenType.tagDirective:
+ case TokenType.documentStart:
+ case TokenType.documentEnd:
+ case TokenType.streamEnd:
+ _state = _states.removeLast();
+ return _processEmptyScalar(token.span.start);
+ default:
+ return _parseNode(block: true);
+ }
+ }
+
+ /// Parses the productions:
+ ///
+ /// implicit_document ::= block_node DOCUMENT-END*
+ /// *************
+ /// explicit_document ::=
+ /// DIRECTIVE* DOCUMENT-START block_node? DOCUMENT-END*
+ /// *************
+ Event _parseDocumentEnd() {
+ _tagDirectives.clear();
+ _state = _State.DOCUMENT_START;
+
+ var token = _scanner.peek()!;
+ if (token.type == TokenType.documentEnd) {
+ _scanner.scan();
+ return DocumentEndEvent(token.span, isImplicit: false);
+ } else {
+ return DocumentEndEvent(token.span.start.pointSpan());
+ }
+ }
+
+ /// Parses the productions:
+ ///
+ /// block_node_or_indentless_sequence ::=
+ /// ALIAS
+ /// *****
+ /// | properties (block_content | indentless_block_sequence)?
+ /// ********** *
+ /// | block_content | indentless_block_sequence
+ /// *
+ /// block_node ::= ALIAS
+ /// *****
+ /// | properties block_content?
+ /// ********** *
+ /// | block_content
+ /// *
+ /// flow_node ::= ALIAS
+ /// *****
+ /// | properties flow_content?
+ /// ********** *
+ /// | flow_content
+ /// *
+ /// properties ::= TAG ANCHOR? | ANCHOR TAG?
+ /// *************************
+ /// block_content ::= block_collection | flow_collection | SCALAR
+ /// ******
+ /// flow_content ::= flow_collection | SCALAR
+ /// ******
+ Event _parseNode({bool block = false, bool indentlessSequence = false}) {
+ var token = _scanner.peek()!;
+
+ if (token is AliasToken) {
+ _scanner.scan();
+ _state = _states.removeLast();
+ return AliasEvent(token.span, token.name);
+ }
+
+ String? anchor;
+ TagToken? tagToken;
+ var span = token.span.start.pointSpan();
+ Token parseAnchor(AnchorToken token) {
+ anchor = token.name;
+ span = span.expand(token.span);
+ return _scanner.advance()!;
+ }
+
+ Token parseTag(TagToken token) {
+ tagToken = token;
+ span = span.expand(token.span);
+ return _scanner.advance()!;
+ }
+
+ if (token is AnchorToken) {
+ token = parseAnchor(token);
+ if (token is TagToken) token = parseTag(token);
+ } else if (token is TagToken) {
+ token = parseTag(token);
+ if (token is AnchorToken) token = parseAnchor(token);
+ }
+
+ String? tag;
+ if (tagToken != null) {
+ if (tagToken!.handle == null) {
+ tag = tagToken!.suffix;
+ } else {
+ var tagDirective = _tagDirectives[tagToken!.handle];
+ if (tagDirective == null) {
+ throw YamlException('Undefined tag handle.', tagToken!.span);
+ }
+
+ tag = tagDirective.prefix + (tagToken?.suffix ?? '');
+ }
+ }
+
+ if (indentlessSequence && token.type == TokenType.blockEntry) {
+ _state = _State.INDENTLESS_SEQUENCE_ENTRY;
+ return SequenceStartEvent(span.expand(token.span), CollectionStyle.BLOCK,
+ anchor: anchor, tag: tag);
+ }
+
+ if (token is ScalarToken) {
+ // All non-plain scalars have the "!" tag by default.
+ if (tag == null && token.style != ScalarStyle.PLAIN) tag = '!';
+
+ _state = _states.removeLast();
+ _scanner.scan();
+ return ScalarEvent(span.expand(token.span), token.value, token.style,
+ anchor: anchor, tag: tag);
+ }
+
+ if (token.type == TokenType.flowSequenceStart) {
+ _state = _State.FLOW_SEQUENCE_FIRST_ENTRY;
+ return SequenceStartEvent(span.expand(token.span), CollectionStyle.FLOW,
+ anchor: anchor, tag: tag);
+ }
+
+ if (token.type == TokenType.flowMappingStart) {
+ _state = _State.FLOW_MAPPING_FIRST_KEY;
+ return MappingStartEvent(span.expand(token.span), CollectionStyle.FLOW,
+ anchor: anchor, tag: tag);
+ }
+
+ if (block && token.type == TokenType.blockSequenceStart) {
+ _state = _State.BLOCK_SEQUENCE_FIRST_ENTRY;
+ return SequenceStartEvent(span.expand(token.span), CollectionStyle.BLOCK,
+ anchor: anchor, tag: tag);
+ }
+
+ if (block && token.type == TokenType.blockMappingStart) {
+ _state = _State.BLOCK_MAPPING_FIRST_KEY;
+ return MappingStartEvent(span.expand(token.span), CollectionStyle.BLOCK,
+ anchor: anchor, tag: tag);
+ }
+
+ if (anchor != null || tag != null) {
+ _state = _states.removeLast();
+ return ScalarEvent(span, '', ScalarStyle.PLAIN, anchor: anchor, tag: tag);
+ }
+
+ throw YamlException('Expected node content.', span);
+ }
+
+ /// Parses the productions:
+ ///
+ /// block_sequence ::=
+ /// BLOCK-SEQUENCE-START (BLOCK-ENTRY block_node?)* BLOCK-END
+ /// ******************** *********** * *********
+ Event _parseBlockSequenceEntry() {
+ var token = _scanner.peek()!;
+
+ if (token.type == TokenType.blockEntry) {
+ var start = token.span.start;
+ token = _scanner.advance()!;
+
+ if (token.type == TokenType.blockEntry ||
+ token.type == TokenType.blockEnd) {
+ _state = _State.BLOCK_SEQUENCE_ENTRY;
+ return _processEmptyScalar(start);
+ } else {
+ _states.add(_State.BLOCK_SEQUENCE_ENTRY);
+ return _parseNode(block: true);
+ }
+ }
+
+ if (token.type == TokenType.blockEnd) {
+ _scanner.scan();
+ _state = _states.removeLast();
+ return Event(EventType.sequenceEnd, token.span);
+ }
+
+ throw YamlException("While parsing a block collection, expected '-'.",
+ token.span.start.pointSpan());
+ }
+
+ /// Parses the productions:
+ ///
+ /// indentless_sequence ::= (BLOCK-ENTRY block_node?)+
+ /// *********** *
+ Event _parseIndentlessSequenceEntry() {
+ var token = _scanner.peek()!;
+
+ if (token.type != TokenType.blockEntry) {
+ _state = _states.removeLast();
+ return Event(EventType.sequenceEnd, token.span.start.pointSpan());
+ }
+
+ var start = token.span.start;
+ token = _scanner.advance()!;
+
+ if (token.type == TokenType.blockEntry ||
+ token.type == TokenType.key ||
+ token.type == TokenType.value ||
+ token.type == TokenType.blockEnd) {
+ _state = _State.INDENTLESS_SEQUENCE_ENTRY;
+ return _processEmptyScalar(start);
+ } else {
+ _states.add(_State.INDENTLESS_SEQUENCE_ENTRY);
+ return _parseNode(block: true);
+ }
+ }
+
+ /// Parses the productions:
+ ///
+ /// block_mapping ::= BLOCK-MAPPING_START
+ /// *******************
+ /// ((KEY block_node_or_indentless_sequence?)?
+ /// *** *
+ /// (VALUE block_node_or_indentless_sequence?)?)*
+ ///
+ /// BLOCK-END
+ /// *********
+ Event _parseBlockMappingKey() {
+ var token = _scanner.peek()!;
+ if (token.type == TokenType.key) {
+ var start = token.span.start;
+ token = _scanner.advance()!;
+
+ if (token.type == TokenType.key ||
+ token.type == TokenType.value ||
+ token.type == TokenType.blockEnd) {
+ _state = _State.BLOCK_MAPPING_VALUE;
+ return _processEmptyScalar(start);
+ } else {
+ _states.add(_State.BLOCK_MAPPING_VALUE);
+ return _parseNode(block: true, indentlessSequence: true);
+ }
+ }
+
+ // libyaml doesn't allow empty keys without an explicit key indicator, but
+ // the spec does. See example 8.18:
+ // http://yaml.org/spec/1.2/spec.html#id2798896.
+ if (token.type == TokenType.value) {
+ _state = _State.BLOCK_MAPPING_VALUE;
+ return _processEmptyScalar(token.span.start);
+ }
+
+ if (token.type == TokenType.blockEnd) {
+ _scanner.scan();
+ _state = _states.removeLast();
+ return Event(EventType.mappingEnd, token.span);
+ }
+
+ throw YamlException('Expected a key while parsing a block mapping.',
+ token.span.start.pointSpan());
+ }
+
+ /// Parses the productions:
+ ///
+ /// block_mapping ::= BLOCK-MAPPING_START
+ ///
+ /// ((KEY block_node_or_indentless_sequence?)?
+ ///
+ /// (VALUE block_node_or_indentless_sequence?)?)*
+ /// ***** *
+ /// BLOCK-END
+ ///
+ Event _parseBlockMappingValue() {
+ var token = _scanner.peek()!;
+
+ if (token.type != TokenType.value) {
+ _state = _State.BLOCK_MAPPING_KEY;
+ return _processEmptyScalar(token.span.start);
+ }
+
+ var start = token.span.start;
+ token = _scanner.advance()!;
+ if (token.type == TokenType.key ||
+ token.type == TokenType.value ||
+ token.type == TokenType.blockEnd) {
+ _state = _State.BLOCK_MAPPING_KEY;
+ return _processEmptyScalar(start);
+ } else {
+ _states.add(_State.BLOCK_MAPPING_KEY);
+ return _parseNode(block: true, indentlessSequence: true);
+ }
+ }
+
+ /// Parses the productions:
+ ///
+ /// flow_sequence ::= FLOW-SEQUENCE-START
+ /// *******************
+ /// (flow_sequence_entry FLOW-ENTRY)*
+ /// * **********
+ /// flow_sequence_entry?
+ /// *
+ /// FLOW-SEQUENCE-END
+ /// *****************
+ /// flow_sequence_entry ::=
+ /// flow_node | KEY flow_node? (VALUE flow_node?)?
+ /// *
+ Event _parseFlowSequenceEntry({bool first = false}) {
+ if (first) _scanner.scan();
+ var token = _scanner.peek()!;
+
+ if (token.type != TokenType.flowSequenceEnd) {
+ if (!first) {
+ if (token.type != TokenType.flowEntry) {
+ throw YamlException(
+ "While parsing a flow sequence, expected ',' or ']'.",
+ token.span.start.pointSpan());
+ }
+
+ token = _scanner.advance()!;
+ }
+
+ if (token.type == TokenType.key) {
+ _state = _State.FLOW_SEQUENCE_ENTRY_MAPPING_KEY;
+ _scanner.scan();
+ return MappingStartEvent(token.span, CollectionStyle.FLOW);
+ } else if (token.type != TokenType.flowSequenceEnd) {
+ _states.add(_State.FLOW_SEQUENCE_ENTRY);
+ return _parseNode();
+ }
+ }
+
+ _scanner.scan();
+ _state = _states.removeLast();
+ return Event(EventType.sequenceEnd, token.span);
+ }
+
+ /// Parses the productions:
+ ///
+ /// flow_sequence_entry ::=
+ /// flow_node | KEY flow_node? (VALUE flow_node?)?
+ /// *** *
+ Event _parseFlowSequenceEntryMappingKey() {
+ var token = _scanner.peek()!;
+
+ if (token.type == TokenType.value ||
+ token.type == TokenType.flowEntry ||
+ token.type == TokenType.flowSequenceEnd) {
+ // libyaml consumes the token here, but that seems like a bug, since it
+ // always causes [_parseFlowSequenceEntryMappingValue] to emit an empty
+ // scalar.
+
+ var start = token.span.start;
+ _state = _State.FLOW_SEQUENCE_ENTRY_MAPPING_VALUE;
+ return _processEmptyScalar(start);
+ } else {
+ _states.add(_State.FLOW_SEQUENCE_ENTRY_MAPPING_VALUE);
+ return _parseNode();
+ }
+ }
+
+ /// Parses the productions:
+ ///
+ /// flow_sequence_entry ::=
+ /// flow_node | KEY flow_node? (VALUE flow_node?)?
+ /// ***** *
+ Event _parseFlowSequenceEntryMappingValue() {
+ var token = _scanner.peek()!;
+
+ if (token.type == TokenType.value) {
+ token = _scanner.advance()!;
+ if (token.type != TokenType.flowEntry &&
+ token.type != TokenType.flowSequenceEnd) {
+ _states.add(_State.FLOW_SEQUENCE_ENTRY_MAPPING_END);
+ return _parseNode();
+ }
+ }
+
+ _state = _State.FLOW_SEQUENCE_ENTRY_MAPPING_END;
+ return _processEmptyScalar(token.span.start);
+ }
+
+ /// Parses the productions:
+ ///
+ /// flow_sequence_entry ::=
+ /// flow_node | KEY flow_node? (VALUE flow_node?)?
+ /// *
+ Event _parseFlowSequenceEntryMappingEnd() {
+ _state = _State.FLOW_SEQUENCE_ENTRY;
+ return Event(EventType.mappingEnd, _scanner.peek()!.span.start.pointSpan());
+ }
+
+ /// Parses the productions:
+ ///
+ /// flow_mapping ::= FLOW-MAPPING-START
+ /// ******************
+ /// (flow_mapping_entry FLOW-ENTRY)*
+ /// * **********
+ /// flow_mapping_entry?
+ /// ******************
+ /// FLOW-MAPPING-END
+ /// ****************
+ /// flow_mapping_entry ::=
+ /// flow_node | KEY flow_node? (VALUE flow_node?)?
+ /// * *** *
+ Event _parseFlowMappingKey({bool first = false}) {
+ if (first) _scanner.scan();
+ var token = _scanner.peek()!;
+
+ if (token.type != TokenType.flowMappingEnd) {
+ if (!first) {
+ if (token.type != TokenType.flowEntry) {
+ throw YamlException(
+ "While parsing a flow mapping, expected ',' or '}'.",
+ token.span.start.pointSpan());
+ }
+
+ token = _scanner.advance()!;
+ }
+
+ if (token.type == TokenType.key) {
+ token = _scanner.advance()!;
+ if (token.type != TokenType.value &&
+ token.type != TokenType.flowEntry &&
+ token.type != TokenType.flowMappingEnd) {
+ _states.add(_State.FLOW_MAPPING_VALUE);
+ return _parseNode();
+ } else {
+ _state = _State.FLOW_MAPPING_VALUE;
+ return _processEmptyScalar(token.span.start);
+ }
+ } else if (token.type != TokenType.flowMappingEnd) {
+ _states.add(_State.FLOW_MAPPING_EMPTY_VALUE);
+ return _parseNode();
+ }
+ }
+
+ _scanner.scan();
+ _state = _states.removeLast();
+ return Event(EventType.mappingEnd, token.span);
+ }
+
+ /// Parses the productions:
+ ///
+ /// flow_mapping_entry ::=
+ /// flow_node | KEY flow_node? (VALUE flow_node?)?
+ /// * ***** *
+ Event _parseFlowMappingValue({bool empty = false}) {
+ var token = _scanner.peek()!;
+
+ if (empty) {
+ _state = _State.FLOW_MAPPING_KEY;
+ return _processEmptyScalar(token.span.start);
+ }
+
+ if (token.type == TokenType.value) {
+ token = _scanner.advance()!;
+ if (token.type != TokenType.flowEntry &&
+ token.type != TokenType.flowMappingEnd) {
+ _states.add(_State.FLOW_MAPPING_KEY);
+ return _parseNode();
+ }
+ }
+
+ _state = _State.FLOW_MAPPING_KEY;
+ return _processEmptyScalar(token.span.start);
+ }
+
+ /// Generate an empty scalar event.
+ Event _processEmptyScalar(SourceLocation location) =>
+ ScalarEvent(location.pointSpan() as FileSpan, '', ScalarStyle.PLAIN);
+
+ /// Parses directives.
+ (VersionDirective?, List<TagDirective>) _processDirectives() {
+ var token = _scanner.peek()!;
+
+ VersionDirective? versionDirective;
+ var tagDirectives = <TagDirective>[];
+ while (token.type == TokenType.versionDirective ||
+ token.type == TokenType.tagDirective) {
+ if (token is VersionDirectiveToken) {
+ if (versionDirective != null) {
+ throw YamlException('Duplicate %YAML directive.', token.span);
+ }
+
+ if (token.major != 1 || token.minor == 0) {
+ throw YamlException(
+ 'Incompatible YAML document. This parser only supports YAML 1.1 '
+ 'and 1.2.',
+ token.span);
+ } else if (token.minor > 2) {
+ // TODO(nweiz): Print to stderr when issue 6943 is fixed and dart:io
+ // is available.
+ warn('Warning: this parser only supports YAML 1.1 and 1.2.',
+ token.span);
+ }
+
+ versionDirective = VersionDirective(token.major, token.minor);
+ } else if (token is TagDirectiveToken) {
+ var tagDirective = TagDirective(token.handle, token.prefix);
+ _appendTagDirective(tagDirective, token.span);
+ tagDirectives.add(tagDirective);
+ }
+
+ token = _scanner.advance()!;
+ }
+
+ _appendTagDirective(TagDirective('!', '!'), token.span.start.pointSpan(),
+ allowDuplicates: true);
+ _appendTagDirective(
+ TagDirective('!!', 'tag:yaml.org,2002:'), token.span.start.pointSpan(),
+ allowDuplicates: true);
+
+ return (versionDirective, tagDirectives);
+ }
+
+ /// Adds a tag directive to the directives stack.
+ void _appendTagDirective(TagDirective newDirective, FileSpan span,
+ {bool allowDuplicates = false}) {
+ if (_tagDirectives.containsKey(newDirective.handle)) {
+ if (allowDuplicates) return;
+ throw YamlException('Duplicate %TAG directive.', span);
+ }
+
+ _tagDirectives[newDirective.handle] = newDirective;
+ }
+}
+
+/// The possible states for the parser.
+class _State {
+ /// Expect [TokenType.streamStart].
+ static const STREAM_START = _State('STREAM_START');
+
+ /// Expect [TokenType.documentStart].
+ static const DOCUMENT_START = _State('DOCUMENT_START');
+
+ /// Expect the content of a document.
+ static const DOCUMENT_CONTENT = _State('DOCUMENT_CONTENT');
+
+ /// Expect [TokenType.documentEnd].
+ static const DOCUMENT_END = _State('DOCUMENT_END');
+
+ /// Expect a block node.
+ static const BLOCK_NODE = _State('BLOCK_NODE');
+
+ /// Expect a block node or indentless sequence.
+ static const BLOCK_NODE_OR_INDENTLESS_SEQUENCE =
+ _State('BLOCK_NODE_OR_INDENTLESS_SEQUENCE');
+
+ /// Expect a flow node.
+ static const FLOW_NODE = _State('FLOW_NODE');
+
+ /// Expect the first entry of a block sequence.
+ static const BLOCK_SEQUENCE_FIRST_ENTRY =
+ _State('BLOCK_SEQUENCE_FIRST_ENTRY');
+
+ /// Expect an entry of a block sequence.
+ static const BLOCK_SEQUENCE_ENTRY = _State('BLOCK_SEQUENCE_ENTRY');
+
+ /// Expect an entry of an indentless sequence.
+ static const INDENTLESS_SEQUENCE_ENTRY = _State('INDENTLESS_SEQUENCE_ENTRY');
+
+ /// Expect the first key of a block mapping.
+ static const BLOCK_MAPPING_FIRST_KEY = _State('BLOCK_MAPPING_FIRST_KEY');
+
+ /// Expect a block mapping key.
+ static const BLOCK_MAPPING_KEY = _State('BLOCK_MAPPING_KEY');
+
+ /// Expect a block mapping value.
+ static const BLOCK_MAPPING_VALUE = _State('BLOCK_MAPPING_VALUE');
+
+ /// Expect the first entry of a flow sequence.
+ static const FLOW_SEQUENCE_FIRST_ENTRY = _State('FLOW_SEQUENCE_FIRST_ENTRY');
+
+ /// Expect an entry of a flow sequence.
+ static const FLOW_SEQUENCE_ENTRY = _State('FLOW_SEQUENCE_ENTRY');
+
+ /// Expect a key of an ordered mapping.
+ static const FLOW_SEQUENCE_ENTRY_MAPPING_KEY =
+ _State('FLOW_SEQUENCE_ENTRY_MAPPING_KEY');
+
+ /// Expect a value of an ordered mapping.
+ static const FLOW_SEQUENCE_ENTRY_MAPPING_VALUE =
+ _State('FLOW_SEQUENCE_ENTRY_MAPPING_VALUE');
+
+ /// Expect the and of an ordered mapping entry.
+ static const FLOW_SEQUENCE_ENTRY_MAPPING_END =
+ _State('FLOW_SEQUENCE_ENTRY_MAPPING_END');
+
+ /// Expect the first key of a flow mapping.
+ static const FLOW_MAPPING_FIRST_KEY = _State('FLOW_MAPPING_FIRST_KEY');
+
+ /// Expect a key of a flow mapping.
+ static const FLOW_MAPPING_KEY = _State('FLOW_MAPPING_KEY');
+
+ /// Expect a value of a flow mapping.
+ static const FLOW_MAPPING_VALUE = _State('FLOW_MAPPING_VALUE');
+
+ /// Expect an empty value of a flow mapping.
+ static const FLOW_MAPPING_EMPTY_VALUE = _State('FLOW_MAPPING_EMPTY_VALUE');
+
+ /// Expect nothing.
+ static const END = _State('END');
+
+ final String name;
+
+ const _State(this.name);
+
+ @override
+ String toString() => name;
+}
diff --git a/pkgs/yaml/lib/src/scanner.dart b/pkgs/yaml/lib/src/scanner.dart
new file mode 100644
index 0000000..1cfd3af
--- /dev/null
+++ b/pkgs/yaml/lib/src/scanner.dart
@@ -0,0 +1,1695 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+// ignore_for_file: constant_identifier_names
+
+import 'package:collection/collection.dart';
+import 'package:source_span/source_span.dart';
+import 'package:string_scanner/string_scanner.dart';
+
+import 'error_listener.dart';
+import 'style.dart';
+import 'token.dart';
+import 'utils.dart';
+import 'yaml_exception.dart';
+
+/// A scanner that reads a string of Unicode characters and emits [Token]s.
+///
+/// This is based on the libyaml scanner, available at
+/// https://github.com/yaml/libyaml/blob/master/src/scanner.c. The license for
+/// that is available in ../../libyaml-license.txt.
+class Scanner {
+ static const TAB = 0x9;
+ static const LF = 0xA;
+ static const CR = 0xD;
+ static const SP = 0x20;
+ static const DOLLAR = 0x24;
+ static const LEFT_PAREN = 0x28;
+ static const RIGHT_PAREN = 0x29;
+ static const PLUS = 0x2B;
+ static const COMMA = 0x2C;
+ static const HYPHEN = 0x2D;
+ static const PERIOD = 0x2E;
+ static const QUESTION = 0x3F;
+ static const COLON = 0x3A;
+ static const SEMICOLON = 0x3B;
+ static const EQUALS = 0x3D;
+ static const LEFT_SQUARE = 0x5B;
+ static const RIGHT_SQUARE = 0x5D;
+ static const LEFT_CURLY = 0x7B;
+ static const RIGHT_CURLY = 0x7D;
+ static const HASH = 0x23;
+ static const AMPERSAND = 0x26;
+ static const ASTERISK = 0x2A;
+ static const EXCLAMATION = 0x21;
+ static const VERTICAL_BAR = 0x7C;
+ static const LEFT_ANGLE = 0x3C;
+ static const RIGHT_ANGLE = 0x3E;
+ static const SINGLE_QUOTE = 0x27;
+ static const DOUBLE_QUOTE = 0x22;
+ static const PERCENT = 0x25;
+ static const AT = 0x40;
+ static const GRAVE_ACCENT = 0x60;
+ static const TILDE = 0x7E;
+
+ static const NULL = 0x0;
+ static const BELL = 0x7;
+ static const BACKSPACE = 0x8;
+ static const VERTICAL_TAB = 0xB;
+ static const FORM_FEED = 0xC;
+ static const ESCAPE = 0x1B;
+ static const SLASH = 0x2F;
+ static const BACKSLASH = 0x5C;
+ static const UNDERSCORE = 0x5F;
+ static const NEL = 0x85;
+ static const NBSP = 0xA0;
+ static const LINE_SEPARATOR = 0x2028;
+ static const PARAGRAPH_SEPARATOR = 0x2029;
+ static const BOM = 0xFEFF;
+
+ static const NUMBER_0 = 0x30;
+ static const NUMBER_9 = 0x39;
+
+ static const LETTER_A = 0x61;
+ static const LETTER_B = 0x62;
+ static const LETTER_E = 0x65;
+ static const LETTER_F = 0x66;
+ static const LETTER_N = 0x6E;
+ static const LETTER_R = 0x72;
+ static const LETTER_T = 0x74;
+ static const LETTER_U = 0x75;
+ static const LETTER_V = 0x76;
+ static const LETTER_X = 0x78;
+ static const LETTER_Z = 0x7A;
+
+ static const LETTER_CAP_A = 0x41;
+ static const LETTER_CAP_F = 0x46;
+ static const LETTER_CAP_L = 0x4C;
+ static const LETTER_CAP_N = 0x4E;
+ static const LETTER_CAP_P = 0x50;
+ static const LETTER_CAP_U = 0x55;
+ static const LETTER_CAP_X = 0x58;
+ static const LETTER_CAP_Z = 0x5A;
+
+ /// Whether this scanner should attempt to recover when parsing invalid YAML.
+ final bool _recover;
+
+ /// A listener to report YAML errors to.
+ final ErrorListener? _errorListener;
+
+ /// The underlying [SpanScanner] used to read characters from the source text.
+ ///
+ /// This is also used to track line and column information and to generate
+ /// [SourceSpan]s.
+ final SpanScanner _scanner;
+
+ /// Whether this scanner has produced a [TokenType.streamStart] token
+ /// indicating the beginning of the YAML stream.
+ var _streamStartProduced = false;
+
+ /// Whether this scanner has produced a [TokenType.streamEnd] token
+ /// indicating the end of the YAML stream.
+ var _streamEndProduced = false;
+
+ /// The queue of tokens yet to be emitted.
+ ///
+ /// These are queued up in advance so that [TokenType.key] tokens can be
+ /// inserted once the scanner determines that a series of tokens represents a
+ /// mapping key.
+ final _tokens = QueueList<Token>();
+
+ /// The number of tokens that have been emitted.
+ ///
+ /// This doesn't count tokens in [_tokens].
+ var _tokensParsed = 0;
+
+ /// Whether the next token in [_tokens] is ready to be returned.
+ ///
+ /// It might not be ready if there may still be a [TokenType.key] inserted
+ /// before it.
+ var _tokenAvailable = false;
+
+ /// The stack of indent levels for the current nested block contexts.
+ ///
+ /// The YAML spec specifies that the initial indentation level is -1 spaces.
+ final _indents = <int>[-1];
+
+ /// Whether a simple key is allowed in this context.
+ ///
+ /// A simple key refers to any mapping key that doesn't have an explicit "?".
+ var _simpleKeyAllowed = true;
+
+ /// The stack of potential simple keys for each level of flow nesting.
+ ///
+ /// Entries in this list may be `null`, indicating that there is no valid
+ /// simple key for the associated level of nesting.
+ ///
+ /// When a ":" is parsed and there's a simple key available, a [TokenType.key]
+ /// token is inserted in [_tokens] before that key's token. This allows the
+ /// parser to tell that the key is intended to be a mapping key.
+ final _simpleKeys = <_SimpleKey?>[null];
+
+ /// The current indentation level.
+ int get _indent => _indents.last;
+
+ /// Whether the scanner's currently positioned in a block-level structure (as
+ /// opposed to flow-level).
+ bool get _inBlockContext => _simpleKeys.length == 1;
+
+ /// Whether the current character is a line break or the end of the source.
+ bool get _isBreakOrEnd => _scanner.isDone || _isBreak;
+
+ /// Whether the current character is a line break.
+ bool get _isBreak => _isBreakAt(0);
+
+ /// Whether the current character is whitespace or the end of the source.
+ bool get _isBlankOrEnd => _isBlankOrEndAt(0);
+
+ /// Whether the current character is whitespace.
+ bool get _isBlank => _isBlankAt(0);
+
+ /// Whether the current character is a valid tag name character.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#ns-tag-name.
+ bool get _isTagChar {
+ var char = _scanner.peekChar();
+ if (char == null) return false;
+ switch (char) {
+ case HYPHEN:
+ case SEMICOLON:
+ case SLASH:
+ case COLON:
+ case AT:
+ case AMPERSAND:
+ case EQUALS:
+ case PLUS:
+ case DOLLAR:
+ case PERIOD:
+ case TILDE:
+ case QUESTION:
+ case ASTERISK:
+ case SINGLE_QUOTE:
+ case LEFT_PAREN:
+ case RIGHT_PAREN:
+ case PERCENT:
+ return true;
+ default:
+ return (char >= NUMBER_0 && char <= NUMBER_9) ||
+ (char >= LETTER_A && char <= LETTER_Z) ||
+ (char >= LETTER_CAP_A && char <= LETTER_CAP_Z);
+ }
+ }
+
+ /// Whether the current character is a valid anchor name character.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#ns-anchor-name.
+ bool get _isAnchorChar {
+ if (!_isNonSpace) return false;
+
+ switch (_scanner.peekChar()) {
+ case COMMA:
+ case LEFT_SQUARE:
+ case RIGHT_SQUARE:
+ case LEFT_CURLY:
+ case RIGHT_CURLY:
+ return false;
+ default:
+ return true;
+ }
+ }
+
+ /// Whether the character at the current position is a decimal digit.
+ bool get _isDigit {
+ var char = _scanner.peekChar();
+ return char != null && (char >= NUMBER_0 && char <= NUMBER_9);
+ }
+
+ /// Whether the character at the current position is a hexidecimal
+ /// digit.
+ bool get _isHex {
+ var char = _scanner.peekChar();
+ if (char == null) return false;
+ return (char >= NUMBER_0 && char <= NUMBER_9) ||
+ (char >= LETTER_A && char <= LETTER_F) ||
+ (char >= LETTER_CAP_A && char <= LETTER_CAP_F);
+ }
+
+ /// Whether the character at the current position is a plain character.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#ns-plain-char(c).
+ bool get _isPlainChar => _isPlainCharAt(0);
+
+ /// Whether the character at the current position is a printable character
+ /// other than a line break or byte-order mark.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#nb-char.
+ bool get _isNonBreak {
+ var char = _scanner.peekChar();
+ return switch (char) {
+ null => false,
+ LF || CR || BOM => false,
+ TAB || NEL => true,
+ _ => _isStandardCharacterAt(0),
+ };
+ }
+
+ /// Whether the character at the current position is a printable character
+ /// other than whitespace.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#nb-char.
+ bool get _isNonSpace {
+ var char = _scanner.peekChar();
+ return switch (char) {
+ null => false,
+ LF || CR || BOM || SP => false,
+ NEL => true,
+ _ => _isStandardCharacterAt(0),
+ };
+ }
+
+ /// Returns Whether or not the current character begins a documentation
+ /// indicator.
+ ///
+ /// If so, this sets the scanner's last match to that indicator.
+ bool get _isDocumentIndicator =>
+ _scanner.column == 0 &&
+ _isBlankOrEndAt(3) &&
+ (_scanner.matches('---') || _scanner.matches('...'));
+
+ /// Creates a scanner that scans [source].
+ Scanner(String source,
+ {Uri? sourceUrl, bool recover = false, ErrorListener? errorListener})
+ : _recover = recover,
+ _errorListener = errorListener,
+ _scanner = SpanScanner.eager(source, sourceUrl: sourceUrl);
+
+ /// Consumes and returns the next token.
+ Token scan() {
+ if (_streamEndProduced) throw StateError('Out of tokens.');
+ if (!_tokenAvailable) _fetchMoreTokens();
+
+ var token = _tokens.removeFirst();
+ _tokenAvailable = false;
+ _tokensParsed++;
+ _streamEndProduced = token.type == TokenType.streamEnd;
+ return token;
+ }
+
+ /// Consumes the next token and returns the one after that.
+ Token? advance() {
+ scan();
+ return peek();
+ }
+
+ /// Returns the next token without consuming it.
+ Token? peek() {
+ if (_streamEndProduced) return null;
+ if (!_tokenAvailable) _fetchMoreTokens();
+ return _tokens.first;
+ }
+
+ /// Ensures that [_tokens] contains at least one token which can be returned.
+ void _fetchMoreTokens() {
+ while (true) {
+ if (_tokens.isNotEmpty) {
+ _staleSimpleKeys();
+
+ // If there are no more tokens to fetch, break.
+ if (_tokens.last.type == TokenType.streamEnd) break;
+
+ // If the current token could be a simple key, we need to scan more
+ // tokens until we determine whether it is or not. Otherwise we might
+ // not emit the `KEY` token before we emit the value of the key.
+ if (!_simpleKeys
+ .any((key) => key != null && key.tokenNumber == _tokensParsed)) {
+ break;
+ }
+ }
+
+ _fetchNextToken();
+ }
+ _tokenAvailable = true;
+ }
+
+ /// The dispatcher for token fetchers.
+ void _fetchNextToken() {
+ if (!_streamStartProduced) {
+ _fetchStreamStart();
+ return;
+ }
+
+ _scanToNextToken();
+ _staleSimpleKeys();
+ _unrollIndent(_scanner.column);
+
+ if (_scanner.isDone) {
+ _fetchStreamEnd();
+ return;
+ }
+
+ if (_scanner.column == 0) {
+ if (_scanner.peekChar() == PERCENT) {
+ _fetchDirective();
+ return;
+ }
+
+ if (_isBlankOrEndAt(3)) {
+ if (_scanner.matches('---')) {
+ _fetchDocumentIndicator(TokenType.documentStart);
+ return;
+ }
+
+ if (_scanner.matches('...')) {
+ _fetchDocumentIndicator(TokenType.documentEnd);
+ return;
+ }
+ }
+ }
+
+ switch (_scanner.peekChar()) {
+ case LEFT_SQUARE:
+ _fetchFlowCollectionStart(TokenType.flowSequenceStart);
+ return;
+ case LEFT_CURLY:
+ _fetchFlowCollectionStart(TokenType.flowMappingStart);
+ return;
+ case RIGHT_SQUARE:
+ _fetchFlowCollectionEnd(TokenType.flowSequenceEnd);
+ return;
+ case RIGHT_CURLY:
+ _fetchFlowCollectionEnd(TokenType.flowMappingEnd);
+ return;
+ case COMMA:
+ _fetchFlowEntry();
+ return;
+ case ASTERISK:
+ _fetchAnchor(anchor: false);
+ return;
+ case AMPERSAND:
+ _fetchAnchor();
+ return;
+ case EXCLAMATION:
+ _fetchTag();
+ return;
+ case SINGLE_QUOTE:
+ _fetchFlowScalar(singleQuote: true);
+ return;
+ case DOUBLE_QUOTE:
+ _fetchFlowScalar();
+ return;
+ case VERTICAL_BAR:
+ if (!_inBlockContext) _invalidScalarCharacter();
+ _fetchBlockScalar(literal: true);
+ return;
+ case RIGHT_ANGLE:
+ if (!_inBlockContext) _invalidScalarCharacter();
+ _fetchBlockScalar();
+ return;
+ case PERCENT:
+ case AT:
+ case GRAVE_ACCENT:
+ _invalidScalarCharacter();
+ return;
+
+ // These characters may sometimes begin plain scalars.
+ case HYPHEN:
+ if (_isPlainCharAt(1)) {
+ _fetchPlainScalar();
+ } else {
+ _fetchBlockEntry();
+ }
+ return;
+ case QUESTION:
+ if (_isPlainCharAt(1)) {
+ _fetchPlainScalar();
+ } else {
+ _fetchKey();
+ }
+ return;
+ case COLON:
+ if (!_inBlockContext && _tokens.isNotEmpty) {
+ // If a colon follows a "JSON-like" value (an explicit map or list, or
+ // a quoted string) it isn't required to have whitespace after it
+ // since it unambiguously describes a map.
+ var token = _tokens.last;
+ if (token.type == TokenType.flowSequenceEnd ||
+ token.type == TokenType.flowMappingEnd ||
+ (token.type == TokenType.scalar &&
+ (token as ScalarToken).style.isQuoted)) {
+ _fetchValue();
+ return;
+ }
+ }
+
+ if (_isPlainCharAt(1)) {
+ _fetchPlainScalar();
+ } else {
+ _fetchValue();
+ }
+ return;
+ default:
+ if (!_isNonBreak) _invalidScalarCharacter();
+
+ _fetchPlainScalar();
+ return;
+ }
+ }
+
+ /// Throws an error about a disallowed character.
+ void _invalidScalarCharacter() =>
+ _scanner.error('Unexpected character.', length: 1);
+
+ /// Checks the list of potential simple keys and remove the positions that
+ /// cannot contain simple keys anymore.
+ void _staleSimpleKeys() {
+ for (var i = 0; i < _simpleKeys.length; i++) {
+ var key = _simpleKeys[i];
+ if (key == null) continue;
+
+ // libyaml requires that all simple keys be a single line and no longer
+ // than 1024 characters. However, in section 7.4.2 of the spec
+ // (http://yaml.org/spec/1.2/spec.html#id2790832), these restrictions are
+ // only applied when the curly braces are omitted. It's difficult to
+ // retain enough context to know which keys need to have the restriction
+ // placed on them, so for now we go the other direction and allow
+ // everything but multiline simple keys in a block context.
+ if (!_inBlockContext) continue;
+
+ if (key.line == _scanner.line) continue;
+
+ if (key.required) {
+ _reportError(YamlException("Expected ':'.", _scanner.emptySpan));
+ _tokens.insert(key.tokenNumber - _tokensParsed,
+ Token(TokenType.key, key.location.pointSpan() as FileSpan));
+ }
+
+ _simpleKeys[i] = null;
+ }
+ }
+
+ /// Checks if a simple key may start at the current position and saves it if
+ /// so.
+ void _saveSimpleKey() {
+ // A simple key is required at the current position if the scanner is in the
+ // block context and the current column coincides with the indentation
+ // level.
+ var required = _inBlockContext && _indent == _scanner.column;
+
+ // A simple key is required only when it is the first token in the current
+ // line. Therefore it is always allowed. But we add a check anyway.
+ assert(_simpleKeyAllowed || !required);
+
+ if (!_simpleKeyAllowed) return;
+
+ // If the current position may start a simple key, save it.
+ _removeSimpleKey();
+ _simpleKeys[_simpleKeys.length - 1] = _SimpleKey(
+ _tokensParsed + _tokens.length,
+ _scanner.line,
+ _scanner.column,
+ _scanner.location,
+ required: required);
+ }
+
+ /// Removes a potential simple key at the current flow level.
+ void _removeSimpleKey() {
+ var key = _simpleKeys.last;
+ if (key != null && key.required) {
+ throw YamlException("Could not find expected ':' for simple key.",
+ key.location.pointSpan());
+ }
+
+ _simpleKeys[_simpleKeys.length - 1] = null;
+ }
+
+ /// Increases the flow level and resizes the simple key list.
+ void _increaseFlowLevel() {
+ _simpleKeys.add(null);
+ }
+
+ /// Decreases the flow level.
+ void _decreaseFlowLevel() {
+ if (_inBlockContext) return;
+ _simpleKeys.removeLast();
+ }
+
+ /// Pushes the current indentation level to the stack and sets the new level
+ /// if [column] is greater than [_indent].
+ ///
+ /// If it is, appends or inserts the specified token into [_tokens]. If
+ /// [tokenNumber] is provided, the corresponding token will be replaced;
+ /// otherwise, the token will be added at the end.
+ void _rollIndent(int column, TokenType type, SourceLocation location,
+ {int? tokenNumber}) {
+ if (!_inBlockContext) return;
+ if (_indent != -1 && _indent >= column) return;
+
+ // Push the current indentation level to the stack and set the new
+ // indentation level.
+ _indents.add(column);
+
+ // Create a token and insert it into the queue.
+ var token = Token(type, location.pointSpan() as FileSpan);
+ if (tokenNumber == null) {
+ _tokens.add(token);
+ } else {
+ _tokens.insert(tokenNumber - _tokensParsed, token);
+ }
+ }
+
+ /// Pops indentation levels from [_indents] until the current level becomes
+ /// less than or equal to [column].
+ ///
+ /// For each indentation level, appends a [TokenType.blockEnd] token.
+ void _unrollIndent(int column) {
+ if (!_inBlockContext) return;
+
+ while (_indent > column) {
+ _tokens.add(Token(TokenType.blockEnd, _scanner.emptySpan));
+ _indents.removeLast();
+ }
+ }
+
+ /// Pops indentation levels from [_indents] until the current level resets to
+ /// -1.
+ ///
+ /// For each indentation level, appends a [TokenType.blockEnd] token.
+ void _resetIndent() => _unrollIndent(-1);
+
+ /// Produces a [TokenType.streamStart] token.
+ void _fetchStreamStart() {
+ // Much of libyaml's initialization logic here is done in variable
+ // initializers instead.
+ _streamStartProduced = true;
+ _tokens.add(Token(TokenType.streamStart, _scanner.emptySpan));
+ }
+
+ /// Produces a [TokenType.streamEnd] token.
+ void _fetchStreamEnd() {
+ _resetIndent();
+ _removeSimpleKey();
+ _simpleKeyAllowed = false;
+ _tokens.add(Token(TokenType.streamEnd, _scanner.emptySpan));
+ }
+
+ /// Produces a [TokenType.versionDirective] or [TokenType.tagDirective]
+ /// token.
+ void _fetchDirective() {
+ _resetIndent();
+ _removeSimpleKey();
+ _simpleKeyAllowed = false;
+ var directive = _scanDirective();
+ if (directive != null) _tokens.add(directive);
+ }
+
+ /// Produces a [TokenType.documentStart] or [TokenType.documentEnd] token.
+ void _fetchDocumentIndicator(TokenType type) {
+ _resetIndent();
+ _removeSimpleKey();
+ _simpleKeyAllowed = false;
+
+ // Consume the indicator token.
+ var start = _scanner.state;
+ _scanner.readCodePoint();
+ _scanner.readCodePoint();
+ _scanner.readCodePoint();
+
+ _tokens.add(Token(type, _scanner.spanFrom(start)));
+ }
+
+ /// Produces a [TokenType.flowSequenceStart] or
+ /// [TokenType.flowMappingStart] token.
+ void _fetchFlowCollectionStart(TokenType type) {
+ _saveSimpleKey();
+ _increaseFlowLevel();
+ _simpleKeyAllowed = true;
+ _addCharToken(type);
+ }
+
+ /// Produces a [TokenType.flowSequenceEnd] or [TokenType.flowMappingEnd]
+ /// token.
+ void _fetchFlowCollectionEnd(TokenType type) {
+ _removeSimpleKey();
+ _decreaseFlowLevel();
+ _simpleKeyAllowed = false;
+ _addCharToken(type);
+ }
+
+ /// Produces a [TokenType.flowEntry] token.
+ void _fetchFlowEntry() {
+ _removeSimpleKey();
+ _simpleKeyAllowed = true;
+ _addCharToken(TokenType.flowEntry);
+ }
+
+ /// Produces a [TokenType.blockEntry] token.
+ void _fetchBlockEntry() {
+ if (_inBlockContext) {
+ if (!_simpleKeyAllowed) {
+ throw YamlException(
+ 'Block sequence entries are not allowed here.', _scanner.emptySpan);
+ }
+
+ _rollIndent(
+ _scanner.column, TokenType.blockSequenceStart, _scanner.location);
+ } else {
+ // It is an error for the '-' indicator to occur in the flow context, but
+ // we let the Parser detect and report it because it's able to point to
+ // the context.
+ }
+
+ _removeSimpleKey();
+ _simpleKeyAllowed = true;
+ _addCharToken(TokenType.blockEntry);
+ }
+
+ /// Produces the [TokenType.key] token.
+ void _fetchKey() {
+ if (_inBlockContext) {
+ if (!_simpleKeyAllowed) {
+ throw YamlException(
+ 'Mapping keys are not allowed here.', _scanner.emptySpan);
+ }
+
+ _rollIndent(
+ _scanner.column, TokenType.blockMappingStart, _scanner.location);
+ }
+
+ // Simple keys are allowed after `?` in a block context.
+ _simpleKeyAllowed = _inBlockContext;
+ _addCharToken(TokenType.key);
+ }
+
+ /// Produces the [TokenType.value] token.
+ void _fetchValue() {
+ var simpleKey = _simpleKeys.last;
+ if (simpleKey != null) {
+ // Add a [TokenType.KEY] directive before the first token of the simple
+ // key so the parser knows that it's part of a key/value pair.
+ _tokens.insert(simpleKey.tokenNumber - _tokensParsed,
+ Token(TokenType.key, simpleKey.location.pointSpan() as FileSpan));
+
+ // In the block context, we may need to add the
+ // [TokenType.BLOCK_MAPPING_START] token.
+ _rollIndent(
+ simpleKey.column, TokenType.blockMappingStart, simpleKey.location,
+ tokenNumber: simpleKey.tokenNumber);
+
+ // Remove the simple key.
+ _simpleKeys[_simpleKeys.length - 1] = null;
+
+ // A simple key cannot follow another simple key.
+ _simpleKeyAllowed = false;
+ } else if (_inBlockContext) {
+ if (!_simpleKeyAllowed) {
+ throw YamlException(
+ 'Mapping values are not allowed here. Did you miss a colon '
+ 'earlier?',
+ _scanner.emptySpan);
+ }
+
+ // If we're here, we've found the ':' indicator following a complex key.
+
+ _rollIndent(
+ _scanner.column, TokenType.blockMappingStart, _scanner.location);
+ _simpleKeyAllowed = true;
+ } else if (_simpleKeyAllowed) {
+ // If we're here, we've found the ':' indicator with an empty key. This
+ // behavior differs from libyaml, which disallows empty implicit keys.
+ _simpleKeyAllowed = false;
+ _addCharToken(TokenType.key);
+ }
+
+ _addCharToken(TokenType.value);
+ }
+
+ /// Adds a token with [type] to [_tokens].
+ ///
+ /// The span of the new token is the current character.
+ void _addCharToken(TokenType type) {
+ var start = _scanner.state;
+ _scanner.readCodePoint();
+ _tokens.add(Token(type, _scanner.spanFrom(start)));
+ }
+
+ /// Produces a [TokenType.alias] or [TokenType.anchor] token.
+ void _fetchAnchor({bool anchor = true}) {
+ _saveSimpleKey();
+ _simpleKeyAllowed = false;
+ _tokens.add(_scanAnchor(anchor: anchor));
+ }
+
+ /// Produces a [TokenType.tag] token.
+ void _fetchTag() {
+ _saveSimpleKey();
+ _simpleKeyAllowed = false;
+ _tokens.add(_scanTag());
+ }
+
+ /// Produces a [TokenType.scalar] token with style [ScalarStyle.LITERAL] or
+ /// [ScalarStyle.FOLDED].
+ void _fetchBlockScalar({bool literal = false}) {
+ _removeSimpleKey();
+ _simpleKeyAllowed = true;
+ _tokens.add(_scanBlockScalar(literal: literal));
+ }
+
+ /// Produces a [TokenType.scalar] token with style [ScalarStyle.SINGLE_QUOTED]
+ /// or [ScalarStyle.DOUBLE_QUOTED].
+ void _fetchFlowScalar({bool singleQuote = false}) {
+ _saveSimpleKey();
+ _simpleKeyAllowed = false;
+ _tokens.add(_scanFlowScalar(singleQuote: singleQuote));
+ }
+
+ /// Produces a [TokenType.scalar] token with style [ScalarStyle.PLAIN].
+ void _fetchPlainScalar() {
+ _saveSimpleKey();
+ _simpleKeyAllowed = false;
+ _tokens.add(_scanPlainScalar());
+ }
+
+ /// Eats whitespace and comments until the next token is found.
+ void _scanToNextToken() {
+ var afterLineBreak = false;
+ while (true) {
+ // Allow the BOM to start a line.
+ if (_scanner.column == 0) _scanner.scan('\uFEFF');
+
+ // Eat whitespace.
+ //
+ // libyaml disallows tabs after "-", "?", or ":", but the spec allows
+ // them. See section 6.2: http://yaml.org/spec/1.2/spec.html#id2778241.
+ while (_scanner.peekChar() == SP ||
+ ((!_inBlockContext || !afterLineBreak) &&
+ _scanner.peekChar() == TAB)) {
+ _scanner.readChar();
+ }
+
+ if (_scanner.peekChar() == TAB) {
+ _scanner.error('Tab characters are not allowed as indentation.',
+ length: 1);
+ }
+
+ // Eat a comment until a line break.
+ _skipComment();
+
+ // If we're at a line break, eat it.
+ if (_isBreak) {
+ _skipLine();
+
+ // In the block context, a new line may start a simple key.
+ if (_inBlockContext) _simpleKeyAllowed = true;
+ afterLineBreak = true;
+ } else {
+ // Otherwise we've found a token.
+ break;
+ }
+ }
+ }
+
+ /// Scans a [TokenType.versionDirective] or [TokenType.tagDirective] token.
+ ///
+ /// %YAML 1.2 # a comment \n
+ /// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ /// %TAG !yaml! tag:yaml.org,2002: \n
+ /// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ Token? _scanDirective() {
+ var start = _scanner.state;
+
+ // Eat '%'.
+ _scanner.readChar();
+
+ Token token;
+ var name = _scanDirectiveName();
+ if (name == 'YAML') {
+ token = _scanVersionDirectiveValue(start);
+ } else if (name == 'TAG') {
+ token = _scanTagDirectiveValue(start);
+ } else {
+ warn('Warning: unknown directive.', _scanner.spanFrom(start));
+
+ // libyaml doesn't support unknown directives, but the spec says to ignore
+ // them and warn: http://yaml.org/spec/1.2/spec.html#id2781147.
+ while (!_isBreakOrEnd) {
+ _scanner.readCodePoint();
+ }
+
+ return null;
+ }
+
+ // Eat the rest of the line, including any comments.
+ _skipBlanks();
+ _skipComment();
+
+ if (!_isBreakOrEnd) {
+ throw YamlException('Expected comment or line break after directive.',
+ _scanner.spanFrom(start));
+ }
+
+ _skipLine();
+ return token;
+ }
+
+ /// Scans a directive name.
+ ///
+ /// %YAML 1.2 # a comment \n
+ /// ^^^^
+ /// %TAG !yaml! tag:yaml.org,2002: \n
+ /// ^^^
+ String _scanDirectiveName() {
+ // libyaml only allows word characters in directive names, but the spec
+ // disagrees: http://yaml.org/spec/1.2/spec.html#ns-directive-name.
+ var start = _scanner.position;
+ while (_isNonSpace) {
+ _scanner.readCodePoint();
+ }
+
+ var name = _scanner.substring(start);
+ if (name.isEmpty) {
+ throw YamlException('Expected directive name.', _scanner.emptySpan);
+ } else if (!_isBlankOrEnd) {
+ throw YamlException(
+ 'Unexpected character in directive name.', _scanner.emptySpan);
+ }
+
+ return name;
+ }
+
+ /// Scans the value of a version directive.
+ ///
+ /// %YAML 1.2 # a comment \n
+ /// ^^^^^^
+ Token _scanVersionDirectiveValue(LineScannerState start) {
+ _skipBlanks();
+
+ var major = _scanVersionDirectiveNumber();
+ _scanner.expect('.');
+ var minor = _scanVersionDirectiveNumber();
+
+ return VersionDirectiveToken(_scanner.spanFrom(start), major, minor);
+ }
+
+ /// Scans the version number of a version directive.
+ ///
+ /// %YAML 1.2 # a comment \n
+ /// ^
+ /// %YAML 1.2 # a comment \n
+ /// ^
+ int _scanVersionDirectiveNumber() {
+ var start = _scanner.position;
+ while (_isDigit) {
+ _scanner.readChar();
+ }
+
+ var number = _scanner.substring(start);
+ if (number.isEmpty) {
+ throw YamlException('Expected version number.', _scanner.emptySpan);
+ }
+
+ return int.parse(number);
+ }
+
+ /// Scans the value of a tag directive.
+ ///
+ /// %TAG !yaml! tag:yaml.org,2002: \n
+ /// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ Token _scanTagDirectiveValue(LineScannerState start) {
+ _skipBlanks();
+
+ var handle = _scanTagHandle(directive: true);
+ if (!_isBlank) {
+ throw YamlException('Expected whitespace.', _scanner.emptySpan);
+ }
+
+ _skipBlanks();
+
+ var prefix = _scanTagUri();
+ if (!_isBlankOrEnd) {
+ throw YamlException('Expected whitespace.', _scanner.emptySpan);
+ }
+
+ return TagDirectiveToken(_scanner.spanFrom(start), handle, prefix);
+ }
+
+ /// Scans a [TokenType.anchor] token.
+ Token _scanAnchor({bool anchor = true}) {
+ var start = _scanner.state;
+
+ // Eat the indicator character.
+ _scanner.readCodePoint();
+
+ // libyaml only allows word characters in anchor names, but the spec
+ // disagrees: http://yaml.org/spec/1.2/spec.html#ns-anchor-char.
+ var startPosition = _scanner.position;
+ while (_isAnchorChar) {
+ _scanner.readCodePoint();
+ }
+ var name = _scanner.substring(startPosition);
+
+ var next = _scanner.peekChar();
+ if (name.isEmpty ||
+ (!_isBlankOrEnd &&
+ next != QUESTION &&
+ next != COLON &&
+ next != COMMA &&
+ next != RIGHT_SQUARE &&
+ next != RIGHT_CURLY &&
+ next != PERCENT &&
+ next != AT &&
+ next != GRAVE_ACCENT)) {
+ throw YamlException(
+ 'Expected alphanumeric character.', _scanner.emptySpan);
+ }
+
+ if (anchor) {
+ return AnchorToken(_scanner.spanFrom(start), name);
+ } else {
+ return AliasToken(_scanner.spanFrom(start), name);
+ }
+ }
+
+ /// Scans a [TokenType.tag] token.
+ Token _scanTag() {
+ String? handle;
+ String suffix;
+ var start = _scanner.state;
+
+ // Check if the tag is in the canonical form.
+ if (_scanner.peekChar(1) == LEFT_ANGLE) {
+ // Eat '!<'.
+ _scanner.readChar();
+ _scanner.readChar();
+
+ handle = '';
+ suffix = _scanTagUri();
+
+ _scanner.expect('>');
+ } else {
+ // The tag has either the '!suffix' or the '!handle!suffix' form.
+
+ // First, try to scan a handle.
+ handle = _scanTagHandle();
+
+ if (handle.length > 1 && handle.startsWith('!') && handle.endsWith('!')) {
+ suffix = _scanTagUri(flowSeparators: false);
+ } else {
+ suffix = _scanTagUri(head: handle, flowSeparators: false);
+
+ // There was no explicit handle.
+ if (suffix.isEmpty) {
+ // This is the special '!' tag.
+ handle = null;
+ suffix = '!';
+ } else {
+ handle = '!';
+ }
+ }
+ }
+
+ // libyaml insists on whitespace after a tag, but example 7.2 indicates
+ // that it's not required: http://yaml.org/spec/1.2/spec.html#id2786720.
+
+ return TagToken(_scanner.spanFrom(start), handle, suffix);
+ }
+
+ /// Scans a tag handle.
+ String _scanTagHandle({bool directive = false}) {
+ _scanner.expect('!');
+
+ var buffer = StringBuffer('!');
+
+ // libyaml only allows word characters in tags, but the spec disagrees:
+ // http://yaml.org/spec/1.2/spec.html#ns-tag-char.
+ var start = _scanner.position;
+ while (_isTagChar) {
+ _scanner.readChar();
+ }
+ buffer.write(_scanner.substring(start));
+
+ if (_scanner.peekChar() == EXCLAMATION) {
+ buffer.writeCharCode(_scanner.readCodePoint());
+ } else {
+ // It's either the '!' tag or not really a tag handle. If it's a %TAG
+ // directive, it's an error. If it's a tag token, it must be part of a
+ // URI.
+ if (directive && buffer.toString() != '!') _scanner.expect('!');
+ }
+
+ return buffer.toString();
+ }
+
+ /// Scans a tag URI.
+ ///
+ /// [head] is the initial portion of the tag that's already been scanned.
+ /// [flowSeparators] indicates whether the tag URI can contain flow
+ /// separators.
+ String _scanTagUri({String? head, bool flowSeparators = true}) {
+ var length = head == null ? 0 : head.length;
+ var buffer = StringBuffer();
+
+ // Copy the head if needed.
+ //
+ // Note that we don't copy the leading '!' character.
+ if (length > 1) buffer.write(head!.substring(1));
+
+ // The set of characters that may appear in URI is as follows:
+ //
+ // '0'-'9', 'A'-'Z', 'a'-'z', '_', '-', ';', '/', '?', ':', '@', '&',
+ // '=', '+', '$', ',', '.', '!', '~', '*', '\'', '(', ')', '[', ']',
+ // '%'.
+ //
+ // In a shorthand tag annotation, the flow separators ',', '[', and ']' are
+ // disallowed.
+ var start = _scanner.position;
+ var char = _scanner.peekChar();
+ while (_isTagChar ||
+ (flowSeparators &&
+ (char == COMMA || char == LEFT_SQUARE || char == RIGHT_SQUARE))) {
+ _scanner.readChar();
+ char = _scanner.peekChar();
+ }
+
+ // libyaml manually decodes the URL, but we don't have to do that.
+ return Uri.decodeFull(_scanner.substring(start));
+ }
+
+ /// Scans a block scalar.
+ Token _scanBlockScalar({bool literal = false}) {
+ var start = _scanner.state;
+
+ // Eat the indicator '|' or '>'.
+ _scanner.readCodePoint();
+
+ // Check for a chomping indicator.
+ var chomping = _Chomping.clip;
+ var increment = 0;
+ var char = _scanner.peekChar();
+ if (char == PLUS || char == HYPHEN) {
+ chomping = char == PLUS ? _Chomping.keep : _Chomping.strip;
+ _scanner.readCodePoint();
+
+ // Check for an indentation indicator.
+ if (_isDigit) {
+ // Check that the indentation is greater than 0.
+ if (_scanner.peekChar() == NUMBER_0) {
+ throw YamlException('0 may not be used as an indentation indicator.',
+ _scanner.spanFrom(start));
+ }
+
+ increment = _scanner.readCodePoint() - NUMBER_0;
+ }
+ } else if (_isDigit) {
+ // Do the same as above, but in the opposite order.
+ if (_scanner.peekChar() == NUMBER_0) {
+ throw YamlException('0 may not be used as an indentation indicator.',
+ _scanner.spanFrom(start));
+ }
+
+ increment = _scanner.readCodePoint() - NUMBER_0;
+
+ char = _scanner.peekChar();
+ if (char == PLUS || char == HYPHEN) {
+ chomping = char == PLUS ? _Chomping.keep : _Chomping.strip;
+ _scanner.readCodePoint();
+ }
+ }
+
+ // Eat whitespace and comments to the end of the line.
+ _skipBlanks();
+ _skipComment();
+
+ // Check if we're at the end of the line.
+ if (!_isBreakOrEnd) {
+ throw YamlException(
+ 'Expected comment or line break.', _scanner.emptySpan);
+ }
+
+ _skipLine();
+
+ // If the block scalar has an explicit indentation indicator, add that to
+ // the current indentation to get the indentation level for the scalar's
+ // contents.
+ var indent = 0;
+ if (increment != 0) {
+ indent = _indent >= 0 ? _indent + increment : increment;
+ }
+
+ // Scan the leading line breaks to determine the indentation level if
+ // needed.
+ var pair = _scanBlockScalarBreaks(indent);
+ indent = pair.indent;
+ var trailingBreaks = pair.trailingBreaks;
+
+ // Scan the block scalar contents.
+ var buffer = StringBuffer();
+ var leadingBreak = '';
+ var leadingBlank = false;
+ var trailingBlank = false;
+ var end = _scanner.state;
+ while (_scanner.column == indent && !_scanner.isDone) {
+ // Check for a document indicator. libyaml doesn't do this, but the spec
+ // mandates it. See example 9.5:
+ // http://yaml.org/spec/1.2/spec.html#id2801606.
+ if (_isDocumentIndicator) break;
+
+ // We are at the beginning of a non-empty line.
+
+ // Is there trailing whitespace?
+ trailingBlank = _isBlank;
+
+ // Check if we need to fold the leading line break.
+ if (!literal &&
+ leadingBreak.isNotEmpty &&
+ !leadingBlank &&
+ !trailingBlank) {
+ // Do we need to join the lines with a space?
+ if (trailingBreaks.isEmpty) buffer.writeCharCode(SP);
+ } else {
+ buffer.write(leadingBreak);
+ }
+ leadingBreak = '';
+
+ // Append the remaining line breaks.
+ buffer.write(trailingBreaks);
+
+ // Is there leading whitespace?
+ leadingBlank = _isBlank;
+
+ var startPosition = _scanner.position;
+ while (!_isBreakOrEnd) {
+ _scanner.readCodePoint();
+ }
+ buffer.write(_scanner.substring(startPosition));
+ end = _scanner.state;
+
+ // libyaml always reads a line here, but this breaks on block scalars at
+ // the end of the document that end without newlines. See example 8.1:
+ // http://yaml.org/spec/1.2/spec.html#id2793888.
+ if (!_scanner.isDone) leadingBreak = _readLine();
+
+ // Eat the following indentation and spaces.
+ var pair = _scanBlockScalarBreaks(indent);
+ indent = pair.indent;
+ trailingBreaks = pair.trailingBreaks;
+ }
+
+ // Chomp the tail.
+ if (chomping != _Chomping.strip) buffer.write(leadingBreak);
+ if (chomping == _Chomping.keep) buffer.write(trailingBreaks);
+
+ return ScalarToken(_scanner.spanFrom(start, end), buffer.toString(),
+ literal ? ScalarStyle.LITERAL : ScalarStyle.FOLDED);
+ }
+
+ /// Scans indentation spaces and line breaks for a block scalar.
+ ///
+ /// Determines the intendation level if needed. Returns the new indentation
+ /// level and the text of the line breaks.
+ ({int indent, String trailingBreaks}) _scanBlockScalarBreaks(int indent) {
+ var maxIndent = 0;
+ var breaks = StringBuffer();
+
+ while (true) {
+ while ((indent == 0 || _scanner.column < indent) &&
+ _scanner.peekChar() == SP) {
+ _scanner.readChar();
+ }
+
+ if (_scanner.column > maxIndent) maxIndent = _scanner.column;
+
+ // libyaml throws an error here if a tab character is detected, but the
+ // spec treats tabs like any other non-space character. See example 8.2:
+ // http://yaml.org/spec/1.2/spec.html#id2794311.
+
+ if (!_isBreak) break;
+ breaks.write(_readLine());
+ }
+
+ if (indent == 0) {
+ indent = maxIndent;
+ if (indent < _indent + 1) indent = _indent + 1;
+
+ // libyaml forces indent to be at least 1 here, but that doesn't seem to
+ // be supported by the spec.
+ }
+
+ return (indent: indent, trailingBreaks: breaks.toString());
+ }
+
+ // Scans a quoted scalar.
+ Token _scanFlowScalar({bool singleQuote = false}) {
+ var start = _scanner.state;
+ var buffer = StringBuffer();
+
+ // Eat the left quote.
+ _scanner.readChar();
+
+ while (true) {
+ // Check that there are no document indicators at the beginning of the
+ // line.
+ if (_isDocumentIndicator) {
+ _scanner.error('Unexpected document indicator.');
+ }
+
+ if (_scanner.isDone) {
+ throw YamlException('Unexpected end of file.', _scanner.emptySpan);
+ }
+
+ var leadingBlanks = false;
+ while (!_isBlankOrEnd) {
+ var char = _scanner.peekChar();
+ if (singleQuote &&
+ char == SINGLE_QUOTE &&
+ _scanner.peekChar(1) == SINGLE_QUOTE) {
+ // An escaped single quote.
+ _scanner.readChar();
+ _scanner.readChar();
+ buffer.writeCharCode(SINGLE_QUOTE);
+ } else if (char == (singleQuote ? SINGLE_QUOTE : DOUBLE_QUOTE)) {
+ // The closing quote.
+ break;
+ } else if (!singleQuote && char == BACKSLASH && _isBreakAt(1)) {
+ // An escaped newline.
+ _scanner.readChar();
+ _skipLine();
+ leadingBlanks = true;
+ break;
+ } else if (!singleQuote && char == BACKSLASH) {
+ var escapeStart = _scanner.state;
+
+ // An escape sequence.
+ int? codeLength;
+ switch (_scanner.peekChar(1)) {
+ case NUMBER_0:
+ buffer.writeCharCode(NULL);
+ break;
+ case LETTER_A:
+ buffer.writeCharCode(BELL);
+ break;
+ case LETTER_B:
+ buffer.writeCharCode(BACKSPACE);
+ break;
+ case LETTER_T:
+ case TAB:
+ buffer.writeCharCode(TAB);
+ break;
+ case LETTER_N:
+ buffer.writeCharCode(LF);
+ break;
+ case LETTER_V:
+ buffer.writeCharCode(VERTICAL_TAB);
+ break;
+ case LETTER_F:
+ buffer.writeCharCode(FORM_FEED);
+ break;
+ case LETTER_R:
+ buffer.writeCharCode(CR);
+ break;
+ case LETTER_E:
+ buffer.writeCharCode(ESCAPE);
+ break;
+ case SP:
+ case DOUBLE_QUOTE:
+ case SLASH:
+ case BACKSLASH:
+ // libyaml doesn't support an escaped forward slash, but it was
+ // added in YAML 1.2. See section 5.7:
+ // http://yaml.org/spec/1.2/spec.html#id2776092
+ buffer.writeCharCode(_scanner.peekChar(1)!);
+ break;
+ case LETTER_CAP_N:
+ buffer.writeCharCode(NEL);
+ break;
+ case UNDERSCORE:
+ buffer.writeCharCode(NBSP);
+ break;
+ case LETTER_CAP_L:
+ buffer.writeCharCode(LINE_SEPARATOR);
+ break;
+ case LETTER_CAP_P:
+ buffer.writeCharCode(PARAGRAPH_SEPARATOR);
+ break;
+ case LETTER_X:
+ codeLength = 2;
+ break;
+ case LETTER_U:
+ codeLength = 4;
+ break;
+ case LETTER_CAP_U:
+ codeLength = 8;
+ break;
+ default:
+ throw YamlException(
+ 'Unknown escape character.', _scanner.spanFrom(escapeStart));
+ }
+
+ _scanner.readChar();
+ _scanner.readChar();
+
+ if (codeLength != null) {
+ var value = 0;
+ for (var i = 0; i < codeLength; i++) {
+ if (!_isHex) {
+ _scanner.readChar();
+ throw YamlException(
+ 'Expected $codeLength-digit hexidecimal number.',
+ _scanner.spanFrom(escapeStart));
+ }
+
+ value = (value << 4) + _asHex(_scanner.readChar());
+ }
+
+ // Check the value and write the character.
+ if ((value >= 0xD800 && value <= 0xDFFF) || value > 0x10FFFF) {
+ throw YamlException('Invalid Unicode character escape code.',
+ _scanner.spanFrom(escapeStart));
+ }
+
+ buffer.writeCharCode(value);
+ }
+ } else {
+ buffer.writeCharCode(_scanner.readCodePoint());
+ }
+ }
+
+ // Check if we're at the end of a scalar.
+ if (_scanner.peekChar() == (singleQuote ? SINGLE_QUOTE : DOUBLE_QUOTE)) {
+ break;
+ }
+
+ var whitespace = StringBuffer();
+ var leadingBreak = '';
+ var trailingBreaks = StringBuffer();
+ while (_isBlank || _isBreak) {
+ if (_isBlank) {
+ // Consume a space or a tab.
+ if (!leadingBlanks) {
+ whitespace.writeCharCode(_scanner.readChar());
+ } else {
+ _scanner.readChar();
+ }
+ } else {
+ // Check if it's a first line break.
+ if (!leadingBlanks) {
+ whitespace.clear();
+ leadingBreak = _readLine();
+ leadingBlanks = true;
+ } else {
+ trailingBreaks.write(_readLine());
+ }
+ }
+ }
+
+ // Join the whitespace or fold line breaks.
+ if (leadingBlanks) {
+ if (leadingBreak.isNotEmpty && trailingBreaks.isEmpty) {
+ buffer.writeCharCode(SP);
+ } else {
+ buffer.write(trailingBreaks);
+ }
+ } else {
+ buffer.write(whitespace);
+ whitespace.clear();
+ }
+ }
+
+ // Eat the right quote.
+ _scanner.readChar();
+
+ return ScalarToken(_scanner.spanFrom(start), buffer.toString(),
+ singleQuote ? ScalarStyle.SINGLE_QUOTED : ScalarStyle.DOUBLE_QUOTED);
+ }
+
+ /// Scans a plain scalar.
+ Token _scanPlainScalar() {
+ var start = _scanner.state;
+ var end = _scanner.state;
+ var buffer = StringBuffer();
+ var leadingBreak = '';
+ var trailingBreaks = '';
+ var whitespace = StringBuffer();
+ var indent = _indent + 1;
+
+ while (true) {
+ // Check for a document indicator.
+ if (_isDocumentIndicator) break;
+
+ // Check for a comment.
+ if (_scanner.peekChar() == HASH) break;
+
+ if (_isPlainChar) {
+ // Join the whitespace or fold line breaks.
+ if (leadingBreak.isNotEmpty) {
+ if (trailingBreaks.isEmpty) {
+ buffer.writeCharCode(SP);
+ } else {
+ buffer.write(trailingBreaks);
+ }
+ leadingBreak = '';
+ trailingBreaks = '';
+ } else {
+ buffer.write(whitespace);
+ whitespace.clear();
+ }
+ }
+
+ // libyaml's notion of valid identifiers differs substantially from YAML
+ // 1.2's. We use [_isPlainChar] instead of libyaml's character here.
+ var startPosition = _scanner.position;
+ while (_isPlainChar) {
+ _scanner.readCodePoint();
+ }
+ buffer.write(_scanner.substring(startPosition));
+ end = _scanner.state;
+
+ // Is it the end?
+ if (!_isBlank && !_isBreak) break;
+
+ while (_isBlank || _isBreak) {
+ if (_isBlank) {
+ // Check for a tab character messing up the intendation.
+ if (leadingBreak.isNotEmpty &&
+ _scanner.column < indent &&
+ _scanner.peekChar() == TAB) {
+ _scanner.error('Expected a space but found a tab.', length: 1);
+ }
+
+ if (leadingBreak.isEmpty) {
+ whitespace.writeCharCode(_scanner.readChar());
+ } else {
+ _scanner.readChar();
+ }
+ } else {
+ // Check if it's a first line break.
+ if (leadingBreak.isEmpty) {
+ leadingBreak = _readLine();
+ whitespace.clear();
+ } else {
+ trailingBreaks = _readLine();
+ }
+ }
+ }
+
+ // Check the indentation level.
+ if (_inBlockContext && _scanner.column < indent) break;
+ }
+
+ // Allow a simple key after a plain scalar with leading blanks.
+ if (leadingBreak.isNotEmpty) _simpleKeyAllowed = true;
+
+ return ScalarToken(
+ _scanner.spanFrom(start, end), buffer.toString(), ScalarStyle.PLAIN);
+ }
+
+ /// Moves past the current line break, if there is one.
+ void _skipLine() {
+ var char = _scanner.peekChar();
+ if (char != CR && char != LF) return;
+ _scanner.readChar();
+ if (char == CR && _scanner.peekChar() == LF) _scanner.readChar();
+ }
+
+ // Moves past the current line break and returns a newline.
+ String _readLine() {
+ var char = _scanner.peekChar();
+
+ // libyaml supports NEL, PS, and LS characters as line separators, but this
+ // is explicitly forbidden in section 5.4 of the YAML spec.
+ if (char != CR && char != LF) {
+ throw YamlException('Expected newline.', _scanner.emptySpan);
+ }
+
+ _scanner.readChar();
+ // CR LF | CR | LF -> LF
+ if (char == CR && _scanner.peekChar() == LF) _scanner.readChar();
+ return '\n';
+ }
+
+ // Returns whether the character at [offset] is whitespace.
+ bool _isBlankAt(int offset) {
+ var char = _scanner.peekChar(offset);
+ return char == SP || char == TAB;
+ }
+
+ // Returns whether the character at [offset] is a line break.
+ bool _isBreakAt(int offset) {
+ // Libyaml considers NEL, LS, and PS to be line breaks as well, but that's
+ // contrary to the spec.
+ var char = _scanner.peekChar(offset);
+ return char == CR || char == LF;
+ }
+
+ // Returns whether the character at [offset] is whitespace or past the end of
+ // the source.
+ bool _isBlankOrEndAt(int offset) {
+ var char = _scanner.peekChar(offset);
+ return char == null ||
+ char == SP ||
+ char == TAB ||
+ char == CR ||
+ char == LF;
+ }
+
+ /// Returns whether the character at [offset] is a plain character.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#ns-plain-char(c).
+ bool _isPlainCharAt(int offset) {
+ switch (_scanner.peekChar(offset)) {
+ case COLON:
+ return _isPlainSafeAt(offset + 1);
+ case HASH:
+ var previous = _scanner.peekChar(offset - 1);
+ return previous != SP && previous != TAB;
+ default:
+ return _isPlainSafeAt(offset);
+ }
+ }
+
+ /// Returns whether the character at [offset] is a plain-safe character.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#ns-plain-safe(c).
+ bool _isPlainSafeAt(int offset) {
+ var char = _scanner.peekChar(offset);
+ return switch (char) {
+ null => false,
+ COMMA ||
+ LEFT_SQUARE ||
+ RIGHT_SQUARE ||
+ LEFT_CURLY ||
+ RIGHT_CURLY =>
+ // These characters are delimiters in a flow context and thus are only
+ // safe in a block context.
+ _inBlockContext,
+ SP || TAB || LF || CR || BOM => false,
+ NEL => true,
+ _ => _isStandardCharacterAt(offset)
+ };
+ }
+
+ bool _isStandardCharacterAt(int offset) {
+ var first = _scanner.peekChar(offset);
+ if (first == null) return false;
+
+ if (isHighSurrogate(first)) {
+ var next = _scanner.peekChar(offset + 1);
+ // A surrogate pair encodes code points from U+010000 to U+10FFFF, so it
+ // must be a standard character.
+ return next != null && isLowSurrogate(next);
+ }
+
+ return _isStandardCharacter(first);
+ }
+
+ bool _isStandardCharacter(int char) =>
+ (char >= 0x0020 && char <= 0x007E) ||
+ (char >= 0x00A0 && char <= 0xD7FF) ||
+ (char >= 0xE000 && char <= 0xFFFD);
+
+ /// Returns the hexidecimal value of [char].
+ int _asHex(int char) {
+ if (char <= NUMBER_9) return char - NUMBER_0;
+ if (char <= LETTER_CAP_F) return 10 + char - LETTER_CAP_A;
+ return 10 + char - LETTER_A;
+ }
+
+ /// Moves the scanner past any blank characters.
+ void _skipBlanks() {
+ while (_isBlank) {
+ _scanner.readChar();
+ }
+ }
+
+ /// Moves the scanner past a comment, if one starts at the current position.
+ void _skipComment() {
+ if (_scanner.peekChar() != HASH) return;
+ while (!_isBreakOrEnd) {
+ _scanner.readChar();
+ }
+ }
+
+ /// Reports a [YamlException] to [_errorListener] if [_recover] is true,
+ /// otherwise throws the exception.
+ void _reportError(YamlException exception) {
+ if (!_recover) {
+ throw exception;
+ }
+ _errorListener?.onError(exception);
+ }
+}
+
+/// A record of the location of a potential simple key.
+class _SimpleKey {
+ /// The index of the token that begins the simple key.
+ ///
+ /// This is the index relative to all tokens emitted, rather than relative to
+ /// [location].
+ final int tokenNumber;
+
+ /// The source location of the beginning of the simple key.
+ ///
+ /// This is used for error reporting and for determining when a simple key is
+ /// no longer on the current line.
+ final SourceLocation location;
+
+ /// The line on which the key appears.
+ ///
+ /// We could get this from [location], but that requires a binary search
+ /// whereas this is O(1).
+ final int line;
+
+ /// The column on which the key appears.
+ ///
+ /// We could get this from [location], but that requires a binary search
+ /// whereas this is O(1).
+ final int column;
+
+ /// Whether this key must exist for the document to be scanned.
+ final bool required;
+
+ _SimpleKey(
+ this.tokenNumber,
+ this.line,
+ this.column,
+ this.location, {
+ required this.required,
+ });
+}
+
+/// The ways to handle trailing whitespace for a block scalar.
+///
+/// See http://yaml.org/spec/1.2/spec.html#id2794534.
+enum _Chomping {
+ /// All trailing whitespace is discarded.
+ strip,
+
+ /// A single trailing newline is retained.
+ clip,
+
+ /// All trailing whitespace is preserved.
+ keep
+}
diff --git a/pkgs/yaml/lib/src/style.dart b/pkgs/yaml/lib/src/style.dart
new file mode 100644
index 0000000..96c3b94
--- /dev/null
+++ b/pkgs/yaml/lib/src/style.dart
@@ -0,0 +1,79 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+// ignore_for_file: constant_identifier_names
+
+import 'yaml_node.dart';
+
+/// An enum of source scalar styles.
+class ScalarStyle {
+ /// No source style was specified.
+ ///
+ /// This usually indicates a scalar constructed with [YamlScalar.wrap].
+ static const ANY = ScalarStyle._('ANY');
+
+ /// The plain scalar style, unquoted and without a prefix.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#style/flow/plain.
+ static const PLAIN = ScalarStyle._('PLAIN');
+
+ /// The literal scalar style, with a `|` prefix.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#id2795688.
+ static const LITERAL = ScalarStyle._('LITERAL');
+
+ /// The folded scalar style, with a `>` prefix.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#id2796251.
+ static const FOLDED = ScalarStyle._('FOLDED');
+
+ /// The single-quoted scalar style.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#style/flow/single-quoted.
+ static const SINGLE_QUOTED = ScalarStyle._('SINGLE_QUOTED');
+
+ /// The double-quoted scalar style.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#style/flow/double-quoted.
+ static const DOUBLE_QUOTED = ScalarStyle._('DOUBLE_QUOTED');
+
+ final String name;
+
+ /// Whether this is a quoted style ([SINGLE_QUOTED] or [DOUBLE_QUOTED]).
+ bool get isQuoted => this == SINGLE_QUOTED || this == DOUBLE_QUOTED;
+
+ const ScalarStyle._(this.name);
+
+ @override
+ String toString() => name;
+}
+
+/// An enum of collection styles.
+class CollectionStyle {
+ /// No source style was specified.
+ ///
+ /// This usually indicates a collection constructed with [YamlList.wrap] or
+ /// [YamlMap.wrap].
+ static const ANY = CollectionStyle._('ANY');
+
+ /// The indentation-based block style.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#id2797293.
+ static const BLOCK = CollectionStyle._('BLOCK');
+
+ /// The delimiter-based block style.
+ ///
+ /// See http://yaml.org/spec/1.2/spec.html#id2790088.
+ static const FLOW = CollectionStyle._('FLOW');
+
+ final String name;
+
+ const CollectionStyle._(this.name);
+
+ @override
+ String toString() => name;
+}
diff --git a/pkgs/yaml/lib/src/token.dart b/pkgs/yaml/lib/src/token.dart
new file mode 100644
index 0000000..7d5d6bc
--- /dev/null
+++ b/pkgs/yaml/lib/src/token.dart
@@ -0,0 +1,158 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:source_span/source_span.dart';
+
+import 'scanner.dart';
+import 'style.dart';
+
+/// A token emitted by a [Scanner].
+class Token {
+ final TokenType type;
+ final FileSpan span;
+
+ Token(this.type, this.span);
+
+ @override
+ String toString() => type.toString();
+}
+
+/// A token representing a `%YAML` directive.
+class VersionDirectiveToken implements Token {
+ @override
+ TokenType get type => TokenType.versionDirective;
+ @override
+ final FileSpan span;
+
+ /// The declared major version of the document.
+ final int major;
+
+ /// The declared minor version of the document.
+ final int minor;
+
+ VersionDirectiveToken(this.span, this.major, this.minor);
+
+ @override
+ String toString() => 'VERSION_DIRECTIVE $major.$minor';
+}
+
+/// A token representing a `%TAG` directive.
+class TagDirectiveToken implements Token {
+ @override
+ TokenType get type => TokenType.tagDirective;
+ @override
+ final FileSpan span;
+
+ /// The tag handle used in the document.
+ final String handle;
+
+ /// The tag prefix that the handle maps to.
+ final String prefix;
+
+ TagDirectiveToken(this.span, this.handle, this.prefix);
+
+ @override
+ String toString() => 'TAG_DIRECTIVE $handle $prefix';
+}
+
+/// A token representing an anchor (`&foo`).
+class AnchorToken implements Token {
+ @override
+ TokenType get type => TokenType.anchor;
+ @override
+ final FileSpan span;
+
+ final String name;
+
+ AnchorToken(this.span, this.name);
+
+ @override
+ String toString() => 'ANCHOR $name';
+}
+
+/// A token representing an alias (`*foo`).
+class AliasToken implements Token {
+ @override
+ TokenType get type => TokenType.alias;
+ @override
+ final FileSpan span;
+
+ final String name;
+
+ AliasToken(this.span, this.name);
+
+ @override
+ String toString() => 'ALIAS $name';
+}
+
+/// A token representing a tag (`!foo`).
+class TagToken implements Token {
+ @override
+ TokenType get type => TokenType.tag;
+ @override
+ final FileSpan span;
+
+ /// The tag handle for named tags.
+ final String? handle;
+
+ /// The tag suffix.
+ final String suffix;
+
+ TagToken(this.span, this.handle, this.suffix);
+
+ @override
+ String toString() => 'TAG $handle $suffix';
+}
+
+/// A scalar value.
+class ScalarToken implements Token {
+ @override
+ TokenType get type => TokenType.scalar;
+ @override
+ final FileSpan span;
+
+ /// The unparsed contents of the value..
+ final String value;
+
+ /// The style of the scalar in the original source.
+ final ScalarStyle style;
+
+ ScalarToken(this.span, this.value, this.style);
+
+ @override
+ String toString() => 'SCALAR $style "$value"';
+}
+
+/// The types of [Token] objects.
+enum TokenType {
+ streamStart,
+ streamEnd,
+
+ versionDirective,
+ tagDirective,
+ documentStart,
+ documentEnd,
+
+ blockSequenceStart,
+ blockMappingStart,
+ blockEnd,
+
+ flowSequenceStart,
+ flowSequenceEnd,
+ flowMappingStart,
+ flowMappingEnd,
+
+ blockEntry,
+ flowEntry,
+ key,
+ value,
+
+ alias,
+ anchor,
+ tag,
+ scalar
+}
diff --git a/pkgs/yaml/lib/src/utils.dart b/pkgs/yaml/lib/src/utils.dart
new file mode 100644
index 0000000..0dc132f
--- /dev/null
+++ b/pkgs/yaml/lib/src/utils.dart
@@ -0,0 +1,40 @@
+// Copyright (c) 2013, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:source_span/source_span.dart';
+
+/// Print a warning.
+///
+/// If [span] is passed, associates the warning with that span.
+void warn(String message, [SourceSpan? span]) =>
+ yamlWarningCallback(message, span);
+
+/// A callback for emitting a warning.
+///
+/// [message] is the text of the warning. If [span] is passed, it's the portion
+/// of the document that the warning is associated with and should be included
+/// in the printed warning.
+typedef YamlWarningCallback = void Function(String message, [SourceSpan? span]);
+
+/// A callback for emitting a warning.
+///
+/// In a very few cases, the YAML spec indicates that an implementation should
+/// emit a warning. To do so, it calls this callback. The default implementation
+/// prints a message using [print].
+// ignore: prefer_function_declarations_over_variables
+YamlWarningCallback yamlWarningCallback = (message, [SourceSpan? span]) {
+ // TODO(nweiz): Print to stderr with color when issue 6943 is fixed and
+ // dart:io is available.
+ if (span != null) message = span.message(message);
+ print(message);
+};
+
+/// Whether [codeUnit] is a UTF-16 high surrogate.
+bool isHighSurrogate(int codeUnit) => codeUnit >>> 10 == 0x36;
+
+/// Whether [codeUnit] is a UTF-16 low surrogate.
+bool isLowSurrogate(int codeUnit) => codeUnit >>> 10 == 0x37;
diff --git a/pkgs/yaml/lib/src/yaml_document.dart b/pkgs/yaml/lib/src/yaml_document.dart
new file mode 100644
index 0000000..da6aa1e
--- /dev/null
+++ b/pkgs/yaml/lib/src/yaml_document.dart
@@ -0,0 +1,71 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'dart:collection';
+
+import 'package:source_span/source_span.dart';
+
+import 'yaml_node.dart';
+
+/// A YAML document, complete with metadata.
+class YamlDocument {
+ /// The contents of the document.
+ final YamlNode contents;
+
+ /// The span covering the entire document.
+ final SourceSpan span;
+
+ /// The version directive for the document, if any.
+ final VersionDirective? versionDirective;
+
+ /// The tag directives for the document.
+ final List<TagDirective> tagDirectives;
+
+ /// Whether the beginning of the document was implicit (versus explicit via
+ /// `===`).
+ final bool startImplicit;
+
+ /// Whether the end of the document was implicit (versus explicit via `...`).
+ final bool endImplicit;
+
+ /// Users of the library should not use this constructor.
+ YamlDocument.internal(this.contents, this.span, this.versionDirective,
+ List<TagDirective> tagDirectives,
+ {this.startImplicit = false, this.endImplicit = false})
+ : tagDirectives = UnmodifiableListView(tagDirectives);
+
+ @override
+ String toString() => contents.toString();
+}
+
+/// A directive indicating which version of YAML a document was written to.
+class VersionDirective {
+ /// The major version number.
+ final int major;
+
+ /// The minor version number.
+ final int minor;
+
+ VersionDirective(this.major, this.minor);
+
+ @override
+ String toString() => '%YAML $major.$minor';
+}
+
+/// A directive describing a custom tag handle.
+class TagDirective {
+ /// The handle for use in the document.
+ final String handle;
+
+ /// The prefix that the handle maps to.
+ final String prefix;
+
+ TagDirective(this.handle, this.prefix);
+
+ @override
+ String toString() => '%TAG $handle $prefix';
+}
diff --git a/pkgs/yaml/lib/src/yaml_exception.dart b/pkgs/yaml/lib/src/yaml_exception.dart
new file mode 100644
index 0000000..7aa5389
--- /dev/null
+++ b/pkgs/yaml/lib/src/yaml_exception.dart
@@ -0,0 +1,13 @@
+// Copyright (c) 2013, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:source_span/source_span.dart';
+
+/// An error thrown by the YAML processor.
+class YamlException extends SourceSpanFormatException {
+ YamlException(super.message, super.span);
+}
diff --git a/pkgs/yaml/lib/src/yaml_node.dart b/pkgs/yaml/lib/src/yaml_node.dart
new file mode 100644
index 0000000..bd17b6c
--- /dev/null
+++ b/pkgs/yaml/lib/src/yaml_node.dart
@@ -0,0 +1,191 @@
+// Copyright (c) 2012, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'dart:collection' as collection;
+
+import 'package:collection/collection.dart';
+import 'package:source_span/source_span.dart';
+
+import 'event.dart';
+import 'null_span.dart';
+import 'style.dart';
+import 'yaml_node_wrapper.dart';
+
+/// An interface for parsed nodes from a YAML source tree.
+///
+/// [YamlMap]s and [YamlList]s implement this interface in addition to the
+/// normal [Map] and [List] interfaces, so any maps and lists will be
+/// [YamlNode]s regardless of how they're accessed.
+///
+/// Scalars values like strings and numbers, on the other hand, don't have this
+/// interface by default. Instead, they can be accessed as [YamlScalar]s via
+/// [YamlMap.nodes] or [YamlList.nodes].
+abstract class YamlNode {
+ /// The source span for this node.
+ ///
+ /// [SourceSpan.message] can be used to produce a human-friendly message about
+ /// this node.
+ SourceSpan get span => _span;
+ SourceSpan _span;
+
+ YamlNode._(this._span);
+
+ /// The inner value of this node.
+ ///
+ /// For [YamlScalar]s, this will return the wrapped value. For [YamlMap] and
+ /// [YamlList], it will return `this`, since they already implement [Map] and
+ /// [List], respectively.
+ dynamic get value;
+}
+
+/// A read-only [Map] parsed from YAML.
+class YamlMap extends YamlNode with collection.MapMixin, UnmodifiableMapMixin {
+ /// A view of `this` where the keys and values are guaranteed to be
+ /// [YamlNode]s.
+ ///
+ /// The key type is `dynamic` to allow values to be accessed using
+ /// non-[YamlNode] keys, but [Map.keys] and [Map.forEach] will always expose
+ /// them as [YamlNode]s. For example, for `{"foo": [1, 2, 3]}` [nodes] will be
+ /// a map from a [YamlScalar] to a [YamlList], but since the key type is
+ /// `dynamic` `map.nodes["foo"]` will still work.
+ final Map<dynamic, YamlNode> nodes;
+
+ /// The style used for the map in the original document.
+ final CollectionStyle style;
+
+ @override
+ Map get value => this;
+
+ @override
+ Iterable get keys => nodes.keys.map((node) => (node as YamlNode).value);
+
+ /// Creates an empty YamlMap.
+ ///
+ /// This map's [span] won't have useful location information. However, it will
+ /// have a reasonable implementation of [SourceSpan.message]. If [sourceUrl]
+ /// is passed, it's used as the [SourceSpan.sourceUrl].
+ ///
+ /// [sourceUrl] may be either a [String], a [Uri], or `null`.
+ factory YamlMap({Object? sourceUrl}) => YamlMapWrapper(const {}, sourceUrl);
+
+ /// Wraps a Dart map so that it can be accessed (recursively) like a
+ /// [YamlMap].
+ ///
+ /// Any [SourceSpan]s returned by this map or its children will be dummies
+ /// without useful location information. However, they will have a reasonable
+ /// implementation of [SourceSpan.message]. If [sourceUrl] is
+ /// passed, it's used as the [SourceSpan.sourceUrl].
+ ///
+ /// [sourceUrl] may be either a [String], a [Uri], or `null`.
+ factory YamlMap.wrap(Map dartMap,
+ {Object? sourceUrl, CollectionStyle style = CollectionStyle.ANY}) =>
+ YamlMapWrapper(dartMap, sourceUrl, style: style);
+
+ /// Users of the library should not use this constructor.
+ YamlMap.internal(Map<dynamic, YamlNode> nodes, super.span, this.style)
+ : nodes = UnmodifiableMapView<dynamic, YamlNode>(nodes),
+ super._();
+
+ @override
+ dynamic operator [](Object? key) => nodes[key]?.value;
+}
+
+// TODO(nweiz): Use UnmodifiableListMixin when issue 18970 is fixed.
+/// A read-only [List] parsed from YAML.
+class YamlList extends YamlNode with collection.ListMixin {
+ final List<YamlNode> nodes;
+
+ /// The style used for the list in the original document.
+ final CollectionStyle style;
+
+ @override
+ List get value => this;
+
+ @override
+ int get length => nodes.length;
+
+ @override
+ set length(int index) {
+ throw UnsupportedError('Cannot modify an unmodifiable List');
+ }
+
+ /// Creates an empty YamlList.
+ ///
+ /// This list's [span] won't have useful location information. However, it
+ /// will have a reasonable implementation of [SourceSpan.message]. If
+ /// [sourceUrl] is passed, it's used as the [SourceSpan.sourceUrl].
+ ///
+ /// [sourceUrl] may be either a [String], a [Uri], or `null`.
+ factory YamlList({Object? sourceUrl}) => YamlListWrapper(const [], sourceUrl);
+
+ /// Wraps a Dart list so that it can be accessed (recursively) like a
+ /// [YamlList].
+ ///
+ /// Any [SourceSpan]s returned by this list or its children will be dummies
+ /// without useful location information. However, they will have a reasonable
+ /// implementation of [SourceSpan.message]. If [sourceUrl] is
+ /// passed, it's used as the [SourceSpan.sourceUrl].
+ ///
+ /// [sourceUrl] may be either a [String], a [Uri], or `null`.
+ factory YamlList.wrap(List dartList,
+ {Object? sourceUrl, CollectionStyle style = CollectionStyle.ANY}) =>
+ YamlListWrapper(dartList, sourceUrl, style: style);
+
+ /// Users of the library should not use this constructor.
+ YamlList.internal(List<YamlNode> nodes, super.span, this.style)
+ : nodes = UnmodifiableListView<YamlNode>(nodes),
+ super._();
+
+ @override
+ dynamic operator [](int index) => nodes[index].value;
+
+ @override
+ void operator []=(int index, Object? value) {
+ throw UnsupportedError('Cannot modify an unmodifiable List');
+ }
+}
+
+/// A wrapped scalar value parsed from YAML.
+class YamlScalar extends YamlNode {
+ @override
+ final dynamic value;
+
+ /// The style used for the scalar in the original document.
+ final ScalarStyle style;
+
+ /// Wraps a Dart value in a [YamlScalar].
+ ///
+ /// This scalar's [span] won't have useful location information. However, it
+ /// will have a reasonable implementation of [SourceSpan.message]. If
+ /// [sourceUrl] is passed, it's used as the [SourceSpan.sourceUrl].
+ ///
+ /// [sourceUrl] may be either a [String], a [Uri], or `null`.
+ YamlScalar.wrap(this.value, {Object? sourceUrl, this.style = ScalarStyle.ANY})
+ : super._(NullSpan(sourceUrl)) {
+ ArgumentError.checkNotNull(style, 'style');
+ }
+
+ /// Users of the library should not use this constructor.
+ YamlScalar.internal(this.value, ScalarEvent scalar)
+ : style = scalar.style,
+ super._(scalar.span);
+
+ /// Users of the library should not use this constructor.
+ YamlScalar.internalWithSpan(this.value, SourceSpan span)
+ : style = ScalarStyle.ANY,
+ super._(span);
+
+ @override
+ String toString() => value.toString();
+}
+
+/// Sets the source span of a [YamlNode].
+///
+/// This method is not exposed publicly.
+void setSpan(YamlNode node, SourceSpan span) {
+ node._span = span;
+}
diff --git a/pkgs/yaml/lib/src/yaml_node_wrapper.dart b/pkgs/yaml/lib/src/yaml_node_wrapper.dart
new file mode 100644
index 0000000..5250844
--- /dev/null
+++ b/pkgs/yaml/lib/src/yaml_node_wrapper.dart
@@ -0,0 +1,189 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'dart:collection';
+
+import 'package:collection/collection.dart' as pkg_collection;
+import 'package:source_span/source_span.dart';
+
+import 'null_span.dart';
+import 'style.dart';
+import 'yaml_node.dart';
+
+/// A wrapper that makes a normal Dart map behave like a [YamlMap].
+class YamlMapWrapper extends MapBase
+ with pkg_collection.UnmodifiableMapMixin
+ implements YamlMap {
+ @override
+ final CollectionStyle style;
+
+ final Map _dartMap;
+
+ @override
+ final SourceSpan span;
+
+ @override
+ final Map<dynamic, YamlNode> nodes;
+
+ @override
+ Map get value => this;
+
+ @override
+ Iterable get keys => _dartMap.keys;
+
+ YamlMapWrapper(Map dartMap, Object? sourceUrl,
+ {CollectionStyle style = CollectionStyle.ANY})
+ : this._(dartMap, NullSpan(sourceUrl), style: style);
+
+ YamlMapWrapper._(Map dartMap, this.span, {this.style = CollectionStyle.ANY})
+ : _dartMap = dartMap,
+ nodes = _YamlMapNodes(dartMap, span) {
+ ArgumentError.checkNotNull(style, 'style');
+ }
+
+ @override
+ dynamic operator [](Object? key) {
+ var value = _dartMap[key];
+ if (value is Map) return YamlMapWrapper._(value, span);
+ if (value is List) return YamlListWrapper._(value, span);
+ return value;
+ }
+
+ @override
+ int get hashCode => _dartMap.hashCode;
+
+ @override
+ bool operator ==(Object other) =>
+ other is YamlMapWrapper && other._dartMap == _dartMap;
+}
+
+/// The implementation of [YamlMapWrapper.nodes] as a wrapper around the Dart
+/// map.
+class _YamlMapNodes extends MapBase<dynamic, YamlNode>
+ with pkg_collection.UnmodifiableMapMixin<dynamic, YamlNode> {
+ final Map _dartMap;
+
+ final SourceSpan _span;
+
+ @override
+ Iterable get keys =>
+ _dartMap.keys.map((key) => YamlScalar.internalWithSpan(key, _span));
+
+ _YamlMapNodes(this._dartMap, this._span);
+
+ @override
+ YamlNode? operator [](Object? key) {
+ // Use "as" here because key being assigned to invalidates type propagation.
+ if (key is YamlScalar) key = key.value;
+ if (!_dartMap.containsKey(key)) return null;
+ return _nodeForValue(_dartMap[key], _span);
+ }
+
+ @override
+ int get hashCode => _dartMap.hashCode;
+
+ @override
+ bool operator ==(Object other) =>
+ other is _YamlMapNodes && other._dartMap == _dartMap;
+}
+
+// TODO(nweiz): Use UnmodifiableListMixin when issue 18970 is fixed.
+/// A wrapper that makes a normal Dart list behave like a [YamlList].
+class YamlListWrapper extends ListBase implements YamlList {
+ @override
+ final CollectionStyle style;
+
+ final List _dartList;
+
+ @override
+ final SourceSpan span;
+
+ @override
+ final List<YamlNode> nodes;
+
+ @override
+ List get value => this;
+
+ @override
+ int get length => _dartList.length;
+
+ @override
+ set length(int index) {
+ throw UnsupportedError('Cannot modify an unmodifiable List.');
+ }
+
+ YamlListWrapper(List dartList, Object? sourceUrl,
+ {CollectionStyle style = CollectionStyle.ANY})
+ : this._(dartList, NullSpan(sourceUrl), style: style);
+
+ YamlListWrapper._(List dartList, this.span,
+ {this.style = CollectionStyle.ANY})
+ : _dartList = dartList,
+ nodes = _YamlListNodes(dartList, span) {
+ ArgumentError.checkNotNull(style, 'style');
+ }
+
+ @override
+ dynamic operator [](int index) {
+ var value = _dartList[index];
+ if (value is Map) return YamlMapWrapper._(value, span);
+ if (value is List) return YamlListWrapper._(value, span);
+ return value;
+ }
+
+ @override
+ void operator []=(int index, Object? value) {
+ throw UnsupportedError('Cannot modify an unmodifiable List.');
+ }
+
+ @override
+ int get hashCode => _dartList.hashCode;
+
+ @override
+ bool operator ==(Object other) =>
+ other is YamlListWrapper && other._dartList == _dartList;
+}
+
+// TODO(nweiz): Use UnmodifiableListMixin when issue 18970 is fixed.
+/// The implementation of [YamlListWrapper.nodes] as a wrapper around the Dart
+/// list.
+class _YamlListNodes extends ListBase<YamlNode> {
+ final List _dartList;
+
+ final SourceSpan _span;
+
+ @override
+ int get length => _dartList.length;
+
+ @override
+ set length(int index) {
+ throw UnsupportedError('Cannot modify an unmodifiable List.');
+ }
+
+ _YamlListNodes(this._dartList, this._span);
+
+ @override
+ YamlNode operator [](int index) => _nodeForValue(_dartList[index], _span);
+
+ @override
+ void operator []=(int index, Object? value) {
+ throw UnsupportedError('Cannot modify an unmodifiable List.');
+ }
+
+ @override
+ int get hashCode => _dartList.hashCode;
+
+ @override
+ bool operator ==(Object other) =>
+ other is _YamlListNodes && other._dartList == _dartList;
+}
+
+YamlNode _nodeForValue(Object? value, SourceSpan span) {
+ if (value is Map) return YamlMapWrapper._(value, span);
+ if (value is List) return YamlListWrapper._(value, span);
+ return YamlScalar.internalWithSpan(value, span);
+}
diff --git a/pkgs/yaml/lib/yaml.dart b/pkgs/yaml/lib/yaml.dart
new file mode 100644
index 0000000..26cc9b8
--- /dev/null
+++ b/pkgs/yaml/lib/yaml.dart
@@ -0,0 +1,126 @@
+// Copyright (c) 2012, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'src/error_listener.dart';
+import 'src/loader.dart';
+import 'src/style.dart';
+import 'src/yaml_document.dart';
+import 'src/yaml_exception.dart';
+import 'src/yaml_node.dart';
+
+export 'src/style.dart';
+export 'src/utils.dart' show YamlWarningCallback, yamlWarningCallback;
+export 'src/yaml_document.dart';
+export 'src/yaml_exception.dart';
+export 'src/yaml_node.dart' hide setSpan;
+
+/// Loads a single document from a YAML string.
+///
+/// If the string contains more than one document, this throws a
+/// [YamlException]. In future releases, this will become an [ArgumentError].
+///
+/// The return value is mostly normal Dart objects. However, since YAML mappings
+/// support some key types that the default Dart map implementation doesn't
+/// (NaN, lists, and maps), all maps in the returned document are [YamlMap]s.
+/// These have a few small behavioral differences from the default Map
+/// implementation; for details, see the [YamlMap] class.
+///
+/// If [sourceUrl] is passed, it's used as the URL from which the YAML
+/// originated for error reporting.
+///
+/// If [recover] is true, will attempt to recover from parse errors and may
+/// return invalid or synthetic nodes. If [errorListener] is also supplied, its
+/// onError method will be called for each error recovered from. It is not valid
+/// to provide [errorListener] if [recover] is false.
+dynamic loadYaml(String yaml,
+ {Uri? sourceUrl, bool recover = false, ErrorListener? errorListener}) =>
+ loadYamlNode(yaml,
+ sourceUrl: sourceUrl,
+ recover: recover,
+ errorListener: errorListener)
+ .value;
+
+/// Loads a single document from a YAML string as a [YamlNode].
+///
+/// This is just like [loadYaml], except that where [loadYaml] would return a
+/// normal Dart value this returns a [YamlNode] instead. This allows the caller
+/// to be confident that the return value will always be a [YamlNode].
+YamlNode loadYamlNode(String yaml,
+ {Uri? sourceUrl, bool recover = false, ErrorListener? errorListener}) =>
+ loadYamlDocument(yaml,
+ sourceUrl: sourceUrl,
+ recover: recover,
+ errorListener: errorListener)
+ .contents;
+
+/// Loads a single document from a YAML string as a [YamlDocument].
+///
+/// This is just like [loadYaml], except that where [loadYaml] would return a
+/// normal Dart value this returns a [YamlDocument] instead. This allows the
+/// caller to access document metadata.
+YamlDocument loadYamlDocument(String yaml,
+ {Uri? sourceUrl, bool recover = false, ErrorListener? errorListener}) {
+ var loader = Loader(yaml,
+ sourceUrl: sourceUrl, recover: recover, errorListener: errorListener);
+ var document = loader.load();
+ if (document == null) {
+ return YamlDocument.internal(YamlScalar.internalWithSpan(null, loader.span),
+ loader.span, null, const []);
+ }
+
+ var nextDocument = loader.load();
+ if (nextDocument != null) {
+ throw YamlException('Only expected one document.', nextDocument.span);
+ }
+
+ return document;
+}
+
+/// Loads a stream of documents from a YAML string.
+///
+/// The return value is mostly normal Dart objects. However, since YAML mappings
+/// support some key types that the default Dart map implementation doesn't
+/// (NaN, lists, and maps), all maps in the returned document are [YamlMap]s.
+/// These have a few small behavioral differences from the default Map
+/// implementation; for details, see the [YamlMap] class.
+///
+/// If [sourceUrl] is passed, it's used as the URL from which the YAML
+/// originated for error reporting.
+YamlList loadYamlStream(String yaml, {Uri? sourceUrl}) {
+ var loader = Loader(yaml, sourceUrl: sourceUrl);
+
+ var documents = <YamlDocument>[];
+ var document = loader.load();
+ while (document != null) {
+ documents.add(document);
+ document = loader.load();
+ }
+
+ // TODO(jmesserly): the type on the `document` parameter is a workaround for:
+ // https://github.com/dart-lang/dev_compiler/issues/203
+ return YamlList.internal(
+ documents.map((YamlDocument document) => document.contents).toList(),
+ loader.span,
+ CollectionStyle.ANY);
+}
+
+/// Loads a stream of documents from a YAML string.
+///
+/// This is like [loadYamlStream], except that it returns [YamlDocument]s with
+/// metadata wrapping the document contents.
+List<YamlDocument> loadYamlDocuments(String yaml, {Uri? sourceUrl}) {
+ var loader = Loader(yaml, sourceUrl: sourceUrl);
+
+ var documents = <YamlDocument>[];
+ var document = loader.load();
+ while (document != null) {
+ documents.add(document);
+ document = loader.load();
+ }
+
+ return documents;
+}
diff --git a/pkgs/yaml/pubspec.yaml b/pkgs/yaml/pubspec.yaml
new file mode 100644
index 0000000..fb37436
--- /dev/null
+++ b/pkgs/yaml/pubspec.yaml
@@ -0,0 +1,21 @@
+name: yaml
+version: 3.1.3
+description: A parser for YAML, a human-friendly data serialization standard
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/yaml
+
+topics:
+ - yaml
+ - config-format
+
+environment:
+ sdk: ^3.4.0
+
+dependencies:
+ collection: ^1.15.0
+ source_span: ^1.8.0
+ string_scanner: ^1.2.0
+
+dev_dependencies:
+ dart_flutter_team_lints: ^3.0.0
+ path: ^1.8.0
+ test: ^1.16.6
diff --git a/pkgs/yaml/test/span_test.dart b/pkgs/yaml/test/span_test.dart
new file mode 100644
index 0000000..03b7f9c
--- /dev/null
+++ b/pkgs/yaml/test/span_test.dart
@@ -0,0 +1,173 @@
+// Copyright (c) 2019, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'dart:convert';
+
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+import 'package:yaml/yaml.dart';
+
+void _expectSpan(SourceSpan source, String expected) {
+ final result = source.message('message');
+ printOnFailure("r'''\n$result'''");
+
+ expect(result, expected);
+}
+
+void main() {
+ late YamlMap yaml;
+
+ setUpAll(() {
+ yaml = loadYaml(const JsonEncoder.withIndent(' ').convert({
+ 'num': 42,
+ 'nested': {
+ 'null': null,
+ 'num': 42,
+ },
+ 'null': null,
+ })) as YamlMap;
+ });
+
+ test('first root key', () {
+ _expectSpan(
+ yaml.nodes['num']!.span,
+ r'''
+line 2, column 9: message
+ ╷
+2 │ "num": 42,
+ │ ^^
+ ╵''',
+ );
+ });
+
+ test('first root key', () {
+ _expectSpan(
+ yaml.nodes['null']!.span,
+ r'''
+line 7, column 10: message
+ ╷
+7 │ "null": null
+ │ ^^^^
+ ╵''',
+ );
+ });
+
+ group('nested', () {
+ late YamlMap nestedMap;
+
+ setUpAll(() {
+ nestedMap = yaml.nodes['nested'] as YamlMap;
+ });
+
+ test('first root key', () {
+ _expectSpan(
+ nestedMap.nodes['null']!.span,
+ r'''
+line 4, column 11: message
+ ╷
+4 │ "null": null,
+ │ ^^^^
+ ╵''',
+ );
+ });
+
+ test('first root key', () {
+ _expectSpan(
+ nestedMap.nodes['num']!.span,
+ r'''
+line 5, column 10: message
+ ╷
+5 │ "num": 42
+ │ ┌──────────^
+6 │ │ },
+ │ └─^
+ ╵''',
+ );
+ });
+ });
+
+ group('block', () {
+ late YamlList list, nestedList;
+
+ setUpAll(() {
+ const yamlStr = '''
+- foo
+-
+ - one
+ -
+ - three
+ -
+ - five
+ -
+-
+ a : b
+ c : d
+- bar
+''';
+
+ list = loadYaml(yamlStr) as YamlList;
+ nestedList = list.nodes[1] as YamlList;
+ });
+
+ test('root nodes span', () {
+ _expectSpan(list.nodes[0].span, r'''
+line 1, column 3: message
+ ╷
+1 │ - foo
+ │ ^^^
+ ╵''');
+
+ _expectSpan(list.nodes[1].span, r'''
+line 3, column 3: message
+ ╷
+3 │ ┌ - one
+4 │ │ -
+5 │ │ - three
+6 │ │ -
+7 │ │ - five
+8 │ └ -
+ ╵''');
+
+ _expectSpan(list.nodes[2].span, r'''
+line 10, column 3: message
+ ╷
+10 │ ┌ a : b
+11 │ └ c : d
+ ╵''');
+
+ _expectSpan(list.nodes[3].span, r'''
+line 12, column 3: message
+ ╷
+12 │ - bar
+ │ ^^^
+ ╵''');
+ });
+
+ test('null nodes span', () {
+ _expectSpan(nestedList.nodes[1].span, r'''
+line 4, column 3: message
+ ╷
+4 │ -
+ │ ^
+ ╵''');
+
+ _expectSpan(nestedList.nodes[3].span, r'''
+line 6, column 3: message
+ ╷
+6 │ -
+ │ ^
+ ╵''');
+
+ _expectSpan(nestedList.nodes[5].span, r'''
+line 8, column 3: message
+ ╷
+8 │ -
+ │ ^
+ ╵''');
+ });
+ });
+}
diff --git a/pkgs/yaml/test/utils.dart b/pkgs/yaml/test/utils.dart
new file mode 100644
index 0000000..372440a
--- /dev/null
+++ b/pkgs/yaml/test/utils.dart
@@ -0,0 +1,95 @@
+// Copyright (c) 2014, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+import 'package:test/test.dart';
+import 'package:yaml/src/equality.dart' as equality;
+import 'package:yaml/yaml.dart';
+
+/// A matcher that validates that a closure or Future throws a [YamlException].
+final Matcher throwsYamlException = throwsA(isA<YamlException>());
+
+/// Returns a matcher that asserts that the value equals [expected].
+///
+/// This handles recursive loops and considers `NaN` to equal itself.
+Matcher deepEquals(Object? expected) => predicate(
+ (actual) => equality.deepEquals(actual, expected), 'equals $expected');
+
+/// Constructs a new yaml.YamlMap, optionally from a normal Map.
+Map deepEqualsMap([Map? from]) {
+ var map = equality.deepEqualsMap<Object?, Object?>();
+ if (from != null) map.addAll(from);
+ return map;
+}
+
+/// Asserts that an error has the given message and starts at the given line/col.
+void expectErrorAtLineCol(
+ YamlException error, String message, int line, int col) {
+ expect(error.message, equals(message));
+ expect(error.span!.start.line, equals(line));
+ expect(error.span!.start.column, equals(col));
+}
+
+/// Asserts that a string containing a single YAML document produces a given
+/// value when loaded.
+void expectYamlLoads(Object? expected, String source) {
+ var actual = loadYaml(cleanUpLiteral(source));
+ expect(actual, deepEquals(expected));
+}
+
+/// Asserts that a string containing a stream of YAML documents produces a given
+/// list of values when loaded.
+void expectYamlStreamLoads(List expected, String source) {
+ var actual = loadYamlStream(cleanUpLiteral(source));
+ expect(actual, deepEquals(expected));
+}
+
+/// Asserts that a string containing a single YAML document throws a
+/// [YamlException].
+void expectYamlFails(String source) {
+ expect(() => loadYaml(cleanUpLiteral(source)), throwsYamlException);
+}
+
+/// Removes eight spaces of leading indentation from a multiline string.
+///
+/// Note that this is very sensitive to how the literals are styled. They should
+/// be:
+/// '''
+/// Text starts on own line. Lines up with subsequent lines.
+/// Lines are indented exactly 8 characters from the left margin.
+/// Close is on the same line.'''
+///
+/// This does nothing if text is only a single line.
+String cleanUpLiteral(String text) {
+ var lines = text.split('\n');
+ if (lines.length <= 1) return text;
+
+ for (var j = 0; j < lines.length; j++) {
+ if (lines[j].length > 8) {
+ lines[j] = lines[j].substring(8, lines[j].length);
+ } else {
+ lines[j] = '';
+ }
+ }
+
+ return lines.join('\n');
+}
+
+/// Indents each line of [text] so that, when passed to [cleanUpLiteral], it
+/// will produce output identical to [text].
+///
+/// This is useful for literals that need to include newlines but can't be
+/// conveniently represented as multi-line strings.
+String indentLiteral(String text) {
+ var lines = text.split('\n');
+ if (lines.length <= 1) return text;
+
+ for (var i = 0; i < lines.length; i++) {
+ lines[i] = ' ${lines[i]}';
+ }
+
+ return lines.join('\n');
+}
diff --git a/pkgs/yaml/test/yaml_node_wrapper_test.dart b/pkgs/yaml/test/yaml_node_wrapper_test.dart
new file mode 100644
index 0000000..637b778
--- /dev/null
+++ b/pkgs/yaml/test/yaml_node_wrapper_test.dart
@@ -0,0 +1,235 @@
+// Copyright (c) 2012, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+// ignore_for_file: avoid_dynamic_calls
+
+import 'package:source_span/source_span.dart';
+import 'package:test/test.dart';
+import 'package:yaml/yaml.dart';
+
+void main() {
+ test('YamlMap() with no sourceUrl', () {
+ var map = YamlMap();
+ expect(map, isEmpty);
+ expect(map.nodes, isEmpty);
+ expect(map.span, isNullSpan(isNull));
+ });
+
+ test('YamlMap() with a sourceUrl', () {
+ var map = YamlMap(sourceUrl: 'source');
+ expect(map.span, isNullSpan(Uri.parse('source')));
+ });
+
+ test('YamlList() with no sourceUrl', () {
+ var list = YamlList();
+ expect(list, isEmpty);
+ expect(list.nodes, isEmpty);
+ expect(list.span, isNullSpan(isNull));
+ });
+
+ test('YamlList() with a sourceUrl', () {
+ var list = YamlList(sourceUrl: 'source');
+ expect(list.span, isNullSpan(Uri.parse('source')));
+ });
+
+ test('YamlMap.wrap() with no sourceUrl', () {
+ var map = YamlMap.wrap({
+ 'list': [1, 2, 3],
+ 'map': {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'scalar': 'value'
+ });
+
+ expect(
+ map,
+ equals({
+ 'list': [1, 2, 3],
+ 'map': {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'scalar': 'value'
+ }));
+
+ expect(map.span, isNullSpan(isNull));
+ expect(map['list'], isA<YamlList>());
+ expect(map['list'].nodes[0], isA<YamlScalar>());
+ expect(map['list'].span, isNullSpan(isNull));
+ expect(map['map'], isA<YamlMap>());
+ expect(map['map'].nodes['foo'], isA<YamlScalar>());
+ expect(map['map']['nested'], isA<YamlList>());
+ expect(map['map'].span, isNullSpan(isNull));
+ expect(map.nodes['scalar'], isA<YamlScalar>());
+ expect(map.nodes['scalar']!.value, 'value');
+ expect(map.nodes['scalar']!.span, isNullSpan(isNull));
+ expect(map['scalar'], 'value');
+ expect(map.keys, unorderedEquals(['list', 'map', 'scalar']));
+ expect(map.nodes.keys, everyElement(isA<YamlScalar>()));
+ expect(map.nodes[YamlScalar.wrap('list')], equals([1, 2, 3]));
+ expect(map.style, equals(CollectionStyle.ANY));
+ expect((map.nodes['list'] as YamlList).style, equals(CollectionStyle.ANY));
+ expect((map.nodes['map'] as YamlMap).style, equals(CollectionStyle.ANY));
+ expect((map['map'].nodes['nested'] as YamlList).style,
+ equals(CollectionStyle.ANY));
+ });
+
+ test('YamlMap.wrap() with a sourceUrl', () {
+ var map = YamlMap.wrap({
+ 'list': [1, 2, 3],
+ 'map': {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'scalar': 'value'
+ }, sourceUrl: 'source');
+
+ var source = Uri.parse('source');
+ expect(map.span, isNullSpan(source));
+ expect(map['list'].span, isNullSpan(source));
+ expect(map['map'].span, isNullSpan(source));
+ expect(map.nodes['scalar']!.span, isNullSpan(source));
+ });
+
+ test('YamlMap.wrap() with a sourceUrl and style', () {
+ var map = YamlMap.wrap({
+ 'list': [1, 2, 3],
+ 'map': {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'scalar': 'value'
+ }, sourceUrl: 'source', style: CollectionStyle.BLOCK);
+
+ expect(map.style, equals(CollectionStyle.BLOCK));
+ expect((map.nodes['list'] as YamlList).style, equals(CollectionStyle.ANY));
+ expect((map.nodes['map'] as YamlMap).style, equals(CollectionStyle.ANY));
+ expect((map['map'].nodes['nested'] as YamlList).style,
+ equals(CollectionStyle.ANY));
+ });
+
+ test('YamlList.wrap() with no sourceUrl', () {
+ var list = YamlList.wrap([
+ [1, 2, 3],
+ {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'value'
+ ]);
+
+ expect(
+ list,
+ equals([
+ [1, 2, 3],
+ {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'value'
+ ]));
+
+ expect(list.span, isNullSpan(isNull));
+ expect(list[0], isA<YamlList>());
+ expect(list[0].nodes[0], isA<YamlScalar>());
+ expect(list[0].span, isNullSpan(isNull));
+ expect(list[1], isA<YamlMap>());
+ expect(list[1].nodes['foo'], isA<YamlScalar>());
+ expect(list[1]['nested'], isA<YamlList>());
+ expect(list[1].span, isNullSpan(isNull));
+ expect(list.nodes[2], isA<YamlScalar>());
+ expect(list.nodes[2].value, 'value');
+ expect(list.nodes[2].span, isNullSpan(isNull));
+ expect(list[2], 'value');
+ expect(list.style, equals(CollectionStyle.ANY));
+ expect((list[0] as YamlList).style, equals(CollectionStyle.ANY));
+ expect((list[1] as YamlMap).style, equals(CollectionStyle.ANY));
+ expect((list[1]['nested'] as YamlList).style, equals(CollectionStyle.ANY));
+ });
+
+ test('YamlList.wrap() with a sourceUrl', () {
+ var list = YamlList.wrap([
+ [1, 2, 3],
+ {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'value'
+ ], sourceUrl: 'source');
+
+ var source = Uri.parse('source');
+ expect(list.span, isNullSpan(source));
+ expect(list[0].span, isNullSpan(source));
+ expect(list[1].span, isNullSpan(source));
+ expect(list.nodes[2].span, isNullSpan(source));
+ });
+
+ test('YamlList.wrap() with a sourceUrl and style', () {
+ var list = YamlList.wrap([
+ [1, 2, 3],
+ {
+ 'foo': 'bar',
+ 'nested': [4, 5, 6]
+ },
+ 'value'
+ ], sourceUrl: 'source', style: CollectionStyle.FLOW);
+
+ expect(list.style, equals(CollectionStyle.FLOW));
+ expect((list[0] as YamlList).style, equals(CollectionStyle.ANY));
+ expect((list[1] as YamlMap).style, equals(CollectionStyle.ANY));
+ expect((list[1]['nested'] as YamlList).style, equals(CollectionStyle.ANY));
+ });
+
+ test('re-wrapped objects equal one another', () {
+ var list = YamlList.wrap([
+ [1, 2, 3],
+ {'foo': 'bar'}
+ ]);
+
+ expect(list[0] == list[0], isTrue);
+ expect(list[0].nodes == list[0].nodes, isTrue);
+ expect(list[0] == YamlList.wrap([1, 2, 3]), isFalse);
+ expect(list[1] == list[1], isTrue);
+ expect(list[1].nodes == list[1].nodes, isTrue);
+ expect(list[1] == YamlMap.wrap({'foo': 'bar'}), isFalse);
+ });
+
+ test('YamlScalar.wrap() with no sourceUrl', () {
+ var scalar = YamlScalar.wrap('foo');
+
+ expect(scalar.span, isNullSpan(isNull));
+ expect(scalar.value, 'foo');
+ expect(scalar.style, equals(ScalarStyle.ANY));
+ });
+
+ test('YamlScalar.wrap() with sourceUrl', () {
+ var scalar = YamlScalar.wrap('foo', sourceUrl: 'source');
+
+ var source = Uri.parse('source');
+ expect(scalar.span, isNullSpan(source));
+ });
+
+ test('YamlScalar.wrap() with sourceUrl and style', () {
+ var scalar = YamlScalar.wrap('foo',
+ sourceUrl: 'source', style: ScalarStyle.DOUBLE_QUOTED);
+
+ expect(scalar.style, equals(ScalarStyle.DOUBLE_QUOTED));
+ });
+}
+
+Matcher isNullSpan(Object sourceUrl) => predicate((SourceSpan span) {
+ expect(span, isA<SourceSpan>());
+ expect(span.length, equals(0));
+ expect(span.text, isEmpty);
+ expect(span.start, equals(span.end));
+ expect(span.start.offset, equals(0));
+ expect(span.start.line, equals(0));
+ expect(span.start.column, equals(0));
+ expect(span.sourceUrl, sourceUrl);
+ return true;
+ });
diff --git a/pkgs/yaml/test/yaml_test.dart b/pkgs/yaml/test/yaml_test.dart
new file mode 100644
index 0000000..3b5b77d
--- /dev/null
+++ b/pkgs/yaml/test/yaml_test.dart
@@ -0,0 +1,1921 @@
+// Copyright (c) 2012, the Dart project authors.
+// Copyright (c) 2006, Kirill Simonov.
+//
+// Use of this source code is governed by an MIT-style
+// license that can be found in the LICENSE file or at
+// https://opensource.org/licenses/MIT.
+
+// ignore_for_file: avoid_dynamic_calls
+
+import 'package:test/test.dart';
+import 'package:yaml/src/error_listener.dart';
+import 'package:yaml/yaml.dart';
+
+import 'utils.dart';
+
+void main() {
+ var infinity = double.parse('Infinity');
+ var nan = double.parse('NaN');
+
+ group('has a friendly error message for', () {
+ var tabError = predicate((e) =>
+ e.toString().contains('Tab characters are not allowed as indentation'));
+
+ test('using a tab as indentation', () {
+ expect(() => loadYaml('foo:\n\tbar'), throwsA(tabError));
+ });
+
+ test('using a tab not as indentation', () {
+ expect(() => loadYaml('''
+ "foo
+ \tbar"
+ error'''), throwsA(isNot(tabError)));
+ });
+ });
+
+ group('refuses', () {
+ // Regression test for #19.
+ test('invalid contents', () {
+ expectYamlFails('{');
+ });
+
+ test('duplicate mapping keys', () {
+ expectYamlFails('{a: 1, a: 2}');
+ });
+
+ group('documents that declare version', () {
+ test('1.0', () {
+ expectYamlFails('''
+ %YAML 1.0
+ --- text
+ ''');
+ });
+
+ test('1.3', () {
+ expectYamlFails('''
+ %YAML 1.3
+ --- text
+ ''');
+ });
+
+ test('2.0', () {
+ expectYamlFails('''
+ %YAML 2.0
+ --- text
+ ''');
+ });
+ });
+ });
+
+ group('recovers', () {
+ var collector = ErrorCollector();
+ setUp(() {
+ collector = ErrorCollector();
+ });
+
+ test('from incomplete leading keys', () {
+ final yaml = cleanUpLiteral(r'''
+ dependencies:
+ zero
+ one: any
+ ''');
+ var result = loadYaml(yaml, recover: true, errorListener: collector);
+ expect(
+ result,
+ deepEquals({
+ 'dependencies': {
+ 'zero': null,
+ 'one': 'any',
+ }
+ }));
+ expect(collector.errors.length, equals(1));
+ // These errors are reported at the start of the next token (after the
+ // whitespace/newlines).
+ expectErrorAtLineCol(collector.errors[0], "Expected ':'.", 2, 2);
+ // Skipped because this case is not currently handled. If it's the first
+ // package without the colon, because the value is indented from the line
+ // above, the whole `zero\n one` is treated as a scalar value.
+ }, skip: true);
+ test('from incomplete keys', () {
+ final yaml = cleanUpLiteral(r'''
+ dependencies:
+ one: any
+ two
+ three:
+ four
+ five:
+ 1.2.3
+ six: 5.4.3
+ ''');
+ var result = loadYaml(yaml, recover: true, errorListener: collector);
+ expect(
+ result,
+ deepEquals({
+ 'dependencies': {
+ 'one': 'any',
+ 'two': null,
+ 'three': null,
+ 'four': null,
+ 'five': '1.2.3',
+ 'six': '5.4.3',
+ }
+ }));
+
+ expect(collector.errors.length, equals(2));
+ // These errors are reported at the start of the next token (after the
+ // whitespace/newlines).
+ expectErrorAtLineCol(collector.errors[0], "Expected ':'.", 3, 2);
+ expectErrorAtLineCol(collector.errors[1], "Expected ':'.", 5, 2);
+ });
+ test('from incomplete trailing keys', () {
+ final yaml = cleanUpLiteral(r'''
+ dependencies:
+ six: 5.4.3
+ seven
+ ''');
+ var result = loadYaml(yaml, recover: true);
+ expect(
+ result,
+ deepEquals({
+ 'dependencies': {
+ 'six': '5.4.3',
+ 'seven': null,
+ }
+ }));
+ });
+ });
+
+ test('includes source span information', () {
+ var yaml = loadYamlNode(r'''
+- foo:
+ bar
+- 123
+''') as YamlList;
+
+ expect(yaml.span.start.line, equals(0));
+ expect(yaml.span.start.column, equals(0));
+ expect(yaml.span.end.line, equals(3));
+ expect(yaml.span.end.column, equals(0));
+
+ var map = yaml.nodes.first as YamlMap;
+ expect(map.span.start.line, equals(0));
+ expect(map.span.start.column, equals(2));
+ expect(map.span.end.line, equals(2));
+ expect(map.span.end.column, equals(0));
+
+ var key = map.nodes.keys.first;
+ expect(key.span.start.line, equals(0));
+ expect(key.span.start.column, equals(2));
+ expect(key.span.end.line, equals(0));
+ expect(key.span.end.column, equals(5));
+
+ var value = map.nodes.values.first;
+ expect(value.span.start.line, equals(1));
+ expect(value.span.start.column, equals(4));
+ expect(value.span.end.line, equals(1));
+ expect(value.span.end.column, equals(7));
+
+ var scalar = yaml.nodes.last;
+ expect(scalar.span.start.line, equals(2));
+ expect(scalar.span.start.column, equals(2));
+ expect(scalar.span.end.line, equals(2));
+ expect(scalar.span.end.column, equals(5));
+ });
+
+ // The following tests are all taken directly from the YAML spec
+ // (http://www.yaml.org/spec/1.2/spec.html). Most of them are code examples
+ // that are directly included in the spec, but additional tests are derived
+ // from the prose.
+
+ // A few examples from the spec are deliberately excluded, because they test
+ // features that this implementation doesn't intend to support (character
+ // encoding detection and user-defined tags). More tests are commented out,
+ // because they're intended to be supported but not yet implemented.
+
+ // Chapter 2 is just a preview of various Yaml documents. It's probably not
+ // necessary to test its examples, but it would be nice to test everything in
+ // the spec.
+ group('2.1: Collections', () {
+ test('[Example 2.1]', () {
+ expectYamlLoads(['Mark McGwire', 'Sammy Sosa', 'Ken Griffey'], '''
+ - Mark McGwire
+ - Sammy Sosa
+ - Ken Griffey''');
+ });
+
+ test('[Example 2.2]', () {
+ expectYamlLoads({'hr': 65, 'avg': 0.278, 'rbi': 147}, '''
+ hr: 65 # Home runs
+ avg: 0.278 # Batting average
+ rbi: 147 # Runs Batted In''');
+ });
+
+ test('[Example 2.3]', () {
+ expectYamlLoads({
+ 'american': ['Boston Red Sox', 'Detroit Tigers', 'New York Yankees'],
+ 'national': ['New York Mets', 'Chicago Cubs', 'Atlanta Braves'],
+ }, '''
+ american:
+ - Boston Red Sox
+ - Detroit Tigers
+ - New York Yankees
+ national:
+ - New York Mets
+ - Chicago Cubs
+ - Atlanta Braves''');
+ });
+
+ test('[Example 2.4]', () {
+ expectYamlLoads([
+ {'name': 'Mark McGwire', 'hr': 65, 'avg': 0.278},
+ {'name': 'Sammy Sosa', 'hr': 63, 'avg': 0.288},
+ ], '''
+ -
+ name: Mark McGwire
+ hr: 65
+ avg: 0.278
+ -
+ name: Sammy Sosa
+ hr: 63
+ avg: 0.288''');
+ });
+
+ test('[Example 2.5]', () {
+ expectYamlLoads([
+ ['name', 'hr', 'avg'],
+ ['Mark McGwire', 65, 0.278],
+ ['Sammy Sosa', 63, 0.288]
+ ], '''
+ - [name , hr, avg ]
+ - [Mark McGwire, 65, 0.278]
+ - [Sammy Sosa , 63, 0.288]''');
+ });
+
+ test('[Example 2.6]', () {
+ expectYamlLoads({
+ 'Mark McGwire': {'hr': 65, 'avg': 0.278},
+ 'Sammy Sosa': {'hr': 63, 'avg': 0.288}
+ }, '''
+ Mark McGwire: {hr: 65, avg: 0.278}
+ Sammy Sosa: {
+ hr: 63,
+ avg: 0.288
+ }''');
+ });
+ });
+
+ group('2.2: Structures', () {
+ test('[Example 2.7]', () {
+ expectYamlStreamLoads([
+ ['Mark McGwire', 'Sammy Sosa', 'Ken Griffey'],
+ ['Chicago Cubs', 'St Louis Cardinals']
+ ], '''
+ # Ranking of 1998 home runs
+ ---
+ - Mark McGwire
+ - Sammy Sosa
+ - Ken Griffey
+
+ # Team ranking
+ ---
+ - Chicago Cubs
+ - St Louis Cardinals''');
+ });
+
+ test('[Example 2.8]', () {
+ expectYamlStreamLoads([
+ {'time': '20:03:20', 'player': 'Sammy Sosa', 'action': 'strike (miss)'},
+ {'time': '20:03:47', 'player': 'Sammy Sosa', 'action': 'grand slam'},
+ ], '''
+ ---
+ time: 20:03:20
+ player: Sammy Sosa
+ action: strike (miss)
+ ...
+ ---
+ time: 20:03:47
+ player: Sammy Sosa
+ action: grand slam
+ ...''');
+ });
+
+ test('[Example 2.9]', () {
+ expectYamlLoads({
+ 'hr': ['Mark McGwire', 'Sammy Sosa'],
+ 'rbi': ['Sammy Sosa', 'Ken Griffey']
+ }, '''
+ ---
+ hr: # 1998 hr ranking
+ - Mark McGwire
+ - Sammy Sosa
+ rbi:
+ # 1998 rbi ranking
+ - Sammy Sosa
+ - Ken Griffey''');
+ });
+
+ test('[Example 2.10]', () {
+ expectYamlLoads({
+ 'hr': ['Mark McGwire', 'Sammy Sosa'],
+ 'rbi': ['Sammy Sosa', 'Ken Griffey']
+ }, '''
+ ---
+ hr:
+ - Mark McGwire
+ # Following node labeled SS
+ - &SS Sammy Sosa
+ rbi:
+ - *SS # Subsequent occurrence
+ - Ken Griffey''');
+ });
+
+ test('[Example 2.11]', () {
+ var doc = deepEqualsMap();
+ doc[['Detroit Tigers', 'Chicago cubs']] = ['2001-07-23'];
+ doc[['New York Yankees', 'Atlanta Braves']] = [
+ '2001-07-02',
+ '2001-08-12',
+ '2001-08-14'
+ ];
+ expectYamlLoads(doc, '''
+ ? - Detroit Tigers
+ - Chicago cubs
+ :
+ - 2001-07-23
+
+ ? [ New York Yankees,
+ Atlanta Braves ]
+ : [ 2001-07-02, 2001-08-12,
+ 2001-08-14 ]''');
+ });
+
+ test('[Example 2.12]', () {
+ expectYamlLoads([
+ {'item': 'Super Hoop', 'quantity': 1},
+ {'item': 'Basketball', 'quantity': 4},
+ {'item': 'Big Shoes', 'quantity': 1},
+ ], '''
+ ---
+ # Products purchased
+ - item : Super Hoop
+ quantity: 1
+ - item : Basketball
+ quantity: 4
+ - item : Big Shoes
+ quantity: 1''');
+ });
+ });
+
+ group('2.3: Scalars', () {
+ test('[Example 2.13]', () {
+ expectYamlLoads(cleanUpLiteral('''
+ \\//||\\/||
+ // || ||__'''), '''
+ # ASCII Art
+ --- |
+ \\//||\\/||
+ // || ||__''');
+ });
+
+ test('[Example 2.14]', () {
+ expectYamlLoads("Mark McGwire's year was crippled by a knee injury.", '''
+ --- >
+ Mark McGwire's
+ year was crippled
+ by a knee injury.''');
+ });
+
+ test('[Example 2.15]', () {
+ expectYamlLoads(cleanUpLiteral('''
+ Sammy Sosa completed another fine season with great stats.
+
+ 63 Home Runs
+ 0.288 Batting Average
+
+ What a year!'''), '''
+ >
+ Sammy Sosa completed another
+ fine season with great stats.
+
+ 63 Home Runs
+ 0.288 Batting Average
+
+ What a year!''');
+ });
+
+ test('[Example 2.16]', () {
+ expectYamlLoads({
+ 'name': 'Mark McGwire',
+ 'accomplishment': 'Mark set a major league home run record in 1998.\n',
+ 'stats': '65 Home Runs\n0.278 Batting Average'
+ }, '''
+ name: Mark McGwire
+ accomplishment: >
+ Mark set a major league
+ home run record in 1998.
+ stats: |
+ 65 Home Runs
+ 0.278 Batting Average''');
+ });
+
+ test('[Example 2.17]', () {
+ expectYamlLoads({
+ 'unicode': 'Sosa did fine.\u263A \u{1F680}',
+ 'control': '\b1998\t1999\t2000\n',
+ 'hex esc': '\r\n is \r\n',
+ 'single': '"Howdy!" he cried.',
+ 'quoted': " # Not a 'comment'.",
+ 'tie-fighter': '|\\-*-/|',
+ 'surrogate-pair': 'I \u{D83D}\u{DE03} ️Dart!',
+ 'key-\u{D83D}\u{DD11}': 'Look\u{D83D}\u{DE03}\u{D83C}\u{DF89}surprise!',
+ }, """
+ unicode: "Sosa did fine.\\u263A \\U0001F680"
+ control: "\\b1998\\t1999\\t2000\\n"
+ hex esc: "\\x0d\\x0a is \\r\\n"
+
+ single: '"Howdy!" he cried.'
+ quoted: ' # Not a ''comment''.'
+ tie-fighter: '|\\-*-/|'
+
+ surrogate-pair: I \u{D83D}\u{DE03} ️Dart!
+ key-\u{D83D}\u{DD11}: Look\u{D83D}\u{DE03}\u{D83C}\u{DF89}surprise!""");
+ });
+
+ test('[Example 2.18]', () {
+ expectYamlLoads({
+ 'plain': 'This unquoted scalar spans many lines.',
+ 'quoted': 'So does this quoted scalar.\n'
+ }, '''
+ plain:
+ This unquoted scalar
+ spans many lines.
+
+ quoted: "So does this
+ quoted scalar.\\n"''');
+ });
+ });
+
+ group('2.4: Tags', () {
+ test('[Example 2.19]', () {
+ expectYamlLoads({
+ 'canonical': 12345,
+ 'decimal': 12345,
+ 'octal': 12,
+ 'hexadecimal': 12
+ }, '''
+ canonical: 12345
+ decimal: +12345
+ octal: 0o14
+ hexadecimal: 0xC''');
+ });
+
+ test('[Example 2.20]', () {
+ expectYamlLoads({
+ 'canonical': 1230.15,
+ 'exponential': 1230.15,
+ 'fixed': 1230.15,
+ 'negative infinity': -infinity,
+ 'not a number': nan
+ }, '''
+ canonical: 1.23015e+3
+ exponential: 12.3015e+02
+ fixed: 1230.15
+ negative infinity: -.inf
+ not a number: .NaN''');
+ });
+
+ test('[Example 2.21]', () {
+ var doc = deepEqualsMap({
+ 'booleans': [true, false],
+ 'string': '012345'
+ });
+ doc[null] = null;
+ expectYamlLoads(doc, """
+ null:
+ booleans: [ true, false ]
+ string: '012345'""");
+ });
+
+ // Examples 2.22 through 2.26 test custom tag URIs, which this
+ // implementation currently doesn't plan to support.
+ });
+
+ group('2.5 Full Length Example', () {
+ // Example 2.27 tests custom tag URIs, which this implementation currently
+ // doesn't plan to support.
+
+ test('[Example 2.28]', () {
+ expectYamlStreamLoads([
+ {
+ 'Time': '2001-11-23 15:01:42 -5',
+ 'User': 'ed',
+ 'Warning': 'This is an error message for the log file'
+ },
+ {
+ 'Time': '2001-11-23 15:02:31 -5',
+ 'User': 'ed',
+ 'Warning': 'A slightly different error message.'
+ },
+ {
+ 'DateTime': '2001-11-23 15:03:17 -5',
+ 'User': 'ed',
+ 'Fatal': 'Unknown variable "bar"',
+ 'Stack': [
+ {
+ 'file': 'TopClass.py',
+ 'line': 23,
+ 'code': 'x = MoreObject("345\\n")\n'
+ },
+ {'file': 'MoreClass.py', 'line': 58, 'code': 'foo = bar'}
+ ]
+ }
+ ], '''
+ ---
+ Time: 2001-11-23 15:01:42 -5
+ User: ed
+ Warning:
+ This is an error message
+ for the log file
+ ---
+ Time: 2001-11-23 15:02:31 -5
+ User: ed
+ Warning:
+ A slightly different error
+ message.
+ ---
+ DateTime: 2001-11-23 15:03:17 -5
+ User: ed
+ Fatal:
+ Unknown variable "bar"
+ Stack:
+ - file: TopClass.py
+ line: 23
+ code: |
+ x = MoreObject("345\\n")
+ - file: MoreClass.py
+ line: 58
+ code: |-
+ foo = bar''');
+ });
+ });
+
+ // Chapter 3 just talks about the structure of loading and dumping Yaml.
+ // Chapter 4 explains conventions used in the spec.
+
+ // Chapter 5: Characters
+ group('5.1: Character Set', () {
+ void expectAllowsCharacter(int charCode) {
+ var char = String.fromCharCodes([charCode]);
+ expectYamlLoads('The character "$char" is allowed',
+ 'The character "$char" is allowed');
+ }
+
+ void expectAllowsQuotedCharacter(int charCode) {
+ var char = String.fromCharCodes([charCode]);
+ expectYamlLoads("The character '$char' is allowed",
+ '"The character \'$char\' is allowed"');
+ }
+
+ void expectDisallowsCharacter(int charCode) {
+ var char = String.fromCharCodes([charCode]);
+ expectYamlFails('The character "$char" is disallowed');
+ }
+
+ test("doesn't include C0 control characters", () {
+ expectDisallowsCharacter(0x0);
+ expectDisallowsCharacter(0x8);
+ expectDisallowsCharacter(0x1F);
+ });
+
+ test('includes TAB', () => expectAllowsCharacter(0x9));
+ test("doesn't include DEL", () => expectDisallowsCharacter(0x7F));
+
+ test("doesn't include C1 control characters", () {
+ expectDisallowsCharacter(0x80);
+ expectDisallowsCharacter(0x8A);
+ expectDisallowsCharacter(0x9F);
+ });
+
+ test('includes NEL', () => expectAllowsCharacter(0x85));
+
+ group('within quoted strings', () {
+ test('includes DEL', () => expectAllowsQuotedCharacter(0x7F));
+ test('includes C1 control characters', () {
+ expectAllowsQuotedCharacter(0x80);
+ expectAllowsQuotedCharacter(0x8A);
+ expectAllowsQuotedCharacter(0x9F);
+ });
+ });
+ });
+
+ // Skipping section 5.2 (Character Encodings), since at the moment the module
+ // assumes that the client code is providing it with a string of the proper
+ // encoding.
+
+ group('5.3: Indicator Characters', () {
+ test('[Example 5.3]', () {
+ expectYamlLoads({
+ 'sequence': ['one', 'two'],
+ 'mapping': {'sky': 'blue', 'sea': 'green'}
+ }, '''
+ sequence:
+ - one
+ - two
+ mapping:
+ ? sky
+ : blue
+ sea : green''');
+ });
+
+ test('[Example 5.4]', () {
+ expectYamlLoads({
+ 'sequence': ['one', 'two'],
+ 'mapping': {'sky': 'blue', 'sea': 'green'}
+ }, '''
+ sequence: [ one, two, ]
+ mapping: { sky: blue, sea: green }''');
+ });
+
+ test('[Example 5.5]', () => expectYamlLoads(null, '# Comment only.'));
+
+ // Skipping 5.6 because it uses an undefined tag.
+
+ test('[Example 5.7]', () {
+ expectYamlLoads({'literal': 'some\ntext\n', 'folded': 'some text\n'}, '''
+ literal: |
+ some
+ text
+ folded: >
+ some
+ text
+ ''');
+ });
+
+ test('[Example 5.8]', () {
+ expectYamlLoads({'single': 'text', 'double': 'text'}, '''
+ single: 'text'
+ double: "text"
+ ''');
+ });
+
+ test('[Example 5.9]', () {
+ expectYamlLoads('text', '''
+ %YAML 1.2
+ --- text''');
+ });
+
+ test('[Example 5.10]', () {
+ expectYamlFails('commercial-at: @text');
+ expectYamlFails('commercial-at: `text');
+ });
+ });
+
+ group('5.4: Line Break Characters', () {
+ group('include', () {
+ test('\\n', () => expectYamlLoads([1, 2], indentLiteral('- 1\n- 2')));
+ test('\\r', () => expectYamlLoads([1, 2], '- 1\r- 2'));
+ });
+
+ group('do not include', () {
+ test('form feed', () => expectYamlFails('- 1\x0C- 2'));
+ test('NEL', () => expectYamlLoads(['1\x85- 2'], '- 1\x85- 2'));
+ test('0x2028', () => expectYamlLoads(['1\u2028- 2'], '- 1\u2028- 2'));
+ test('0x2029', () => expectYamlLoads(['1\u2029- 2'], '- 1\u2029- 2'));
+ });
+
+ group('in a scalar context must be normalized', () {
+ test(
+ 'from \\r to \\n',
+ () => expectYamlLoads(
+ ['foo\nbar'], indentLiteral('- |\n foo\r bar')));
+ test(
+ 'from \\r\\n to \\n',
+ () => expectYamlLoads(
+ ['foo\nbar'], indentLiteral('- |\n foo\r\n bar')));
+ });
+
+ test('[Example 5.11]', () {
+ expectYamlLoads(cleanUpLiteral('''
+ Line break (no glyph)
+ Line break (glyphed)'''), '''
+ |
+ Line break (no glyph)
+ Line break (glyphed)''');
+ });
+ });
+
+ group('5.5: White Space Characters', () {
+ test('[Example 5.12]', () {
+ expectYamlLoads({
+ 'quoted': 'Quoted \t',
+ 'block': 'void main() {\n\tprintf("Hello, world!\\n");\n}\n'
+ }, '''
+ # Tabs and spaces
+ quoted: "Quoted \t"
+ block:\t|
+ void main() {
+ \tprintf("Hello, world!\\n");
+ }
+ ''');
+ });
+ });
+
+ group('5.7: Escaped Characters', () {
+ test('[Example 5.13]', () {
+ expectYamlLoads(
+ 'Fun with \x5C '
+ '\x22 \x07 \x08 \x1B \x0C '
+ '\x0A \x0D \x09 \x0B \x00 '
+ '\x20 \xA0 \x85 \u2028 \u2029 '
+ 'A A A',
+ '''
+ "Fun with \\\\
+ \\" \\a \\b \\e \\f \\
+ \\n \\r \\t \\v \\0 \\
+ \\ \\_ \\N \\L \\P \\
+ \\x41 \\u0041 \\U00000041"''');
+ });
+
+ test('[Example 5.14]', () {
+ expectYamlFails('Bad escape: "\\c"');
+ expectYamlFails('Bad escape: "\\xq-"');
+ });
+ });
+
+ // Chapter 6: Basic Structures
+ group('6.1: Indentation Spaces', () {
+ test('may not include TAB characters', () {
+ expectYamlFails('''
+ -
+ \t- foo
+ \t- bar''');
+ });
+
+ test('must be the same for all sibling nodes', () {
+ expectYamlFails('''
+ -
+ - foo
+ - bar''');
+ });
+
+ test('may be different for the children of sibling nodes', () {
+ expectYamlLoads([
+ ['foo'],
+ ['bar']
+ ], '''
+ -
+ - foo
+ -
+ - bar''');
+ });
+
+ test('[Example 6.1]', () {
+ expectYamlLoads({
+ 'Not indented': {
+ 'By one space': 'By four\n spaces\n',
+ 'Flow style': ['By two', 'Also by two', 'Still by two']
+ }
+ }, '''
+ # Leading comment line spaces are
+ # neither content nor indentation.
+
+ Not indented:
+ By one space: |
+ By four
+ spaces
+ Flow style: [ # Leading spaces
+ By two, # in flow style
+ Also by two, # are neither
+ \tStill by two # content nor
+ ] # indentation.''');
+ });
+
+ test('[Example 6.2]', () {
+ expectYamlLoads({
+ 'a': [
+ 'b',
+ ['c', 'd']
+ ]
+ }, '''
+ ? a
+ : -\tb
+ - -\tc
+ - d''');
+ });
+ });
+
+ group('6.2: Separation Spaces', () {
+ test('[Example 6.3]', () {
+ expectYamlLoads([
+ {'foo': 'bar'},
+ ['baz', 'baz']
+ ], '''
+ - foo:\t bar
+ - - baz
+ -\tbaz''');
+ });
+ });
+
+ group('6.3: Line Prefixes', () {
+ test('[Example 6.4]', () {
+ expectYamlLoads({
+ 'plain': 'text lines',
+ 'quoted': 'text lines',
+ 'block': 'text\n \tlines\n'
+ }, '''
+ plain: text
+ lines
+ quoted: "text
+ \tlines"
+ block: |
+ text
+ \tlines
+ ''');
+ });
+ });
+
+ group('6.4: Empty Lines', () {
+ test('[Example 6.5]', () {
+ expectYamlLoads({
+ 'Folding': 'Empty line\nas a line feed',
+ 'Chomping': 'Clipped empty lines\n',
+ }, '''
+ Folding:
+ "Empty line
+ \t
+ as a line feed"
+ Chomping: |
+ Clipped empty lines
+ ''');
+ });
+ });
+
+ group('6.5: Line Folding', () {
+ test('[Example 6.6]', () {
+ expectYamlLoads('trimmed\n\n\nas space', '''
+ >-
+ trimmed
+
+
+
+ as
+ space
+ ''');
+ });
+
+ test('[Example 6.7]', () {
+ expectYamlLoads('foo \n\n\t bar\n\nbaz\n', '''
+ >
+ foo
+
+ \t bar
+
+ baz
+ ''');
+ });
+
+ test('[Example 6.8]', () {
+ expectYamlLoads(' foo\nbar\nbaz ', '''
+ "
+ foo
+
+ \t bar
+
+ baz
+ "''');
+ });
+ });
+
+ group('6.6: Comments', () {
+ test('must be separated from other tokens by white space characters', () {
+ expectYamlLoads('foo#bar', 'foo#bar');
+ expectYamlLoads('foo:#bar', 'foo:#bar');
+ expectYamlLoads('-#bar', '-#bar');
+ });
+
+ test('[Example 6.9]', () {
+ expectYamlLoads({'key': 'value'}, '''
+ key: # Comment
+ value''');
+ });
+
+ group('outside of scalar content', () {
+ test('may appear on a line of their own', () {
+ expectYamlLoads([1, 2], '''
+ - 1
+ # Comment
+ - 2''');
+ });
+
+ test('are independent of indentation level', () {
+ expectYamlLoads([
+ [1, 2]
+ ], '''
+ -
+ - 1
+ # Comment
+ - 2''');
+ });
+
+ test('include lines containing only white space characters', () {
+ expectYamlLoads([1, 2], '''
+ - 1
+ \t
+ - 2''');
+ });
+ });
+
+ group('within scalar content', () {
+ test('may not appear on a line of their own', () {
+ expectYamlLoads(['foo\n# not comment\nbar\n'], '''
+ - |
+ foo
+ # not comment
+ bar
+ ''');
+ });
+
+ test("don't include lines containing only white space characters", () {
+ expectYamlLoads(['foo\n \t \nbar\n'], '''
+ - |
+ foo
+ \t
+ bar
+ ''');
+ });
+ });
+
+ test('[Example 6.10]', () {
+ expectYamlLoads(null, '''
+ # Comment
+
+ ''');
+ });
+
+ test('[Example 6.11]', () {
+ expectYamlLoads({'key': 'value'}, '''
+ key: # Comment
+ # lines
+ value
+ ''');
+ });
+
+ group('ending a block scalar header', () {
+ test('may not be followed by additional comment lines', () {
+ expectYamlLoads(['# not comment\nfoo\n'], '''
+ - | # comment
+ # not comment
+ foo
+ ''');
+ });
+ });
+ });
+
+ group('6.7: Separation Lines', () {
+ test('may not be used within implicit keys', () {
+ expectYamlFails('''
+ [1,
+ 2]: 3''');
+ });
+
+ test('[Example 6.12]', () {
+ var doc = deepEqualsMap();
+ doc[{'first': 'Sammy', 'last': 'Sosa'}] = {'hr': 65, 'avg': 0.278};
+ expectYamlLoads(doc, '''
+ { first: Sammy, last: Sosa }:
+ # Statistics:
+ hr: # Home runs
+ 65
+ avg: # Average
+ 0.278''');
+ });
+ });
+
+ group('6.8: Directives', () {
+ // TODO(nweiz): assert that this produces a warning
+ test('[Example 6.13]', () {
+ expectYamlLoads('foo', '''
+ %FOO bar baz # Should be ignored
+ # with a warning.
+ --- "foo"''');
+ });
+
+ // TODO(nweiz): assert that this produces a warning.
+ test('[Example 6.14]', () {
+ expectYamlLoads('foo', '''
+ %YAML 1.3 # Attempt parsing
+ # with a warning
+ ---
+ "foo"''');
+ });
+
+ test('[Example 6.15]', () {
+ expectYamlFails('''
+ %YAML 1.2
+ %YAML 1.1
+ foo''');
+ });
+
+ test('[Example 6.16]', () {
+ expectYamlLoads('foo', '''
+ %TAG !yaml! tag:yaml.org,2002:
+ ---
+ !yaml!str "foo"''');
+ });
+
+ test('[Example 6.17]', () {
+ expectYamlFails('''
+ %TAG ! !foo
+ %TAG ! !foo
+ bar''');
+ });
+
+ // Examples 6.18 through 6.22 test custom tag URIs, which this
+ // implementation currently doesn't plan to support.
+ });
+
+ group('6.9: Node Properties', () {
+ test('may be specified in any order', () {
+ expectYamlLoads(['foo', 'bar'], '''
+ - !!str &a1 foo
+ - &a2 !!str bar''');
+ });
+
+ test('[Example 6.23]', () {
+ expectYamlLoads({'foo': 'bar', 'baz': 'foo'}, '''
+ !!str &a1 "foo":
+ !!str bar
+ &a2 baz : *a1''');
+ });
+
+ // Example 6.24 tests custom tag URIs, which this implementation currently
+ // doesn't plan to support.
+
+ test('[Example 6.25]', () {
+ expectYamlFails('- !<!> foo');
+ expectYamlFails('- !<\$:?> foo');
+ });
+
+ // Examples 6.26 and 6.27 test custom tag URIs, which this implementation
+ // currently doesn't plan to support.
+
+ test('[Example 6.28]', () {
+ expectYamlLoads(['12', 12, '12'], '''
+ # Assuming conventional resolution:
+ - "12"
+ - 12
+ - ! 12''');
+ });
+
+ test('[Example 6.29]', () {
+ expectYamlLoads(
+ {'First occurrence': 'Value', 'Second occurrence': 'Value'}, '''
+ First occurrence: &anchor Value
+ Second occurrence: *anchor''');
+ });
+ });
+
+ // Chapter 7: Flow Styles
+ group('7.1: Alias Nodes', () {
+ test("must not use an anchor that doesn't previously occur", () {
+ expectYamlFails('''
+ - *anchor
+ - &anchor foo''');
+ });
+
+ test("don't have to exist for a given anchor node", () {
+ expectYamlLoads(['foo'], '- &anchor foo');
+ });
+
+ group('must not specify', () {
+ test('tag properties', () => expectYamlFails('''
+ - &anchor foo
+ - !str *anchor'''));
+
+ test('anchor properties', () => expectYamlFails('''
+ - &anchor foo
+ - &anchor2 *anchor'''));
+
+ test('content', () => expectYamlFails('''
+ - &anchor foo
+ - *anchor bar'''));
+ });
+
+ test('must preserve structural equality', () {
+ var doc = loadYaml(cleanUpLiteral('''
+ anchor: &anchor [a, b, c]
+ alias: *anchor'''));
+ var anchorList = doc['anchor'];
+ var aliasList = doc['alias'];
+ expect(anchorList, same(aliasList));
+
+ doc = loadYaml(cleanUpLiteral('''
+ ? &anchor [a, b, c]
+ : ? *anchor
+ : bar'''));
+ anchorList = doc.keys.first;
+ aliasList = doc[['a', 'b', 'c']].keys.first;
+ expect(anchorList, same(aliasList));
+ });
+
+ test('[Example 7.1]', () {
+ expectYamlLoads({
+ 'First occurrence': 'Foo',
+ 'Second occurrence': 'Foo',
+ 'Override anchor': 'Bar',
+ 'Reuse anchor': 'Bar',
+ }, '''
+ First occurrence: &anchor Foo
+ Second occurrence: *anchor
+ Override anchor: &anchor Bar
+ Reuse anchor: *anchor''');
+ });
+ });
+
+ group('7.2: Empty Nodes', () {
+ test('[Example 7.2]', () {
+ expectYamlLoads({'foo': '', '': 'bar'}, '''
+ {
+ foo : !!str,
+ !!str : bar,
+ }''');
+ });
+
+ test('[Example 7.3]', () {
+ var doc = deepEqualsMap({'foo': null});
+ doc[null] = 'bar';
+ expectYamlLoads(doc, '''
+ {
+ ? foo :,
+ : bar,
+ }''');
+ });
+ });
+
+ group('7.3: Flow Scalar Styles', () {
+ test('[Example 7.4]', () {
+ expectYamlLoads({
+ 'implicit block key': [
+ {'implicit flow key': 'value'}
+ ]
+ }, '''
+ "implicit block key" : [
+ "implicit flow key" : value,
+ ]''');
+ });
+
+ test('[Example 7.5]', () {
+ expectYamlLoads(
+ 'folded to a space,\nto a line feed, or \t \tnon-content', '''
+ "folded
+ to a space,\t
+
+ to a line feed, or \t\\
+ \\ \tnon-content"''');
+ });
+
+ test('[Example 7.6]', () {
+ expectYamlLoads(' 1st non-empty\n2nd non-empty 3rd non-empty ', '''
+ " 1st non-empty
+
+ 2nd non-empty
+ \t3rd non-empty "''');
+ });
+
+ test('[Example 7.7]', () {
+ expectYamlLoads("here's to \"quotes\"", "'here''s to \"quotes\"'");
+ });
+
+ test('[Example 7.8]', () {
+ expectYamlLoads({
+ 'implicit block key': [
+ {'implicit flow key': 'value'}
+ ]
+ }, """
+ 'implicit block key' : [
+ 'implicit flow key' : value,
+ ]""");
+ });
+
+ test('[Example 7.9]', () {
+ expectYamlLoads(' 1st non-empty\n2nd non-empty 3rd non-empty ', """
+ ' 1st non-empty
+
+ 2nd non-empty
+ \t3rd non-empty '""");
+ });
+
+ test('[Example 7.10]', () {
+ expectYamlLoads([
+ '::vector',
+ ': - ()',
+ 'Up, up, and away!',
+ -123,
+ 'http://example.com/foo#bar',
+ [
+ '::vector',
+ ': - ()',
+ 'Up, up, and away!',
+ -123,
+ 'http://example.com/foo#bar'
+ ]
+ ], '''
+ # Outside flow collection:
+ - ::vector
+ - ": - ()"
+ - Up, up, and away!
+ - -123
+ - http://example.com/foo#bar
+ # Inside flow collection:
+ - [ ::vector,
+ ": - ()",
+ "Up, up, and away!",
+ -123,
+ http://example.com/foo#bar ]''');
+ });
+
+ test('[Example 7.11]', () {
+ expectYamlLoads({
+ 'implicit block key': [
+ {'implicit flow key': 'value'}
+ ]
+ }, '''
+ implicit block key : [
+ implicit flow key : value,
+ ]''');
+ });
+
+ test('[Example 7.12]', () {
+ expectYamlLoads('1st non-empty\n2nd non-empty 3rd non-empty', '''
+ 1st non-empty
+
+ 2nd non-empty
+ \t3rd non-empty''');
+ });
+ });
+
+ group('7.4: Flow Collection Styles', () {
+ test('[Example 7.13]', () {
+ expectYamlLoads([
+ ['one', 'two'],
+ ['three', 'four']
+ ], '''
+ - [ one, two, ]
+ - [three ,four]''');
+ });
+
+ test('[Example 7.14]', () {
+ expectYamlLoads([
+ 'double quoted',
+ 'single quoted',
+ 'plain text',
+ ['nested'],
+ {'single': 'pair'}
+ ], """
+ [
+ "double
+ quoted", 'single
+ quoted',
+ plain
+ text, [ nested ],
+ single: pair,
+ ]""");
+ });
+
+ test('[Example 7.15]', () {
+ expectYamlLoads([
+ {'one': 'two', 'three': 'four'},
+ {'five': 'six', 'seven': 'eight'},
+ ], '''
+ - { one : two , three: four , }
+ - {five: six,seven : eight}''');
+ });
+
+ test('[Example 7.16]', () {
+ var doc = deepEqualsMap({'explicit': 'entry', 'implicit': 'entry'});
+ doc[null] = null;
+ expectYamlLoads(doc, '''
+ {
+ ? explicit: entry,
+ implicit: entry,
+ ?
+ }''');
+ });
+
+ test('[Example 7.17]', () {
+ var doc = deepEqualsMap({
+ 'unquoted': 'separate',
+ 'http://foo.com': null,
+ 'omitted value': null
+ });
+ doc[null] = 'omitted key';
+ expectYamlLoads(doc, '''
+ {
+ unquoted : "separate",
+ http://foo.com,
+ omitted value:,
+ : omitted key,
+ }''');
+ });
+
+ test('[Example 7.18]', () {
+ expectYamlLoads(
+ {'adjacent': 'value', 'readable': 'value', 'empty': null}, '''
+ {
+ "adjacent":value,
+ "readable": value,
+ "empty":
+ }''');
+ });
+
+ test('[Example 7.19]', () {
+ expectYamlLoads([
+ {'foo': 'bar'}
+ ], '''
+ [
+ foo: bar
+ ]''');
+ });
+
+ test('[Example 7.20]', () {
+ expectYamlLoads([
+ {'foo bar': 'baz'}
+ ], '''
+ [
+ ? foo
+ bar : baz
+ ]''');
+ });
+
+ test('[Example 7.21]', () {
+ var el1 = deepEqualsMap();
+ el1[null] = 'empty key entry';
+
+ var el2 = deepEqualsMap();
+ el2[{'JSON': 'like'}] = 'adjacent';
+
+ expectYamlLoads([
+ [
+ {'YAML': 'separate'}
+ ],
+ [el1],
+ [el2]
+ ], '''
+ - [ YAML : separate ]
+ - [ : empty key entry ]
+ - [ {JSON: like}:adjacent ]''');
+ });
+
+ // TODO(nweiz): enable this when we throw an error for long or multiline
+ // keys.
+ // test('[Example 7.22]', () {
+ // expectYamlFails(
+ // """
+ // [ foo
+ // bar: invalid ]""");
+ //
+ // var dotList = new List.filled(1024, ' ');
+ // var dots = dotList.join();
+ // expectYamlFails('[ "foo...$dots...bar": invalid ]');
+ // });
+ });
+
+ group('7.5: Flow Nodes', () {
+ test('[Example 7.23]', () {
+ expectYamlLoads([
+ ['a', 'b'],
+ {'a': 'b'},
+ 'a',
+ 'b',
+ 'c'
+ ], '''
+ - [ a, b ]
+ - { a: b }
+ - 'a'
+ - 'b'
+ - c''');
+ });
+
+ test('[Example 7.24]', () {
+ expectYamlLoads(['a', 'b', 'c', 'c', ''], '''
+ - !!str "a"
+ - 'b'
+ - &anchor "c"
+ - *anchor
+ - !!str''');
+ });
+ });
+
+ // Chapter 8: Block Styles
+ group('8.1: Block Scalar Styles', () {
+ test('[Example 8.1]', () {
+ expectYamlLoads(['literal\n', ' folded\n', 'keep\n\n', ' strip'], '''
+ - | # Empty header
+ literal
+ - >1 # Indentation indicator
+ folded
+ - |+ # Chomping indicator
+ keep
+
+ - >1- # Both indicators
+ strip''');
+ });
+
+ test('[Example 8.2]', () {
+ // Note: in the spec, the fourth element in this array is listed as
+ // "\t detected\n", not "\t\ndetected\n". However, I'm reasonably
+ // confident that "\t\ndetected\n" is correct when parsed according to the
+ // rest of the spec.
+ expectYamlLoads(
+ ['detected\n', '\n\n# detected\n', ' explicit\n', '\t\ndetected\n'],
+ '''
+ - |
+ detected
+ - >
+
+
+ # detected
+ - |1
+ explicit
+ - >
+ \t
+ detected
+ ''');
+ });
+
+ test('[Example 8.3]', () {
+ expectYamlFails('''
+ - |
+
+ text''');
+
+ expectYamlFails('''
+ - >
+ text
+ text''');
+
+ expectYamlFails('''
+ - |2
+ text''');
+ });
+
+ test('[Example 8.4]', () {
+ expectYamlLoads({'strip': 'text', 'clip': 'text\n', 'keep': 'text\n'}, '''
+ strip: |-
+ text
+ clip: |
+ text
+ keep: |+
+ text
+ ''');
+ });
+
+ test('[Example 8.5]', () {
+ // This example in the spec only includes a single newline in the "keep"
+ // value, but as far as I can tell that's not how it's supposed to be
+ // parsed according to the rest of the spec.
+ expectYamlLoads(
+ {'strip': '# text', 'clip': '# text\n', 'keep': '# text\n\n'}, '''
+ # Strip
+ # Comments:
+ strip: |-
+ # text
+
+ # Clip
+ # comments:
+
+ clip: |
+ # text
+
+ # Keep
+ # comments:
+
+ keep: |+
+ # text
+
+ # Trail
+ # comments.
+ ''');
+ });
+
+ test('[Example 8.6]', () {
+ expectYamlLoads({'strip': '', 'clip': '', 'keep': '\n'}, '''
+ strip: >-
+
+ clip: >
+
+ keep: |+
+
+ ''');
+ });
+
+ test('[Example 8.7]', () {
+ expectYamlLoads('literal\n\ttext\n', '''
+ |
+ literal
+ \ttext
+ ''');
+ });
+
+ test('[Example 8.8]', () {
+ expectYamlLoads('\n\nliteral\n \n\ntext\n', '''
+ |
+
+
+ literal
+
+
+ text
+
+ # Comment''');
+ });
+
+ test('[Example 8.9]', () {
+ expectYamlLoads('folded text\n', '''
+ >
+ folded
+ text
+ ''');
+ });
+
+ test('[Example 8.10]', () {
+ expectYamlLoads(cleanUpLiteral('''
+
+ folded line
+ next line
+ * bullet
+
+ * list
+ * lines
+
+ last line
+ '''), '''
+ >
+
+ folded
+ line
+
+ next
+ line
+ * bullet
+
+ * list
+ * lines
+
+ last
+ line
+
+ # Comment''');
+ });
+
+ // Examples 8.11 through 8.13 are duplicates of 8.10.
+ });
+
+ group('8.2: Block Collection Styles', () {
+ test('[Example 8.14]', () {
+ expectYamlLoads({
+ 'block sequence': [
+ 'one',
+ {'two': 'three'}
+ ]
+ }, '''
+ block sequence:
+ - one
+ - two : three''');
+ });
+
+ test('[Example 8.15]', () {
+ expectYamlLoads([
+ null,
+ 'block node\n',
+ ['one', 'two'],
+ {'one': 'two'}
+ ], '''
+ - # Empty
+ - |
+ block node
+ - - one # Compact
+ - two # sequence
+ - one: two # Compact mapping''');
+ });
+
+ test('[Example 8.16]', () {
+ expectYamlLoads({
+ 'block mapping': {'key': 'value'}
+ }, '''
+ block mapping:
+ key: value''');
+ });
+
+ test('[Example 8.17]', () {
+ expectYamlLoads({
+ 'explicit key': null,
+ 'block key\n': ['one', 'two']
+ }, '''
+ ? explicit key # Empty value
+ ? |
+ block key
+ : - one # Explicit compact
+ - two # block value''');
+ });
+
+ test('[Example 8.18]', () {
+ var doc = deepEqualsMap({
+ 'plain key': 'in-line value',
+ 'quoted key': ['entry']
+ });
+ doc[null] = null;
+ expectYamlLoads(doc, '''
+ plain key: in-line value
+ : # Both empty
+ "quoted key":
+ - entry''');
+ });
+
+ test('[Example 8.19]', () {
+ var el = deepEqualsMap();
+ el[{'earth': 'blue'}] = {'moon': 'white'};
+ expectYamlLoads([
+ {'sun': 'yellow'},
+ el
+ ], '''
+ - sun: yellow
+ - ? earth: blue
+ : moon: white''');
+ });
+
+ test('[Example 8.20]', () {
+ expectYamlLoads([
+ 'flow in block',
+ 'Block scalar\n',
+ {'foo': 'bar'}
+ ], '''
+ -
+ "flow in block"
+ - >
+ Block scalar
+ - !!map # Block collection
+ foo : bar''');
+ });
+
+ test('[Example 8.21]', () {
+ // The spec doesn't include a newline after "value" in the parsed map, but
+ // the block scalar is clipped so it should be retained.
+ expectYamlLoads({'literal': 'value\n', 'folded': 'value'}, '''
+ literal: |2
+ value
+ folded:
+ !!str
+ >1
+ value''');
+ });
+
+ test('[Example 8.22]', () {
+ expectYamlLoads({
+ 'sequence': [
+ 'entry',
+ ['nested']
+ ],
+ 'mapping': {'foo': 'bar'}
+ }, '''
+ sequence: !!seq
+ - entry
+ - !!seq
+ - nested
+ mapping: !!map
+ foo: bar''');
+ });
+ });
+
+ // Chapter 9: YAML Character Stream
+ group('9.1: Documents', () {
+ // Example 9.1 tests the use of a BOM, which this implementation currently
+ // doesn't plan to support.
+
+ test('[Example 9.2]', () {
+ expectYamlLoads('Document', '''
+ %YAML 1.2
+ ---
+ Document
+ ... # Suffix''');
+ });
+
+ test('[Example 9.3]', () {
+ // The spec example indicates that the comment after "%!PS-Adobe-2.0"
+ // should be stripped, which would imply that that line is not part of the
+ // literal defined by the "|". The rest of the spec is ambiguous on this
+ // point; the allowable indentation for non-indented literal content is
+ // not clearly explained. However, if both the "|" and the text were
+ // indented the same amount, the text would be part of the literal, which
+ // implies that the spec's parse of this document is incorrect.
+ expectYamlStreamLoads(
+ ['Bare document', '%!PS-Adobe-2.0 # Not the first line\n'], '''
+ Bare
+ document
+ ...
+ # No document
+ ...
+ |
+ %!PS-Adobe-2.0 # Not the first line
+ ''');
+ });
+
+ test('[Example 9.4]', () {
+ expectYamlStreamLoads([
+ {'matches %': 20},
+ null
+ ], '''
+ ---
+ { matches
+ % : 20 }
+ ...
+ ---
+ # Empty
+ ...''');
+ });
+
+ test('[Example 9.5]', () {
+ // The spec doesn't have a space between the second
+ // "YAML" and "1.2", but this seems to be a typo.
+ expectYamlStreamLoads(['%!PS-Adobe-2.0\n', null], '''
+ %YAML 1.2
+ --- |
+ %!PS-Adobe-2.0
+ ...
+ %YAML 1.2
+ ---
+ # Empty
+ ...''');
+ });
+
+ test('[Example 9.6]', () {
+ expectYamlStreamLoads([
+ 'Document',
+ null,
+ {'matches %': 20}
+ ], '''
+ Document
+ ---
+ # Empty
+ ...
+ %YAML 1.2
+ ---
+ matches %: 20''');
+ });
+ });
+
+ // Chapter 10: Recommended Schemas
+ group('10.1: Failsafe Schema', () {
+ test('[Example 10.1]', () {
+ expectYamlLoads({
+ 'Block style': {
+ 'Clark': 'Evans',
+ 'Ingy': 'döt Net',
+ 'Oren': 'Ben-Kiki'
+ },
+ 'Flow style': {'Clark': 'Evans', 'Ingy': 'döt Net', 'Oren': 'Ben-Kiki'}
+ }, '''
+ Block style: !!map
+ Clark : Evans
+ Ingy : döt Net
+ Oren : Ben-Kiki
+
+ Flow style: !!map { Clark: Evans, Ingy: döt Net, Oren: Ben-Kiki }''');
+ });
+
+ test('[Example 10.2]', () {
+ expectYamlLoads({
+ 'Block style': ['Clark Evans', 'Ingy döt Net', 'Oren Ben-Kiki'],
+ 'Flow style': ['Clark Evans', 'Ingy döt Net', 'Oren Ben-Kiki']
+ }, '''
+ Block style: !!seq
+ - Clark Evans
+ - Ingy döt Net
+ - Oren Ben-Kiki
+
+ Flow style: !!seq [ Clark Evans, Ingy döt Net, Oren Ben-Kiki ]''');
+ });
+
+ test('[Example 10.3]', () {
+ expectYamlLoads({
+ 'Block style': 'String: just a theory.',
+ 'Flow style': 'String: just a theory.'
+ }, '''
+ Block style: !!str |-
+ String: just a theory.
+
+ Flow style: !!str "String: just a theory."''');
+ });
+ });
+
+ group('10.2: JSON Schema', () {
+ test('[Example 10.4]', () {
+ var doc = deepEqualsMap({'key with null value': null});
+ doc[null] = 'value for null key';
+ expectYamlStreamLoads([doc], '''
+ !!null null: value for null key
+ key with null value: !!null null''');
+ });
+
+ test('[Example 10.5]', () {
+ expectYamlStreamLoads([
+ {'YAML is a superset of JSON': true, 'Pluto is a planet': false}
+ ], '''
+ YAML is a superset of JSON: !!bool true
+ Pluto is a planet: !!bool false''');
+ });
+
+ test('[Example 10.6]', () {
+ expectYamlStreamLoads([
+ {'negative': -12, 'zero': 0, 'positive': 34}
+ ], '''
+ negative: !!int -12
+ zero: !!int 0
+ positive: !!int 34''');
+ });
+
+ test('[Example 10.7]', () {
+ expectYamlStreamLoads([
+ {
+ 'negative': -1,
+ 'zero': 0,
+ 'positive': 23000,
+ 'infinity': infinity,
+ 'not a number': nan
+ }
+ ], '''
+ negative: !!float -1
+ zero: !!float 0
+ positive: !!float 2.3e4
+ infinity: !!float .inf
+ not a number: !!float .nan''');
+ }, skip: 'Fails for single digit float');
+
+ test('[Example 10.8]', () {
+ expectYamlStreamLoads([
+ {
+ 'A null': null,
+ 'Booleans': [true, false],
+ 'Integers': [0, -0, 3, -19],
+ 'Floats': [0, 0, 12000, -200000],
+ // Despite being invalid in the JSON schema, these values are valid in
+ // the core schema which this implementation supports.
+ 'Invalid': [true, null, 7, 0x3A, 12.3]
+ }
+ ], '''
+ A null: null
+ Booleans: [ true, false ]
+ Integers: [ 0, -0, 3, -19 ]
+ Floats: [ 0., -0.0, 12e03, -2E+05 ]
+ Invalid: [ True, Null, 0o7, 0x3A, +12.3 ]''');
+ });
+ });
+
+ group('10.3: Core Schema', () {
+ test('[Example 10.9]', () {
+ expectYamlLoads({
+ 'A null': null,
+ 'Also a null': null,
+ 'Not a null': '',
+ 'Booleans': [true, true, false, false],
+ 'Integers': [0, 7, 0x3A, -19],
+ 'Floats': [0, 0, 0.5, 12000, -200000],
+ 'Also floats': [infinity, -infinity, infinity, nan]
+ }, '''
+ A null: null
+ Also a null: # Empty
+ Not a null: ""
+ Booleans: [ true, True, false, FALSE ]
+ Integers: [ 0, 0o7, 0x3A, -19 ]
+ Floats: [ 0., -0.0, .5, +12e03, -2E+05 ]
+ Also floats: [ .inf, -.Inf, +.INF, .NAN ]''');
+ });
+ });
+
+ test('preserves key order', () {
+ const keys = ['a', 'b', 'c', 'd', 'e', 'f'];
+ var sanityCheckCount = 0;
+ for (var permutation in _generatePermutations(keys)) {
+ final yaml = permutation.map((key) => '$key: value').join('\n');
+ expect(loadYaml(yaml).keys.toList(), permutation);
+ sanityCheckCount++;
+ }
+ final expectedPermutationCount =
+ List.generate(keys.length, (i) => i + 1).reduce((n, i) => n * i);
+ expect(sanityCheckCount, expectedPermutationCount);
+ });
+}
+
+Iterable<List<String>> _generatePermutations(List<String> keys) sync* {
+ if (keys.length <= 1) {
+ yield keys;
+ return;
+ }
+ for (var i = 0; i < keys.length; i++) {
+ final first = keys[i];
+ final rest = <String>[...keys.sublist(0, i), ...keys.sublist(i + 1)];
+ for (var subPermutation in _generatePermutations(rest)) {
+ yield <String>[first, ...subPermutation];
+ }
+ }
+}
diff --git a/pkgs/yaml_edit/.gitignore b/pkgs/yaml_edit/.gitignore
new file mode 100644
index 0000000..7886c3d
--- /dev/null
+++ b/pkgs/yaml_edit/.gitignore
@@ -0,0 +1,3 @@
+/.dart_tool/
+/.packages
+/pubspec.lock
diff --git a/pkgs/yaml_edit/CHANGELOG.md b/pkgs/yaml_edit/CHANGELOG.md
new file mode 100644
index 0000000..9342e9f
--- /dev/null
+++ b/pkgs/yaml_edit/CHANGELOG.md
@@ -0,0 +1,95 @@
+## 2.2.2
+
+- Suppress warnings previously printed to `stdout` when parsing YAML internally.
+- Fix error thrown when inserting duplicate keys to different maps in the same
+ list.
+ ([#69](https://github.com/dart-lang/yaml_edit/issues/69))
+
+- Fix error thrown when inserting in nested list using `spliceList` method
+ ([#83](https://github.com/dart-lang/yaml_edit/issues/83))
+
+- Fix error thrown when string has spaces when applying `ScalarStyle.FOLDED`.
+ ([#41](https://github.com/dart-lang/yaml_edit/issues/41)). Resolves
+ ([[#86](https://github.com/dart-lang/yaml_edit/issues/86)]).
+
+- Require Dart 3.1
+
+- Move to `dart-lang/tools` monorepo.
+
+## 2.2.1
+
+- Require Dart 3.0
+- Fix removal of last key in blockmap when key has no value
+ ([#55](https://github.com/dart-lang/yaml_edit/issues/55)).
+
+## 2.2.0
+
+- Fix inconsistent line endings when inserting maps into a document using `\r\n`.
+ ([#65](https://github.com/dart-lang/yaml_edit/issues/65))
+
+- `AliasError` is changed to `AliasException` and exposed in the public API.
+
+ All node-mutating methods on `YamlEditor`, i.e. `update()`, `appendToList()`,
+ `prependToList()`, `insertIntoList()`, `spliceList()`, `remove()` will now
+ throw an exception instead of an error when encountering an alias on the path
+ to modify.
+
+ This allows catching and handling when this is happening.
+
+## 2.1.1
+
+- Require Dart 2.19
+
+## 2.1.0
+
+- **Breaking** `wrapAsYamlNode(value, collectionStyle, scalarStyle)` will apply
+ `collectionStyle` and `scalarStyle` recursively when wrapping a children of
+ `Map` and `List`.
+ While this may change the style of the YAML documents written by applications
+ that rely on the old behavior, such YAML documents should still be valid.
+ Hence, we hope it is reasonable to make this change in a minor release.
+- Fix for cases that can't be encoded correctly with
+ `scalarStyle: ScalarStyle.SINGLE_QUOTED`.
+- Fix YamlEditor `appendToList` and `insertIntoList` functions inserts new item into next yaml item
+ rather than at end of list.
+ ([#23](https://github.com/dart-lang/yaml_edit/issues/23))
+
+## 2.0.3
+
+- Updated the value of the pubspec `repository` field.
+
+## 2.0.2
+
+- Fix trailing whitespace after adding new key with block-value to map
+ ([#15](https://github.com/dart-lang/yaml_edit/issues/15)).
+- Updated `repository` and other meta-data in `pubspec.yaml`.
+
+## 2.0.1
+
+- License changed to BSD, as this package is now maintained by the Dart team.
+- Fixed minor lints.
+
+## 2.0.0
+
+- Migrated to null-safety.
+- API will no-longer return `null` in-place of a `YamlNode`, instead a
+ `YamlNode` with `YamlNode.value == null` should be used. These are easily
+ created with `wrapAsYamlNode(null)`.
+
+## 1.0.3
+
+- Fixed bug in adding an empty map as a map value.
+
+## 1.0.2
+
+- Throws an error if the final YAML after edit is not parsable.
+- Fixed bug in adding to empty map values, when it is followed by other content.
+
+## 1.0.1
+
+- Updated behavior surrounding list and map removal.
+- Fixed bug in dealing with empty values.
+
+## 1.0.0
+
+- Initial release.
diff --git a/pkgs/yaml_edit/LICENSE b/pkgs/yaml_edit/LICENSE
new file mode 100644
index 0000000..413ed83
--- /dev/null
+++ b/pkgs/yaml_edit/LICENSE
@@ -0,0 +1,26 @@
+Copyright 2020, the Dart project authors.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+ * Neither the name of Google LLC nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/pkgs/yaml_edit/README.md b/pkgs/yaml_edit/README.md
new file mode 100644
index 0000000..f10560b
--- /dev/null
+++ b/pkgs/yaml_edit/README.md
@@ -0,0 +1,61 @@
+[](https://github.com/dart-lang/yaml_edit/actions/workflows/test-package.yml)
+[](https://pub.dev/packages/yaml_edit)
+[](https://pub.dev/packages/yaml_edit/publisher)
+[](https://coveralls.io/github/dart-lang/yaml_edit)
+
+A library for [YAML](https://yaml.org) manipulation while preserving comments.
+
+## Usage
+
+A simple usage example:
+
+```dart
+import 'package:yaml_edit/yaml_edit.dart';
+
+void main() {
+ final yamlEditor = YamlEditor('{YAML: YAML}');
+ yamlEditor.update(['YAML'], "YAML Ain't Markup Language");
+ print(yamlEditor);
+ // Expected output:
+ // {YAML: YAML Ain't Markup Language}
+}
+```
+
+### Example: Converting JSON to YAML (block formatted)
+
+```dart
+void main() {
+ final jsonString = r'''
+{
+ "key": "value",
+ "list": [
+ "first",
+ "second",
+ "last entry in the list"
+ ],
+ "map": {
+ "multiline": "this is a fairly long string with\nline breaks..."
+ }
+}
+''';
+ final jsonValue = json.decode(jsonString);
+
+ // Convert jsonValue to YAML
+ final yamlEditor = YamlEditor('');
+ yamlEditor.update([], jsonValue);
+ print(yamlEditor.toString());
+}
+```
+
+## Testing
+
+Testing is done in two strategies: Unit testing (`/test/editor_test.dart`) and
+Golden testing (`/test/golden_test.dart`). More information on Golden testing
+and the input/output format can be found at `/test/testdata/README.md`.
+
+These tests are automatically run with `pub run test`.
+
+## Limitations
+
+1. Users are not allowed to define tags in the modifications.
+2. Map keys will always be added in the flow style.
diff --git a/pkgs/yaml_edit/analysis_options.yaml b/pkgs/yaml_edit/analysis_options.yaml
new file mode 100644
index 0000000..937e7fe
--- /dev/null
+++ b/pkgs/yaml_edit/analysis_options.yaml
@@ -0,0 +1,8 @@
+include: package:dart_flutter_team_lints/analysis_options.yaml
+
+analyzer:
+ errors:
+ inference_failure_on_collection_literal: ignore
+ inference_failure_on_function_invocation: ignore
+ inference_failure_on_function_return_type: ignore
+ inference_failure_on_instance_creation: ignore
diff --git a/pkgs/yaml_edit/example/example.dart b/pkgs/yaml_edit/example/example.dart
new file mode 100644
index 0000000..d49c39b
--- /dev/null
+++ b/pkgs/yaml_edit/example/example.dart
@@ -0,0 +1,12 @@
+import 'package:yaml_edit/yaml_edit.dart';
+
+void main() {
+ final doc = YamlEditor('''
+- 0 # comment 0
+- 1 # comment 1
+- 2 # comment 2
+''');
+ doc.remove([1]);
+
+ print(doc);
+}
diff --git a/pkgs/yaml_edit/example/json2yaml.dart b/pkgs/yaml_edit/example/json2yaml.dart
new file mode 100644
index 0000000..d6204d3
--- /dev/null
+++ b/pkgs/yaml_edit/example/json2yaml.dart
@@ -0,0 +1,28 @@
+// Copyright (c) 2023, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:convert' show json;
+
+import 'package:yaml_edit/yaml_edit.dart';
+
+void main() {
+ final jsonString = r'''
+{
+ "key": "value",
+ "list": [
+ "first",
+ "second",
+ "last entry in the list"
+ ],
+ "map": {
+ "multiline": "this is a fairly long string with\nline breaks..."
+ }
+}
+''';
+ final jsonValue = json.decode(jsonString);
+
+ final yamlEditor = YamlEditor('');
+ yamlEditor.update([], jsonValue);
+ print(yamlEditor.toString());
+}
diff --git a/pkgs/yaml_edit/lib/src/editor.dart b/pkgs/yaml_edit/lib/src/editor.dart
new file mode 100644
index 0000000..54775cc
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/editor.dart
@@ -0,0 +1,634 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:meta/meta.dart';
+import 'package:source_span/source_span.dart';
+import 'package:yaml/yaml.dart';
+
+import 'equality.dart';
+import 'errors.dart';
+import 'list_mutations.dart';
+import 'map_mutations.dart';
+import 'source_edit.dart';
+import 'strings.dart';
+import 'utils.dart';
+import 'wrap.dart';
+
+/// An interface for modifying [YAML][1] documents while preserving comments
+/// and whitespaces.
+///
+/// YAML parsing is supported by `package:yaml`, and modifications are performed
+/// as string operations. An error will be thrown if internal assertions fail -
+/// such a situation should be extremely rare, and should only occur with
+/// degenerate formatting.
+///
+/// Most modification methods require the user to pass in an `Iterable<Object>`
+/// path that holds the keys/indices to navigate to the element.
+///
+/// **Example:**
+/// ```yaml
+/// a: 1
+/// b: 2
+/// c:
+/// - 3
+/// - 4
+/// - {e: 5, f: [6, 7]}
+/// ```
+///
+/// To get to `7`, our path will be `['c', 2, 'f', 1]`. The path for the base
+/// object is the empty array `[]`. All modification methods will throw a
+/// [ArgumentError] if the path provided is invalid. Note also that that the
+/// order of elements in the path is important, and it should be arranged in
+/// order of calling, with the first element being the first key or index to be
+/// called.
+///
+/// In most modification methods, users are required to pass in a value to be
+/// used for updating the YAML tree. This value is only allowed to either be a
+/// valid scalar that is recognizable by YAML (i.e. `bool`, `String`, `List`,
+/// `Map`, `num`, `null`) or a [YamlNode]. Should the user want to specify
+/// the style to be applied to the value passed in, the user may wrap the value
+/// using [wrapAsYamlNode] while passing in the appropriate `scalarStyle` or
+/// `collectionStyle`. While we try to respect the style that is passed in,
+/// there will be instances where the formatting will not result in valid YAML,
+/// and as such we will fallback to a default formatting while preserving the
+/// content.
+///
+/// To dump the YAML after all the modifications have been completed, simply
+/// call [toString()].
+///
+/// [1]: https://yaml.org/
+@sealed
+class YamlEditor {
+ final List<SourceEdit> _edits = [];
+
+ /// List of [SourceEdit]s that have been applied to [_yaml] since the creation
+ /// of this instance, in chronological order. Intended to be compatible with
+ /// `package:analysis_server`.
+ ///
+ /// The [SourceEdit] objects can be serialized to JSON using the `toJSON`
+ /// function, deserialized using [SourceEdit.fromJson], and applied to a
+ /// string using the `apply` function. Multiple [SourceEdit]s can be applied
+ /// to a string using [SourceEdit.applyAll].
+ ///
+ /// For more information, refer to the [SourceEdit] class.
+ List<SourceEdit> get edits => [..._edits];
+
+ /// Current YAML string.
+ String _yaml;
+
+ /// Root node of YAML AST.
+ YamlNode _contents;
+
+ /// Stores the list of nodes in [_contents] that are connected by aliases.
+ ///
+ /// When a node is anchored with an alias and subsequently referenced,
+ /// the full content of the anchored node is thought to be copied in the
+ /// following references.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// a: &SS Sammy Sosa
+ /// b: *SS
+ /// ```
+ ///
+ /// is equivalent to
+ ///
+ /// ```dart
+ /// a: Sammy Sosa
+ /// b: Sammy Sosa
+ /// ```
+ ///
+ /// As such, aliased nodes have to be treated with special caution when
+ /// any modification is taking place.
+ ///
+ /// See 7.1 Alias Nodes: https://yaml.org/spec/1.2/spec.html#id2786196
+ Set<YamlNode> _aliases = {};
+
+ /// Returns the current YAML string.
+ @override
+ String toString() => _yaml;
+
+ factory YamlEditor(String yaml) => YamlEditor._(yaml);
+
+ YamlEditor._(this._yaml) : _contents = loadYamlNode(_yaml) {
+ _initialize();
+ }
+
+ /// Traverses the YAML tree formed to detect alias nodes.
+ void _initialize() {
+ _aliases = {};
+
+ /// Performs a DFS on [_contents] to detect alias nodes.
+ final visited = <YamlNode>{};
+ void collectAliases(YamlNode node) {
+ if (visited.add(node)) {
+ if (node is YamlMap) {
+ node.nodes.forEach((key, value) {
+ collectAliases(key as YamlNode);
+ collectAliases(value);
+ });
+ } else if (node is YamlList) {
+ node.nodes.forEach(collectAliases);
+ }
+ } else {
+ _aliases.add(node);
+ }
+ }
+
+ collectAliases(_contents);
+ }
+
+ /// Parses the document to return [YamlNode] currently present at [path].
+ ///
+ /// If no [YamlNode]s exist at [path], the result of invoking the [orElse]
+ /// function is returned.
+ ///
+ /// If [orElse] is omitted, it defaults to throwing a [ArgumentError].
+ ///
+ /// To get a default value when [path] does not point to a value in the
+ /// [YamlNode]-tree, simply pass `orElse: () => ...`.
+ ///
+ /// **Example:** (using orElse)
+ /// ```dart
+ /// final myYamlEditor('{"key": "value"}');
+ /// final node = myYamlEditor.valueAt(
+ /// ['invalid', 'path'],
+ /// orElse: () => wrapAsYamlNode(null),
+ /// );
+ /// print(node.value); // null
+ /// ```
+ ///
+ /// **Example:** (common usage)
+ /// ```dart
+ /// final doc = YamlEditor('''
+ /// a: 1
+ /// b:
+ /// d: 4
+ /// e: [5, 6, 7]
+ /// c: 3
+ /// ''');
+ /// print(doc.parseAt(['b', 'e', 2])); // 7
+ /// ```
+ /// The value returned by [parseAt] is invalidated when the documented is
+ /// mutated, as illustrated below:
+ ///
+ /// **Example:** (old [parseAt] value is invalidated)
+ /// ```dart
+ /// final doc = YamlEditor("YAML: YAML Ain't Markup Language");
+ /// final node = doc.parseAt(['YAML']);
+ ///
+ /// print(node.value); // Expected output: "YAML Ain't Markup Language"
+ ///
+ /// doc.update(['YAML'], 'YAML');
+ ///
+ /// final newNode = doc.parseAt(['YAML']);
+ ///
+ /// // Note that the value does not change
+ /// print(newNode.value); // "YAML"
+ /// print(node.value); // "YAML Ain't Markup Language"
+ /// ```
+ YamlNode parseAt(Iterable<Object?> path, {YamlNode Function()? orElse}) {
+ return _traverse(path, orElse: orElse);
+ }
+
+ /// Sets [value] in the [path].
+ ///
+ /// There is a subtle difference between [update] and [remove] followed by
+ /// an [insertIntoList], because [update] preserves comments at the same
+ /// level.
+ ///
+ /// Throws a [ArgumentError] if [path] is invalid.
+ ///
+ /// Throws an [AliasException] if a node on [path] is an alias or anchor.
+ ///
+ /// **Example:** (using [update])
+ /// ```dart
+ /// final doc = YamlEditor('''
+ /// - 0
+ /// - 1 # comment
+ /// - 2
+ /// ''');
+ /// doc.update([1], 'test');
+ /// ```
+ ///
+ /// **Expected Output:**
+ /// ```yaml
+ /// - 0
+ /// - test # comment
+ /// - 2
+ /// ```
+ ///
+ /// **Example:** (using [remove] and [insertIntoList])
+ /// ```dart
+ /// final doc2 = YamlEditor('''
+ /// - 0
+ /// - 1 # comment
+ /// - 2
+ /// ''');
+ /// doc2.remove([1]);
+ /// doc2.insertIntoList([], 1, 'test');
+ /// ```
+ ///
+ /// **Expected Output:**
+ /// ```yaml
+ /// - 0
+ /// - test
+ /// - 2
+ /// ```
+ void update(Iterable<Object?> path, Object? value) {
+ final valueNode = wrapAsYamlNode(value);
+
+ if (path.isEmpty) {
+ final start = _contents.span.start.offset;
+ final end = getContentSensitiveEnd(_contents);
+ final lineEnding = getLineEnding(_yaml);
+ final edit = SourceEdit(
+ start, end - start, yamlEncodeBlock(valueNode, 0, lineEnding));
+
+ return _performEdit(edit, path, valueNode);
+ }
+
+ final pathAsList = path.toList();
+ final collectionPath = pathAsList.take(path.length - 1);
+ final keyOrIndex = pathAsList.last;
+ final parentNode = _traverse(collectionPath, checkAlias: true);
+
+ if (parentNode is YamlList) {
+ if (keyOrIndex is! int) {
+ throw PathError(path, path, parentNode);
+ }
+ final expected = wrapAsYamlNode(
+ [...parentNode.nodes]..[keyOrIndex] = valueNode,
+ );
+
+ return _performEdit(updateInList(this, parentNode, keyOrIndex, valueNode),
+ collectionPath, expected);
+ }
+
+ if (parentNode is YamlMap) {
+ final expectedMap =
+ updatedYamlMap(parentNode, (nodes) => nodes[keyOrIndex] = valueNode);
+ return _performEdit(updateInMap(this, parentNode, keyOrIndex, valueNode),
+ collectionPath, expectedMap);
+ }
+
+ throw PathError.unexpected(
+ path, 'Scalar $parentNode does not have key $keyOrIndex');
+ }
+
+ /// Appends [value] to the list at [path].
+ ///
+ /// Throws a [ArgumentError] if the element at the given path is not a
+ /// [YamlList] or if the path is invalid.
+ ///
+ /// Throws an [AliasException] if a node on [path] is an alias or anchor.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// final doc = YamlEditor('[0, 1]');
+ /// doc.appendToList([], 2); // [0, 1, 2]
+ /// ```
+ void appendToList(Iterable<Object?> path, Object? value) {
+ final yamlList = _traverseToList(path);
+
+ insertIntoList(path, yamlList.length, value);
+ }
+
+ /// Prepends [value] to the list at [path].
+ ///
+ /// Throws a [ArgumentError] if the element at the given path is not a
+ /// [YamlList] or if the path is invalid.
+ ///
+ /// Throws an [AliasException] if a node on [path] is an alias or anchor.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// final doc = YamlEditor('[1, 2]');
+ /// doc.prependToList([], 0); // [0, 1, 2]
+ /// ```
+ void prependToList(Iterable<Object?> path, Object? value) {
+ insertIntoList(path, 0, value);
+ }
+
+ /// Inserts [value] into the list at [path].
+ ///
+ /// [index] must be non-negative and no greater than the list's length.
+ ///
+ /// Throws a [ArgumentError] if the element at the given path is not a
+ /// [YamlList] or if the path is invalid.
+ ///
+ /// Throws an [AliasException] if a node on [path] is an alias or anchor.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// final doc = YamlEditor('[0, 2]');
+ /// doc.insertIntoList([], 1, 1); // [0, 1, 2]
+ /// ```
+ void insertIntoList(Iterable<Object?> path, int index, Object? value) {
+ final valueNode = wrapAsYamlNode(value);
+
+ final list = _traverseToList(path, checkAlias: true);
+ RangeError.checkValueInInterval(index, 0, list.length);
+
+ final edit = insertInList(this, list, index, valueNode);
+ final expected = wrapAsYamlNode(
+ [...list.nodes]..insert(index, valueNode),
+ );
+
+ _performEdit(edit, path, expected);
+ }
+
+ /// Changes the contents of the list at [path] by removing [deleteCount]
+ /// items at [index], and inserting [values] in-place. Returns the elements
+ /// that are deleted.
+ ///
+ /// [index] and [deleteCount] must be non-negative and [index] + [deleteCount]
+ /// must be no greater than the list's length.
+ ///
+ /// Throws a [ArgumentError] if the element at the given path is not a
+ /// [YamlList] or if the path is invalid.
+ ///
+ /// Throws an [AliasException] if a node on [path] is an alias or anchor.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// final doc = YamlEditor('[Jan, March, April, June]');
+ /// doc.spliceList([], 1, 0, ['Feb']); // [Jan, Feb, March, April, June]
+ /// doc.spliceList([], 4, 1, ['May']); // [Jan, Feb, March, April, May]
+ /// ```
+ Iterable<YamlNode> spliceList(Iterable<Object?> path, int index,
+ int deleteCount, Iterable<Object?> values) {
+ final list = _traverseToList(path, checkAlias: true);
+
+ RangeError.checkValueInInterval(index, 0, list.length);
+ RangeError.checkValueInInterval(index + deleteCount, 0, list.length);
+
+ final nodesToRemove = list.nodes.getRange(index, index + deleteCount);
+
+ // Perform addition of elements before removal to avoid scenarios where
+ // a block list gets emptied out to {} to avoid changing collection styles
+ // where possible.
+
+ // Reverse [values] and insert them.
+ final reversedValues = values.toList().reversed;
+ for (final value in reversedValues) {
+ insertIntoList(path, index, value);
+ }
+
+ for (var i = 0; i < deleteCount; i++) {
+ remove([...path, index + values.length]);
+ }
+
+ return nodesToRemove;
+ }
+
+ /// Removes the node at [path]. Comments "belonging" to the node will be
+ /// removed while surrounding comments will be left untouched.
+ ///
+ /// Throws an [ArgumentError] if [path] is invalid.
+ ///
+ /// Throws an [AliasException] if a node on [path] is an alias or anchor.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// final doc = YamlEditor('''
+ /// - 0 # comment 0
+ /// # comment A
+ /// - 1 # comment 1
+ /// # comment B
+ /// - 2 # comment 2
+ /// ''');
+ /// doc.remove([1]);
+ /// ```
+ ///
+ /// **Expected Result:**
+ /// ```dart
+ /// '''
+ /// - 0 # comment 0
+ /// # comment A
+ /// # comment B
+ /// - 2 # comment 2
+ /// '''
+ /// ```
+ YamlNode remove(Iterable<Object?> path) {
+ late SourceEdit edit;
+ late YamlNode expectedNode;
+ final nodeToRemove = _traverse(path, checkAlias: true);
+
+ if (path.isEmpty) {
+ edit = SourceEdit(0, _yaml.length, '');
+ expectedNode = wrapAsYamlNode(null);
+
+ /// Parsing an empty YAML document returns YamlScalar with value `null`.
+ _performEdit(edit, path, expectedNode);
+ return nodeToRemove;
+ }
+
+ final pathAsList = path.toList();
+ final collectionPath = pathAsList.take(path.length - 1);
+ final keyOrIndex = pathAsList.last;
+ final parentNode = _traverse(collectionPath);
+
+ if (parentNode is YamlList) {
+ edit = removeInList(this, parentNode, keyOrIndex as int);
+ expectedNode = wrapAsYamlNode(
+ [...parentNode.nodes]..removeAt(keyOrIndex),
+ );
+ } else if (parentNode is YamlMap) {
+ edit = removeInMap(this, parentNode, keyOrIndex);
+
+ expectedNode =
+ updatedYamlMap(parentNode, (nodes) => nodes.remove(keyOrIndex));
+ }
+
+ _performEdit(edit, collectionPath, expectedNode);
+
+ return nodeToRemove;
+ }
+
+ /// Traverses down [path] to return the [YamlNode] at [path] if successful.
+ ///
+ /// If no [YamlNode]s exist at [path], the result of invoking the [orElse]
+ /// function is returned.
+ ///
+ /// If [orElse] is omitted, it defaults to throwing a [PathError].
+ ///
+ /// If [checkAlias] is `true`, throw [AliasException] if an aliased node is
+ /// encountered.
+ YamlNode _traverse(Iterable<Object?> path,
+ {bool checkAlias = false, YamlNode Function()? orElse}) {
+ if (path.isEmpty) return _contents;
+
+ var currentNode = _contents;
+ final pathList = path.toList();
+
+ for (var i = 0; i < pathList.length; i++) {
+ final keyOrIndex = pathList[i];
+
+ if (checkAlias && _aliases.contains(currentNode)) {
+ throw AliasException(path, currentNode);
+ }
+
+ if (currentNode is YamlList) {
+ final list = currentNode;
+ if (!isValidIndex(keyOrIndex, list.length)) {
+ return _pathErrorOrElse(path, path.take(i + 1), list, orElse);
+ }
+
+ currentNode = list.nodes[keyOrIndex as int];
+ } else if (currentNode is YamlMap) {
+ final map = currentNode;
+
+ if (!containsKey(map, keyOrIndex)) {
+ return _pathErrorOrElse(path, path.take(i + 1), map, orElse);
+ }
+ final keyNode = getKeyNode(map, keyOrIndex);
+
+ if (checkAlias) {
+ if (_aliases.contains(keyNode)) throw AliasException(path, keyNode);
+ }
+
+ currentNode = map.nodes[keyNode]!;
+ } else {
+ return _pathErrorOrElse(path, path.take(i + 1), currentNode, orElse);
+ }
+ }
+
+ if (checkAlias) _assertNoChildAlias(path, currentNode);
+
+ return currentNode;
+ }
+
+ /// Throws a [PathError] if [orElse] is not provided, returns the result
+ /// of invoking the [orElse] function otherwise.
+ YamlNode _pathErrorOrElse(Iterable<Object?> path, Iterable<Object?> subPath,
+ YamlNode parent, YamlNode Function()? orElse) {
+ if (orElse == null) throw PathError(path, subPath, parent);
+ return orElse();
+ }
+
+ /// Asserts that [node] and none its children are aliases
+ void _assertNoChildAlias(Iterable<Object?> path, [YamlNode? node]) {
+ if (node == null) return _assertNoChildAlias(path, _traverse(path));
+ if (_aliases.contains(node)) throw AliasException(path, node);
+
+ if (node is YamlScalar) return;
+
+ if (node is YamlList) {
+ for (var i = 0; i < node.length; i++) {
+ final updatedPath = [...path, i];
+ _assertNoChildAlias(updatedPath, node.nodes[i]);
+ }
+ }
+
+ if (node is YamlMap) {
+ final keyList = node.keys.toList();
+ for (var i = 0; i < node.length; i++) {
+ final updatedPath = [...path, keyList[i]];
+ if (_aliases.contains(keyList[i])) {
+ throw AliasException(path, keyList[i] as YamlNode);
+ }
+ _assertNoChildAlias(updatedPath, node.nodes[keyList[i]]);
+ }
+ }
+ }
+
+ /// Traverses down the provided [path] to return the [YamlList] at [path].
+ ///
+ /// Convenience function to ensure that a [YamlList] is returned.
+ ///
+ /// Throws [ArgumentError] if the element at the given path is not a
+ /// [YamlList] or if the path is invalid. If [checkAlias] is `true`, and an
+ /// aliased node is encountered along [path], an [AliasException] will be
+ /// thrown.
+ YamlList _traverseToList(Iterable<Object?> path, {bool checkAlias = false}) {
+ final possibleList = _traverse(path, checkAlias: checkAlias);
+
+ if (possibleList is YamlList) {
+ return possibleList;
+ } else {
+ throw PathError.unexpected(
+ path, 'Path $path does not point to a YamlList!');
+ }
+ }
+
+ /// Utility method to replace the substring of [_yaml] according to [edit].
+ ///
+ /// When [_yaml] is modified with this method, the resulting string is parsed
+ /// and reloaded and traversed down [path] to ensure that the reloaded YAML
+ /// tree is equal to our expectations by deep equality of values. Throws an
+ /// [AssertionError] if the two trees do not match.
+ void _performEdit(
+ SourceEdit edit, Iterable<Object?> path, YamlNode expectedNode) {
+ final expectedTree = _deepModify(_contents, path, [], expectedNode);
+ final initialYaml = _yaml;
+ _yaml = edit.apply(_yaml);
+
+ try {
+ _initialize();
+ } on YamlException {
+ throw createAssertionError(
+ 'Failed to produce valid YAML after modification.',
+ initialYaml,
+ _yaml);
+ }
+
+ final actualTree = withYamlWarningCallback(() => loadYamlNode(_yaml));
+ if (!deepEquals(actualTree, expectedTree)) {
+ throw createAssertionError(
+ 'Modification did not result in expected result.',
+ initialYaml,
+ _yaml);
+ }
+
+ _contents = actualTree;
+ _edits.add(edit);
+ }
+
+ /// Utility method to produce an updated YAML tree equivalent to converting
+ /// the [YamlNode] at [path] to be [expectedNode]. [subPath] holds the portion
+ /// of [path] that has been traversed thus far.
+ ///
+ /// Throws a [PathError] if path is invalid.
+ ///
+ /// When called, it creates a new [YamlNode] of the same type as [tree], and
+ /// copies its children over, except for the child that is on the path. Doing
+ /// so allows us to "update" the immutable [YamlNode] without having to clone
+ /// the whole tree.
+ ///
+ /// [SourceSpan]s in this new tree are not guaranteed to be accurate.
+ YamlNode _deepModify(YamlNode tree, Iterable<Object?> path,
+ Iterable<Object?> subPath, YamlNode expectedNode) {
+ RangeError.checkValueInInterval(subPath.length, 0, path.length);
+
+ if (path.length == subPath.length) return expectedNode;
+
+ final keyOrIndex = path.elementAt(subPath.length);
+
+ if (tree is YamlList) {
+ if (!isValidIndex(keyOrIndex, tree.length)) {
+ throw PathError(path, subPath, tree);
+ }
+
+ return wrapAsYamlNode([...tree.nodes]..[keyOrIndex as int] = _deepModify(
+ tree.nodes[keyOrIndex],
+ path,
+ path.take(subPath.length + 1),
+ expectedNode));
+ }
+
+ if (tree is YamlMap) {
+ return updatedYamlMap(
+ tree,
+ (nodes) => nodes[keyOrIndex] = _deepModify(
+ nodes[keyOrIndex] as YamlNode,
+ path,
+ path.take(subPath.length + 1),
+ expectedNode));
+ }
+
+ /// Should not ever reach here.
+ throw PathError(path, subPath, tree);
+ }
+}
diff --git a/pkgs/yaml_edit/lib/src/equality.dart b/pkgs/yaml_edit/lib/src/equality.dart
new file mode 100644
index 0000000..0c6a952
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/equality.dart
@@ -0,0 +1,116 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:collection';
+
+import 'package:collection/collection.dart';
+import 'package:yaml/yaml.dart';
+
+/// Creates a map that uses our custom [deepEquals] and [deepHashCode] functions
+/// to determine equality.
+Map<K, V> deepEqualsMap<K, V>() =>
+ LinkedHashMap(equals: deepEquals, hashCode: deepHashCode);
+
+/// Compares two [Object]s for deep equality. This implementation differs from
+/// `package:yaml`'s deep equality notation by allowing for comparison of
+/// non-scalar map keys.
+bool deepEquals(dynamic obj1, dynamic obj2) {
+ if (obj1 is YamlNode) obj1 = obj1.value;
+ if (obj2 is YamlNode) obj2 = obj2.value;
+
+ if (obj1 is Map && obj2 is Map) {
+ return mapDeepEquals(obj1, obj2);
+ }
+
+ if (obj1 is List && obj2 is List) {
+ return listDeepEquals(obj1, obj2);
+ }
+
+ return obj1 == obj2;
+}
+
+/// Compares two [List]s for deep equality.
+bool listDeepEquals(List list1, List list2) {
+ if (list1.length != list2.length) return false;
+
+ if (list1 is YamlList) list1 = list1.nodes;
+ if (list2 is YamlList) list2 = list2.nodes;
+
+ for (var i = 0; i < list1.length; i++) {
+ if (!deepEquals(list1[i], list2[i])) {
+ return false;
+ }
+ }
+
+ return true;
+}
+
+/// Compares two [Map]s for deep equality. Differs from `package:yaml`'s deep
+/// equality notation by allowing for comparison of non-scalar map keys.
+bool mapDeepEquals(Map map1, Map map2) {
+ if (map1.length != map2.length) return false;
+
+ if (map1 is YamlList) map1 = (map1 as YamlMap).nodes;
+ if (map2 is YamlList) map2 = (map2 as YamlMap).nodes;
+
+ return map1.keys.every((key) {
+ if (!containsKey(map2, key)) return false;
+
+ /// Because two keys may be equal by deep equality but using one key on the
+ /// other map might not get a hit since they may not be both using our
+ /// [deepEqualsMap].
+ final key2 = getKey(map2, key);
+
+ if (!deepEquals(map1[key], map2[key2])) {
+ return false;
+ }
+
+ return true;
+ });
+}
+
+/// Returns a hashcode for [value] such that structures that are equal by
+/// [deepEquals] will have the same hash code.
+int deepHashCode(Object? value) {
+ if (value is Map) {
+ const equality = UnorderedIterableEquality();
+ return equality.hash(value.keys.map(deepHashCode)) ^
+ equality.hash(value.values.map(deepHashCode));
+ } else if (value is Iterable) {
+ return const IterableEquality().hash(value.map(deepHashCode));
+ } else if (value is YamlScalar) {
+ return (value.value as Object?).hashCode;
+ }
+
+ return value.hashCode;
+}
+
+/// Returns the [YamlNode] corresponding to the provided [key].
+YamlNode getKeyNode(YamlMap map, Object? key) {
+ return map.nodes.keys.firstWhere((node) => deepEquals(node, key)) as YamlNode;
+}
+
+/// Returns the [YamlNode] after the [YamlNode] corresponding to the provided
+/// [key].
+YamlNode? getNextKeyNode(YamlMap map, Object? key) {
+ final keyIterator = map.nodes.keys.iterator;
+ while (keyIterator.moveNext()) {
+ if (deepEquals(keyIterator.current, key) && keyIterator.moveNext()) {
+ return keyIterator.current as YamlNode?;
+ }
+ }
+
+ return null;
+}
+
+/// Returns the key in [map] that is equal to the provided [key] by the notion
+/// of deep equality.
+Object? getKey(Map map, Object? key) {
+ return map.keys.firstWhere((k) => deepEquals(k, key));
+}
+
+/// Checks if [map] has any keys equal to the provided [key] by deep equality.
+bool containsKey(Map map, Object? key) {
+ return map.keys.where((node) => deepEquals(node, key)).isNotEmpty;
+}
diff --git a/pkgs/yaml_edit/lib/src/errors.dart b/pkgs/yaml_edit/lib/src/errors.dart
new file mode 100644
index 0000000..0e60dd8
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/errors.dart
@@ -0,0 +1,99 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:meta/meta.dart';
+import 'package:yaml/yaml.dart';
+
+/// Error thrown when a function is passed an invalid path.
+@sealed
+class PathError extends ArgumentError {
+ /// The full path that caused the error
+ final Iterable<Object?> path;
+
+ /// The subpath that caused the error
+ final Iterable<Object?> subPath;
+
+ /// The last element of [path] that could be traversed.
+ YamlNode? parent;
+
+ PathError(this.path, this.subPath, this.parent, [String? message])
+ : super.value(subPath, 'path', message);
+
+ PathError.unexpected(this.path, String message)
+ : subPath = path,
+ super(message);
+
+ @override
+ String toString() {
+ if (message == null) {
+ var errorMessage = 'Failed to traverse to subpath $subPath!';
+
+ if (subPath.isNotEmpty) {
+ errorMessage +=
+ ' Parent $parent does not contain key or index ${subPath.last}';
+ }
+
+ return 'Invalid path: $path. $errorMessage.';
+ }
+
+ return 'Invalid path: $path. $message';
+ }
+}
+
+/// Exception thrown when the path contains an alias along the way.
+///
+/// When a path contains an aliased node, the behavior becomes less well-defined
+/// because we cannot be certain if the user wishes for the change to propagate
+/// throughout all the other aliased nodes, or if the user wishes for only that
+/// particular node to be modified. As such, [AliasException] reflects the
+/// detection that our change will impact an alias, and we do not intend on
+/// supporting such changes for the foreseeable future.
+@sealed
+class AliasException extends FormatException {
+ /// The path that caused the error
+ final Iterable<Object?> path;
+
+ /// The anchor node of the alias
+ final YamlNode anchor;
+
+ AliasException(this.path, this.anchor)
+ : super('Encountered an alias node along $path! '
+ 'Alias nodes are nodes that refer to a previously serialized '
+ 'nodes, and are denoted by either the "*" or the "&" indicators in '
+ 'the original YAML. As the resulting behavior of mutations on '
+ 'these nodes is not well-defined, the operation will not be '
+ 'supported by this library.\n\n'
+ '${anchor.span.message('The alias was first defined here.')}');
+}
+
+/// Error thrown when an assertion about the YAML fails. Extends
+/// [AssertionError] to override the [toString] method for pretty printing.
+class _YamlAssertionError extends AssertionError {
+ _YamlAssertionError(super.message);
+
+ @override
+ String toString() {
+ if (message != null) {
+ return 'Assertion failed: $message';
+ }
+ return 'Assertion failed';
+ }
+}
+
+/// Throws an [AssertionError] with the given [message], and format
+/// [oldYaml] and [newYaml] for information.
+Error createAssertionError(String message, String oldYaml, String newYaml) {
+ return _YamlAssertionError('''
+(package:yaml_edit) $message
+
+# YAML before edit:
+> ${oldYaml.replaceAll('\n', '\n> ')}
+
+# YAML after edit:
+> ${newYaml.replaceAll('\n', '\n> ')}
+
+Please file an issue at:
+https://github.com/dart-lang/yaml_edit/issues/new?labels=bug
+''');
+}
diff --git a/pkgs/yaml_edit/lib/src/list_mutations.dart b/pkgs/yaml_edit/lib/src/list_mutations.dart
new file mode 100644
index 0000000..17da6dd
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/list_mutations.dart
@@ -0,0 +1,403 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:yaml/yaml.dart';
+
+import 'editor.dart';
+import 'source_edit.dart';
+import 'strings.dart';
+import 'utils.dart';
+import 'wrap.dart';
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of setting the element at [index] to [newValue] when
+/// re-parsed.
+SourceEdit updateInList(
+ YamlEditor yamlEdit, YamlList list, int index, YamlNode newValue) {
+ RangeError.checkValueInInterval(index, 0, list.length - 1);
+
+ final currValue = list.nodes[index];
+ var offset = currValue.span.start.offset;
+ final yaml = yamlEdit.toString();
+ String valueString;
+
+ /// We do not use [_formatNewBlock] since we want to only replace the contents
+ /// of this node while preserving comments/whitespace, while [_formatNewBlock]
+ /// produces a string representation of a new node.
+ if (list.style == CollectionStyle.BLOCK) {
+ final listIndentation = getListIndentation(yaml, list);
+ final indentation = listIndentation + getIndentation(yamlEdit);
+ final lineEnding = getLineEnding(yaml);
+ valueString =
+ yamlEncodeBlock(wrapAsYamlNode(newValue), indentation, lineEnding);
+
+ /// We prefer the compact nested notation for collections.
+ ///
+ /// By virtue of [yamlEncodeBlockString], collections automatically
+ /// have the necessary line endings.
+ if ((newValue is List && (newValue as List).isNotEmpty) ||
+ (newValue is Map && (newValue as Map).isNotEmpty)) {
+ valueString = valueString.substring(indentation);
+ } else if (currValue.collectionStyle == CollectionStyle.BLOCK) {
+ valueString += lineEnding;
+ }
+
+ var end = getContentSensitiveEnd(currValue);
+ if (end <= offset) {
+ offset++;
+ end = offset;
+ valueString = ' $valueString';
+ }
+
+ return SourceEdit(offset, end - offset, valueString);
+ } else {
+ valueString = yamlEncodeFlow(newValue);
+ return SourceEdit(offset, currValue.span.length, valueString);
+ }
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of appending [item] to the list.
+SourceEdit appendIntoList(YamlEditor yamlEdit, YamlList list, YamlNode item) {
+ if (list.style == CollectionStyle.FLOW) {
+ return _appendToFlowList(yamlEdit, list, item);
+ } else {
+ return _appendToBlockList(yamlEdit, list, item);
+ }
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of inserting [item] to the list at [index].
+SourceEdit insertInList(
+ YamlEditor yamlEdit, YamlList list, int index, YamlNode item) {
+ RangeError.checkValueInInterval(index, 0, list.length);
+
+ /// We call the append method if the user wants to append it to the end of the
+ /// list because appending requires different techniques.
+ if (index == list.length) {
+ return appendIntoList(yamlEdit, list, item);
+ } else {
+ if (list.style == CollectionStyle.FLOW) {
+ return _insertInFlowList(yamlEdit, list, index, item);
+ } else {
+ return _insertInBlockList(yamlEdit, list, index, item);
+ }
+ }
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of removing the element at [index] when re-parsed.
+SourceEdit removeInList(YamlEditor yamlEdit, YamlList list, int index) {
+ final nodeToRemove = list.nodes[index];
+
+ if (list.style == CollectionStyle.FLOW) {
+ return _removeFromFlowList(yamlEdit, list, nodeToRemove, index);
+ } else {
+ return _removeFromBlockList(yamlEdit, list, nodeToRemove, index);
+ }
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of addition [item] into [list], noting that this is a
+/// flow list.
+SourceEdit _appendToFlowList(
+ YamlEditor yamlEdit, YamlList list, YamlNode item) {
+ final valueString = _formatNewFlow(list, item, true);
+ return SourceEdit(list.span.end.offset - 1, 0, valueString);
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of addition [item] into [list], noting that this is a
+/// block list.
+SourceEdit _appendToBlockList(
+ YamlEditor yamlEdit, YamlList list, YamlNode item) {
+ var (indentSize, valueToIndent) = _formatNewBlock(yamlEdit, list, item);
+ var formattedValue = '${' ' * indentSize}$valueToIndent';
+
+ final yaml = yamlEdit.toString();
+ var offset = list.span.end.offset;
+
+ // Adjusts offset to after the trailing newline of the last entry, if it
+ // exists
+ if (list.isNotEmpty) {
+ final lastValueSpanEnd = list.nodes.last.span.end.offset;
+ final nextNewLineIndex = yaml.indexOf('\n', lastValueSpanEnd - 1);
+ if (nextNewLineIndex == -1) {
+ formattedValue = getLineEnding(yaml) + formattedValue;
+ } else {
+ offset = nextNewLineIndex + 1;
+ }
+ }
+
+ return SourceEdit(offset, 0, formattedValue);
+}
+
+/// Formats [item] into a new node for block lists.
+(int indentSize, String valueStringToIndent) _formatNewBlock(
+ YamlEditor yamlEdit, YamlList list, YamlNode item) {
+ final yaml = yamlEdit.toString();
+ final listIndentation = getListIndentation(yaml, list);
+ final newIndentation = listIndentation + getIndentation(yamlEdit);
+ final lineEnding = getLineEnding(yaml);
+
+ var valueString = yamlEncodeBlock(item, newIndentation, lineEnding);
+ if (isCollection(item) && !isFlowYamlCollectionNode(item) && !isEmpty(item)) {
+ valueString = valueString.substring(newIndentation);
+ }
+
+ return (listIndentation, '- $valueString$lineEnding');
+}
+
+/// Formats [item] into a new node for flow lists.
+String _formatNewFlow(YamlList list, YamlNode item, [bool isLast = false]) {
+ var valueString = yamlEncodeFlow(item);
+ if (list.isNotEmpty) {
+ if (isLast) {
+ valueString = ', $valueString';
+ } else {
+ valueString += ', ';
+ }
+ }
+
+ return valueString;
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of inserting [item] into [list] at [index], noting that
+/// this is a block list.
+///
+/// [index] should be non-negative and less than or equal to `list.length`.
+SourceEdit _insertInBlockList(
+ YamlEditor yamlEdit, YamlList list, int index, YamlNode item) {
+ RangeError.checkValueInInterval(index, 0, list.length);
+
+ if (index == list.length) return _appendToBlockList(yamlEdit, list, item);
+
+ var (indentSize, formattedValue) = _formatNewBlock(yamlEdit, list, item);
+
+ final currNode = list.nodes[index];
+ final currNodeStart = currNode.span.start.offset;
+ final yaml = yamlEdit.toString();
+
+ final currSequenceOffset = yaml.lastIndexOf('-', currNodeStart - 1);
+
+ final (isNested, offset) = _isNestedInBlockList(currSequenceOffset, yaml);
+
+ /// We have to get rid of the left indentation applied by default
+ if (isNested && index == 0) {
+ /// The [insertionIndex] will be equal to the start of
+ /// [currentSequenceOffset] of the element we are inserting before in most
+ /// cases.
+ ///
+ /// Example:
+ ///
+ /// - - value
+ /// ^ Inserting before this and we get rid of indent
+ ///
+ /// If not, we need to account for the space between them that is not an
+ /// indent.
+ ///
+ /// Example:
+ ///
+ /// - - value
+ /// ^ Inserting before this and we get rid of indent. But also account
+ /// for space in between
+ final leftPad = currSequenceOffset - offset;
+ final padding = ' ' * leftPad;
+
+ final indent = ' ' * (indentSize - leftPad);
+
+ // Give the indent to the first element
+ formattedValue = '$padding${formattedValue.trimLeft()}$indent';
+ } else {
+ final indent = ' ' * indentSize; // Calculate indent normally
+ formattedValue = '$indent$formattedValue';
+ }
+
+ return SourceEdit(offset, 0, formattedValue);
+}
+
+/// Determines if the list containing an element is nested within another list.
+/// The [currentSequenceOffset] indicates the index of the element's `-` and
+/// [yaml] represents the entire yaml document.
+///
+/// ```yaml
+/// # Returns true
+/// - - value
+///
+/// # Returns true
+/// - - value
+///
+/// # Returns false
+/// key:
+/// - value
+///
+/// # Returns false. Even though nested, a "\n" precedes the previous "-"
+/// -
+/// - value
+/// ```
+(bool isNested, int offset) _isNestedInBlockList(
+ int currentSequenceOffset, String yaml) {
+ final startIndex = currentSequenceOffset - 1;
+
+ /// Indicates the element we are inserting before is at index `0` of the list
+ /// at the root of the yaml
+ ///
+ /// Example:
+ ///
+ /// - foo
+ /// ^ Inserting before this
+ if (startIndex < 0) return (false, 0);
+
+ final newLineStart = yaml.lastIndexOf('\n', startIndex);
+ final seqStart = yaml.lastIndexOf('-', startIndex);
+
+ /// Indicates that a `\n` is closer to the last `-`. Meaning this list is not
+ /// nested.
+ ///
+ /// Example:
+ ///
+ /// key:
+ /// - value
+ /// ^ Inserting before this and we need to keep the indent.
+ ///
+ /// Also this list may be nested but the nested list starts its indent after
+ /// a new line.
+ ///
+ /// Example:
+ ///
+ /// -
+ /// - value
+ /// ^ Inserting before this and we need to keep the indent.
+ if (newLineStart >= seqStart) {
+ return (false, newLineStart + 1);
+ }
+
+ return (true, seqStart + 2); // Inclusive of space
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of inserting [item] into [list] at [index], noting that
+/// this is a flow list.
+///
+/// [index] should be non-negative and less than or equal to `list.length`.
+SourceEdit _insertInFlowList(
+ YamlEditor yamlEdit, YamlList list, int index, YamlNode item) {
+ RangeError.checkValueInInterval(index, 0, list.length);
+
+ if (index == list.length) return _appendToFlowList(yamlEdit, list, item);
+
+ final formattedValue = _formatNewFlow(list, item);
+
+ final yaml = yamlEdit.toString();
+ final currNode = list.nodes[index];
+ final currNodeStart = currNode.span.start.offset;
+ var start = yaml.lastIndexOf(RegExp(r',|\['), currNodeStart - 1) + 1;
+ if (yaml[start] == ' ') start++;
+
+ return SourceEdit(start, 0, formattedValue);
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of removing [nodeToRemove] from [list], noting that this
+/// is a block list.
+///
+/// [index] should be non-negative and less than or equal to `list.length`.
+SourceEdit _removeFromBlockList(
+ YamlEditor yamlEdit, YamlList list, YamlNode nodeToRemove, int index) {
+ RangeError.checkValueInInterval(index, 0, list.length - 1);
+
+ var end = getContentSensitiveEnd(nodeToRemove);
+
+ /// If we are removing the last element in a block list, convert it into a
+ /// flow empty list.
+ if (list.length == 1) {
+ final start = list.span.start.offset;
+
+ return SourceEdit(start, end - start, '[]');
+ }
+
+ final yaml = yamlEdit.toString();
+ final span = nodeToRemove.span;
+
+ /// Adjust the end to clear the new line after the end too.
+ ///
+ /// We do this because we suspect that our users will want the inline
+ /// comments to disappear too.
+ final nextNewLine = yaml.indexOf('\n', end);
+ if (nextNewLine != -1) {
+ end = nextNewLine + 1;
+ }
+
+ /// If the value is empty
+ if (span.length == 0) {
+ var start = span.start.offset;
+ return SourceEdit(start, end - start, '');
+ }
+
+ /// -1 accounts for the fact that the content can start with a dash
+ var start = yaml.lastIndexOf('-', span.start.offset - 1);
+
+ /// Check if there is a `-` before the node
+ if (start > 0) {
+ final lastHyphen = yaml.lastIndexOf('-', start - 1);
+ final lastNewLine = yaml.lastIndexOf('\n', start - 1);
+ if (lastHyphen > lastNewLine) {
+ start = lastHyphen + 2;
+
+ /// If there is a `-` before the node, we need to check if we have
+ /// to update the indentation of the next node.
+ if (index < list.length - 1) {
+ /// Since [end] is currently set to the next new line after the current
+ /// node, check if we see a possible comment first, or a hyphen first.
+ /// Note that no actual content can appear here.
+ ///
+ /// We check this way because the start of a span in a block list is
+ /// the start of its value, and checking from the back leaves us
+ /// easily confused if there are comments that have dashes in them.
+ final nextHash = yaml.indexOf('#', end);
+ final nextHyphen = yaml.indexOf('-', end);
+ final nextNewLine = yaml.indexOf('\n', end);
+
+ /// If [end] is on the same line as the hyphen of the next node
+ if ((nextHash == -1 || nextHyphen < nextHash) &&
+ nextHyphen < nextNewLine) {
+ end = nextHyphen;
+ }
+ }
+ } else if (lastNewLine > lastHyphen) {
+ start = lastNewLine + 1;
+ }
+ }
+
+ return SourceEdit(start, end - start, '');
+}
+
+/// Returns a [SourceEdit] describing the change to be made on [yamlEdit] to
+/// achieve the effect of removing [nodeToRemove] from [list], noting that this
+/// is a flow list.
+///
+/// [index] should be non-negative and less than or equal to `list.length`.
+SourceEdit _removeFromFlowList(
+ YamlEditor yamlEdit, YamlList list, YamlNode nodeToRemove, int index) {
+ RangeError.checkValueInInterval(index, 0, list.length - 1);
+
+ final span = nodeToRemove.span;
+ final yaml = yamlEdit.toString();
+ var start = span.start.offset;
+ var end = span.end.offset;
+
+ if (index == 0) {
+ start = yaml.lastIndexOf('[', start - 1) + 1;
+ if (index == list.length - 1) {
+ end = yaml.indexOf(']', end);
+ } else {
+ end = yaml.indexOf(',', end) + 1;
+ }
+ } else {
+ start = yaml.lastIndexOf(',', start - 1);
+ }
+
+ return SourceEdit(start, end - start, '');
+}
diff --git a/pkgs/yaml_edit/lib/src/map_mutations.dart b/pkgs/yaml_edit/lib/src/map_mutations.dart
new file mode 100644
index 0000000..46e8c79
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/map_mutations.dart
@@ -0,0 +1,257 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:yaml/yaml.dart';
+
+import 'editor.dart';
+import 'equality.dart';
+import 'source_edit.dart';
+import 'strings.dart';
+import 'utils.dart';
+import 'wrap.dart';
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of setting
+/// the element at [key] to [newValue] when re-parsed.
+SourceEdit updateInMap(
+ YamlEditor yamlEdit, YamlMap map, Object? key, YamlNode newValue) {
+ if (!containsKey(map, key)) {
+ final keyNode = wrapAsYamlNode(key);
+
+ if (map.style == CollectionStyle.FLOW) {
+ return _addToFlowMap(yamlEdit, map, keyNode, newValue);
+ } else {
+ return _addToBlockMap(yamlEdit, map, keyNode, newValue);
+ }
+ } else {
+ if (map.style == CollectionStyle.FLOW) {
+ return _replaceInFlowMap(yamlEdit, map, key, newValue);
+ } else {
+ return _replaceInBlockMap(yamlEdit, map, key, newValue);
+ }
+ }
+}
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of
+/// removing the element at [key] when re-parsed.
+SourceEdit removeInMap(YamlEditor yamlEdit, YamlMap map, Object? key) {
+ assert(containsKey(map, key));
+ final keyNode = getKeyNode(map, key);
+ final valueNode = map.nodes[keyNode]!;
+
+ if (map.style == CollectionStyle.FLOW) {
+ return _removeFromFlowMap(yamlEdit, map, keyNode, valueNode);
+ } else {
+ return _removeFromBlockMap(yamlEdit, map, keyNode, valueNode);
+ }
+}
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of adding
+/// the [key]:[newValue] pair when reparsed, bearing in mind that this is a
+/// block map.
+SourceEdit _addToBlockMap(
+ YamlEditor yamlEdit, YamlMap map, Object key, YamlNode newValue) {
+ final yaml = yamlEdit.toString();
+ final newIndentation =
+ getMapIndentation(yaml, map) + getIndentation(yamlEdit);
+ final keyString = yamlEncodeFlow(wrapAsYamlNode(key));
+ final lineEnding = getLineEnding(yaml);
+
+ var formattedValue = ' ' * getMapIndentation(yaml, map);
+ var offset = map.span.end.offset;
+
+ final insertionIndex = getMapInsertionIndex(map, keyString);
+
+ if (map.isNotEmpty) {
+ /// Adjusts offset to after the trailing newline of the last entry, if it
+ /// exists
+ if (insertionIndex == map.length) {
+ final lastValueSpanEnd = getContentSensitiveEnd(map.nodes.values.last);
+ final nextNewLineIndex = yaml.indexOf('\n', lastValueSpanEnd);
+
+ if (nextNewLineIndex != -1) {
+ offset = nextNewLineIndex + 1;
+ } else {
+ formattedValue = lineEnding + formattedValue;
+ }
+ } else {
+ final keyAtIndex = map.nodes.keys.toList()[insertionIndex] as YamlNode;
+ final keySpanStart = keyAtIndex.span.start.offset;
+ final prevNewLineIndex = yaml.lastIndexOf('\n', keySpanStart);
+
+ offset = prevNewLineIndex + 1;
+ }
+ }
+
+ var valueString = yamlEncodeBlock(newValue, newIndentation, lineEnding);
+ if (isCollection(newValue) &&
+ !isFlowYamlCollectionNode(newValue) &&
+ !isEmpty(newValue)) {
+ formattedValue += '$keyString:$lineEnding$valueString$lineEnding';
+ } else {
+ formattedValue += '$keyString: $valueString$lineEnding';
+ }
+
+ return SourceEdit(offset, 0, formattedValue);
+}
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of adding
+/// the [keyNode]:[newValue] pair when reparsed, bearing in mind that this is a
+/// flow map.
+SourceEdit _addToFlowMap(
+ YamlEditor yamlEdit, YamlMap map, YamlNode keyNode, YamlNode newValue) {
+ final keyString = yamlEncodeFlow(keyNode);
+ final valueString = yamlEncodeFlow(newValue);
+
+ // The -1 accounts for the closing bracket.
+ if (map.isEmpty) {
+ return SourceEdit(map.span.end.offset - 1, 0, '$keyString: $valueString');
+ }
+
+ final insertionIndex = getMapInsertionIndex(map, keyString);
+
+ if (insertionIndex == map.length) {
+ return SourceEdit(map.span.end.offset - 1, 0, ', $keyString: $valueString');
+ }
+
+ final insertionOffset =
+ (map.nodes.keys.toList()[insertionIndex] as YamlNode).span.start.offset;
+
+ return SourceEdit(insertionOffset, 0, '$keyString: $valueString, ');
+}
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of
+/// replacing the value at [key] with [newValue] when reparsed, bearing in mind
+/// that this is a block map.
+SourceEdit _replaceInBlockMap(
+ YamlEditor yamlEdit, YamlMap map, Object? key, YamlNode newValue) {
+ final yaml = yamlEdit.toString();
+ final lineEnding = getLineEnding(yaml);
+ final newIndentation =
+ getMapIndentation(yaml, map) + getIndentation(yamlEdit);
+
+ final keyNode = getKeyNode(map, key);
+ var valueAsString =
+ yamlEncodeBlock(wrapAsYamlNode(newValue), newIndentation, lineEnding);
+ if (isCollection(newValue) &&
+ !isFlowYamlCollectionNode(newValue) &&
+ !isEmpty(newValue)) {
+ valueAsString = lineEnding + valueAsString;
+ }
+
+ if (!valueAsString.startsWith(lineEnding)) {
+ // prepend whitespace to ensure there is space after colon.
+ valueAsString = ' $valueAsString';
+ }
+
+ /// +1 accounts for the colon
+ // TODO: What if here is a whitespace following the key, before the colon?
+ final start = keyNode.span.end.offset + 1;
+ var end = getContentSensitiveEnd(map.nodes[key]!);
+
+ /// `package:yaml` parses empty nodes in a way where the start/end of the
+ /// empty value node is the end of the key node, so we have to adjust for
+ /// this.
+ if (end < start) end = start;
+
+ return SourceEdit(start, end - start, valueAsString);
+}
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of
+/// replacing the value at [key] with [newValue] when reparsed, bearing in mind
+/// that this is a flow map.
+SourceEdit _replaceInFlowMap(
+ YamlEditor yamlEdit, YamlMap map, Object? key, YamlNode newValue) {
+ final valueSpan = map.nodes[key]!.span;
+ final valueString = yamlEncodeFlow(newValue);
+
+ return SourceEdit(valueSpan.start.offset, valueSpan.length, valueString);
+}
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of
+/// removing the [keyNode] from the map, bearing in mind that this is a block
+/// map.
+SourceEdit _removeFromBlockMap(
+ YamlEditor yamlEdit, YamlMap map, YamlNode keyNode, YamlNode valueNode) {
+ final keySpan = keyNode.span;
+ var end = getContentSensitiveEnd(valueNode);
+ final yaml = yamlEdit.toString();
+ final lineEnding = getLineEnding(yaml);
+
+ if (map.length == 1) {
+ final start = map.span.start.offset;
+ final nextNewLine = yaml.indexOf(lineEnding, end);
+ if (nextNewLine != -1) {
+ // Remove everything up to the next newline, this strips comments that
+ // follows on the same line as the value we're removing.
+ // It also ensures we consume colon when [valueNode.value] is `null`
+ // because there is no value (e.g. `key: \n`). Because [valueNode.span] in
+ // such cases point to the colon `:`.
+ end = nextNewLine;
+ } else {
+ // Remove everything until the end of the document, if there is no newline
+ end = yaml.length;
+ }
+ return SourceEdit(start, end - start, '{}');
+ }
+
+ var start = keySpan.start.offset;
+
+ /// Adjust the end to clear the new line after the end too.
+ ///
+ /// We do this because we suspect that our users will want the inline
+ /// comments to disappear too.
+ final nextNewLine = yaml.indexOf(lineEnding, end);
+ if (nextNewLine != -1) {
+ end = nextNewLine + lineEnding.length;
+ } else {
+ // Remove everything until the end of the document, if there is no newline
+ end = yaml.length;
+ }
+
+ final nextNode = getNextKeyNode(map, keyNode);
+
+ if (start > 0) {
+ final lastHyphen = yaml.lastIndexOf('-', start - 1);
+ final lastNewLine = yaml.lastIndexOf(lineEnding, start - 1);
+ if (lastHyphen > lastNewLine) {
+ start = lastHyphen + 2;
+
+ /// If there is a `-` before the node, and the end is on the same line
+ /// as the next node, we need to add the necessary offset to the end to
+ /// make sure the next node has the correct indentation.
+ if (nextNode != null &&
+ nextNode.span.start.offset - end <= nextNode.span.start.column) {
+ end += nextNode.span.start.column;
+ }
+ } else if (lastNewLine > lastHyphen) {
+ start = lastNewLine + lineEnding.length;
+ }
+ }
+
+ return SourceEdit(start, end - start, '');
+}
+
+/// Performs the string operation on [yamlEdit] to achieve the effect of
+/// removing the [keyNode] from the map, bearing in mind that this is a flow
+/// map.
+SourceEdit _removeFromFlowMap(
+ YamlEditor yamlEdit, YamlMap map, YamlNode keyNode, YamlNode valueNode) {
+ var start = keyNode.span.start.offset;
+ var end = valueNode.span.end.offset;
+ final yaml = yamlEdit.toString();
+
+ if (deepEquals(keyNode, map.keys.first)) {
+ start = yaml.lastIndexOf('{', start - 1) + 1;
+
+ if (deepEquals(keyNode, map.keys.last)) {
+ end = yaml.indexOf('}', end);
+ } else {
+ end = yaml.indexOf(',', end) + 1;
+ }
+ } else {
+ start = yaml.lastIndexOf(',', start - 1);
+ }
+
+ return SourceEdit(start, end - start, '');
+}
diff --git a/pkgs/yaml_edit/lib/src/source_edit.dart b/pkgs/yaml_edit/lib/src/source_edit.dart
new file mode 100644
index 0000000..d177a19
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/source_edit.dart
@@ -0,0 +1,133 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:meta/meta.dart';
+
+/// A class representing a change on a [String], intended to be compatible with
+/// `package:analysis_server`'s [SourceEdit].
+///
+/// For example, changing a string from
+/// ```
+/// foo: foobar
+/// ```
+/// to
+/// ```
+/// foo: barbar
+/// ```
+/// will be represented by
+/// `SourceEdit(offset: 4, length: 3, replacement: 'bar')`
+@sealed
+class SourceEdit {
+ /// The offset from the start of the string where the modification begins.
+ final int offset;
+
+ /// The length of the substring to be replaced.
+ final int length;
+
+ /// The replacement string to be used.
+ final String replacement;
+
+ /// Creates a new [SourceEdit] instance. [offset], [length] and [replacement]
+ /// must be non-null, and [offset] and [length] must be non-negative.
+ factory SourceEdit(int offset, int length, String replacement) =>
+ SourceEdit._(offset, length, replacement);
+
+ SourceEdit._(this.offset, this.length, this.replacement) {
+ RangeError.checkNotNegative(offset);
+ RangeError.checkNotNegative(length);
+ }
+
+ @override
+ bool operator ==(Object other) {
+ if (other is SourceEdit) {
+ return offset == other.offset &&
+ length == other.length &&
+ replacement == other.replacement;
+ }
+
+ return false;
+ }
+
+ @override
+ int get hashCode => offset.hashCode ^ length.hashCode ^ replacement.hashCode;
+
+ /// Constructs a SourceEdit from JSON.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// final edit = {
+ /// 'offset': 1,
+ /// 'length': 2,
+ /// 'replacement': 'replacement string'
+ /// };
+ ///
+ /// final sourceEdit = SourceEdit.fromJson(edit);
+ /// ```
+ factory SourceEdit.fromJson(Map<String, dynamic> json) {
+ final offset = json['offset'];
+ final length = json['length'];
+ final replacement = json['replacement'];
+
+ if (offset is int && length is int && replacement is String) {
+ return SourceEdit(offset, length, replacement);
+ }
+
+ throw const FormatException('Invalid JSON passed to SourceEdit');
+ }
+
+ /// Encodes this object as JSON-compatible structure.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// import 'dart:convert' show jsonEncode;
+ ///
+ /// final edit = SourceEdit(offset, length, 'replacement string');
+ /// final jsonString = jsonEncode(edit.toJson());
+ /// print(jsonString);
+ /// ```
+ Map<String, dynamic> toJson() {
+ return {'offset': offset, 'length': length, 'replacement': replacement};
+ }
+
+ @override
+ String toString() => 'SourceEdit($offset, $length, "$replacement")';
+
+ /// Applies a series of [SourceEdit]s to an original string, and return the
+ /// final output.
+ ///
+ /// [edits] should be in order i.e. the first [SourceEdit] in [edits] should
+ /// be the first edit applied to [original].
+ ///
+ /// **Example:**
+ /// ```dart
+ /// const original = 'YAML: YAML';
+ /// final sourceEdits = [
+ /// SourceEdit(6, 4, "YAML Ain't Markup Language"),
+ /// SourceEdit(6, 4, "YAML Ain't Markup Language"),
+ /// SourceEdit(0, 4, "YAML Ain't Markup Language")
+ /// ];
+ /// final result = SourceEdit.applyAll(original, sourceEdits);
+ /// ```
+ /// **Expected result:**
+ /// ```dart
+ /// "YAML Ain't Markup Language: YAML Ain't Markup Language Ain't Markup
+ /// Language"
+ /// ```
+ static String applyAll(String original, Iterable<SourceEdit> edits) {
+ return edits.fold(original, (current, edit) => edit.apply(current));
+ }
+
+ /// Applies one [SourceEdit]s to an original string, and return the final
+ /// output.
+ ///
+ /// **Example:**
+ /// ```dart
+ /// final edit = SourceEdit(4, 3, 'bar');
+ /// final originalString = 'foo: foobar';
+ /// print(edit.apply(originalString)); // 'foo: barbar'
+ /// ```
+ String apply(String original) {
+ return original.replaceRange(offset, offset + length, replacement);
+ }
+}
diff --git a/pkgs/yaml_edit/lib/src/strings.dart b/pkgs/yaml_edit/lib/src/strings.dart
new file mode 100644
index 0000000..1b85641
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/strings.dart
@@ -0,0 +1,366 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:collection/collection.dart';
+import 'package:yaml/yaml.dart';
+
+import 'utils.dart';
+
+/// Given [value], tries to format it into a plain string recognizable by YAML.
+///
+/// Not all values can be formatted into a plain string. If the string contains
+/// an escape sequence, it can only be detected when in a double-quoted
+/// sequence. Plain strings may also be misinterpreted by the YAML parser (e.g.
+/// ' null').
+///
+/// Returns `null` if [value] cannot be encoded as a plain string.
+String? _tryYamlEncodePlain(String value) {
+ /// If it contains a dangerous character we want to wrap the result with
+ /// double quotes because the double quoted style allows for arbitrary
+ /// strings with "\" escape sequences.
+ ///
+ /// See 7.3.1 Double-Quoted Style
+ /// https://yaml.org/spec/1.2/spec.html#id2787109
+ return isDangerousString(value) ? null : value;
+}
+
+/// Checks if [string] has unprintable characters according to
+/// [unprintableCharCodes].
+bool _hasUnprintableCharacters(String string) {
+ final codeUnits = string.codeUnits;
+
+ for (final key in unprintableCharCodes.keys) {
+ if (codeUnits.contains(key)) return true;
+ }
+
+ return false;
+}
+
+/// Generates a YAML-safe double-quoted string based on [string], escaping the
+/// list of characters as defined by the YAML 1.2 spec.
+///
+/// See 5.7 Escaped Characters https://yaml.org/spec/1.2/spec.html#id2776092
+String _yamlEncodeDoubleQuoted(String string) {
+ final buffer = StringBuffer();
+ for (final codeUnit in string.codeUnits) {
+ if (doubleQuoteEscapeChars[codeUnit] != null) {
+ buffer.write(doubleQuoteEscapeChars[codeUnit]);
+ } else {
+ buffer.writeCharCode(codeUnit);
+ }
+ }
+
+ return '"$buffer"';
+}
+
+/// Encodes [string] as YAML single quoted string.
+///
+/// Returns `null`, if the [string] can't be encoded as single-quoted string.
+/// This might happen if it contains line-breaks or [_hasUnprintableCharacters].
+///
+/// See: https://yaml.org/spec/1.2.2/#732-single-quoted-style
+String? _tryYamlEncodeSingleQuoted(String string) {
+ // If [string] contains a newline we'll use double quoted strings instead.
+ // Single quoted strings can represent newlines, but then we have to use an
+ // empty line (replace \n with \n\n). But since leading spaces following
+ // line breaks are ignored, we can't represent "\n ".
+ // Thus, if the string contains `\n` and we're asked to do single quoted,
+ // we'll fallback to a double quoted string.
+ if (_hasUnprintableCharacters(string) || string.contains('\n')) return null;
+
+ final result = string.replaceAll('\'', '\'\'');
+ return '\'$result\'';
+}
+
+/// Attempts to encode a [string] as a _YAML folded string_ and apply the
+/// appropriate _chomping indicator_.
+///
+/// Returns `null`, if the [string] cannot be encoded as a _YAML folded
+/// string_.
+///
+/// **Examples** of folded strings.
+/// ```yaml
+/// # With the "strip" chomping indicator
+/// key: >-
+/// my folded
+/// string
+///
+/// # With the "keep" chomping indicator
+/// key: >+
+/// my folded
+/// string
+/// ```
+///
+/// See: https://yaml.org/spec/1.2.2/#813-folded-style
+String? _tryYamlEncodeFolded(String string, int indentSize, String lineEnding) {
+ // A string that starts with space or newline followed by space can't be
+ // encoded in folded mode.
+ if (string.isEmpty || string.trim().length != string.length) return null;
+
+ if (_hasUnprintableCharacters(string)) return null;
+
+ // TODO: Are there other strings we can't encode in folded mode?
+
+ final indent = ' ' * indentSize;
+
+ /// Remove trailing `\n` & white-space to ease string folding
+ var trimmed = string.trimRight();
+ final stripped = string.substring(trimmed.length);
+
+ final trimmedSplit =
+ trimmed.replaceAll('\n', lineEnding + indent).split(lineEnding);
+
+ /// Try folding to match specification:
+ /// * https://yaml.org/spec/1.2.2/#65-line-folding
+ trimmed = trimmedSplit.reduceIndexed((index, previous, current) {
+ var updated = current;
+
+ /// If initially empty, this line holds only `\n` or white-space. This
+ /// tells us we don't need to apply an additional `\n`.
+ ///
+ /// See https://yaml.org/spec/1.2.2/#64-empty-lines
+ ///
+ /// If this line is not empty, we need to apply an additional `\n` if and
+ /// only if:
+ /// 1. The preceding line was non-empty too
+ /// 2. If the current line doesn't begin with white-space
+ ///
+ /// Such that we apply `\n` for `foo\nbar` but not `foo\n bar`.
+ if (current.trim().isNotEmpty &&
+ trimmedSplit[index - 1].trim().isNotEmpty &&
+ !current.replaceFirst(indent, '').startsWith(' ')) {
+ updated = lineEnding + updated;
+ }
+
+ /// Apply a `\n` by default.
+ return previous + lineEnding + updated;
+ });
+
+ return '>-\n'
+ '$indent$trimmed'
+ '${stripped.replaceAll('\n', lineEnding + indent)}';
+}
+
+/// Attempts to encode a [string] as a _YAML literal string_ and apply the
+/// appropriate _chomping indicator_.
+///
+/// Returns `null`, if the [string] cannot be encoded as a _YAML literal
+/// string_.
+///
+/// **Examples** of literal strings.
+/// ```yaml
+/// # With the "strip" chomping indicator
+/// key: |-
+/// my literal
+/// string
+///
+/// # Without chomping indicator
+/// key: |
+/// my literal
+/// string
+/// ```
+///
+/// See: https://yaml.org/spec/1.2.2/#812-literal-style
+String? _tryYamlEncodeLiteral(
+ String string, int indentSize, String lineEnding) {
+ if (string.isEmpty || string.trim().length != string.length) return null;
+
+ // A string that starts with space or newline followed by space can't be
+ // encoded in literal mode.
+ if (_hasUnprintableCharacters(string)) return null;
+
+ // TODO: Are there other strings we can't encode in literal mode?
+
+ final indent = ' ' * indentSize;
+
+ /// Simplest block style.
+ /// * https://yaml.org/spec/1.2.2/#812-literal-style
+ return '|-\n$indent${string.replaceAll('\n', lineEnding + indent)}';
+}
+
+/// Encodes a flow [YamlScalar] based on the provided [YamlScalar.style].
+///
+/// Falls back to [ScalarStyle.DOUBLE_QUOTED] if the [yamlScalar] cannot be
+/// encoded with the [YamlScalar.style] or with [ScalarStyle.PLAIN] when the
+/// [yamlScalar] is not a [String].
+String _yamlEncodeFlowScalar(YamlScalar yamlScalar) {
+ final YamlScalar(:value, :style) = yamlScalar;
+
+ if (value is! String) {
+ return value.toString();
+ }
+
+ switch (style) {
+ /// Only encode as double-quoted if it's a string.
+ case ScalarStyle.DOUBLE_QUOTED:
+ return _yamlEncodeDoubleQuoted(value);
+
+ case ScalarStyle.SINGLE_QUOTED:
+ return _tryYamlEncodeSingleQuoted(value) ??
+ _yamlEncodeDoubleQuoted(value);
+
+ /// Cast into [String] if [null] as this condition only returns [null]
+ /// for a [String] that can't be encoded.
+ default:
+ return _tryYamlEncodePlain(value) ?? _yamlEncodeDoubleQuoted(value);
+ }
+}
+
+/// Encodes a block [YamlScalar] based on the provided [YamlScalar.style].
+///
+/// Falls back to [ScalarStyle.DOUBLE_QUOTED] if the [yamlScalar] cannot be
+/// encoded with the [YamlScalar.style] provided.
+String _yamlEncodeBlockScalar(
+ YamlScalar yamlScalar,
+ int indentation,
+ String lineEnding,
+) {
+ final YamlScalar(:value, :style) = yamlScalar;
+ assertValidScalar(value);
+
+ if (value is! String) {
+ return value.toString();
+ }
+
+ switch (style) {
+ /// Prefer 'plain', fallback to "double quoted"
+ case ScalarStyle.PLAIN:
+ return _tryYamlEncodePlain(value) ?? _yamlEncodeDoubleQuoted(value);
+
+ // Prefer 'single quoted', fallback to "double quoted"
+ case ScalarStyle.SINGLE_QUOTED:
+ return _tryYamlEncodeSingleQuoted(value) ??
+ _yamlEncodeDoubleQuoted(value);
+
+ /// Prefer folded string, fallback to "double quoted"
+ case ScalarStyle.FOLDED:
+ return _tryYamlEncodeFolded(value, indentation, lineEnding) ??
+ _yamlEncodeDoubleQuoted(value);
+
+ /// Prefer literal string, fallback to "double quoted"
+ case ScalarStyle.LITERAL:
+ return _tryYamlEncodeLiteral(value, indentation, lineEnding) ??
+ _yamlEncodeDoubleQuoted(value);
+
+ /// Prefer plain, fallback to "double quoted"
+ default:
+ return _tryYamlEncodePlain(value) ?? _yamlEncodeDoubleQuoted(value);
+ }
+}
+
+/// Returns [value] with the necessary formatting applied in a flow context.
+///
+/// If [value] is a [YamlNode], we try to respect its [YamlScalar.style]
+/// parameter where possible. Certain cases make this impossible (e.g. a plain
+/// string scalar that starts with '>', a child having a block style
+/// parameters), in which case we will produce [value] with default styling
+/// options.
+String yamlEncodeFlow(YamlNode value) {
+ if (value is YamlList) {
+ final list = value.nodes;
+
+ final safeValues = list.map(yamlEncodeFlow);
+ return '[${safeValues.join(', ')}]';
+ } else if (value is YamlMap) {
+ final safeEntries = value.nodes.entries.map((entry) {
+ final safeKey = yamlEncodeFlow(entry.key as YamlNode);
+ final safeValue = yamlEncodeFlow(entry.value);
+ return '$safeKey: $safeValue';
+ });
+
+ return '{${safeEntries.join(', ')}}';
+ }
+
+ return _yamlEncodeFlowScalar(value as YamlScalar);
+}
+
+/// Returns [value] with the necessary formatting applied in a block context.
+String yamlEncodeBlock(
+ YamlNode value,
+ int indentation,
+ String lineEnding,
+) {
+ const additionalIndentation = 2;
+
+ if (!isBlockNode(value)) return yamlEncodeFlow(value);
+
+ final newIndentation = indentation + additionalIndentation;
+
+ if (value is YamlList) {
+ if (value.isEmpty) return '${' ' * indentation}[]';
+
+ Iterable<String> safeValues;
+
+ final children = value.nodes;
+
+ safeValues = children.map((child) {
+ var valueString = yamlEncodeBlock(child, newIndentation, lineEnding);
+ if (isCollection(child) && !isFlowYamlCollectionNode(child)) {
+ valueString = valueString.substring(newIndentation);
+ }
+
+ return '${' ' * indentation}- $valueString';
+ });
+
+ return safeValues.join(lineEnding);
+ } else if (value is YamlMap) {
+ if (value.isEmpty) return '${' ' * indentation}{}';
+
+ return value.nodes.entries.map((entry) {
+ final MapEntry(:key, :value) = entry;
+
+ final safeKey = yamlEncodeFlow(key as YamlNode);
+ final formattedKey = ' ' * indentation + safeKey;
+
+ final formattedValue = yamlEncodeBlock(
+ value,
+ newIndentation,
+ lineEnding,
+ );
+
+ /// Empty collections are always encoded in flow-style, so new-line must
+ /// be avoided
+ if (isCollection(value) && !isEmpty(value)) {
+ return '$formattedKey:$lineEnding$formattedValue';
+ }
+
+ return '$formattedKey: $formattedValue';
+ }).join(lineEnding);
+ }
+
+ return _yamlEncodeBlockScalar(
+ value as YamlScalar,
+ newIndentation,
+ lineEnding,
+ );
+}
+
+/// List of unprintable characters.
+///
+/// See 5.7 Escape Characters https://yaml.org/spec/1.2/spec.html#id2776092
+final Map<int, String> unprintableCharCodes = {
+ 0: '\\0', // Escaped ASCII null (#x0) character.
+ 7: '\\a', // Escaped ASCII bell (#x7) character.
+ 8: '\\b', // Escaped ASCII backspace (#x8) character.
+ 11: '\\v', // Escaped ASCII vertical tab (#xB) character.
+ 12: '\\f', // Escaped ASCII form feed (#xC) character.
+ 13: '\\r', // Escaped ASCII carriage return (#xD) character. Line Break.
+ 27: '\\e', // Escaped ASCII escape (#x1B) character.
+ 133: '\\N', // Escaped Unicode next line (#x85) character.
+ 160: '\\_', // Escaped Unicode non-breaking space (#xA0) character.
+ 8232: '\\L', // Escaped Unicode line separator (#x2028) character.
+ 8233: '\\P', // Escaped Unicode paragraph separator (#x2029) character.
+};
+
+/// List of escape characters.
+///
+/// See 5.7 Escape Characters https://yaml.org/spec/1.2/spec.html#id2776092
+final Map<int, String> doubleQuoteEscapeChars = {
+ ...unprintableCharCodes,
+ 9: '\\t', // Escaped ASCII horizontal tab (#x9) character. Printable
+ 10: '\\n', // Escaped ASCII line feed (#xA) character. Line Break.
+ 34: '\\"', // Escaped ASCII double quote (#x22).
+ 47: '\\/', // Escaped ASCII slash (#x2F), for JSON compatibility.
+ 92: '\\\\', // Escaped ASCII back slash (#x5C).
+};
diff --git a/pkgs/yaml_edit/lib/src/utils.dart b/pkgs/yaml_edit/lib/src/utils.dart
new file mode 100644
index 0000000..ef85526
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/utils.dart
@@ -0,0 +1,291 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:source_span/source_span.dart';
+import 'package:yaml/yaml.dart';
+
+import 'editor.dart';
+import 'wrap.dart';
+
+/// Invoke [fn] while setting [yamlWarningCallback] to [warn], and restore
+/// [YamlWarningCallback] after [fn] returns.
+///
+/// Defaults to a [warn] function that ignores all warnings.
+T withYamlWarningCallback<T>(
+ T Function() fn, {
+ YamlWarningCallback warn = _ignoreWarning,
+}) {
+ final original = yamlWarningCallback;
+ try {
+ yamlWarningCallback = warn;
+ return fn();
+ } finally {
+ yamlWarningCallback = original;
+ }
+}
+
+void _ignoreWarning(String warning, [SourceSpan? span]) {/* ignore warning */}
+
+/// Determines if [string] is dangerous by checking if parsing the plain string
+/// can return a result different from [string].
+///
+/// This function is also capable of detecting if non-printable characters are
+/// in [string].
+bool isDangerousString(String string) {
+ try {
+ final node = withYamlWarningCallback(() => loadYamlNode(string));
+ if (node.value != string) {
+ return true;
+ }
+
+ // [string] should also not contain the `[`, `]`, `,`, `{` and `}` indicator
+ // characters.
+ return string.contains(RegExp(r'\{|\[|\]|\}|,'));
+ } catch (e) {
+ /// This catch statement catches [ArgumentError] in `loadYamlNode` when
+ /// a string can be interpreted as a URI tag, but catches for other
+ /// [YamlException]s
+ return true;
+ }
+}
+
+/// Asserts that [value] is a valid scalar according to YAML.
+///
+/// A valid scalar is a number, String, boolean, or null.
+void assertValidScalar(Object? value) {
+ if (value is num || value is String || value is bool || value == null) {
+ return;
+ }
+
+ throw ArgumentError.value(value, 'value', 'Not a valid scalar type!');
+}
+
+/// Checks if [node] is a [YamlNode] with block styling.
+///
+/// [ScalarStyle.ANY] and [CollectionStyle.ANY] are considered to be block
+/// styling by default for maximum flexibility.
+bool isBlockNode(YamlNode node) {
+ if (node is YamlScalar) {
+ if (node.style == ScalarStyle.LITERAL ||
+ node.style == ScalarStyle.FOLDED ||
+ node.style == ScalarStyle.ANY) {
+ return true;
+ }
+ }
+
+ if (node is YamlList &&
+ (node.style == CollectionStyle.BLOCK ||
+ node.style == CollectionStyle.ANY)) {
+ return true;
+ }
+ if (node is YamlMap &&
+ (node.style == CollectionStyle.BLOCK ||
+ node.style == CollectionStyle.ANY)) {
+ return true;
+ }
+
+ return false;
+}
+
+/// Returns the content sensitive ending offset of [yamlNode] (i.e. where the
+/// last meaningful content happens)
+int getContentSensitiveEnd(YamlNode yamlNode) {
+ if (yamlNode is YamlList) {
+ if (yamlNode.style == CollectionStyle.FLOW) {
+ return yamlNode.span.end.offset;
+ } else {
+ return getContentSensitiveEnd(yamlNode.nodes.last);
+ }
+ } else if (yamlNode is YamlMap) {
+ if (yamlNode.style == CollectionStyle.FLOW) {
+ return yamlNode.span.end.offset;
+ } else {
+ return getContentSensitiveEnd(yamlNode.nodes.values.last);
+ }
+ }
+
+ return yamlNode.span.end.offset;
+}
+
+/// Checks if the item is a Map or a List
+bool isCollection(Object item) => item is Map || item is List;
+
+/// Checks if [index] is [int], >=0, < [length]
+bool isValidIndex(Object? index, int length) {
+ return index is int && index >= 0 && index < length;
+}
+
+/// Checks if the item is empty, if it is a List or a Map.
+///
+/// Returns `false` if [item] is not a List or Map.
+bool isEmpty(Object item) {
+ if (item is Map) return item.isEmpty;
+ if (item is List) return item.isEmpty;
+
+ return false;
+}
+
+/// Creates a [SourceSpan] from [sourceUrl] with no meaningful location
+/// information.
+///
+/// Mainly used with [wrapAsYamlNode] to allow for a reasonable
+/// implementation of [SourceSpan.message].
+SourceSpan shellSpan(Object? sourceUrl) {
+ final shellSourceLocation = SourceLocation(0, sourceUrl: sourceUrl);
+ return SourceSpanBase(shellSourceLocation, shellSourceLocation, '');
+}
+
+/// Returns if [value] is a [YamlList] or [YamlMap] with [CollectionStyle.FLOW].
+bool isFlowYamlCollectionNode(Object value) =>
+ value is YamlNode && value.collectionStyle == CollectionStyle.FLOW;
+
+/// Determines the index where [newKey] will be inserted if the keys in [map]
+/// are in alphabetical order when converted to strings.
+///
+/// Returns the length of [map] if the keys in [map] are not in alphabetical
+/// order.
+int getMapInsertionIndex(YamlMap map, Object newKey) {
+ final keys = map.nodes.keys.map((k) => k.toString()).toList();
+
+ // We can't deduce ordering if list is empty, so then we just we just append
+ if (keys.length <= 1) {
+ return map.length;
+ }
+
+ for (var i = 1; i < keys.length; i++) {
+ if (keys[i].compareTo(keys[i - 1]) < 0) {
+ return map.length;
+ }
+ }
+
+ final insertionIndex =
+ keys.indexWhere((key) => key.compareTo(newKey as String) > 0);
+
+ if (insertionIndex != -1) return insertionIndex;
+
+ return map.length;
+}
+
+/// Returns the detected indentation step used in [editor], or defaults to a
+/// value of `2` if no indentation step can be detected.
+///
+/// Indentation step is determined by the difference in indentation of the
+/// first block-styled yaml collection in the second level as compared to the
+/// top-level elements. In the case where there are multiple possible
+/// candidates, we choose the candidate closest to the start of [editor].
+int getIndentation(YamlEditor editor) {
+ final node = editor.parseAt([]);
+ Iterable<YamlNode>? children;
+ var indentation = 2;
+
+ if (node is YamlMap && node.style == CollectionStyle.BLOCK) {
+ children = node.nodes.values;
+ } else if (node is YamlList && node.style == CollectionStyle.BLOCK) {
+ children = node.nodes;
+ }
+
+ if (children != null) {
+ for (final child in children) {
+ var indent = 0;
+ if (child is YamlList) {
+ indent = getListIndentation(editor.toString(), child);
+ } else if (child is YamlMap) {
+ indent = getMapIndentation(editor.toString(), child);
+ }
+
+ if (indent != 0) indentation = indent;
+ }
+ }
+ return indentation;
+}
+
+/// Gets the indentation level of [list]. This is 0 if it is a flow list,
+/// but returns the number of spaces before the hyphen of elements for
+/// block lists.
+///
+/// Throws [UnsupportedError] if an empty block map is passed in.
+int getListIndentation(String yaml, YamlList list) {
+ if (list.style == CollectionStyle.FLOW) return 0;
+
+ /// An empty block map doesn't really exist.
+ if (list.isEmpty) {
+ throw UnsupportedError('Unable to get indentation for empty block list');
+ }
+
+ final lastSpanOffset = list.nodes.last.span.start.offset;
+ final lastHyphen = yaml.lastIndexOf('-', lastSpanOffset - 1);
+
+ if (lastHyphen == 0) return lastHyphen;
+
+ // Look for '\n' that's before hyphen
+ final lastNewLine = yaml.lastIndexOf('\n', lastHyphen - 1);
+
+ return lastHyphen - lastNewLine - 1;
+}
+
+/// Gets the indentation level of [map]. This is 0 if it is a flow map,
+/// but returns the number of spaces before the keys for block maps.
+int getMapIndentation(String yaml, YamlMap map) {
+ if (map.style == CollectionStyle.FLOW) return 0;
+
+ /// An empty block map doesn't really exist.
+ if (map.isEmpty) {
+ throw UnsupportedError('Unable to get indentation for empty block map');
+ }
+
+ /// Use the number of spaces between the last key and the newline as
+ /// indentation.
+ final lastKey = map.nodes.keys.last as YamlNode;
+ final lastSpanOffset = lastKey.span.start.offset;
+ final lastNewLine = yaml.lastIndexOf('\n', lastSpanOffset);
+ final lastQuestionMark = yaml.lastIndexOf('?', lastSpanOffset);
+
+ if (lastQuestionMark == -1) {
+ if (lastNewLine == -1) return lastSpanOffset;
+ return lastSpanOffset - lastNewLine - 1;
+ }
+
+ /// If there is a question mark, it might be a complex key. Check if it
+ /// is on the same line as the key node to verify.
+ if (lastNewLine == -1) return lastQuestionMark;
+ if (lastQuestionMark > lastNewLine) {
+ return lastQuestionMark - lastNewLine - 1;
+ }
+
+ return lastSpanOffset - lastNewLine - 1;
+}
+
+/// Returns the detected line ending used in [yaml], more specifically, whether
+/// [yaml] appears to use Windows `\r\n` or Unix `\n` line endings.
+///
+/// The heuristic used is to count all `\n` in the text and if strictly more
+/// than half of them are preceded by `\r` we report that windows line endings
+/// are used.
+String getLineEnding(String yaml) {
+ var index = -1;
+ var unixNewlines = 0;
+ var windowsNewlines = 0;
+ while ((index = yaml.indexOf('\n', index + 1)) != -1) {
+ if (index != 0 && yaml[index - 1] == '\r') {
+ windowsNewlines++;
+ } else {
+ unixNewlines++;
+ }
+ }
+
+ return windowsNewlines > unixNewlines ? '\r\n' : '\n';
+}
+
+extension YamlNodeExtension on YamlNode {
+ /// Returns the [CollectionStyle] of `this` if `this` is [YamlMap] or
+ /// [YamlList].
+ ///
+ /// Otherwise, returns `null`.
+ CollectionStyle? get collectionStyle {
+ final me = this;
+ if (me is YamlMap) return me.style;
+ if (me is YamlList) return me.style;
+ return null;
+ }
+}
diff --git a/pkgs/yaml_edit/lib/src/wrap.dart b/pkgs/yaml_edit/lib/src/wrap.dart
new file mode 100644
index 0000000..73f7751
--- /dev/null
+++ b/pkgs/yaml_edit/lib/src/wrap.dart
@@ -0,0 +1,216 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:collection' as collection;
+
+import 'package:collection/collection.dart';
+import 'package:source_span/source_span.dart';
+import 'package:yaml/yaml.dart';
+
+import 'equality.dart';
+import 'utils.dart';
+
+/// Returns a new [YamlMap] constructed by applying [update] onto the nodes of
+/// this [YamlMap].
+YamlMap updatedYamlMap(YamlMap map, Function(Map) update) {
+ final dummyMap = deepEqualsMap();
+ dummyMap.addAll(map.nodes);
+
+ update(dummyMap);
+
+ return wrapAsYamlNode(dummyMap) as YamlMap;
+}
+
+/// Wraps [value] into a [YamlNode].
+///
+/// [Map]s, [List]s and Scalars will be wrapped as [YamlMap]s, [YamlList]s,
+/// and [YamlScalar]s respectively. If [collectionStyle]/[scalarStyle] is
+/// defined, and [value] is a collection or scalar, the wrapped [YamlNode] will
+/// have the respective style, otherwise it defaults to the ANY style.
+///
+/// If [value] is a [Map] or [List], then [wrapAsYamlNode] will be called
+/// recursively on all children, and [collectionStyle]/[scalarStyle] will be
+/// applied to any children that are not instances of [YamlNode].
+///
+/// If a [YamlNode] is passed in, no further wrapping will be done, and the
+/// [collectionStyle]/[scalarStyle] will not be applied.
+YamlNode wrapAsYamlNode(
+ Object? value, {
+ CollectionStyle collectionStyle = CollectionStyle.ANY,
+ ScalarStyle scalarStyle = ScalarStyle.ANY,
+}) {
+ if (value is YamlScalar) {
+ assertValidScalar(value.value);
+ return value;
+ } else if (value is YamlList) {
+ for (final item in value.nodes) {
+ wrapAsYamlNode(item);
+ }
+
+ return value;
+ } else if (value is YamlMap) {
+ /// Both [entry.key] and [entry.values] are guaranteed to be [YamlNode]s,
+ /// so running this will just assert that they are valid scalars.
+ for (final entry in value.nodes.entries) {
+ wrapAsYamlNode(entry.key);
+ wrapAsYamlNode(entry.value);
+ }
+
+ return value;
+ } else if (value is Map) {
+ return YamlMapWrap(
+ value,
+ collectionStyle: collectionStyle,
+ scalarStyle: scalarStyle,
+ );
+ } else if (value is List) {
+ return YamlListWrap(
+ value,
+ collectionStyle: collectionStyle,
+ scalarStyle: scalarStyle,
+ );
+ } else {
+ assertValidScalar(value);
+
+ return YamlScalarWrap(value, style: scalarStyle);
+ }
+}
+
+/// Internal class that allows us to define a constructor on [YamlScalar]
+/// which takes in [style] as an argument.
+class YamlScalarWrap implements YamlScalar {
+ /// The [ScalarStyle] to be used for the scalar.
+ @override
+ final ScalarStyle style;
+
+ @override
+ final SourceSpan span;
+
+ @override
+ final dynamic value;
+
+ YamlScalarWrap(this.value, {this.style = ScalarStyle.ANY, Object? sourceUrl})
+ : span = shellSpan(sourceUrl);
+
+ @override
+ String toString() => value.toString();
+}
+
+/// Internal class that allows us to define a constructor on [YamlMap]
+/// which takes in [style] as an argument.
+class YamlMapWrap
+ with collection.MapMixin, UnmodifiableMapMixin
+ implements YamlMap {
+ /// The [CollectionStyle] to be used for the map.
+ @override
+ final CollectionStyle style;
+
+ @override
+ final Map<dynamic, YamlNode> nodes;
+
+ @override
+ final SourceSpan span;
+
+ factory YamlMapWrap(
+ Map dartMap, {
+ CollectionStyle collectionStyle = CollectionStyle.ANY,
+ ScalarStyle scalarStyle = ScalarStyle.ANY,
+ Object? sourceUrl,
+ }) {
+ final wrappedMap = deepEqualsMap<dynamic, YamlNode>();
+
+ for (final entry in dartMap.entries) {
+ final wrappedKey = wrapAsYamlNode(
+ entry.key,
+ collectionStyle: collectionStyle,
+ scalarStyle: scalarStyle,
+ );
+ final wrappedValue = wrapAsYamlNode(
+ entry.value,
+ collectionStyle: collectionStyle,
+ scalarStyle: scalarStyle,
+ );
+ wrappedMap[wrappedKey] = wrappedValue;
+ }
+
+ return YamlMapWrap._(
+ wrappedMap,
+ style: collectionStyle,
+ sourceUrl: sourceUrl,
+ );
+ }
+
+ YamlMapWrap._(
+ this.nodes, {
+ CollectionStyle style = CollectionStyle.ANY,
+ Object? sourceUrl,
+ }) : span = shellSpan(sourceUrl),
+ style = nodes.isEmpty ? CollectionStyle.FLOW : style;
+
+ @override
+ dynamic operator [](Object? key) => nodes[key]?.value;
+
+ @override
+ Iterable get keys => nodes.keys.map((node) => (node as YamlNode).value);
+
+ @override
+ Map get value => this;
+}
+
+/// Internal class that allows us to define a constructor on [YamlList]
+/// which takes in [style] as an argument.
+class YamlListWrap with collection.ListMixin implements YamlList {
+ /// The [CollectionStyle] to be used for the list.
+ @override
+ final CollectionStyle style;
+
+ @override
+ final List<YamlNode> nodes;
+
+ @override
+ final SourceSpan span;
+
+ @override
+ int get length => nodes.length;
+
+ @override
+ set length(int index) {
+ throw UnsupportedError('Cannot modify an unmodifiable List');
+ }
+
+ factory YamlListWrap(
+ List dartList, {
+ CollectionStyle collectionStyle = CollectionStyle.ANY,
+ ScalarStyle scalarStyle = ScalarStyle.ANY,
+ Object? sourceUrl,
+ }) {
+ return YamlListWrap._(
+ dartList
+ .map((v) => wrapAsYamlNode(
+ v,
+ collectionStyle: collectionStyle,
+ scalarStyle: scalarStyle,
+ ))
+ .toList(),
+ style: collectionStyle,
+ sourceUrl: sourceUrl,
+ );
+ }
+
+ YamlListWrap._(this.nodes,
+ {CollectionStyle style = CollectionStyle.ANY, Object? sourceUrl})
+ : span = shellSpan(sourceUrl),
+ style = nodes.isEmpty ? CollectionStyle.FLOW : style;
+
+ @override
+ dynamic operator [](int index) => nodes[index].value;
+
+ @override
+ void operator []=(int index, Object? value) {
+ throw UnsupportedError('Cannot modify an unmodifiable List');
+ }
+
+ @override
+ List get value => this;
+}
diff --git a/pkgs/yaml_edit/lib/yaml_edit.dart b/pkgs/yaml_edit/lib/yaml_edit.dart
new file mode 100644
index 0000000..49558b2
--- /dev/null
+++ b/pkgs/yaml_edit/lib/yaml_edit.dart
@@ -0,0 +1,28 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+/// YAML parsing is supported by `package:yaml`, and each time a change is
+/// made, the resulting YAML AST is compared against our expected output
+/// with deep equality to ensure that the output conforms to our expectations.
+///
+/// **Example**
+/// ```dart
+/// import 'package:yaml_edit/yaml_edit.dart';
+///
+/// void main() {
+/// final yamlEditor = YamlEditor('{YAML: YAML}');
+/// yamlEditor.update(['YAML'], "YAML Ain't Markup Language");
+/// print(yamlEditor);
+/// // Expected Output:
+/// // {YAML: YAML Ain't Markup Language}
+/// }
+/// ```
+///
+/// [1]: https://yaml.org/
+library;
+
+export 'src/editor.dart';
+export 'src/errors.dart' show AliasException;
+export 'src/source_edit.dart';
+export 'src/wrap.dart' show wrapAsYamlNode;
diff --git a/pkgs/yaml_edit/pubspec.yaml b/pkgs/yaml_edit/pubspec.yaml
new file mode 100644
index 0000000..8127a12
--- /dev/null
+++ b/pkgs/yaml_edit/pubspec.yaml
@@ -0,0 +1,25 @@
+name: yaml_edit
+version: 2.2.2
+description: >-
+ A library for YAML manipulation with comment and whitespace preservation.
+repository: https://github.com/dart-lang/tools/tree/main/pkgs/yaml_edit
+
+issue_tracker: https://github.com/dart-lang/yaml_edit/issues
+
+topics:
+ - yaml
+
+environment:
+ sdk: ^3.1.0
+
+dependencies:
+ collection: ^1.15.0
+ meta: ^1.7.0
+ source_span: ^1.8.1
+ yaml: ^3.1.0
+
+dev_dependencies:
+ coverage: any # we only need format_coverage, don't care what version
+ dart_flutter_team_lints: ^3.0.0
+ path: ^1.8.0
+ test: ^1.17.12
diff --git a/pkgs/yaml_edit/test/alias_test.dart b/pkgs/yaml_edit/test/alias_test.dart
new file mode 100644
index 0000000..acc0df7
--- /dev/null
+++ b/pkgs/yaml_edit/test/alias_test.dart
@@ -0,0 +1,139 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'test_utils.dart';
+
+/// This test suite is a temporary measure until we are able to better handle
+/// aliases.
+void main() {
+ group('list ', () {
+ test('removing an alias anchor results in AliasError', () {
+ final doc = YamlEditor('''
+- &SS Sammy Sosa
+- *SS
+''');
+ expect(() => doc.remove([0]), throwsAliasException);
+ });
+
+ test('removing an alias reference results in AliasError', () {
+ final doc = YamlEditor('''
+- &SS Sammy Sosa
+- *SS
+''');
+
+ expect(() => doc.remove([1]), throwsAliasException);
+ });
+
+ test('it is okay to remove a non-alias node', () {
+ final doc = YamlEditor('''
+- &SS Sammy Sosa
+- *SS
+- Sammy Sosa
+''');
+
+ doc.remove([2]);
+ expect(doc.toString(), equals('''
+- &SS Sammy Sosa
+- *SS
+'''));
+ });
+ });
+
+ group('map', () {
+ test('removing an alias anchor value results in AliasError', () {
+ final doc = YamlEditor('''
+a: &SS Sammy Sosa
+b: *SS
+''');
+
+ expect(() => doc.remove(['a']), throwsAliasException);
+ });
+
+ test('removing an alias reference value results in AliasError', () {
+ final doc = YamlEditor('''
+a: &SS Sammy Sosa
+b: *SS
+''');
+
+ expect(() => doc.remove(['b']), throwsAliasException);
+ });
+
+ test('removing an alias anchor key results in AliasError', () {
+ final doc = YamlEditor('''
+&SS Sammy Sosa: a
+b: *SS
+''');
+
+ expect(() => doc.remove(['Sammy Sosa']), throwsAliasException);
+ });
+
+ test('removing an alias reference key results in AliasError', () {
+ final doc = YamlEditor('''
+a: &SS Sammy Sosa
+*SS : b
+''');
+
+ expect(() => doc.remove(['Sammy Sosa']), throwsAliasException);
+ });
+
+ test('it is okay to remove a non-alias node', () {
+ final doc = YamlEditor('''
+a: &SS Sammy Sosa
+b: *SS
+c: Sammy Sosa
+''');
+
+ doc.remove(['c']);
+ expect(doc.toString(), equals('''
+a: &SS Sammy Sosa
+b: *SS
+'''));
+ });
+ });
+
+ group('nested alias', () {
+ test('nested list alias anchors are detected too', () {
+ final doc = YamlEditor('''
+-
+ - &SS Sammy Sosa
+- *SS
+''');
+
+ expect(() => doc.remove([0]), throwsAliasException);
+ });
+
+ test('nested list alias references are detected too', () {
+ final doc = YamlEditor('''
+- &SS Sammy Sosa
+-
+ - *SS
+''');
+
+ expect(() => doc.remove([1]), throwsAliasException);
+ });
+
+ test('removing nested map alias anchor results in AliasError', () {
+ final doc = YamlEditor('''
+a:
+ c: &SS Sammy Sosa
+b: *SS
+''');
+
+ expect(() => doc.remove(['a']), throwsAliasException);
+ });
+
+ test('removing nested map alias reference results in AliasError', () {
+ final doc = YamlEditor('''
+a: &SS Sammy Sosa
+b:
+ c: *SS
+''');
+
+ expect(() => doc.remove(['b']), throwsAliasException);
+ });
+ });
+}
diff --git a/pkgs/yaml_edit/test/append_test.dart b/pkgs/yaml_edit/test/append_test.dart
new file mode 100644
index 0000000..cb705ed
--- /dev/null
+++ b/pkgs/yaml_edit/test/append_test.dart
@@ -0,0 +1,269 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ group('throws PathError', () {
+ test('if it is a map', () {
+ final doc = YamlEditor('a:1');
+ expect(() => doc.appendToList([], 4), throwsPathError);
+ });
+
+ test('if it is a scalar', () {
+ final doc = YamlEditor('1');
+ expect(() => doc.appendToList([], 4), throwsPathError);
+ });
+ });
+
+ group('block list', () {
+ test('(1)', () {
+ final doc = YamlEditor('''
+- 0
+- 1
+- 2
+- 3
+''');
+ doc.appendToList([], 4);
+ expect(doc.toString(), equals('''
+- 0
+- 1
+- 2
+- 3
+- 4
+'''));
+ expectYamlBuilderValue(doc, [0, 1, 2, 3, 4]);
+ });
+
+ test('null path', () {
+ final doc = YamlEditor('''
+~:
+ - 0
+ - 1
+ - 2
+ - 3
+''');
+ doc.appendToList([null], 4);
+ expect(doc.toString(), equals('''
+~:
+ - 0
+ - 1
+ - 2
+ - 3
+ - 4
+'''));
+ expectYamlBuilderValue(doc, {
+ null: [0, 1, 2, 3, 4]
+ });
+ });
+
+ test('element to simple block list ', () {
+ final doc = YamlEditor('''
+- 0
+- 1
+- 2
+- 3
+''');
+ doc.appendToList([], [4, 5, 6]);
+ expect(doc.toString(), equals('''
+- 0
+- 1
+- 2
+- 3
+- - 4
+ - 5
+ - 6
+'''));
+ expectYamlBuilderValue(doc, [
+ 0,
+ 1,
+ 2,
+ 3,
+ [4, 5, 6]
+ ]);
+ });
+
+ test('nested', () {
+ final doc = YamlEditor('''
+- 0
+- - 1
+ - 2
+''');
+ doc.appendToList([1], 3);
+ expect(doc.toString(), equals('''
+- 0
+- - 1
+ - 2
+ - 3
+'''));
+ expectYamlBuilderValue(doc, [
+ 0,
+ [1, 2, 3]
+ ]);
+ });
+
+ test('block list element to nested block list ', () {
+ final doc = YamlEditor('''
+- 0
+- - 1
+ - 2
+''');
+ doc.appendToList([1], [3, 4, 5]);
+
+ expect(doc.toString(), equals('''
+- 0
+- - 1
+ - 2
+ - - 3
+ - 4
+ - 5
+'''));
+ expectYamlBuilderValue(doc, [
+ 0,
+ [
+ 1,
+ 2,
+ [3, 4, 5]
+ ]
+ ]);
+ });
+
+ test('nested', () {
+ final yamlEditor = YamlEditor('''
+a:
+ 1:
+ - null
+ 2: null
+''');
+ yamlEditor.appendToList(['a', 1], false);
+
+ expect(yamlEditor.toString(), equals('''
+a:
+ 1:
+ - null
+ - false
+ 2: null
+'''));
+ });
+
+ test('block append (1)', () {
+ final yamlEditor = YamlEditor('''
+# comment
+- z:
+ x: 1
+ y: 2
+- z:
+ x: 3
+ y: 4
+''');
+ yamlEditor.appendToList([], {
+ 'z': {'x': 5, 'y': 6}
+ });
+
+ expect(yamlEditor.toString(), equals('''
+# comment
+- z:
+ x: 1
+ y: 2
+- z:
+ x: 3
+ y: 4
+- z:
+ x: 5
+ y: 6
+'''));
+ });
+
+ test('block append (2)', () {
+ final yamlEditor = YamlEditor('''
+# comment
+a:
+ - z:
+ x: 1
+ y: 2
+ - z:
+ x: 3
+ y: 4
+b:
+ - w:
+ m: 2
+ n: 4
+''');
+ yamlEditor.appendToList([
+ 'a'
+ ], {
+ 'z': {'x': 5, 'y': 6}
+ });
+
+ expect(yamlEditor.toString(), equals('''
+# comment
+a:
+ - z:
+ x: 1
+ y: 2
+ - z:
+ x: 3
+ y: 4
+ - z:
+ x: 5
+ y: 6
+b:
+ - w:
+ m: 2
+ n: 4
+'''));
+ });
+
+ test('block append nested and with comments', () {
+ final yamlEditor = YamlEditor('''
+a:
+ b:
+ - c:
+ d: 1
+ - c:
+ d: 2
+# comment
+ e:
+ - g:
+ e: 1
+ f: 2
+# comment
+''');
+ expect(
+ () => yamlEditor.appendToList([
+ 'a',
+ 'e'
+ ], {
+ 'g': {'e': 3, 'f': 4}
+ }),
+ returnsNormally);
+ });
+ });
+
+ group('flow list', () {
+ test('(1)', () {
+ final doc = YamlEditor('[0, 1, 2]');
+ doc.appendToList([], 3);
+ expect(doc.toString(), equals('[0, 1, 2, 3]'));
+ expectYamlBuilderValue(doc, [0, 1, 2, 3]);
+ });
+
+ test('null value', () {
+ final doc = YamlEditor('[0, 1, 2]');
+ doc.appendToList([], null);
+ expect(doc.toString(), equals('[0, 1, 2, null]'));
+ expectYamlBuilderValue(doc, [0, 1, 2, null]);
+ });
+
+ test('empty ', () {
+ final doc = YamlEditor('[]');
+ doc.appendToList([], 0);
+ expect(doc.toString(), equals('[0]'));
+ expectYamlBuilderValue(doc, [0]);
+ });
+ });
+}
diff --git a/pkgs/yaml_edit/test/editor_test.dart b/pkgs/yaml_edit/test/editor_test.dart
new file mode 100644
index 0000000..b0a0081
--- /dev/null
+++ b/pkgs/yaml_edit/test/editor_test.dart
@@ -0,0 +1,56 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+void main() {
+ group('YamlEditor records edits', () {
+ test('returns empty list at start', () {
+ final yamlEditor = YamlEditor('YAML: YAML');
+
+ expect(yamlEditor.edits, []);
+ });
+
+ test('after one change', () {
+ final yamlEditor = YamlEditor('YAML: YAML');
+ yamlEditor.update(['YAML'], "YAML Ain't Markup Language");
+
+ expect(
+ yamlEditor.edits, [SourceEdit(5, 5, " YAML Ain't Markup Language")]);
+ });
+
+ test('after multiple changes', () {
+ final yamlEditor = YamlEditor('YAML: YAML');
+ yamlEditor.update(['YAML'], "YAML Ain't Markup Language");
+ yamlEditor.update(['XML'], 'Extensible Markup Language');
+ yamlEditor.remove(['YAML']);
+
+ expect(yamlEditor.edits, [
+ SourceEdit(5, 5, " YAML Ain't Markup Language"),
+ SourceEdit(32, 0, '\nXML: Extensible Markup Language\n'),
+ SourceEdit(0, 33, '')
+ ]);
+ });
+
+ test('that do not automatically update with internal list', () {
+ final yamlEditor = YamlEditor('YAML: YAML');
+ yamlEditor.update(['YAML'], "YAML Ain't Markup Language");
+
+ final firstEdits = yamlEditor.edits;
+
+ expect(firstEdits, [SourceEdit(5, 5, " YAML Ain't Markup Language")]);
+
+ yamlEditor.update(['XML'], 'Extensible Markup Language');
+ yamlEditor.remove(['YAML']);
+
+ expect(firstEdits, [SourceEdit(5, 5, " YAML Ain't Markup Language")]);
+ expect(yamlEditor.edits, [
+ SourceEdit(5, 5, " YAML Ain't Markup Language"),
+ SourceEdit(32, 0, '\nXML: Extensible Markup Language\n'),
+ SourceEdit(0, 33, '')
+ ]);
+ });
+ });
+}
diff --git a/pkgs/yaml_edit/test/golden_test.dart b/pkgs/yaml_edit/test/golden_test.dart
new file mode 100644
index 0000000..1dd6ff3
--- /dev/null
+++ b/pkgs/yaml_edit/test/golden_test.dart
@@ -0,0 +1,40 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+@TestOn('vm')
+library;
+
+import 'dart:io';
+import 'dart:isolate';
+
+import 'package:test/test.dart';
+
+import 'test_case.dart';
+
+/// This script performs snapshot testing of the inputs in the testing directory
+/// against golden files if they exist, and creates the golden files otherwise.
+///
+/// Input directory should be in `test/test_cases`, while the golden files should
+/// be in `test/test_cases_golden`.
+///
+/// For more information on the expected input and output, refer to the README
+/// in the testdata folder
+Future<void> main() async {
+ final packageUri = await Isolate.resolvePackageUri(
+ Uri.parse('package:yaml_edit/yaml_edit.dart'));
+
+ final testdataUri = packageUri!.resolve('../test/testdata/');
+ final inputDirectory = Directory.fromUri(testdataUri.resolve('input/'));
+ final goldDirectoryUri = testdataUri.resolve('output/');
+
+ if (!inputDirectory.existsSync()) {
+ throw FileSystemException(
+ 'Testing Directory does not exist!', inputDirectory.path);
+ }
+
+ final testCases =
+ await TestCases.getTestCases(inputDirectory.uri, goldDirectoryUri);
+
+ testCases.test();
+}
diff --git a/pkgs/yaml_edit/test/insert_test.dart b/pkgs/yaml_edit/test/insert_test.dart
new file mode 100644
index 0000000..8c0e3b2
--- /dev/null
+++ b/pkgs/yaml_edit/test/insert_test.dart
@@ -0,0 +1,207 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ group('throws PathError', () {
+ test('if it is a map', () {
+ final doc = YamlEditor('a:1');
+ expect(() => doc.insertIntoList([], 0, 4), throwsPathError);
+ });
+
+ test('if it is a scalar', () {
+ final doc = YamlEditor('1');
+ expect(() => doc.insertIntoList([], 0, 4), throwsPathError);
+ });
+ });
+
+ test('throws RangeError if index is out of range', () {
+ final doc = YamlEditor('[1, 2]');
+ expect(() => doc.insertIntoList([], -1, 0), throwsRangeError);
+ expect(() => doc.insertIntoList([], 3, 0), throwsRangeError);
+ });
+
+ group('block list', () {
+ test('(1)', () {
+ final doc = YamlEditor('''
+- 1
+- 2''');
+ doc.insertIntoList([], 0, 0);
+ expect(doc.toString(), equals('''
+- 0
+- 1
+- 2'''));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+
+ test('(2)', () {
+ final doc = YamlEditor('''
+- 1
+- 2''');
+ doc.insertIntoList([], 1, 3);
+ expect(doc.toString(), equals('''
+- 1
+- 3
+- 2'''));
+ expectYamlBuilderValue(doc, [1, 3, 2]);
+ });
+
+ test('(3)', () {
+ final doc = YamlEditor('''
+- 1
+- 2
+''');
+ doc.insertIntoList([], 2, 3);
+ expect(doc.toString(), equals('''
+- 1
+- 2
+- 3
+'''));
+ expectYamlBuilderValue(doc, [1, 2, 3]);
+ });
+
+ test('(4)', () {
+ final doc = YamlEditor('''
+- 1
+- 3
+''');
+ doc.insertIntoList([], 1, [4, 5, 6]);
+ expect(doc.toString(), equals('''
+- 1
+- - 4
+ - 5
+ - 6
+- 3
+'''));
+ expectYamlBuilderValue(doc, [
+ 1,
+ [4, 5, 6],
+ 3
+ ]);
+ });
+
+ test(' with comments', () {
+ final doc = YamlEditor('''
+- 0 # comment a
+- 2 # comment b
+''');
+ doc.insertIntoList([], 1, 1);
+ expect(doc.toString(), equals('''
+- 0 # comment a
+- 1
+- 2 # comment b
+'''));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+
+ for (var i = 0; i < 3; i++) {
+ test('block insert(1) at $i', () {
+ final yamlEditor = YamlEditor('''
+# comment
+- z:
+ x: 1
+ y: 2
+- z:
+ x: 3
+ y: 4
+''');
+ expect(
+ () => yamlEditor.insertIntoList(
+ [],
+ i,
+ {
+ 'z': {'x': 5, 'y': 6}
+ }),
+ returnsNormally);
+ });
+ }
+
+ for (var i = 0; i < 3; i++) {
+ test('block insert(2) at $i', () {
+ final yamlEditor = YamlEditor('''
+a:
+ - z:
+ x: 1
+ y: 2
+ - z:
+ x: 3
+ y: 4
+b:
+ - w:
+ m: 2
+ n: 4
+''');
+ expect(
+ () => yamlEditor.insertIntoList(
+ ['a'],
+ i,
+ {
+ 'z': {'x': 5, 'y': 6}
+ }),
+ returnsNormally);
+ });
+ }
+
+ for (var i = 0; i < 2; i++) {
+ test('block insert nested and with comments at $i', () {
+ final yamlEditor = YamlEditor('''
+a:
+ b:
+ - c:
+ d: 1
+ - c:
+ d: 2
+# comment
+ e:
+ - g:
+ e: 1
+ f: 2
+# comment
+''');
+ expect(
+ () => yamlEditor.insertIntoList(
+ ['a', 'b'],
+ i,
+ {
+ 'g': {'e': 3, 'f': 4}
+ }),
+ returnsNormally);
+ });
+ }
+ });
+
+ group('flow list', () {
+ test('(1)', () {
+ final doc = YamlEditor('[1, 2]');
+ doc.insertIntoList([], 0, 0);
+ expect(doc.toString(), equals('[0, 1, 2]'));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+
+ test('(2)', () {
+ final doc = YamlEditor('[1, 2]');
+ doc.insertIntoList([], 1, 3);
+ expect(doc.toString(), equals('[1, 3, 2]'));
+ expectYamlBuilderValue(doc, [1, 3, 2]);
+ });
+
+ test('(3)', () {
+ final doc = YamlEditor('[1, 2]');
+ doc.insertIntoList([], 2, 3);
+ expect(doc.toString(), equals('[1, 2, 3]'));
+ expectYamlBuilderValue(doc, [1, 2, 3]);
+ });
+
+ test('(4)', () {
+ final doc = YamlEditor('["[],", "[],"]');
+ doc.insertIntoList([], 1, 'test');
+ expect(doc.toString(), equals('["[],", test, "[],"]'));
+ expectYamlBuilderValue(doc, ['[],', 'test', '[],']);
+ });
+ });
+}
diff --git a/pkgs/yaml_edit/test/naughty_test.dart b/pkgs/yaml_edit/test/naughty_test.dart
new file mode 100644
index 0000000..533a535
--- /dev/null
+++ b/pkgs/yaml_edit/test/naughty_test.dart
@@ -0,0 +1,30 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:test/test.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'problem_strings.dart';
+
+void main() {
+ for (final string in problemStrings) {
+ test('expect string $string', () {
+ final doc = YamlEditor('');
+
+ /// Using [runZoned] to hide `package:yaml`'s warnings.
+ /// Test failures and errors will still be shown.
+ runZoned(() {
+ expect(() => doc.update([], string), returnsNormally);
+ final value = doc.parseAt([]).value;
+ expect(value, isA<String>());
+ expect(value, equals(string));
+ },
+ zoneSpecification: ZoneSpecification(
+ print: (Zone self, ZoneDelegate parent, Zone zone,
+ String message) {}));
+ });
+ }
+}
diff --git a/pkgs/yaml_edit/test/parse_test.dart b/pkgs/yaml_edit/test/parse_test.dart
new file mode 100644
index 0000000..382307c
--- /dev/null
+++ b/pkgs/yaml_edit/test/parse_test.dart
@@ -0,0 +1,156 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:yaml/yaml.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ group('throws', () {
+ test('PathError if key does not exist', () {
+ final doc = YamlEditor('{a: 4}');
+ final path = ['b'];
+
+ expect(() => doc.parseAt(path), throwsPathError);
+ });
+
+ test('PathError if path tries to go deeper into a scalar', () {
+ final doc = YamlEditor('{a: 4}');
+ final path = ['a', 'b'];
+
+ expect(() => doc.parseAt(path), throwsPathError);
+ });
+
+ test('PathError if index is out of bounds', () {
+ final doc = YamlEditor('[0,1]');
+ final path = [2];
+
+ expect(() => doc.parseAt(path), throwsPathError);
+ });
+
+ test('PathError if index is not an integer', () {
+ final doc = YamlEditor('[0,1]');
+ final path = ['2'];
+
+ expect(() => doc.parseAt(path), throwsPathError);
+ });
+ });
+
+ group('orElse provides a default value', () {
+ test('simple example with null node return ', () {
+ final doc = YamlEditor('{a: {d: 4}, c: ~}');
+ final result = doc.parseAt(['b'], orElse: () => wrapAsYamlNode(null));
+
+ expect(result.value, equals(null));
+ });
+
+ test('simple example with map return', () {
+ final doc = YamlEditor('{a: {d: 4}, c: ~}');
+ final result =
+ doc.parseAt(['b'], orElse: () => wrapAsYamlNode({'a': 42}));
+
+ expect(result, isA<YamlMap>());
+ expect(result.value, equals({'a': 42}));
+ });
+
+ test('simple example with scalar return', () {
+ final doc = YamlEditor('{a: {d: 4}, c: ~}');
+ final result = doc.parseAt(['b'], orElse: () => wrapAsYamlNode(42));
+
+ expect(result, isA<YamlScalar>());
+ expect(result.value, equals(42));
+ });
+
+ test('simple example with list return', () {
+ final doc = YamlEditor('{a: {d: 4}, c: ~}');
+ final result = doc.parseAt(['b'], orElse: () => wrapAsYamlNode([42]));
+
+ expect(result, isA<YamlList>());
+ expect(result.value, equals([42]));
+ });
+ });
+
+ group('returns a YamlNode', () {
+ test('with the correct type', () {
+ final doc = YamlEditor("YAML: YAML Ain't Markup Language");
+ final expectedYamlScalar = doc.parseAt(['YAML']);
+
+ expect(expectedYamlScalar, isA<YamlScalar>());
+ });
+
+ test('with the correct value', () {
+ final doc = YamlEditor("YAML: YAML Ain't Markup Language");
+
+ expect(doc.parseAt(['YAML']).value, "YAML Ain't Markup Language");
+ });
+
+ test('with the correct value in nested collection', () {
+ final doc = YamlEditor('''
+a: 1
+b:
+ d: 4
+ e: [5, 6, 7]
+c: 3
+''');
+
+ expect(doc.parseAt(['b', 'e', 2]).value, 7);
+ });
+
+ test('with a null value in nested collection', () {
+ final doc = YamlEditor('''
+key1:
+ key2: null
+''');
+
+ expect(doc.parseAt(['key1', 'key2']).value, null);
+ });
+
+ test('with the correct type (2)', () {
+ final doc = YamlEditor("YAML: YAML Ain't Markup Language");
+ final expectedYamlMap = doc.parseAt([]);
+
+ expect(expectedYamlMap is YamlMap, equals(true));
+ });
+
+ test('that is immutable', () {
+ final doc = YamlEditor("YAML: YAML Ain't Markup Language");
+ final expectedYamlMap = doc.parseAt([]);
+
+ expect(() => (expectedYamlMap as YamlMap)['YAML'] = 'test',
+ throwsUnsupportedError);
+ });
+
+ test('that has immutable children', () {
+ final doc = YamlEditor("YAML: ['Y', 'A', 'M', 'L']");
+ final expectedYamlMap = doc.parseAt([]);
+
+ expect(() => ((expectedYamlMap as YamlMap)['YAML'] as List)[0] = 'X',
+ throwsUnsupportedError);
+ });
+ });
+
+ test('works with map keys', () {
+ final doc = YamlEditor('{a: {{[1, 2]: 3}: 4}}');
+ expect(
+ doc.parseAt([
+ 'a',
+ {
+ [1, 2]: 3
+ }
+ ]).value,
+ equals(4));
+ });
+
+ test('works with null in path', () {
+ final doc = YamlEditor('{a: { ~: 4}}');
+ expect(doc.parseAt(['a', null]).value, equals(4));
+ });
+
+ test('works with null value', () {
+ final doc = YamlEditor('{a: null}');
+ expect(doc.parseAt(['a']).value, equals(null));
+ });
+}
diff --git a/pkgs/yaml_edit/test/prepend_test.dart b/pkgs/yaml_edit/test/prepend_test.dart
new file mode 100644
index 0000000..3112653
--- /dev/null
+++ b/pkgs/yaml_edit/test/prepend_test.dart
@@ -0,0 +1,169 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:yaml/yaml.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ group('throws PathError', () {
+ test('if it is a map', () {
+ final doc = YamlEditor('a:1');
+ expect(() => doc.prependToList([], 4), throwsPathError);
+ });
+
+ test('if it is a scalar', () {
+ final doc = YamlEditor('1');
+ expect(() => doc.prependToList([], 4), throwsPathError);
+ });
+ });
+
+ group('flow list', () {
+ test('(1)', () {
+ final doc = YamlEditor('[1, 2]');
+ doc.prependToList([], 0);
+ expect(doc.toString(), equals('[0, 1, 2]'));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+
+ test('null value', () {
+ final doc = YamlEditor('[1, 2]');
+ doc.prependToList([], null);
+ expect(doc.toString(), equals('[null, 1, 2]'));
+ expectYamlBuilderValue(doc, [null, 1, 2]);
+ });
+
+ test('with spaces (1)', () {
+ final doc = YamlEditor('[ 1 , 2 ]');
+ doc.prependToList([], 0);
+ expect(doc.toString(), equals('[ 0, 1 , 2 ]'));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+ });
+
+ group('block list', () {
+ test('(1)', () {
+ final doc = YamlEditor('''
+- 1
+- 2''');
+ doc.prependToList([], 0);
+ expect(doc.toString(), equals('''
+- 0
+- 1
+- 2'''));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+
+ /// Regression testing for no trailing spaces.
+ test('(2)', () {
+ final doc = YamlEditor('''- 1
+- 2''');
+ doc.prependToList([], 0);
+ expect(doc.toString(), equals('''- 0
+- 1
+- 2'''));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+
+ test('(3)', () {
+ final doc = YamlEditor('''
+- 1
+- 2
+''');
+ doc.prependToList([], [4, 5, 6]);
+ expect(doc.toString(), equals('''
+- - 4
+ - 5
+ - 6
+- 1
+- 2
+'''));
+ expectYamlBuilderValue(doc, [
+ [4, 5, 6],
+ 1,
+ 2
+ ]);
+ });
+
+ test('(4)', () {
+ final doc = YamlEditor('''
+a:
+ - b
+ - - c
+ - d
+''');
+ doc.prependToList(
+ ['a'], wrapAsYamlNode({1: 2}, collectionStyle: CollectionStyle.FLOW));
+
+ expect(doc.toString(), equals('''
+a:
+ - {1: 2}
+ - b
+ - - c
+ - d
+'''));
+ expectYamlBuilderValue(doc, {
+ 'a': [
+ {1: 2},
+ 'b',
+ ['c', 'd']
+ ]
+ });
+ });
+
+ test('with comments ', () {
+ final doc = YamlEditor('''
+# comments
+- 1 # comments
+- 2
+''');
+ doc.prependToList([], 0);
+ expect(doc.toString(), equals('''
+# comments
+- 0
+- 1 # comments
+- 2
+'''));
+ expectYamlBuilderValue(doc, [0, 1, 2]);
+ });
+
+ test('nested in map', () {
+ final doc = YamlEditor('''
+a:
+ - 1
+ - 2
+''');
+ doc.prependToList(['a'], 0);
+ expect(doc.toString(), equals('''
+a:
+ - 0
+ - 1
+ - 2
+'''));
+ expectYamlBuilderValue(doc, {
+ 'a': [0, 1, 2]
+ });
+ });
+
+ test('nested in map with comments ', () {
+ final doc = YamlEditor('''
+a: # comments
+ - 1 # comments
+ - 2
+''');
+ doc.prependToList(['a'], 0);
+ expect(doc.toString(), equals('''
+a: # comments
+ - 0
+ - 1 # comments
+ - 2
+'''));
+ expectYamlBuilderValue(doc, {
+ 'a': [0, 1, 2]
+ });
+ });
+ });
+}
diff --git a/pkgs/yaml_edit/test/preservation_test.dart b/pkgs/yaml_edit/test/preservation_test.dart
new file mode 100644
index 0000000..a763296
--- /dev/null
+++ b/pkgs/yaml_edit/test/preservation_test.dart
@@ -0,0 +1,61 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ group('preserves original yaml: ', () {
+ test('number', expectLoadPreservesYAML('2'));
+ test('number with leading and trailing lines', expectLoadPreservesYAML('''
+
+ 2
+
+ '''));
+ test('octal numbers', expectLoadPreservesYAML('0o14'));
+ test('negative numbers', expectLoadPreservesYAML('-345'));
+ test('hexadecimal numbers', expectLoadPreservesYAML('0x123abc'));
+ test('floating point numbers', expectLoadPreservesYAML('345.678'));
+ test('exponential numbers', expectLoadPreservesYAML('12.3015e+02'));
+ test('string', expectLoadPreservesYAML('a string'));
+ test('string with control characters',
+ expectLoadPreservesYAML('a string \\n'));
+ test('string with control characters',
+ expectLoadPreservesYAML('a string \n\r'));
+ test('string with hex escapes',
+ expectLoadPreservesYAML('\\x0d\\x0a is \\r\\n'));
+ test('flow map', expectLoadPreservesYAML('{a: 2}'));
+ test('flow list', expectLoadPreservesYAML('[1, 2]'));
+ test('flow list with different types of elements',
+ expectLoadPreservesYAML('[1, a]'));
+ test('flow list with weird spaces',
+ expectLoadPreservesYAML('[ 1 , 2]'));
+ test('multiline string', expectLoadPreservesYAML('''
+ Mark set a major league
+ home run record in 1998.'''));
+ test('tilde', expectLoadPreservesYAML('~'));
+ test('false', expectLoadPreservesYAML('false'));
+
+ test('block map', expectLoadPreservesYAML('''a:
+ b: 1
+ '''));
+ test('block list', expectLoadPreservesYAML('''a:
+ - 1
+ '''));
+ test('complicated example', () {
+ expectLoadPreservesYAML('''verb: RecommendCafes
+map:
+ a:
+ b: 1
+recipe:
+ - verb: Score
+ outputs: ["DishOffering[]/Scored", "Suggestions"]
+ name: Hotpot
+ - verb: Rate
+ inputs: Dish
+ ''');
+ });
+ });
+}
diff --git a/pkgs/yaml_edit/test/problem_strings.dart b/pkgs/yaml_edit/test/problem_strings.dart
new file mode 100644
index 0000000..527a9e0
--- /dev/null
+++ b/pkgs/yaml_edit/test/problem_strings.dart
@@ -0,0 +1,91 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+// ignore_for_file: lines_longer_than_80_chars
+
+const problemStrings = [
+ '[]',
+ '{}',
+ '',
+ ',',
+ '~',
+ 'undefined',
+ 'undef',
+ 'null',
+ 'NULL',
+ '(null)',
+ 'nil',
+ 'NIL',
+ 'true',
+ 'false',
+ 'True',
+ 'False',
+ 'TRUE',
+ 'FALSE',
+ 'None',
+ '\\',
+ '\\\\',
+ '0',
+ '1',
+ '\$1.00',
+ '1/2',
+ '1E2',
+ '-\$1.00',
+ '-1/2',
+ '-1E+02',
+ '1/0',
+ '0/0',
+ '-0',
+ '+0.0',
+ '0..0',
+ '.',
+ '0.0.0',
+ '0,00',
+ ',',
+ '0.0/0',
+ '1.0/0.0',
+ '0.0/0.0',
+ '--1',
+ '-',
+ '-.',
+ '-,',
+ '999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999',
+ 'NaN',
+ 'Infinity',
+ '-Infinity',
+ 'INF',
+ '1#INF',
+ '0x0',
+ '0xffffffffffffffff',
+ "1'000.00",
+ '1,000,000.00',
+ '1.000,00',
+ "1'000,00",
+ '1.000.000,00',
+ ",./;'[]\\-=",
+ '<>?:"{}|_+',
+ '!@#\$%^&*()`~',
+ '\u0001\u0002\u0003\u0004\u0005\u0006\u0007\b\u000e\u000f\u0010\u0011\u0012\u0013\u0014\u0015\u0016\u0017\u0018\u0019\u001a\u001b\u001c\u001d\u001e\u001f',
+ '\t\u000b\f
',
+ 'ด้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็ ด้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็ ด้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็็้้้้้้้้็็็็็้้้้้็็็็',
+ "'",
+ '"',
+ "''",
+ '\'"',
+ "'\"'",
+ '社會科學院語學研究所',
+ 'Ⱥ',
+ 'ヽ༼ຈل͜ຈ༽ノ ヽ༼ຈل͜ຈ༽ノ',
+ '❤️ 💔 💌 💕 💞 💓 💗 💖 💘 💝 💟 💜 💛 💚 💙',
+ '𝕋𝕙𝕖 𝕢𝕦𝕚𝕔𝕜 𝕓𝕣𝕠𝕨𝕟 𝕗𝕠𝕩 𝕛𝕦𝕞𝕡𝕤 𝕠𝕧𝕖𝕣 𝕥𝕙𝕖 𝕝𝕒𝕫𝕪 𝕕𝕠𝕘',
+ ' ',
+ '%',
+ '%d',
+ '%s%s%s%s%s',
+ '{0}',
+ '%*.*s',
+ '%@',
+ '%n',
+ 'The quic\b\b\b\b\b\bk brown fo\u0007\u0007\u0007\u0007\u0007\u0007\u0007\u0007\u0007\u0007\u0007x... [Beeeep]',
+];
diff --git a/pkgs/yaml_edit/test/random_test.dart b/pkgs/yaml_edit/test/random_test.dart
new file mode 100644
index 0000000..85cea4a
--- /dev/null
+++ b/pkgs/yaml_edit/test/random_test.dart
@@ -0,0 +1,311 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'dart:math' show Random;
+
+import 'package:test/test.dart';
+import 'package:yaml/yaml.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'problem_strings.dart';
+import 'test_utils.dart';
+
+/// Performs naive fuzzing on an initial YAML file based on an initial seed.
+///
+/// Starting with a template YAML, we randomly generate modifications and their
+/// inputs (boolean, null, strings, or numbers) to modify the YAML and assert
+/// that the change produced was expected.
+void main() {
+ final generator = _Generator(maxDepth: 5);
+
+ const roundsOfTesting = 40;
+ const modificationsPerRound = 1000;
+
+ for (var i = 0; i < roundsOfTesting; i++) {
+ test(
+ 'testing with randomly generated modifications: test $i',
+ () {
+ final editor = YamlEditor('''
+name: yaml_edit
+description: A library for YAML manipulation with comment and whitespace preservation.
+version: 0.0.1-dev
+
+environment:
+ sdk: ">=2.4.0 <3.0.0"
+
+dependencies:
+ meta: ^1.1.8
+ quiver_hashcode: ^2.0.0
+
+dev_dependencies:
+ pedantic: ^1.9.0
+ test: ^1.14.4
+''');
+
+ for (var j = 0; j < modificationsPerRound; j++) {
+ expect(
+ () => generator.performNextModification(editor, i),
+ returnsNormally,
+ );
+ }
+ },
+ );
+ }
+}
+
+/// Generates the random variables we need for fuzzing.
+class _Generator {
+ final Random r;
+
+ /// 2^32
+ static const int maxInt = 4294967296;
+
+ /// Maximum depth of random YAML collection generated.
+ final int maxDepth;
+
+ // ignore: unused_element
+ _Generator({int seed = 0, required this.maxDepth}) : r = Random(seed);
+
+ int nextInt([int max = maxInt]) => r.nextInt(max);
+
+ double nextDouble() => r.nextDouble();
+
+ bool nextBool() => r.nextBool();
+
+ /// Generates a new string by individually generating characters and
+ /// appending them to a buffer. Currently only generates strings from
+ /// ascii 32 - 127.
+ String nextString() {
+ if (nextBool()) {
+ return problemStrings[nextInt(problemStrings.length)];
+ }
+
+ final length = nextInt(100);
+ final buffer = StringBuffer();
+
+ for (var i = 0; i < length; i++) {
+ final charCode = nextInt(95) + 32;
+ buffer.writeCharCode(charCode);
+ }
+
+ return buffer.toString();
+ }
+
+ /// Generates a new scalar recognizable by YAML.
+ Object? nextScalar() {
+ final typeIndex = nextInt(5);
+
+ switch (typeIndex) {
+ case 0:
+ return nextBool();
+ case 1:
+ return nextDouble();
+ case 2:
+ return nextInt();
+ case 3:
+ return null;
+ default:
+ return nextString();
+ }
+ }
+
+ YamlScalar nextYamlScalar() {
+ return wrapAsYamlNode(nextScalar(), scalarStyle: nextScalarStyle())
+ as YamlScalar;
+ }
+
+ /// Generates the next [YamlList], with the current [depth].
+ YamlList nextYamlList(int depth) {
+ final length = nextInt(9);
+ final list = [];
+
+ for (var i = 0; i < length; i++) {
+ list.add(nextYamlNode(depth + 1));
+ }
+
+ return wrapAsYamlNode(list, collectionStyle: nextCollectionStyle())
+ as YamlList;
+ }
+
+ /// Generates the next [YamlList], with the current [depth].
+ YamlMap nextYamlMap(int depth) {
+ final length = nextInt(9);
+ final nodes = {};
+
+ for (var i = 0; i < length; i++) {
+ nodes[nextYamlNode(depth + 1)] = nextYamlScalar();
+ }
+
+ return wrapAsYamlNode(nodes, collectionStyle: nextCollectionStyle())
+ as YamlMap;
+ }
+
+ /// Returns a [YamlNode], with it being a [YamlScalar] 80% of the time, a
+ /// [YamlList] 10% of the time, and a [YamlMap] 10% of the time.
+ ///
+ /// If [depth] is greater than [maxDepth], we instantly return a [YamlScalar]
+ /// to prevent the parent from growing any further, to improve our speeds.
+ YamlNode nextYamlNode([int depth = 0]) {
+ if (depth >= maxDepth) {
+ return nextYamlScalar();
+ }
+
+ final roll = nextInt(10);
+
+ if (roll < 8) {
+ return nextYamlScalar();
+ } else if (roll == 8) {
+ return nextYamlList(depth);
+ } else {
+ return nextYamlMap(depth);
+ }
+ }
+
+ /// Performs a random modification
+ void performNextModification(YamlEditor editor, int count) {
+ final path = findPath(editor);
+ final node = editor.parseAt(path);
+ final initialString = editor.toString();
+ final args = [];
+ var method = YamlModificationMethod.remove;
+
+ try {
+ if (node is YamlScalar) {
+ editor.remove(path);
+ return;
+ }
+
+ if (node is YamlList) {
+ final methodIndex = nextInt(YamlModificationMethod.values.length);
+ method = YamlModificationMethod.values[methodIndex];
+
+ switch (method) {
+ case YamlModificationMethod.remove:
+ editor.remove(path);
+ break;
+ case YamlModificationMethod.update:
+ if (node.isEmpty) break;
+ final index = nextInt(node.length);
+ args.add(nextYamlNode());
+ path.add(index);
+ editor.update(path, args[0]);
+ break;
+ case YamlModificationMethod.appendTo:
+ args.add(nextYamlNode());
+ editor.appendToList(path, args[0]);
+ break;
+ case YamlModificationMethod.prependTo:
+ args.add(nextYamlNode());
+ editor.prependToList(path, args[0]);
+ break;
+ case YamlModificationMethod.insert:
+ args.add(nextInt(node.length + 1));
+ args.add(nextYamlNode());
+ editor.insertIntoList(path, args[0] as int, args[1]);
+ break;
+ case YamlModificationMethod.splice:
+ args.add(nextInt(node.length + 1));
+ args.add(nextInt(node.length + 1 - (args[0] as int)));
+ args.add(nextYamlList(0));
+ editor.spliceList(
+ path, args[0] as int, args[1] as int, args[2] as List);
+ break;
+ }
+ return;
+ }
+
+ if (node is YamlMap) {
+ final replace = nextBool();
+ method = YamlModificationMethod.update;
+
+ if (replace && node.isNotEmpty) {
+ final keyList = node.keys.toList();
+ path.add(keyList[nextInt(keyList.length)]);
+ } else {
+ path.add(nextScalar());
+ }
+ final value = nextYamlNode();
+ args.add(value);
+ editor.update(path, value);
+ return;
+ }
+ } catch (error, stacktrace) {
+ /// TODO: Fix once reproducible. Identify pattern.
+ if (count == 20) return;
+
+ print('''
+Failed to call $method on:
+$initialString
+with the following arguments:
+$args
+and path:
+$path
+
+Error Details:
+$error
+
+$stacktrace
+''');
+ rethrow;
+ }
+
+ throw AssertionError('Got invalid node');
+ }
+
+ /// Obtains a random path by traversing [editor].
+ ///
+ /// At every node, we return the path to the node if the node has no children.
+ /// Otherwise, we return at a 50% chance, or traverse to one random child.
+ List<Object?> findPath(YamlEditor editor) {
+ final path = <Object?>[];
+
+ // 50% chance of stopping at the collection
+ while (nextBool()) {
+ final node = editor.parseAt(path);
+
+ if (node is YamlList && node.isNotEmpty) {
+ path.add(nextInt(node.length));
+ } else if (node is YamlMap && node.isNotEmpty) {
+ final keyList = node.keys.toList();
+ path.add(keyList[nextInt(keyList.length)]);
+ } else {
+ break;
+ }
+ }
+
+ return path;
+ }
+
+ ScalarStyle nextScalarStyle() {
+ final seed = nextInt(6);
+
+ switch (seed) {
+ case 0:
+ return ScalarStyle.DOUBLE_QUOTED;
+ case 1:
+ return ScalarStyle.FOLDED;
+ case 2:
+ return ScalarStyle.LITERAL;
+ case 3:
+ return ScalarStyle.PLAIN;
+ case 4:
+ return ScalarStyle.SINGLE_QUOTED;
+ default:
+ return ScalarStyle.ANY;
+ }
+ }
+
+ CollectionStyle nextCollectionStyle() {
+ final seed = nextInt(3);
+
+ switch (seed) {
+ case 0:
+ return CollectionStyle.BLOCK;
+ case 1:
+ return CollectionStyle.FLOW;
+ default:
+ return CollectionStyle.ANY;
+ }
+ }
+}
diff --git a/pkgs/yaml_edit/test/remove_test.dart b/pkgs/yaml_edit/test/remove_test.dart
new file mode 100644
index 0000000..4742b56
--- /dev/null
+++ b/pkgs/yaml_edit/test/remove_test.dart
@@ -0,0 +1,603 @@
+// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
+// for details. All rights reserved. Use of this source code is governed by a
+// BSD-style license that can be found in the LICENSE file.
+
+import 'package:test/test.dart';
+import 'package:yaml_edit/yaml_edit.dart';
+
+import 'test_utils.dart';
+
+void main() {
+ group('throws', () {
+ test('PathError if collectionPath points to a scalar', () {
+ final doc = YamlEditor('''
+a: 1
+b: 2
+c: 3
+''');
+
+ expect(() => doc.remove(['a', 0]), throwsPathError);
+ });
+
+ test('PathError if collectionPath is invalid', () {
+ final doc = YamlEditor('''
+a: 1
+b: 2
+c: 3
+''');
+
+ expect(() => doc.remove(['d']), throwsPathError);
+ });
+
+ test('PathError if collectionPath is invalid in nested path', () {
+ final doc = YamlEditor('''
+a:
+ b: 'foo'
+''');
+
+ expect(() => doc.remove(['d']), throwsPathError);
+ });
+
+ test('PathError if collectionPath is invalid - list', () {
+ final doc = YamlEditor('''
+[1, 2, 3]
+''');
+
+ expect(() => doc.remove([4]), throwsPathError);
+ });
+
+ test('PathError in list if using a non-integer as index', () {
+ final doc = YamlEditor("{ a: ['b', 'c'] }");
+ expect(() => doc.remove(['a', 'b']), throwsPathError);
+ });
+
+ test('PathError if path is invalid', () {
+ final doc = YamlEditor("{ a: ['b', 'c'] }");
+ expect(() => doc.remove(['a', 0, '1']), throwsPathError);
+ });
+ });
+
+ group('returns', () {
+ test('returns the removed node when successful', () {
+ final doc = YamlEditor('{ a: { b: foo } }');
+ final node = doc.remove(['a', 'b']);
+ expect(node.value, equals('foo'));
+ });
+
+ test('returns null-value node when doc is empty and path is empty', () {
+ final doc = YamlEditor('');
+ final node = doc.remove([]);
+ expect(node.value, equals(null));
+ });
+ });
+
+ test('empty path should clear string', () {
+ final doc = YamlEditor('''
+a: 1
+b: 2
+c: [3, 4]
+''');
+ doc.remove([]);
+ expect(doc.toString(), equals(''));
+ });
+
+ group('block map', () {
+ test('(1)', () {
+ final doc = YamlEditor('''
+a: 1
+b: 2
+c: 3
+''');
+ doc.remove(['b']);
+ expect(doc.toString(), equals('''
+a: 1
+c: 3
+'''));
+ });
+
+ test('empty value', () {
+ final doc = YamlEditor('''
+a: 1
+b:
+c: 3
+''');
+ doc.remove(['b']);
+ expect(doc.toString(), equals('''
+a: 1
+c: 3
+'''));
+ });
+
+ test('empty value (2)', () {
+ final doc = YamlEditor('''
+- a: 1
+ b:
+ c: 3
+''');
+ doc.remove([0, 'b']);
+ expect(doc.toString(), equals('''
+- a: 1
+ c: 3
+'''));
+ });
+
+ test('empty value (3)', () {
+ final doc = YamlEditor('''
+- a: 1
+ b:
+
+ c: 3
+''');
+ doc.remove([0, 'b']);
+ expect(doc.toString(), equals('''
+- a: 1
+
+ c: 3
+'''));
+ });
+
+ test('preserves comments', () {
+ final doc = YamlEditor('''
+a: 1 # preserved 1
+# preserved 2
+b: 2
+# preserved 3
+c: 3 # preserved 4
+''');
+ doc.remove(['b']);
+ expect(doc.toString(), equals('''
+a: 1 # preserved 1
+# preserved 2
+# preserved 3
+c: 3 # preserved 4
+'''));
+ });
+
+ test('final element in map', () {
+ final doc = YamlEditor('''
+a: 1
+b: 2
+''');
+ doc.remove(['b']);
+ expect(doc.toString(), equals('''
+a: 1
+'''));
+ });
+
+ test('final element in nested map', () {
+ final doc = YamlEditor('''
+a:
+ aa: 11
+ bb: 22
+b: 2
+''');
+ doc.remove(['a', 'bb']);
+ expect(doc.toString(), equals('''
+a:
+ aa: 11
+b: 2
+'''));
+ });
+
+ test('last element should return flow empty map', () {
+ final doc = YamlEditor('''
+a: 1
+''');
+ doc.remove(['a']);
+ expect(doc.toString(), equals('''
+{}
+'''));
+ });
+
+ test('last element should return flow empty map (2)', () {
+ final doc = YamlEditor('''
+- a: 1
+- b: 2
+''');
+ doc.remove([0, 'a']);
+ expect(doc.toString(), equals('''
+- {}
+- b: 2
+'''));
+ });
+
+ test('nested', () {
+ final doc = YamlEditor('''
+a: 1
+b:
+ d: 4
+ e: 5
+c: 3
+''');
+ doc.remove(['b', 'd']);
+ expect(doc.toString(), equals('''
+a: 1
+b:
+ e: 5
+c: 3
+'''));
+ });
+
+ test('issue #55 reopend', () {
+ final doc = YamlEditor('''name: sample
+version: 0.1.0
+environment:
+ sdk: ^3.0.0
+dependencies:
+ retry: ^3.1.2
+dev_dependencies:
+ retry:''');
+ doc.remove(['dev_dependencies']);
+ });
+
+ test('issue #55 reopend, variant 2', () {
+ final doc = YamlEditor('''name: sample
+version: 0.1.0
+environment:
+ sdk: ^3.0.0
+dependencies:
+ retry: ^3.1.2
+dev_dependencies:
+ retry:''');
+ doc.remove(['dev_dependencies', 'retry']);
+ });
+ });
+
+ group('flow map', () {
+ test('(1)', () {
+ final doc = YamlEditor('{a: 1, b: 2, c: 3}');
+ doc.remove(['b']);
+ expect(doc.toString(), equals('{a: 1, c: 3}'));
+ });
+
+ test('(2) ', () {
+ final doc = YamlEditor('{a: 1}');
+ doc.remove(['a'