[INFO] cloning repository https://github.com/divy-goes-random/json_tokenizer [INFO] running `Command { std: "git" "-c" "credential.helper=" "-c" "credential.helper=/workspace/cargo-home/bin/git-credential-null" "clone" "--bare" "https://github.com/divy-goes-random/json_tokenizer" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fdivy-goes-random%2Fjson_tokenizer", kill_on_drop: false }` [INFO] [stderr] Cloning into bare repository '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fdivy-goes-random%2Fjson_tokenizer'... [INFO] [stderr] error: copy-fd: write returned: No space left on device [INFO] [stderr] fatal: cannot copy '/usr/share/git-core/templates/hooks/post-update.sample' to '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fdivy-goes-random%2Fjson_tokenizer/hooks/post-update.sample': No space left on device [WARN] Retrying crate fetch in 3 seconds (attempt 1) [INFO] cloning repository https://github.com/divy-goes-random/json_tokenizer [INFO] running `Command { std: "git" "-c" "credential.helper=" "-c" "credential.helper=/workspace/cargo-home/bin/git-credential-null" "clone" "--bare" "https://github.com/divy-goes-random/json_tokenizer" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fdivy-goes-random%2Fjson_tokenizer", kill_on_drop: false }` [INFO] [stderr] Cloning into bare repository '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fdivy-goes-random%2Fjson_tokenizer'... [INFO] running `Command { std: "git" "rev-parse" "HEAD", kill_on_drop: false }` [INFO] [stdout] 8dd8e4fb70aefed168108dcd00523128947f2b15 [INFO] building divy-goes-random/json_tokenizer against master#44f415c1d617ebc7b931a243b7b321ef8a6ca47c for pr-142134-abi-ast-error [INFO] running `Command { std: "git" "clone" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fdivy-goes-random%2Fjson_tokenizer" "/workspace/builds/worker-4-tc1/source", kill_on_drop: false }` [INFO] [stderr] Cloning into '/workspace/builds/worker-4-tc1/source'... [INFO] [stderr] done. [INFO] validating manifest of git repo https://github.com/divy-goes-random/json_tokenizer on toolchain 44f415c1d617ebc7b931a243b7b321ef8a6ca47c [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+44f415c1d617ebc7b931a243b7b321ef8a6ca47c" "metadata" "--manifest-path" "Cargo.toml" "--no-deps", kill_on_drop: false }` [INFO] started tweaking git repo https://github.com/divy-goes-random/json_tokenizer [INFO] finished tweaking git repo https://github.com/divy-goes-random/json_tokenizer [INFO] tweaked toml for git repo https://github.com/divy-goes-random/json_tokenizer written to /workspace/builds/worker-4-tc1/source/Cargo.toml [INFO] crate git repo https://github.com/divy-goes-random/json_tokenizer already has a lockfile, it will not be regenerated [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+44f415c1d617ebc7b931a243b7b321ef8a6ca47c" "fetch" "--manifest-path" "Cargo.toml", kill_on_drop: false }` [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:eea15d5475069c3ef791f10c2c6b5af2ee421ef2da1e680ecce1cba46243983b" "/opt/rustwide/cargo-home/bin/cargo" "+44f415c1d617ebc7b931a243b7b321ef8a6ca47c" "metadata" "--no-deps" "--format-version=1", kill_on_drop: false }` [INFO] [stdout] e3e625503c767b3da466c02f20db963964d8a59da5c1a55542aed2cf4c04b5dd [INFO] running `Command { std: "docker" "start" "-a" "e3e625503c767b3da466c02f20db963964d8a59da5c1a55542aed2cf4c04b5dd", kill_on_drop: false }` [INFO] running `Command { std: "docker" "inspect" "e3e625503c767b3da466c02f20db963964d8a59da5c1a55542aed2cf4c04b5dd", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "e3e625503c767b3da466c02f20db963964d8a59da5c1a55542aed2cf4c04b5dd", kill_on_drop: false }` [INFO] [stdout] e3e625503c767b3da466c02f20db963964d8a59da5c1a55542aed2cf4c04b5dd [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=warn" "-e" "RUSTDOCFLAGS=--cap-lints=warn" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:eea15d5475069c3ef791f10c2c6b5af2ee421ef2da1e680ecce1cba46243983b" "/opt/rustwide/cargo-home/bin/cargo" "+44f415c1d617ebc7b931a243b7b321ef8a6ca47c" "build" "--frozen" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] bf154a27ff59950c7016f2dee20b9b2aa340013e7c8e6912a3747449467f7690 [INFO] running `Command { std: "docker" "start" "-a" "bf154a27ff59950c7016f2dee20b9b2aa340013e7c8e6912a3747449467f7690", kill_on_drop: false }` [INFO] [stderr] Compiling memchr v2.3.3 [INFO] [stderr] Compiling lazy_static v1.4.0 [INFO] [stderr] Compiling regex-syntax v0.6.18 [INFO] [stderr] Compiling thread_local v1.0.1 [INFO] [stderr] Compiling aho-corasick v0.7.13 [INFO] [stderr] Compiling regex v1.3.9 [INFO] [stderr] Compiling json-tokenizer v1.0.1 (/opt/rustwide/workdir) [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer.rs:134:20 [INFO] [stdout] | [INFO] [stdout] 134 | '0'...'9' => v.push(self.number_token()?), [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] | [INFO] [stdout] = warning: this is accepted in the current edition (Rust 2018) but is a hard error in Rust 2021! [INFO] [stdout] = note: for more information, see [INFO] [stdout] = note: `#[warn(ellipsis_inclusive_range_patterns)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `0` is never read [INFO] [stdout] --> src/error.rs:7:6 [INFO] [stdout] | [INFO] [stdout] 7 | IO(std::io::Error), [INFO] [stdout] | -- ^^^^^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | field in this variant [INFO] [stdout] | [INFO] [stdout] = note: `Error` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] help: consider changing the field to be of unit type to suppress this warning while preserving the field numbering, or remove the field [INFO] [stdout] | [INFO] [stdout] 7 - IO(std::io::Error), [INFO] [stdout] 7 + IO(()), [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variants `Tokenize`, `Parse`, and `Other` are never constructed [INFO] [stdout] --> src/error.rs:5:3 [INFO] [stdout] | [INFO] [stdout] 4 | pub enum Error { [INFO] [stdout] | ----- variants in this enum [INFO] [stdout] 5 | Tokenize(String), [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] 6 | Parse(String), [INFO] [stdout] | ^^^^^ [INFO] [stdout] 7 | IO(std::io::Error), [INFO] [stdout] 8 | Other(String), [INFO] [stdout] | ^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `Error` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: type alias `Result` is never used [INFO] [stdout] --> src/result.rs:3:10 [INFO] [stdout] | [INFO] [stdout] 3 | pub type Result = std::result::Result; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: enum `Token` is never used [INFO] [stdout] --> src/tokenizer.rs:11:10 [INFO] [stdout] | [INFO] [stdout] 11 | pub enum Token { [INFO] [stdout] | ^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `Token` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `char_stream` is never read [INFO] [stdout] --> src/tokenizer.rs:25:5 [INFO] [stdout] | [INFO] [stdout] 24 | pub struct Tokenizer { [INFO] [stdout] | --------- field in this struct [INFO] [stdout] 25 | char_stream: PeekableIter, [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: multiple methods are never used [INFO] [stdout] --> src/tokenizer.rs:35:8 [INFO] [stdout] | [INFO] [stdout] 28 | impl Tokenizer { [INFO] [stdout] | -------------- methods in this implementation [INFO] [stdout] ... [INFO] [stdout] 35 | fn take_until(&mut self, predicate: fn(char) -> bool) -> Result> { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 50 | fn take_while(&mut self, predicate: fn(char) -> bool) -> Result> { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 63 | fn skip(&mut self, ch: char) -> Result<()> { [INFO] [stdout] | ^^^^ [INFO] [stdout] ... [INFO] [stdout] 70 | fn string_token(&mut self) -> Result { [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 76 | fn number_token(&mut self) -> Result { [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 85 | fn keyword_token(&mut self) -> Result { [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 102 | pub fn tokenize(&mut self) -> Result> { [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished `dev` profile [unoptimized + debuginfo] target(s) in 4.70s [INFO] running `Command { std: "docker" "inspect" "bf154a27ff59950c7016f2dee20b9b2aa340013e7c8e6912a3747449467f7690", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "bf154a27ff59950c7016f2dee20b9b2aa340013e7c8e6912a3747449467f7690", kill_on_drop: false }` [INFO] [stdout] bf154a27ff59950c7016f2dee20b9b2aa340013e7c8e6912a3747449467f7690 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=warn" "-e" "RUSTDOCFLAGS=--cap-lints=warn" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:eea15d5475069c3ef791f10c2c6b5af2ee421ef2da1e680ecce1cba46243983b" "/opt/rustwide/cargo-home/bin/cargo" "+44f415c1d617ebc7b931a243b7b321ef8a6ca47c" "test" "--frozen" "--no-run" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] c0a4e12cb2b4a7f24f0388b3738a3329a450adcc29d0f41074eb6a019ab0d243 [INFO] running `Command { std: "docker" "start" "-a" "c0a4e12cb2b4a7f24f0388b3738a3329a450adcc29d0f41074eb6a019ab0d243", kill_on_drop: false }` [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer.rs:134:20 [INFO] [stdout] | [INFO] [stdout] 134 | '0'...'9' => v.push(self.number_token()?), [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] | [INFO] [stdout] = warning: this is accepted in the current edition (Rust 2018) but is a hard error in Rust 2021! [INFO] [stdout] = note: for more information, see [INFO] [stdout] = note: `#[warn(ellipsis_inclusive_range_patterns)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `0` is never read [INFO] [stdout] --> src/error.rs:7:6 [INFO] [stdout] | [INFO] [stdout] 7 | IO(std::io::Error), [INFO] [stdout] | -- ^^^^^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | field in this variant [INFO] [stdout] | [INFO] [stdout] = note: `Error` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] help: consider changing the field to be of unit type to suppress this warning while preserving the field numbering, or remove the field [INFO] [stdout] | [INFO] [stdout] 7 - IO(std::io::Error), [INFO] [stdout] 7 + IO(()), [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variants `Tokenize`, `Parse`, and `Other` are never constructed [INFO] [stdout] --> src/error.rs:5:3 [INFO] [stdout] | [INFO] [stdout] 4 | pub enum Error { [INFO] [stdout] | ----- variants in this enum [INFO] [stdout] 5 | Tokenize(String), [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] 6 | Parse(String), [INFO] [stdout] | ^^^^^ [INFO] [stdout] 7 | IO(std::io::Error), [INFO] [stdout] 8 | Other(String), [INFO] [stdout] | ^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `Error` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: type alias `Result` is never used [INFO] [stdout] --> src/result.rs:3:10 [INFO] [stdout] | [INFO] [stdout] 3 | pub type Result = std::result::Result; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: enum `Token` is never used [INFO] [stdout] --> src/tokenizer.rs:11:10 [INFO] [stdout] | [INFO] [stdout] 11 | pub enum Token { [INFO] [stdout] | ^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `Token` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `char_stream` is never read [INFO] [stdout] --> src/tokenizer.rs:25:5 [INFO] [stdout] | [INFO] [stdout] 24 | pub struct Tokenizer { [INFO] [stdout] | --------- field in this struct [INFO] [stdout] 25 | char_stream: PeekableIter, [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling json-tokenizer v1.0.1 (/opt/rustwide/workdir) [INFO] [stdout] warning: multiple methods are never used [INFO] [stdout] --> src/tokenizer.rs:35:8 [INFO] [stdout] | [INFO] [stdout] 28 | impl Tokenizer { [INFO] [stdout] | -------------- methods in this implementation [INFO] [stdout] ... [INFO] [stdout] 35 | fn take_until(&mut self, predicate: fn(char) -> bool) -> Result> { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 50 | fn take_while(&mut self, predicate: fn(char) -> bool) -> Result> { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 63 | fn skip(&mut self, ch: char) -> Result<()> { [INFO] [stdout] | ^^^^ [INFO] [stdout] ... [INFO] [stdout] 70 | fn string_token(&mut self) -> Result { [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 76 | fn number_token(&mut self) -> Result { [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 85 | fn keyword_token(&mut self) -> Result { [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 102 | pub fn tokenize(&mut self) -> Result> { [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer.rs:134:20 [INFO] [stdout] | [INFO] [stdout] 134 | '0'...'9' => v.push(self.number_token()?), [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] | [INFO] [stdout] = warning: this is accepted in the current edition (Rust 2018) but is a hard error in Rust 2021! [INFO] [stdout] = note: for more information, see [INFO] [stdout] = note: `#[warn(ellipsis_inclusive_range_patterns)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `0` is never read [INFO] [stdout] --> src/error.rs:5:12 [INFO] [stdout] | [INFO] [stdout] 5 | Tokenize(String), [INFO] [stdout] | -------- ^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | field in this variant [INFO] [stdout] | [INFO] [stdout] = note: `Error` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] help: consider changing the field to be of unit type to suppress this warning while preserving the field numbering, or remove the field [INFO] [stdout] | [INFO] [stdout] 5 - Tokenize(String), [INFO] [stdout] 5 + Tokenize(()), [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `0` is never read [INFO] [stdout] --> src/error.rs:7:6 [INFO] [stdout] | [INFO] [stdout] 7 | IO(std::io::Error), [INFO] [stdout] | -- ^^^^^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | field in this variant [INFO] [stdout] | [INFO] [stdout] = note: `Error` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] help: consider changing the field to be of unit type to suppress this warning while preserving the field numbering, or remove the field [INFO] [stdout] | [INFO] [stdout] 7 - IO(std::io::Error), [INFO] [stdout] 7 + IO(()), [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variants `Parse` and `Other` are never constructed [INFO] [stdout] --> src/error.rs:6:3 [INFO] [stdout] | [INFO] [stdout] 4 | pub enum Error { [INFO] [stdout] | ----- variants in this enum [INFO] [stdout] 5 | Tokenize(String), [INFO] [stdout] 6 | Parse(String), [INFO] [stdout] | ^^^^^ [INFO] [stdout] 7 | IO(std::io::Error), [INFO] [stdout] 8 | Other(String), [INFO] [stdout] | ^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `Error` has a derived impl for the trait `Debug`, but this is intentionally ignored during dead code analysis [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished `test` profile [unoptimized + debuginfo] target(s) in 0.82s [INFO] running `Command { std: "docker" "inspect" "c0a4e12cb2b4a7f24f0388b3738a3329a450adcc29d0f41074eb6a019ab0d243", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "c0a4e12cb2b4a7f24f0388b3738a3329a450adcc29d0f41074eb6a019ab0d243", kill_on_drop: false }` [INFO] [stdout] c0a4e12cb2b4a7f24f0388b3738a3329a450adcc29d0f41074eb6a019ab0d243