[INFO] cloning repository https://github.com/johnny-human/nlp-tokenize [INFO] running `Command { std: "git" "-c" "credential.helper=" "-c" "credential.helper=/workspace/cargo-home/bin/git-credential-null" "clone" "--bare" "https://github.com/johnny-human/nlp-tokenize" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fjohnny-human%2Fnlp-tokenize", kill_on_drop: false }` [INFO] [stderr] Cloning into bare repository '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fjohnny-human%2Fnlp-tokenize'... [INFO] running `Command { std: "git" "rev-parse" "HEAD", kill_on_drop: false }` [INFO] [stdout] fa68c7ce430a43fd9b9f7983f97a58ec4c195caf [INFO] checking johnny-human/nlp-tokenize against try#8e3afc79c11f48cb3acd1be5b3b7de98fe3f93a8 for pr-82565 [INFO] running `Command { std: "git" "clone" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fjohnny-human%2Fnlp-tokenize" "/workspace/builds/worker-13/source", kill_on_drop: false }` [INFO] [stderr] Cloning into '/workspace/builds/worker-13/source'... [INFO] [stderr] done. [INFO] validating manifest of git repo https://github.com/johnny-human/nlp-tokenize on toolchain 8e3afc79c11f48cb3acd1be5b3b7de98fe3f93a8 [INFO] running `Command { std: "/workspace/cargo-home/bin/cargo" "+8e3afc79c11f48cb3acd1be5b3b7de98fe3f93a8" "metadata" "--manifest-path" "Cargo.toml" "--no-deps", kill_on_drop: false }` [INFO] started tweaking git repo https://github.com/johnny-human/nlp-tokenize [INFO] finished tweaking git repo https://github.com/johnny-human/nlp-tokenize [INFO] tweaked toml for git repo https://github.com/johnny-human/nlp-tokenize written to /workspace/builds/worker-13/source/Cargo.toml [INFO] crate git repo https://github.com/johnny-human/nlp-tokenize already has a lockfile, it will not be regenerated [INFO] running `Command { std: "/workspace/cargo-home/bin/cargo" "+8e3afc79c11f48cb3acd1be5b3b7de98fe3f93a8" "fetch" "--locked" "--manifest-path" "Cargo.toml", kill_on_drop: false }` [INFO] [stderr] Downloading crates ... [INFO] [stderr] Downloaded bitflags v0.9.0 [INFO] [stderr] Downloaded filebuffer v0.1.1 [INFO] [stderr] Downloaded clap v2.24.1 [INFO] [stderr] Downloaded fst v0.1.38 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-13/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-13/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "rustops/crates-build-env@sha256:f2f6bcd4b43ebee4e173f653a26493129bdb64017c85f916b780ca7fbdbaa79d" "/opt/rustwide/cargo-home/bin/cargo" "+8e3afc79c11f48cb3acd1be5b3b7de98fe3f93a8" "metadata" "--no-deps" "--format-version=1", kill_on_drop: false }` [INFO] [stdout] 579273c9250342c1bb36c5e8f86532572ea9ebeb4d03314a209db67a33f4a313 [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] running `Command { std: "docker" "start" "-a" "579273c9250342c1bb36c5e8f86532572ea9ebeb4d03314a209db67a33f4a313", kill_on_drop: false }` [INFO] running `Command { std: "docker" "inspect" "579273c9250342c1bb36c5e8f86532572ea9ebeb4d03314a209db67a33f4a313", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "579273c9250342c1bb36c5e8f86532572ea9ebeb4d03314a209db67a33f4a313", kill_on_drop: false }` [INFO] [stdout] 579273c9250342c1bb36c5e8f86532572ea9ebeb4d03314a209db67a33f4a313 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-13/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-13/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "rustops/crates-build-env@sha256:f2f6bcd4b43ebee4e173f653a26493129bdb64017c85f916b780ca7fbdbaa79d" "/opt/rustwide/cargo-home/bin/cargo" "+8e3afc79c11f48cb3acd1be5b3b7de98fe3f93a8" "check" "--frozen" "--all" "--all-targets" "--message-format=json", kill_on_drop: false }` [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] [stdout] 41f5de76c3d3037751cb1f42325c10ac29be131b4308b0b9efcd81c19f576f17 [INFO] running `Command { std: "docker" "start" "-a" "41f5de76c3d3037751cb1f42325c10ac29be131b4308b0b9efcd81c19f576f17", kill_on_drop: false }` [INFO] [stderr] Checking libc v0.2.22 [INFO] [stderr] Compiling winapi-build v0.1.1 [INFO] [stderr] Checking encoding_index_tests v0.1.4 [INFO] [stderr] Checking utf8-ranges v0.1.3 [INFO] [stderr] Checking unicode-segmentation v1.1.0 [INFO] [stderr] Checking strsim v0.6.0 [INFO] [stderr] Checking vec_map v0.7.0 [INFO] [stderr] Checking bitflags v0.8.2 [INFO] [stderr] Checking byteorder v0.5.3 [INFO] [stderr] Checking unicode-width v0.1.4 [INFO] [stderr] Checking ansi_term v0.9.0 [INFO] [stderr] Checking bitflags v0.9.0 [INFO] [stderr] Checking serde v1.0.8 [INFO] [stderr] Checking encoding-index-japanese v1.20141219.5 [INFO] [stderr] Checking encoding-index-simpchinese v1.20141219.5 [INFO] [stderr] Checking encoding-index-singlebyte v1.20141219.5 [INFO] [stderr] Checking encoding-index-tradchinese v1.20141219.5 [INFO] [stderr] Checking encoding-index-korean v1.20141219.5 [INFO] [stderr] Compiling kernel32-sys v0.2.2 [INFO] [stderr] Checking fs2 v0.2.5 [INFO] [stderr] Checking term_size v0.3.0 [INFO] [stderr] Checking atty v0.2.2 [INFO] [stderr] Checking time v0.1.37 [INFO] [stderr] Checking filebuffer v0.1.1 [INFO] [stderr] Checking encoding v0.2.33 [INFO] [stderr] Checking memmap v0.4.0 [INFO] [stderr] Checking clap v2.24.1 [INFO] [stderr] Checking fst v0.1.38 [INFO] [stderr] Checking nlp v0.1.0 (/opt/rustwide/workdir) [INFO] [stdout] warning: unused import: `std::borrow::Cow` [INFO] [stdout] --> src/tokenizer_loop.rs:2:5 [INFO] [stdout] | [INFO] [stdout] 2 | use std::borrow::Cow; [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `DecoderTrap`, `Encoding` [INFO] [stdout] --> src/tokenizer_loop.rs:3:16 [INFO] [stdout] | [INFO] [stdout] 3 | use encoding::{Encoding, DecoderTrap}; [INFO] [stdout] | ^^^^^^^^ ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `encoding::all::UTF_8` [INFO] [stdout] --> src/tokenizer_loop.rs:4:5 [INFO] [stdout] | [INFO] [stdout] 4 | use encoding::all::UTF_8; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated macro `try`: use the `?` operator instead [INFO] [stdout] --> src/tokenizer_loop.rs:14:1 [INFO] [stdout] | [INFO] [stdout] 14 | / bitflags! { [INFO] [stdout] 15 | | #[derive(Default)] [INFO] [stdout] 16 | | pub struct Flags: u64 { [INFO] [stdout] 17 | | const CONTROL = 0b0000000000000001; [INFO] [stdout] ... | [INFO] [stdout] 27 | | } [INFO] [stdout] 28 | | } [INFO] [stdout] | |_^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(deprecated)]` on by default [INFO] [stdout] = note: this warning originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info) [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated macro `try`: use the `?` operator instead [INFO] [stdout] --> src/tokenizer_loop.rs:14:1 [INFO] [stdout] | [INFO] [stdout] 14 | / bitflags! { [INFO] [stdout] 15 | | #[derive(Default)] [INFO] [stdout] 16 | | pub struct Flags: u64 { [INFO] [stdout] 17 | | const CONTROL = 0b0000000000000001; [INFO] [stdout] ... | [INFO] [stdout] 27 | | } [INFO] [stdout] 28 | | } [INFO] [stdout] | |_^ [INFO] [stdout] | [INFO] [stdout] = note: this warning originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info) [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated macro `try`: use the `?` operator instead [INFO] [stdout] --> src/tokenizer_loop.rs:14:1 [INFO] [stdout] | [INFO] [stdout] 14 | / bitflags! { [INFO] [stdout] 15 | | #[derive(Default)] [INFO] [stdout] 16 | | pub struct Flags: u64 { [INFO] [stdout] 17 | | const CONTROL = 0b0000000000000001; [INFO] [stdout] ... | [INFO] [stdout] 27 | | } [INFO] [stdout] 28 | | } [INFO] [stdout] | |_^ [INFO] [stdout] | [INFO] [stdout] = note: this warning originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info) [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: trait objects without an explicit `dyn` are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:36:34 [INFO] [stdout] | [INFO] [stdout] 36 | pub fn words(bytes: Vec, f: &Fn(&Token) -> i32) -> Result, &'static Vec> { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ help: use `dyn`: `dyn Fn(&Token) -> i32` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(bare_trait_objects)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:69:22 [INFO] [stdout] | [INFO] [stdout] 69 | 65 ... 90 | 97 ... 122 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(ellipsis_inclusive_range_patterns)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:69:34 [INFO] [stdout] | [INFO] [stdout] 69 | 65 ... 90 | 97 ... 122 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:22 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:34 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:46 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:59 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:73 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:77:22 [INFO] [stdout] | [INFO] [stdout] 77 | 48 ... 57 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:81:23 [INFO] [stdout] | [INFO] [stdout] 81 | 127 ... 159 | 1 ... 31 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:81:35 [INFO] [stdout] | [INFO] [stdout] 81 | 127 ... 159 | 1 ... 31 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:85:23 [INFO] [stdout] | [INFO] [stdout] 85 | 192 ... 255 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `bitflags::*` [INFO] [stdout] --> src/numbers.rs:1:5 [INFO] [stdout] | [INFO] [stdout] 1 | use bitflags::*; [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::borrow::Cow` [INFO] [stdout] --> src/numbers.rs:2:5 [INFO] [stdout] | [INFO] [stdout] 2 | use std::borrow::Cow; [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `DecoderTrap`, `Encoding` [INFO] [stdout] --> src/numbers.rs:3:16 [INFO] [stdout] | [INFO] [stdout] 3 | use encoding::{Encoding, DecoderTrap}; [INFO] [stdout] | ^^^^^^^^ ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `encoding::all::UTF_8` [INFO] [stdout] --> src/numbers.rs:4:5 [INFO] [stdout] | [INFO] [stdout] 4 | use encoding::all::UTF_8; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Duration` [INFO] [stdout] --> src/main.rs:14:25 [INFO] [stdout] | [INFO] [stdout] 14 | use time::{PreciseTime, Duration}; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `tokenizer_peek::*` [INFO] [stdout] --> src/main.rs:16:5 [INFO] [stdout] | [INFO] [stdout] 16 | use tokenizer_peek::*; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::borrow::Cow` [INFO] [stdout] --> src/tokenizer_loop.rs:2:5 [INFO] [stdout] | [INFO] [stdout] 2 | use std::borrow::Cow; [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `DecoderTrap`, `Encoding` [INFO] [stdout] --> src/tokenizer_loop.rs:3:16 [INFO] [stdout] | [INFO] [stdout] 3 | use encoding::{Encoding, DecoderTrap}; [INFO] [stdout] | ^^^^^^^^ ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `encoding::all::UTF_8` [INFO] [stdout] --> src/tokenizer_loop.rs:4:5 [INFO] [stdout] | [INFO] [stdout] 4 | use encoding::all::UTF_8; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated macro `try`: use the `?` operator instead [INFO] [stdout] --> src/tokenizer_loop.rs:14:1 [INFO] [stdout] | [INFO] [stdout] 14 | / bitflags! { [INFO] [stdout] 15 | | #[derive(Default)] [INFO] [stdout] 16 | | pub struct Flags: u64 { [INFO] [stdout] 17 | | const CONTROL = 0b0000000000000001; [INFO] [stdout] ... | [INFO] [stdout] 27 | | } [INFO] [stdout] 28 | | } [INFO] [stdout] | |_^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(deprecated)]` on by default [INFO] [stdout] = note: this warning originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info) [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated macro `try`: use the `?` operator instead [INFO] [stdout] --> src/tokenizer_loop.rs:14:1 [INFO] [stdout] | [INFO] [stdout] 14 | / bitflags! { [INFO] [stdout] 15 | | #[derive(Default)] [INFO] [stdout] 16 | | pub struct Flags: u64 { [INFO] [stdout] 17 | | const CONTROL = 0b0000000000000001; [INFO] [stdout] ... | [INFO] [stdout] 27 | | } [INFO] [stdout] 28 | | } [INFO] [stdout] | |_^ [INFO] [stdout] | [INFO] [stdout] = note: this warning originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info) [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated macro `try`: use the `?` operator instead [INFO] [stdout] --> src/tokenizer_loop.rs:14:1 [INFO] [stdout] | [INFO] [stdout] 14 | / bitflags! { [INFO] [stdout] 15 | | #[derive(Default)] [INFO] [stdout] 16 | | pub struct Flags: u64 { [INFO] [stdout] 17 | | const CONTROL = 0b0000000000000001; [INFO] [stdout] ... | [INFO] [stdout] 27 | | } [INFO] [stdout] 28 | | } [INFO] [stdout] | |_^ [INFO] [stdout] | [INFO] [stdout] = note: this warning originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info) [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: trait objects without an explicit `dyn` are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:36:34 [INFO] [stdout] | [INFO] [stdout] 36 | pub fn words(bytes: Vec, f: &Fn(&Token) -> i32) -> Result, &'static Vec> { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ help: use `dyn`: `dyn Fn(&Token) -> i32` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(bare_trait_objects)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:69:22 [INFO] [stdout] | [INFO] [stdout] 69 | 65 ... 90 | 97 ... 122 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(ellipsis_inclusive_range_patterns)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:69:34 [INFO] [stdout] | [INFO] [stdout] 69 | 65 ... 90 | 97 ... 122 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:22 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:34 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:46 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:59 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:73:73 [INFO] [stdout] | [INFO] [stdout] 73 | 33 ... 47 | 58 ... 64 | 91 ... 96 | 123 ... 126 | 160 ... 191 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:77:22 [INFO] [stdout] | [INFO] [stdout] 77 | 48 ... 57 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:81:23 [INFO] [stdout] | [INFO] [stdout] 81 | 127 ... 159 | 1 ... 31 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:81:35 [INFO] [stdout] | [INFO] [stdout] 81 | 127 ... 159 | 1 ... 31 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: `...` range patterns are deprecated [INFO] [stdout] --> src/tokenizer_loop.rs:85:23 [INFO] [stdout] | [INFO] [stdout] 85 | 192 ... 255 => { [INFO] [stdout] | ^^^ help: use `..=` for an inclusive range [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `bitflags::*` [INFO] [stdout] --> src/numbers.rs:1:5 [INFO] [stdout] | [INFO] [stdout] 1 | use bitflags::*; [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::borrow::Cow` [INFO] [stdout] --> src/numbers.rs:2:5 [INFO] [stdout] | [INFO] [stdout] 2 | use std::borrow::Cow; [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `DecoderTrap`, `Encoding` [INFO] [stdout] --> src/numbers.rs:3:16 [INFO] [stdout] | [INFO] [stdout] 3 | use encoding::{Encoding, DecoderTrap}; [INFO] [stdout] | ^^^^^^^^ ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `encoding::all::UTF_8` [INFO] [stdout] --> src/numbers.rs:4:5 [INFO] [stdout] | [INFO] [stdout] 4 | use encoding::all::UTF_8; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Duration` [INFO] [stdout] --> src/main.rs:14:25 [INFO] [stdout] | [INFO] [stdout] 14 | use time::{PreciseTime, Duration}; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `tokenizer_peek::*` [INFO] [stdout] --> src/main.rs:16:5 [INFO] [stdout] | [INFO] [stdout] 16 | use tokenizer_peek::*; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `token` [INFO] [stdout] --> src/tokenizer_loop.rs:97:17 [INFO] [stdout] | [INFO] [stdout] 97 | let token = Token { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_token` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `length` [INFO] [stdout] --> src/tokenizer_peek.rs:75:11 [INFO] [stdout] | [INFO] [stdout] 75 | let length = c.len(); [INFO] [stdout] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_length` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `i` [INFO] [stdout] --> src/tokenizer_peek.rs:77:12 [INFO] [stdout] | [INFO] [stdout] 77 | for (i, byte) in c.iter().enumerate() { [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_i` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `pos` is never read [INFO] [stdout] --> src/numbers.rs:11:11 [INFO] [stdout] | [INFO] [stdout] 11 | let mut pos: usize = 0; [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_assignments)]` on by default [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `token` [INFO] [stdout] --> src/tokenizer_loop.rs:97:17 [INFO] [stdout] | [INFO] [stdout] 97 | let token = Token { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_token` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `length` [INFO] [stdout] --> src/tokenizer_peek.rs:75:11 [INFO] [stdout] | [INFO] [stdout] 75 | let length = c.len(); [INFO] [stdout] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_length` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `i` [INFO] [stdout] --> src/tokenizer_peek.rs:77:12 [INFO] [stdout] | [INFO] [stdout] 77 | for (i, byte) in c.iter().enumerate() { [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_i` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `pos` is never read [INFO] [stdout] --> src/numbers.rs:11:11 [INFO] [stdout] | [INFO] [stdout] 11 | let mut pos: usize = 0; [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_assignments)]` on by default [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `Alpha` [INFO] [stdout] --> src/tokenizer_peek.rs:16:7 [INFO] [stdout] | [INFO] [stdout] 16 | Alpha, // Pure alpha string [INFO] [stdout] | ^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `Other` [INFO] [stdout] --> src/tokenizer_peek.rs:17:7 [INFO] [stdout] | [INFO] [stdout] 17 | Other, // Unidentified elements [INFO] [stdout] | ^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `Numeric` [INFO] [stdout] --> src/tokenizer_peek.rs:21:7 [INFO] [stdout] | [INFO] [stdout] 21 | Numeric, // Pure numeric strings 123 [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `NonAlphaNum` [INFO] [stdout] --> src/tokenizer_peek.rs:22:7 [INFO] [stdout] | [INFO] [stdout] 22 | NonAlphaNum // Strings with only | ¦ § _ ~ ^ [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `U80_FF` should have a snake case name [INFO] [stdout] --> src/tokenizer_loop.rs:141:8 [INFO] [stdout] | [INFO] [stdout] 141 | pub fn U80_FF(byte: u8) -> [u8; 2] { [INFO] [stdout] | ^^^^^^ help: convert the identifier to snake case (notice the capitalization): `u80_ff` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(non_snake_case)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: 33 warnings emitted [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `Alpha` [INFO] [stdout] --> src/tokenizer_peek.rs:16:7 [INFO] [stdout] | [INFO] [stdout] 16 | Alpha, // Pure alpha string [INFO] [stdout] | ^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `Other` [INFO] [stdout] --> src/tokenizer_peek.rs:17:7 [INFO] [stdout] | [INFO] [stdout] 17 | Other, // Unidentified elements [INFO] [stdout] | ^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `Numeric` [INFO] [stdout] --> src/tokenizer_peek.rs:21:7 [INFO] [stdout] | [INFO] [stdout] 21 | Numeric, // Pure numeric strings 123 [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant is never constructed: `NonAlphaNum` [INFO] [stdout] --> src/tokenizer_peek.rs:22:7 [INFO] [stdout] | [INFO] [stdout] 22 | NonAlphaNum // Strings with only | ¦ § _ ~ ^ [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `U80_FF` should have a snake case name [INFO] [stdout] --> src/tokenizer_loop.rs:141:8 [INFO] [stdout] | [INFO] [stdout] 141 | pub fn U80_FF(byte: u8) -> [u8; 2] { [INFO] [stdout] | ^^^^^^ help: convert the identifier to snake case (notice the capitalization): `u80_ff` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(non_snake_case)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: 33 warnings emitted [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished dev [unoptimized + debuginfo] target(s) in 9.29s [INFO] running `Command { std: "docker" "inspect" "41f5de76c3d3037751cb1f42325c10ac29be131b4308b0b9efcd81c19f576f17", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "41f5de76c3d3037751cb1f42325c10ac29be131b4308b0b9efcd81c19f576f17", kill_on_drop: false }` [INFO] [stdout] 41f5de76c3d3037751cb1f42325c10ac29be131b4308b0b9efcd81c19f576f17