[INFO] cloning repository https://github.com/P40b0s/tokenizer [INFO] running `Command { std: "git" "-c" "credential.helper=" "-c" "credential.helper=/workspace/cargo-home/bin/git-credential-null" "clone" "--bare" "https://github.com/P40b0s/tokenizer" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2FP40b0s%2Ftokenizer", kill_on_drop: false }` [INFO] [stderr] Cloning into bare repository '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2FP40b0s%2Ftokenizer'... [INFO] running `Command { std: "git" "rev-parse" "HEAD", kill_on_drop: false }` [INFO] [stdout] 398fc5845e2245d459b3bd3ef5ddf076a8823089 [INFO] checking P40b0s/tokenizer/398fc5845e2245d459b3bd3ef5ddf076a8823089 against try#ccf408f4326a858c00dd845a64a86b16f360a801 for pr-129466-2 [INFO] running `Command { std: "git" "clone" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2FP40b0s%2Ftokenizer" "/workspace/builds/worker-5-tc2/source", kill_on_drop: false }` [INFO] [stderr] Cloning into '/workspace/builds/worker-5-tc2/source'... [INFO] [stderr] done. [INFO] validating manifest of git repo https://github.com/P40b0s/tokenizer on toolchain ccf408f4326a858c00dd845a64a86b16f360a801 [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+ccf408f4326a858c00dd845a64a86b16f360a801" "metadata" "--manifest-path" "Cargo.toml" "--no-deps", kill_on_drop: false }` [INFO] started tweaking git repo https://github.com/P40b0s/tokenizer [INFO] finished tweaking git repo https://github.com/P40b0s/tokenizer [INFO] tweaked toml for git repo https://github.com/P40b0s/tokenizer written to /workspace/builds/worker-5-tc2/source/Cargo.toml [INFO] crate git repo https://github.com/P40b0s/tokenizer already has a lockfile, it will not be regenerated [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+ccf408f4326a858c00dd845a64a86b16f360a801" "fetch" "--manifest-path" "Cargo.toml", kill_on_drop: false }` [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-5-tc2/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-5-tc2/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:ceb6ea022f8a89cebbe621bb4987e73a935bd40dfbb726f832cfff4742a5b95a" "/opt/rustwide/cargo-home/bin/cargo" "+ccf408f4326a858c00dd845a64a86b16f360a801" "metadata" "--no-deps" "--format-version=1", kill_on_drop: false }` [INFO] [stdout] dddf4068d9ff0f066f9803a797fb85fc81207d28e18755bac3b8682b6cd48418 [INFO] running `Command { std: "docker" "start" "-a" "dddf4068d9ff0f066f9803a797fb85fc81207d28e18755bac3b8682b6cd48418", kill_on_drop: false }` [INFO] running `Command { std: "docker" "inspect" "dddf4068d9ff0f066f9803a797fb85fc81207d28e18755bac3b8682b6cd48418", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "dddf4068d9ff0f066f9803a797fb85fc81207d28e18755bac3b8682b6cd48418", kill_on_drop: false }` [INFO] [stdout] dddf4068d9ff0f066f9803a797fb85fc81207d28e18755bac3b8682b6cd48418 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-5-tc2/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-5-tc2/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "RUSTDOCFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:ceb6ea022f8a89cebbe621bb4987e73a935bd40dfbb726f832cfff4742a5b95a" "/opt/rustwide/cargo-home/bin/cargo" "+ccf408f4326a858c00dd845a64a86b16f360a801" "check" "--frozen" "--all" "--all-targets" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] 9e7aa034345f15e8db7055c57773574952971251a5da217b0bfc641ffeef17d6 [INFO] running `Command { std: "docker" "start" "-a" "9e7aa034345f15e8db7055c57773574952971251a5da217b0bfc641ffeef17d6", kill_on_drop: false }` [INFO] [stderr] Copying to /tmp/fixit [INFO] [stderr] Running `cargo fix --edition` [INFO] [stderr] Migrating tokenizer/Cargo.toml from 2021 edition to 2024 [INFO] [stderr] Migrating tokenizer_derive/Cargo.toml from 2021 edition to 2024 [INFO] [stderr] Migrating tests/Cargo.toml from 2021 edition to 2024 [INFO] [stderr] Migrating Cargo.toml from 2021 edition to 2024 [INFO] [stderr] Compiling unicode-ident v1.0.6 [INFO] [stderr] Checking regex-syntax v0.7.4 [INFO] [stderr] Checking either v1.8.1 [INFO] [stderr] Checking tokenizerx v0.1.0 (/tmp/fixit) [INFO] [stderr] Checking aho-corasick v1.0.4 [INFO] [stderr] Migrating src/lib.rs from 2021 edition to 2024 [INFO] [stderr] Compiling proc-macro2 v1.0.66 [INFO] [stderr] Checking itertools v0.11.0 [INFO] [stderr] Compiling quote v1.0.33 [INFO] [stderr] Compiling syn v2.0.29 [INFO] [stderr] Checking regex-automata v0.3.6 [INFO] [stderr] Checking tokenizer_derive v0.10.12 (/tmp/fixit/tokenizer_derive) [INFO] [stderr] Migrating tokenizer_derive/src/lib.rs from 2021 edition to 2024 [INFO] [stderr] Fixed tokenizer_derive/src/lib.rs (2 fixes) [INFO] [stdout] warning: value assigned to `pattern` is never read [INFO] [stdout] --> tokenizer_derive/src/lib.rs:95:15 [INFO] [stdout] | [INFO] [stdout] 95 | let mut pattern: &String = &String::new(); [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: 1 warning emitted [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `pattern` is never read [INFO] [stdout] --> tokenizer_derive/src/lib.rs:95:15 [INFO] [stdout] | [INFO] [stdout] 95 | let mut pattern: &String = &String::new(); [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: 1 warning emitted [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Checking regex v1.9.3 [INFO] [stdout] warning: value assigned to `pattern` is never read [INFO] [stdout] --> tokenizer_derive/src/lib.rs:95:15 [INFO] [stdout] | [INFO] [stdout] 95 | let mut pattern: &String = &String::new(); [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: 1 warning emitted [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Checking tokenizer v0.1.5 (/tmp/fixit/tokenizer) [INFO] [stderr] Migrating tokenizer/src/lib.rs from 2021 edition to 2024 [INFO] [stderr] Fixed tokenizer/src/token_definition.rs (2 fixes) [INFO] [stderr] Fixed tokenizer/src/token_model.rs (1 fix) [INFO] [stderr] Fixed tokenizer/src/matches.rs (3 fixes) [INFO] [stderr] Checking macro-test v0.1.0 (/tmp/fixit/tests) [INFO] [stderr] Migrating tests/src/lib.rs from 2021 edition to 2024 [INFO] [stdout] warning: enum `LtrTokens` is never used [INFO] [stdout] --> tests/src/lib.rs:36:6 [INFO] [stdout] | [INFO] [stdout] 36 | enum LtrTokens [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Fixed tests/src/lib.rs (4 fixes) [INFO] [stdout] warning: 1 warning emitted [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] значение паттерна: (?P123)значение паттерна: 321значение конвертера: *>абырвалгзначение паттерна: 000значение конвертера: 000>ZEROзначение очередности: 0значение конвертера: conv1>oзначение паттерна: pat1значение очередности: 1значение конвертера: conv2>1значение паттерна: pat2значение паттерна: 123[[[p][]]321значение очередности: 3значение конвертера: 123321>321значение паттерна: \[[ПИСЬМО]{6}.*\]значение паттерна: (?i)тема=([^\n\r]+)значение паттерна: (?i)автоотправка=([^\n\r]+)значение паттерна: (?i)эцп=([^\n\r]+)значение паттерна: (?i)доставлено=([^\n\r]+)значение паттерна: (?i)прочтено=([^\n\r]+)значение паттерна: (?i)дата=([^\n\r]+)значение паттерна: (?i)\[АДРЕСАТЫ\]значение паттерна: \[ФАЙЛЫ\]значение паттерна: \[ПИСЬМО.*\]значение паттерна: \[ТЕКСТ\]значение паттерна: \d=([^\n\r]+)значение паттерна: [А-Яа-я0-9_]+=(?P.*)значение очередности: 3значение паттерна: (?Pпервый)\s(?Pвторой)\s(?Pтретий)значение конвертера: конвертированное значение бдет=$three{"$message_type":"artifact","artifact":"/opt/rustwide/target/debug/deps/macro_test-21d098ec33dfe9f9.d","emit":"dep-info"} [INFO] [stderr] значение паттерна: (?P123)значение паттерна: 321значение конвертера: *>абырвалгзначение паттерна: 000значение конвертера: 000>ZEROзначение очередности: 0значение конвертера: conv1>oзначение паттерна: pat1значение очередности: 1значение конвертера: conv2>1значение паттерна: pat2значение паттерна: 123[[[p][]]321значение очередности: 3значение конвертера: 123321>321значение паттерна: \[[ПИСЬМО]{6}.*\]значение паттерна: (?i)тема=([^\n\r]+)значение паттерна: (?i)автоотправка=([^\n\r]+)значение паттерна: (?i)эцп=([^\n\r]+)значение паттерна: (?i)доставлено=([^\n\r]+)значение паттерна: (?i)прочтено=([^\n\r]+)значение паттерна: (?i)дата=([^\n\r]+)значение паттерна: (?i)\[АДРЕСАТЫ\]значение паттерна: \[ФАЙЛЫ\]значение паттерна: \[ПИСЬМО.*\]значение паттерна: \[ТЕКСТ\]значение паттерна: \d=([^\n\r]+)значение паттерна: [А-Яа-я0-9_]+=(?P.*)значение очередности: 3значение паттерна: (?Pпервый)\s(?Pвторой)\s(?Pтретий)значение конвертера: конвертированное значение бдет=$three{"$message_type":"diagnostic","message":"failed to resolve: use of undeclared type `Lexer`","code":{"code":"E0433","explanation":"An undeclared crate, module, or type was used.\n\nErroneous code example:\n\n```compile_fail,E0433\nlet map = HashMap::new();\n// error: failed to resolve: use of undeclared type `HashMap`\n```\n\nPlease verify you didn't misspell the type/module's name or that you didn't\nforget to import it:\n\n```\nuse std::collections::HashMap; // HashMap has been imported.\nlet map: HashMap = HashMap::new(); // So it can be used!\n```\n\nIf you've expected to use a crate name:\n\n```compile_fail\nuse ferris_wheel::BigO;\n// error: failed to resolve: use of undeclared crate or module `ferris_wheel`\n```\n\nMake sure the crate has been added as a dependency in `Cargo.toml`.\n\nTo use a module from your current crate, add the `crate::` prefix to the path.\n"},"level":"error","spans":[{"file_name":"tests/src/lib.rs","byte_start":2397,"byte_end":2402,"line_start":82,"line_end":82,"column_start":19,"column_end":24,"is_primary":true,"text":[{"text":" let actions = Lexer::tokenize(text, tt);","highlight_start":19,"highlight_end":24}],"label":"use of undeclared type `Lexer`","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[{"message":"consider importing this struct","code":null,"level":"help","spans":[{"file_name":"tests/src/lib.rs","byte_start":57,"byte_end":57,"line_start":5,"line_end":5,"column_start":1,"column_end":1,"is_primary":true,"text":[{"text":"use tokenizer_derive::Tokenizer;","highlight_start":1,"highlight_end":1}],"label":null,"suggested_replacement":"use tokenizer::Lexer;\n","suggestion_applicability":"MaybeIncorrect","expansion":null}],"children":[],"rendered":null}],"rendered":"\u001b[0m\u001b[1m\u001b[38;5;9merror[E0433]\u001b[0m\u001b[0m\u001b[1m: failed to resolve: use of undeclared type `Lexer`\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m--> \u001b[0m\u001b[0mtests/src/lib.rs:82:19\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\u001b[0m\u001b[1m\u001b[38;5;12m82\u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\u001b[0m \u001b[0m\u001b[0m let actions = Lexer::tokenize(text, tt);\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;9m^^^^^\u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;9muse of undeclared type `Lexer`\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\u001b[0m\u001b[1m\u001b[38;5;14mhelp\u001b[0m\u001b[0m: consider importing this struct\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\u001b[0m\u001b[1m\u001b[38;5;12m5\u001b[0m\u001b[0m \u001b[0m\u001b[0m\u001b[38;5;10m+ use tokenizer::Lexer;\u001b[0m\n\u001b[0m \u001b[0m\u001b[0m\u001b[1m\u001b[38;5;12m|\u001b[0m\n\n"} [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:118:19 [INFO] [stdout] | [INFO] [stdout] 118 | let actions = Lexer::tokenize(text, tt); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:131:19 [INFO] [stdout] | [INFO] [stdout] 131 | let actions = Lexer::tokenize(text, defs.unwrap()); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:161:23 [INFO] [stdout] | [INFO] [stdout] 161 | let actions = Lexer::tokenize(text, defs); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:186:23 [INFO] [stdout] | [INFO] [stdout] 186 | let actions = Lexer::tokenize(text, defs); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:209:19 [INFO] [stdout] | [INFO] [stdout] 209 | let actions = Lexer::tokenize(text, defs.unwrap()); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:234:19 [INFO] [stdout] | [INFO] [stdout] 234 | let actions = Lexer::tokenize(text, defs.unwrap()); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:254:19 [INFO] [stdout] | [INFO] [stdout] 254 | let actions = Lexer::tokenize(text, defs.unwrap()); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: use of undeclared type `Lexer` [INFO] [stdout] --> tests/src/lib.rs:274:19 [INFO] [stdout] | [INFO] [stdout] 274 | let actions = Lexer::tokenize(text, defs.unwrap()); [INFO] [stdout] | ^^^^^ use of undeclared type `Lexer` [INFO] [stdout] | [INFO] [stdout] help: consider importing this struct [INFO] [stdout] | [INFO] [stdout] 5 + use tokenizer::Lexer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `tt` [INFO] [stdout] --> tests/src/lib.rs:89:9 [INFO] [stdout] | [INFO] [stdout] 89 | let tt : Option = None; [INFO] [stdout] | ^^ help: if this is intentional, prefix it with an underscore: `_tt` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error: aborting due to 9 previous errors; 1 warning emitted [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] [INFO] [stderr] error: could not compile `macro-test` (lib test) due to 9 previous errors; 1 warning emitted [INFO] [stderr] error: failed to migrate to next edition [INFO] [stderr] [INFO] [stderr] Caused by: [INFO] [stderr] process didn't exit successfully: `cargo fix --edition --allow-no-vcs --allow-dirty --frozen --all --all-targets --message-format=json` (exit status: 101) [INFO] running `Command { std: "docker" "inspect" "9e7aa034345f15e8db7055c57773574952971251a5da217b0bfc641ffeef17d6", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "9e7aa034345f15e8db7055c57773574952971251a5da217b0bfc641ffeef17d6", kill_on_drop: false }` [INFO] [stdout] 9e7aa034345f15e8db7055c57773574952971251a5da217b0bfc641ffeef17d6