Dec 15 05:29:21.641 INFO testing luthor-0.2.0 against try#aa49d8ef14939ddec0e34b346b60174a5673d48f for pr-56550 Dec 15 05:29:21.641 INFO running `"docker" "create" "-v" "/mnt/big/crater/work/local/target-dirs/pr-56550/worker-3/try#aa49d8ef14939ddec0e34b346b60174a5673d48f:/target:rw,Z" "-v" "/mnt/big/crater/work/local/test-source/worker-3/pr-56550/try#aa49d8ef14939ddec0e34b346b60174a5673d48f:/source:ro,Z" "-v" "/mnt/big/crater/work/local/cargo-home:/cargo-home:ro,Z" "-v" "/mnt/big/crater/work/local/rustup-home:/rustup-home:ro,Z" "-e" "USER_ID=1000" "-e" "SOURCE_DIR=/source" "-e" "USER_ID=1000" "-e" "CMD=cargo +aa49d8ef14939ddec0e34b346b60174a5673d48f-alt build --frozen" "-e" "CARGO_TARGET_DIR=/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/cargo-home" "-e" "RUSTUP_HOME=/rustup-home" "-m" "1536M" "--network" "none" "crater"` Dec 15 05:29:21.840 INFO [stdout] a54c17a5d40ae9ca5dee7d7b402ee7699d1f7b363f95be1955d5965b3cff93a2 Dec 15 05:29:21.842 INFO running `"docker" "start" "-a" "a54c17a5d40ae9ca5dee7d7b402ee7699d1f7b363f95be1955d5965b3cff93a2"` Dec 15 05:29:22.407 INFO [stderr] usermod: no changes Dec 15 05:29:22.439 INFO [stderr] Compiling luthor v0.2.0 (/source) Dec 15 05:29:23.388 INFO [stderr] Finished dev [unoptimized + debuginfo] target(s) in 0.96s Dec 15 05:29:23.390 INFO [stderr] su: No module specific data is present Dec 15 05:29:23.859 INFO running `"docker" "inspect" "a54c17a5d40ae9ca5dee7d7b402ee7699d1f7b363f95be1955d5965b3cff93a2"` Dec 15 05:29:24.031 INFO running `"docker" "rm" "-f" "a54c17a5d40ae9ca5dee7d7b402ee7699d1f7b363f95be1955d5965b3cff93a2"` Dec 15 05:29:24.143 INFO [stdout] a54c17a5d40ae9ca5dee7d7b402ee7699d1f7b363f95be1955d5965b3cff93a2 Dec 15 05:29:24.145 INFO running `"docker" "create" "-v" "/mnt/big/crater/work/local/target-dirs/pr-56550/worker-3/try#aa49d8ef14939ddec0e34b346b60174a5673d48f:/target:rw,Z" "-v" "/mnt/big/crater/work/local/test-source/worker-3/pr-56550/try#aa49d8ef14939ddec0e34b346b60174a5673d48f:/source:ro,Z" "-v" "/mnt/big/crater/work/local/cargo-home:/cargo-home:ro,Z" "-v" "/mnt/big/crater/work/local/rustup-home:/rustup-home:ro,Z" "-e" "USER_ID=1000" "-e" "SOURCE_DIR=/source" "-e" "USER_ID=1000" "-e" "CMD=cargo +aa49d8ef14939ddec0e34b346b60174a5673d48f-alt test --frozen --no-run" "-e" "CARGO_TARGET_DIR=/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/cargo-home" "-e" "RUSTUP_HOME=/rustup-home" "-m" "1536M" "--network" "none" "crater"` Dec 15 05:29:24.304 INFO [stdout] fcbaefb3443ab4d0bb736001dcbf018c9d2b05a363a10bc214425e356bc4d7a6 Dec 15 05:29:24.308 INFO running `"docker" "start" "-a" "fcbaefb3443ab4d0bb736001dcbf018c9d2b05a363a10bc214425e356bc4d7a6"` Dec 15 05:29:24.763 INFO [stderr] usermod: no changes Dec 15 05:29:24.795 INFO [stderr] Compiling luthor v0.2.0 (/source) Dec 15 05:29:26.336 INFO [stderr] Finished dev [unoptimized + debuginfo] target(s) in 1.52s Dec 15 05:29:26.336 INFO [stderr] su: No module specific data is present Dec 15 05:29:26.589 INFO running `"docker" "inspect" "fcbaefb3443ab4d0bb736001dcbf018c9d2b05a363a10bc214425e356bc4d7a6"` Dec 15 05:29:26.744 INFO running `"docker" "rm" "-f" "fcbaefb3443ab4d0bb736001dcbf018c9d2b05a363a10bc214425e356bc4d7a6"` Dec 15 05:29:26.969 INFO [stdout] fcbaefb3443ab4d0bb736001dcbf018c9d2b05a363a10bc214425e356bc4d7a6 Dec 15 05:29:26.973 INFO running `"docker" "create" "-v" "/mnt/big/crater/work/local/target-dirs/pr-56550/worker-3/try#aa49d8ef14939ddec0e34b346b60174a5673d48f:/target:rw,Z" "-v" "/mnt/big/crater/work/local/test-source/worker-3/pr-56550/try#aa49d8ef14939ddec0e34b346b60174a5673d48f:/source:ro,Z" "-v" "/mnt/big/crater/work/local/cargo-home:/cargo-home:ro,Z" "-v" "/mnt/big/crater/work/local/rustup-home:/rustup-home:ro,Z" "-e" "USER_ID=1000" "-e" "SOURCE_DIR=/source" "-e" "USER_ID=1000" "-e" "CMD=cargo +aa49d8ef14939ddec0e34b346b60174a5673d48f-alt test --frozen" "-e" "CARGO_TARGET_DIR=/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/cargo-home" "-e" "RUSTUP_HOME=/rustup-home" "-m" "1536M" "--network" "none" "crater"` Dec 15 05:29:27.271 INFO [stdout] 4f785d92c88f72dbf2e3ae316f1d15767b6782e115337864b2fa6e050076488e Dec 15 05:29:27.271 INFO running `"docker" "start" "-a" "4f785d92c88f72dbf2e3ae316f1d15767b6782e115337864b2fa6e050076488e"` Dec 15 05:29:27.824 INFO [stderr] usermod: no changes Dec 15 05:29:27.867 INFO [stderr] Finished dev [unoptimized + debuginfo] target(s) in 0.01s Dec 15 05:29:27.870 INFO [stderr] Running /target/debug/deps/luthor-33d221237ba48ff5 Dec 15 05:29:27.872 INFO [stdout] Dec 15 05:29:27.872 INFO [stdout] running 22 tests Dec 15 05:29:27.872 INFO [stdout] test lexers::json::tests::it_can_handle_utf8_data ... ok Dec 15 05:29:27.873 INFO [stdout] test lexers::default::tests::it_works ... ok Dec 15 05:29:27.873 INFO [stdout] test lexers::json::tests::it_works ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::json::tests::it_can_handle_open_strings ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::json::tests::it_can_handle_garbage ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::ruby::tests::it_identifies_integers_and_operators ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::xml::tests::it_can_handle_garbage ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::rust::tests::it_works ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::ruby::tests::it_works ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::tokenize_does_nothing_if_range_is_empty ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::tokenize_next_takes_at_most_what_is_left ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::tokenize_next_tokenizes_next_x_chars ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::xml::tests::it_can_handle_utf8_data ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::tokenize_next_tokenizes_previous_data_as_text ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::xml::tests::it_can_handle_open_strings ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::consume_whitespace_handles_preexisting_noncategorized_chars ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::current_char_returns_the_char_at_head ... ok Dec 15 05:29:27.874 INFO [stdout] test lexers::xml::tests::it_works ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::current_char_returns_none_if_at_the_end ... ok Dec 15 05:29:27.874 INFO [stdout] test tokenizer::tests::tokenize_creates_the_correct_token ... ok Dec 15 05:29:27.887 INFO [stdout] test tokenizer::tests::tokens_returns_unprocessed_data_as_text_token ... ok Dec 15 05:29:27.887 INFO [stderr] Doc-tests luthor Dec 15 05:29:27.887 INFO [stdout] test tokenizer::tests::tokens_joins_advanced_data_with_unprocessed_data_as_text_token ... ok Dec 15 05:29:27.887 INFO [stdout] Dec 15 05:29:27.887 INFO [stdout] test result: ok. 22 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out Dec 15 05:29:27.887 INFO [stdout] Dec 15 05:29:28.715 INFO [stdout] Dec 15 05:29:28.715 INFO [stdout] running 10 tests Dec 15 05:29:30.911 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::current_char (line 115) ... ok Dec 15 05:29:32.259 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::starts_with_lexeme (line 191) ... ok Dec 15 05:29:33.751 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::advance (line 90) ... ok Dec 15 05:29:33.873 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::consume_whitespace (line 307) ... ok Dec 15 05:29:34.147 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::tokenize (line 238) ... ok Dec 15 05:29:34.355 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::has_prefix (line 163) ... ok Dec 15 05:29:34.925 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::new (line 29) ... ok Dec 15 05:29:35.119 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::next_non_whitespace_char (line 139) ... ok Dec 15 05:29:35.403 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::tokenize_next (line 270) ... ok Dec 15 05:29:36.254 INFO [stdout] test src/tokenizer.rs - tokenizer::Tokenizer<'a>::tokens (line 48) ... ok Dec 15 05:29:36.254 INFO [stdout] Dec 15 05:29:36.254 INFO [stdout] test result: ok. 10 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out Dec 15 05:29:36.254 INFO [stdout] Dec 15 05:29:36.291 INFO [stderr] su: No module specific data is present Dec 15 05:29:36.760 INFO running `"docker" "inspect" "4f785d92c88f72dbf2e3ae316f1d15767b6782e115337864b2fa6e050076488e"` Dec 15 05:29:37.017 INFO running `"docker" "rm" "-f" "4f785d92c88f72dbf2e3ae316f1d15767b6782e115337864b2fa6e050076488e"` Dec 15 05:29:37.311 INFO [stdout] 4f785d92c88f72dbf2e3ae316f1d15767b6782e115337864b2fa6e050076488e