[INFO] crate rust_tokenizers 2.0.4 is already in cache [INFO] testing rust_tokenizers-2.0.4 against try#41997647ba6a77908f6ef64401414feb00bccf16 for pr-71274 [INFO] extracting crate rust_tokenizers 2.0.4 into /workspace/builds/worker-4/source [INFO] validating manifest of crates.io crate rust_tokenizers 2.0.4 on toolchain 41997647ba6a77908f6ef64401414feb00bccf16 [INFO] running `"/workspace/cargo-home/bin/cargo" "+41997647ba6a77908f6ef64401414feb00bccf16" "read-manifest" "--manifest-path" "Cargo.toml"` [INFO] started tweaking crates.io crate rust_tokenizers 2.0.4 [INFO] finished tweaking crates.io crate rust_tokenizers 2.0.4 [INFO] tweaked toml for crates.io crate rust_tokenizers 2.0.4 written to /workspace/builds/worker-4/source/Cargo.toml [INFO] crate crates.io crate rust_tokenizers 2.0.4 already has a lockfile, it will not be regenerated [INFO] running `"/workspace/cargo-home/bin/cargo" "+41997647ba6a77908f6ef64401414feb00bccf16" "fetch" "--locked" "--manifest-path" "Cargo.toml"` [INFO] running `"docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "MAP_USER_ID=0" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--network" "none" "rustops/crates-build-env" "/opt/rustwide/cargo-home/bin/cargo" "+41997647ba6a77908f6ef64401414feb00bccf16" "build" "--frozen"` [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] [stdout] 2701929c8f640e6eab4061343510bfaf4c13dd0d71cc8ec366fe4ba746be71da [INFO] running `"docker" "start" "-a" "2701929c8f640e6eab4061343510bfaf4c13dd0d71cc8ec366fe4ba746be71da"` [INFO] [stderr] Compiling byteorder v1.3.4 [INFO] [stderr] Compiling itertools v0.8.2 [INFO] [stderr] Compiling csv-core v0.1.10 [INFO] [stderr] Compiling crossbeam-utils v0.7.2 [INFO] [stderr] Compiling serde_json v1.0.49 [INFO] [stderr] Compiling crossbeam-epoch v0.8.2 [INFO] [stderr] Compiling crossbeam-queue v0.2.1 [INFO] [stderr] Compiling crossbeam-deque v0.7.3 [INFO] [stderr] Compiling regex-automata v0.1.9 [INFO] [stderr] Compiling rayon-core v1.7.0 [INFO] [stderr] Compiling bstr v0.2.12 [INFO] [stderr] Compiling rayon v1.3.0 [INFO] [stderr] Compiling csv v1.1.3 [INFO] [stderr] Compiling rust_tokenizers v2.0.4 (/opt/rustwide/workdir) [INFO] [stderr] Finished dev [unoptimized + debuginfo] target(s) in 23.47s [INFO] running `"docker" "inspect" "2701929c8f640e6eab4061343510bfaf4c13dd0d71cc8ec366fe4ba746be71da"` [INFO] running `"docker" "rm" "-f" "2701929c8f640e6eab4061343510bfaf4c13dd0d71cc8ec366fe4ba746be71da"` [INFO] [stdout] 2701929c8f640e6eab4061343510bfaf4c13dd0d71cc8ec366fe4ba746be71da [INFO] running `"docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "MAP_USER_ID=0" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--network" "none" "rustops/crates-build-env" "/opt/rustwide/cargo-home/bin/cargo" "+41997647ba6a77908f6ef64401414feb00bccf16" "test" "--frozen" "--no-run"` [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] [stdout] cebf3f5ab2df1675e66c7d5654b477513ab609a77bc29bf860c09a6131892702 [INFO] running `"docker" "start" "-a" "cebf3f5ab2df1675e66c7d5654b477513ab609a77bc29bf860c09a6131892702"` [INFO] [stderr] Compiling getrandom v0.1.14 [INFO] [stderr] Compiling rand_core v0.5.1 [INFO] [stderr] Compiling rand_chacha v0.2.2 [INFO] [stderr] Compiling rand v0.7.3 [INFO] [stderr] Compiling tempfile v3.1.0 [INFO] [stderr] Compiling rust_tokenizers v2.0.4 (/opt/rustwide/workdir) [INFO] [stderr] Finished test [unoptimized + debuginfo] target(s) in 18.97s [INFO] running `"docker" "inspect" "cebf3f5ab2df1675e66c7d5654b477513ab609a77bc29bf860c09a6131892702"` [INFO] running `"docker" "rm" "-f" "cebf3f5ab2df1675e66c7d5654b477513ab609a77bc29bf860c09a6131892702"` [INFO] [stdout] cebf3f5ab2df1675e66c7d5654b477513ab609a77bc29bf860c09a6131892702 [INFO] running `"docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "MAP_USER_ID=0" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--network" "none" "rustops/crates-build-env" "/opt/rustwide/cargo-home/bin/cargo" "+41997647ba6a77908f6ef64401414feb00bccf16" "test" "--frozen"` [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] [stdout] c9533dfbe8b4cfbcaea91dc801f05b15c9162f4e5e008653523f210d40ade224 [INFO] running `"docker" "start" "-a" "c9533dfbe8b4cfbcaea91dc801f05b15c9162f4e5e008653523f210d40ade224"` [INFO] [stderr] Finished test [unoptimized + debuginfo] target(s) in 0.23s [INFO] [stderr] Running /opt/rustwide/target/debug/deps/rust_tokenizers-f26fcb7251539ec9 [INFO] [stdout] [INFO] [stdout] running 77 tests [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_convert_tokens_to_ids ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_decode_skip_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_decode_clean_up_tokenization_spaces ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_encode_sentence_pair ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_encode_single_sentence ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_base_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_bert_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_decode_skip_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_decode_skip_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_bert_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_ctrl_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_openai_gpt_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_ctrl_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_encode_sentence_pair ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_openai_gpt_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_get_pair ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_bpe ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_cjk_char ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_clean_text ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_gpt2_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_group_common_pairs ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_whitespace ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_gpt2_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_control ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_roberta_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_sentence_pair_longest_first ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_split_on_punct ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_tokenize_cjk_chars ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_single_sentence ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_strip_accents ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_punctuation ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_whitespace_tokenize ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_wordpiece_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_split_on_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_sentence_pair_second_only ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_sentence_pair_first_only ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_create_object ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_roberta_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_create_object ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::bpe_vocab::tests::test_create_pair_vocab ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::bpe_vocab::tests::test_create_pair_vocab_from_file ... ok [INFO] [stdout] test preprocessing::vocab::bpe_vocab::tests::test_encode_byte_pairs ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_create_vocab ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_create_vocab ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_create_vocab ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 77 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out [INFO] [stdout] [INFO] [stderr] Running /opt/rustwide/target/debug/deps/rust_tokenizers_bin-33f9fa210c0c1221 [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out [INFO] [stdout] [INFO] [stderr] Doc-tests rust_tokenizers [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out [INFO] [stdout] [INFO] running `"docker" "inspect" "c9533dfbe8b4cfbcaea91dc801f05b15c9162f4e5e008653523f210d40ade224"` [INFO] running `"docker" "rm" "-f" "c9533dfbe8b4cfbcaea91dc801f05b15c9162f4e5e008653523f210d40ade224"` [INFO] [stdout] c9533dfbe8b4cfbcaea91dc801f05b15c9162f4e5e008653523f210d40ade224