[INFO] crate rust_tokenizers 2.0.4 is already in cache [INFO] testing rust_tokenizers-2.0.4 against master#28742a1146f10a4f09369baad027a464acb7a766 for pr-71274 [INFO] extracting crate rust_tokenizers 2.0.4 into /workspace/builds/worker-0/source [INFO] validating manifest of crates.io crate rust_tokenizers 2.0.4 on toolchain 28742a1146f10a4f09369baad027a464acb7a766 [INFO] running `"/workspace/cargo-home/bin/cargo" "+28742a1146f10a4f09369baad027a464acb7a766" "read-manifest" "--manifest-path" "Cargo.toml"` [INFO] started tweaking crates.io crate rust_tokenizers 2.0.4 [INFO] finished tweaking crates.io crate rust_tokenizers 2.0.4 [INFO] tweaked toml for crates.io crate rust_tokenizers 2.0.4 written to /workspace/builds/worker-0/source/Cargo.toml [INFO] crate crates.io crate rust_tokenizers 2.0.4 already has a lockfile, it will not be regenerated [INFO] running `"/workspace/cargo-home/bin/cargo" "+28742a1146f10a4f09369baad027a464acb7a766" "fetch" "--locked" "--manifest-path" "Cargo.toml"` [INFO] [stderr] Blocking waiting for file lock on package cache [INFO] [stderr] Blocking waiting for file lock on package cache [INFO] running `"docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-0/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-0/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "MAP_USER_ID=0" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--network" "none" "rustops/crates-build-env" "/opt/rustwide/cargo-home/bin/cargo" "+28742a1146f10a4f09369baad027a464acb7a766" "build" "--frozen"` [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] [stdout] e15d74f733189b8b38748714c29ac75c1703d50cdfcab230016c76a3edaef74f [INFO] running `"docker" "start" "-a" "e15d74f733189b8b38748714c29ac75c1703d50cdfcab230016c76a3edaef74f"` [INFO] [stderr] Compiling serde v1.0.105 [INFO] [stderr] Compiling smallvec v1.2.0 [INFO] [stderr] Compiling crossbeam-utils v0.7.2 [INFO] [stderr] Compiling memoffset v0.5.4 [INFO] [stderr] Compiling crossbeam-epoch v0.8.2 [INFO] [stderr] Compiling regex-automata v0.1.9 [INFO] [stderr] Compiling csv-core v0.1.10 [INFO] [stderr] Compiling unicode-normalization v0.1.12 [INFO] [stderr] Compiling crossbeam-queue v0.2.1 [INFO] [stderr] Compiling crossbeam-deque v0.7.3 [INFO] [stderr] Compiling rayon-core v1.7.0 [INFO] [stderr] Compiling rayon v1.3.0 [INFO] [stderr] Compiling bstr v0.2.12 [INFO] [stderr] Compiling serde_json v1.0.49 [INFO] [stderr] Compiling csv v1.1.3 [INFO] [stderr] Compiling rust_tokenizers v2.0.4 (/opt/rustwide/workdir) [INFO] [stderr] Finished dev [unoptimized + debuginfo] target(s) in 30.34s [INFO] running `"docker" "inspect" "e15d74f733189b8b38748714c29ac75c1703d50cdfcab230016c76a3edaef74f"` [INFO] running `"docker" "rm" "-f" "e15d74f733189b8b38748714c29ac75c1703d50cdfcab230016c76a3edaef74f"` [INFO] [stdout] e15d74f733189b8b38748714c29ac75c1703d50cdfcab230016c76a3edaef74f [INFO] running `"docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-0/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-0/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "MAP_USER_ID=0" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--network" "none" "rustops/crates-build-env" "/opt/rustwide/cargo-home/bin/cargo" "+28742a1146f10a4f09369baad027a464acb7a766" "test" "--frozen" "--no-run"` [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] [stdout] 74c13a221122bbf78c1a38cc9b70b85accb1095d262e4954b9652b728075bee7 [INFO] running `"docker" "start" "-a" "74c13a221122bbf78c1a38cc9b70b85accb1095d262e4954b9652b728075bee7"` [INFO] [stderr] Compiling getrandom v0.1.14 [INFO] [stderr] Compiling ppv-lite86 v0.2.6 [INFO] [stderr] Compiling rand_core v0.5.1 [INFO] [stderr] Compiling rand_chacha v0.2.2 [INFO] [stderr] Compiling rand v0.7.3 [INFO] [stderr] Compiling tempfile v3.1.0 [INFO] [stderr] Compiling rust_tokenizers v2.0.4 (/opt/rustwide/workdir) [INFO] [stderr] Finished test [unoptimized + debuginfo] target(s) in 20.57s [INFO] running `"docker" "inspect" "74c13a221122bbf78c1a38cc9b70b85accb1095d262e4954b9652b728075bee7"` [INFO] running `"docker" "rm" "-f" "74c13a221122bbf78c1a38cc9b70b85accb1095d262e4954b9652b728075bee7"` [INFO] [stdout] 74c13a221122bbf78c1a38cc9b70b85accb1095d262e4954b9652b728075bee7 [INFO] running `"docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-0/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-0/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "MAP_USER_ID=0" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--network" "none" "rustops/crates-build-env" "/opt/rustwide/cargo-home/bin/cargo" "+28742a1146f10a4f09369baad027a464acb7a766" "test" "--frozen"` [INFO] [stdout] df5d4f06249eafa06a0cec02899ee1c7a4680e4f360e50d096526ddfae83d809 [INFO] [stderr] WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. [INFO] running `"docker" "start" "-a" "df5d4f06249eafa06a0cec02899ee1c7a4680e4f360e50d096526ddfae83d809"` [INFO] [stderr] Finished test [unoptimized + debuginfo] target(s) in 0.19s [INFO] [stderr] Running /opt/rustwide/target/debug/deps/rust_tokenizers-f26fcb7251539ec9 [INFO] [stdout] [INFO] [stdout] running 77 tests [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_convert_tokens_to_ids ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_decode_skip_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_encode_sentence_pair ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_decode_clean_up_tokenization_spaces ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_bert_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_base_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::base_tokenizer::tests::test_encode_single_sentence ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_decode_skip_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_ctrl_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_ctrl_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_bert_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_decode_skip_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::bert_tokenizer::tests::test_encode_sentence_pair ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_openai_gpt_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::openai_gpt_tokenizer::tests::test_openai_gpt_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::ctrl_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_bpe ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_get_pair ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_gpt2_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_clean_text ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_group_common_pairs ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_cjk_char ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_punctuation ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_control ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_is_whitespace ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_split_on_special_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_split_on_punct ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_tokenize_cjk_chars ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_strip_accents ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_sentence_pair_first_only ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_decode ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_whitespace_tokenize ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_single_sentence ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_sentence_pair_longest_first ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_truncate_sentence_pair_second_only ... ok [INFO] [stdout] test preprocessing::tokenizer::tokenization_utils::tests::test_wordpiece_tokenizer ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_create_object ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_roberta_tokenizer_no_lower_casing ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_create_object ... ok [INFO] [stdout] test preprocessing::tokenizer::gpt2_tokenizer::tests::test_gpt2_tokenizer ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_roberta_tokenizer ... ok [INFO] [stdout] test preprocessing::tokenizer::roberta_tokenizer::tests::test_encode ... ok [INFO] [stdout] test preprocessing::vocab::bpe_vocab::tests::test_create_pair_vocab ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_create_vocab ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::bpe_vocab::tests::test_create_pair_vocab_from_file ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::bpe_vocab::tests::test_encode_byte_pairs ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_create_vocab ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_create_vocab ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_decode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_encode_tokens ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_create_object_from_file ... ok [INFO] [stdout] test preprocessing::vocab::bert_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::gpt2_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::openai_gpt_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::roberta_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] test preprocessing::vocab::base_vocab::tests::test_create_object_from_file_without_unknown_token ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 77 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out [INFO] [stdout] [INFO] [stderr] Running /opt/rustwide/target/debug/deps/rust_tokenizers_bin-33f9fa210c0c1221 [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out [INFO] [stdout] [INFO] [stderr] Doc-tests rust_tokenizers [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out [INFO] [stdout] [INFO] running `"docker" "inspect" "df5d4f06249eafa06a0cec02899ee1c7a4680e4f360e50d096526ddfae83d809"` [INFO] running `"docker" "rm" "-f" "df5d4f06249eafa06a0cec02899ee1c7a4680e4f360e50d096526ddfae83d809"` [INFO] [stdout] df5d4f06249eafa06a0cec02899ee1c7a4680e4f360e50d096526ddfae83d809