[INFO] cloning repository https://github.com/enricozanardo/wall-e [INFO] running `Command { std: "git" "-c" "credential.helper=" "-c" "credential.helper=/workspace/cargo-home/bin/git-credential-null" "clone" "--bare" "https://github.com/enricozanardo/wall-e" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fenricozanardo%2Fwall-e", kill_on_drop: false }` [INFO] [stderr] Cloning into bare repository '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fenricozanardo%2Fwall-e'... [INFO] running `Command { std: "git" "rev-parse" "HEAD", kill_on_drop: false }` [INFO] [stdout] 630da121b1d60a7c51b25053ee0d7523a80a9b0b [INFO] testing enricozanardo/wall-e against master#ad85bc524b1ad696e42061ad8338d382dffbdbe5 for pr-146237 [INFO] running `Command { std: "git" "clone" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fenricozanardo%2Fwall-e" "/workspace/builds/worker-4-tc1/source", kill_on_drop: false }` [INFO] [stderr] Cloning into '/workspace/builds/worker-4-tc1/source'... [INFO] [stderr] done. [INFO] [stderr] Updating files: 14% (26/180) Updating files: 15% (27/180) Updating files: 16% (29/180) Updating files: 17% (31/180) Updating files: 18% (33/180) Updating files: 19% (35/180) Updating files: 20% (36/180) Updating files: 21% (38/180) Updating files: 22% (40/180) Updating files: 23% (42/180) Updating files: 24% (44/180) Updating files: 25% (45/180) Updating files: 26% (47/180) Updating files: 27% (49/180) Updating files: 28% (51/180) Updating files: 29% (53/180) Updating files: 30% (54/180) Updating files: 31% (56/180) Updating files: 32% (58/180) Updating files: 33% (60/180) Updating files: 34% (62/180) Updating files: 35% (63/180) Updating files: 36% (65/180) Updating files: 37% (67/180) Updating files: 38% (69/180) Updating files: 39% (71/180) Updating files: 40% (72/180) Updating files: 41% (74/180) Updating files: 42% (76/180) Updating files: 43% (78/180) Updating files: 44% (80/180) Updating files: 45% (81/180) Updating files: 46% (83/180) Updating files: 47% (85/180) Updating files: 48% (87/180) Updating files: 49% (89/180) Updating files: 50% (90/180) Updating files: 51% (92/180) Updating files: 52% (94/180) Updating files: 53% (96/180) Updating files: 54% (98/180) Updating files: 55% (99/180) Updating files: 56% (101/180) Updating files: 57% (103/180) Updating files: 58% (105/180) Updating files: 59% (107/180) Updating files: 60% (108/180) Updating files: 61% (110/180) Updating files: 62% (112/180) Updating files: 63% (114/180) Updating files: 64% (116/180) Updating files: 65% (117/180) Updating files: 66% (119/180) Updating files: 67% (121/180) Updating files: 68% (123/180) Updating files: 69% (125/180) Updating files: 70% (126/180) Updating files: 71% (128/180) Updating files: 72% (130/180) Updating files: 73% (132/180) Updating files: 74% (134/180) Updating files: 75% (135/180) Updating files: 76% (137/180) Updating files: 77% (139/180) Updating files: 78% (141/180) Updating files: 79% (143/180) Updating files: 80% (144/180) Updating files: 81% (146/180) Updating files: 82% (148/180) Updating files: 83% (150/180) Updating files: 84% (152/180) Updating files: 85% (153/180) Updating files: 86% (155/180) Updating files: 87% (157/180) Updating files: 88% (159/180) Updating files: 89% (161/180) Updating files: 90% (162/180) Updating files: 91% (164/180) Updating files: 92% (166/180) Updating files: 93% (168/180) Updating files: 94% (170/180) Updating files: 95% (171/180) Updating files: 96% (173/180) Updating files: 97% (175/180) Updating files: 98% (177/180) Updating files: 99% (179/180) Updating files: 100% (180/180) Updating files: 100% (180/180), done. [INFO] started tweaking git repo https://github.com/enricozanardo/wall-e [INFO] removed 0 missing examples [INFO] finished tweaking git repo https://github.com/enricozanardo/wall-e [INFO] tweaked toml for git repo https://github.com/enricozanardo/wall-e written to /workspace/builds/worker-4-tc1/source/Cargo.toml [INFO] validating manifest of git repo https://github.com/enricozanardo/wall-e on toolchain ad85bc524b1ad696e42061ad8338d382dffbdbe5 [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+ad85bc524b1ad696e42061ad8338d382dffbdbe5" "metadata" "--manifest-path" "Cargo.toml" "--no-deps", kill_on_drop: false }` [INFO] crate git repo https://github.com/enricozanardo/wall-e already has a lockfile, it will not be regenerated [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+ad85bc524b1ad696e42061ad8338d382dffbdbe5" "fetch" "--manifest-path" "Cargo.toml", kill_on_drop: false }` [INFO] [stderr] Updating crates.io index [INFO] [stderr] Downloading crates ... [INFO] [stderr] Downloaded crossbeam-deque v0.8.6 [INFO] [stderr] Downloaded clap_lex v0.7.4 [INFO] [stderr] Downloaded number_prefix v0.4.0 [INFO] [stderr] Downloaded anstyle-query v1.1.2 [INFO] [stderr] Downloaded clap_derive v4.5.32 [INFO] [stderr] Downloaded anstream v0.6.18 [INFO] [stderr] Downloaded bytemuck v1.22.0 [INFO] [stderr] Downloaded clap v4.5.37 [INFO] [stderr] Downloaded jiff-static v0.2.13 [INFO] [stderr] Downloaded itertools v0.7.11 [INFO] [stderr] Downloaded rayon v1.10.0 [INFO] [stderr] Downloaded libm v0.2.13 [INFO] [stderr] Downloaded cc v1.2.19 [INFO] [stderr] Downloaded clap_builder v4.5.37 [INFO] [stderr] Downloaded portable-atomic v1.11.0 [INFO] [stderr] Downloaded zerocopy v0.8.24 [INFO] [stderr] Downloaded syn v2.0.100 [INFO] [stderr] Downloaded unicode-width v0.2.0 [INFO] [stderr] Downloaded rustix v1.0.5 [INFO] [stderr] Downloaded portable-atomic-util v0.2.4 [INFO] [stderr] Downloaded encode_unicode v1.0.0 [INFO] [stderr] Downloaded memmap2 v0.9.5 [INFO] [stderr] Downloaded matrixmultiply v0.1.15 [INFO] [stderr] Downloaded matrixmultiply v0.3.9 [INFO] [stderr] Downloaded zerocopy-derive v0.8.24 [INFO] [stderr] Downloaded console v0.15.11 [INFO] [stderr] Downloaded indicatif v0.17.11 [INFO] [stderr] Downloaded num-complex v0.2.4 [INFO] [stderr] Downloaded ndarray-parallel v0.9.1 [INFO] [stderr] Downloaded jiff v0.2.13 [INFO] [stderr] Downloaded num-complex v0.4.6 [INFO] [stderr] Downloaded ndarray-rand v0.14.0 [INFO] [stderr] Downloaded rand_distr v0.4.3 [INFO] [stderr] Downloaded rayon-core v1.12.1 [INFO] [stderr] Downloaded env_logger v0.11.8 [INFO] [stderr] Downloaded anstyle-wincon v3.0.7 [INFO] [stderr] Downloaded colorchoice v1.0.3 [INFO] [stderr] Downloaded ndarray v0.12.1 [INFO] [stderr] Downloaded anstyle v1.0.10 [INFO] [stderr] Downloaded env_filter v0.1.3 [INFO] [stderr] Downloaded anstyle-parse v0.2.6 [INFO] [stderr] Downloaded rawpointer v0.1.0 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:e90291280db7d1fac5b66fc6dad9f9662629e7365a55743daf9bdf73ebc4ea79" "/opt/rustwide/cargo-home/bin/cargo" "+ad85bc524b1ad696e42061ad8338d382dffbdbe5" "metadata" "--no-deps" "--format-version=1", kill_on_drop: false }` [INFO] [stdout] 1638951b4190854aa6bb8c7fdabbf120af04ac5e315ba4c87f3b31196b302327 [INFO] running `Command { std: "docker" "start" "-a" "1638951b4190854aa6bb8c7fdabbf120af04ac5e315ba4c87f3b31196b302327", kill_on_drop: false }` [INFO] running `Command { std: "docker" "inspect" "1638951b4190854aa6bb8c7fdabbf120af04ac5e315ba4c87f3b31196b302327", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "1638951b4190854aa6bb8c7fdabbf120af04ac5e315ba4c87f3b31196b302327", kill_on_drop: false }` [INFO] [stdout] 1638951b4190854aa6bb8c7fdabbf120af04ac5e315ba4c87f3b31196b302327 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "RUSTDOCFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:e90291280db7d1fac5b66fc6dad9f9662629e7365a55743daf9bdf73ebc4ea79" "/opt/rustwide/cargo-home/bin/cargo" "+ad85bc524b1ad696e42061ad8338d382dffbdbe5" "build" "--frozen" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] 032fcecaa2eb148f3a994e6be9e2b38e12e8e9f57ff4c4ce896e60804dd7dd64 [INFO] running `Command { std: "docker" "start" "-a" "032fcecaa2eb148f3a994e6be9e2b38e12e8e9f57ff4c4ce896e60804dd7dd64", kill_on_drop: false }` [INFO] [stderr] Compiling proc-macro2 v1.0.95 [INFO] [stderr] Compiling unicode-ident v1.0.18 [INFO] [stderr] Compiling autocfg v1.4.0 [INFO] [stderr] Compiling libc v0.2.172 [INFO] [stderr] Compiling libm v0.2.13 [INFO] [stderr] Compiling crossbeam-utils v0.8.21 [INFO] [stderr] Compiling serde v1.0.219 [INFO] [stderr] Compiling cfg-if v1.0.0 [INFO] [stderr] Compiling icu_locid_transform_data v1.5.1 [INFO] [stderr] Compiling memchr v2.7.4 [INFO] [stderr] Compiling writeable v0.5.5 [INFO] [stderr] Compiling litemap v0.7.5 [INFO] [stderr] Compiling zerocopy v0.8.24 [INFO] [stderr] Compiling icu_properties_data v1.5.1 [INFO] [stderr] Compiling rayon-core v1.12.1 [INFO] [stderr] Compiling either v1.15.0 [INFO] [stderr] Compiling icu_normalizer_data v1.5.1 [INFO] [stderr] Compiling utf8parse v0.2.2 [INFO] [stderr] Compiling is_terminal_polyfill v1.70.1 [INFO] [stderr] Compiling anstyle-parse v0.2.6 [INFO] [stderr] Compiling smallvec v1.15.0 [INFO] [stderr] Compiling anstyle-query v1.1.2 [INFO] [stderr] Compiling matrixmultiply v0.1.15 [INFO] [stderr] Compiling colorchoice v1.0.3 [INFO] [stderr] Compiling anstyle v1.0.10 [INFO] [stderr] Compiling write16 v1.0.0 [INFO] [stderr] Compiling utf16_iter v1.0.5 [INFO] [stderr] Compiling regex-syntax v0.8.5 [INFO] [stderr] Compiling rawpointer v0.1.0 [INFO] [stderr] Compiling getrandom v0.3.2 [INFO] [stderr] Compiling rawpointer v0.2.1 [INFO] [stderr] Compiling ndarray v0.12.1 [INFO] [stderr] Compiling anstream v0.6.18 [INFO] [stderr] Compiling aho-corasick v1.1.3 [INFO] [stderr] Compiling rustix v1.0.5 [INFO] [stderr] Compiling portable-atomic v1.11.0 [INFO] [stderr] Compiling itertools v0.7.11 [INFO] [stderr] Compiling strsim v0.11.1 [INFO] [stderr] Compiling num-traits v0.2.19 [INFO] [stderr] Compiling num-complex v0.2.4 [INFO] [stderr] Compiling matrixmultiply v0.3.9 [INFO] [stderr] Compiling percent-encoding v2.3.1 [INFO] [stderr] Compiling bitflags v2.9.0 [INFO] [stderr] Compiling serde_json v1.0.140 [INFO] [stderr] Compiling clap_lex v0.7.4 [INFO] [stderr] Compiling log v0.4.27 [INFO] [stderr] Compiling crossbeam-epoch v0.9.18 [INFO] [stderr] Compiling unicode-width v0.2.0 [INFO] [stderr] Compiling linux-raw-sys v0.9.4 [INFO] [stderr] Compiling thiserror v1.0.69 [INFO] [stderr] Compiling heck v0.5.0 [INFO] [stderr] Compiling form_urlencoded v1.2.1 [INFO] [stderr] Compiling clap_builder v4.5.37 [INFO] [stderr] Compiling quote v1.0.40 [INFO] [stderr] Compiling crossbeam-deque v0.8.6 [INFO] [stderr] Compiling syn v2.0.100 [INFO] [stderr] Compiling csv-core v0.1.12 [INFO] [stderr] Compiling jiff v0.2.13 [INFO] [stderr] Compiling number_prefix v0.4.0 [INFO] [stderr] Compiling iana-time-zone v0.1.63 [INFO] [stderr] Compiling fastrand v2.3.0 [INFO] [stderr] Compiling lazy_static v1.5.0 [INFO] [stderr] Compiling getrandom v0.2.16 [INFO] [stderr] Compiling console v0.15.11 [INFO] [stderr] Compiling rand_core v0.6.4 [INFO] [stderr] Compiling num_cpus v1.16.0 [INFO] [stderr] Compiling memmap2 v0.9.5 [INFO] [stderr] Compiling byteorder v1.5.0 [INFO] [stderr] Compiling bytemuck v1.22.0 [INFO] [stderr] Compiling rayon v1.10.0 [INFO] [stderr] Compiling indicatif v0.17.11 [INFO] [stderr] Compiling ppv-lite86 v0.2.21 [INFO] [stderr] Compiling num-complex v0.4.6 [INFO] [stderr] Compiling num-integer v0.1.46 [INFO] [stderr] Compiling rand_chacha v0.3.1 [INFO] [stderr] Compiling regex-automata v0.4.9 [INFO] [stderr] Compiling rand v0.8.5 [INFO] [stderr] Compiling tempfile v3.19.1 [INFO] [stderr] Compiling rand_distr v0.4.3 [INFO] [stderr] Compiling synstructure v0.13.1 [INFO] [stderr] Compiling ndarray-parallel v0.9.1 [INFO] [stderr] Compiling regex v1.11.1 [INFO] [stderr] Compiling zerovec-derive v0.10.3 [INFO] [stderr] Compiling displaydoc v0.2.5 [INFO] [stderr] Compiling serde_derive v1.0.219 [INFO] [stderr] Compiling icu_provider_macros v1.5.0 [INFO] [stderr] Compiling thiserror-impl v1.0.69 [INFO] [stderr] Compiling clap_derive v4.5.32 [INFO] [stderr] Compiling zerofrom-derive v0.1.6 [INFO] [stderr] Compiling yoke-derive v0.7.5 [INFO] [stderr] Compiling env_filter v0.1.3 [INFO] [stderr] Compiling env_logger v0.11.8 [INFO] [stderr] Compiling zerofrom v0.1.6 [INFO] [stderr] Compiling yoke v0.7.5 [INFO] [stderr] Compiling clap v4.5.37 [INFO] [stderr] Compiling zerovec v0.10.4 [INFO] [stderr] Compiling tinystr v0.7.6 [INFO] [stderr] Compiling icu_collections v1.5.0 [INFO] [stderr] Compiling icu_locid v1.5.0 [INFO] [stderr] Compiling icu_provider v1.5.0 [INFO] [stderr] Compiling icu_locid_transform v1.5.0 [INFO] [stderr] Compiling icu_properties v1.5.1 [INFO] [stderr] Compiling ndarray v0.15.6 [INFO] [stderr] Compiling bincode v1.3.3 [INFO] [stderr] Compiling chrono v0.4.40 [INFO] [stderr] Compiling csv v1.3.1 [INFO] [stderr] Compiling icu_normalizer v1.5.0 [INFO] [stderr] Compiling idna_adapter v1.2.0 [INFO] [stderr] Compiling idna v1.0.3 [INFO] [stderr] Compiling url v2.5.4 [INFO] [stderr] Compiling ndarray-rand v0.14.0 [INFO] [stderr] Compiling wall-e1 v0.1.0 (/opt/rustwide/workdir) [INFO] [stdout] warning: unused import: `Duration` [INFO] [stdout] --> src/nabla/memory_opt.rs:4:26 [INFO] [stdout] | [INFO] [stdout] 4 | use std::time::{Instant, Duration}; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Axis` and `Ix3` [INFO] [stdout] --> src/training/mod.rs:4:46 [INFO] [stdout] | [INFO] [stdout] 4 | use ndarray::{Array, Array1, Array2, Array3, Axis, Ix1, Ix2, Ix3, Ix0, IxDyn, s}; [INFO] [stdout] | ^^^^ ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray::Array0` [INFO] [stdout] --> src/training/mod.rs:12:5 [INFO] [stdout] | [INFO] [stdout] 12 | use ndarray::Array0; [INFO] [stdout] | ^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray::array` [INFO] [stdout] --> src/training/mod.rs:13:5 [INFO] [stdout] | [INFO] [stdout] 13 | use ndarray::array; [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::time::Instant` [INFO] [stdout] --> src/training/mod.rs:16:5 [INFO] [stdout] | [INFO] [stdout] 16 | use std::time::Instant; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Arc`, `Barrier`, and `Mutex` [INFO] [stdout] --> src/training/mod.rs:17:17 [INFO] [stdout] | [INFO] [stdout] 17 | use std::sync::{Mutex, Arc, Barrier}; [INFO] [stdout] | ^^^^^ ^^^ ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `AtomicUsize` and `Ordering` [INFO] [stdout] --> src/training/mod.rs:18:25 [INFO] [stdout] | [INFO] [stdout] 18 | use std::sync::atomic::{AtomicUsize, Ordering}; [INFO] [stdout] | ^^^^^^^^^^^ ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `crate::nabla::memory_opt` [INFO] [stdout] --> src/training/enhanced_trainer.rs:15:5 [INFO] [stdout] | [INFO] [stdout] 15 | use crate::nabla::memory_opt; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Barrier` [INFO] [stdout] --> src/training/enhanced_trainer.rs:19:29 [INFO] [stdout] | [INFO] [stdout] 19 | use std::sync::{Arc, Mutex, Barrier}; [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `MutexGuard` and `Mutex` [INFO] [stdout] --> src/training/batch_dispatcher.rs:4:17 [INFO] [stdout] | [INFO] [stdout] 4 | use std::sync::{Mutex, MutexGuard}; [INFO] [stdout] | ^^^^^ ^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::thread::ThreadId` [INFO] [stdout] --> src/training/batch_dispatcher.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use std::thread::ThreadId; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `v` [INFO] [stdout] --> src/nabla/memory_opt.rs:88:15 [INFO] [stdout] | [INFO] [stdout] 88 | if let Ok(v) = std::env::var("MALLOC_ARENA_MAX") { [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_v` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `count` [INFO] [stdout] --> src/tokenizer/word_bpe.rs:420:20 [INFO] [stdout] | [INFO] [stdout] 420 | for (word, count) in word_counts.iter().filter(|&(_, count)| *count >= min_frequency) { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_count` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `tokens_to_add` [INFO] [stdout] --> src/training/mod.rs:1315:13 [INFO] [stdout] | [INFO] [stdout] 1315 | let tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] help: if this is intentional, prefix it with an underscore [INFO] [stdout] | [INFO] [stdout] 1315 | let _tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] | + [INFO] [stdout] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stdout] | [INFO] [stdout] 1315 - let tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] 1315 + let training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE = new_vocab_size - current_size; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `reported_progress` is assigned to, but never used [INFO] [stdout] --> src/training/generation.rs:174:17 [INFO] [stdout] | [INFO] [stdout] 174 | let mut reported_progress = false; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_reported_progress` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `reported_progress` is never read [INFO] [stdout] --> src/training/generation.rs:181:17 [INFO] [stdout] | [INFO] [stdout] 181 | reported_progress = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `batch` [INFO] [stdout] --> src/training/enhanced_trainer.rs:538:34 [INFO] [stdout] | [INFO] [stdout] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_batch` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `target` [INFO] [stdout] --> src/training/enhanced_trainer.rs:538:47 [INFO] [stdout] | [INFO] [stdout] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stdout] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_target` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable does not need to be mutable [INFO] [stdout] --> src/training/enhanced_trainer.rs:542:29 [INFO] [stdout] | [INFO] [stdout] 542 | let mut loss_guard = loss_mutex.lock().unwrap(); [INFO] [stdout] | ----^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | help: remove this `mut` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_mut)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `thread_id` [INFO] [stdout] --> src/training/enhanced_trainer.rs:3454:13 [INFO] [stdout] | [INFO] [stdout] 3454 | for thread_id in 0..num_threads { [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] help: if this is intentional, prefix it with an underscore [INFO] [stdout] | [INFO] [stdout] 3454 | for _thread_id in 0..num_threads { [INFO] [stdout] | + [INFO] [stdout] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stdout] | [INFO] [stdout] 3454 - for thread_id in 0..num_threads { [INFO] [stdout] 3454 + for training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE in 0..num_threads { [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `progress_reported` is never read [INFO] [stdout] --> src/training/enhanced_trainer.rs:3714:13 [INFO] [stdout] | [INFO] [stdout] 3714 | progress_reported = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `determine_allocator_type` is never used [INFO] [stdout] --> src/nabla/memory_opt.rs:79:4 [INFO] [stdout] | [INFO] [stdout] 79 | fn determine_allocator_type() -> Option { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `get_total_elapsed` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:165:8 [INFO] [stdout] | [INFO] [stdout] 134 | impl ThreadStateTracker { [INFO] [stdout] | ----------------------- method in this implementation [INFO] [stdout] ... [INFO] [stdout] 165 | fn get_total_elapsed(&self) -> std::time::Duration { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `run_with_timeout` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:2001:8 [INFO] [stdout] | [INFO] [stdout] 201 | impl EnhancedTrainer { [INFO] [stdout] | -------------------- method in this implementation [INFO] [stdout] ... [INFO] [stdout] 2001 | fn run_with_timeout(&self, f: F, timeout: std::time::Duration) -> Option [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `V0_HEADER_SIZE` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:101:15 [INFO] [stdout] | [INFO] [stdout] 101 | pub const V0_HEADER_SIZE: usize = 32; [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `time` is never read [INFO] [stdout] --> src/training/batch_dispatcher.rs:44:5 [INFO] [stdout] | [INFO] [stdout] 40 | struct LockEvent { [INFO] [stdout] | --------- field in this struct [INFO] [stdout] ... [INFO] [stdout] 44 | time: Instant, [INFO] [stdout] | ^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `ProgressBar` and `ProgressStyle` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:16:17 [INFO] [stdout] | [INFO] [stdout] 16 | use indicatif::{ProgressBar, ProgressStyle}; [INFO] [stdout] | ^^^^^^^^^^^ ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Arc`, `AtomicUsize`, `Mutex`, and `Ordering` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:17:17 [INFO] [stdout] | [INFO] [stdout] 17 | use std::sync::{Arc, Mutex, atomic::{AtomicUsize, Ordering}}; [INFO] [stdout] | ^^^ ^^^^^ ^^^^^^^^^^^ ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `TextGenerator` [INFO] [stdout] --> src/bin/test_multithreading.rs:5:42 [INFO] [stdout] | [INFO] [stdout] 5 | use wall_e1::training::{EnhancedTrainer, TextGenerator}; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `wall_e1::nabla::tensor::Tensor` [INFO] [stdout] --> src/bin/memory_test.rs:4:5 [INFO] [stdout] | [INFO] [stdout] 4 | use wall_e1::nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::env` [INFO] [stdout] --> src/bin/diagnose_training.rs:2:5 [INFO] [stdout] | [INFO] [stdout] 2 | use std::env; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `wall_e1::tokenizer::Tokenizer` [INFO] [stdout] --> src/bin/diagnose_training.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use wall_e1::tokenizer::Tokenizer; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::fs::File` [INFO] [stdout] --> src/bin/diagnose_training.rs:7:5 [INFO] [stdout] | [INFO] [stdout] 7 | use std::fs::File; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::io::Read` [INFO] [stdout] --> src/bin/diagnose_training.rs:8:5 [INFO] [stdout] | [INFO] [stdout] 8 | use std::io::Read; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `thread_id` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:47:13 [INFO] [stdout] | [INFO] [stdout] 47 | let thread_id = format!("{:?}", std::thread::current().id()); [INFO] [stdout] | ^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_thread_id` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `thread_count` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:371:13 [INFO] [stdout] | [INFO] [stdout] 371 | let thread_count = if op == &"data_loading" { min_data_threads } else { pool.get_num_threads() }; [INFO] [stdout] | ^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_thread_count` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `setup_start` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:428:9 [INFO] [stdout] | [INFO] [stdout] 428 | let setup_start = Instant::now(); [INFO] [stdout] | ^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_setup_start` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `curriculum_examples` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:462:13 [INFO] [stdout] | [INFO] [stdout] 462 | let mut curriculum_examples: usize = 2000; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_curriculum_examples` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `auto_resize_vocab` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:466:13 [INFO] [stdout] | [INFO] [stdout] 466 | let mut auto_resize_vocab = false; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_auto_resize_vocab` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `watchdog_timeout` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:467:13 [INFO] [stdout] | [INFO] [stdout] 467 | let mut watchdog_timeout: Option = None; [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_watchdog_timeout` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `target_id_max` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:470:13 [INFO] [stdout] | [INFO] [stdout] 470 | let mut target_id_max: Option = None; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_target_id_max` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `curriculum_examples` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:675:25 [INFO] [stdout] | [INFO] [stdout] 675 | curriculum_examples = num; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `auto_resize_vocab` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:681:17 [INFO] [stdout] | [INFO] [stdout] 681 | auto_resize_vocab = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `watchdog_timeout` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:687:25 [INFO] [stdout] | [INFO] [stdout] 687 | watchdog_timeout = Some(timeout); [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `target_id_max` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:711:25 [INFO] [stdout] | [INFO] [stdout] 711 | target_id_max = Some(max_id); [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable does not need to be mutable [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1230:9 [INFO] [stdout] | [INFO] [stdout] 1230 | let mut path_with_dir = if !final_save_path_str.starts_with("models/") { [INFO] [stdout] | ----^^^^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | help: remove this `mut` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_mut)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `epoch` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1340:68 [INFO] [stdout] | [INFO] [stdout] 1340 | fn train_epoch(trainer: &mut EnhancedTrainer, tokens: &Vec, epoch: usize, [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_epoch` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `batch_inputs` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1508:10 [INFO] [stdout] | [INFO] [stdout] 1508 | let (batch_inputs, min_seq_len, truncated_inputs, batch_targets_arr, timing) = pool_result.install(|| { [INFO] [stdout] | ^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_batch_inputs` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `min_seq_len` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1508:24 [INFO] [stdout] | [INFO] [stdout] 1508 | let (batch_inputs, min_seq_len, truncated_inputs, batch_targets_arr, timing) = pool_result.install(|| { [INFO] [stdout] | ^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_min_seq_len` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `k` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1779:13 [INFO] [stdout] | [INFO] [stdout] 1779 | let (m, k) = a.dim(); [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_k` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `prepare_batch_parallel` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1472:4 [INFO] [stdout] | [INFO] [stdout] 1472 | fn prepare_batch_parallel( [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `detect_simd_features` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1691:4 [INFO] [stdout] | [INFO] [stdout] 1691 | fn detect_simd_features() -> Vec { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `enable_simd_optimizations` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1728:4 [INFO] [stdout] | [INFO] [stdout] 1728 | fn enable_simd_optimizations(features: &[String]) -> bool { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `matrix_multiply_simd` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1774:4 [INFO] [stdout] | [INFO] [stdout] 1774 | fn matrix_multiply_simd(a: &ndarray::Array2, b: &ndarray::Array2) -> ndarray::Array2 { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `matrix_multiply_avx2` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1844:4 [INFO] [stdout] | [INFO] [stdout] 1844 | fn matrix_multiply_avx2(a: &ndarray::Array2, b: &ndarray::Array2, result: &mut ndarray::Array2) { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `matrix_multiply_scalar` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1850:4 [INFO] [stdout] | [INFO] [stdout] 1850 | fn matrix_multiply_scalar(a: &ndarray::Array2, b: &ndarray::Array2, result: &mut ndarray::Array2) { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished `dev` profile [unoptimized + debuginfo] target(s) in 59.44s [INFO] running `Command { std: "docker" "inspect" "032fcecaa2eb148f3a994e6be9e2b38e12e8e9f57ff4c4ce896e60804dd7dd64", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "032fcecaa2eb148f3a994e6be9e2b38e12e8e9f57ff4c4ce896e60804dd7dd64", kill_on_drop: false }` [INFO] [stdout] 032fcecaa2eb148f3a994e6be9e2b38e12e8e9f57ff4c4ce896e60804dd7dd64 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "RUSTDOCFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:e90291280db7d1fac5b66fc6dad9f9662629e7365a55743daf9bdf73ebc4ea79" "/opt/rustwide/cargo-home/bin/cargo" "+ad85bc524b1ad696e42061ad8338d382dffbdbe5" "test" "--frozen" "--no-run" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] ce7a25b500ba1132c415cceabaa63fbdf741fdad094bfa1ef44af3c40a595987 [INFO] running `Command { std: "docker" "start" "-a" "ce7a25b500ba1132c415cceabaa63fbdf741fdad094bfa1ef44af3c40a595987", kill_on_drop: false }` [INFO] [stdout] warning: unused import: `Duration` [INFO] [stdout] --> src/nabla/memory_opt.rs:4:26 [INFO] [stdout] | [INFO] [stdout] 4 | use std::time::{Instant, Duration}; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Axis` and `Ix3` [INFO] [stdout] --> src/training/mod.rs:4:46 [INFO] [stdout] | [INFO] [stdout] 4 | use ndarray::{Array, Array1, Array2, Array3, Axis, Ix1, Ix2, Ix3, Ix0, IxDyn, s}; [INFO] [stdout] | ^^^^ ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray::Array0` [INFO] [stdout] --> src/training/mod.rs:12:5 [INFO] [stdout] | [INFO] [stdout] 12 | use ndarray::Array0; [INFO] [stdout] | ^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray::array` [INFO] [stdout] --> src/training/mod.rs:13:5 [INFO] [stdout] | [INFO] [stdout] 13 | use ndarray::array; [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling wall-e1 v0.1.0 (/opt/rustwide/workdir) [INFO] [stdout] warning: unused import: `std::time::Instant` [INFO] [stdout] --> src/training/mod.rs:16:5 [INFO] [stdout] | [INFO] [stdout] 16 | use std::time::Instant; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Arc`, `Barrier`, and `Mutex` [INFO] [stdout] --> src/training/mod.rs:17:17 [INFO] [stdout] | [INFO] [stdout] 17 | use std::sync::{Mutex, Arc, Barrier}; [INFO] [stdout] | ^^^^^ ^^^ ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `AtomicUsize` and `Ordering` [INFO] [stdout] --> src/training/mod.rs:18:25 [INFO] [stdout] | [INFO] [stdout] 18 | use std::sync::atomic::{AtomicUsize, Ordering}; [INFO] [stdout] | ^^^^^^^^^^^ ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `crate::nabla::memory_opt` [INFO] [stdout] --> src/training/enhanced_trainer.rs:15:5 [INFO] [stdout] | [INFO] [stdout] 15 | use crate::nabla::memory_opt; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Barrier` [INFO] [stdout] --> src/training/enhanced_trainer.rs:19:29 [INFO] [stdout] | [INFO] [stdout] 19 | use std::sync::{Arc, Mutex, Barrier}; [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `MutexGuard` and `Mutex` [INFO] [stdout] --> src/training/batch_dispatcher.rs:4:17 [INFO] [stdout] | [INFO] [stdout] 4 | use std::sync::{Mutex, MutexGuard}; [INFO] [stdout] | ^^^^^ ^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::thread::ThreadId` [INFO] [stdout] --> src/training/batch_dispatcher.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use std::thread::ThreadId; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `v` [INFO] [stdout] --> src/nabla/memory_opt.rs:88:15 [INFO] [stdout] | [INFO] [stdout] 88 | if let Ok(v) = std::env::var("MALLOC_ARENA_MAX") { [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_v` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `count` [INFO] [stdout] --> src/tokenizer/word_bpe.rs:420:20 [INFO] [stdout] | [INFO] [stdout] 420 | for (word, count) in word_counts.iter().filter(|&(_, count)| *count >= min_frequency) { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_count` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `tokens_to_add` [INFO] [stdout] --> src/training/mod.rs:1315:13 [INFO] [stdout] | [INFO] [stdout] 1315 | let tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] help: if this is intentional, prefix it with an underscore [INFO] [stdout] | [INFO] [stdout] 1315 | let _tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] | + [INFO] [stdout] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stdout] | [INFO] [stdout] 1315 - let tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] 1315 + let training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE = new_vocab_size - current_size; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `reported_progress` is assigned to, but never used [INFO] [stdout] --> src/training/generation.rs:174:17 [INFO] [stdout] | [INFO] [stdout] 174 | let mut reported_progress = false; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_reported_progress` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `reported_progress` is never read [INFO] [stdout] --> src/training/generation.rs:181:17 [INFO] [stdout] | [INFO] [stdout] 181 | reported_progress = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `batch` [INFO] [stdout] --> src/training/enhanced_trainer.rs:538:34 [INFO] [stdout] | [INFO] [stdout] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_batch` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `target` [INFO] [stdout] --> src/training/enhanced_trainer.rs:538:47 [INFO] [stdout] | [INFO] [stdout] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stdout] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_target` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable does not need to be mutable [INFO] [stdout] --> src/training/enhanced_trainer.rs:542:29 [INFO] [stdout] | [INFO] [stdout] 542 | let mut loss_guard = loss_mutex.lock().unwrap(); [INFO] [stdout] | ----^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | help: remove this `mut` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_mut)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `thread_id` [INFO] [stdout] --> src/training/enhanced_trainer.rs:3454:13 [INFO] [stdout] | [INFO] [stdout] 3454 | for thread_id in 0..num_threads { [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] help: if this is intentional, prefix it with an underscore [INFO] [stdout] | [INFO] [stdout] 3454 | for _thread_id in 0..num_threads { [INFO] [stdout] | + [INFO] [stdout] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stdout] | [INFO] [stdout] 3454 - for thread_id in 0..num_threads { [INFO] [stdout] 3454 + for training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE in 0..num_threads { [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `progress_reported` is never read [INFO] [stdout] --> src/training/enhanced_trainer.rs:3714:13 [INFO] [stdout] | [INFO] [stdout] 3714 | progress_reported = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `determine_allocator_type` is never used [INFO] [stdout] --> src/nabla/memory_opt.rs:79:4 [INFO] [stdout] | [INFO] [stdout] 79 | fn determine_allocator_type() -> Option { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `get_total_elapsed` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:165:8 [INFO] [stdout] | [INFO] [stdout] 134 | impl ThreadStateTracker { [INFO] [stdout] | ----------------------- method in this implementation [INFO] [stdout] ... [INFO] [stdout] 165 | fn get_total_elapsed(&self) -> std::time::Duration { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `run_with_timeout` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:2001:8 [INFO] [stdout] | [INFO] [stdout] 201 | impl EnhancedTrainer { [INFO] [stdout] | -------------------- method in this implementation [INFO] [stdout] ... [INFO] [stdout] 2001 | fn run_with_timeout(&self, f: F, timeout: std::time::Duration) -> Option [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `V0_HEADER_SIZE` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:101:15 [INFO] [stdout] | [INFO] [stdout] 101 | pub const V0_HEADER_SIZE: usize = 32; [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `time` is never read [INFO] [stdout] --> src/training/batch_dispatcher.rs:44:5 [INFO] [stdout] | [INFO] [stdout] 40 | struct LockEvent { [INFO] [stdout] | --------- field in this struct [INFO] [stdout] ... [INFO] [stdout] 44 | time: Instant, [INFO] [stdout] | ^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `wall_e1::nabla::tensor::Tensor` [INFO] [stdout] --> src/bin/memory_test.rs:4:5 [INFO] [stdout] | [INFO] [stdout] 4 | use wall_e1::nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `WordPieceBPETokenizer` [INFO] [stdout] --> examples/test_model.rs:6:26 [INFO] [stdout] | [INFO] [stdout] 6 | use wall_e1::tokenizer::{WordPieceBPETokenizer, Tokenizer}; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray::Array2` [INFO] [stdout] --> examples/test_model.rs:8:5 [INFO] [stdout] | [INFO] [stdout] 8 | use ndarray::Array2; [INFO] [stdout] | ^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `ProgressBar` and `ProgressStyle` [INFO] [stdout] --> examples/test_model.rs:10:17 [INFO] [stdout] | [INFO] [stdout] 10 | use indicatif::{ProgressBar, ProgressStyle}; [INFO] [stdout] | ^^^^^^^^^^^ ^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `rand::prelude::*` [INFO] [stdout] --> examples/test_model.rs:11:5 [INFO] [stdout] | [INFO] [stdout] 11 | use rand::prelude::*; [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `TextGenerator` [INFO] [stdout] --> src/bin/test_multithreading.rs:5:42 [INFO] [stdout] | [INFO] [stdout] 5 | use wall_e1::training::{EnhancedTrainer, TextGenerator}; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `add_load_method_to_trainer` is never used [INFO] [stdout] --> examples/test_model.rs:14:4 [INFO] [stdout] | [INFO] [stdout] 14 | fn add_load_method_to_trainer() -> Result<(), Box> { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `ProgressBar` and `ProgressStyle` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:16:17 [INFO] [stdout] | [INFO] [stdout] 16 | use indicatif::{ProgressBar, ProgressStyle}; [INFO] [stdout] | ^^^^^^^^^^^ ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Arc`, `AtomicUsize`, `Mutex`, and `Ordering` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:17:17 [INFO] [stdout] | [INFO] [stdout] 17 | use std::sync::{Arc, Mutex, atomic::{AtomicUsize, Ordering}}; [INFO] [stdout] | ^^^ ^^^^^ ^^^^^^^^^^^ ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::env` [INFO] [stdout] --> src/bin/diagnose_training.rs:2:5 [INFO] [stdout] | [INFO] [stdout] 2 | use std::env; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `wall_e1::tokenizer::Tokenizer` [INFO] [stdout] --> src/bin/diagnose_training.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use wall_e1::tokenizer::Tokenizer; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::fs::File` [INFO] [stdout] --> src/bin/diagnose_training.rs:7:5 [INFO] [stdout] | [INFO] [stdout] 7 | use std::fs::File; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::io::Read` [INFO] [stdout] --> src/bin/diagnose_training.rs:8:5 [INFO] [stdout] | [INFO] [stdout] 8 | use std::io::Read; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Array2` [INFO] [stdout] --> examples/model_test.rs:6:15 [INFO] [stdout] | [INFO] [stdout] 6 | use ndarray::{Array2, s}; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `wall_e1::tokenizer::basic_tokenizer::BasicTokenizer` [INFO] [stdout] --> examples/train_qa_model.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use wall_e1::tokenizer::basic_tokenizer::BasicTokenizer; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `thread_id` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:47:13 [INFO] [stdout] | [INFO] [stdout] 47 | let thread_id = format!("{:?}", std::thread::current().id()); [INFO] [stdout] | ^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_thread_id` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Duration` [INFO] [stdout] --> src/nabla/memory_opt.rs:4:26 [INFO] [stdout] | [INFO] [stdout] 4 | use std::time::{Instant, Duration}; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Axis` and `Ix3` [INFO] [stdout] --> src/training/mod.rs:4:46 [INFO] [stdout] | [INFO] [stdout] 4 | use ndarray::{Array, Array1, Array2, Array3, Axis, Ix1, Ix2, Ix3, Ix0, IxDyn, s}; [INFO] [stdout] | ^^^^ ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray::Array0` [INFO] [stdout] --> src/training/mod.rs:12:5 [INFO] [stdout] | [INFO] [stdout] 12 | use ndarray::Array0; [INFO] [stdout] | ^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray::array` [INFO] [stdout] --> src/training/mod.rs:13:5 [INFO] [stdout] | [INFO] [stdout] 13 | use ndarray::array; [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::time::Instant` [INFO] [stdout] --> src/training/mod.rs:16:5 [INFO] [stdout] | [INFO] [stdout] 16 | use std::time::Instant; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `Arc`, `Barrier`, and `Mutex` [INFO] [stdout] --> src/training/mod.rs:17:17 [INFO] [stdout] | [INFO] [stdout] 17 | use std::sync::{Mutex, Arc, Barrier}; [INFO] [stdout] | ^^^^^ ^^^ ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `AtomicUsize` and `Ordering` [INFO] [stdout] --> src/training/mod.rs:18:25 [INFO] [stdout] | [INFO] [stdout] 18 | use std::sync::atomic::{AtomicUsize, Ordering}; [INFO] [stdout] | ^^^^^^^^^^^ ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `crate::nabla::memory_opt` [INFO] [stdout] --> src/training/enhanced_trainer.rs:15:5 [INFO] [stdout] | [INFO] [stdout] 15 | use crate::nabla::memory_opt; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Barrier` [INFO] [stdout] --> src/training/enhanced_trainer.rs:19:29 [INFO] [stdout] | [INFO] [stdout] 19 | use std::sync::{Arc, Mutex, Barrier}; [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `MutexGuard` and `Mutex` [INFO] [stdout] --> src/training/batch_dispatcher.rs:4:17 [INFO] [stdout] | [INFO] [stdout] 4 | use std::sync::{Mutex, MutexGuard}; [INFO] [stdout] | ^^^^^ ^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::thread::ThreadId` [INFO] [stdout] --> src/training/batch_dispatcher.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use std::thread::ThreadId; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `thread_count` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:371:13 [INFO] [stdout] | [INFO] [stdout] 371 | let thread_count = if op == &"data_loading" { min_data_threads } else { pool.get_num_threads() }; [INFO] [stdout] | ^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_thread_count` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `setup_start` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:428:9 [INFO] [stdout] | [INFO] [stdout] 428 | let setup_start = Instant::now(); [INFO] [stdout] | ^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_setup_start` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `curriculum_examples` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:462:13 [INFO] [stdout] | [INFO] [stdout] 462 | let mut curriculum_examples: usize = 2000; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_curriculum_examples` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `auto_resize_vocab` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:466:13 [INFO] [stdout] | [INFO] [stdout] 466 | let mut auto_resize_vocab = false; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_auto_resize_vocab` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `watchdog_timeout` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:467:13 [INFO] [stdout] | [INFO] [stdout] 467 | let mut watchdog_timeout: Option = None; [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_watchdog_timeout` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `target_id_max` is assigned to, but never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:470:13 [INFO] [stdout] | [INFO] [stdout] 470 | let mut target_id_max: Option = None; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_target_id_max` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `curriculum_examples` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:675:25 [INFO] [stdout] | [INFO] [stdout] 675 | curriculum_examples = num; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `auto_resize_vocab` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:681:17 [INFO] [stdout] | [INFO] [stdout] 681 | auto_resize_vocab = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `watchdog_timeout` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:687:25 [INFO] [stdout] | [INFO] [stdout] 687 | watchdog_timeout = Some(timeout); [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `target_id_max` is never read [INFO] [stdout] --> src/bin/train_enhanced_model.rs:711:25 [INFO] [stdout] | [INFO] [stdout] 711 | target_id_max = Some(max_id); [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable does not need to be mutable [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1230:9 [INFO] [stdout] | [INFO] [stdout] 1230 | let mut path_with_dir = if !final_save_path_str.starts_with("models/") { [INFO] [stdout] | ----^^^^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | help: remove this `mut` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_mut)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `epoch` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1340:68 [INFO] [stdout] | [INFO] [stdout] 1340 | fn train_epoch(trainer: &mut EnhancedTrainer, tokens: &Vec, epoch: usize, [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_epoch` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `batch_inputs` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1508:10 [INFO] [stdout] | [INFO] [stdout] 1508 | let (batch_inputs, min_seq_len, truncated_inputs, batch_targets_arr, timing) = pool_result.install(|| { [INFO] [stdout] | ^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_batch_inputs` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `min_seq_len` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1508:24 [INFO] [stdout] | [INFO] [stdout] 1508 | let (batch_inputs, min_seq_len, truncated_inputs, batch_targets_arr, timing) = pool_result.install(|| { [INFO] [stdout] | ^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_min_seq_len` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `k` [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1779:13 [INFO] [stdout] | [INFO] [stdout] 1779 | let (m, k) = a.dim(); [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_k` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `prepare_batch_parallel` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1472:4 [INFO] [stdout] | [INFO] [stdout] 1472 | fn prepare_batch_parallel( [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `detect_simd_features` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1691:4 [INFO] [stdout] | [INFO] [stdout] 1691 | fn detect_simd_features() -> Vec { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `enable_simd_optimizations` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1728:4 [INFO] [stdout] | [INFO] [stdout] 1728 | fn enable_simd_optimizations(features: &[String]) -> bool { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `matrix_multiply_simd` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1774:4 [INFO] [stdout] | [INFO] [stdout] 1774 | fn matrix_multiply_simd(a: &ndarray::Array2, b: &ndarray::Array2) -> ndarray::Array2 { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `matrix_multiply_avx2` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1844:4 [INFO] [stdout] | [INFO] [stdout] 1844 | fn matrix_multiply_avx2(a: &ndarray::Array2, b: &ndarray::Array2, result: &mut ndarray::Array2) { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `matrix_multiply_scalar` is never used [INFO] [stdout] --> src/bin/train_enhanced_model.rs:1850:4 [INFO] [stdout] | [INFO] [stdout] 1850 | fn matrix_multiply_scalar(a: &ndarray::Array2, b: &ndarray::Array2, result: &mut ndarray::Array2) { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `v` [INFO] [stdout] --> src/nabla/memory_opt.rs:88:15 [INFO] [stdout] | [INFO] [stdout] 88 | if let Ok(v) = std::env::var("MALLOC_ARENA_MAX") { [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_v` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `count` [INFO] [stdout] --> src/tokenizer/word_bpe.rs:420:20 [INFO] [stdout] | [INFO] [stdout] 420 | for (word, count) in word_counts.iter().filter(|&(_, count)| *count >= min_frequency) { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_count` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `tokens` [INFO] [stdout] --> src/tokenizer/word_bpe.rs:745:13 [INFO] [stdout] | [INFO] [stdout] 745 | let tokens = tokenizer.tokenize(text); [INFO] [stdout] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_tokens` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `tokens_to_add` [INFO] [stdout] --> src/training/mod.rs:1315:13 [INFO] [stdout] | [INFO] [stdout] 1315 | let tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] help: if this is intentional, prefix it with an underscore [INFO] [stdout] | [INFO] [stdout] 1315 | let _tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] | + [INFO] [stdout] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stdout] | [INFO] [stdout] 1315 - let tokens_to_add = new_vocab_size - current_size; [INFO] [stdout] 1315 + let training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE = new_vocab_size - current_size; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable `reported_progress` is assigned to, but never used [INFO] [stdout] --> src/training/generation.rs:174:17 [INFO] [stdout] | [INFO] [stdout] 174 | let mut reported_progress = false; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: consider using `_reported_progress` instead [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `reported_progress` is never read [INFO] [stdout] --> src/training/generation.rs:181:17 [INFO] [stdout] | [INFO] [stdout] 181 | reported_progress = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] = note: `#[warn(unused_assignments)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `batch` [INFO] [stdout] --> src/training/enhanced_trainer.rs:538:34 [INFO] [stdout] | [INFO] [stdout] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stdout] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_batch` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `target` [INFO] [stdout] --> src/training/enhanced_trainer.rs:538:47 [INFO] [stdout] | [INFO] [stdout] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stdout] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_target` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable does not need to be mutable [INFO] [stdout] --> src/training/enhanced_trainer.rs:542:29 [INFO] [stdout] | [INFO] [stdout] 542 | let mut loss_guard = loss_mutex.lock().unwrap(); [INFO] [stdout] | ----^^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | help: remove this `mut` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_mut)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `thread_id` [INFO] [stdout] --> src/training/enhanced_trainer.rs:3454:13 [INFO] [stdout] | [INFO] [stdout] 3454 | for thread_id in 0..num_threads { [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] help: if this is intentional, prefix it with an underscore [INFO] [stdout] | [INFO] [stdout] 3454 | for _thread_id in 0..num_threads { [INFO] [stdout] | + [INFO] [stdout] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stdout] | [INFO] [stdout] 3454 - for thread_id in 0..num_threads { [INFO] [stdout] 3454 + for training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE in 0..num_threads { [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: value assigned to `progress_reported` is never read [INFO] [stdout] --> src/training/enhanced_trainer.rs:3714:13 [INFO] [stdout] | [INFO] [stdout] 3714 | progress_reported = true; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = help: maybe it is overwritten before being read? [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `determine_allocator_type` is never used [INFO] [stdout] --> src/nabla/memory_opt.rs:79:4 [INFO] [stdout] | [INFO] [stdout] 79 | fn determine_allocator_type() -> Option { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `get_total_elapsed` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:165:8 [INFO] [stdout] | [INFO] [stdout] 134 | impl ThreadStateTracker { [INFO] [stdout] | ----------------------- method in this implementation [INFO] [stdout] ... [INFO] [stdout] 165 | fn get_total_elapsed(&self) -> std::time::Duration { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `run_with_timeout` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:2001:8 [INFO] [stdout] | [INFO] [stdout] 201 | impl EnhancedTrainer { [INFO] [stdout] | -------------------- method in this implementation [INFO] [stdout] ... [INFO] [stdout] 2001 | fn run_with_timeout(&self, f: F, timeout: std::time::Duration) -> Option [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `V0_HEADER_SIZE` is never used [INFO] [stdout] --> src/training/enhanced_trainer.rs:101:15 [INFO] [stdout] | [INFO] [stdout] 101 | pub const V0_HEADER_SIZE: usize = 32; [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `time` is never read [INFO] [stdout] --> src/training/batch_dispatcher.rs:44:5 [INFO] [stdout] | [INFO] [stdout] 40 | struct LockEvent { [INFO] [stdout] | --------- field in this struct [INFO] [stdout] ... [INFO] [stdout] 44 | time: Instant, [INFO] [stdout] | ^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished `test` profile [unoptimized + debuginfo] target(s) in 7.94s [INFO] running `Command { std: "docker" "inspect" "ce7a25b500ba1132c415cceabaa63fbdf741fdad094bfa1ef44af3c40a595987", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "ce7a25b500ba1132c415cceabaa63fbdf741fdad094bfa1ef44af3c40a595987", kill_on_drop: false }` [INFO] [stdout] ce7a25b500ba1132c415cceabaa63fbdf741fdad094bfa1ef44af3c40a595987 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "RUSTDOCFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:e90291280db7d1fac5b66fc6dad9f9662629e7365a55743daf9bdf73ebc4ea79" "/opt/rustwide/cargo-home/bin/cargo" "+ad85bc524b1ad696e42061ad8338d382dffbdbe5" "test" "--frozen", kill_on_drop: false }` [INFO] [stdout] bc0c4acc05c38cedd5b4cb3b87e54371ed33308076a1a9fac86c266c35c72f97 [INFO] running `Command { std: "docker" "start" "-a" "bc0c4acc05c38cedd5b4cb3b87e54371ed33308076a1a9fac86c266c35c72f97", kill_on_drop: false }` [INFO] [stderr] warning: unused import: `Duration` [INFO] [stderr] --> src/nabla/memory_opt.rs:4:26 [INFO] [stderr] | [INFO] [stderr] 4 | use std::time::{Instant, Duration}; [INFO] [stderr] | ^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused imports: `Axis` and `Ix3` [INFO] [stderr] --> src/training/mod.rs:4:46 [INFO] [stderr] | [INFO] [stderr] 4 | use ndarray::{Array, Array1, Array2, Array3, Axis, Ix1, Ix2, Ix3, Ix0, IxDyn, s}; [INFO] [stderr] | ^^^^ ^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `ndarray::Array0` [INFO] [stderr] --> src/training/mod.rs:12:5 [INFO] [stderr] | [INFO] [stderr] 12 | use ndarray::Array0; [INFO] [stderr] | ^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `ndarray::array` [INFO] [stderr] --> src/training/mod.rs:13:5 [INFO] [stderr] | [INFO] [stderr] 13 | use ndarray::array; [INFO] [stderr] | ^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `std::time::Instant` [INFO] [stderr] --> src/training/mod.rs:16:5 [INFO] [stderr] | [INFO] [stderr] 16 | use std::time::Instant; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused imports: `Arc`, `Barrier`, and `Mutex` [INFO] [stderr] --> src/training/mod.rs:17:17 [INFO] [stderr] | [INFO] [stderr] 17 | use std::sync::{Mutex, Arc, Barrier}; [INFO] [stderr] | ^^^^^ ^^^ ^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused imports: `AtomicUsize` and `Ordering` [INFO] [stderr] --> src/training/mod.rs:18:25 [INFO] [stderr] | [INFO] [stderr] 18 | use std::sync::atomic::{AtomicUsize, Ordering}; [INFO] [stderr] | ^^^^^^^^^^^ ^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `crate::nabla::memory_opt` [INFO] [stderr] --> src/training/enhanced_trainer.rs:15:5 [INFO] [stderr] | [INFO] [stderr] 15 | use crate::nabla::memory_opt; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `Barrier` [INFO] [stderr] --> src/training/enhanced_trainer.rs:19:29 [INFO] [stderr] | [INFO] [stderr] 19 | use std::sync::{Arc, Mutex, Barrier}; [INFO] [stderr] | ^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused imports: `MutexGuard` and `Mutex` [INFO] [stderr] --> src/training/batch_dispatcher.rs:4:17 [INFO] [stderr] | [INFO] [stderr] 4 | use std::sync::{Mutex, MutexGuard}; [INFO] [stderr] | ^^^^^ ^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `std::thread::ThreadId` [INFO] [stderr] --> src/training/batch_dispatcher.rs:5:5 [INFO] [stderr] | [INFO] [stderr] 5 | use std::thread::ThreadId; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused variable: `v` [INFO] [stderr] --> src/nabla/memory_opt.rs:88:15 [INFO] [stderr] | [INFO] [stderr] 88 | if let Ok(v) = std::env::var("MALLOC_ARENA_MAX") { [INFO] [stderr] | ^ help: if this is intentional, prefix it with an underscore: `_v` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_variables)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `count` [INFO] [stderr] --> src/tokenizer/word_bpe.rs:420:20 [INFO] [stderr] | [INFO] [stderr] 420 | for (word, count) in word_counts.iter().filter(|&(_, count)| *count >= min_frequency) { [INFO] [stderr] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_count` [INFO] [stderr] [INFO] [stderr] warning: unused variable: `tokens_to_add` [INFO] [stderr] --> src/training/mod.rs:1315:13 [INFO] [stderr] | [INFO] [stderr] 1315 | let tokens_to_add = new_vocab_size - current_size; [INFO] [stderr] | ^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] help: if this is intentional, prefix it with an underscore [INFO] [stderr] | [INFO] [stderr] 1315 | let _tokens_to_add = new_vocab_size - current_size; [INFO] [stderr] | + [INFO] [stderr] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stderr] | [INFO] [stderr] 1315 - let tokens_to_add = new_vocab_size - current_size; [INFO] [stderr] 1315 + let training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE = new_vocab_size - current_size; [INFO] [stderr] | [INFO] [stderr] [INFO] [stderr] warning: variable `reported_progress` is assigned to, but never used [INFO] [stderr] --> src/training/generation.rs:174:17 [INFO] [stderr] | [INFO] [stderr] 174 | let mut reported_progress = false; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: consider using `_reported_progress` instead [INFO] [stderr] [INFO] [stderr] warning: value assigned to `reported_progress` is never read [INFO] [stderr] --> src/training/generation.rs:181:17 [INFO] [stderr] | [INFO] [stderr] 181 | reported_progress = true; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = help: maybe it is overwritten before being read? [INFO] [stderr] = note: `#[warn(unused_assignments)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `batch` [INFO] [stderr] --> src/training/enhanced_trainer.rs:538:34 [INFO] [stderr] | [INFO] [stderr] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stderr] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_batch` [INFO] [stderr] [INFO] [stderr] warning: unused variable: `target` [INFO] [stderr] --> src/training/enhanced_trainer.rs:538:47 [INFO] [stderr] | [INFO] [stderr] 538 | if let (Some(batch), Some(target)) = (inputs.get(batch_idx), targets.get(batch_idx)) { [INFO] [stderr] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_target` [INFO] [stderr] [INFO] [stderr] warning: variable does not need to be mutable [INFO] [stderr] --> src/training/enhanced_trainer.rs:542:29 [INFO] [stderr] | [INFO] [stderr] 542 | let mut loss_guard = loss_mutex.lock().unwrap(); [INFO] [stderr] | ----^^^^^^^^^^ [INFO] [stderr] | | [INFO] [stderr] | help: remove this `mut` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_mut)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `thread_id` [INFO] [stderr] --> src/training/enhanced_trainer.rs:3454:13 [INFO] [stderr] | [INFO] [stderr] 3454 | for thread_id in 0..num_threads { [INFO] [stderr] | ^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] help: if this is intentional, prefix it with an underscore [INFO] [stderr] | [INFO] [stderr] 3454 | for _thread_id in 0..num_threads { [INFO] [stderr] | + [INFO] [stderr] help: you might have meant to pattern match on the similarly named constant `CHUNK_SIZE` [INFO] [stderr] | [INFO] [stderr] 3454 - for thread_id in 0..num_threads { [INFO] [stderr] 3454 + for training::enhanced_trainer::EnhancedTrainer::learn_tokenizer_from_text::CHUNK_SIZE in 0..num_threads { [INFO] [stderr] | [INFO] [stderr] [INFO] [stderr] warning: value assigned to `progress_reported` is never read [INFO] [stderr] --> src/training/enhanced_trainer.rs:3714:13 [INFO] [stderr] | [INFO] [stderr] 3714 | progress_reported = true; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = help: maybe it is overwritten before being read? [INFO] [stderr] [INFO] [stderr] warning: function `determine_allocator_type` is never used [INFO] [stderr] --> src/nabla/memory_opt.rs:79:4 [INFO] [stderr] | [INFO] [stderr] 79 | fn determine_allocator_type() -> Option { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: method `get_total_elapsed` is never used [INFO] [stderr] --> src/training/enhanced_trainer.rs:165:8 [INFO] [stderr] | [INFO] [stderr] 134 | impl ThreadStateTracker { [INFO] [stderr] | ----------------------- method in this implementation [INFO] [stderr] ... [INFO] [stderr] 165 | fn get_total_elapsed(&self) -> std::time::Duration { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: method `run_with_timeout` is never used [INFO] [stderr] --> src/training/enhanced_trainer.rs:2001:8 [INFO] [stderr] | [INFO] [stderr] 201 | impl EnhancedTrainer { [INFO] [stderr] | -------------------- method in this implementation [INFO] [stderr] ... [INFO] [stderr] 2001 | fn run_with_timeout(&self, f: F, timeout: std::time::Duration) -> Option [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: constant `V0_HEADER_SIZE` is never used [INFO] [stderr] --> src/training/enhanced_trainer.rs:101:15 [INFO] [stderr] | [INFO] [stderr] 101 | pub const V0_HEADER_SIZE: usize = 32; [INFO] [stderr] | ^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: field `time` is never read [INFO] [stderr] --> src/training/batch_dispatcher.rs:44:5 [INFO] [stderr] | [INFO] [stderr] 40 | struct LockEvent { [INFO] [stderr] | --------- field in this struct [INFO] [stderr] ... [INFO] [stderr] 44 | time: Instant, [INFO] [stderr] | ^^^^ [INFO] [stderr] [INFO] [stderr] warning: `wall-e1` (lib) generated 26 warnings (run `cargo fix --lib -p wall-e1` to apply 14 suggestions) [INFO] [stderr] warning: unused import: `wall_e1::nabla::tensor::Tensor` [INFO] [stderr] --> src/bin/memory_test.rs:4:5 [INFO] [stderr] | [INFO] [stderr] 4 | use wall_e1::nabla::tensor::Tensor; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused import: `wall_e1::tokenizer::basic_tokenizer::BasicTokenizer` [INFO] [stderr] --> examples/train_qa_model.rs:5:5 [INFO] [stderr] | [INFO] [stderr] 5 | use wall_e1::tokenizer::basic_tokenizer::BasicTokenizer; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused imports: `ProgressBar` and `ProgressStyle` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:16:17 [INFO] [stderr] | [INFO] [stderr] 16 | use indicatif::{ProgressBar, ProgressStyle}; [INFO] [stderr] | ^^^^^^^^^^^ ^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused imports: `Arc`, `AtomicUsize`, `Mutex`, and `Ordering` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:17:17 [INFO] [stderr] | [INFO] [stderr] 17 | use std::sync::{Arc, Mutex, atomic::{AtomicUsize, Ordering}}; [INFO] [stderr] | ^^^ ^^^^^ ^^^^^^^^^^^ ^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused variable: `thread_id` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:47:13 [INFO] [stderr] | [INFO] [stderr] 47 | let thread_id = format!("{:?}", std::thread::current().id()); [INFO] [stderr] | ^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_thread_id` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_variables)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `thread_count` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:371:13 [INFO] [stderr] | [INFO] [stderr] 371 | let thread_count = if op == &"data_loading" { min_data_threads } else { pool.get_num_threads() }; [INFO] [stderr] | ^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_thread_count` [INFO] [stderr] [INFO] [stderr] warning: unused variable: `setup_start` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:428:9 [INFO] [stderr] | [INFO] [stderr] 428 | let setup_start = Instant::now(); [INFO] [stderr] | ^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_setup_start` [INFO] [stderr] [INFO] [stderr] warning: variable `curriculum_examples` is assigned to, but never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:462:13 [INFO] [stderr] | [INFO] [stderr] 462 | let mut curriculum_examples: usize = 2000; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: consider using `_curriculum_examples` instead [INFO] [stderr] [INFO] [stderr] warning: variable `auto_resize_vocab` is assigned to, but never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:466:13 [INFO] [stderr] | [INFO] [stderr] 466 | let mut auto_resize_vocab = false; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: consider using `_auto_resize_vocab` instead [INFO] [stderr] [INFO] [stderr] warning: variable `watchdog_timeout` is assigned to, but never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:467:13 [INFO] [stderr] | [INFO] [stderr] 467 | let mut watchdog_timeout: Option = None; [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: consider using `_watchdog_timeout` instead [INFO] [stderr] [INFO] [stderr] warning: variable `target_id_max` is assigned to, but never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:470:13 [INFO] [stderr] | [INFO] [stderr] 470 | let mut target_id_max: Option = None; [INFO] [stderr] | ^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: consider using `_target_id_max` instead [INFO] [stderr] [INFO] [stderr] warning: value assigned to `curriculum_examples` is never read [INFO] [stderr] --> src/bin/train_enhanced_model.rs:675:25 [INFO] [stderr] | [INFO] [stderr] 675 | curriculum_examples = num; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = help: maybe it is overwritten before being read? [INFO] [stderr] = note: `#[warn(unused_assignments)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: value assigned to `auto_resize_vocab` is never read [INFO] [stderr] --> src/bin/train_enhanced_model.rs:681:17 [INFO] [stderr] | [INFO] [stderr] 681 | auto_resize_vocab = true; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = help: maybe it is overwritten before being read? [INFO] [stderr] [INFO] [stderr] warning: value assigned to `watchdog_timeout` is never read [INFO] [stderr] --> src/bin/train_enhanced_model.rs:687:25 [INFO] [stderr] | [INFO] [stderr] 687 | watchdog_timeout = Some(timeout); [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = help: maybe it is overwritten before being read? [INFO] [stderr] [INFO] [stderr] warning: value assigned to `target_id_max` is never read [INFO] [stderr] --> src/bin/train_enhanced_model.rs:711:25 [INFO] [stderr] | [INFO] [stderr] 711 | target_id_max = Some(max_id); [INFO] [stderr] | ^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = help: maybe it is overwritten before being read? [INFO] [stderr] [INFO] [stderr] warning: variable does not need to be mutable [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1230:9 [INFO] [stderr] | [INFO] [stderr] 1230 | let mut path_with_dir = if !final_save_path_str.starts_with("models/") { [INFO] [stderr] | ----^^^^^^^^^^^^^ [INFO] [stderr] | | [INFO] [stderr] | help: remove this `mut` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_mut)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `epoch` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1340:68 [INFO] [stderr] | [INFO] [stderr] 1340 | fn train_epoch(trainer: &mut EnhancedTrainer, tokens: &Vec, epoch: usize, [INFO] [stderr] | ^^^^^ help: if this is intentional, prefix it with an underscore: `_epoch` [INFO] [stderr] [INFO] [stderr] warning: unused variable: `batch_inputs` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1508:10 [INFO] [stderr] | [INFO] [stderr] 1508 | let (batch_inputs, min_seq_len, truncated_inputs, batch_targets_arr, timing) = pool_result.install(|| { [INFO] [stderr] | ^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_batch_inputs` [INFO] [stderr] [INFO] [stderr] warning: unused variable: `min_seq_len` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1508:24 [INFO] [stderr] | [INFO] [stderr] 1508 | let (batch_inputs, min_seq_len, truncated_inputs, batch_targets_arr, timing) = pool_result.install(|| { [INFO] [stderr] | ^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_min_seq_len` [INFO] [stderr] [INFO] [stderr] warning: unused variable: `k` [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1779:13 [INFO] [stderr] | [INFO] [stderr] 1779 | let (m, k) = a.dim(); [INFO] [stderr] | ^ help: if this is intentional, prefix it with an underscore: `_k` [INFO] [stderr] [INFO] [stderr] warning: function `prepare_batch_parallel` is never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1472:4 [INFO] [stderr] | [INFO] [stderr] 1472 | fn prepare_batch_parallel( [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: function `detect_simd_features` is never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1691:4 [INFO] [stderr] | [INFO] [stderr] 1691 | fn detect_simd_features() -> Vec { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: function `enable_simd_optimizations` is never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1728:4 [INFO] [stderr] | [INFO] [stderr] 1728 | fn enable_simd_optimizations(features: &[String]) -> bool { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: function `matrix_multiply_simd` is never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1774:4 [INFO] [stderr] | [INFO] [stderr] 1774 | fn matrix_multiply_simd(a: &ndarray::Array2, b: &ndarray::Array2) -> ndarray::Array2 { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: function `matrix_multiply_avx2` is never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1844:4 [INFO] [stderr] | [INFO] [stderr] 1844 | fn matrix_multiply_avx2(a: &ndarray::Array2, b: &ndarray::Array2, result: &mut ndarray::Array2) { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: function `matrix_multiply_scalar` is never used [INFO] [stderr] --> src/bin/train_enhanced_model.rs:1850:4 [INFO] [stderr] | [INFO] [stderr] 1850 | fn matrix_multiply_scalar(a: &ndarray::Array2, b: &ndarray::Array2, result: &mut ndarray::Array2) { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `WordPieceBPETokenizer` [INFO] [stderr] --> examples/test_model.rs:6:26 [INFO] [stderr] | [INFO] [stderr] 6 | use wall_e1::tokenizer::{WordPieceBPETokenizer, Tokenizer}; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused import: `ndarray::Array2` [INFO] [stderr] --> examples/test_model.rs:8:5 [INFO] [stderr] | [INFO] [stderr] 8 | use ndarray::Array2; [INFO] [stderr] | ^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused imports: `ProgressBar` and `ProgressStyle` [INFO] [stderr] --> examples/test_model.rs:10:17 [INFO] [stderr] | [INFO] [stderr] 10 | use indicatif::{ProgressBar, ProgressStyle}; [INFO] [stderr] | ^^^^^^^^^^^ ^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `rand::prelude::*` [INFO] [stderr] --> examples/test_model.rs:11:5 [INFO] [stderr] | [INFO] [stderr] 11 | use rand::prelude::*; [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: function `add_load_method_to_trainer` is never used [INFO] [stderr] --> examples/test_model.rs:14:4 [INFO] [stderr] | [INFO] [stderr] 14 | fn add_load_method_to_trainer() -> Result<(), Box> { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused import: `std::env` [INFO] [stderr] --> src/bin/diagnose_training.rs:2:5 [INFO] [stderr] | [INFO] [stderr] 2 | use std::env; [INFO] [stderr] | ^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused import: `wall_e1::tokenizer::Tokenizer` [INFO] [stderr] --> src/bin/diagnose_training.rs:5:5 [INFO] [stderr] | [INFO] [stderr] 5 | use wall_e1::tokenizer::Tokenizer; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `std::fs::File` [INFO] [stderr] --> src/bin/diagnose_training.rs:7:5 [INFO] [stderr] | [INFO] [stderr] 7 | use std::fs::File; [INFO] [stderr] | ^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `std::io::Read` [INFO] [stderr] --> src/bin/diagnose_training.rs:8:5 [INFO] [stderr] | [INFO] [stderr] 8 | use std::io::Read; [INFO] [stderr] | ^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused import: `Array2` [INFO] [stderr] --> examples/model_test.rs:6:15 [INFO] [stderr] | [INFO] [stderr] 6 | use ndarray::{Array2, s}; [INFO] [stderr] | ^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused import: `TextGenerator` [INFO] [stderr] --> src/bin/test_multithreading.rs:5:42 [INFO] [stderr] | [INFO] [stderr] 5 | use wall_e1::training::{EnhancedTrainer, TextGenerator}; [INFO] [stderr] | ^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `tokens` [INFO] [stderr] --> src/tokenizer/word_bpe.rs:745:13 [INFO] [stderr] | [INFO] [stderr] 745 | let tokens = tokenizer.tokenize(text); [INFO] [stderr] | ^^^^^^ help: if this is intentional, prefix it with an underscore: `_tokens` [INFO] [stderr] [INFO] [stderr] warning: `wall-e1` (bin "memory_test" test) generated 1 warning (run `cargo fix --bin "memory_test" --tests` to apply 1 suggestion) [INFO] [stderr] warning: `wall-e1` (example "train_qa_model") generated 1 warning (run `cargo fix --example "train_qa_model"` to apply 1 suggestion) [INFO] [stderr] warning: `wall-e1` (bin "Wall-E" test) generated 24 warnings (run `cargo fix --bin "Wall-E" --tests` to apply 3 suggestions) [INFO] [stderr] warning: `wall-e1` (example "test_model") generated 5 warnings (run `cargo fix --example "test_model"` to apply 4 suggestions) [INFO] [stderr] warning: `wall-e1` (bin "diagnose_training" test) generated 4 warnings (run `cargo fix --bin "diagnose_training" --tests` to apply 4 suggestions) [INFO] [stderr] warning: `wall-e1` (example "model_test") generated 1 warning (run `cargo fix --example "model_test"` to apply 1 suggestion) [INFO] [stderr] warning: `wall-e1` (bin "test_multithreading" test) generated 1 warning (run `cargo fix --bin "test_multithreading" --tests` to apply 1 suggestion) [INFO] [stderr] warning: `wall-e1` (lib test) generated 27 warnings (26 duplicates) [INFO] [stderr] Finished `test` profile [unoptimized + debuginfo] target(s) in 0.24s [INFO] [stderr] Running unittests src/lib.rs (/opt/rustwide/target/debug/deps/wall_e1-afe0e2b11155a33b) [INFO] [stdout] [INFO] [stdout] running 80 tests [INFO] [stdout] test attention::encoder::tests::test_layer_norm ... ok [INFO] [stdout] test attention::feed_forward::tests::test_feed_forward_relu_activation ... ok [INFO] [stdout] test attention::multi_head_attention::tests::test_multi_head_attention_forward ... ok [INFO] [stdout] test attention::encoder::tests::test_encoder_layer_creation ... ok [INFO] [stdout] test attention::feed_forward::tests::test_feed_forward_creation ... ok [INFO] [stdout] test attention::tests::test_combined_mask ... ok [INFO] [stdout] test attention::tests::test_padding_mask ... ok [INFO] [stdout] test attention::tests::test_create_weight_matrix ... ok [INFO] [stdout] test attention::self_attention::tests::test_self_attention_with_mask ... ok [INFO] [stdout] test attention::tests::test_causal_mask ... ok [INFO] [stdout] test attention::tests::test_softmax_3d ... ok [INFO] [stdout] test attention::multi_head_attention::tests::test_multi_head_attention_with_mask ... ok [INFO] [stdout] test attention::self_attention::tests::test_self_attention_forward ... ok [INFO] [stdout] test dataset::loader::tests::test_load_csv ... ok [INFO] [stdout] test dataset::loader::tests::test_load_jsonl ... ok [INFO] [stdout] test dataset::loader::tests::test_load_text_lines ... ok [INFO] [stdout] test dataset::processor::tests::test_basic_processor ... ok [INFO] [stdout] test dataset::loader::tests::test_load_json ... ok [INFO] [stdout] test dataset::split::tests::test_split_dataset_with_invalid_ratios ... ok [INFO] [stdout] test dataset::loader::tests::test_load_text ... ok [INFO] [stdout] test dataset::streaming::tests::test_dataset_stream ... ok [INFO] [stdout] test dataset::split::tests::test_split_dataset ... ok [INFO] [stdout] test dataset::streaming::tests::test_stream_with_processor ... ok [INFO] [stdout] test dataset::split::tests::test_k_fold_split ... ok [INFO] [stdout] test attention::encoder_stack::tests::test_encoder_stack_creation ... ok [INFO] [stdout] test dataset::processor::tests::test_basic_processor_no_lowercase ... ok [INFO] [stdout] test dataset::processor::tests::test_composite_processor ... ok [INFO] [stdout] test attention::multi_head_attention::tests::test_multi_head_attention_creation ... ok [INFO] [stdout] test dataset::split::tests::test_k_fold_invalid_k - should panic ... ok [INFO] [stdout] test dataset::streaming::tests::test_stream_reset - should panic ... ok [INFO] [stdout] test embedding::token_embedding::tests::test_forward ... ok [INFO] [stdout] test embedding::token_embedding::tests::test_forward_batch ... ok [INFO] [stdout] test embedding::token_embedding::tests::test_get_embedding ... ok [INFO] [stdout] test embedding::token_embedding::tests::test_token_embedding_creation ... ok [INFO] [stdout] test embedding::token_embedding::tests::test_update_embedding ... ok [INFO] [stdout] test attention::feed_forward::tests::test_feed_forward_forward ... ok [INFO] [stdout] test nabla::memory_opt::tests::test_cache_blocked_matmul ... ok [INFO] [stdout] test nabla::memory_opt::tests::test_gradient_checkpointer ... ok [INFO] [stdout] test embedding::positional_embedding::tests::test_positional_embedding_formula ... ok [INFO] [stdout] test embedding::positional_embedding::tests::test_positional_embedding_creation ... ok [INFO] [stdout] test nabla::tensor::tests::test_matmul_forward_and_backward ... ok [INFO] [stdout] test nabla::tensor::tests::test_matmul_with ... ok [INFO] [stdout] test nabla::tensor::tests::test_new_tensor ... ok [INFO] [stdout] test embedding::positional_embedding::tests::test_positional_embedding_forward_batch ... ok [INFO] [stdout] test nabla::tensor::tests::test_relu_forward_and_backward ... ok [INFO] [stdout] test nabla::tensor::tests::test_thread_safety ... ok [INFO] [stdout] test embedding::positional_embedding::tests::test_positional_embedding_forward ... ok [INFO] [stdout] test embedding::positional_embedding::tests::test_positional_embedding_too_long ... ok [INFO] [stdout] test nabla::tensor::tests::test_transpose ... ok [INFO] [stdout] test tokenizer::bpe::tests::test_bpe_learn ... ok [INFO] [stdout] test tokenizer::bpe::tests::test_bpe_tokenizer_simple ... ok [INFO] [stdout] test nabla::memory_opt::tests::test_batch_matmul_3d_2d ... ok [INFO] [stdout] test tokenizer::basic_tokenizer::tests::test_basic_tokenizer_encode_decode ... ok [INFO] [stdout] test tokenizer::basic_tokenizer::tests::test_basic_tokenizer_tokenize ... ok [INFO] [stdout] test tokenizer::vocab::tests::test_vocab_add_token ... ok [INFO] [stdout] test tokenizer::basic_tokenizer::tests::test_build_vocab ... ok [INFO] [stdout] test tokenizer::word_bpe::tests::test_word_bpe_tokenizer_simple ... ok [INFO] [stdout] test training::tests::tests::test_adam_optimizer_basic ... ok [INFO] [stdout] test training::tests::tests::test_adam_optimizer_momentum ... ok [INFO] [stdout] test training::tests::tests::test_adam_optimizer_with_gradient_clipping ... ok [INFO] [stdout] test training::tests::tests::test_cross_entropy_loss_basic ... ok [INFO] [stdout] test training::tests::tests::test_cross_entropy_loss_with_padding ... ok [INFO] [stdout] test tokenizer::vocab::tests::test_vocab_special_tokens ... ok [INFO] [stdout] test tokenizer::vocab::tests::test_vocab_save_load ... ok [INFO] [stdout] test tokenizer::word_bpe::tests::test_word_bpe_learn ... ok [INFO] [stdout] test training::tests::tests::test_trainer_forward_shape ... ok [INFO] [stdout] test nabla::tensor::tests::test_add_forward_and_backward ... ok [INFO] [stdout] test tokenizer::word_bpe::tests::test_word_boundaries ... ok [INFO] [stdout] test training::tests::tests::test_trainer_initialization ... ok [INFO] [stdout] test training::tests::tests::test_trainer_train_step ... ok [INFO] [stdout] test embedding::tests::test_transformer_embedding_forward ... ok [INFO] [stdout] test embedding::tests::test_transformer_embedding_forward_batch ... ok [INFO] [stdout] test embedding::tests::test_transformer_embedding_creation ... ok [INFO] [stdout] test attention::feed_forward::tests::test_batched_feed_forward ... ok [INFO] [stdout] test attention::encoder::tests::test_encoder_layer_forward ... ok [INFO] [stdout] test attention::encoder::tests::test_encoder_layer_with_mask ... ok [INFO] [stdout] test attention::encoder_stack::tests::test_encoder_stack_forward ... ok [INFO] [stdout] test attention::encoder_stack::tests::test_encoder_stack_with_padding_mask ... ok [INFO] [stdout] test attention::encoder_stack::tests::test_encoder_stack_with_mask ... ok [INFO] [stdout] test nabla::memory_opt::tests::test_measure_memory_bandwidth ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 80 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 30.27s [INFO] [stdout] [INFO] [stderr] Running unittests src/bin/train_enhanced_model.rs (/opt/rustwide/target/debug/deps/Wall_E-bba3f943a83d00e2) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/bin/batch_size_test.rs (/opt/rustwide/target/debug/deps/batch_size_test-250d98d15fcb0827) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/bin/diagnose_training.rs (/opt/rustwide/target/debug/deps/diagnose_training-159c50cd65ba6668) [INFO] [stdout] [INFO] [stderr] Running unittests src/bin/memory_test.rs (/opt/rustwide/target/debug/deps/memory_test-0f0c672c5f4e7a79) [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/bin/test_multithreading.rs (/opt/rustwide/target/debug/deps/test_multithreading-677acc978ff4f5ec) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/bin/train_model_test.rs (/opt/rustwide/target/debug/deps/train_model_test-dfcdeec95e7ffb66) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stderr] Doc-tests wall_e1 [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] running 48 tests [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::add (line 119) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::matmul (line 175) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::matmul_with (line 288) ... FAILED [INFO] [stdout] test src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding (line 20) ... FAILED [INFO] [stdout] test src/export/mod.rs - export::save_model (line 82) ... FAILED [INFO] [stdout] test src/export/mod.rs - export::verify_model (line 321) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::mse (line 489) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::new_from_array (line 569) ... FAILED [INFO] [stdout] test src/export/mod.rs - export::load_model (line 209) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::relu (line 342) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::square (line 389) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::sum (line 441) ... FAILED [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::Tensor::transpose (line 234) ... FAILED [INFO] [stdout] test src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer::new (line 42) ... FAILED [INFO] [stdout] test src/embedding/mod.rs - embedding::TransformerEmbedding::from_vocab (line 120) ... ok [INFO] [stdout] test src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding::forward_batch (line 141) ... ok [INFO] [stdout] test src/embedding/mod.rs - embedding::TransformerEmbedding::forward_batch (line 195) ... ok [INFO] [stdout] test src/embedding/token_embedding.rs - embedding::token_embedding::TokenEmbedding::new (line 36) ... ok [INFO] [stdout] test src/nabla/tensor.rs - nabla::tensor::set_num_threads (line 15) ... ok [INFO] [stdout] test src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding::new (line 61) ... ok [INFO] [stdout] test src/tokenizer/bpe.rs - tokenizer::bpe::BPETokenizer::new (line 40) ... FAILED [INFO] [stdout] test src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding::forward_batch_3d (line 195) ... ok [INFO] [stdout] test src/embedding/mod.rs - embedding::TransformerEmbedding::new (line 81) ... ok [INFO] [stdout] test src/embedding/mod.rs - embedding::TransformerEmbedding::forward (line 158) ... ok [INFO] [stdout] test src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer (line 13) ... FAILED [INFO] [stdout] test src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding::embedding_matrix (line 113) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::load (line 317) - compile ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::add_token (line 83) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::id_to_token (line 173) ... ok [INFO] [stdout] test src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer::build_vocab (line 104) ... ok [INFO] [stdout] test src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding::forward (line 248) ... ok [INFO] [stdout] test src/training/mod.rs - training::CrossEntropyLoss (line 80) ... FAILED [INFO] [stdout] test src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer::with_vocab (line 69) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::add_special_token (line 120) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::save (line 272) - compile ... ok [INFO] [stdout] test src/tokenizer/bpe.rs - tokenizer::bpe::BPETokenizer (line 12) ... ok [INFO] [stdout] test src/tokenizer/mod.rs - tokenizer::Tokenizer (line 18) ... ok [INFO] [stdout] test src/training/mod.rs - training::AdamOptimizer (line 192) ... FAILED [INFO] [stdout] test src/training/mod.rs - training::Trainer (line 374) ... FAILED [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::len (line 220) ... ok [INFO] [stdout] test src/tokenizer/bpe.rs - tokenizer::bpe::BPETokenizer::with_vocab (line 81) ... ok [INFO] [stdout] test src/training/mod.rs - training::ModelOutput (line 50) ... FAILED [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab (line 14) ... ok [INFO] [stdout] test src/tokenizer/bpe.rs - tokenizer::bpe::BPETokenizer::learn_bpe (line 128) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::new (line 53) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::token_to_id (line 148) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::is_empty (line 244) ... ok [INFO] [stdout] test src/tokenizer/vocab.rs - tokenizer::vocab::Vocab::is_special_token (line 198) ... ok [INFO] [stdout] [INFO] [stdout] failures: [INFO] [stdout] [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::add (line 119) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:120:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::matmul (line 175) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:176:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::matmul_with (line 288) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:289:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding (line 20) stdout ---- [INFO] [stdout] error[E0599]: no method named `forward` found for struct `PositionalEmbedding` in the current scope [INFO] [stdout] --> src/embedding/positional_embedding.rs:29:39 [INFO] [stdout] | [INFO] [stdout] 12 | let positional_embeddings = embedding.forward(&token_ids); [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/embedding/mod.rs:21:8 [INFO] [stdout] | [INFO] [stdout] 21 | fn forward(&self, token_ids: &[usize]) -> Tensor; [INFO] [stdout] | ------- the method is available for `PositionalEmbedding` here [INFO] [stdout] | [INFO] [stdout] = help: items from traits can only be used if the trait is in scope [INFO] [stdout] help: there is a method `forward_batch` with a similar name, but with different arguments [INFO] [stdout] --> /opt/rustwide/workdir/src/embedding/positional_embedding.rs:150:5 [INFO] [stdout] | [INFO] [stdout] 150 | pub fn forward_batch(&self, batch_size: usize, seq_len: usize) -> Tensor { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] help: trait `Embedding` which provides `forward` is implemented but not in scope; perhaps you want to import it [INFO] [stdout] | [INFO] [stdout] 2 + use crate::wall_e1::Embedding; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0599`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/export/mod.rs - export::save_model (line 82) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: unresolved import [INFO] [stdout] --> src/export/mod.rs:85:12 [INFO] [stdout] | [INFO] [stdout] 5 | use crate::tokenizer::basic_tokenizer::BasicTokenizer; [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::tokenizer` [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: unresolved import [INFO] [stdout] --> src/export/mod.rs:86:12 [INFO] [stdout] | [INFO] [stdout] 6 | use crate::nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::nabla` [INFO] [stdout] [INFO] [stdout] error[E0432]: unresolved import `crate::export` [INFO] [stdout] --> src/export/mod.rs:87:12 [INFO] [stdout] | [INFO] [stdout] 7 | use crate::export::save_model; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::export` [INFO] [stdout] [INFO] [stdout] error: aborting due to 3 previous errors [INFO] [stdout] [INFO] [stdout] Some errors have detailed explanations: E0432, E0433. [INFO] [stdout] For more information about an error, try `rustc --explain E0432`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/export/mod.rs - export::verify_model (line 321) stdout ---- [INFO] [stdout] error[E0432]: unresolved import `crate::export` [INFO] [stdout] --> src/export/mod.rs:322:12 [INFO] [stdout] | [INFO] [stdout] 3 | use crate::export::verify_model; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::export` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0432`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::mse (line 489) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:490:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::new_from_array (line 569) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:570:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/export/mod.rs - export::load_model (line 209) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: unresolved import [INFO] [stdout] --> src/export/mod.rs:210:12 [INFO] [stdout] | [INFO] [stdout] 3 | use crate::tokenizer::basic_tokenizer::BasicTokenizer; [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::tokenizer` [INFO] [stdout] [INFO] [stdout] error[E0432]: unresolved import `crate::export` [INFO] [stdout] --> src/export/mod.rs:211:12 [INFO] [stdout] | [INFO] [stdout] 4 | use crate::export::load_model; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::export` [INFO] [stdout] [INFO] [stdout] error: aborting due to 2 previous errors [INFO] [stdout] [INFO] [stdout] Some errors have detailed explanations: E0432, E0433. [INFO] [stdout] For more information about an error, try `rustc --explain E0432`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::relu (line 342) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:343:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::square (line 389) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:390:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::sum (line 441) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:442:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/nabla/tensor.rs - nabla::tensor::Tensor::transpose (line 234) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: use of unresolved module or unlinked crate `nabla` [INFO] [stdout] --> src/nabla/tensor.rs:235:5 [INFO] [stdout] | [INFO] [stdout] 3 | use nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ use of unresolved module or unlinked crate `nabla` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `nabla`, use `cargo add nabla` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0433`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer::new (line 42) stdout ---- [INFO] [stdout] error[E0599]: no method named `vocab_size` found for struct `BasicTokenizer` in the current scope [INFO] [stdout] --> src/tokenizer/basic_tokenizer.rs:47:22 [INFO] [stdout] | [INFO] [stdout] 8 | assert_eq!(tokenizer.vocab_size(), 2); [INFO] [stdout] | ^^^^^^^^^^ method not found in `BasicTokenizer` [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/tokenizer/mod.rs:48:8 [INFO] [stdout] | [INFO] [stdout] 48 | fn vocab_size(&self) -> usize; [INFO] [stdout] | ---------- the method is available for `BasicTokenizer` here [INFO] [stdout] | [INFO] [stdout] = help: items from traits can only be used if the trait is in scope [INFO] [stdout] help: trait `Tokenizer` which provides `vocab_size` is implemented but not in scope; perhaps you want to import it [INFO] [stdout] | [INFO] [stdout] 2 + use crate::wall_e1::Tokenizer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0599`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/tokenizer/bpe.rs - tokenizer::bpe::BPETokenizer::new (line 40) stdout ---- [INFO] [stdout] error[E0599]: no method named `vocab_size` found for struct `BPETokenizer` in the current scope [INFO] [stdout] --> src/tokenizer/bpe.rs:46:19 [INFO] [stdout] | [INFO] [stdout] 9 | assert!(tokenizer.vocab_size() > 0); [INFO] [stdout] | ^^^^^^^^^^ method not found in `BPETokenizer` [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/tokenizer/mod.rs:48:8 [INFO] [stdout] | [INFO] [stdout] 48 | fn vocab_size(&self) -> usize; [INFO] [stdout] | ---------- the method is available for `BPETokenizer` here [INFO] [stdout] | [INFO] [stdout] = help: items from traits can only be used if the trait is in scope [INFO] [stdout] help: trait `Tokenizer` which provides `vocab_size` is implemented but not in scope; perhaps you want to import it [INFO] [stdout] | [INFO] [stdout] 2 + use crate::wall_e1::Tokenizer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] error: aborting due to 1 previous error [INFO] [stdout] [INFO] [stdout] For more information about this error, try `rustc --explain E0599`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer (line 13) stdout ---- [INFO] [stdout] Test executable failed (exit status: 101). [INFO] [stdout] [INFO] [stdout] stderr: [INFO] [stdout] [INFO] [stdout] thread 'main' (804) panicked at src/tokenizer/basic_tokenizer.rs:16:1: [INFO] [stdout] assertion `left == right` failed [INFO] [stdout] left: ["hello", "[UNK]", "world", "[UNK]"] [INFO] [stdout] right: ["hello", ",", "world", "!"] [INFO] [stdout] stack backtrace: [INFO] [stdout] 0: 0x56e54fcb8ae2 - std::backtrace_rs::backtrace::libunwind::trace::h52580dd202462214 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/../../backtrace/src/backtrace/libunwind.rs:117:9 [INFO] [stdout] 1: 0x56e54fcb8ae2 - std::backtrace_rs::backtrace::trace_unsynchronized::hc969519abce0f52b [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/../../backtrace/src/backtrace/mod.rs:66:14 [INFO] [stdout] 2: 0x56e54fcb8ae2 - std::sys::backtrace::_print_fmt::hfd5825900b6e0030 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/sys/backtrace.rs:66:9 [INFO] [stdout] 3: 0x56e54fcb8ae2 - ::fmt::h427144ad75cfb218 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/sys/backtrace.rs:39:26 [INFO] [stdout] 4: 0x56e54fcc8def - core::fmt::rt::Argument::fmt::hd5ccc9cf97cea7f7 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/core/src/fmt/rt.rs:173:76 [INFO] [stdout] 5: 0x56e54fcc8def - core::fmt::write::h593aaf5adf0f5dae [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/core/src/fmt/mod.rs:1468:25 [INFO] [stdout] 6: 0x56e54fc912f1 - std::io::default_write_fmt::hf108c2855e591a7c [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/io/mod.rs:639:11 [INFO] [stdout] 7: 0x56e54fc912f1 - std::io::Write::write_fmt::h6e02e29a2bcc97c1 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/io/mod.rs:1954:13 [INFO] [stdout] 8: 0x56e54fc985c2 - std::sys::backtrace::BacktraceLock::print::h3953113552dca3ca [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/sys/backtrace.rs:42:9 [INFO] [stdout] 9: 0x56e54fc9c6bf - std::panicking::default_hook::{{closure}}::h486b96ad75eafbb8 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:301:27 [INFO] [stdout] 10: 0x56e54fc9c519 - std::panicking::default_hook::h99456317c5a1a20c [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:328:9 [INFO] [stdout] 11: 0x56e54fc9cdf5 - std::panicking::panic_with_hook::h67ab0df20212e4ea [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:834:13 [INFO] [stdout] 12: 0x56e54fc9cbda - std::panicking::panic_handler::{{closure}}::hc9e4a933ae92e208 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:707:13 [INFO] [stdout] 13: 0x56e54fc986f9 - std::sys::backtrace::__rust_end_short_backtrace::hbfa72df9b68c2d19 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/sys/backtrace.rs:174:18 [INFO] [stdout] 14: 0x56e54fc85b6d - __rustc[8cce077e14951490]::rust_begin_unwind [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:698:5 [INFO] [stdout] 15: 0x56e54fcccaf0 - core::panicking::panic_fmt::h49931053d20abf41 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/core/src/panicking.rs:75:14 [INFO] [stdout] 16: 0x56e54fccc8f3 - core::panicking::assert_failed_inner::h3eabd3c15a9e5a8c [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/core/src/panicking.rs:439:17 [INFO] [stdout] 17: 0x56e54fc2cfbc - core::panicking::assert_failed::h9c8b4befd9aca731 [INFO] [stdout] 18: 0x56e54fc2d67f - rust_out::main::_doctest_main_src_tokenizer_basic_tokenizer_rs_13_0::h67beeb33fb87d65f [INFO] [stdout] 19: 0x56e54fc2d436 - rust_out::main::h5d9ae37ea63e4164 [INFO] [stdout] 20: 0x56e54fc2cc43 - core::ops::function::FnOnce::call_once::h1775aadbd0a7f189 [INFO] [stdout] 21: 0x56e54fc2cbc6 - std::sys::backtrace::__rust_begin_short_backtrace::h1f0ca296c06226ef [INFO] [stdout] 22: 0x56e54fc2cba9 - std::rt::lang_start::{{closure}}::hf5b26ba6479d0f2b [INFO] [stdout] 23: 0x56e54fc928c0 - core::ops::function::impls:: for &F>::call_once::hb84e9e033646564e [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/core/src/ops/function.rs:290:21 [INFO] [stdout] 24: 0x56e54fc928c0 - std::panicking::catch_unwind::do_call::hbd46cd07e9e2b512 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:590:40 [INFO] [stdout] 25: 0x56e54fc928c0 - std::panicking::catch_unwind::hd048d6861d343544 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:553:19 [INFO] [stdout] 26: 0x56e54fc928c0 - std::panic::catch_unwind::h141ca8973277f822 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panic.rs:359:14 [INFO] [stdout] 27: 0x56e54fc928c0 - std::rt::lang_start_internal::{{closure}}::hc1f3273d5e4e0426 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/rt.rs:175:24 [INFO] [stdout] 28: 0x56e54fc928c0 - std::panicking::catch_unwind::do_call::hc215ba2b02ac1f8c [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:590:40 [INFO] [stdout] 29: 0x56e54fc928c0 - std::panicking::catch_unwind::h67f54bbdc14963b0 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panicking.rs:553:19 [INFO] [stdout] 30: 0x56e54fc928c0 - std::panic::catch_unwind::h7d01c664f3f34538 [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/panic.rs:359:14 [INFO] [stdout] 31: 0x56e54fc928c0 - std::rt::lang_start_internal::hcbf1d15fae983a0f [INFO] [stdout] at /rustc/ad85bc524b1ad696e42061ad8338d382dffbdbe5/library/std/src/rt.rs:171:5 [INFO] [stdout] 32: 0x56e54fc2cb91 - std::rt::lang_start::hb5324c43cbf32b6f [INFO] [stdout] 33: 0x56e54fc2d6e5 - main [INFO] [stdout] 34: 0x7aec9460e1ca - [INFO] [stdout] 35: 0x7aec9460e28b - __libc_start_main [INFO] [stdout] 36: 0x56e54fc2caa5 - _start [INFO] [stdout] 37: 0x0 - [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] ---- src/training/mod.rs - training::CrossEntropyLoss (line 80) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: unresolved import [INFO] [stdout] --> src/training/mod.rs:82:12 [INFO] [stdout] | [INFO] [stdout] 4 | use crate::nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::nabla` [INFO] [stdout] [INFO] [stdout] error[E0432]: unresolved import `crate::training` [INFO] [stdout] --> src/training/mod.rs:83:12 [INFO] [stdout] | [INFO] [stdout] 5 | use crate::training::CrossEntropyLoss; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::training` [INFO] [stdout] [INFO] [stdout] error: aborting due to 2 previous errors [INFO] [stdout] [INFO] [stdout] Some errors have detailed explanations: E0432, E0433. [INFO] [stdout] For more information about an error, try `rustc --explain E0432`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/training/mod.rs - training::AdamOptimizer (line 192) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: unresolved import [INFO] [stdout] --> src/training/mod.rs:195:12 [INFO] [stdout] | [INFO] [stdout] 5 | use crate::nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::nabla` [INFO] [stdout] [INFO] [stdout] error[E0432]: unresolved import `crate::training` [INFO] [stdout] --> src/training/mod.rs:196:12 [INFO] [stdout] | [INFO] [stdout] 6 | use crate::training::AdamOptimizer; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::training` [INFO] [stdout] [INFO] [stdout] error: aborting due to 2 previous errors [INFO] [stdout] [INFO] [stdout] Some errors have detailed explanations: E0432, E0433. [INFO] [stdout] For more information about an error, try `rustc --explain E0432`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/training/mod.rs - training::Trainer (line 374) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: unresolved import [INFO] [stdout] --> src/training/mod.rs:375:12 [INFO] [stdout] | [INFO] [stdout] 3 | use crate::tokenizer::basic_tokenizer::BasicTokenizer; [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::tokenizer` [INFO] [stdout] [INFO] [stdout] error[E0432]: unresolved import `crate::training` [INFO] [stdout] --> src/training/mod.rs:376:12 [INFO] [stdout] | [INFO] [stdout] 4 | use crate::training::Trainer; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::training` [INFO] [stdout] [INFO] [stdout] error: aborting due to 2 previous errors [INFO] [stdout] [INFO] [stdout] Some errors have detailed explanations: E0432, E0433. [INFO] [stdout] For more information about an error, try `rustc --explain E0432`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] ---- src/training/mod.rs - training::ModelOutput (line 50) stdout ---- [INFO] [stdout] error[E0433]: failed to resolve: unresolved import [INFO] [stdout] --> src/training/mod.rs:52:12 [INFO] [stdout] | [INFO] [stdout] 4 | use crate::nabla::tensor::Tensor; [INFO] [stdout] | ^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::nabla` [INFO] [stdout] [INFO] [stdout] error[E0432]: unresolved import `crate::training` [INFO] [stdout] --> src/training/mod.rs:53:12 [INFO] [stdout] | [INFO] [stdout] 5 | use crate::training::ModelOutput; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | unresolved import [INFO] [stdout] | help: a similar path exists: `wall_e1::training` [INFO] [stdout] [INFO] [stdout] error: aborting due to 2 previous errors [INFO] [stdout] [INFO] [stdout] Some errors have detailed explanations: E0432, E0433. [INFO] [stdout] For more information about an error, try `rustc --explain E0432`. [INFO] [stdout] Couldn't compile the test. [INFO] [stdout] [INFO] [stdout] failures: [INFO] [stdout] src/embedding/positional_embedding.rs - embedding::positional_embedding::PositionalEmbedding (line 20) [INFO] [stdout] src/export/mod.rs - export::load_model (line 209) [INFO] [stdout] src/export/mod.rs - export::save_model (line 82) [INFO] [stdout] src/export/mod.rs - export::verify_model (line 321) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::add (line 119) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::matmul (line 175) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::matmul_with (line 288) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::mse (line 489) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::new_from_array (line 569) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::relu (line 342) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::square (line 389) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::sum (line 441) [INFO] [stdout] src/nabla/tensor.rs - nabla::tensor::Tensor::transpose (line 234) [INFO] [stdout] src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer (line 13) [INFO] [stdout] src/tokenizer/basic_tokenizer.rs - tokenizer::basic_tokenizer::BasicTokenizer::new (line 42) [INFO] [stdout] src/tokenizer/bpe.rs - tokenizer::bpe::BPETokenizer::new (line 40) [INFO] [stdout] src/training/mod.rs - training::AdamOptimizer (line 192) [INFO] [stdout] src/training/mod.rs - training::CrossEntropyLoss (line 80) [INFO] [stdout] src/training/mod.rs - training::ModelOutput (line 50) [INFO] [stdout] src/training/mod.rs - training::Trainer (line 374) [INFO] [stdout] [INFO] [stdout] test result: FAILED. 28 passed; 20 failed; 0 ignored; 0 measured; 0 filtered out; finished in 3.44s [INFO] [stdout] [INFO] [stdout] all doctests ran in 3.90s; merged doctests compilation took 0.45s [INFO] [stderr] error: doctest failed, to rerun pass `--doc` [INFO] running `Command { std: "docker" "inspect" "bc0c4acc05c38cedd5b4cb3b87e54371ed33308076a1a9fac86c266c35c72f97", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "bc0c4acc05c38cedd5b4cb3b87e54371ed33308076a1a9fac86c266c35c72f97", kill_on_drop: false }` [INFO] [stdout] bc0c4acc05c38cedd5b4cb3b87e54371ed33308076a1a9fac86c266c35c72f97