[INFO] fetching crate scirs2-neural 0.1.0-alpha.6... [INFO] checking scirs2-neural-0.1.0-alpha.6 against try#c659ee110de67e82444e4b6c8407c1a9af9c2cf6 for pr-145608-1 [INFO] extracting crate scirs2-neural 0.1.0-alpha.6 into /workspace/builds/worker-4-tc2/source [INFO] started tweaking crates.io crate scirs2-neural 0.1.0-alpha.6 [INFO] removed 0 missing examples [INFO] finished tweaking crates.io crate scirs2-neural 0.1.0-alpha.6 [INFO] tweaked toml for crates.io crate scirs2-neural 0.1.0-alpha.6 written to /workspace/builds/worker-4-tc2/source/Cargo.toml [INFO] validating manifest of crates.io crate scirs2-neural 0.1.0-alpha.6 on toolchain c659ee110de67e82444e4b6c8407c1a9af9c2cf6 [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+c659ee110de67e82444e4b6c8407c1a9af9c2cf6" "metadata" "--manifest-path" "Cargo.toml" "--no-deps", kill_on_drop: false }` [INFO] crate crates.io crate scirs2-neural 0.1.0-alpha.6 already has a lockfile, it will not be regenerated [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+c659ee110de67e82444e4b6c8407c1a9af9c2cf6" "fetch" "--manifest-path" "Cargo.toml", kill_on_drop: false }` [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc2/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc2/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:7ad1b28ee6f5f7f699f6cf7015098d6ccdd96d6f2d78dd06228f5b4c9faf309c" "/opt/rustwide/cargo-home/bin/cargo" "+c659ee110de67e82444e4b6c8407c1a9af9c2cf6" "metadata" "--no-deps" "--format-version=1", kill_on_drop: false }` [INFO] [stdout] 978b12fd440b094562888d061843ac35349e5e6be02f87e07c05187c288e48c9 [INFO] running `Command { std: "docker" "start" "-a" "978b12fd440b094562888d061843ac35349e5e6be02f87e07c05187c288e48c9", kill_on_drop: false }` [INFO] running `Command { std: "docker" "inspect" "978b12fd440b094562888d061843ac35349e5e6be02f87e07c05187c288e48c9", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "978b12fd440b094562888d061843ac35349e5e6be02f87e07c05187c288e48c9", kill_on_drop: false }` [INFO] [stdout] 978b12fd440b094562888d061843ac35349e5e6be02f87e07c05187c288e48c9 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc2/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-4-tc2/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "RUSTDOCFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:7ad1b28ee6f5f7f699f6cf7015098d6ccdd96d6f2d78dd06228f5b4c9faf309c" "/opt/rustwide/cargo-home/bin/cargo" "+c659ee110de67e82444e4b6c8407c1a9af9c2cf6" "check" "--frozen" "--all" "--all-targets" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] b18b1c2ef3ceb0fd4cf9c2ac5cd8a3d54806b97329d02c3f8ed041b2aad79d7b [INFO] running `Command { std: "docker" "start" "-a" "b18b1c2ef3ceb0fd4cf9c2ac5cd8a3d54806b97329d02c3f8ed041b2aad79d7b", kill_on_drop: false }` [INFO] [stderr] Compiling syn v2.0.103 [INFO] [stderr] Checking num-integer v0.1.46 [INFO] [stderr] Compiling num-bigint v0.3.3 [INFO] [stderr] Checking approx v0.5.1 [INFO] [stderr] Compiling num-complex v0.2.4 [INFO] [stderr] Compiling num-rational v0.3.2 [INFO] [stderr] Compiling darling_core v0.14.4 [INFO] [stderr] Compiling matrixmultiply v0.1.15 [INFO] [stderr] Checking ahash v0.8.12 [INFO] [stderr] Checking lapack-sys v0.14.0 [INFO] [stderr] Checking rand_chacha v0.9.0 [INFO] [stderr] Checking rand v0.8.5 [INFO] [stderr] Compiling ndarray v0.12.1 [INFO] [stderr] Checking bytemuck v1.23.1 [INFO] [stderr] Checking rayon v1.10.0 [INFO] [stderr] Checking rawpointer v0.1.0 [INFO] [stderr] Checking crossbeam-channel v0.5.15 [INFO] [stderr] Checking rand v0.9.1 [INFO] [stderr] Checking crossbeam-queue v0.3.12 [INFO] [stderr] Checking num-iter v0.1.45 [INFO] [stderr] Checking hashbrown v0.14.5 [INFO] [stderr] Checking itertools v0.7.11 [INFO] [stderr] Checking num-complex v0.3.1 [INFO] [stderr] Checking instant v0.1.13 [INFO] [stderr] Checking cached_proc_macro_types v0.1.1 [INFO] [stderr] Checking safe_arch v0.7.4 [INFO] [stderr] Checking rmp v0.8.14 [INFO] [stderr] Checking tempfile v3.20.0 [INFO] [stderr] Checking memmap2 v0.9.5 [INFO] [stderr] Checking itertools v0.10.5 [INFO] [stderr] Checking num_cpus v1.17.0 [INFO] [stderr] Checking crossbeam v0.8.4 [INFO] [stderr] Checking matrixmultiply v0.2.4 [INFO] [stderr] Checking noisy_float v0.2.0 [INFO] [stderr] Compiling scirs2-neural v0.1.0-alpha.6 (/opt/rustwide/workdir) [INFO] [stderr] Checking unsafe-libyaml v0.2.11 [INFO] [stderr] Checking wide v0.7.32 [INFO] [stderr] Checking rand_distr v0.4.3 [INFO] [stderr] Checking rand_distr v0.5.1 [INFO] [stderr] Compiling darling_macro v0.14.4 [INFO] [stderr] Compiling darling v0.14.4 [INFO] [stderr] Checking num v0.3.1 [INFO] [stderr] Compiling cached_proc_macro v0.19.1 [INFO] [stderr] Checking autograd v1.1.1 [INFO] [stderr] Compiling serde_derive v1.0.219 [INFO] [stderr] Compiling thiserror-impl v2.0.12 [INFO] [stderr] Compiling katexit v0.1.5 [INFO] [stderr] Compiling thiserror-impl v1.0.69 [INFO] [stderr] Checking thiserror v2.0.12 [INFO] [stderr] Checking thiserror v1.0.69 [INFO] [stderr] Checking cached v0.48.1 [INFO] [stderr] Checking serde v1.0.219 [INFO] [stderr] Checking num-complex v0.4.6 [INFO] [stderr] Checking serde_json v1.0.140 [INFO] [stderr] Checking uuid v1.17.0 [INFO] [stderr] Checking chrono v0.4.41 [INFO] [stderr] Checking bincode v1.3.3 [INFO] [stderr] Checking rmp-serde v1.3.0 [INFO] [stderr] Checking serde_cbor v0.11.2 [INFO] [stderr] Checking serde_yaml v0.9.34+deprecated [INFO] [stderr] Checking cauchy v0.4.0 [INFO] [stderr] Checking ndarray v0.16.1 [INFO] [stderr] Checking ndarray v0.15.6 [INFO] [stderr] Checking lax v0.17.0 [INFO] [stderr] Checking ndarray-linalg v0.17.0 [INFO] [stderr] Checking ndarray-rand v0.15.0 [INFO] [stderr] Checking ndarray-stats v0.5.1 [INFO] [stderr] Checking scirs2-core v0.1.0-alpha.6 [INFO] [stdout] error[E0432]: unresolved import `criterion` [INFO] [stdout] --> benches/neural_benchmarks.rs:1:5 [INFO] [stdout] | [INFO] [stdout] 1 | use criterion::{black_box, criterion_group, criterion_main, BenchmarkId, Criterion, Throughput}; [INFO] [stdout] | ^^^^^^^^^ use of unresolved module or unlinked crate `criterion` [INFO] [stdout] | [INFO] [stdout] = help: if you wanted to use a crate named `criterion`, use `cargo add criterion` to add it to your `Cargo.toml` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0433]: failed to resolve: could not find `MLP` in `models` [INFO] [stdout] --> benches/neural_benchmarks.rs:303:33 [INFO] [stdout] | [INFO] [stdout] 303 | let mut model = models::MLP::new(&[*input_size, *hidden_size, *hidden_size, *output_size]); [INFO] [stdout] | ^^^ could not find `MLP` in `models` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0423]: expected function, found module `activations::relu` [INFO] [stdout] --> benches/neural_benchmarks.rs:45:23 [INFO] [stdout] | [INFO] [stdout] 45 | b.iter(|| activations::relu(black_box(&data))) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ not a function [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0423]: expected function, found module `activations::sigmoid` [INFO] [stdout] --> benches/neural_benchmarks.rs:49:23 [INFO] [stdout] | [INFO] [stdout] 49 | b.iter(|| activations::sigmoid(black_box(&data))) [INFO] [stdout] | ^^^^^^^^^^^^^------- [INFO] [stdout] | | [INFO] [stdout] | help: a unit struct with a similar name exists (notice the capitalization): `Sigmoid` [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/activations/sigmoid.rs:34:1 [INFO] [stdout] | [INFO] [stdout] 34 | pub struct Sigmoid; [INFO] [stdout] | ------------------ similarly named unit struct `Sigmoid` defined here [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0423]: expected function, found module `activations::tanh` [INFO] [stdout] --> benches/neural_benchmarks.rs:53:23 [INFO] [stdout] | [INFO] [stdout] 53 | b.iter(|| activations::tanh(black_box(&data))) [INFO] [stdout] | ^^^^^^^^^^^^^---- [INFO] [stdout] | | [INFO] [stdout] | help: a unit struct with a similar name exists: `Tanh` [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/activations/tanh.rs:34:1 [INFO] [stdout] | [INFO] [stdout] 34 | pub struct Tanh; [INFO] [stdout] | --------------- similarly named unit struct `Tanh` defined here [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `leaky_relu` in module `activations` [INFO] [stdout] --> benches/neural_benchmarks.rs:57:36 [INFO] [stdout] | [INFO] [stdout] 57 | b.iter(|| activations::leaky_relu(black_box(&data), 0.01)) [INFO] [stdout] | ^^^^^^^^^^ not found in `activations` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0423]: expected function, found module `activations::gelu` [INFO] [stdout] --> benches/neural_benchmarks.rs:61:23 [INFO] [stdout] | [INFO] [stdout] 61 | b.iter(|| activations::gelu(black_box(&data))) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^ not a function [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0423]: expected function, found module `activations::swish` [INFO] [stdout] --> benches/neural_benchmarks.rs:65:23 [INFO] [stdout] | [INFO] [stdout] 65 | b.iter(|| activations::swish(black_box(&data))) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ not a function [INFO] [stdout] | [INFO] [stdout] help: consider importing one of these functions instead [INFO] [stdout] | [INFO] [stdout] 1 + use crate::models::efficientnet::swish; [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_neural::models::efficientnet::swish; [INFO] [stdout] | [INFO] [stdout] help: if you import `swish`, refer to it directly [INFO] [stdout] | [INFO] [stdout] 65 - b.iter(|| activations::swish(black_box(&data))) [INFO] [stdout] 65 + b.iter(|| swish(black_box(&data))) [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `mse_loss` in module `losses` [INFO] [stdout] --> benches/neural_benchmarks.rs:198:31 [INFO] [stdout] | [INFO] [stdout] 198 | b.iter(|| losses::mse_loss(black_box(&predictions), black_box(&targets))) [INFO] [stdout] | ^^^^^^^^ not found in `losses` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `mae_loss` in module `losses` [INFO] [stdout] --> benches/neural_benchmarks.rs:202:31 [INFO] [stdout] | [INFO] [stdout] 202 | b.iter(|| losses::mae_loss(black_box(&predictions), black_box(&targets))) [INFO] [stdout] | ^^^^^^^^ not found in `losses` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `cross_entropy_loss` in module `losses` [INFO] [stdout] --> benches/neural_benchmarks.rs:210:29 [INFO] [stdout] | [INFO] [stdout] 210 | losses::cross_entropy_loss(black_box(&pred_probs), black_box(&true_labels)) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ not found in `losses` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `binary_cross_entropy_loss` in module `losses` [INFO] [stdout] --> benches/neural_benchmarks.rs:220:29 [INFO] [stdout] | [INFO] [stdout] 220 | losses::binary_cross_entropy_loss(black_box(&predictions), black_box(&targets)) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^ not found in `losses` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `batch_norm` in module `layers` [INFO] [stdout] --> benches/neural_benchmarks.rs:277:38 [INFO] [stdout] | [INFO] [stdout] 277 | |b, _| b.iter(|| layers::batch_norm(black_box(&data), None, None, 1e-5)), [INFO] [stdout] | ^^^^^^^^^^ not found in `layers` [INFO] [stdout] | [INFO] [stdout] help: consider importing this function [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_core::array_protocol::batch_norm; [INFO] [stdout] | [INFO] [stdout] help: if you import `batch_norm`, refer to it directly [INFO] [stdout] | [INFO] [stdout] 277 - |b, _| b.iter(|| layers::batch_norm(black_box(&data), None, None, 1e-5)), [INFO] [stdout] 277 + |b, _| b.iter(|| batch_norm(black_box(&data), None, None, 1e-5)), [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `layer_norm` in module `layers` [INFO] [stdout] --> benches/neural_benchmarks.rs:283:38 [INFO] [stdout] | [INFO] [stdout] 283 | |b, _| b.iter(|| layers::layer_norm(black_box(&data), None, None, 1e-5)), [INFO] [stdout] | ^^^^^^^^^^ not found in `layers` [INFO] [stdout] | [INFO] [stdout] help: consider importing one of these functions [INFO] [stdout] | [INFO] [stdout] 1 + use crate::linalg::layer_norm; [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_neural::linalg::layer_norm; [INFO] [stdout] | [INFO] [stdout] help: if you import `layer_norm`, refer to it directly [INFO] [stdout] | [INFO] [stdout] 283 - |b, _| b.iter(|| layers::layer_norm(black_box(&data), None, None, 1e-5)), [INFO] [stdout] 283 + |b, _| b.iter(|| layer_norm(black_box(&data), None, None, 1e-5)), [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0423]: expected function, found module `layers::dropout` [INFO] [stdout] --> benches/neural_benchmarks.rs:289:30 [INFO] [stdout] | [INFO] [stdout] 289 | |b, _| b.iter(|| layers::dropout(black_box(&data), 0.5, true)), [INFO] [stdout] | ^^^^^^^^^^^^^^^ not a function [INFO] [stdout] | [INFO] [stdout] help: consider importing one of these functions instead [INFO] [stdout] | [INFO] [stdout] 1 + use crate::linalg::dropout; [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_core::array_protocol::dropout; [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_neural::linalg::dropout; [INFO] [stdout] | [INFO] [stdout] help: if you import `dropout`, refer to it directly [INFO] [stdout] | [INFO] [stdout] 289 - |b, _| b.iter(|| layers::dropout(black_box(&data), 0.5, true)), [INFO] [stdout] 289 + |b, _| b.iter(|| dropout(black_box(&data), 0.5, true)), [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0425]: cannot find function `cross_entropy_loss` in module `losses` [INFO] [stdout] --> benches/neural_benchmarks.rs:321:40 [INFO] [stdout] | [INFO] [stdout] 321 | let loss = losses::cross_entropy_loss(&predictions, black_box(&targets)); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ not found in `losses` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0603]: module `relu` is private [INFO] [stdout] --> benches/neural_benchmarks.rs:45:36 [INFO] [stdout] | [INFO] [stdout] 45 | b.iter(|| activations::relu(black_box(&data))) [INFO] [stdout] | ^^^^ private module [INFO] [stdout] | [INFO] [stdout] note: the module `relu` is defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/activations/mod.rs:200:1 [INFO] [stdout] | [INFO] [stdout] 200 | mod relu; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0603]: module `sigmoid` is private [INFO] [stdout] --> benches/neural_benchmarks.rs:49:36 [INFO] [stdout] | [INFO] [stdout] 49 | b.iter(|| activations::sigmoid(black_box(&data))) [INFO] [stdout] | ^^^^^^^ private module [INFO] [stdout] | [INFO] [stdout] note: the module `sigmoid` is defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/activations/mod.rs:201:1 [INFO] [stdout] | [INFO] [stdout] 201 | mod sigmoid; [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0603]: module `tanh` is private [INFO] [stdout] --> benches/neural_benchmarks.rs:53:36 [INFO] [stdout] | [INFO] [stdout] 53 | b.iter(|| activations::tanh(black_box(&data))) [INFO] [stdout] | ^^^^ private module [INFO] [stdout] | [INFO] [stdout] note: the module `tanh` is defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/activations/mod.rs:204:1 [INFO] [stdout] | [INFO] [stdout] 204 | mod tanh; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0603]: module `gelu` is private [INFO] [stdout] --> benches/neural_benchmarks.rs:61:36 [INFO] [stdout] | [INFO] [stdout] 61 | b.iter(|| activations::gelu(black_box(&data))) [INFO] [stdout] | ^^^^ private module [INFO] [stdout] | [INFO] [stdout] note: the module `gelu` is defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/activations/mod.rs:198:1 [INFO] [stdout] | [INFO] [stdout] 198 | mod gelu; [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0603]: module `swish` is private [INFO] [stdout] --> benches/neural_benchmarks.rs:65:36 [INFO] [stdout] | [INFO] [stdout] 65 | b.iter(|| activations::swish(black_box(&data))) [INFO] [stdout] | ^^^^^ private module [INFO] [stdout] | [INFO] [stdout] note: the module `swish` is defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/activations/mod.rs:203:1 [INFO] [stdout] | [INFO] [stdout] 203 | mod swish; [INFO] [stdout] | ^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0603]: module `dropout` is private [INFO] [stdout] --> benches/neural_benchmarks.rs:289:38 [INFO] [stdout] | [INFO] [stdout] 289 | |b, _| b.iter(|| layers::dropout(black_box(&data), 0.5, true)), [INFO] [stdout] | ^^^^^^^ private module [INFO] [stdout] | [INFO] [stdout] note: the module `dropout` is defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/layers/mod.rs:487:1 [INFO] [stdout] | [INFO] [stdout] 487 | mod dropout; [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0601]: `main` function not found in crate `neural_benchmarks` [INFO] [stdout] --> benches/neural_benchmarks.rs:343:26 [INFO] [stdout] | [INFO] [stdout] 343 | criterion_main!(benches); [INFO] [stdout] | ^ consider adding a `main` function to `benches/neural_benchmarks.rs` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated function `rand::thread_rng`: Renamed to `rng` [INFO] [stdout] --> benches/neural_benchmarks.rs:7:25 [INFO] [stdout] | [INFO] [stdout] 7 | let mut rng = rand::thread_rng(); [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(deprecated)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated function `rand::thread_rng`: Renamed to `rng` [INFO] [stdout] --> benches/neural_benchmarks.rs:17:25 [INFO] [stdout] | [INFO] [stdout] 17 | let mut rng = rand::thread_rng(); [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated function `rand::thread_rng`: Renamed to `rng` [INFO] [stdout] --> benches/neural_benchmarks.rs:27:25 [INFO] [stdout] | [INFO] [stdout] 27 | let mut rng = rand::thread_rng(); [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated method `rand::Rng::gen`: Renamed to `random` to avoid conflict with the new `gen` keyword in Rust 2024. [INFO] [stdout] --> benches/neural_benchmarks.rs:11:37 [INFO] [stdout] | [INFO] [stdout] 11 | (0..total_size).map(|_| rng.gen::()).collect(), [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated method `rand::Rng::gen`: Renamed to `random` to avoid conflict with the new `gen` keyword in Rust 2024. [INFO] [stdout] --> benches/neural_benchmarks.rs:21:37 [INFO] [stdout] | [INFO] [stdout] 21 | (0..total_size).map(|_| rng.gen::()).collect(), [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: use of deprecated method `rand::Rng::gen`: Renamed to `random` to avoid conflict with the new `gen` keyword in Rust 2024. [INFO] [stdout] --> benches/neural_benchmarks.rs:31:37 [INFO] [stdout] | [INFO] [stdout] 31 | (0..total_size).map(|_| rng.gen::()).collect(), [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0061]: this function takes 4 arguments but 2 arguments were supplied [INFO] [stdout] --> benches/neural_benchmarks.rs:82:25 [INFO] [stdout] | [INFO] [stdout] 82 | let mut layer = layers::Dense::new(*input_size, *output_size); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^--------------------------- two arguments of type `Option<&str>` and `&mut _` are missing [INFO] [stdout] | [INFO] [stdout] note: associated function defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/layers/dense.rs:113:12 [INFO] [stdout] | [INFO] [stdout] 113 | pub fn new( [INFO] [stdout] | ^^^ [INFO] [stdout] help: provide the arguments [INFO] [stdout] | [INFO] [stdout] 82 | let mut layer = layers::Dense::new(*input_size, *output_size, /* Option<&str> */, /* rng */); [INFO] [stdout] | +++++++++++++++++++++++++++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0609]: no field `weights` on type `Result, NeuralError>` [INFO] [stdout] --> benches/neural_benchmarks.rs:83:15 [INFO] [stdout] | [INFO] [stdout] 83 | layer.weights = weights.clone(); [INFO] [stdout] | ^^^^^^^ unknown field [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0609]: no field `bias` on type `Result, NeuralError>` [INFO] [stdout] --> benches/neural_benchmarks.rs:84:15 [INFO] [stdout] | [INFO] [stdout] 84 | layer.bias = bias.clone(); [INFO] [stdout] | ^^^^ unknown field [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `forward` found for enum `Result` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:96:36 [INFO] [stdout] | [INFO] [stdout] 96 | |b, _| b.iter(|| layer.forward(black_box(&input))), [INFO] [stdout] | ^^^^^^^ method not found in `Result, NeuralError>` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `backward` found for enum `Result` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:108:36 [INFO] [stdout] | [INFO] [stdout] 108 | |b, _| b.iter(|| layer.backward(black_box(&input), black_box(&output_grad))), [INFO] [stdout] | ^^^^^^^^ method not found in `Result, NeuralError>` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0061]: this function takes 6 arguments but 5 arguments were supplied [INFO] [stdout] --> benches/neural_benchmarks.rs:126:30 [INFO] [stdout] | [INFO] [stdout] 126 | let mut conv_layer = layers::Conv2D::new(*channels, *filters, kernel_size, stride, padding); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^--------------------------------------------------- [INFO] [stdout] | | | [INFO] [stdout] | | expected `PaddingMode`, found integer [INFO] [stdout] | argument #6 of type `&mut _` is missing [INFO] [stdout] | [INFO] [stdout] note: expected `(usize, usize)`, found integer [INFO] [stdout] --> benches/neural_benchmarks.rs:126:71 [INFO] [stdout] | [INFO] [stdout] 126 | let mut conv_layer = layers::Conv2D::new(*channels, *filters, kernel_size, stride, padding); [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] = note: expected tuple `(usize, usize)` [INFO] [stdout] found type `{integer}` [INFO] [stdout] note: expected `(usize, usize)`, found integer [INFO] [stdout] --> benches/neural_benchmarks.rs:126:84 [INFO] [stdout] | [INFO] [stdout] 126 | let mut conv_layer = layers::Conv2D::new(*channels, *filters, kernel_size, stride, padding); [INFO] [stdout] | ^^^^^^ [INFO] [stdout] = note: expected tuple `(usize, usize)` [INFO] [stdout] found type `{integer}` [INFO] [stdout] note: associated function defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/layers/conv/conv2d.rs:94:12 [INFO] [stdout] | [INFO] [stdout] 94 | pub fn new( [INFO] [stdout] | ^^^ [INFO] [stdout] help: try wrapping the expression in `scirs2_neural::layers::PaddingMode::Custom` [INFO] [stdout] | [INFO] [stdout] 126 | let mut conv_layer = layers::Conv2D::new(*channels, *filters, kernel_size, stride, scirs2_neural::layers::PaddingMode::Custom(padding)); [INFO] [stdout] | +++++++++++++++++++++++++++++++++++++++++++ + [INFO] [stdout] help: provide the argument [INFO] [stdout] | [INFO] [stdout] 126 - let mut conv_layer = layers::Conv2D::new(*channels, *filters, kernel_size, stride, padding); [INFO] [stdout] 126 + let mut conv_layer = layers::Conv2D::new(*channels, *filters, /* (usize, usize) */, /* (usize, usize) */, /* PaddingMode */, /* rng */); [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `forward` found for enum `Result` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:138:41 [INFO] [stdout] | [INFO] [stdout] 138 | |b, _| b.iter(|| conv_layer.forward(black_box(&input))), [INFO] [stdout] | ^^^^^^^ method not found in `Result, NeuralError>` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0061]: this function takes 3 arguments but 2 arguments were supplied [INFO] [stdout] --> benches/neural_benchmarks.rs:153:30 [INFO] [stdout] | [INFO] [stdout] 153 | let mut lstm_layer = layers::LSTM::new(*input_size, *hidden_size); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^--------------------------- argument #3 of type `&mut _` is missing [INFO] [stdout] | [INFO] [stdout] note: associated function defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/layers/recurrent/lstm.rs:116:12 [INFO] [stdout] | [INFO] [stdout] 116 | pub fn new(input_size: usize, hidden_size: usize, rng: &mut R) -> Result { [INFO] [stdout] | ^^^ [INFO] [stdout] help: provide the argument [INFO] [stdout] | [INFO] [stdout] 153 | let mut lstm_layer = layers::LSTM::new(*input_size, *hidden_size, /* rng */); [INFO] [stdout] | +++++++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0061]: this function takes 3 arguments but 2 arguments were supplied [INFO] [stdout] --> benches/neural_benchmarks.rs:154:29 [INFO] [stdout] | [INFO] [stdout] 154 | let mut gru_layer = layers::GRU::new(*input_size, *hidden_size); [INFO] [stdout] | ^^^^^^^^^^^^^^^^--------------------------- argument #3 of type `&mut _` is missing [INFO] [stdout] | [INFO] [stdout] note: associated function defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/layers/recurrent/gru.rs:104:12 [INFO] [stdout] | [INFO] [stdout] 104 | pub fn new(input_size: usize, hidden_size: usize, rng: &mut R) -> Result { [INFO] [stdout] | ^^^ [INFO] [stdout] help: provide the argument [INFO] [stdout] | [INFO] [stdout] 154 | let mut gru_layer = layers::GRU::new(*input_size, *hidden_size, /* rng */); [INFO] [stdout] | +++++++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `forward` found for enum `Result` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:166:41 [INFO] [stdout] | [INFO] [stdout] 166 | |b, _| b.iter(|| lstm_layer.forward(black_box(&input))), [INFO] [stdout] | ^^^^^^^ method not found in `Result, NeuralError>` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `forward` found for enum `Result` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:175:40 [INFO] [stdout] | [INFO] [stdout] 175 | |b, _| b.iter(|| gru_layer.forward(black_box(&input))), [INFO] [stdout] | ^^^^^^^ method not found in `Result, NeuralError>` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0061]: this function takes 1 argument but 3 arguments were supplied [INFO] [stdout] --> benches/neural_benchmarks.rs:240:27 [INFO] [stdout] | [INFO] [stdout] 240 | let mut rmsprop = optimizers::RMSprop::new(0.001, 0.9, 1e-8); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^ --- ---- unexpected argument #3 of type `{float}` [INFO] [stdout] | | [INFO] [stdout] | unexpected argument #2 of type `{float}` [INFO] [stdout] | [INFO] [stdout] note: associated function defined here [INFO] [stdout] --> /opt/rustwide/workdir/src/optimizers/rmsprop.rs:51:12 [INFO] [stdout] | [INFO] [stdout] 51 | pub fn new(learning_rate: F) -> Self { [INFO] [stdout] | ^^^ [INFO] [stdout] help: remove the extra arguments [INFO] [stdout] | [INFO] [stdout] 240 - let mut rmsprop = optimizers::RMSprop::new(0.001, 0.9, 1e-8); [INFO] [stdout] 240 + let mut rmsprop = optimizers::RMSprop::new(0.001); [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `update` found for struct `scirs2_neural::optimizers::SGD` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:247:34 [INFO] [stdout] | [INFO] [stdout] 247 | |b, _| b.iter(|| sgd.update(black_box(¶ms), black_box(&gradients))), [INFO] [stdout] | ^^^^^^ method not found in `scirs2_neural::optimizers::SGD<{float}>` [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/optimizers/mod.rs:23:8 [INFO] [stdout] | [INFO] [stdout] 23 | fn update( [INFO] [stdout] | ------ the method is available for `scirs2_neural::optimizers::SGD<{float}>` here [INFO] [stdout] | [INFO] [stdout] = help: items from traits can only be used if the trait is in scope [INFO] [stdout] help: trait `Optimizer` which provides `update` is implemented but not in scope; perhaps you want to import it [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_neural::optimizers::Optimizer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `update` found for struct `scirs2_neural::optimizers::Adam` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:253:35 [INFO] [stdout] | [INFO] [stdout] 253 | |b, _| b.iter(|| adam.update(black_box(¶ms), black_box(&gradients))), [INFO] [stdout] | ^^^^^^ method not found in `scirs2_neural::optimizers::Adam<{float}>` [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/optimizers/mod.rs:23:8 [INFO] [stdout] | [INFO] [stdout] 23 | fn update( [INFO] [stdout] | ------ the method is available for `scirs2_neural::optimizers::Adam<{float}>` here [INFO] [stdout] | [INFO] [stdout] = help: items from traits can only be used if the trait is in scope [INFO] [stdout] help: trait `Optimizer` which provides `update` is implemented but not in scope; perhaps you want to import it [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_neural::optimizers::Optimizer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] error[E0599]: no method named `update` found for struct `RMSprop` in the current scope [INFO] [stdout] --> benches/neural_benchmarks.rs:259:38 [INFO] [stdout] | [INFO] [stdout] 259 | |b, _| b.iter(|| rmsprop.update(black_box(¶ms), black_box(&gradients))), [INFO] [stdout] | ^^^^^^ method not found in `RMSprop<{float}>` [INFO] [stdout] | [INFO] [stdout] ::: /opt/rustwide/workdir/src/optimizers/mod.rs:23:8 [INFO] [stdout] | [INFO] [stdout] 23 | fn update( [INFO] [stdout] | ------ the method is available for `RMSprop<{float}>` here [INFO] [stdout] | [INFO] [stdout] = help: items from traits can only be used if the trait is in scope [INFO] [stdout] help: trait `Optimizer` which provides `update` is implemented but not in scope; perhaps you want to import it [INFO] [stdout] | [INFO] [stdout] 1 + use scirs2_neural::optimizers::Optimizer; [INFO] [stdout] | [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] Some errors have detailed explanations: E0061, E0423, E0425, E0432, E0433, E0599, E0601, E0603, E0609. [INFO] [stdout] [INFO] [stdout] For more information about an error, try `rustc --explain E0061`. [INFO] [stdout] [INFO] [stderr] error: could not compile `scirs2-neural` (bench "neural_benchmarks") due to 38 previous errors; 6 warnings emitted [INFO] [stderr] warning: build failed, waiting for other jobs to finish... [INFO] running `Command { std: "docker" "inspect" "b18b1c2ef3ceb0fd4cf9c2ac5cd8a3d54806b97329d02c3f8ed041b2aad79d7b", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "b18b1c2ef3ceb0fd4cf9c2ac5cd8a3d54806b97329d02c3f8ed041b2aad79d7b", kill_on_drop: false }` [INFO] [stdout] b18b1c2ef3ceb0fd4cf9c2ac5cd8a3d54806b97329d02c3f8ed041b2aad79d7b