[INFO] cloning repository https://github.com/maxjeffos/rust-neural-network-experiments [INFO] running `Command { std: "git" "-c" "credential.helper=" "-c" "credential.helper=/workspace/cargo-home/bin/git-credential-null" "clone" "--bare" "https://github.com/maxjeffos/rust-neural-network-experiments" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fmaxjeffos%2Frust-neural-network-experiments", kill_on_drop: false }` [INFO] [stderr] Cloning into bare repository '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fmaxjeffos%2Frust-neural-network-experiments'... [INFO] running `Command { std: "git" "rev-parse" "HEAD", kill_on_drop: false }` [INFO] [stdout] c4c2539f42a371031368d45b92d7a75516ec12f0 [INFO] testing maxjeffos/rust-neural-network-experiments against 1.84.0 for beta-1.85-1 [INFO] running `Command { std: "git" "clone" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fmaxjeffos%2Frust-neural-network-experiments" "/workspace/builds/worker-1-tc1/source", kill_on_drop: false }` [INFO] [stderr] Cloning into '/workspace/builds/worker-1-tc1/source'... [INFO] [stderr] done. [INFO] [stderr] Updating files: 79% (61/77) Updating files: 80% (62/77) Updating files: 81% (63/77) Updating files: 83% (64/77) Updating files: 84% (65/77) Updating files: 85% (66/77) Updating files: 87% (67/77) Updating files: 88% (68/77) Updating files: 89% (69/77) Updating files: 90% (70/77) Updating files: 92% (71/77) Updating files: 93% (72/77) Updating files: 94% (73/77) Updating files: 96% (74/77) Updating files: 97% (75/77) Updating files: 98% (76/77) Updating files: 100% (77/77) Updating files: 100% (77/77), done. [INFO] validating manifest of git repo https://github.com/maxjeffos/rust-neural-network-experiments on toolchain 1.84.0 [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+1.84.0" "metadata" "--manifest-path" "Cargo.toml" "--no-deps", kill_on_drop: false }` [INFO] started tweaking git repo https://github.com/maxjeffos/rust-neural-network-experiments [INFO] finished tweaking git repo https://github.com/maxjeffos/rust-neural-network-experiments [INFO] tweaked toml for git repo https://github.com/maxjeffos/rust-neural-network-experiments written to /workspace/builds/worker-1-tc1/source/Cargo.toml [INFO] crate git repo https://github.com/maxjeffos/rust-neural-network-experiments already has a lockfile, it will not be regenerated [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+1.84.0" "fetch" "--manifest-path" "Cargo.toml", kill_on_drop: false }` [INFO] [stderr] warning: virtual workspace defaulting to `resolver = "1"` despite one or more workspace members being on edition 2021 which implies `resolver = "2"` [INFO] [stderr] note: to keep the current resolver, specify `workspace.resolver = "1"` in the workspace root's manifest [INFO] [stderr] note: to use the edition 2021 resolver, specify `workspace.resolver = "2"` in the workspace root's manifest [INFO] [stderr] note: for more details see https://doc.rust-lang.org/cargo/reference/resolver.html#resolver-versions [INFO] [stderr] Updating crates.io index [INFO] [stderr] Downloading crates ... [INFO] [stderr] Downloaded autodiff v0.4.0 [INFO] [stderr] Downloaded mnist v0.5.0 [INFO] [stderr] Downloaded ndarray v0.15.4 [INFO] [stderr] Downloaded time-test v0.2.2 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:86ea7c7af713d31e8cfdb68a6d0db50b5cf7cbeecde3d112f9f257f747318d36" "/opt/rustwide/cargo-home/bin/cargo" "+1.84.0" "metadata" "--no-deps" "--format-version=1", kill_on_drop: false }` [INFO] [stdout] a290aa36bd6e72330a86d0cd708cbee9f5c1c8e61112d747ffd6ef9310fa434c [INFO] running `Command { std: "docker" "start" "-a" "a290aa36bd6e72330a86d0cd708cbee9f5c1c8e61112d747ffd6ef9310fa434c", kill_on_drop: false }` [INFO] running `Command { std: "docker" "inspect" "a290aa36bd6e72330a86d0cd708cbee9f5c1c8e61112d747ffd6ef9310fa434c", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "a290aa36bd6e72330a86d0cd708cbee9f5c1c8e61112d747ffd6ef9310fa434c", kill_on_drop: false }` [INFO] [stdout] a290aa36bd6e72330a86d0cd708cbee9f5c1c8e61112d747ffd6ef9310fa434c [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=warn" "-e" "RUSTDOCFLAGS=--cap-lints=warn" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:86ea7c7af713d31e8cfdb68a6d0db50b5cf7cbeecde3d112f9f257f747318d36" "/opt/rustwide/cargo-home/bin/cargo" "+1.84.0" "build" "--frozen" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] 7065d8c6366bb1b2da788b212ebeb3d91a2682f12c0002040bc0938d5f75d8af [INFO] running `Command { std: "docker" "start" "-a" "7065d8c6366bb1b2da788b212ebeb3d91a2682f12c0002040bc0938d5f75d8af", kill_on_drop: false }` [INFO] [stderr] warning: virtual workspace defaulting to `resolver = "1"` despite one or more workspace members being on edition 2021 which implies `resolver = "2"` [INFO] [stderr] note: to keep the current resolver, specify `workspace.resolver = "1"` in the workspace root's manifest [INFO] [stderr] note: to use the edition 2021 resolver, specify `workspace.resolver = "2"` in the workspace root's manifest [INFO] [stderr] note: for more details see https://doc.rust-lang.org/cargo/reference/resolver.html#resolver-versions [INFO] [stderr] Compiling autocfg v1.0.1 [INFO] [stderr] Compiling cfg-if v1.0.0 [INFO] [stderr] Compiling libc v0.2.112 [INFO] [stderr] Compiling libm v0.2.1 [INFO] [stderr] Compiling crossbeam-utils v0.8.6 [INFO] [stderr] Compiling lazy_static v1.4.0 [INFO] [stderr] Compiling crossbeam-epoch v0.9.6 [INFO] [stderr] Compiling rayon-core v1.9.1 [INFO] [stderr] Compiling scopeguard v1.1.0 [INFO] [stderr] Compiling ppv-lite86 v0.2.16 [INFO] [stderr] Compiling either v1.6.1 [INFO] [stderr] Compiling proc-macro2 v1.0.36 [INFO] [stderr] Compiling unicode-xid v0.2.2 [INFO] [stderr] Compiling serde v1.0.136 [INFO] [stderr] Compiling rawpointer v0.2.1 [INFO] [stderr] Compiling syn v1.0.85 [INFO] [stderr] Compiling serde_derive v1.0.136 [INFO] [stderr] Compiling serde_json v1.0.78 [INFO] [stderr] Compiling matrixmultiply v0.3.2 [INFO] [stderr] Compiling byteorder v1.4.3 [INFO] [stderr] Compiling ryu v1.0.9 [INFO] [stderr] Compiling itoa v1.0.1 [INFO] [stderr] Compiling anyhow v1.0.82 [INFO] [stderr] Compiling num-traits v0.2.14 [INFO] [stderr] Compiling memoffset v0.6.5 [INFO] [stderr] Compiling rayon v1.5.1 [INFO] [stderr] Compiling num-integer v0.1.44 [INFO] [stderr] Compiling mnist v0.5.0 [INFO] [stderr] Compiling crossbeam-channel v0.5.2 [INFO] [stderr] Compiling crossbeam-deque v0.8.1 [INFO] [stderr] Compiling getrandom v0.2.4 [INFO] [stderr] Compiling num_cpus v1.13.1 [INFO] [stderr] Compiling quote v1.0.14 [INFO] [stderr] Compiling rand_core v0.6.3 [INFO] [stderr] Compiling rand_chacha v0.3.1 [INFO] [stderr] Compiling num-complex v0.4.0 [INFO] [stderr] Compiling autodiff v0.4.0 [INFO] [stderr] Compiling rand v0.8.4 [INFO] [stderr] Compiling test1-autodiff v0.1.0 (/opt/rustwide/workdir/test1-autodiff) [INFO] [stdout] warning: function `e_to_the_x` is never used [INFO] [stdout] --> test1-autodiff/src/main.rs:3:4 [INFO] [stdout] | [INFO] [stdout] 3 | fn e_to_the_x(x: FT) -> FT { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling ndarray v0.15.4 [INFO] [stderr] Compiling rand_distr v0.4.2 [INFO] [stderr] Compiling common v0.1.0 (/opt/rustwide/workdir/common) [INFO] [stdout] warning: elided lifetime has a name [INFO] [stdout] --> common/src/linalg/mod.rs:714:64 [INFO] [stdout] | [INFO] [stdout] 714 | pub fn iter_with<'a>(&'a self, other: &'a ColumnVector) -> IterWith { [INFO] [stdout] | -- lifetime `'a` declared here ^^^^^^^^ this elided lifetime gets resolved as `'a` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(elided_named_lifetimes)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: enum `Quadrant` is never used [INFO] [stdout] --> common/src/point.rs:7:6 [INFO] [stdout] | [INFO] [stdout] 7 | enum Quadrant { [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `softmax` is never used [INFO] [stdout] --> common/src/softmax.rs:1:4 [INFO] [stdout] | [INFO] [stdout] 1 | fn softmax(logits: &[f64]) -> Vec { [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `softmax_derivative` is never used [INFO] [stdout] --> common/src/softmax.rs:16:4 [INFO] [stdout] | [INFO] [stdout] 16 | fn softmax_derivative(logits: &[f64]) -> Vec> { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling test3-simple-linear-regression v0.1.0 (/opt/rustwide/workdir/test3-simple-linear-regression) [INFO] [stderr] Compiling test5-playing-with-matrix-ideas v0.1.0 (/opt/rustwide/workdir/test5-playing-with-matrix-ideas) [INFO] [stderr] Compiling test4-multivariable-regression v0.1.0 (/opt/rustwide/workdir/test4-multivariable-regression) [INFO] [stdout] warning: enum `MultivariableRegressionError` is never used [INFO] [stdout] --> test4-multivariable-regression/src/main.rs:36:6 [INFO] [stdout] | [INFO] [stdout] 36 | enum MultivariableRegressionError { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: enum `InvalidDataError` is never used [INFO] [stdout] --> test4-multivariable-regression/src/main.rs:43:6 [INFO] [stdout] | [INFO] [stdout] 43 | enum InvalidDataError { [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `predict` is never used [INFO] [stdout] --> test4-multivariable-regression/src/main.rs:87:4 [INFO] [stdout] | [INFO] [stdout] 87 | fn predict(theta: &[f64], independant: &[f64]) -> f64 { [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling mnist-data v0.1.0 (/opt/rustwide/workdir/mnist-data) [INFO] [stderr] Compiling test2-mlp-classifier v0.1.0 (/opt/rustwide/workdir/test2-mlp-classifier) [INFO] [stdout] warning: variant `blue` should have an upper camel case name [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:59:5 [INFO] [stdout] | [INFO] [stdout] 59 | blue, [INFO] [stdout] | ^^^^ help: convert the identifier to upper camel case: `Blue` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(non_camel_case_types)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant `orange` should have an upper camel case name [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:60:5 [INFO] [stdout] | [INFO] [stdout] 60 | orange, [INFO] [stdout] | ^^^^^^ help: convert the identifier to upper camel case (notice the capitalization): `Orange` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray` [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use ndarray::*; [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `i` [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:80:9 [INFO] [stdout] | [INFO] [stdout] 80 | for i in 0..n { [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_i` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `MultilayerPerceptron` is never constructed [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:7:8 [INFO] [stdout] | [INFO] [stdout] 7 | struct MultilayerPerceptron { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `MLPArchitecture` is never constructed [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:14:8 [INFO] [stdout] | [INFO] [stdout] 14 | struct MLPArchitecture { [INFO] [stdout] | ^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:21:8 [INFO] [stdout] | [INFO] [stdout] 20 | impl MLPArchitecture { [INFO] [stdout] | -------------------- associated function in this implementation [INFO] [stdout] 21 | fn new(input_size: usize, hidden_layers: Vec, output_size: usize) -> Self { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling metrics v0.1.0 (/opt/rustwide/workdir/metrics) [INFO] [stderr] Compiling test6-nn v0.1.0 (/opt/rustwide/workdir/test6-nn) [INFO] [stderr] Compiling test7-nn-mnist-classifier v0.1.0 (/opt/rustwide/workdir/test7-nn-mnist-classifier) [INFO] [stdout] warning: unused variable: `normalized_distance` [INFO] [stdout] --> test6-nn/src/lib.rs:585:21 [INFO] [stdout] | [INFO] [stdout] 585 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `normalized_distance` [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:639:21 [INFO] [stdout] | [INFO] [stdout] 639 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `LayerGradients` is never constructed [INFO] [stdout] --> test6-nn/src/lib.rs:81:8 [INFO] [stdout] | [INFO] [stdout] 81 | struct LayerGradients { [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test6-nn/src/lib.rs:87:8 [INFO] [stdout] | [INFO] [stdout] 86 | impl LayerGradients { [INFO] [stdout] | ------------------- associated function in this implementation [INFO] [stdout] 87 | fn new(weight_gradients: Matrix, bias_gradients: ColumnVector) -> Self { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stdout] --> test6-nn/src/lib.rs:118:7 [INFO] [stdout] | [INFO] [stdout] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `unroll_gradients` is never used [INFO] [stdout] --> test6-nn/src/lib.rs:1074:8 [INFO] [stdout] | [INFO] [stdout] 146 | impl SimpleNeuralNetwork { [INFO] [stdout] | ------------------------ method in this implementation [INFO] [stdout] ... [INFO] [stdout] 1074 | fn unroll_gradients( [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `JELU` is never constructed [INFO] [stdout] --> test6-nn/src/activation/activator/jelu.rs:4:8 [INFO] [stdout] | [INFO] [stdout] 4 | struct JELU { [INFO] [stdout] | ^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test6-nn/src/activation/activator/jelu.rs:12:12 [INFO] [stdout] | [INFO] [stdout] 11 | impl JELU { [INFO] [stdout] | --------- associated function in this implementation [INFO] [stdout] 12 | pub fn new(crossover_point: f64) -> JELU { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test6-nn/src/lib.rs:665:13 [INFO] [stdout] | [INFO] [stdout] 665 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 665 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `LayerGradients` is never constructed [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:81:8 [INFO] [stdout] | [INFO] [stdout] 81 | struct LayerGradients { [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:87:8 [INFO] [stdout] | [INFO] [stdout] 86 | impl LayerGradients { [INFO] [stdout] | ------------------- associated function in this implementation [INFO] [stdout] 87 | fn new(weight_gradients: Matrix, bias_gradients: ColumnVector) -> Self { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:118:7 [INFO] [stdout] | [INFO] [stdout] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: methods `err_output_layer` and `unroll_gradients` are never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:328:8 [INFO] [stdout] | [INFO] [stdout] 148 | impl NeuralNetwork { [INFO] [stdout] | ------------------ methods in this implementation [INFO] [stdout] ... [INFO] [stdout] 328 | fn err_output_layer( [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 1128 | fn unroll_gradients( [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:719:13 [INFO] [stdout] | [INFO] [stdout] 719 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 719 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test7-nn-mnist-classifier/src/main.rs:99:5 [INFO] [stdout] | [INFO] [stdout] 99 | / nn.train_stochastic( [INFO] [stdout] 100 | | &training_data, [INFO] [stdout] 101 | | 10_000, [INFO] [stdout] 102 | | // &Optimizer::standard_gradient_descent(0.9), [INFO] [stdout] ... | [INFO] [stdout] 110 | | Some(session_logger), [INFO] [stdout] 111 | | ); [INFO] [stdout] | |_____^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 99 | let _ = nn.train_stochastic( [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test6-nn/src/main.rs:92:5 [INFO] [stdout] | [INFO] [stdout] 92 | / nn.train_stochastic( [INFO] [stdout] 93 | | &training_data, [INFO] [stdout] 94 | | 10_000, [INFO] [stdout] 95 | | // &Optimizer::standard_gradient_descent(0.9), [INFO] [stdout] ... | [INFO] [stdout] 103 | | Some(session_logger), [INFO] [stdout] 104 | | ); [INFO] [stdout] | |_____^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 92 | let _ = nn.train_stochastic( [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished `dev` profile [unoptimized + debuginfo] target(s) in 23.76s [INFO] running `Command { std: "docker" "inspect" "7065d8c6366bb1b2da788b212ebeb3d91a2682f12c0002040bc0938d5f75d8af", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "7065d8c6366bb1b2da788b212ebeb3d91a2682f12c0002040bc0938d5f75d8af", kill_on_drop: false }` [INFO] [stdout] 7065d8c6366bb1b2da788b212ebeb3d91a2682f12c0002040bc0938d5f75d8af [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=warn" "-e" "RUSTDOCFLAGS=--cap-lints=warn" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:86ea7c7af713d31e8cfdb68a6d0db50b5cf7cbeecde3d112f9f257f747318d36" "/opt/rustwide/cargo-home/bin/cargo" "+1.84.0" "test" "--frozen" "--no-run" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] 8335ea116919a8d40eb57e812af2beb18def8e243edcddbd6d98b81447dc2968 [INFO] running `Command { std: "docker" "start" "-a" "8335ea116919a8d40eb57e812af2beb18def8e243edcddbd6d98b81447dc2968", kill_on_drop: false }` [INFO] [stderr] warning: virtual workspace defaulting to `resolver = "1"` despite one or more workspace members being on edition 2021 which implies `resolver = "2"` [INFO] [stderr] note: to keep the current resolver, specify `workspace.resolver = "1"` in the workspace root's manifest [INFO] [stderr] note: to use the edition 2021 resolver, specify `workspace.resolver = "2"` in the workspace root's manifest [INFO] [stderr] note: for more details see https://doc.rust-lang.org/cargo/reference/resolver.html#resolver-versions [INFO] [stderr] Compiling time v0.1.43 [INFO] [stderr] Compiling float-cmp v0.9.0 [INFO] [stderr] Compiling test1-autodiff v0.1.0 (/opt/rustwide/workdir/test1-autodiff) [INFO] [stderr] Compiling test2-mlp-classifier v0.1.0 (/opt/rustwide/workdir/test2-mlp-classifier) [INFO] [stderr] Compiling metrics v0.1.0 (/opt/rustwide/workdir/metrics) [INFO] [stdout] warning: elided lifetime has a name [INFO] [stdout] --> common/src/linalg/mod.rs:714:64 [INFO] [stdout] | [INFO] [stdout] 714 | pub fn iter_with<'a>(&'a self, other: &'a ColumnVector) -> IterWith { [INFO] [stdout] | -- lifetime `'a` declared here ^^^^^^^^ this elided lifetime gets resolved as `'a` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(elided_named_lifetimes)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: enum `Quadrant` is never used [INFO] [stdout] --> common/src/point.rs:7:6 [INFO] [stdout] | [INFO] [stdout] 7 | enum Quadrant { [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling common v0.1.0 (/opt/rustwide/workdir/common) [INFO] [stdout] warning: function `softmax` is never used [INFO] [stdout] --> common/src/softmax.rs:1:4 [INFO] [stdout] | [INFO] [stdout] 1 | fn softmax(logits: &[f64]) -> Vec { [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling test3-simple-linear-regression v0.1.0 (/opt/rustwide/workdir/test3-simple-linear-regression) [INFO] [stdout] warning: function `softmax_derivative` is never used [INFO] [stdout] --> common/src/softmax.rs:16:4 [INFO] [stdout] | [INFO] [stdout] 16 | fn softmax_derivative(logits: &[f64]) -> Vec> { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling test5-playing-with-matrix-ideas v0.1.0 (/opt/rustwide/workdir/test5-playing-with-matrix-ideas) [INFO] [stderr] Compiling mnist-data v0.1.0 (/opt/rustwide/workdir/mnist-data) [INFO] [stdout] warning: unused variable: `normalized_distance` [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:639:21 [INFO] [stdout] | [INFO] [stdout] 639 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `LayerGradients` is never constructed [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:81:8 [INFO] [stdout] | [INFO] [stdout] 81 | struct LayerGradients { [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:87:8 [INFO] [stdout] | [INFO] [stdout] 86 | impl LayerGradients { [INFO] [stdout] | ------------------- associated function in this implementation [INFO] [stdout] 87 | fn new(weight_gradients: Matrix, bias_gradients: ColumnVector) -> Self { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:118:7 [INFO] [stdout] | [INFO] [stdout] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: methods `err_output_layer` and `unroll_gradients` are never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:328:8 [INFO] [stdout] | [INFO] [stdout] 148 | impl NeuralNetwork { [INFO] [stdout] | ------------------ methods in this implementation [INFO] [stdout] ... [INFO] [stdout] 328 | fn err_output_layer( [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] ... [INFO] [stdout] 1128 | fn unroll_gradients( [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:719:13 [INFO] [stdout] | [INFO] [stdout] 719 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 719 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `normalized_distance` [INFO] [stdout] --> test6-nn/src/lib.rs:585:21 [INFO] [stdout] | [INFO] [stdout] 585 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `LayerGradients` is never constructed [INFO] [stdout] --> test6-nn/src/lib.rs:81:8 [INFO] [stdout] | [INFO] [stdout] 81 | struct LayerGradients { [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test6-nn/src/lib.rs:87:8 [INFO] [stdout] | [INFO] [stdout] 86 | impl LayerGradients { [INFO] [stdout] | ------------------- associated function in this implementation [INFO] [stdout] 87 | fn new(weight_gradients: Matrix, bias_gradients: ColumnVector) -> Self { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stdout] --> test6-nn/src/lib.rs:118:7 [INFO] [stdout] | [INFO] [stdout] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `unroll_gradients` is never used [INFO] [stdout] --> test6-nn/src/lib.rs:1074:8 [INFO] [stdout] | [INFO] [stdout] 146 | impl SimpleNeuralNetwork { [INFO] [stdout] | ------------------------ method in this implementation [INFO] [stdout] ... [INFO] [stdout] 1074 | fn unroll_gradients( [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `JELU` is never constructed [INFO] [stdout] --> test6-nn/src/activation/activator/jelu.rs:4:8 [INFO] [stdout] | [INFO] [stdout] 4 | struct JELU { [INFO] [stdout] | ^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test6-nn/src/activation/activator/jelu.rs:12:12 [INFO] [stdout] | [INFO] [stdout] 11 | impl JELU { [INFO] [stdout] | --------- associated function in this implementation [INFO] [stdout] 12 | pub fn new(crossover_point: f64) -> JELU { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test6-nn/src/lib.rs:665:13 [INFO] [stdout] | [INFO] [stdout] 665 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 665 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant `blue` should have an upper camel case name [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:59:5 [INFO] [stdout] | [INFO] [stdout] 59 | blue, [INFO] [stdout] | ^^^^ help: convert the identifier to upper camel case: `Blue` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(non_camel_case_types)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variant `orange` should have an upper camel case name [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:60:5 [INFO] [stdout] | [INFO] [stdout] 60 | orange, [INFO] [stdout] | ^^^^^^ help: convert the identifier to upper camel case (notice the capitalization): `Orange` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `ndarray` [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:5:5 [INFO] [stdout] | [INFO] [stdout] 5 | use ndarray::*; [INFO] [stdout] | ^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `i` [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:80:9 [INFO] [stdout] | [INFO] [stdout] 80 | for i in 0..n { [INFO] [stdout] | ^ help: if this is intentional, prefix it with an underscore: `_i` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `e_to_the_x` is never used [INFO] [stdout] --> test1-autodiff/src/main.rs:3:4 [INFO] [stdout] | [INFO] [stdout] 3 | fn e_to_the_x(x: FT) -> FT { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `MultilayerPerceptron` is never constructed [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:7:8 [INFO] [stdout] | [INFO] [stdout] 7 | struct MultilayerPerceptron { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `MLPArchitecture` is never constructed [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:14:8 [INFO] [stdout] | [INFO] [stdout] 14 | struct MLPArchitecture { [INFO] [stdout] | ^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: associated function `new` is never used [INFO] [stdout] --> test2-mlp-classifier/src/main.rs:21:8 [INFO] [stdout] | [INFO] [stdout] 20 | impl MLPArchitecture { [INFO] [stdout] | -------------------- associated function in this implementation [INFO] [stdout] 21 | fn new(input_size: usize, hidden_layers: Vec, output_size: usize) -> Self { [INFO] [stdout] | ^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: elided lifetime has a name [INFO] [stdout] --> common/src/linalg/mod.rs:714:64 [INFO] [stdout] | [INFO] [stdout] 714 | pub fn iter_with<'a>(&'a self, other: &'a ColumnVector) -> IterWith { [INFO] [stdout] | -- lifetime `'a` declared here ^^^^^^^^ this elided lifetime gets resolved as `'a` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(elided_named_lifetimes)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling test4-multivariable-regression v0.1.0 (/opt/rustwide/workdir/test4-multivariable-regression) [INFO] [stdout] warning: enum `MultivariableRegressionError` is never used [INFO] [stdout] --> test4-multivariable-regression/src/main.rs:36:6 [INFO] [stdout] | [INFO] [stdout] 36 | enum MultivariableRegressionError { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: enum `InvalidDataError` is never used [INFO] [stdout] --> test4-multivariable-regression/src/main.rs:43:6 [INFO] [stdout] | [INFO] [stdout] 43 | enum InvalidDataError { [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Compiling time-test v0.2.2 [INFO] [stderr] Compiling test6-nn v0.1.0 (/opt/rustwide/workdir/test6-nn) [INFO] [stderr] Compiling test7-nn-mnist-classifier v0.1.0 (/opt/rustwide/workdir/test7-nn-mnist-classifier) [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test7-nn-mnist-classifier/src/main.rs:99:5 [INFO] [stdout] | [INFO] [stdout] 99 | / nn.train_stochastic( [INFO] [stdout] 100 | | &training_data, [INFO] [stdout] 101 | | 10_000, [INFO] [stdout] 102 | | // &Optimizer::standard_gradient_descent(0.9), [INFO] [stdout] ... | [INFO] [stdout] 110 | | Some(session_logger), [INFO] [stdout] 111 | | ); [INFO] [stdout] | |_____^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 99 | let _ = nn.train_stochastic( [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test6-nn/src/main.rs:92:5 [INFO] [stdout] | [INFO] [stdout] 92 | / nn.train_stochastic( [INFO] [stdout] 93 | | &training_data, [INFO] [stdout] 94 | | 10_000, [INFO] [stdout] 95 | | // &Optimizer::standard_gradient_descent(0.9), [INFO] [stdout] ... | [INFO] [stdout] 103 | | Some(session_logger), [INFO] [stdout] 104 | | ); [INFO] [stdout] | |_____^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 92 | let _ = nn.train_stochastic( [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: enum `Quadrant` is never used [INFO] [stdout] --> common/src/point.rs:7:6 [INFO] [stdout] | [INFO] [stdout] 7 | enum Quadrant { [INFO] [stdout] | ^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable does not need to be mutable [INFO] [stdout] --> test7-nn-mnist-classifier/src/big_theta.rs:381:13 [INFO] [stdout] | [INFO] [stdout] 381 | let mut big_theta = create_big_theta_for_test(&sizes); [INFO] [stdout] | ----^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | help: remove this `mut` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_mut)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: variable does not need to be mutable [INFO] [stdout] --> test6-nn/src/big_theta.rs:381:13 [INFO] [stdout] | [INFO] [stdout] 381 | let mut big_theta = create_big_theta_for_test(&sizes); [INFO] [stdout] | ----^^^^^^^^^ [INFO] [stdout] | | [INFO] [stdout] | help: remove this `mut` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_mut)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `normalized_distance` [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:639:21 [INFO] [stdout] | [INFO] [stdout] 639 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `normalized_distance` [INFO] [stdout] --> test6-nn/src/lib.rs:585:21 [INFO] [stdout] | [INFO] [stdout] 585 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_variables)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused variable: `lr` [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:2242:13 [INFO] [stdout] | [INFO] [stdout] 2242 | let lr = LeakyReLU::new(0.1); [INFO] [stdout] | ^^ help: if this is intentional, prefix it with an underscore: `_lr` [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stdout] --> test6-nn/src/lib.rs:118:7 [INFO] [stdout] | [INFO] [stdout] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test6-nn/src/lib.rs:665:13 [INFO] [stdout] | [INFO] [stdout] 665 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 665 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:118:7 [INFO] [stdout] | [INFO] [stdout] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: method `err_output_layer` is never used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:328:8 [INFO] [stdout] | [INFO] [stdout] 148 | impl NeuralNetwork { [INFO] [stdout] | ------------------ method in this implementation [INFO] [stdout] ... [INFO] [stdout] 328 | fn err_output_layer( [INFO] [stdout] | ^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused `Result` that must be used [INFO] [stdout] --> test7-nn-mnist-classifier/src/lib.rs:719:13 [INFO] [stdout] | [INFO] [stdout] 719 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stdout] = note: `#[warn(unused_must_use)]` on by default [INFO] [stdout] help: use `let _ = ...` to ignore the resulting value [INFO] [stdout] | [INFO] [stdout] 719 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stdout] | +++++++ [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished `test` profile [unoptimized + debuginfo] target(s) in 5.82s [INFO] running `Command { std: "docker" "inspect" "8335ea116919a8d40eb57e812af2beb18def8e243edcddbd6d98b81447dc2968", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "8335ea116919a8d40eb57e812af2beb18def8e243edcddbd6d98b81447dc2968", kill_on_drop: false }` [INFO] [stdout] 8335ea116919a8d40eb57e812af2beb18def8e243edcddbd6d98b81447dc2968 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-1-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=warn" "-e" "RUSTDOCFLAGS=--cap-lints=warn" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:86ea7c7af713d31e8cfdb68a6d0db50b5cf7cbeecde3d112f9f257f747318d36" "/opt/rustwide/cargo-home/bin/cargo" "+1.84.0" "test" "--frozen", kill_on_drop: false }` [INFO] [stdout] 1e51216369f747a2f8788c096d99a55238aade6e1020cc791cc4de9e3d833938 [INFO] running `Command { std: "docker" "start" "-a" "1e51216369f747a2f8788c096d99a55238aade6e1020cc791cc4de9e3d833938", kill_on_drop: false }` [INFO] [stderr] warning: virtual workspace defaulting to `resolver = "1"` despite one or more workspace members being on edition 2021 which implies `resolver = "2"` [INFO] [stderr] note: to keep the current resolver, specify `workspace.resolver = "1"` in the workspace root's manifest [INFO] [stderr] note: to use the edition 2021 resolver, specify `workspace.resolver = "2"` in the workspace root's manifest [INFO] [stderr] note: for more details see https://doc.rust-lang.org/cargo/reference/resolver.html#resolver-versions [INFO] [stderr] warning: elided lifetime has a name [INFO] [stderr] --> common/src/linalg/mod.rs:714:64 [INFO] [stderr] | [INFO] [stderr] 714 | pub fn iter_with<'a>(&'a self, other: &'a ColumnVector) -> IterWith { [INFO] [stderr] | -- lifetime `'a` declared here ^^^^^^^^ this elided lifetime gets resolved as `'a` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(elided_named_lifetimes)]` on by default [INFO] [stderr] [INFO] [stderr] warning: enum `Quadrant` is never used [INFO] [stderr] --> common/src/point.rs:7:6 [INFO] [stderr] | [INFO] [stderr] 7 | enum Quadrant { [INFO] [stderr] | ^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: function `softmax` is never used [INFO] [stderr] --> common/src/softmax.rs:1:4 [INFO] [stderr] | [INFO] [stderr] 1 | fn softmax(logits: &[f64]) -> Vec { [INFO] [stderr] | ^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: function `softmax_derivative` is never used [INFO] [stderr] --> common/src/softmax.rs:16:4 [INFO] [stderr] | [INFO] [stderr] 16 | fn softmax_derivative(logits: &[f64]) -> Vec> { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: `common` (lib) generated 4 warnings [INFO] [stderr] warning: unused variable: `normalized_distance` [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:639:21 [INFO] [stderr] | [INFO] [stderr] 639 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_variables)]` on by default [INFO] [stderr] [INFO] [stderr] warning: struct `LayerGradients` is never constructed [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:81:8 [INFO] [stderr] | [INFO] [stderr] 81 | struct LayerGradients { [INFO] [stderr] | ^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: associated function `new` is never used [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:87:8 [INFO] [stderr] | [INFO] [stderr] 86 | impl LayerGradients { [INFO] [stderr] | ------------------- associated function in this implementation [INFO] [stderr] 87 | fn new(weight_gradients: Matrix, bias_gradients: ColumnVector) -> Self { [INFO] [stderr] | ^^^ [INFO] [stderr] [INFO] [stderr] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:118:7 [INFO] [stderr] | [INFO] [stderr] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: methods `err_output_layer` and `unroll_gradients` are never used [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:328:8 [INFO] [stderr] | [INFO] [stderr] 148 | impl NeuralNetwork { [INFO] [stderr] | ------------------ methods in this implementation [INFO] [stderr] ... [INFO] [stderr] 328 | fn err_output_layer( [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] ... [INFO] [stderr] 1128 | fn unroll_gradients( [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: unused `Result` that must be used [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:719:13 [INFO] [stderr] | [INFO] [stderr] 719 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stderr] = note: `#[warn(unused_must_use)]` on by default [INFO] [stderr] help: use `let _ = ...` to ignore the resulting value [INFO] [stderr] | [INFO] [stderr] 719 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stderr] | +++++++ [INFO] [stderr] [INFO] [stderr] warning: `test7-nn-mnist-classifier` (lib) generated 6 warnings [INFO] [stderr] warning: unused variable: `normalized_distance` [INFO] [stderr] --> test6-nn/src/lib.rs:585:21 [INFO] [stderr] | [INFO] [stderr] 585 | let normalized_distance = euclidian_distance(&approx_gradients_big_v, &d_vec) [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^ help: if this is intentional, prefix it with an underscore: `_normalized_distance` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_variables)]` on by default [INFO] [stderr] [INFO] [stderr] warning: struct `LayerGradients` is never constructed [INFO] [stderr] --> test6-nn/src/lib.rs:81:8 [INFO] [stderr] | [INFO] [stderr] 81 | struct LayerGradients { [INFO] [stderr] | ^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: associated function `new` is never used [INFO] [stderr] --> test6-nn/src/lib.rs:87:8 [INFO] [stderr] | [INFO] [stderr] 86 | impl LayerGradients { [INFO] [stderr] | ------------------- associated function in this implementation [INFO] [stderr] 87 | fn new(weight_gradients: Matrix, bias_gradients: ColumnVector) -> Self { [INFO] [stderr] | ^^^ [INFO] [stderr] [INFO] [stderr] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stderr] --> test6-nn/src/lib.rs:118:7 [INFO] [stderr] | [INFO] [stderr] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: method `unroll_gradients` is never used [INFO] [stderr] --> test6-nn/src/lib.rs:1074:8 [INFO] [stderr] | [INFO] [stderr] 146 | impl SimpleNeuralNetwork { [INFO] [stderr] | ------------------------ method in this implementation [INFO] [stderr] ... [INFO] [stderr] 1074 | fn unroll_gradients( [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: struct `JELU` is never constructed [INFO] [stderr] --> test6-nn/src/activation/activator/jelu.rs:4:8 [INFO] [stderr] | [INFO] [stderr] 4 | struct JELU { [INFO] [stderr] | ^^^^ [INFO] [stderr] [INFO] [stderr] warning: associated function `new` is never used [INFO] [stderr] --> test6-nn/src/activation/activator/jelu.rs:12:12 [INFO] [stderr] | [INFO] [stderr] 11 | impl JELU { [INFO] [stderr] | --------- associated function in this implementation [INFO] [stderr] 12 | pub fn new(crossover_point: f64) -> JELU { [INFO] [stderr] | ^^^ [INFO] [stderr] [INFO] [stderr] warning: unused `Result` that must be used [INFO] [stderr] --> test6-nn/src/lib.rs:665:13 [INFO] [stderr] | [INFO] [stderr] 665 | session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stderr] = note: `#[warn(unused_must_use)]` on by default [INFO] [stderr] help: use `let _ = ...` to ignore the resulting value [INFO] [stderr] | [INFO] [stderr] 665 | let _ = session_logger.write_training_session_file(initial_cost, network_config, optimizer_str); [INFO] [stderr] | +++++++ [INFO] [stderr] [INFO] [stderr] warning: unused `Result` that must be used [INFO] [stderr] --> test7-nn-mnist-classifier/src/main.rs:99:5 [INFO] [stderr] | [INFO] [stderr] 99 | / nn.train_stochastic( [INFO] [stderr] 100 | | &training_data, [INFO] [stderr] 101 | | 10_000, [INFO] [stderr] 102 | | // &Optimizer::standard_gradient_descent(0.9), [INFO] [stderr] ... | [INFO] [stderr] 110 | | Some(session_logger), [INFO] [stderr] 111 | | ); [INFO] [stderr] | |_____^ [INFO] [stderr] | [INFO] [stderr] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stderr] = note: `#[warn(unused_must_use)]` on by default [INFO] [stderr] help: use `let _ = ...` to ignore the resulting value [INFO] [stderr] | [INFO] [stderr] 99 | let _ = nn.train_stochastic( [INFO] [stderr] | +++++++ [INFO] [stderr] [INFO] [stderr] warning: variable does not need to be mutable [INFO] [stderr] --> test7-nn-mnist-classifier/src/big_theta.rs:381:13 [INFO] [stderr] | [INFO] [stderr] 381 | let mut big_theta = create_big_theta_for_test(&sizes); [INFO] [stderr] | ----^^^^^^^^^ [INFO] [stderr] | | [INFO] [stderr] | help: remove this `mut` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_mut)]` on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `lr` [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:2242:13 [INFO] [stderr] | [INFO] [stderr] 2242 | let lr = LeakyReLU::new(0.1); [INFO] [stderr] | ^^ help: if this is intentional, prefix it with an underscore: `_lr` [INFO] [stderr] [INFO] [stderr] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:118:7 [INFO] [stderr] | [INFO] [stderr] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: method `err_output_layer` is never used [INFO] [stderr] --> test7-nn-mnist-classifier/src/lib.rs:328:8 [INFO] [stderr] | [INFO] [stderr] 148 | impl NeuralNetwork { [INFO] [stderr] | ------------------ method in this implementation [INFO] [stderr] ... [INFO] [stderr] 328 | fn err_output_layer( [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: variable does not need to be mutable [INFO] [stderr] --> test6-nn/src/big_theta.rs:381:13 [INFO] [stderr] | [INFO] [stderr] 381 | let mut big_theta = create_big_theta_for_test(&sizes); [INFO] [stderr] | ----^^^^^^^^^ [INFO] [stderr] | | [INFO] [stderr] | help: remove this `mut` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_mut)]` on by default [INFO] [stderr] [INFO] [stderr] warning: constant `GRADIENT_CHECK_EPSILON_SQUARED` is never used [INFO] [stderr] --> test6-nn/src/lib.rs:118:7 [INFO] [stderr] | [INFO] [stderr] 118 | const GRADIENT_CHECK_EPSILON_SQUARED: f64 = GRADIENT_CHECK_EPSILON * GRADIENT_CHECK_EPSILON; [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: enum `MultivariableRegressionError` is never used [INFO] [stderr] --> test4-multivariable-regression/src/main.rs:36:6 [INFO] [stderr] | [INFO] [stderr] 36 | enum MultivariableRegressionError { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: enum `InvalidDataError` is never used [INFO] [stderr] --> test4-multivariable-regression/src/main.rs:43:6 [INFO] [stderr] | [INFO] [stderr] 43 | enum InvalidDataError { [INFO] [stderr] | ^^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: variant `blue` should have an upper camel case name [INFO] [stderr] --> test2-mlp-classifier/src/main.rs:59:5 [INFO] [stderr] | [INFO] [stderr] 59 | blue, [INFO] [stderr] | ^^^^ help: convert the identifier to upper camel case: `Blue` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(non_camel_case_types)]` on by default [INFO] [stderr] [INFO] [stderr] warning: variant `orange` should have an upper camel case name [INFO] [stderr] --> test2-mlp-classifier/src/main.rs:60:5 [INFO] [stderr] | [INFO] [stderr] 60 | orange, [INFO] [stderr] | ^^^^^^ help: convert the identifier to upper camel case (notice the capitalization): `Orange` [INFO] [stderr] [INFO] [stderr] warning: unused import: `ndarray` [INFO] [stderr] --> test2-mlp-classifier/src/main.rs:5:5 [INFO] [stderr] | [INFO] [stderr] 5 | use ndarray::*; [INFO] [stderr] | ^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_imports)]` on by default [INFO] [stderr] [INFO] [stderr] warning: unused variable: `i` [INFO] [stderr] --> test2-mlp-classifier/src/main.rs:80:9 [INFO] [stderr] | [INFO] [stderr] 80 | for i in 0..n { [INFO] [stderr] | ^ help: if this is intentional, prefix it with an underscore: `_i` [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(unused_variables)]` on by default [INFO] [stderr] [INFO] [stderr] warning: struct `MultilayerPerceptron` is never constructed [INFO] [stderr] --> test2-mlp-classifier/src/main.rs:7:8 [INFO] [stderr] | [INFO] [stderr] 7 | struct MultilayerPerceptron { [INFO] [stderr] | ^^^^^^^^^^^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: struct `MLPArchitecture` is never constructed [INFO] [stderr] --> test2-mlp-classifier/src/main.rs:14:8 [INFO] [stderr] | [INFO] [stderr] 14 | struct MLPArchitecture { [INFO] [stderr] | ^^^^^^^^^^^^^^^ [INFO] [stderr] [INFO] [stderr] warning: associated function `new` is never used [INFO] [stderr] --> test2-mlp-classifier/src/main.rs:21:8 [INFO] [stderr] | [INFO] [stderr] 20 | impl MLPArchitecture { [INFO] [stderr] | -------------------- associated function in this implementation [INFO] [stderr] 21 | fn new(input_size: usize, hidden_layers: Vec, output_size: usize) -> Self { [INFO] [stderr] | ^^^ [INFO] [stderr] [INFO] [stderr] warning: `test6-nn` (lib) generated 8 warnings [INFO] [stderr] warning: `test7-nn-mnist-classifier` (bin "test7-nn-mnist-classifier" test) generated 1 warning [INFO] [stderr] warning: `test7-nn-mnist-classifier` (lib test) generated 6 warnings (2 duplicates) (run `cargo fix --lib -p test7-nn-mnist-classifier --tests` to apply 1 suggestion) [INFO] [stderr] warning: `test6-nn` (lib test) generated 4 warnings (2 duplicates) (run `cargo fix --lib -p test6-nn --tests` to apply 1 suggestion) [INFO] [stderr] warning: `test4-multivariable-regression` (bin "test4-multivariable-regression" test) generated 2 warnings [INFO] [stderr] warning: `test2-mlp-classifier` (bin "test2-mlp-classifier" test) generated 7 warnings [INFO] [stderr] warning: `common` (lib test) generated 2 warnings (2 duplicates) [INFO] [stderr] warning: unused `Result` that must be used [INFO] [stderr] --> test6-nn/src/main.rs:92:5 [INFO] [stderr] | [INFO] [stderr] 92 | / nn.train_stochastic( [INFO] [stderr] 93 | | &training_data, [INFO] [stderr] 94 | | 10_000, [INFO] [stderr] 95 | | // &Optimizer::standard_gradient_descent(0.9), [INFO] [stderr] ... | [INFO] [stderr] 103 | | Some(session_logger), [INFO] [stderr] 104 | | ); [INFO] [stderr] | |_____^ [INFO] [stderr] | [INFO] [stderr] = note: this `Result` may be an `Err` variant, which should be handled [INFO] [stderr] = note: `#[warn(unused_must_use)]` on by default [INFO] [stderr] help: use `let _ = ...` to ignore the resulting value [INFO] [stderr] | [INFO] [stderr] 92 | let _ = nn.train_stochastic( [INFO] [stderr] | +++++++ [INFO] [stderr] [INFO] [stderr] warning: function `e_to_the_x` is never used [INFO] [stderr] --> test1-autodiff/src/main.rs:3:4 [INFO] [stderr] | [INFO] [stderr] 3 | fn e_to_the_x(x: FT) -> FT { [INFO] [stderr] | ^^^^^^^^^^ [INFO] [stderr] | [INFO] [stderr] = note: `#[warn(dead_code)]` on by default [INFO] [stderr] [INFO] [stderr] warning: `test6-nn` (bin "test6-nn" test) generated 1 warning [INFO] [stderr] warning: `test1-autodiff` (bin "test1-autodiff" test) generated 1 warning [INFO] [stderr] Finished `test` profile [unoptimized + debuginfo] target(s) in 0.07s [INFO] [stderr] Running unittests src/lib.rs (/opt/rustwide/target/debug/deps/common-0f1e0a565094e000) [INFO] [stdout] [INFO] [stdout] running 104 tests [INFO] [stdout] test linalg::column_vector_tests::add_mut_works ... ok [INFO] [stdout] test linalg::column_vector_tests::add_works ... ok [INFO] [stdout] test linalg::column_vector_tests::div_scalar_mut_works ... ok [INFO] [stdout] test linalg::column_vector_tests::can_create_a_column_vector_and_use_from_and_into ... ok [INFO] [stdout] test linalg::column_vector_tests::dot_product_works ... ok [INFO] [stdout] test linalg::column_vector_tests::div_scalar_works ... ok [INFO] [stdout] test linalg::column_vector_tests::elementwise_divide_in_place_works ... ok [INFO] [stdout] test linalg::column_vector_tests::elementwise_divide_works ... ok [INFO] [stdout] test linalg::column_vector_tests::empty_works ... ok [INFO] [stdout] test linalg::column_vector_tests::fill_new_works ... ok [INFO] [stdout] test linalg::column_vector_tests::hadamard_product_chaining_works ... ok [INFO] [stdout] test linalg::column_vector_tests::hadamard_product_works ... ok [INFO] [stdout] test linalg::column_vector_tests::hadamard_product_in_place_works ... ok [INFO] [stdout] test linalg::column_vector_tests::into_value_works ... ok [INFO] [stdout] test linalg::column_vector_tests::mult_by_matrix_works ... ok [INFO] [stdout] test linalg::column_vector_tests::minus_in_place_works ... ok [INFO] [stdout] test linalg::column_vector_tests::multiply_by_scalar_works ... ok [INFO] [stdout] test linalg::column_vector_tests::new_zero_vector_works ... ok [INFO] [stdout] test linalg::column_vector_tests::plus_works ... ok [INFO] [stdout] test linalg::column_vector_tests::subtract_works ... ok [INFO] [stdout] test linalg::column_vector_tests::test_add_scalar_to_each_element_in_place ... ok [INFO] [stdout] test linalg::column_vector_tests::test_basics ... ok [INFO] [stdout] test linalg::column_vector_tests::test_can_iterate_over_column_vector ... ok [INFO] [stdout] test linalg::column_vector_tests::test_can_do_double_iterate_over_column_vectors ... ok [INFO] [stdout] test linalg::column_vector_tests::test_can_iterate_mutably_over_column_vector ... ok [INFO] [stdout] test linalg::column_vector_tests::test_elementwise_square_root_in_place ... ok [INFO] [stdout] test linalg::column_vector_tests::test_mult_scalar_mut ... ok [INFO] [stdout] test linalg::column_vector_tests::test_outer_product ... ok [INFO] [stdout] test linalg::column_vector_tests::test_vec_length ... ok [INFO] [stdout] test linalg::column_vector_tests::transpose_into_row_vector_matrix_works ... ok [INFO] [stdout] test linalg::column_vector_tests::transpose_works ... ok [INFO] [stdout] test linalg::columns_matrix_builder_tests::test_columns_matrix_builder_with_chaining ... ok [INFO] [stdout] test linalg::rows_matrix_builder_tests::test_row_matrix_builder_with_non_chaining ... ok [INFO] [stdout] test linalg::rows_matrix_builder_tests::test_rows_matrix_builder_with_chaining ... ok [INFO] [stdout] test linalg::tests::add_in_place_serial_works ... ok [INFO] [stdout] test linalg::tests::new_identity_matrix_works ... ok [INFO] [stdout] test linalg::test_components_in_the_module_root::test_euclidian_distance ... ok [INFO] [stdout] test linalg::columns_matrix_builder_tests::test_columns_matrix_builder_with_non_chaining ... ok [INFO] [stdout] test linalg::tests::plus_works ... ok [INFO] [stdout] test linalg::test_components_in_the_module_root::test_euclidian_length ... ok [INFO] [stdout] test linalg::tests::set_and_get_work ... ok [INFO] [stdout] test linalg::tests::subtract ... ok [INFO] [stdout] test linalg::tests::push_column_works ... ok [INFO] [stdout] test linalg::tests::test_div_scalar ... ok [INFO] [stdout] test linalg::tests::add_mut_works ... ok [INFO] [stdout] test linalg::tests::test_add_scalar_to_each_element_in_place ... ok [INFO] [stdout] test linalg::tests::can_add_rows_and_get_values_at_specified_indexes ... ok [INFO] [stdout] test linalg::tests::multiply_works ... ok [INFO] [stdout] test linalg::tests::add_in_place_par_works ... ok [INFO] [stdout] test linalg::tests::test_div_scalar_mut ... ok [INFO] [stdout] test linalg::tests::test_elementwise_divide ... ok [INFO] [stdout] test linalg::tests::test_elementwise_divide_product_in_place ... ok [INFO] [stdout] test linalg::tests::test_extract_column ... ok [INFO] [stdout] test linalg::tests::test_extract_column_vector_as_matrix ... ok [INFO] [stdout] test linalg::tests::test_hadamard_product_chaining ... ok [INFO] [stdout] test linalg::tests::test_hadamard_product ... ok [INFO] [stdout] test linalg::tests::test_matrix_vector_multiplication_with_column_vector_type ... ok [INFO] [stdout] test linalg::tests::test_multiply_by_scalar ... ok [INFO] [stdout] test linalg::tests::test_matrix_vector_multiplication ... ok [INFO] [stdout] test linalg::tests::test_subtract_mut ... ok [INFO] [stdout] test linalg::tests::test_mult_scalar_mut ... ok [INFO] [stdout] test linalg::tests::test_vec_length ... ok [INFO] [stdout] test linalg::tests::test_from_columns ... ok [INFO] [stdout] test old_matrix::rows_matrix_builder_tests::test_rows_matrix_builder_with_chaining ... ok [INFO] [stdout] test old_matrix::columns_matrix_builder_tests::test_columns_matrix_builder_with_chaining ... ok [INFO] [stdout] test linalg::tests::test_elementwise_square_root_in_place ... ok [INFO] [stdout] test linalg::tests::transpose_of_column_vector_mult_by_column_vector_works ... ok [INFO] [stdout] test old_matrix::tests::can_add_rows_and_get_values_at_specified_indexes ... ok [INFO] [stdout] test old_matrix::rows_matrix_builder_tests::test_row_matrix_builder_with_non_chaining ... ok [INFO] [stdout] test old_matrix::columns_matrix_builder_tests::test_columns_matrix_builder_with_non_chaining ... ok [INFO] [stdout] test linalg::tests::transpose_works ... ok [INFO] [stdout] test old_matrix::tests::test_divide_by_scalar_in_place ... ok [INFO] [stdout] test old_matrix::tests::test_extract_column ... ok [INFO] [stdout] test old_matrix::tests::test_from_columns ... ok [INFO] [stdout] test old_matrix::tests::minus_works ... ok [INFO] [stdout] test old_matrix::tests::new_identity_matrix_works ... ok [INFO] [stdout] test linalg::tests::transpose_works_for_column_vector ... ok [INFO] [stdout] test old_matrix::tests::push_column_works ... ok [INFO] [stdout] test old_matrix::tests::plus_works ... ok [INFO] [stdout] test old_matrix::tests::set_and_get_work ... ok [INFO] [stdout] test old_matrix::tests::subtract_works ... ok [INFO] [stdout] test old_matrix::tests::add_in_place_works ... ok [INFO] [stdout] test old_matrix::tests::test_hadamard_product_in_place ... ok [INFO] [stdout] test old_matrix::tests::test_div_scalar ... ok [INFO] [stdout] test old_matrix::tests::test_matrix_vector_multiplication ... ok [INFO] [stdout] test linalg::tests::test_hadamard_product_in_place ... ok [INFO] [stdout] test old_matrix::tests::test_multiply_by_scalar ... ok [INFO] [stdout] test old_matrix::tests::test_multiply_by_scalar_in_place ... ok [INFO] [stdout] test old_matrix::tests::transpose_of_column_vector_mult_by_column_vector_works ... ok [INFO] [stdout] test old_matrix::tests::test_vec_length ... ok [INFO] [stdout] test old_matrix::tests::transpose_works ... ok [INFO] [stdout] test old_matrix::tests::transpose_works_for_column_vector ... ok [INFO] [stdout] test point::tests::it_works ... ok [INFO] [stdout] test softmax::tests::test_softmax_derivative_diagonal ... ok [INFO] [stdout] test softmax::tests::test_softmax_derivative_off_diagonal ... ok [INFO] [stdout] test softmax::tests::test_softmax_derivative_matrix_size ... ok [INFO] [stdout] test softmax::tests::test_softmax_derivative_uniform_input ... ok [INFO] [stdout] test softmax::tests::test_softmax_output_range ... ok [INFO] [stdout] test softmax::tests::test_softmax_numerical_stability ... ok [INFO] [stdout] test old_matrix::tests::multiply_works ... ok [INFO] [stdout] test old_matrix::tests::test_hadamard_product ... ok [INFO] [stdout] test softmax::tests::test_softmax_sum_to_one ... ok [INFO] [stdout] test tests::dot_returns_err_if_dimentions_are_zero ... ok [INFO] [stdout] test tests::dot_product_works ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 104 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.04s [INFO] [stdout] [INFO] [stderr] Running unittests src/lib.rs (/opt/rustwide/target/debug/deps/metrics-ae16c0c199bcebbe) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/lib.rs (/opt/rustwide/target/debug/deps/mnist_data-ecb2e028fd7e9a4f) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/main.rs (/opt/rustwide/target/debug/deps/test1_autodiff-fe39e58d71879494) [INFO] [stderr] Running unittests src/main.rs (/opt/rustwide/target/debug/deps/test2_mlp_classifier-45b1e87d12f99ca7) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/main.rs (/opt/rustwide/target/debug/deps/test3_simple_linear_regression-15c303f66434f34b) [INFO] [stdout] [INFO] [stderr] Running unittests src/main.rs (/opt/rustwide/target/debug/deps/test4_multivariable_regression-118171bc57518bde) [INFO] [stdout] running 1 test [INFO] [stdout] test tests::it_yields_the_correct_result ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 1 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] running 7 tests [INFO] [stdout] test tests::compute_partial_derivatives_v_works ... ok [INFO] [stdout] test tests::hypothesis_v_works ... ok [INFO] [stdout] test tests::it_yields_the_correct_result_for_3d_ex1 ... ok [INFO] [stdout] test tests::it_yields_the_correct_result_for_2d ... ok [INFO] [stdout] test tests::cost_fn_works_for_non_zero_cost ... ok [INFO] [stdout] test tests::cost_fn_works_for_zero_cost ... ok [INFO] [stdout] test tests::it_yields_the_correct_result_for_3d_ex2 ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 7 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.01s [INFO] [stdout] [INFO] [stderr] Running unittests src/main.rs (/opt/rustwide/target/debug/deps/test5_playing_with_matrix_ideas-c036608d79b403cc) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/lib.rs (/opt/rustwide/target/debug/deps/test6_nn-571c3d3678dd0d8d) [INFO] [stdout] [INFO] [stdout] running 60 tests [INFO] [stdout] test activation::activator::elu::tests::activate_prime_works ... ok [INFO] [stdout] test activation::activator::relu::tests::activate_prime_works ... ok [INFO] [stdout] test activation::activator::relu::tests::activate_works ... ok [INFO] [stdout] test activation::activator::tests::activate_derivative_vector_works ... ok [INFO] [stdout] test activation::jelu::tests::test_activate ... ok [INFO] [stdout] test activation::leaky_relu::tests::activate_prime_works ... ok [INFO] [stdout] test activation::jelu::tests::test_activate_derivative ... ok [INFO] [stdout] test activation::activator::jelu::tests::test_activate_derivative ... ok [INFO] [stdout] test activation::activator::sigmoid::tests::activate_works ... ok [INFO] [stdout] test activation::activator::jelu::tests::test_activate ... ok [INFO] [stdout] test activation::relu::tests::activate_works ... ok [INFO] [stdout] test activation::sigmoid::tests::activate_derivative_works ... ok [INFO] [stdout] test big_theta::tests::divide_scalar_works ... ok [INFO] [stdout] test activation::activator::elu::tests::activate_works ... ok [INFO] [stdout] test big_theta::tests::test_add_in_place_works ... ok [INFO] [stdout] test big_theta::tests::test_elementwise_mult_in_place_place_works ... ok [INFO] [stdout] test big_theta::tests::test_get_weights_matrix_mut ... ok [INFO] [stdout] test big_theta::tests::test_mult_scalar_return_new_works ... ok [INFO] [stdout] test big_theta::tests::test_subtract_in_place_works ... ok [INFO] [stdout] test activation::activator::sigmoid::tests::activate_derivative_works ... ok [INFO] [stdout] test activation::sigmoid::tests::activate_works ... ok [INFO] [stdout] test big_theta::tests::create_big_theta_for_test_with_scale_factor_works ... ok [INFO] [stdout] test big_theta::tests::create_big_theta_for_test_works ... ok [INFO] [stdout] test big_theta::tests::test_elementwise_divide_in_place_place_works ... ok [INFO] [stdout] test activation::leaky_relu::tests::activate_works ... ok [INFO] [stdout] test cost::tests::test_quadratic_cost_fn ... ok [INFO] [stdout] test cost::tests::test_quadratic_cost_fn_dimension_mismatch ... ok [INFO] [stdout] test builder::test_nn_builder::test_nn_builder_manual_wb_values ... ok [INFO] [stdout] test tests::feed_forward_works_simple_three_layer_using_feed_forward_capturing ... ok [INFO] [stdout] test tests::feed_forward_works_simple_two_layer ... ok [INFO] [stdout] test big_theta::tests::test_zero_from_sizes ... ok [INFO] [stdout] test tests::feed_forward_works_simple_three_layer ... ok [INFO] [stdout] test big_theta::tests::test_divide_scalar_return_new_works ... ok [INFO] [stdout] test activation::activator::tests::activate_vector_works ... ok [INFO] [stdout] test activation::elu::tests::activate_prime_works ... ok [INFO] [stdout] test activation::relu::tests::activate_prime_works ... ok [INFO] [stdout] test big_theta::tests::test_mult_scalar_in_place_works ... ok [INFO] [stdout] test tests::test_cost_single_tr_ex_multiple_output_neurons ... ok [INFO] [stdout] test tests::test_cost_single_tr_ex_single_output_neuron ... ok [INFO] [stdout] test tests::test_get_weight_matrix_shape ... ok [INFO] [stdout] test activation::elu::tests::activate_works ... ok [INFO] [stdout] test tests::test_rev_layer_indexs_computation ... ok [INFO] [stdout] test tests::test_unroll_gradients ... ok [INFO] [stdout] test tests::test_reshape_weights_and_biases ... ok [INFO] [stdout] test tests::test_unroll_weights_and_biases ... ok [INFO] [stdout] test tests::test_cost_for_training_set_iterative_impl ... ok [INFO] [stdout] test tests::test_z_vec ... ok [INFO] [stdout] test builder::test_nn_builder::panics_on_hidden_layer_with_invalid_weight_or_bias_dimensions ... ok [INFO] [stdout] test builder::test_nn_builder::cannot_add_output_layer_before_input_layer - should panic ... ok [INFO] [stdout] test builder::test_nn_builder::cannot_add_hiddlen_layer_before_input_layer - should panic ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons_with_leaky_relu_and_momentum_opt ... ok [INFO] [stdout] test tests::test_fan_in_fan_out ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons_with_leaky_relu_and_adam_opt ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons_with_relu_hidden_layers ... ok [INFO] [stdout] test tests::simple_test_to_get_elu_sorted_out ... ok [INFO] [stdout] test tests::simple_jelu_test ... ok [INFO] [stdout] test tests::simple_leaky_relu_test ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons ... ok [INFO] [stdout] test tests::test_nn_using_constructor_for_random_initial_weights_and_biases ... ok [INFO] [stdout] test tests::test_nn has been running for over 60 seconds [INFO] [stdout] test tests::test_nn ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 60 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 69.47s [INFO] [stdout] [INFO] [stderr] (took PT0.225631927S) (took PT0.591416501S) (took PT3.294045504S) (took PT22.074779888S) (took PT35.628493683S) (took PT69.456707451S) Running unittests src/main.rs (/opt/rustwide/target/debug/deps/test6_nn-c1877b0e8c793888) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Running unittests src/lib.rs (/opt/rustwide/target/debug/deps/test7_nn_mnist_classifier-477276f4f236df0f) [INFO] [stdout] [INFO] [stdout] running 45 tests [INFO] [stdout] test activation::leaky_relu::tests::activate_prime_works ... ok [INFO] [stdout] test activation::relu::tests::activate_prime_works ... ok [INFO] [stdout] test big_theta::tests::create_big_theta_for_test_with_scale_factor_works ... ok [INFO] [stdout] test big_theta::tests::test_add_in_place_works ... ok [INFO] [stdout] test big_theta::tests::test_elementwise_divide_in_place_place_works ... ok [INFO] [stdout] test big_theta::tests::test_get_weights_matrix_mut ... ok [INFO] [stdout] test big_theta::tests::test_elementwise_mult_in_place_place_works ... ok [INFO] [stdout] test activation::sigmoid::tests::activate_derivative_works ... ok [INFO] [stdout] test activation::sigmoid::tests::activate_works ... ok [INFO] [stdout] test big_theta::tests::test_subtract_in_place_works ... ok [INFO] [stdout] test big_theta::tests::create_big_theta_for_test_works ... ok [INFO] [stdout] test big_theta::tests::test_zero_from_sizes ... ok [INFO] [stdout] test activation::relu::tests::activate_works ... ok [INFO] [stdout] test builder::test_nn_builder::test_nn_builder_manual_wb_values ... ok [INFO] [stdout] test tests::feed_forward_works_simple_two_layer ... ok [INFO] [stdout] test big_theta::tests::test_divide_scalar_return_new_works ... ok [INFO] [stdout] test activation::leaky_relu::tests::activate_works ... ok [INFO] [stdout] test cost::tests::test_quadratic_cost_fn_dimension_mismatch ... ok [INFO] [stdout] test tests::feed_forward_works_simple_three_layer ... ok [INFO] [stdout] test big_theta::tests::test_mult_scalar_return_new_works ... ok [INFO] [stdout] test big_theta::tests::test_mult_scalar_in_place_works ... ok [INFO] [stdout] test big_theta::tests::divide_scalar_works ... ok [INFO] [stdout] test cost::tests::test_quadratic_cost_fn ... ok [INFO] [stdout] test tests::test_cost_single_tr_ex_single_output_neuron ... ok [INFO] [stdout] test tests::feed_forward_works_simple_three_layer_using_feed_forward_capturing ... ok [INFO] [stdout] test tests::test_get_weight_matrix_shape ... ok [INFO] [stdout] test tests::test_cost_single_tr_ex_multiple_output_neurons ... ok [INFO] [stdout] test tests::test_reshape_weights_and_biases ... ok [INFO] [stdout] test tests::test_unroll_weights_and_biases ... ok [INFO] [stdout] test tests::test_rev_layer_indexs_computation ... ok [INFO] [stdout] test tests::test_z_vec ... ok [INFO] [stdout] test tests::test_unroll_gradients ... ok [INFO] [stdout] test builder::test_nn_builder::cannot_add_output_layer_before_input_layer - should panic ... ok [INFO] [stdout] test tests::test_cost_for_training_set_iterative_impl ... ok [INFO] [stdout] test builder::test_nn_builder::panics_on_hidden_layer_with_invalid_weight_or_bias_dimensions ... ok [INFO] [stdout] test builder::test_nn_builder::cannot_add_hiddlen_layer_before_input_layer - should panic ... ok [INFO] [stdout] test tests::test_fan_in_fan_out ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons_with_leaky_relu_and_momentum_opt ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons_with_leaky_relu_and_adam_opt ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons_with_relu_hidden_layers ... ok [INFO] [stdout] test tests::simple_relu_test ... ok [INFO] [stdout] test tests::simple_leaky_relu_test ... ok [INFO] [stdout] test tests::test_nn_using_more_hidden_layers_with_more_neurons ... ok [INFO] [stdout] test tests::test_nn_using_constructor_for_random_initial_weights_and_biases ... ok [INFO] [stdout] test tests::test_nn ... ok [INFO] [stdout] [INFO] [stdout] test result: ok. 45 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 47.33s [INFO] [stdout] [INFO] [stderr] (took PT0.275630986S) (took PT0.301000076S) (took PT0.547837073S) (took PT17.996513084S) (took PT20.235362554S) (took PT47.304582074S) Running unittests src/main.rs (/opt/rustwide/target/debug/deps/test7_nn_mnist_classifier-438bfc6ff94e1345) [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Doc-tests common [INFO] [stdout] [INFO] [stderr] Doc-tests metrics [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Doc-tests mnist_data [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Doc-tests test6_nn [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] [stderr] Doc-tests test7_nn_mnist_classifier [INFO] [stdout] [INFO] [stdout] running 0 tests [INFO] [stdout] [INFO] [stdout] test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s [INFO] [stdout] [INFO] running `Command { std: "docker" "inspect" "1e51216369f747a2f8788c096d99a55238aade6e1020cc791cc4de9e3d833938", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "1e51216369f747a2f8788c096d99a55238aade6e1020cc791cc4de9e3d833938", kill_on_drop: false }` [INFO] [stdout] 1e51216369f747a2f8788c096d99a55238aade6e1020cc791cc4de9e3d833938