Nov 09 12:09:32.329 INFO checking helloooooo/learn_deep_learning against master#653da4fd006c97625247acd7e076d0782cdc149b for pr-55632 Nov 09 12:09:32.330 INFO running `"docker" "create" "-v" "/mnt/big/crater/work/local/target-dirs/pr-55632/worker-6/master#653da4fd006c97625247acd7e076d0782cdc149b:/target:rw,Z" "-v" "/mnt/big/crater/work/local/test-source/worker-6/pr-55632/master#653da4fd006c97625247acd7e076d0782cdc149b:/source:ro,Z" "-v" "/mnt/big/crater/work/local/cargo-home:/cargo-home:ro,Z" "-v" "/mnt/big/crater/work/local/rustup-home:/rustup-home:ro,Z" "-e" "USER_ID=1000" "-e" "SOURCE_DIR=/source" "-e" "USER_ID=1000" "-e" "CMD=cargo +653da4fd006c97625247acd7e076d0782cdc149b-alt check --frozen --all --all-targets" "-e" "CARGO_TARGET_DIR=/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/cargo-home" "-e" "RUSTUP_HOME=/rustup-home" "-m" "1536M" "--network" "none" "crater"` Nov 09 12:09:32.963 INFO blam! 665f3f794316fb6657caff1255a760e27d0bc969eb63cf3186b250bb6823a3f8 Nov 09 12:09:32.975 INFO running `"docker" "start" "-a" "665f3f794316fb6657caff1255a760e27d0bc969eb63cf3186b250bb6823a3f8"` Nov 09 12:09:34.953 INFO kablam! usermod: no changes Nov 09 12:09:35.253 INFO kablam! Checking gnuplot v0.0.23 Nov 09 12:09:35.253 INFO kablam! Checking mnist v0.4.0 Nov 09 12:09:35.255 INFO kablam! Checking base64 v0.8.0 Nov 09 12:09:35.255 INFO kablam! Checking bytes v0.4.5 Nov 09 12:09:35.255 INFO kablam! Checking futures-cpupool v0.1.7 Nov 09 12:09:35.255 INFO kablam! Checking alga v0.5.2 Nov 09 12:09:35.259 INFO kablam! Checking chrono v0.4.0 Nov 09 12:09:35.259 INFO kablam! Checking native-tls v0.1.4 Nov 09 12:09:39.831 INFO kablam! Checking tokio-io v0.1.4 Nov 09 12:09:42.747 INFO kablam! Checking tokio-core v0.1.11 Nov 09 12:09:45.844 INFO kablam! Checking tokio-proto v0.1.1 Nov 09 12:09:45.844 INFO kablam! Checking tokio-tls v0.1.3 Nov 09 12:09:48.916 INFO kablam! Checking nalgebra v0.13.1 Nov 09 12:09:51.455 INFO kablam! Checking hyper v0.11.9 Nov 09 12:10:00.747 INFO kablam! Checking hyper-tls v0.1.2 Nov 09 12:10:03.497 INFO kablam! Checking egg-mode v0.12.0 Nov 09 12:10:29.966 INFO kablam! Checking test1 v0.1.0 (/source) Nov 09 12:10:31.347 INFO kablam! warning: unused imports: `RefMut`, `Ref` Nov 09 12:10:31.347 INFO kablam! --> src/gradient.rs:5:26 Nov 09 12:10:31.347 INFO kablam! | Nov 09 12:10:31.347 INFO kablam! 5 | use std::cell::{RefCell, Ref, RefMut}; Nov 09 12:10:31.347 INFO kablam! | ^^^ ^^^^^^ Nov 09 12:10:31.347 INFO kablam! | Nov 09 12:10:31.347 INFO kablam! = note: #[warn(unused_imports)] on by default Nov 09 12:10:31.347 INFO kablam! Nov 09 12:10:31.347 INFO kablam! warning: unused import: `std::cell::RefCell` Nov 09 12:10:31.347 INFO kablam! --> src/nural.rs:5:5 Nov 09 12:10:31.347 INFO kablam! | Nov 09 12:10:31.347 INFO kablam! 5 | use std::cell::RefCell; Nov 09 12:10:31.347 INFO kablam! | ^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.347 INFO kablam! Nov 09 12:10:31.635 INFO kablam! warning: unused variable: `rows` Nov 09 12:10:31.635 INFO kablam! --> src/main.rs:21:16 Nov 09 12:10:31.635 INFO kablam! | Nov 09 12:10:31.635 INFO kablam! 21 | let (size, rows, cols) = (60_000, 28, 28); Nov 09 12:10:31.635 INFO kablam! | ^^^^ help: consider using `_rows` instead Nov 09 12:10:31.635 INFO kablam! | Nov 09 12:10:31.635 INFO kablam! = note: #[warn(unused_variables)] on by default Nov 09 12:10:31.635 INFO kablam! Nov 09 12:10:31.635 INFO kablam! warning: unused variable: `cols` Nov 09 12:10:31.635 INFO kablam! --> src/main.rs:21:22 Nov 09 12:10:31.635 INFO kablam! | Nov 09 12:10:31.635 INFO kablam! 21 | let (size, rows, cols) = (60_000, 28, 28); Nov 09 12:10:31.635 INFO kablam! | ^^^^ help: consider using `_cols` instead Nov 09 12:10:31.635 INFO kablam! Nov 09 12:10:31.635 INFO kablam! warning: unused variable: `batch_size` Nov 09 12:10:31.635 INFO kablam! --> src/main.rs:40:9 Nov 09 12:10:31.635 INFO kablam! | Nov 09 12:10:31.635 INFO kablam! 40 | let batch_size = 100; Nov 09 12:10:31.635 INFO kablam! | ^^^^^^^^^^ help: consider using `_batch_size` instead Nov 09 12:10:31.635 INFO kablam! Nov 09 12:10:31.651 INFO kablam! warning: variable does not need to be mutable Nov 09 12:10:31.651 INFO kablam! --> src/main.rs:33:9 Nov 09 12:10:31.651 INFO kablam! | Nov 09 12:10:31.651 INFO kablam! 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Nov 09 12:10:31.651 INFO kablam! | ----^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.651 INFO kablam! | | Nov 09 12:10:31.651 INFO kablam! | help: remove this `mut` Nov 09 12:10:31.651 INFO kablam! | Nov 09 12:10:31.651 INFO kablam! = note: #[warn(unused_mut)] on by default Nov 09 12:10:31.651 INFO kablam! Nov 09 12:10:31.719 INFO kablam! warning: function is never used: `mean_squared_error` Nov 09 12:10:31.719 INFO kablam! --> src/lossfunc.rs:11:1 Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! 11 | pub fn mean_squared_error(y: DMatrix, t: DMatrix) -> f64 { Nov 09 12:10:31.719 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! = note: #[warn(dead_code)] on by default Nov 09 12:10:31.719 INFO kablam! Nov 09 12:10:31.719 INFO kablam! warning: function is never used: `numerical_gradient` Nov 09 12:10:31.719 INFO kablam! --> src/gradient.rs:8:1 Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! 8 | / pub fn numerical_gradient< Nov 09 12:10:31.719 INFO kablam! 9 | | F: Fn(&DMatrix, Nov 09 12:10:31.719 INFO kablam! 10 | | &DMatrix, Nov 09 12:10:31.719 INFO kablam! 11 | | &DMatrix, Nov 09 12:10:31.719 INFO kablam! ... | Nov 09 12:10:31.719 INFO kablam! 38 | | grad Nov 09 12:10:31.719 INFO kablam! 39 | | } Nov 09 12:10:31.719 INFO kablam! | |_^ Nov 09 12:10:31.719 INFO kablam! Nov 09 12:10:31.719 INFO kablam! warning: function is never used: `function_2` Nov 09 12:10:31.719 INFO kablam! --> src/gradient.rs:40:1 Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! 40 | pub fn function_2(x: &mut DMatrix) -> f64 { Nov 09 12:10:31.719 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.719 INFO kablam! Nov 09 12:10:31.719 INFO kablam! warning: method is never used: `predict` Nov 09 12:10:31.719 INFO kablam! --> src/nural.rs:11:5 Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! 11 | pub fn predict(self, x: &DMatrix) -> DMatrix { Nov 09 12:10:31.719 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.719 INFO kablam! Nov 09 12:10:31.719 INFO kablam! warning: method is never used: `loss` Nov 09 12:10:31.719 INFO kablam! --> src/nural.rs:14:5 Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! 14 | pub fn loss(self, x: &DMatrix, t: &DMatrix) -> f64 { Nov 09 12:10:31.719 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.719 INFO kablam! Nov 09 12:10:31.719 INFO kablam! warning: method is never used: `numerical_gradient` Nov 09 12:10:31.719 INFO kablam! --> src/two_layer_net.rs:91:5 Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! 91 | pub fn numerical_gradient(&mut self, x: &DMatrix, t: &DMatrix) -> grad { Nov 09 12:10:31.719 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.719 INFO kablam! Nov 09 12:10:31.719 INFO kablam! warning: function is never used: `loss_w` Nov 09 12:10:31.719 INFO kablam! --> src/two_layer_net.rs:128:1 Nov 09 12:10:31.719 INFO kablam! | Nov 09 12:10:31.719 INFO kablam! 128 | / pub fn loss_w( Nov 09 12:10:31.720 INFO kablam! 129 | | param: &DMatrix, Nov 09 12:10:31.720 INFO kablam! 130 | | x: &DMatrix, Nov 09 12:10:31.720 INFO kablam! 131 | | t: &DMatrix, Nov 09 12:10:31.720 INFO kablam! ... | Nov 09 12:10:31.720 INFO kablam! 135 | | two.loss(param, x, t, &patern) Nov 09 12:10:31.720 INFO kablam! 136 | | } Nov 09 12:10:31.720 INFO kablam! | |_^ Nov 09 12:10:31.720 INFO kablam! Nov 09 12:10:31.723 INFO kablam! warning: function `axisZerosum` should have a snake case name such as `axis_zerosum` Nov 09 12:10:31.723 INFO kablam! --> src/nural.rs:32:1 Nov 09 12:10:31.723 INFO kablam! | Nov 09 12:10:31.723 INFO kablam! 32 | / pub fn axisZerosum(x: &DMatrix) -> DMatrix { Nov 09 12:10:31.723 INFO kablam! 33 | | let zerosum = DMatrix::::from_iterator( Nov 09 12:10:31.723 INFO kablam! 34 | | 1, Nov 09 12:10:31.723 INFO kablam! 35 | | x.shape().1, Nov 09 12:10:31.723 INFO kablam! ... | Nov 09 12:10:31.723 INFO kablam! 44 | | Nov 09 12:10:31.723 INFO kablam! 45 | | } Nov 09 12:10:31.723 INFO kablam! | |_^ Nov 09 12:10:31.723 INFO kablam! | Nov 09 12:10:31.723 INFO kablam! = note: #[warn(non_snake_case)] on by default Nov 09 12:10:31.723 INFO kablam! Nov 09 12:10:31.723 INFO kablam! warning: function `createVec` should have a snake case name such as `create_vec` Nov 09 12:10:31.723 INFO kablam! --> src/nural.rs:47:1 Nov 09 12:10:31.723 INFO kablam! | Nov 09 12:10:31.723 INFO kablam! 47 | / pub fn createVec(x: usize) -> Vec { Nov 09 12:10:31.723 INFO kablam! 48 | | let mut vec = Vec::with_capacity(x); Nov 09 12:10:31.723 INFO kablam! 49 | | for i in 0..x { Nov 09 12:10:31.723 INFO kablam! 50 | | vec.push(i); Nov 09 12:10:31.723 INFO kablam! 51 | | } Nov 09 12:10:31.723 INFO kablam! 52 | | vec Nov 09 12:10:31.723 INFO kablam! 53 | | } Nov 09 12:10:31.723 INFO kablam! | |_^ Nov 09 12:10:31.723 INFO kablam! Nov 09 12:10:31.723 INFO kablam! warning: type `grad` should have a camel case name such as `Grad` Nov 09 12:10:31.723 INFO kablam! --> src/two_layer_net.rs:13:1 Nov 09 12:10:31.723 INFO kablam! | Nov 09 12:10:31.723 INFO kablam! 13 | / pub struct grad { Nov 09 12:10:31.723 INFO kablam! 14 | | pub w1: DMatrix, Nov 09 12:10:31.723 INFO kablam! 15 | | pub b1: DMatrix, Nov 09 12:10:31.723 INFO kablam! 16 | | pub w2: DMatrix, Nov 09 12:10:31.723 INFO kablam! 17 | | pub b2: DMatrix, Nov 09 12:10:31.723 INFO kablam! 18 | | } Nov 09 12:10:31.723 INFO kablam! | |_^ Nov 09 12:10:31.723 INFO kablam! | Nov 09 12:10:31.723 INFO kablam! = note: #[warn(non_camel_case_types)] on by default Nov 09 12:10:31.723 INFO kablam! Nov 09 12:10:31.723 INFO kablam! warning: type `Two_layer_network` should have a camel case name such as `TwoLayerNetwork` Nov 09 12:10:31.723 INFO kablam! --> src/two_layer_net.rs:19:1 Nov 09 12:10:31.723 INFO kablam! | Nov 09 12:10:31.723 INFO kablam! 19 | / pub struct Two_layer_network { Nov 09 12:10:31.723 INFO kablam! 20 | | pub w1: Rc>>, Nov 09 12:10:31.723 INFO kablam! 21 | | pub b1: Rc>>, Nov 09 12:10:31.723 INFO kablam! 22 | | pub w2: Rc>>, Nov 09 12:10:31.723 INFO kablam! 23 | | pub b2: Rc>>, Nov 09 12:10:31.723 INFO kablam! 24 | | } Nov 09 12:10:31.723 INFO kablam! | |_^ Nov 09 12:10:31.723 INFO kablam! Nov 09 12:10:31.723 INFO kablam! warning: variable `Two_layer_network` should have a snake case name such as `two_layer_network` Nov 09 12:10:31.723 INFO kablam! --> src/main.rs:33:9 Nov 09 12:10:31.723 INFO kablam! | Nov 09 12:10:31.723 INFO kablam! 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Nov 09 12:10:31.723 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.723 INFO kablam! Nov 09 12:10:31.763 INFO kablam! warning: unused imports: `RefMut`, `Ref` Nov 09 12:10:31.763 INFO kablam! --> src/gradient.rs:5:26 Nov 09 12:10:31.763 INFO kablam! | Nov 09 12:10:31.763 INFO kablam! 5 | use std::cell::{RefCell, Ref, RefMut}; Nov 09 12:10:31.763 INFO kablam! | ^^^ ^^^^^^ Nov 09 12:10:31.763 INFO kablam! | Nov 09 12:10:31.763 INFO kablam! = note: #[warn(unused_imports)] on by default Nov 09 12:10:31.763 INFO kablam! Nov 09 12:10:31.763 INFO kablam! warning: unused import: `std::cell::RefCell` Nov 09 12:10:31.763 INFO kablam! --> src/nural.rs:5:5 Nov 09 12:10:31.763 INFO kablam! | Nov 09 12:10:31.763 INFO kablam! 5 | use std::cell::RefCell; Nov 09 12:10:31.763 INFO kablam! | ^^^^^^^^^^^^^^^^^^ Nov 09 12:10:31.763 INFO kablam! Nov 09 12:10:32.019 INFO kablam! warning: unused variable: `rows` Nov 09 12:10:32.019 INFO kablam! --> src/main.rs:21:16 Nov 09 12:10:32.019 INFO kablam! | Nov 09 12:10:32.019 INFO kablam! 21 | let (size, rows, cols) = (60_000, 28, 28); Nov 09 12:10:32.019 INFO kablam! | ^^^^ help: consider using `_rows` instead Nov 09 12:10:32.019 INFO kablam! | Nov 09 12:10:32.019 INFO kablam! = note: #[warn(unused_variables)] on by default Nov 09 12:10:32.019 INFO kablam! Nov 09 12:10:32.019 INFO kablam! warning: unused variable: `cols` Nov 09 12:10:32.019 INFO kablam! --> src/main.rs:21:22 Nov 09 12:10:32.019 INFO kablam! | Nov 09 12:10:32.019 INFO kablam! 21 | let (size, rows, cols) = (60_000, 28, 28); Nov 09 12:10:32.019 INFO kablam! | ^^^^ help: consider using `_cols` instead Nov 09 12:10:32.019 INFO kablam! Nov 09 12:10:32.019 INFO kablam! warning: unused variable: `batch_size` Nov 09 12:10:32.019 INFO kablam! --> src/main.rs:40:9 Nov 09 12:10:32.019 INFO kablam! | Nov 09 12:10:32.019 INFO kablam! 40 | let batch_size = 100; Nov 09 12:10:32.019 INFO kablam! | ^^^^^^^^^^ help: consider using `_batch_size` instead Nov 09 12:10:32.019 INFO kablam! Nov 09 12:10:32.039 INFO kablam! warning: variable does not need to be mutable Nov 09 12:10:32.039 INFO kablam! --> src/main.rs:33:9 Nov 09 12:10:32.039 INFO kablam! | Nov 09 12:10:32.039 INFO kablam! 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Nov 09 12:10:32.039 INFO kablam! | ----^^^^^^^^^^^^^^^^^ Nov 09 12:10:32.039 INFO kablam! | | Nov 09 12:10:32.039 INFO kablam! | help: remove this `mut` Nov 09 12:10:32.039 INFO kablam! | Nov 09 12:10:32.039 INFO kablam! = note: #[warn(unused_mut)] on by default Nov 09 12:10:32.039 INFO kablam! Nov 09 12:10:32.083 INFO kablam! warning: function is never used: `numerical_gradient` Nov 09 12:10:32.083 INFO kablam! --> src/gradient.rs:8:1 Nov 09 12:10:32.083 INFO kablam! | Nov 09 12:10:32.083 INFO kablam! 8 | / pub fn numerical_gradient< Nov 09 12:10:32.083 INFO kablam! 9 | | F: Fn(&DMatrix, Nov 09 12:10:32.083 INFO kablam! 10 | | &DMatrix, Nov 09 12:10:32.083 INFO kablam! 11 | | &DMatrix, Nov 09 12:10:32.083 INFO kablam! ... | Nov 09 12:10:32.083 INFO kablam! 38 | | grad Nov 09 12:10:32.083 INFO kablam! 39 | | } Nov 09 12:10:32.083 INFO kablam! | |_^ Nov 09 12:10:32.083 INFO kablam! | Nov 09 12:10:32.083 INFO kablam! = note: #[warn(dead_code)] on by default Nov 09 12:10:32.083 INFO kablam! Nov 09 12:10:32.083 INFO kablam! warning: method is never used: `numerical_gradient` Nov 09 12:10:32.083 INFO kablam! --> src/two_layer_net.rs:91:5 Nov 09 12:10:32.083 INFO kablam! | Nov 09 12:10:32.083 INFO kablam! 91 | pub fn numerical_gradient(&mut self, x: &DMatrix, t: &DMatrix) -> grad { Nov 09 12:10:32.083 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:32.083 INFO kablam! Nov 09 12:10:32.083 INFO kablam! warning: function is never used: `loss_w` Nov 09 12:10:32.083 INFO kablam! --> src/two_layer_net.rs:128:1 Nov 09 12:10:32.083 INFO kablam! | Nov 09 12:10:32.083 INFO kablam! 128 | / pub fn loss_w( Nov 09 12:10:32.083 INFO kablam! 129 | | param: &DMatrix, Nov 09 12:10:32.083 INFO kablam! 130 | | x: &DMatrix, Nov 09 12:10:32.083 INFO kablam! 131 | | t: &DMatrix, Nov 09 12:10:32.083 INFO kablam! ... | Nov 09 12:10:32.083 INFO kablam! 135 | | two.loss(param, x, t, &patern) Nov 09 12:10:32.083 INFO kablam! 136 | | } Nov 09 12:10:32.083 INFO kablam! | |_^ Nov 09 12:10:32.083 INFO kablam! Nov 09 12:10:32.091 INFO kablam! warning: function `axisZerosum` should have a snake case name such as `axis_zerosum` Nov 09 12:10:32.091 INFO kablam! --> src/nural.rs:32:1 Nov 09 12:10:32.091 INFO kablam! | Nov 09 12:10:32.091 INFO kablam! 32 | / pub fn axisZerosum(x: &DMatrix) -> DMatrix { Nov 09 12:10:32.091 INFO kablam! 33 | | let zerosum = DMatrix::::from_iterator( Nov 09 12:10:32.091 INFO kablam! 34 | | 1, Nov 09 12:10:32.091 INFO kablam! 35 | | x.shape().1, Nov 09 12:10:32.091 INFO kablam! ... | Nov 09 12:10:32.091 INFO kablam! 44 | | Nov 09 12:10:32.091 INFO kablam! 45 | | } Nov 09 12:10:32.091 INFO kablam! | |_^ Nov 09 12:10:32.091 INFO kablam! | Nov 09 12:10:32.091 INFO kablam! = note: #[warn(non_snake_case)] on by default Nov 09 12:10:32.091 INFO kablam! Nov 09 12:10:32.091 INFO kablam! warning: function `createVec` should have a snake case name such as `create_vec` Nov 09 12:10:32.091 INFO kablam! --> src/nural.rs:47:1 Nov 09 12:10:32.091 INFO kablam! | Nov 09 12:10:32.091 INFO kablam! 47 | / pub fn createVec(x: usize) -> Vec { Nov 09 12:10:32.091 INFO kablam! 48 | | let mut vec = Vec::with_capacity(x); Nov 09 12:10:32.091 INFO kablam! 49 | | for i in 0..x { Nov 09 12:10:32.091 INFO kablam! 50 | | vec.push(i); Nov 09 12:10:32.091 INFO kablam! 51 | | } Nov 09 12:10:32.091 INFO kablam! 52 | | vec Nov 09 12:10:32.091 INFO kablam! 53 | | } Nov 09 12:10:32.091 INFO kablam! | |_^ Nov 09 12:10:32.091 INFO kablam! Nov 09 12:10:32.091 INFO kablam! warning: type `grad` should have a camel case name such as `Grad` Nov 09 12:10:32.091 INFO kablam! --> src/two_layer_net.rs:13:1 Nov 09 12:10:32.091 INFO kablam! | Nov 09 12:10:32.091 INFO kablam! 13 | / pub struct grad { Nov 09 12:10:32.091 INFO kablam! 14 | | pub w1: DMatrix, Nov 09 12:10:32.091 INFO kablam! 15 | | pub b1: DMatrix, Nov 09 12:10:32.091 INFO kablam! 16 | | pub w2: DMatrix, Nov 09 12:10:32.091 INFO kablam! 17 | | pub b2: DMatrix, Nov 09 12:10:32.091 INFO kablam! 18 | | } Nov 09 12:10:32.091 INFO kablam! | |_^ Nov 09 12:10:32.091 INFO kablam! | Nov 09 12:10:32.091 INFO kablam! = note: #[warn(non_camel_case_types)] on by default Nov 09 12:10:32.091 INFO kablam! Nov 09 12:10:32.091 INFO kablam! warning: type `Two_layer_network` should have a camel case name such as `TwoLayerNetwork` Nov 09 12:10:32.091 INFO kablam! --> src/two_layer_net.rs:19:1 Nov 09 12:10:32.091 INFO kablam! | Nov 09 12:10:32.091 INFO kablam! 19 | / pub struct Two_layer_network { Nov 09 12:10:32.091 INFO kablam! 20 | | pub w1: Rc>>, Nov 09 12:10:32.091 INFO kablam! 21 | | pub b1: Rc>>, Nov 09 12:10:32.091 INFO kablam! 22 | | pub w2: Rc>>, Nov 09 12:10:32.091 INFO kablam! 23 | | pub b2: Rc>>, Nov 09 12:10:32.091 INFO kablam! 24 | | } Nov 09 12:10:32.091 INFO kablam! | |_^ Nov 09 12:10:32.091 INFO kablam! Nov 09 12:10:32.091 INFO kablam! warning: variable `Two_layer_network` should have a snake case name such as `two_layer_network` Nov 09 12:10:32.091 INFO kablam! --> src/main.rs:33:9 Nov 09 12:10:32.091 INFO kablam! | Nov 09 12:10:32.091 INFO kablam! 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Nov 09 12:10:32.091 INFO kablam! | ^^^^^^^^^^^^^^^^^^^^^ Nov 09 12:10:32.091 INFO kablam! Nov 09 12:10:32.135 INFO kablam! Finished dev [unoptimized + debuginfo] target(s) in 57.13s Nov 09 12:10:32.143 INFO kablam! su: No module specific data is present Nov 09 12:10:32.918 INFO running `"docker" "rm" "-f" "665f3f794316fb6657caff1255a760e27d0bc969eb63cf3186b250bb6823a3f8"` Nov 09 12:10:33.223 INFO blam! 665f3f794316fb6657caff1255a760e27d0bc969eb63cf3186b250bb6823a3f8