Dec 19 16:08:44.711 INFO checking helloooooo/learn_deep_learning against master#d99a320cba42f661aebfa1293b7b2ec3603dda75 for pr-56955 Dec 19 16:08:44.711 INFO running `"docker" "create" "-v" "/mnt/big/crater/work/local/target-dirs/pr-56955/worker-6/master#d99a320cba42f661aebfa1293b7b2ec3603dda75:/opt/crater/target:rw,Z" "-v" "/mnt/big/crater/work/ex/pr-56955/sources/master#d99a320cba42f661aebfa1293b7b2ec3603dda75/gh/helloooooo/learn_deep_learning:/opt/crater/workdir:ro,Z" "-v" "/mnt/big/crater/work/local/cargo-home:/opt/crater/cargo-home:ro,Z" "-v" "/mnt/big/crater/work/local/rustup-home:/opt/crater/rustup-home:ro,Z" "-e" "USER_ID=1000" "-e" "SOURCE_DIR=/opt/crater/workdir" "-e" "MAP_USER_ID=1000" "-e" "CARGO_TARGET_DIR=/opt/crater/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/crater/cargo-home" "-e" "RUSTUP_HOME=/opt/crater/rustup-home" "-w" "/opt/crater/workdir" "-m" "1536M" "--network" "none" "rustops/crates-build-env" "/opt/crater/cargo-home/bin/cargo" "+d99a320cba42f661aebfa1293b7b2ec3603dda75-alt" "check" "--frozen" "--all" "--all-targets"` Dec 19 16:08:45.167 INFO [stdout] 4311fe7e83a8be06892638d3e78474b5516eb60fe9f2d7fd9db529604ac2a339 Dec 19 16:08:45.176 INFO running `"docker" "start" "-a" "4311fe7e83a8be06892638d3e78474b5516eb60fe9f2d7fd9db529604ac2a339"` Dec 19 16:08:46.900 INFO [stderr] Checking gnuplot v0.0.23 Dec 19 16:08:46.900 INFO [stderr] Checking memchr v0.1.11 Dec 19 16:08:46.904 INFO [stderr] Checking mnist v0.4.0 Dec 19 16:08:46.904 INFO [stderr] Checking base64 v0.8.0 Dec 19 16:08:46.912 INFO [stderr] Checking alga v0.5.2 Dec 19 16:08:46.912 INFO [stderr] Checking generic-array v0.8.3 Dec 19 16:08:46.956 INFO [stderr] Checking thread-id v2.0.0 Dec 19 16:08:46.956 INFO [stderr] Checking tokio-proto v0.1.1 Dec 19 16:08:47.796 INFO [stderr] Checking thread_local v0.2.7 Dec 19 16:08:48.588 INFO [stderr] Checking aho-corasick v0.5.3 Dec 19 16:08:50.220 INFO [stderr] Checking regex v0.1.80 Dec 19 16:08:50.519 INFO [stderr] Checking digest v0.6.2 Dec 19 16:08:50.519 INFO [stderr] Checking crypto-mac v0.4.0 Dec 19 16:08:51.280 INFO [stderr] Checking sha-1 v0.4.1 Dec 19 16:08:51.296 INFO [stderr] Checking hmac v0.4.2 Dec 19 16:08:55.112 INFO [stderr] Checking hyper v0.11.9 Dec 19 16:08:58.768 INFO [stderr] Checking nalgebra v0.13.1 Dec 19 16:09:02.929 INFO [stderr] Checking hyper-tls v0.1.2 Dec 19 16:09:04.068 INFO [stderr] Checking egg-mode v0.12.0 Dec 19 16:09:30.320 INFO [stderr] Checking test1 v0.1.0 (/opt/crater/workdir) Dec 19 16:09:30.872 INFO [stderr] warning: unused imports: `RefMut`, `Ref` Dec 19 16:09:30.872 INFO [stderr] --> src/gradient.rs:5:26 Dec 19 16:09:30.872 INFO [stderr] | Dec 19 16:09:30.872 INFO [stderr] 5 | use std::cell::{RefCell, Ref, RefMut}; Dec 19 16:09:30.872 INFO [stderr] | ^^^ ^^^^^^ Dec 19 16:09:30.872 INFO [stderr] | Dec 19 16:09:30.872 INFO [stderr] = note: #[warn(unused_imports)] on by default Dec 19 16:09:30.872 INFO [stderr] Dec 19 16:09:30.872 INFO [stderr] warning: unused import: `std::cell::RefCell` Dec 19 16:09:30.872 INFO [stderr] --> src/nural.rs:5:5 Dec 19 16:09:30.872 INFO [stderr] | Dec 19 16:09:30.872 INFO [stderr] 5 | use std::cell::RefCell; Dec 19 16:09:30.872 INFO [stderr] | ^^^^^^^^^^^^^^^^^^ Dec 19 16:09:30.872 INFO [stderr] Dec 19 16:09:30.912 INFO [stderr] warning: unused imports: `RefMut`, `Ref` Dec 19 16:09:30.912 INFO [stderr] --> src/gradient.rs:5:26 Dec 19 16:09:30.912 INFO [stderr] | Dec 19 16:09:30.912 INFO [stderr] 5 | use std::cell::{RefCell, Ref, RefMut}; Dec 19 16:09:30.912 INFO [stderr] | ^^^ ^^^^^^ Dec 19 16:09:30.912 INFO [stderr] | Dec 19 16:09:30.912 INFO [stderr] = note: #[warn(unused_imports)] on by default Dec 19 16:09:30.912 INFO [stderr] Dec 19 16:09:30.912 INFO [stderr] warning: unused import: `std::cell::RefCell` Dec 19 16:09:30.912 INFO [stderr] --> src/nural.rs:5:5 Dec 19 16:09:30.912 INFO [stderr] | Dec 19 16:09:30.912 INFO [stderr] 5 | use std::cell::RefCell; Dec 19 16:09:30.912 INFO [stderr] | ^^^^^^^^^^^^^^^^^^ Dec 19 16:09:30.912 INFO [stderr] Dec 19 16:09:31.115 INFO [stderr] warning: unused variable: `rows` Dec 19 16:09:31.115 INFO [stderr] --> src/main.rs:21:16 Dec 19 16:09:31.115 INFO [stderr] | Dec 19 16:09:31.115 INFO [stderr] 21 | let (size, rows, cols) = (60_000, 28, 28); Dec 19 16:09:31.115 INFO [stderr] | ^^^^ help: consider using `_rows` instead Dec 19 16:09:31.115 INFO [stderr] | Dec 19 16:09:31.120 INFO [stderr] = note: #[warn(unused_variables)] on by default Dec 19 16:09:31.120 INFO [stderr] Dec 19 16:09:31.120 INFO [stderr] warning: unused variable: `cols` Dec 19 16:09:31.120 INFO [stderr] --> src/main.rs:21:22 Dec 19 16:09:31.120 INFO [stderr] | Dec 19 16:09:31.120 INFO [stderr] 21 | let (size, rows, cols) = (60_000, 28, 28); Dec 19 16:09:31.120 INFO [stderr] | ^^^^ help: consider using `_cols` instead Dec 19 16:09:31.120 INFO [stderr] Dec 19 16:09:31.120 INFO [stderr] warning: unused variable: `batch_size` Dec 19 16:09:31.120 INFO [stderr] --> src/main.rs:40:9 Dec 19 16:09:31.120 INFO [stderr] | Dec 19 16:09:31.120 INFO [stderr] 40 | let batch_size = 100; Dec 19 16:09:31.120 INFO [stderr] | ^^^^^^^^^^ help: consider using `_batch_size` instead Dec 19 16:09:31.120 INFO [stderr] Dec 19 16:09:31.136 INFO [stderr] warning: variable does not need to be mutable Dec 19 16:09:31.136 INFO [stderr] --> src/main.rs:33:9 Dec 19 16:09:31.136 INFO [stderr] | Dec 19 16:09:31.136 INFO [stderr] 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Dec 19 16:09:31.136 INFO [stderr] | ----^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.136 INFO [stderr] | | Dec 19 16:09:31.136 INFO [stderr] | help: remove this `mut` Dec 19 16:09:31.136 INFO [stderr] | Dec 19 16:09:31.136 INFO [stderr] = note: #[warn(unused_mut)] on by default Dec 19 16:09:31.136 INFO [stderr] Dec 19 16:09:31.184 INFO [stderr] warning: function is never used: `mean_squared_error` Dec 19 16:09:31.184 INFO [stderr] --> src/lossfunc.rs:11:1 Dec 19 16:09:31.184 INFO [stderr] | Dec 19 16:09:31.184 INFO [stderr] 11 | pub fn mean_squared_error(y: DMatrix, t: DMatrix) -> f64 { Dec 19 16:09:31.184 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.184 INFO [stderr] | Dec 19 16:09:31.184 INFO [stderr] = note: #[warn(dead_code)] on by default Dec 19 16:09:31.184 INFO [stderr] Dec 19 16:09:31.184 INFO [stderr] warning: function is never used: `numerical_gradient` Dec 19 16:09:31.184 INFO [stderr] --> src/gradient.rs:8:1 Dec 19 16:09:31.184 INFO [stderr] | Dec 19 16:09:31.184 INFO [stderr] 8 | / pub fn numerical_gradient< Dec 19 16:09:31.188 INFO [stderr] 9 | | F: Fn(&DMatrix, Dec 19 16:09:31.188 INFO [stderr] 10 | | &DMatrix, Dec 19 16:09:31.188 INFO [stderr] 11 | | &DMatrix, Dec 19 16:09:31.188 INFO [stderr] ... | Dec 19 16:09:31.188 INFO [stderr] 38 | | grad Dec 19 16:09:31.188 INFO [stderr] 39 | | } Dec 19 16:09:31.188 INFO [stderr] | |_^ Dec 19 16:09:31.188 INFO [stderr] Dec 19 16:09:31.188 INFO [stderr] warning: function is never used: `function_2` Dec 19 16:09:31.188 INFO [stderr] --> src/gradient.rs:40:1 Dec 19 16:09:31.188 INFO [stderr] | Dec 19 16:09:31.188 INFO [stderr] 40 | pub fn function_2(x: &mut DMatrix) -> f64 { Dec 19 16:09:31.188 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.188 INFO [stderr] Dec 19 16:09:31.188 INFO [stderr] warning: method is never used: `predict` Dec 19 16:09:31.188 INFO [stderr] --> src/nural.rs:11:5 Dec 19 16:09:31.188 INFO [stderr] | Dec 19 16:09:31.188 INFO [stderr] 11 | pub fn predict(self, x: &DMatrix) -> DMatrix { Dec 19 16:09:31.188 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.188 INFO [stderr] Dec 19 16:09:31.188 INFO [stderr] warning: method is never used: `loss` Dec 19 16:09:31.188 INFO [stderr] --> src/nural.rs:14:5 Dec 19 16:09:31.188 INFO [stderr] | Dec 19 16:09:31.188 INFO [stderr] 14 | pub fn loss(self, x: &DMatrix, t: &DMatrix) -> f64 { Dec 19 16:09:31.188 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.188 INFO [stderr] Dec 19 16:09:31.188 INFO [stderr] warning: method is never used: `numerical_gradient` Dec 19 16:09:31.192 INFO [stderr] --> src/two_layer_net.rs:91:5 Dec 19 16:09:31.192 INFO [stderr] | Dec 19 16:09:31.192 INFO [stderr] 91 | pub fn numerical_gradient(&mut self, x: &DMatrix, t: &DMatrix) -> grad { Dec 19 16:09:31.192 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.192 INFO [stderr] Dec 19 16:09:31.192 INFO [stderr] warning: function is never used: `loss_w` Dec 19 16:09:31.192 INFO [stderr] --> src/two_layer_net.rs:128:1 Dec 19 16:09:31.192 INFO [stderr] | Dec 19 16:09:31.192 INFO [stderr] 128 | / pub fn loss_w( Dec 19 16:09:31.192 INFO [stderr] 129 | | param: &DMatrix, Dec 19 16:09:31.192 INFO [stderr] 130 | | x: &DMatrix, Dec 19 16:09:31.192 INFO [stderr] 131 | | t: &DMatrix, Dec 19 16:09:31.192 INFO [stderr] ... | Dec 19 16:09:31.192 INFO [stderr] 135 | | two.loss(param, x, t, &patern) Dec 19 16:09:31.192 INFO [stderr] 136 | | } Dec 19 16:09:31.192 INFO [stderr] | |_^ Dec 19 16:09:31.192 INFO [stderr] Dec 19 16:09:31.192 INFO [stderr] warning: function `axisZerosum` should have a snake case name such as `axis_zerosum` Dec 19 16:09:31.192 INFO [stderr] --> src/nural.rs:32:1 Dec 19 16:09:31.196 INFO [stderr] | Dec 19 16:09:31.196 INFO [stderr] 32 | / pub fn axisZerosum(x: &DMatrix) -> DMatrix { Dec 19 16:09:31.196 INFO [stderr] 33 | | let zerosum = DMatrix::::from_iterator( Dec 19 16:09:31.196 INFO [stderr] 34 | | 1, Dec 19 16:09:31.196 INFO [stderr] 35 | | x.shape().1, Dec 19 16:09:31.196 INFO [stderr] ... | Dec 19 16:09:31.196 INFO [stderr] 44 | | Dec 19 16:09:31.196 INFO [stderr] 45 | | } Dec 19 16:09:31.196 INFO [stderr] | |_^ Dec 19 16:09:31.196 INFO [stderr] | Dec 19 16:09:31.196 INFO [stderr] = note: #[warn(non_snake_case)] on by default Dec 19 16:09:31.196 INFO [stderr] Dec 19 16:09:31.196 INFO [stderr] warning: function `createVec` should have a snake case name such as `create_vec` Dec 19 16:09:31.196 INFO [stderr] --> src/nural.rs:47:1 Dec 19 16:09:31.196 INFO [stderr] | Dec 19 16:09:31.196 INFO [stderr] 47 | / pub fn createVec(x: usize) -> Vec { Dec 19 16:09:31.196 INFO [stderr] 48 | | let mut vec = Vec::with_capacity(x); Dec 19 16:09:31.196 INFO [stderr] 49 | | for i in 0..x { Dec 19 16:09:31.196 INFO [stderr] 50 | | vec.push(i); Dec 19 16:09:31.196 INFO [stderr] 51 | | } Dec 19 16:09:31.196 INFO [stderr] 52 | | vec Dec 19 16:09:31.196 INFO [stderr] 53 | | } Dec 19 16:09:31.196 INFO [stderr] | |_^ Dec 19 16:09:31.196 INFO [stderr] Dec 19 16:09:31.196 INFO [stderr] warning: type `grad` should have a camel case name such as `Grad` Dec 19 16:09:31.196 INFO [stderr] --> src/two_layer_net.rs:13:1 Dec 19 16:09:31.200 INFO [stderr] | Dec 19 16:09:31.200 INFO [stderr] 13 | / pub struct grad { Dec 19 16:09:31.200 INFO [stderr] 14 | | pub w1: DMatrix, Dec 19 16:09:31.200 INFO [stderr] 15 | | pub b1: DMatrix, Dec 19 16:09:31.200 INFO [stderr] 16 | | pub w2: DMatrix, Dec 19 16:09:31.200 INFO [stderr] 17 | | pub b2: DMatrix, Dec 19 16:09:31.200 INFO [stderr] 18 | | } Dec 19 16:09:31.200 INFO [stderr] | |_^ Dec 19 16:09:31.200 INFO [stderr] | Dec 19 16:09:31.200 INFO [stderr] = note: #[warn(non_camel_case_types)] on by default Dec 19 16:09:31.200 INFO [stderr] Dec 19 16:09:31.200 INFO [stderr] warning: type `Two_layer_network` should have a camel case name such as `TwoLayerNetwork` Dec 19 16:09:31.200 INFO [stderr] --> src/two_layer_net.rs:19:1 Dec 19 16:09:31.200 INFO [stderr] | Dec 19 16:09:31.200 INFO [stderr] 19 | / pub struct Two_layer_network { Dec 19 16:09:31.200 INFO [stderr] 20 | | pub w1: Rc>>, Dec 19 16:09:31.200 INFO [stderr] 21 | | pub b1: Rc>>, Dec 19 16:09:31.200 INFO [stderr] 22 | | pub w2: Rc>>, Dec 19 16:09:31.200 INFO [stderr] 23 | | pub b2: Rc>>, Dec 19 16:09:31.200 INFO [stderr] 24 | | } Dec 19 16:09:31.200 INFO [stderr] | |_^ Dec 19 16:09:31.200 INFO [stderr] Dec 19 16:09:31.200 INFO [stderr] warning: variable `Two_layer_network` should have a snake case name such as `two_layer_network` Dec 19 16:09:31.204 INFO [stderr] --> src/main.rs:33:9 Dec 19 16:09:31.204 INFO [stderr] | Dec 19 16:09:31.204 INFO [stderr] 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Dec 19 16:09:31.204 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.204 INFO [stderr] Dec 19 16:09:31.265 INFO [stderr] warning: unused variable: `rows` Dec 19 16:09:31.265 INFO [stderr] --> src/main.rs:21:16 Dec 19 16:09:31.265 INFO [stderr] | Dec 19 16:09:31.265 INFO [stderr] 21 | let (size, rows, cols) = (60_000, 28, 28); Dec 19 16:09:31.265 INFO [stderr] | ^^^^ help: consider using `_rows` instead Dec 19 16:09:31.265 INFO [stderr] | Dec 19 16:09:31.265 INFO [stderr] = note: #[warn(unused_variables)] on by default Dec 19 16:09:31.265 INFO [stderr] Dec 19 16:09:31.265 INFO [stderr] warning: unused variable: `cols` Dec 19 16:09:31.265 INFO [stderr] --> src/main.rs:21:22 Dec 19 16:09:31.265 INFO [stderr] | Dec 19 16:09:31.265 INFO [stderr] 21 | let (size, rows, cols) = (60_000, 28, 28); Dec 19 16:09:31.265 INFO [stderr] | ^^^^ help: consider using `_cols` instead Dec 19 16:09:31.265 INFO [stderr] Dec 19 16:09:31.265 INFO [stderr] warning: unused variable: `batch_size` Dec 19 16:09:31.265 INFO [stderr] --> src/main.rs:40:9 Dec 19 16:09:31.265 INFO [stderr] | Dec 19 16:09:31.265 INFO [stderr] 40 | let batch_size = 100; Dec 19 16:09:31.265 INFO [stderr] | ^^^^^^^^^^ help: consider using `_batch_size` instead Dec 19 16:09:31.265 INFO [stderr] Dec 19 16:09:31.288 INFO [stderr] warning: variable does not need to be mutable Dec 19 16:09:31.288 INFO [stderr] --> src/main.rs:33:9 Dec 19 16:09:31.288 INFO [stderr] | Dec 19 16:09:31.288 INFO [stderr] 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Dec 19 16:09:31.288 INFO [stderr] | ----^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.288 INFO [stderr] | | Dec 19 16:09:31.288 INFO [stderr] | help: remove this `mut` Dec 19 16:09:31.288 INFO [stderr] | Dec 19 16:09:31.288 INFO [stderr] = note: #[warn(unused_mut)] on by default Dec 19 16:09:31.288 INFO [stderr] Dec 19 16:09:31.352 INFO [stderr] warning: function is never used: `numerical_gradient` Dec 19 16:09:31.352 INFO [stderr] --> src/gradient.rs:8:1 Dec 19 16:09:31.352 INFO [stderr] | Dec 19 16:09:31.352 INFO [stderr] 8 | / pub fn numerical_gradient< Dec 19 16:09:31.352 INFO [stderr] 9 | | F: Fn(&DMatrix, Dec 19 16:09:31.352 INFO [stderr] 10 | | &DMatrix, Dec 19 16:09:31.352 INFO [stderr] 11 | | &DMatrix, Dec 19 16:09:31.352 INFO [stderr] ... | Dec 19 16:09:31.352 INFO [stderr] 38 | | grad Dec 19 16:09:31.352 INFO [stderr] 39 | | } Dec 19 16:09:31.352 INFO [stderr] | |_^ Dec 19 16:09:31.352 INFO [stderr] | Dec 19 16:09:31.352 INFO [stderr] = note: #[warn(dead_code)] on by default Dec 19 16:09:31.352 INFO [stderr] Dec 19 16:09:31.352 INFO [stderr] warning: method is never used: `numerical_gradient` Dec 19 16:09:31.352 INFO [stderr] --> src/two_layer_net.rs:91:5 Dec 19 16:09:31.352 INFO [stderr] | Dec 19 16:09:31.352 INFO [stderr] 91 | pub fn numerical_gradient(&mut self, x: &DMatrix, t: &DMatrix) -> grad { Dec 19 16:09:31.352 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.352 INFO [stderr] Dec 19 16:09:31.352 INFO [stderr] warning: function is never used: `loss_w` Dec 19 16:09:31.352 INFO [stderr] --> src/two_layer_net.rs:128:1 Dec 19 16:09:31.352 INFO [stderr] | Dec 19 16:09:31.352 INFO [stderr] 128 | / pub fn loss_w( Dec 19 16:09:31.352 INFO [stderr] 129 | | param: &DMatrix, Dec 19 16:09:31.352 INFO [stderr] 130 | | x: &DMatrix, Dec 19 16:09:31.352 INFO [stderr] 131 | | t: &DMatrix, Dec 19 16:09:31.352 INFO [stderr] ... | Dec 19 16:09:31.352 INFO [stderr] 135 | | two.loss(param, x, t, &patern) Dec 19 16:09:31.352 INFO [stderr] 136 | | } Dec 19 16:09:31.352 INFO [stderr] | |_^ Dec 19 16:09:31.352 INFO [stderr] Dec 19 16:09:31.353 INFO [stderr] warning: function `axisZerosum` should have a snake case name such as `axis_zerosum` Dec 19 16:09:31.353 INFO [stderr] --> src/nural.rs:32:1 Dec 19 16:09:31.353 INFO [stderr] | Dec 19 16:09:31.353 INFO [stderr] 32 | / pub fn axisZerosum(x: &DMatrix) -> DMatrix { Dec 19 16:09:31.353 INFO [stderr] 33 | | let zerosum = DMatrix::::from_iterator( Dec 19 16:09:31.353 INFO [stderr] 34 | | 1, Dec 19 16:09:31.353 INFO [stderr] 35 | | x.shape().1, Dec 19 16:09:31.353 INFO [stderr] ... | Dec 19 16:09:31.353 INFO [stderr] 44 | | Dec 19 16:09:31.353 INFO [stderr] 45 | | } Dec 19 16:09:31.353 INFO [stderr] | |_^ Dec 19 16:09:31.353 INFO [stderr] | Dec 19 16:09:31.353 INFO [stderr] = note: #[warn(non_snake_case)] on by default Dec 19 16:09:31.353 INFO [stderr] Dec 19 16:09:31.353 INFO [stderr] warning: function `createVec` should have a snake case name such as `create_vec` Dec 19 16:09:31.353 INFO [stderr] --> src/nural.rs:47:1 Dec 19 16:09:31.353 INFO [stderr] | Dec 19 16:09:31.353 INFO [stderr] 47 | / pub fn createVec(x: usize) -> Vec { Dec 19 16:09:31.353 INFO [stderr] 48 | | let mut vec = Vec::with_capacity(x); Dec 19 16:09:31.353 INFO [stderr] 49 | | for i in 0..x { Dec 19 16:09:31.353 INFO [stderr] 50 | | vec.push(i); Dec 19 16:09:31.353 INFO [stderr] 51 | | } Dec 19 16:09:31.353 INFO [stderr] 52 | | vec Dec 19 16:09:31.353 INFO [stderr] 53 | | } Dec 19 16:09:31.353 INFO [stderr] | |_^ Dec 19 16:09:31.353 INFO [stderr] Dec 19 16:09:31.353 INFO [stderr] warning: type `grad` should have a camel case name such as `Grad` Dec 19 16:09:31.353 INFO [stderr] --> src/two_layer_net.rs:13:1 Dec 19 16:09:31.353 INFO [stderr] | Dec 19 16:09:31.353 INFO [stderr] 13 | / pub struct grad { Dec 19 16:09:31.353 INFO [stderr] 14 | | pub w1: DMatrix, Dec 19 16:09:31.353 INFO [stderr] 15 | | pub b1: DMatrix, Dec 19 16:09:31.353 INFO [stderr] 16 | | pub w2: DMatrix, Dec 19 16:09:31.353 INFO [stderr] 17 | | pub b2: DMatrix, Dec 19 16:09:31.353 INFO [stderr] 18 | | } Dec 19 16:09:31.353 INFO [stderr] | |_^ Dec 19 16:09:31.353 INFO [stderr] | Dec 19 16:09:31.353 INFO [stderr] = note: #[warn(non_camel_case_types)] on by default Dec 19 16:09:31.353 INFO [stderr] Dec 19 16:09:31.353 INFO [stderr] warning: type `Two_layer_network` should have a camel case name such as `TwoLayerNetwork` Dec 19 16:09:31.353 INFO [stderr] --> src/two_layer_net.rs:19:1 Dec 19 16:09:31.353 INFO [stderr] | Dec 19 16:09:31.353 INFO [stderr] 19 | / pub struct Two_layer_network { Dec 19 16:09:31.353 INFO [stderr] 20 | | pub w1: Rc>>, Dec 19 16:09:31.353 INFO [stderr] 21 | | pub b1: Rc>>, Dec 19 16:09:31.353 INFO [stderr] 22 | | pub w2: Rc>>, Dec 19 16:09:31.353 INFO [stderr] 23 | | pub b2: Rc>>, Dec 19 16:09:31.353 INFO [stderr] 24 | | } Dec 19 16:09:31.353 INFO [stderr] | |_^ Dec 19 16:09:31.353 INFO [stderr] Dec 19 16:09:31.353 INFO [stderr] warning: variable `Two_layer_network` should have a snake case name such as `two_layer_network` Dec 19 16:09:31.353 INFO [stderr] --> src/main.rs:33:9 Dec 19 16:09:31.353 INFO [stderr] | Dec 19 16:09:31.353 INFO [stderr] 33 | let mut Two_layer_network = two_layer_net::Two_layer_network { Dec 19 16:09:31.353 INFO [stderr] | ^^^^^^^^^^^^^^^^^^^^^ Dec 19 16:09:31.353 INFO [stderr] Dec 19 16:09:31.396 INFO [stderr] Finished dev [unoptimized + debuginfo] target(s) in 44.72s Dec 19 16:09:32.222 INFO running `"docker" "inspect" "4311fe7e83a8be06892638d3e78474b5516eb60fe9f2d7fd9db529604ac2a339"` Dec 19 16:09:32.522 INFO running `"docker" "rm" "-f" "4311fe7e83a8be06892638d3e78474b5516eb60fe9f2d7fd9db529604ac2a339"` Dec 19 16:09:32.813 INFO [stdout] 4311fe7e83a8be06892638d3e78474b5516eb60fe9f2d7fd9db529604ac2a339