[INFO] cloning repository https://github.com/duggal1/supercrawler [INFO] running `Command { std: "git" "-c" "credential.helper=" "-c" "credential.helper=/workspace/cargo-home/bin/git-credential-null" "clone" "--bare" "https://github.com/duggal1/supercrawler" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fduggal1%2Fsupercrawler", kill_on_drop: false }` [INFO] [stderr] Cloning into bare repository '/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fduggal1%2Fsupercrawler'... [INFO] running `Command { std: "git" "rev-parse" "HEAD", kill_on_drop: false }` [INFO] [stdout] bae472d5a0745ad3a7680d9d0de959d4df209884 [INFO] checking duggal1/supercrawler against master#292be5c7c05138d753bbd4b30db7a3f1a5c914f7 for pr-148271 [INFO] running `Command { std: "git" "clone" "/workspace/cache/git-repos/https%3A%2F%2Fgithub.com%2Fduggal1%2Fsupercrawler" "/workspace/builds/worker-6-tc1/source", kill_on_drop: false }` [INFO] [stderr] Cloning into '/workspace/builds/worker-6-tc1/source'... [INFO] [stderr] done. [INFO] started tweaking git repo https://github.com/duggal1/supercrawler [INFO] finished tweaking git repo https://github.com/duggal1/supercrawler [INFO] tweaked toml for git repo https://github.com/duggal1/supercrawler written to /workspace/builds/worker-6-tc1/source/Cargo.toml [INFO] validating manifest of git repo https://github.com/duggal1/supercrawler on toolchain 292be5c7c05138d753bbd4b30db7a3f1a5c914f7 [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+292be5c7c05138d753bbd4b30db7a3f1a5c914f7" "metadata" "--manifest-path" "Cargo.toml" "--no-deps", kill_on_drop: false }` [INFO] crate git repo https://github.com/duggal1/supercrawler already has a lockfile, it will not be regenerated [INFO] running `Command { std: CARGO_HOME="/workspace/cargo-home" RUSTUP_HOME="/workspace/rustup-home" "/workspace/cargo-home/bin/cargo" "+292be5c7c05138d753bbd4b30db7a3f1a5c914f7" "fetch" "--manifest-path" "Cargo.toml", kill_on_drop: false }` [INFO] [stderr] Blocking waiting for file lock on package cache [INFO] [stderr] Updating crates.io index [INFO] [stderr] Blocking waiting for file lock on package cache [INFO] [stderr] Downloading crates ... [INFO] [stderr] Downloaded servo_arc v0.4.0 [INFO] [stderr] Downloaded type1-encoding-parser v0.1.0 [INFO] [stderr] Downloaded adobe-cmap-parser v0.4.1 [INFO] [stderr] Downloaded postscript v0.14.1 [INFO] [stderr] Downloaded cff-parser v0.1.0 [INFO] [stderr] Downloaded actix-web-lab-derive v0.24.0 [INFO] [stderr] Downloaded actix-cors v0.7.1 [INFO] [stderr] Downloaded jiff-static v0.2.6 [INFO] [stderr] Downloaded euclid v0.20.14 [INFO] [stderr] Downloaded pom v1.1.0 [INFO] [stderr] Downloaded pdf-extract v0.9.0 [INFO] [stderr] Downloaded jiff v0.2.6 [INFO] [stderr] Downloaded actix-web-lab v0.24.1 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-6-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-6-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:e90291280db7d1fac5b66fc6dad9f9662629e7365a55743daf9bdf73ebc4ea79" "/opt/rustwide/cargo-home/bin/cargo" "+292be5c7c05138d753bbd4b30db7a3f1a5c914f7" "metadata" "--no-deps" "--format-version=1", kill_on_drop: false }` [INFO] [stdout] 967ef1089fd5ed34c9af25040e54668d6ba8482b9165643da6556bf9cf0421d7 [INFO] running `Command { std: "docker" "start" "-a" "967ef1089fd5ed34c9af25040e54668d6ba8482b9165643da6556bf9cf0421d7", kill_on_drop: false }` [INFO] running `Command { std: "docker" "inspect" "967ef1089fd5ed34c9af25040e54668d6ba8482b9165643da6556bf9cf0421d7", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "967ef1089fd5ed34c9af25040e54668d6ba8482b9165643da6556bf9cf0421d7", kill_on_drop: false }` [INFO] [stdout] 967ef1089fd5ed34c9af25040e54668d6ba8482b9165643da6556bf9cf0421d7 [INFO] running `Command { std: "docker" "create" "-v" "/var/lib/crater-agent-workspace/builds/worker-6-tc1/target:/opt/rustwide/target:rw,Z" "-v" "/var/lib/crater-agent-workspace/builds/worker-6-tc1/source:/opt/rustwide/workdir:ro,Z" "-v" "/var/lib/crater-agent-workspace/cargo-home:/opt/rustwide/cargo-home:ro,Z" "-v" "/var/lib/crater-agent-workspace/rustup-home:/opt/rustwide/rustup-home:ro,Z" "-e" "SOURCE_DIR=/opt/rustwide/workdir" "-e" "CARGO_TARGET_DIR=/opt/rustwide/target" "-e" "CARGO_INCREMENTAL=0" "-e" "RUST_BACKTRACE=full" "-e" "RUSTFLAGS=--cap-lints=forbid" "-e" "RUSTDOCFLAGS=--cap-lints=forbid" "-e" "CARGO_HOME=/opt/rustwide/cargo-home" "-e" "RUSTUP_HOME=/opt/rustwide/rustup-home" "-w" "/opt/rustwide/workdir" "-m" "1610612736" "--user" "0:0" "--network" "none" "ghcr.io/rust-lang/crates-build-env/linux@sha256:e90291280db7d1fac5b66fc6dad9f9662629e7365a55743daf9bdf73ebc4ea79" "/opt/rustwide/cargo-home/bin/cargo" "+292be5c7c05138d753bbd4b30db7a3f1a5c914f7" "check" "--frozen" "--all" "--all-targets" "--message-format=json", kill_on_drop: false }` [INFO] [stdout] ed525899e3737587e42f01d0d07864dfd2dd3ef290871f9bc0ee169db09b3de8 [INFO] running `Command { std: "docker" "start" "-a" "ed525899e3737587e42f01d0d07864dfd2dd3ef290871f9bc0ee169db09b3de8", kill_on_drop: false }` [INFO] [stderr] Checking memchr v2.7.4 [INFO] [stderr] Compiling jobserver v0.1.33 [INFO] [stderr] Checking mio v1.0.3 [INFO] [stderr] Checking signal-hook-registry v1.4.2 [INFO] [stderr] Checking parking_lot_core v0.9.10 [INFO] [stderr] Checking bitflags v2.9.0 [INFO] [stderr] Compiling siphasher v1.0.1 [INFO] [stderr] Checking yoke v0.7.5 [INFO] [stderr] Checking tracing v0.1.41 [INFO] [stderr] Checking getrandom v0.3.2 [INFO] [stderr] Compiling serde v1.0.219 [INFO] [stderr] Compiling tracing-core v0.1.33 [INFO] [stderr] Compiling bytestring v1.4.0 [INFO] [stderr] Compiling derive_more-impl v2.0.1 [INFO] [stderr] Checking brotli-decompressor v4.0.2 [INFO] [stderr] Compiling phf_shared v0.11.3 [INFO] [stderr] Compiling cc v1.2.18 [INFO] [stderr] Checking rand_core v0.9.3 [INFO] [stderr] Checking precomputed-hash v0.1.1 [INFO] [stderr] Checking mac v0.1.1 [INFO] [stderr] Checking zerovec v0.10.4 [INFO] [stderr] Checking time v0.3.41 [INFO] [stderr] Compiling phf_generator v0.11.3 [INFO] [stderr] Checking futf v0.1.5 [INFO] [stderr] Checking dtoa v1.0.10 [INFO] [stderr] Checking futures-util v0.3.31 [INFO] [stderr] Checking aho-corasick v1.1.3 [INFO] [stderr] Checking serde_json v1.0.140 [INFO] [stderr] Compiling phf_codegen v0.11.3 [INFO] [stderr] Checking parking_lot v0.12.3 [INFO] [stderr] Checking rand_chacha v0.9.0 [INFO] [stderr] Compiling phf_macros v0.11.3 [INFO] [stderr] Compiling string_cache_codegen v0.5.4 [INFO] [stderr] Checking dtoa-short v0.3.5 [INFO] [stderr] Checking tokio v1.44.2 [INFO] [stderr] Compiling selectors v0.26.0 [INFO] [stderr] Checking string_cache v0.8.9 [INFO] [stderr] Checking tendril v0.4.3 [INFO] [stderr] Checking rand v0.9.0 [INFO] [stderr] Checking nom v8.0.0 [INFO] [stderr] Compiling cssparser-macros v0.6.1 [INFO] [stderr] Compiling markup5ever v0.14.1 [INFO] [stderr] Compiling actix-macros v0.2.4 [INFO] [stderr] Checking pom v1.1.0 [INFO] [stderr] Checking bytecount v0.6.8 [INFO] [stderr] Checking phf v0.11.3 [INFO] [stderr] Compiling rustix v1.0.5 [INFO] [stderr] Checking brotli v7.0.0 [INFO] [stderr] Checking cssparser v0.34.0 [INFO] [stderr] Checking csv-core v0.1.12 [INFO] [stderr] Checking aes v0.8.4 [INFO] [stderr] Checking md-5 v0.10.6 [INFO] [stderr] Checking tinystr v0.7.6 [INFO] [stderr] Checking icu_collections v1.5.0 [INFO] [stderr] Checking servo_arc v0.4.0 [INFO] [stderr] Checking icu_locid v1.5.0 [INFO] [stderr] Checking getrandom v0.2.15 [INFO] [stderr] Compiling match_token v0.1.0 [INFO] [stderr] Compiling derive_more v0.99.19 [INFO] [stderr] Checking regex-automata v0.4.9 [INFO] [stderr] Checking cookie v0.16.2 [INFO] [stderr] Checking rangemap v1.5.1 [INFO] [stderr] Compiling openssl-sys v0.9.107 [INFO] [stderr] Compiling zstd-sys v2.0.15+zstd.1.5.7 [INFO] [stderr] Checking anstream v0.6.18 [INFO] [stderr] Checking getopts v0.2.21 [INFO] [stderr] Checking ahash v0.8.11 [INFO] [stderr] Checking icu_provider v1.5.0 [INFO] [stderr] Checking csv v1.3.1 [INFO] [stderr] Checking adobe-cmap-parser v0.4.1 [INFO] [stderr] Checking type1-encoding-parser v0.1.0 [INFO] [stderr] Checking html5ever v0.29.1 [INFO] [stderr] Compiling openssl v0.10.72 [INFO] [stderr] Compiling native-tls v0.2.14 [INFO] [stderr] Checking euclid v0.20.14 [INFO] [stderr] Checking serde_html_form v0.2.7 [INFO] [stderr] Checking icu_locid_transform v1.5.0 [INFO] [stderr] Compiling actix-router v0.5.3 [INFO] [stderr] Compiling actix-web-lab-derive v0.24.0 [INFO] [stderr] Checking cff-parser v0.1.0 [INFO] [stderr] Checking jiff v0.2.6 [INFO] [stderr] Checking ego-tree v0.10.0 [INFO] [stderr] Checking postscript v0.14.1 [INFO] [stderr] Checking icu_properties v1.5.1 [INFO] [stderr] Compiling actix-web-codegen v4.3.0 [INFO] [stderr] Checking arc-swap v1.7.1 [INFO] [stderr] Checking utf8-width v0.1.7 [INFO] [stderr] Checking chrono v0.4.40 [INFO] [stderr] Checking scraper v0.23.1 [INFO] [stderr] Checking html-escape v0.2.13 [INFO] [stderr] Checking derive_more v2.0.1 [INFO] [stderr] Checking tempfile v3.19.1 [INFO] [stderr] Checking nom_locate v5.0.0 [INFO] [stderr] Checking lopdf v0.36.0 [INFO] [stderr] Checking regex v1.11.1 [INFO] [stderr] Checking futures-executor v0.3.31 [INFO] [stderr] Checking futures v0.3.31 [INFO] [stderr] Checking env_filter v0.1.3 [INFO] [stderr] Checking icu_normalizer v1.5.0 [INFO] [stderr] Checking pdf-extract v0.9.0 [INFO] [stderr] Checking idna_adapter v1.2.0 [INFO] [stderr] Checking idna v1.0.3 [INFO] [stderr] Checking tokio-util v0.7.14 [INFO] [stderr] Checking actix-rt v2.10.0 [INFO] [stderr] Checking tower v0.5.2 [INFO] [stderr] Checking actix-server v2.5.1 [INFO] [stderr] Checking url v2.5.4 [INFO] [stderr] Checking actix-codec v0.5.2 [INFO] [stderr] Checking h2 v0.4.8 [INFO] [stderr] Checking h2 v0.3.26 [INFO] [stderr] Checking tokio-stream v0.1.17 [INFO] [stderr] Checking tokio-native-tls v0.3.1 [INFO] [stderr] Checking env_logger v0.11.8 [INFO] [stderr] Compiling zstd-safe v7.2.4 [INFO] [stderr] Checking zstd v0.13.3 [INFO] [stderr] Checking actix-http v3.10.0 [INFO] [stderr] Checking hyper v1.6.0 [INFO] [stderr] Checking hyper-util v0.1.11 [INFO] [stderr] Checking actix-web v4.10.2 [INFO] [stderr] Checking hyper-tls v0.6.0 [INFO] [stderr] Checking reqwest v0.12.15 [INFO] [stderr] Checking actix-web-lab v0.24.1 [INFO] [stderr] Checking actix-cors v0.7.1 [INFO] [stderr] Checking super-crawler v0.1.0 (/opt/rustwide/workdir) [INFO] [stdout] warning: unused import: `Client` [INFO] [stdout] --> src/yt_crawler.rs:5:15 [INFO] [stdout] | [INFO] [stdout] 5 | use reqwest::{Client}; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `html_escape` [INFO] [stdout] --> src/yt_crawler.rs:10:5 [INFO] [stdout] | [INFO] [stdout] 10 | use html_escape; [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `regex::Regex` [INFO] [stdout] --> src/yt_crawler.rs:11:5 [INFO] [stdout] | [INFO] [stdout] 11 | use regex::Regex; [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `App`, `HttpResponse`, `HttpServer`, and `Responder` [INFO] [stdout] --> src/supercrawler.rs:1:22 [INFO] [stdout] | [INFO] [stdout] 1 | use actix_web::{web, App, HttpServer, HttpResponse, Responder, rt::spawn}; [INFO] [stdout] | ^^^ ^^^^^^^^^^ ^^^^^^^^^^^^ ^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `self` [INFO] [stdout] --> src/supercrawler.rs:21:26 [INFO] [stdout] | [INFO] [stdout] 21 | use actix_web_lab::sse::{self, Sse, Event, Data}; [INFO] [stdout] | ^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::io::Error as IoError` [INFO] [stdout] --> src/supercrawler.rs:24:5 [INFO] [stdout] | [INFO] [stdout] 24 | use std::io::Error as IoError; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `Client` [INFO] [stdout] --> src/yt_crawler.rs:5:15 [INFO] [stdout] | [INFO] [stdout] 5 | use reqwest::{Client}; [INFO] [stdout] | ^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `html_escape` [INFO] [stdout] --> src/yt_crawler.rs:10:5 [INFO] [stdout] | [INFO] [stdout] 10 | use html_escape; [INFO] [stdout] | ^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `regex::Regex` [INFO] [stdout] --> src/yt_crawler.rs:11:5 [INFO] [stdout] | [INFO] [stdout] 11 | use regex::Regex; [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused imports: `App`, `HttpResponse`, `HttpServer`, and `Responder` [INFO] [stdout] --> src/supercrawler.rs:1:22 [INFO] [stdout] | [INFO] [stdout] 1 | use actix_web::{web, App, HttpServer, HttpResponse, Responder, rt::spawn}; [INFO] [stdout] | ^^^ ^^^^^^^^^^ ^^^^^^^^^^^^ ^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `self` [INFO] [stdout] --> src/supercrawler.rs:21:26 [INFO] [stdout] | [INFO] [stdout] 21 | use actix_web_lab::sse::{self, Sse, Event, Data}; [INFO] [stdout] | ^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: unused import: `std::io::Error as IoError` [INFO] [stdout] --> src/supercrawler.rs:24:5 [INFO] [stdout] | [INFO] [stdout] 24 | use std::io::Error as IoError; [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `fetch_html` is never used [INFO] [stdout] --> src/main.rs:630:10 [INFO] [stdout] | [INFO] [stdout] 630 | async fn fetch_html(client: &Client, url: &str) -> Option { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `transcript` is never read [INFO] [stdout] --> src/yt_crawler.rs:23:9 [INFO] [stdout] | [INFO] [stdout] 19 | pub struct Video { [INFO] [stdout] | ----- field in this struct [INFO] [stdout] ... [INFO] [stdout] 23 | pub transcript: Option, [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `Video` has derived impls for the traits `Clone` and `Debug`, but these are intentionally ignored during dead code analysis [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `scrape_handler` is never used [INFO] [stdout] --> src/yt_crawler.rs:272:10 [INFO] [stdout] | [INFO] [stdout] 272 | async fn scrape_handler( [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `CrawlRequest` is never constructed [INFO] [stdout] --> src/supercrawler.rs:32:8 [INFO] [stdout] | [INFO] [stdout] 32 | struct CrawlRequest { [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `CrawlResponse` is never constructed [INFO] [stdout] --> src/supercrawler.rs:38:8 [INFO] [stdout] | [INFO] [stdout] 38 | struct CrawlResponse { [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `fetch_and_extract_urls` is never used [INFO] [stdout] --> src/supercrawler.rs:564:10 [INFO] [stdout] | [INFO] [stdout] 564 | async fn fetch_and_extract_urls(client: &Client, url: &str) -> Vec { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `fetch_html` is never used [INFO] [stdout] --> src/supercrawler.rs:623:10 [INFO] [stdout] | [INFO] [stdout] 623 | async fn fetch_html(client: &Client, url: &str) -> Option { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: structure field `finalAnalysis` should have a snake case name [INFO] [stdout] --> src/supercrawler.rs:68:5 [INFO] [stdout] | [INFO] [stdout] 68 | finalAnalysis: Option, [INFO] [stdout] | ^^^^^^^^^^^^^ help: convert the identifier to snake case: `final_analysis` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(non_snake_case)]` (part of `#[warn(nonstandard_style)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `fetch_html` is never used [INFO] [stdout] --> src/main.rs:630:10 [INFO] [stdout] | [INFO] [stdout] 630 | async fn fetch_html(client: &Client, url: &str) -> Option { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(dead_code)]` (part of `#[warn(unused)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: field `transcript` is never read [INFO] [stdout] --> src/yt_crawler.rs:23:9 [INFO] [stdout] | [INFO] [stdout] 19 | pub struct Video { [INFO] [stdout] | ----- field in this struct [INFO] [stdout] ... [INFO] [stdout] 23 | pub transcript: Option, [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] | [INFO] [stdout] = note: `Video` has derived impls for the traits `Clone` and `Debug`, but these are intentionally ignored during dead code analysis [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `scrape_handler` is never used [INFO] [stdout] --> src/yt_crawler.rs:272:10 [INFO] [stdout] | [INFO] [stdout] 272 | async fn scrape_handler( [INFO] [stdout] | ^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `CrawlRequest` is never constructed [INFO] [stdout] --> src/supercrawler.rs:32:8 [INFO] [stdout] | [INFO] [stdout] 32 | struct CrawlRequest { [INFO] [stdout] | ^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: struct `CrawlResponse` is never constructed [INFO] [stdout] --> src/supercrawler.rs:38:8 [INFO] [stdout] | [INFO] [stdout] 38 | struct CrawlResponse { [INFO] [stdout] | ^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `fetch_and_extract_urls` is never used [INFO] [stdout] --> src/supercrawler.rs:564:10 [INFO] [stdout] | [INFO] [stdout] 564 | async fn fetch_and_extract_urls(client: &Client, url: &str) -> Vec { [INFO] [stdout] | ^^^^^^^^^^^^^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: function `fetch_html` is never used [INFO] [stdout] --> src/supercrawler.rs:623:10 [INFO] [stdout] | [INFO] [stdout] 623 | async fn fetch_html(client: &Client, url: &str) -> Option { [INFO] [stdout] | ^^^^^^^^^^ [INFO] [stdout] [INFO] [stdout] [INFO] [stdout] warning: structure field `finalAnalysis` should have a snake case name [INFO] [stdout] --> src/supercrawler.rs:68:5 [INFO] [stdout] | [INFO] [stdout] 68 | finalAnalysis: Option, [INFO] [stdout] | ^^^^^^^^^^^^^ help: convert the identifier to snake case: `final_analysis` [INFO] [stdout] | [INFO] [stdout] = note: `#[warn(non_snake_case)]` (part of `#[warn(nonstandard_style)]`) on by default [INFO] [stdout] [INFO] [stdout] [INFO] [stderr] Finished `dev` profile [unoptimized + debuginfo] target(s) in 56.56s [INFO] running `Command { std: "docker" "inspect" "ed525899e3737587e42f01d0d07864dfd2dd3ef290871f9bc0ee169db09b3de8", kill_on_drop: false }` [INFO] running `Command { std: "docker" "rm" "-f" "ed525899e3737587e42f01d0d07864dfd2dd3ef290871f9bc0ee169db09b3de8", kill_on_drop: false }` [INFO] [stdout] ed525899e3737587e42f01d0d07864dfd2dd3ef290871f9bc0ee169db09b3de8