If you are looking out for the most effective real-time object detection algorithm which is open source and free to use, then YOLO(You Only Look Once) is the perfect answer. YOLO encompasses many of the most innovative ideas coming out of the computer vision research community. Object detection has become a critical capability of autonomous vehicle technology. Kiwibot is one such interesting example which I have been talking about. A Kiwibot is a food delivery robot equipped with six cameras and GPS to deliver the food order at the right place & at the right time. Last year it served around 40,000 food deliveries. Only the person who has ordered will be able to open the Kiwibot and retrieve the order through the app – all using Object detection algorithm. Isn’t it amazing?
YOLO is really very clever convolutional neural network (CNN) for doing object detection and that too in real-time. YOLO learns generalizable representations of objects so that when trained on natural images and tested on artwork, the algorithm outperforms other top detection methods. It is extremely very fast. It sees the entire image during training and test time so it implicitly encodes contextual information about classes as well as their appearance. Today it is popularly being adopted due to the reason that it achieves high accuracy while also being able to run in real-time. If you want to deep dive into YOLO, I recommend reading
https://arxiv.org/pdf/1506.02640v5.pdf
Let’s talk about Pico for On-Premises..
Pico is one interesting project I have been working since past 3 months. Its all about implementing object detection & analytics(Deep Learning) using Docker on IoT devices like Raspberry Pi & Jetson Nano in just simple 3 steps. Imagine you are able to capture live video streams, identify objects using deep learning, and then trigger actions or notifications based on the identified objects – all using Docker containers. With Pico, you will be able to setup and run a live video capture, analysis, and alerting solution prototype. A camera surveils a particular area, streaming video over the network to a video capture client. The client samples video frames and sends them over to AWS, where they are analyzed and stored along with metadata. If certain objects are detected in the analyzed video frames, SMS alerts are sent out. Once a person receives an SMS alert, they will likely want to know what caused it. For that, sampled video frames can be monitored with low latency using a web-based user interface. One of the limitation with the above approach was that it uses Cloud Deep Learning Platform called Amazon Rekognition Service which is free for the first 5,000 API calls only. Once you cross that limit, you will be charged. Hence, I started looking out for an AI platform which can run modern Deep Learning algorithm fast.
Why Yolo on Jetson Nano?
Deep learning is a field with intense computational requirements and the choice of GPU will fundamentally determine your deep learning experience. Having a fast GPU is a very important aspect when one begins to learn deep learning as this allows for rapid gain in practical experience which is key to building the expertise with which you will be able to apply deep learning to new problems. Without this rapid feedback, it just takes too much time to learn from one’s mistakes and it can be discouraging and frustrating to go on with deep learning. If you look at emerging modern libraries like TensorFlow and PyTorch, they are great for parallelizing recurrent and convolutional networks, and for convolution, you can expect a speedup of about 1.9x/2.8x/3.5x for 2/3/4 GPUs.
At just 70 x 45 mm, the Jetson Nano module is the smallest Jetson device with AI capability. This production-ready System on Module (SOM) delivers big when it comes to deploying AI to devices at the edge across multiple industries—from smart cities to robotics. Jetson Nano delivers 472 GFLOPs for running modern AI algorithms fast. It runs multiple neural networks in parallel and processes several high-resolution sensors simultaneously, making it ideal for applications like entry-level Network Video Recorders (NVRs), home robots, and intelligent gateways with full analytics capabilities. You can experience powerful and efficient AI, computer vision, and high-performance computing at just 5 to 10 watts.
If you have ever setup Yolo on Jetson Nano, I am sure you must have faced several challenges in terms of compiling Python, OpenCV & Darknet. Under this blog post, I will showcase how object detection can be simplified by using Docker container.
Preparing Jetson Nano
- Unboxing Jetson Nano Pack
- Preparing your microSD card
To prepare your microSD card, you’ll need a computer with Internet connection and the ability to read and write SD cards, either via a built-in SD card slot or adapter.
- Download the Jetson Nano Developer Kit SD Card Image, and note where it was saved on the computer.
- Write the image to your microSD card( atleast 16GB size) by following the instructions below according to the type of computer you are using: Windows, Mac, or Linux. If you are using Windows laptop, you can use SDFormatter software for formatting your microSD card and Win32DiskImager to flash Jetson Nano Image. In case you are using Mac, you will need Etcher software.
- To prepare your microSD card, you’ll need a computer with Internet connection and the ability to read and write SD cards, either via a built-in SD card slot or adapter
The Jetson Nano SD card image is of 12GB(uncompressed size).
Next, It’s time to remove this tiny SD card from SD card reader and plugin it to Jetson Board to let it boot.
Wow ! Jetson Nano comes with 18.09 by default
Yes, you read it correct. Let us try it once. First we will verify OS version running on Jetson Nano.
Verifying OS running on Jetson Nano
jetson@jetson-desktop:~$ sudo cat /etc/os-release
NAME="Ubuntu"
VERSION="18.04.2 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.2 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic
jetson@jetson-desktop:~$
Verifying Docker
jetson@jetson-desktop:~$ sudo docker version
Client:
Version: 18.09.2
API version: 1.39
Go version: go1.10.4
Git commit: 6247962
Built: Tue Feb 26 23:51:35 2019
OS/Arch: linux/arm64
Experimental: false
Server:
Engine:
Version: 18.09.2
API version: 1.39 (minimum version 1.12)
Go version: go1.10.4
Git commit: 6247962
Built: Wed Feb 13 00:24:14 2019
OS/Arch: linux/arm64
Experimental: false
jetson@jetson-desktop:~$
Updating OS Repository
sudo apt update
Installing Docker 19.03 Binaries
You will need curl command to update Docker 18.09 to 19.03 flawlessly.
sudo apt install curl
curl -sSL https://get.docker.com/ | sh
jetson@jetson-desktop:~$ sudo docker version
Client: Docker Engine - Community
Version: 19.03.2
API version: 1.40
Go version: go1.12.8
Git commit: 6a30dfc
Built: Thu Aug 29 05:32:21 2019
OS/Arch: linux/arm64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 19.03.2
API version: 1.40 (minimum version 1.12)
Go version: go1.12.8
Git commit: 6a30dfc
Built: Thu Aug 29 05:30:53 2019
OS/Arch: linux/arm64
Experimental: false
containerd:
Version: 1.2.6
GitCommit: 894b81a4b802e4eb2a91d1ce216b8817763c29fb
runc:
Version: 1.0.0-rc8
GitCommit: 425e105d5a03fabd737a126ad93d62a9eeede87f
docker-init:
Version: 0.18.0
GitCommit: fec3683
jetson@jetson-desktop:~$
Running Jetson on 5W
Jetson Nano has two power mode, 5W and 10W. Set the powermode of the Jetson Nano to 5W by running the below CLI:
sudo nvpmodel -m 1
Please note that I encountered an issue while operating it on 10W as everytime I start the opendatacam container, it just gets rebooted.
Setting up a swap partition
In order to reduce memory pressure (and crashes), it is a good idea to setup a 6GB swap partition. (Nano has only 4GB of RAM)
git clone https://github.com/collabnix/installSwapfile
cd installSwapfile
chmod 777 installSwapfile.sh
./installSwapfile.sh
Don’t forget to reboot the Jetson nano.
Verify your if your USB Camera is connected
I tested it with Logitech Webcam and Jetson already have required driver for this to work.
ls /dev/video*
Output should be: /dev/video0
Running the scripts
I will be using OpenDataCam tool for object detection. It is an open source tool which quantifies and tracks moving objects with live video analysis. It runs flawlessly on Linux and CUDA GPU enabled hardware. Good news for NVIDIA Jetson fans ~ It is optimized for the NVIDIA Jetson Board series.
Interestingly, Opendatacam is shipped as a Docker container. Let us go ahead and try it out for the first time on Jetson Nano board. Follow the below steps to pull the shell script and run it for Jetson Nano.
wget -N https://raw.githubusercontent.com/opendatacam/opendatacam/v2.1.0/docker/install-opendatacam.sh
chmod 777 install-opendatacam.sh
./install-opendatacam.sh --platform nano
Listing the container
jetson@worker1:~$ sudo docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
aae5117a06c6 opendatacam/opendatacam:v2.1.0-nano "/bin/sh -c ./docker…" 15 minutes ago Up 5 minutes 0.0.0.0:8070->8070/tcp, 0.0.0.0:8080->8080/tcp, 0.0.0.0:8090->8090/tcp, 27017/tcp heuristic_bardeen
jetson@worker1:~$ sudo docker logs -f aae
2020-01-05T10:24:01.840+0000 I STORAGE [main] Max cache overflow file size custom option: 0
2020-01-05T10:24:01.845+0000 I CONTROL [main] Automatically disabling TLS 1.0, to force-enable TLS 1.0 specify --sslDisabledProtocols 'none'
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] MongoDB starting : pid=8 port=27017 dbpath=/data/db 64-bit host=aae5117a06c6
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] db version v4.0.12
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] git version: 5776e3cbf9e7afe86e6b29e22520ffb6766e95d4
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] OpenSSL version: OpenSSL 1.0.2n 7 Dec 2017
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] allocator: tcmalloc
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] modules: none
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] build environment:
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] distmod: ubuntu1604
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] distarch: aarch64
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] target_arch: aarch64
2020-01-05T10:24:01.853+0000 I CONTROL [initandlisten] options: {}
2020-01-05T10:24:01.854+0000 I STORAGE [initandlisten]
2020-01-05T10:24:01.854+0000 I STORAGE [initandlisten] ** WARNING: Using the XFS filesystem is strongly recommended with the WiredTiger storage engine
2020-01-05T10:24:01.854+0000 I STORAGE [initandlisten] ** See http://dochub.mongodb.org/core/prodnotes-filesystem
2020-01-05T10:24:01.854+0000 I STORAGE [initandlisten] wiredtiger_open config: create,cache_size=1470M,cache_overflow=(file_max=0M),session_max=20000,eviction=(threads_min=4,threads_max=4),config_base=false,statistics=(fast),log=(enabled=true,archive=true,path=journal,compressor=snappy),file_manager=(close_idle_time=100000),statistics_log=(wait=0),verbose=(recovery_progress),
2020-01-05T10:24:03.612+0000 I STORAGE [initandlisten] WiredTiger message [1578219843:612093][8:0x7fb6246440], txn-recover: Set global recovery timestamp: 0
2020-01-05T10:24:03.669+0000 I RECOVERY [initandlisten] WiredTiger recoveryTimestamp. Ts: Timestamp(0, 0)
2020-01-05T10:24:03.730+0000 I CONTROL [initandlisten]
2020-01-05T10:24:03.730+0000 I CONTROL [initandlisten] ** WARNING: Access control is not enabled for the database.
2020-01-05T10:24:03.730+0000 I CONTROL [initandlisten] ** Read and write access to data and configuration is unrestricted.
2020-01-05T10:24:03.730+0000 I CONTROL [initandlisten] ** WARNING: You are running this process as the root user, which is not recommended.
2020-01-05T10:24:03.730+0000 I CONTROL [initandlisten]
2020-01-05T10:24:03.731+0000 I CONTROL [initandlisten] ** WARNING: This server is bound to localhost.
2020-01-05T10:24:03.731+0000 I CONTROL [initandlisten] ** Remote systems will be unable to connect to this server.
2020-01-05T10:24:03.731+0000 I CONTROL [initandlisten] ** Start the server with --bind_ip <address> to specify which IP
2020-01-05T10:24:03.731+0000 I CONTROL [initandlisten] ** addresses it should serve responses from, or with --bind_ip_all to
2020-01-05T10:24:03.731+0000 I CONTROL [initandlisten] ** bind to all interfaces. If this behavior is desired, start the
2020-01-05T10:24:03.732+0000 I CONTROL [initandlisten] ** server with --bind_ip 127.0.0.1 to disable this warning.
2020-01-05T10:24:03.732+0000 I CONTROL [initandlisten]
2020-01-05T10:24:03.733+0000 I CONTROL [initandlisten]
2020-01-05T10:24:03.734+0000 I CONTROL [initandlisten] ** WARNING: /sys/kernel/mm/transparent_hugepage/enabled is 'always'.
2020-01-05T10:24:03.734+0000 I CONTROL [initandlisten] ** We suggest setting it to 'never'
2020-01-05T10:24:03.734+0000 I CONTROL [initandlisten]
2020-01-05T10:24:03.738+0000 I STORAGE [initandlisten] createCollection: admin.system.version with provided UUID: 2ecaac66-8c6f-403e-b789-2a69113c59fd
2020-01-05T10:24:03.802+0000 I COMMAND [initandlisten] setting featureCompatibilityVersion to 4.0
2020-01-05T10:24:03.810+0000 I STORAGE [initandlisten] createCollection: local.startup_log with generated UUID: 847e0215-cc4d-4f84-8bbe-0bccb2f9dfd3
2020-01-05T10:24:03.858+0000 I FTDC [initandlisten] Initializing full-time diagnostic data capture with directory '/data/db/diagnostic.data'
2020-01-05T10:24:03.862+0000 I NETWORK [initandlisten] waiting for connections on port 27017
2020-01-05T10:24:03.863+0000 I STORAGE [LogicalSessionCacheRefresh] createCollection: config.system.sessions with generated UUID: 1e2b3be5-a92a-4eb8-b8a5-c6d10cfaadb7
2020-01-05T10:24:03.961+0000 I INDEX [LogicalSessionCacheRefresh] build index on: config.system.sessions properties: { v: 2, key: { lastUse: 1 }, name: "lsidTTLIndex", ns: "config.system.sessions", expireAfterSeconds: 1800 }
2020-01-05T10:24:03.961+0000 I INDEX [LogicalSessionCacheRefresh] building index using bulk method; build may temporarily use up to 500 megabytes of RAM
2020-01-05T10:24:03.965+0000 I INDEX [LogicalSessionCacheRefresh] build index done. scanned 0 total records. 0 secs
2020-01-05T10:24:03.965+0000 I COMMAND [LogicalSessionCacheRefresh] command config.$cmd command: createIndexes { createIndexes: "system.sessions", indexes: [ { key: { lastUse: 1 }, name: "lsidTTLIndex", expireAfterSeconds: 1800 } ], $db: "config" } numYields:0 reslen:114 locks:{ Global: { acquireCount: { r: 2, w: 2 } }, Database: { acquireCount: { w: 2, W: 1 } }, Collection: { acquireCount: { w: 2 } } } storage:{} protocol:op_msg 102ms
> OpenDataCam@2.1.0 start /opendatacam
> PORT=8080 NODE_ENV=production node server.js
Please specify the path to the raw detections file
-----------------------------------
- Opendatacam initialized -
- Config loaded: -
{
"OPENDATACAM_VERSION": "2.1.0",
"PATH_TO_YOLO_DARKNET": "/darknet",
"VIDEO_INPUT": "usbcam",
"NEURAL_NETWORK": "yolov3-tiny",
"VIDEO_INPUTS_PARAMS": {
"file": "opendatacam_videos/demo.mp4",
"usbcam": "v4l2src device=/dev/video0 ! video/x-raw, framerate=30/1, width=640, height=360 ! videoconvert ! appsink",
"usbcam_no_gstreamer": "-c 0",
"experimental_raspberrycam_docker": "v4l2src device=/dev/video2 ! video/x-raw, framerate=30/1, width=640, height=360 ! videoconvert ! appsink",
"raspberrycam_no_docker": "nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280, height=720, framerate=30/1, format=NV12 ! nvvidconv ! video/x-raw, format=BGRx, width=640, height=360 ! videoconvert ! video/x-raw, format=BGR ! appsink",
"remote_cam": "YOUR IP CAM STREAM (can be .m3u8, MJPEG ...), anything supported by opencv"
},
"VALID_CLASSES": [
"*"
],
"DISPLAY_CLASSES": [
{
"class": "bicycle",
"icon": "1F6B2.svg"
},
{
"class": "person",
"icon": "1F6B6.svg"
},
{
"class": "truck",
"icon": "1F69B.svg"
},
{
"class": "motorbike",
"icon": "1F6F5.svg"
},
{
"class": "car",
"icon": "1F697.svg"
},
{
"class": "bus",
"icon": "1F68C.svg"
}
],
"PATHFINDER_COLORS": [
"#1f77b4",
"#ff7f0e",
"#2ca02c",
"#d62728",
"#9467bd",
"#8c564b",
"#e377c2",
"#7f7f7f",
"#bcbd22",
"#17becf"
],
"COUNTER_COLORS": {
"yellow": "#FFE700",
"turquoise": "#A3FFF4",
"green": "#a0f17f",
"purple": "#d070f0",
"red": "#AB4435"
},
"NEURAL_NETWORK_PARAMS": {
"yolov3": {
"data": "cfg/coco.data",
"cfg": "cfg/yolov3.cfg",
"weights": "yolov3.weights"
},
"yolov3-tiny": {
"data": "cfg/coco.data",
"cfg": "cfg/yolov3-tiny.cfg",
"weights": "yolov3-tiny.weights"
},
"yolov2-voc": {
"data": "cfg/voc.data",
"cfg": "cfg/yolo-voc.cfg",
"weights": "yolo-voc.weights"
}
},
"TRACKER_ACCURACY_DISPLAY": {
"nbFrameBuffer": 300,
"settings": {
"radius": 3.1,
"blur": 6.2,
"step": 0.1,
"gradient": {
"1": "red",
"0.4": "orange"
},
"canvasResolutionFactor": 0.1
}
},
"MONGODB_URL": "mongodb://127.0.0.1:27017"
}
-----------------------------------
Process YOLO initialized
2020-01-05T10:24:09.844+0000 I NETWORK [listener] connection accepted from 127.0.0.1:33770 #1 (1 connection now open)
> Ready on http://localhost:8080
> Ready on http://172.17.0.2:8080
2020-01-05T10:24:09.878+0000 I NETWORK [conn1] received client metadata from 127.0.0.1:33770 conn1: { driver: { name: "nodejs", version: "3.2.5" }, os: { type: "Linux", name: "linux", architecture: "arm64", version: "4.9.140-tegra" }, platform: "Node.js v10.16.3, LE, mongodb-core: 3.2.5" }
2020-01-05T10:24:09.915+0000 I STORAGE [conn1] createCollection: opendatacam.recordings with generated UUID: 0b545873-c40f-4232-8803-9c7c0cbd0ec4
2020-01-05T10:24:09.917+0000 I NETWORK [listener] connection accepted from 127.0.0.1:33772 #2 (2 connections now open)
Success init db
2020-01-05T10:24:09.919+0000 I NETWORK [conn2] received client metadata from 127.0.0.1:33772 conn2: { driver: { name: "nodejs", version: "3.2.5" }, os: { type: "Linux", name: "linux", architecture: "arm64", version: "4.9.140-tegra" }, platform: "Node.js v10.16.3, LE, mongodb-core: 3.2.5" }
2020-01-05T10:24:09.969+0000 I INDEX [conn1] build index on: opendatacam.recordings properties: { v: 2, key: { dateEnd: -1 }, name: "dateEnd_-1", ns: "opendatacam.recordings" }
2020-01-05T10:24:09.969+0000 I INDEX [conn1] building index using bulk method; build may temporarily use up to 500 megabytes of RAM
2020-01-05T10:24:09.971+0000 I INDEX [conn1] build index done. scanned 0 total records. 0 secs
2020-01-05T10:24:09.971+0000 I STORAGE [conn2] createCollection: opendatacam.tracker with generated UUID: 58e46bc1-6f22-4b3b-9e3f-42201351e5b4
2020-01-05T10:24:10.040+0000 I INDEX [conn2] build index on: opendatacam.tracker properties: { v: 2, key: { recordingId: 1 }, name: "recordingId_1", ns: "opendatacam.tracker" }
2020-01-05T10:24:10.040+0000 I INDEX [conn2] building index using bulk method; build may temporarily use up to 500 megabytes of RAM
2020-01-05T10:24:10.042+0000 I INDEX [conn2] build index done. scanned 0 total records. 0 secs
2020-01-05T10:24:10.043+0000 I COMMAND [conn2] command opendatacam.$cmd command: createIndexes { createIndexes: "tracker", indexes: [ { name: "recordingId_1", key: { recordingId: 1 } } ], lsid: { id: UUID("afed3446-90a2-4a09-b03b-ba2e9e3aa76f") }, $db: "opendatacam" } numYields:0 reslen:114 locks:{ Global: { acquireCount: { r: 2, w: 2 } }, Database: { acquireCount: { w: 2, W: 1 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 47016 } }, Collection: { acquireCount: { w: 2 } } } storage:{} protocol:op_msg 118ms
Process YOLO started
{ OPENDATACAM_VERSION: '2.1.0',
PATH_TO_YOLO_DARKNET: '/darknet',
VIDEO_INPUT: 'usbcam',
NEURAL_NETWORK: 'yolov3-tiny',
VIDEO_INPUTS_PARAMS:
{ file: 'opendatacam_videos/demo.mp4',
usbcam:
'v4l2src device=/dev/video0 ! video/x-raw, framerate=30/1, width=640, height=360 ! videoconvert ! appsink',
usbcam_no_gstreamer: '-c 0',
experimental_raspberrycam_docker:
'v4l2src device=/dev/video2 ! video/x-raw, framerate=30/1, width=640, height=360 ! videoconvert ! appsink',
raspberrycam_no_docker:
'nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280, height=720, framerate=30/1, format=NV12 ! nvvidconv ! video/x-raw, format=BGRx, width=640, height=360 ! videoconvert ! video/x-raw, format=BGR ! appsink',
remote_cam:
'YOUR IP CAM STREAM (can be .m3u8, MJPEG ...), anything supported by opencv' },
VALID_CLASSES: [ '*' ],
DISPLAY_CLASSES:
[ { class: 'bicycle', icon: '1F6B2.svg' },
{ class: 'person', icon: '1F6B6.svg' },
{ class: 'truck', icon: '1F69B.svg' },
{ class: 'motorbike', icon: '1F6F5.svg' },
{ class: 'car', icon: '1F697.svg' },
{ class: 'bus', icon: '1F68C.svg' } ],
PATHFINDER_COLORS:
[ '#1f77b4',
'#ff7f0e',
'#2ca02c',
'#d62728',
'#9467bd',
'#8c564b',
'#e377c2',
'#7f7f7f',
'#bcbd22',
'#17becf' ],
COUNTER_COLORS:
{ yellow: '#FFE700',
turquoise: '#A3FFF4',
green: '#a0f17f',
purple: '#d070f0',
red: '#AB4435' },
NEURAL_NETWORK_PARAMS:
{ yolov3:
{ data: 'cfg/coco.data',
cfg: 'cfg/yolov3.cfg',
weights: 'yolov3.weights' },
'yolov3-tiny':
{ data: 'cfg/coco.data',
cfg: 'cfg/yolov3-tiny.cfg',
weights: 'yolov3-tiny.weights' },
'yolov2-voc':
{ data: 'cfg/voc.data',
cfg: 'cfg/yolo-voc.cfg',
weights: 'yolo-voc.weights' } },
TRACKER_ACCURACY_DISPLAY:
{ nbFrameBuffer: 300,
settings:
{ radius: 3.1,
blur: 6.2,
step: 0.1,
gradient: [Object],
canvasResolutionFactor: 0.1 } },
MONGODB_URL: 'mongodb://127.0.0.1:27017' }
layer filters size input output
0 (node:55) [DEP0001] DeprecationWarning: OutgoingMessage.flush is deprecated. Use flushHeaders instead.
conv 16 3 x 3 / 1 416 x 416 x 3 -> 416 x 416 x 16 0.150 BF
1 max 2 x 2 / 2 416 x 416 x 16 -> 208 x 208 x 16 0.003 BF
2 conv 32 3 x 3 / 1 208 x 208 x 16 -> 208 x 208 x 32 0.399 BF
3 max 2 x 2 / 2 208 x 208 x 32 -> 104 x 104 x 32 0.001 BF
4 conv 64 3 x 3 / 1 104 x 104 x 32 -> 104 x 104 x 64 0.399 BF
5 max 2 x 2 / 2 104 x 104 x 64 -> 52 x 52 x 64 0.001 BF
6 conv 128 3 x 3 / 1 52 x 52 x 64 -> 52 x 52 x 128 0.399 BF
7 max 2 x 2 / 2 52 x 52 x 128 -> 26 x 26 x 128 0.000 BF
8 conv 256 3 x 3 / 1 26 x 26 x 128 -> 26 x 26 x 256 0.399 BF
9 max 2 x 2 / 2 26 x 26 x 256 -> 13 x 13 x 256 0.000 BF
10 conv 512 3 x 3 / 1 13 x 13 x 256 -> 13 x 13 x 512 0.399 BF
11 max 2 x 2 / 1 13 x 13 x 512 -> 13 x 13 x 512 0.000 BF
12 conv 1024 3 x 3 / 1 13 x 13 x 512 -> 13 x 13 x1024 1.595 BF
13 conv 256 1 x 1 / 1 13 x 13 x1024 -> 13 x 13 x 256 0.089 BF
14 conv 512 3 x 3 / 1 13 x 13 x 256 -> 13 x 13 x 512 0.399 BF
15 conv 255 1 x 1 / 1 13 x 13 x 512 -> 13 x 13 x 255 0.044 BF
16 yolo
By now, you should be able to access opendatacam UI under https://IP-ADDRESS:8080
References:
- https://github.com/opendatacam/opendatacam/blob/master/documentation/jetson/JETSON_NANO.md
- https://github.com/collabnix/pico/blob/master/onprem/yolo/README.md
Comments are closed.