-
Notifications
You must be signed in to change notification settings - Fork 0
/
art.html
705 lines (665 loc) · 30.4 KB
/
art.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
<!DOCTYPE html>
<html lang="en">
<head>
<title>Yisu Fang's FYP Poster Page</title>
<style>
/* Reset and base styles */
* {
box-sizing: border-box;
margin: 0;
padding: 0;
}
@font-face {
font-family: 'Marcellus';
src: url('Marcellus-Regular.ttf') format('truetype');
font-weight: normal;
font-style: normal;
}
@font-face {
font-family: 'Reddit Sans';
src: url('reddit-sans-latin-500-normal.ttf') format('truetype');
font-weight: normal;
font-style: normal;
}
@font-face {
font-family: 'Reddit Sans';
src: url('reddit-sans-latin-500-italic.ttf') format('truetype');
font-weight: normal;
font-style: italic;
}
@font-face {
font-family: 'Reddit Sans';
src: url('reddit-sans-latin-900-normal.ttf') format('truetype');
font-weight: bold;
font-style: normal;
}
@font-face {
font-family: 'Reddit Sans';
src: url('reddit-sans-latin-900-italic.ttf') format('truetype');
font-weight: bold;
font-style: italic;
}
body {
font-family: 'Reddit Sans', sans-serif;
line-height: 1.6;
color: #333;
background-color: #f0f0f0;
}
.responsive-image {
max-width: 100%;
height: auto;
}
/* Typography */
h1, h2, h3, h4 {
font-weight: 600;
margin-bottom: 1rem;
color: #2c3e50;
}
/* Centering the <quotation> section */
quotation {
max-width: 80%; /* Adjust the percentage as needed */
padding: 1rem; /* Reduced padding */
background-color: #f5f5f5;
border-radius: 0.5rem;
box-shadow: 0 0 10px rgba(0, 0, 0, 0.1);
text-align: center; /* Added text-align: center; here */
}
/* Ensuring proper line spacing */
quotation p {
line-height: 1.5; /* Adjust the value as per your preference */
margin: 0; /* Reset margin to prevent extra space */
font-size: 16px; /* Adjust font size as needed */
}
#container {
max-width: 1024px;
margin: 0 auto;
}
/* Layout */
.container {
max-width: 1200px;
margin: 0 auto;
padding: 1rem;
}
.container img {
display: block; /* This will make the image a block-level element */
margin: 0 auto; /* This will center the image horizontally */
}
/* Top bar */
.top-bar {
background-color: #34495e;
color: #fff;
padding: 1rem;
position: sticky;
top: 0;
z-index: 1;
display: flex;
justify-content: space-between;
align-items: center;
}
.top-bar h2 {
color: #fff;
margin-bottom: 0;
}
.top-bar select {
background-color: #2c3e50;
color: #fff;
border: none;
padding: 0.5rem;
font-size: 1rem;
cursor: pointer;
}
/* Sections */
section {
margin-bottom: 2rem;
padding: 2rem;
border-radius: 5px;
background-color: #fff;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
@import url('https://fonts.googleapis.com/css2?family=Marcellus&display=swap');
h2 {
font-size: 32px;
font-family: 'Marcellus', serif;
}
section h5 {
font-size: 16px;
font-family: 'Marcellus', serif;
display: inline-block; /* This will treat the h5 elements as inline-block elements */
text-align: center; /* This will center the text within the h5 elements */
}
/* Cards */
.card {
background-color: #f9f9f9;
border-radius: 5px;
padding: 1rem;
margin-bottom: 1rem;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
blockquote {
margin: 0 auto; /* This will center the blockquote horizontally */
max-width: 80%; /* Adjust as needed */
text-align: center; /* Center align the text within the blockquote */
}
/* Buttons */
button {
background-color: #2c3e50;
color: #fff;
border: none;
padding: 0.5rem 1rem;
font-size: 1rem;
cursor: pointer;
border-radius: 5px;
transition: background-color 0.3s;
}
button:hover {
background-color: #34495e;
}
/* Animations */
@keyframes fadeIn {
from {
opacity: 0;
}
to {
opacity: 1;
}
}
figure {
text-align: center;
margin: 0 auto;
max-width: 100%;
}
figure img {
display: block;
margin: 0 auto;
max-width: 100%;
height: auto;
}
figcaption {
text-align: center;
font-style: italic;
margin-top: 0.5rem;
}
.fade-in {
animation: fadeIn 0.5s ease-in-out;
}
.content-menu {
position: fixed;
left: 20px;
top: 200px;
width: 200px;
background-color: #f1f1f1;
padding: 20px;
border-radius: 5px;
box-shadow: 0 2px 5px rgba(0, 0, 0, 0.3);
}
.content-menu ul {
list-style-type: none;
padding: 0;
}
.content-menu li {
margin-bottom: 10px;
}
.content-menu a {
display: block;
color: #333;
text-decoration: none;
padding: 5px 10px;
border-radius: 3px;
transition: background-color 0.3s ease;
}
.content-menu a:hover {
background-color: #ddd;
}
/* Plaque */
.plaque {
display: flex;
justify-content: center;
align-items: center;
height: 100vh;
background-color: #34495e;
color: #fff;
text-align: center;
padding: 2rem;
}
.plaque-content {
max-width: 800px;
padding: 2rem;
background-color: #2c3e50;
border-radius: 10px;
box-shadow: 0 0 20px rgba(0, 0, 0, 0.3);
}
.plaque h1,
.plaque h2,
.plaque h3,
.plaque h4 {
color: #fff;
margin-bottom: 1rem;
}
.line {
height: 1px;
background-color: #fff;
margin: 1rem 0;
}
.button {
display: inline-block;
padding: 10px 20px;
background-color: #4CAF50; /* Green */
color: white;
text-decoration: none;
border: none;
border-radius: 4px;
cursor: pointer;
transition: background-color 0.3s ease;
}
.button:hover {
background-color: #45a049;
}
/* Poster Section */
.poster-section {
margin-bottom: 2rem;
padding: 2rem;
border-radius: 5px;
background-color: #f4f4f4;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
.subsection {
margin-bottom: 1rem;
}
.subsection h3 {
font-size: 1.2rem;
margin-bottom: 0.5rem;
}
.popup {
display: none;
position: fixed;
z-index: 1;
left: 0;
top: 0;
width: 100%;
height: 100%;
overflow: auto;
background-color: rgba(0, 0, 0, 0.4);
}
.popup-content {
background-color: #fefefe;
margin: 10% auto;
padding: 20px;
border: 1px solid #888;
width: 50%;
}
.close-btn {
color: #aaa;
float: right;
font-size: 28px;
font-weight: bold;
}
.close-btn:hover,
.close-btn:focus {
color: black;
text-decoration: none;
cursor: pointer;
}
</style>
</head>
<body>
<nav class="content-menu">
<ul>
<li><a href="#section0">Credits</a></li>
<li><a href="#aest">Aesthetics</a></li>
<li><a href="#intro">Introduction</a></li>
<li><a href="#section1">CV</a></li>
<li><a href="#SLAM">SLAM</a></li>
<li><a href="#section2">WSN</a></li>
<li><a href="#section4">RFID</a></li>
<li><a href="#section3">Assembly</a></li>
<li><a href="#ack">Acknowledgements</a></li>
</ul>
</nav>
<div class="top-bar">
<h2><em>Argyrorrhyton:</em>
Autonomous Houseplant Irrigation Robot</h2>
<nav>
</nav>
</div>
<section id="section0" class="plaque">
<div class="plaque-content">
<h2>Xi'an-Jiaotong Liverpool University <br> School of Internet of Things</h2>
<div class="line"></div>
<h4>
Student:<br />
Yisu Fang <br>2033451
<div class="line"></div>
Supervisors:<br />
Dr. Hadyan Hafizh<br />
Dr. Muhammad Ateeq
<div class="line"></div>
Procurement:<br />
Yingchao Lyu
<div class="line"></div>
Special Thanks:<br />
Prof. Miguel B. Nunes
<div class="line"></div>
Dean:<br />
Dr. Matilda Isaac
</h4>
</div>
</section>
<div id="container">
<section id="etym">
<h2>Etymology</h2>
<section>
<h1><em>Argyrorrhyton</em> - the Silver Rhyton Cup</h1>
<p>Per my personal naming tradition, my software projects are codenamed in Ancient Greek or Latin, usually with liturgical connotations.</p><br>
<p>The project is the third iteration of an irrigation robot project.</p><br>
<p>It is named after the Ancient Greek word "<em>argyrorrhyton</em>" (ἀργυρόρρυτον), which is a compound word derived from two components:</p>
<p>The first part of the word, "<em>argyros</em>," is an adjective meaning "silvery" or "made of silver.</p>
<p>The second part, "<em>rhyton</em>," is a noun referring to a specific type of ancient Greek libation vessel. A <em>rhyton</em> was a distinctive container made from ceramic, metal, or even horn, with one end shaped like an animal's head or a similar design, and the other end being the opening from which the liquid was consumed.</p><br>
<p>Therefore, the word "<em>argyrorrhyton</em>" (ἀργυρόρρυτον) literally means "a silver <em>rhyton</em>" or "a <em>rhyton</em> made of silver." It refers to a particular type of ancient Greek drinking vessel crafted from silver.</p><strong></strong><br>
<p>The word can also be analyzed as a compound adjective of "<em>argyros</em>" (silver) and "<em>rhytos</em>" (flowing or pouring), declined in the neuter gender.</p>
<img src="photo/1714729544080.png" alt="img">
<p>The source material given is a citation from <em>The Liddell, Scott, Jones Ancient Greek Lexicon </em>(LSJ).</p><p>The word <em>argyrorrhytos, -on </em> is attested in Euripides' play "Helen" line 386, describing a location "beside a silver stream" specifically referencing the banks of the river Hebrus.</p>
<p><a href="https://lsj.gr/wiki/%E1%BC%80%CF%81%CE%B3%CF%85%CF%81%CF%8C%CF%81%CF%81%CF%85%CF%84%CE%BF%CF%82">H. G. Liddell, R. Scott, and H. S. Jones, "ἀργυρόρρυτος," A Greek-English Lexicon.</a></p>
</section>
<section id="aest"> <h2>Aesthetics</h2>
<p>I would like to thank Ms. Longdan Chen M.A. from Loughborough University, Mr. Wenhao Huang from the University of Liverpool, for providing helpful feedbacks on my design aesthetics.</p><br>
<h1>Graphics Design</h1>
<!--<a href="art.html" class="button"></a>-->
<!--<img src="photo/aegisona.jpg" alt="img" class="responsive-image">-->
<figure>
<img src="photo/title.png" alt="img">
<figcaption>Project emblem, featuring halftone and fluid typography.</figcaption>
</figure><figure>
<img src="photo/argp2.jpg" alt="img">
<figcaption>Dedication plaque design</figcaption>
</figure><figure>
<img src="photo/argp1.jpg" alt="img">
<figcaption>Dedication plaque</figcaption>
</figure><figure>
<img src="photo/poster.jpg" alt="img">
<figcaption>Poster is designed in a mildly acid style, featuring collage of ephemera objects.</figcaption>
</figure><figure>
<img src="photo/pos1.jpg" alt="img">
<figcaption>Poster is designed with Adobe Illustrator.</figcaption>
</figure><figure>
<img src="photo/pos2.jpg" alt="img">
<figcaption>Poster is designed with Adobe Photoshop.</figcaption>
</figure><figure>
<img src="photo/pos3.jpg" alt="img">
<figcaption>This webpage is designed with Adobe Dreamweaver.</figcaption>
</figure>
<br>
<h1>Industrial Design</h1>
<figure>
<img src="photo/c4d1.jpg" alt="img">
<figcaption>The robot layout is designed with Maxon Cinema 4D.</figcaption>
</figure><figure>
<img src="photo/sw1.png" alt="img">
<figcaption>The robot is partially designed with Solidworks.</figcaption>
</figure>
<br><br> <p>I could really use a B.A. degree.</p>
</section>
<section id="intro">
<h2>Introduction</h2>
<blockquote>
—'O stream!<br>
Whose source is inaccessibly profound,<br>
Whither do thy mysterious waters tend?<br>
Thou imagest my life.
<br><br>
<em>—— Alastor, or the Spirit of Solitude</em>, Percy Bysshe Shelley </blockquote><br><br>
<h1>Technology</h1>
<p>The convergence of Light Detection and Ranging (LIDAR), Simultaneous Localization and Mapping (SLAM), Robot Operating System 2 (ROS2), and lightweight machine vision like YOLOv8 is enabling a new era of cost-effective, intelligent robotic solutions. This project presents <em>"Argyrorrhyton"</em> - an autonomous robot that integrates these cutting-edge technologies to navigate indoor office environments and autonomously locate and water houseplants.</p>
<p>Harnessing LIDAR for precision mapping and SLAM algorithms for localization, <em>Argyrorrhyton</em> constructs 2D environmental maps to autonomously navigate corridors and rooms. The open-source ROS2 framework provides a modular software architecture, integrating sensor data, control algorithms, and IoT connectivity. Onboard YOLOv8 enables efficient vision-based detection of houseplants in real-life scenarios on embedded hardware. In addition, an encrypted wireless data interchange network is established between the server computer and the microcontrollers to provide electrical isolation between locomotion and control systems.</p>
<br>
<h1>Methodology</h1>
<p>The project is iterated 3 times.</p><br>
<figure>
<img src="photo/hydp2.jpg" alt="img">
<figcaption>The <em>Hydrophylax</em>, an IOT205 coursework project. As a precursor to the project. It is heavily flawed, fitted with primitive analogue vision technology. It has been dismantled and scavenged for parts.</figcaption>
</figure>
<figure>
<img src="photo/hydp1.jpg" alt="img">
<figcaption>The <em>Hydrophylax</em> computer vision system is processed remotely on a laptop computer, using YOLOv5.</figcaption>
</figure>
<figure>
<img src="photo/hydp3.jpg" alt="img">
<figcaption>The <em>Hydrophylax</em> is briefly seen in an XEC promotional video, photo taken in IBSS.</figcaption>
</figure>
<figure>
<img src="photo/prog.jpg" alt="img">
<figcaption>The <em>Siderorrhyton</em>, a test bed for CV and SLAM algorithms. Its software is transplanted entirely onto the Argyrorrhyton as the Ombropompon daemon. The vehicle is now dismantled and kept as a back-up to the project.</figcaption>
</figure><figure>
<img src="photo/argr1.jpg" alt="img">
<figcaption>The <em>Argyrorrhyton</em>, third iteration of an irrigation robot.</figcaption>
</figure>
<figure>
<img src="photo/rabd1.jpg" alt="img">
<figcaption>The <em>Rhabdomantis</em>, a controller unit featuring a CRT screen. It is used to manually override navigation controls and also serves as a standalone remote controller, over the Hieroglossa mesh.</figcaption>
</figure>
<figure>
<img src="photo/argr2.jpg" alt="img">
<figcaption>Presented without commentary.</figcaption>
</figure>
<figure>
<img src="photo/argr4.jpg" alt="img">
<figcaption>Presented without commentary.</figcaption>
</figure>
<figure>
<img src="photo/argr3.jpg" alt="img">
<figcaption>Presented without commentary.</figcaption>
</figure>
<figure>
<img src="photo/argr5.jpg" alt="img">
<figcaption>Presented without commentary.</figcaption>
</figure>
</section>
<!-- Sections -->
<section id="section1">
<h2>Vision System: <em>Phytomantis</em></h2>
<blockquote>
The clouds seem colourless, and even joy is rather sorrowful there; but fountains of fresh water spring out of the rocks, and the eyes of the young girls are like the green fountains in which, with their beds of waving herbs, the sky is mirrored. <br><br>
<em>—— Prayer on the Acropolis</em>, Ernest Renan </blockquote><br><br>
<h1>Implementing Computer Vision with MobileNetV2 SSD</h1>
<p>In my FYP proposal study, it is envisioned that the computer vision model used will be MobileNetV2 SSD, citing its lightweight capacities suitable for deployment on edge devices. However, preliminary attempts to train a MobileNetV2 SSD inference model using a dataset of 32 images, courtesy of <em>Edgeimpulse.com</em>, is deemed inconvenient to use, due to model hallucinations and other considerations such as <em>Edgeimpulse</em> only provides free CPU training and the wrapper function is written in C++ rather than Python. These restrictions discourages me from using MobileNetV2 SSD as the vision algorithm for the project. </p>
<figure>
<img src="photo/fomo1.jpg" alt="img">
<figcaption><em>Edgeimpulse.com</em> provided online training capabilities using MobileNetV2 SSD model.</figcaption>
</figure>
<figure>
<img src="photo/fomo2.jpg" alt="img">
<figcaption>A trained demonstration model hallucinates extensively and is deemed unsafe to use.</figcaption>
</figure>
<p><br>Instead, I resorted to using YOLOv8, which is a potent vision algorithm maintained by <em>Ultralytics</em>, and can be trained on GPU locally using standard YOLO datasets on the pyTorch framework.</p>
<h1>Implementing Computer Vision with YOLOv8n</h1>
<p>To enable real-time perception of houseplants for targeted irrigation, an on-board vision system utilizing the lightweight YOLOv8n object detection model was developed. A dataset of 914 houseplant images from <em>Aspidistra</em> and <em>Epipremnum</em> species was collected by extracting frames from videos recorded by a robotic vehicle in indoor environments of a local university, simulating real-life operating conditions. The dataset was further augmented to 2113 images, with 1893 for training and 190 for validation, using automated labeling on <em>Roboflow.com</em> and preprocessing techniques like resizing and noise introduction. Transfer learning was employed by initializing YOLOv8n with pre-trained weights and the model is trained locally. However, overfitting was observed around the 60th iteration due to limited dataset size and diversity, highlighting the need for further optimization strategies to improve generalization capabilities.</p>
<figure>
<img src="photo/yolo1.png" alt="img">
<figcaption>Auto-labelling with <em>Roboflow.com</em></figcaption>
</figure>
<br>
<figure>
<img src="photo/yolo2.png" alt="img">
<figcaption>Manually adding label instances</figcaption>
</figure>
<br>
<figure>
<img src="photo/yolo3.png" alt="img">
<figcaption>Data augmentation and exporting with <em>Roboflow.com</em></figcaption>
</figure>
<br>
<figure>
<img src="photo/yolo4.png" alt="img">
<figcaption>Training is done on a GTX1650Ti GPU and monitored with Tensorboard. <br>Early stopping is eventually invoked to prevent overfitting.</figcaption>
</figure><br>
<figure>
<img src="photo/resized_Screenshot%20from%202024-05-07%2019-18-04.png" alt="img">
<figcaption>Real-time object recognition running on robot, on an i5-4670T CPU, at ~60ms per frame (~17FPS)</figcaption>
</figure><br>
</section>
<section id="SLAM">
<h2>Navigation System: <em>Ombropompon</em></h2>
<blockquote>
"Yet mark those trees, two miles away,<br>
All clustered in a clump:<br>
If you could trot there twice a day,<br>
Nor ever pause for rest or play.<br>
In the far future — Who can say?——<br>
You may be fit to jump." <br><br>
<em>—— Sylvie and Bruno Concluded</em>, Lewis Carroll<br><br>
</blockquote>
<h1>Implementing SLAM and Pathfinding with SLAM Toolbox and Nav2 SMAC (A* Algorithm)</h1>
<p>Precise robot localization and mapping leveraged Google Cartographer SLAM integrated with ROS2, utilizing a planar LIDAR for forward scanning. SLAM maps were visualized in Rviz2 along with the robot's trajectory for monitoring performance. The vision system's polar houseplant coordinates were transformed to Cartesian positions on the SLAM-derived maps. These global houseplant locations served as waypoints for ROS2 Nav2's SMAC planner to compute obstacle-free navigation routes across the field, dynamically adjusting with new map data, enabling autonomous precision irrigation of detected plants.</p>
<figure>
<br><img src="photo/slam1.png" alt="img">
<figcaption>LIDAR visualization in Rviz2</figcaption>
</figure>
<figure>
<br><img src="photo/resized_slam3.jpg" alt="img">
<figcaption>Multiple driver issues, configurations and incompatibility errors are resolved to make SLAM working as intended</figcaption>
</figure>
<figure>
<br><img src="photo/resized_Screenshot%20from%202024-05-07%2018-19-42.png" alt="img">
<figcaption>Transform node topology and SLAM</figcaption>
</figure>
<figure>
<br><img src="photo/resized_Screenshot%20from%202024-05-05%2016-08-03.png" alt="img">
<figcaption>Costmap generation using Nav2</figcaption>
</figure>
</section>
<section id="section2">
<h2>Encrypted Ad-hoc WSN Data Interchange System: <em>Hieroglossa</em></h2>
<blockquote>
By the Nine Gods he swore it,<br>
And named a trysting day,<br>
And bade his messengers ride forth,<br>
East and west and south and north,<br>
To summon his array.<br><br>
<em>—— Lays of Ancient Rome</em>, Thomas Babington Macaulay </blockquote><br><br>
<p>
To ensure secure and reliable communication between the mainframe computer and distributed microcontrollers, an encrypted wireless sensor network (WSN) architecture was implemented, codenamed <em>Hieroglossa</em>. Due to it being an unpublished project under active development, this section will only delve into its functions briefly.
</p><br>
<div class="subsection">
<h3>Architecture</h3>
<p>
The <em>Hieroglossa</em> WSN comprised the robot computer mainframe running the daemon, and its coordinator node acting as the base station, along with several microcontroller nodes interfaced to sensors and actuators like pumps and motors. This decentralized topology enabled modular expansion of functionality, as well as physical isolation from single-point electrical failures. The <em>Hieroglossa</em> relies on FreeRTOS running on ESP32 microcontrollers, and publishes data over WiFi mesh network to the coordinator node connected to the Linux mainframe. The individual keys are exchanged after each pairing over ECDH. The data is encrypted in AES-128 and the variables are formatted in JSON.
</p>
</div>
<div class="subsection">
<h3>The Daemon</h3>
<p>
The <em>Hieroglossa</em> daemon decrypts and parses the data packets delievered over the mesh network, reading from the coordinator node through Linux serial port. This protocol decodes JSON messages, parsing variables and broadcasting them as ROS2 topics for other daemons to use. It can also subscribe to ROS2 topics and broadcast them in encrypted message strings.
</p>
</div> <figure>
<br><img src="photo/hieg1.jpg" alt="img">
<figcaption>AES encryption-decryption demonstration</figcaption>
</figure>
</section>
<section id="section4">
<h2>RFID Houseplant Management Database: <em>Phytognomon</em></h2>
<blockquote>
The wither'd Misses! how they prose<br>
O'er books of travell'd seamen,<br>
And show you slips of all that grows<br>
From England to Van Diemen.
<br><br>
<em>—— Amphion</em>, Alfred Tennyson </blockquote><br><br>
<p>Phytognomon is the RFID reader and database system recording the information of irrigated plant instances. The database daemon records time, location (ROS2 odometry transform) and the tag UID of the irrigated plant in a CSV form. It is accompanied with a GUI to view and manipulate the system. <br>It will be further integrated with the other daemons to cooperate in decision-making.</p>
<figure>
<br><img src="photo/resized_phyt1.jpg" alt="img">
<figcaption>Phytognomon GUI is integrated with Phytomantis in constructing a database of irrigated plants.</figcaption>
</figure>
<figure>
<br><img src="photo/resized_Screenshot from 2024-05-08 20-17-03.png" alt="img">
<figcaption>Running Phytomantis, Ombropompon & Phytognomon daemons together onboard the robot slightly impacts CV performance (~10FPS).</figcaption>
</figure>
<figure>
<br><img src="photo/resized_phyt2.jpg" alt="img">
<figcaption>The RFID reader is connected to Argyrorrhyton mainframe via a RS232-USB converter.</figcaption>
</figure>
</section>
<section id="section3">
<h2>Hardware Assembly</h2>
<figure>
<br><img src="photo/ag2.jpg" alt="img">
<figcaption>Siderorrhyton and Argyrorrhyton</figcaption>
</figure><figure>
<br><img src="photo/ag3.jpg" alt="img">
<figcaption>Testing the refill pump</figcaption>
</figure><figure>
<br><img src="photo/ag6.jpg" alt="img">
<figcaption>Mounting the overflow sensor</figcaption>
</figure>
<figure>
<br><img src="photo/ag4.jpg" alt="img">
<figcaption>Building the pylons and the superstructure</figcaption>
</figure><figure>
<br><img src="photo/ag5.jpg" alt="img">
<figcaption>Soldering the wires</figcaption>
</figure>
</section>
<section id="ack">
<h2>Acknowledgements</h2>
<blockquote>
Revering, supplicating for divine patronage<br>
仰 祈 靈 澤
<br><br>
<em>—— The Book of Southern Qi</em> 《南齊書》 </blockquote><br>
<br><p>
<p>I would like to extend my gratitude to all the people who have helped me during the writing of this dissertation.</p><br>
<p>My sincere gratitude goes first and foremost to my supervisor, Dr. Hadyan Hafizh, without whose devotion this dissertation would not have been possible. He has helped me revise the writings and oversee the experiments, especially during winter holidays. His dedication to my project kept me confident during the course of my Final-year Project.</p>
<br>
<p>I am profoundly grateful to Dr. Muhammad Ateeq for his willingness to take over the supervision of my project and guide my writing during the final crucial month. His timely intervention and invaluable mentorship following Dr. Hafizh's resignation ensured that I received the necessary support and direction to bring this dissertation to fruition.</p>
<br>
<p>I would like to extend my sincere gratitude to Prof. Miguel Baptista Nunes, the esteemed former Dean of the School of Internet of Things, for his invaluable support and contributions to my research project. His generous allocation of research grants provided the crucial financial backing needed to bring my project to fruition. His vision and support have played a pivotal role in turning my ideas into tangible reality.</p>
<br>
<p>I am forever indebted to my father, Fang Tao, and my mother, Gao Yue'e, for their unwavering financial support and emotional encouragement throughout my college years. Their sacrifices and belief in me provided the foundation for this achievement. I would also like to commemorate my late paternal grandfather, Fang Guocheng, a retired headmaster whose passion for education inspired me from an early age. His dedicated involvement in nurturing my academic pursuits since elementary school left an indelible mark. While he is no longer with us, I hope the tiding of my graduation brings him solace and pride in the Great Beyond. This milestone is a testament to the values he instilled in me. Their collective love, guidance, and provisions have made this journey possible. I am eternally grateful for my family's role in shaping who I am today.</p>
<br>
<p>I am grateful for the guidance from all the faculty members of the School of Internet of Things. I find their teachings highly practical and relevant to my project, especially:</p><br>
<ul>
<li>Dr. Dong Yuji, for his lessons helped my conceptualizing the idea of an irrigation robot.</li>
<li>Dr. Karim Moussa, for his knowledge of Information Security.</li>
<li>Dr. Matilda Issac, for her knowledge of Data Pipelines.</li>
<li>Dr. Oh Bong-Hwan, for his knowledge of Networking Protocols.</li>
<li>Dr. Zhang Wenzhang, for her knowledge of Wireless Sensor Network.</li>
</ul>
<br>
<p>I would like to thank the Laboratory Technicians:</p><br>
<ul>
<li>Ms. Lyu Yingchao, especially, for personally taking care of my project and helping me process my requisition orders.</li>
<li>Ms. Ge Shurong, Mr. Liu Changli, Ms. Xu Jie, for helping with laboratory maintenance.</li>
</ul>
<br>
<p>I would like to thank the Teaching Assistants of the IOT School:</p><br>
<ul>
<li>Mr. Goonjur Medhav Kumar, Mr. Huang Sida, Ms. Yao Xueyan, Mr. Zhang Shiyao.</li>
</ul>
<br>
<p>I am also grateful for the janitors, electricians, property management and security officers of the XJTLU campus building, whose names I do not know.</p>
<br>
<p>In addition to the people mentioned above, I am immensely grateful for the support and companionship of my other dear friends, who have been an integral part of this journey. Their presence has been a constant source of consolation in my depths of affliction. I would like to extend my heartfelt appreciation to the following individuals (though this list is by no means exhaustive):</p><br>
<p>An Jiabao, Chen Longdan, Chen Ruiyang, Chen Zuyu, Dong Yantong, Du Yucheng, Fang Zhixian, Feng Xiangcheng, Feng Yijia, Gao Shuyue, Gao Xingrui, He Zhiqiao, Heng Zhangyan, Hu Haoqi, Hua Sicheng, Huang Wenhao, Huang Xucheng, Jia Xiao'ang, Jin Gehui, Lei Mengyuan, Lei Xiaohaoyang, Li Jiale, Li Mengyuan, Li Shuang, Li Yikun, Li Zonghan, Lin Quanfu, Liu Yiran, Liu Zhiyu, Liu Zilin, Lyu Yize, Ma Kai, Ni Jiayue, Peng Yingbin, Qi Miao, Qin Haoran, Qing Yu, Qiu Yu, Qu Pengcheng, Shi Yuwei, Song Yuyang, Su Hanxiao, Sun Wenhao, Sun Zuoyu, TanLige, Tang Mingyu, Tang Yuqing, Tang Ziyue, Wang Jiayue, Wang Qingshi, Wang Yi, Wang Yiming, Wei Yixuan, Wei Zixiang, Wu Jingheng, Wu Yuesiyu, Xie Hailin, Xu Baixiang, Xu Jiang, Yang Dongyi, Yuan Jiawei, Zha Siyu, Zhang Bo, Zhang Hongbin, Zhang Ziming, Zhao Juanyi, Zhao Tianshi, Zhao Yuyan, Zhou Ziyu, Zhu Lenghan, Zhu Ruilin, Zhu Xiaoqing, Zuo Hongbo.</p>
<br>
<p>And countless others who have touched my life in countless ways.</p>
<br>
<p>Finally, I extend my sincere gratitude to you, the reader, for taking the time to engage with this dissertation. Your presence and attention are truly an honor. I am deeply appreciative of the patience and dedication you have invested in exploring my work. It is a humbling experience to have my research and ideas considered by scholars, peers, and intellectually curious individuals such as yourself. Thank you for being a part of this journey and for allowing me to share the culmination of my efforts with you.</p>
</p>
</section><p>Yisu Fang <br>方 奕甦</p>
<script>
function scrollToSection(sectionId) {
if (sectionId) {
const section = document.querySelector(sectionId);
if (section) {
const header = document.querySelector('.top-bar');
const headerHeight = header.offsetHeight;
const sectionTop = section.getBoundingClientRect().top + window.pageYOffset;
const scrollPosition = sectionTop - headerHeight-50;
window.scrollTo({
top: scrollPosition,
behavior: 'smooth'
});
}
}
}
</script>
</body>
</html>