Machine Vision and Navigation (eBook)
851 Seiten
Springer International Publishing (Verlag)
978-3-030-22587-2 (ISBN)
This book presents a variety of perspectives on vision-based applications. These contributions are focused on optoelectronic sensors, 3D & 2D machine vision technologies, robot navigation, control schemes, motion controllers, intelligent algorithms and vision systems. The authors focus on applications of unmanned aerial vehicles, autonomous and mobile robots, industrial inspection applications and structural health monitoring. Recent advanced research in measurement and others areas where 3D & 2D machine vision and machine control play an important role, as well as surveys and reviews about vision-based applications. These topics are of interest to readers from diverse areas, including electrical, electronics and computer engineering, technologists, students and non-specialist readers.
• Presents current research in image and signal sensors, methods, and 3D & 2D technologies in vision-based theories and applications;• Discusses applications such as daily use devices including robotics, detection, tracking and stereoscopic vision systems, pose estimation, avoidance of objects, control and data exchange for navigation, and aerial imagery processing;
• Includes research contributions in scientific, industrial, and civil applications.
Preface 5
An Overview of Machine Vision and Navigation 6
Acknowledgment 15
Contents 16
Contributors 20
Abbreviations 26
Part I Image and Signal Sensors 35
1 Image and Signal Sensors for Computing and Machine Vision: Developments to Meet Future Needs 36
Acronyms 36
1.1 Introduction 37
1.1.1 Image Sensing in Machine Vision Systems 37
1.1.2 Image Capture by Digital Cameras 38
1.1.3 Performance Metrics of Image Sensor Photodiodes 40
1.2 Limitations of Current Inorganic-Based Imaging Systems 41
1.2.1 Weak Light Absorption 43
1.2.2 Low Dynamic Range 43
1.2.3 Incompatibility with Complicated Processing and Fabrication on Flexible, Miniaturized Devices 44
1.2.4 Inability to Cope with Illuminant Variation 44
1.2.5 Low Bandgap 45
1.2.6 Crosstalk 45
1.3 Overcoming Limitations of Conventional Imaging Systems Using Alternative Photosensing Materials 45
1.3.1 Organic Photodetectors in Image Sensing 46
1.3.1.1 OPDs Beyond Photodetection 48
1.3.2 Metal Halide Perovskite (MHP)/Organohalide Perovskite (OHP) Photodetectors 53
1.4 Phototransistors 54
1.5 Conclusions and Outlook 57
References 58
2 Bio-Inspired, Real-Time Passive Vision for Mobile Robots 66
Acronyms 66
2.1 Introduction 66
2.2 Related Work 69
2.3 Hardware of the Sensor 70
2.4 Basic Software and Calibration 71
2.4.1 Calibration of the Subsystems 72
2.4.2 Panoramic Images 73
2.4.3 Virtual Camera 75
2.4.4 Calibration Between the Subsystems 76
2.5 Peripheral Vision in the Hybrid Sensor 77
2.5.1 Detection of Objects 77
2.5.2 Tracking of Objects 78
2.5.3 Avoiding Obstacles 79
2.6 Central Vision in the Hybrid Sensor 82
2.7 Experimental Results 83
2.7.1 Peripheral Vision 83
2.7.2 Central Vision 85
2.8 Conclusions 88
References 89
3 Color and Depth Sensing Sensor Technologies for Robotics and Machine Vision 92
Abbreviations 92
3.1 Introduction 93
3.2 3D Image Construction 94
3.2.1 Image Sensor 95
3.2.2 Stereo Vision 95
3.2.3 Shape from Shading 99
3.2.4 Dynamic Vision 101
3.3 Active 3D Imaging 103
3.3.1 Time of Flight 104
3.3.2 Structured Light 106
3.3.3 Shape from Motion 111
3.4 Deep Learning Approaches to 3D Vision 112
3.5 Conclusion 113
References 114
4 Design and Simulation of Array Cells of Mixed Sensor Processors for Intensity Transformation and Analog-Digital Coding in Machine Vision 120
Acronyms 120
4.1 Introduction 121
4.2 Simulation of Array Cells for Image Intensity Transformations and Theoretical Mathematical Background 125
4.2.1 Substantiation of the Need to Design Devices for Parallel Nonlinear Image Intensity Transformations in Self-Learning Equivalent Convolutional Neural Structures (SLECNS) 125
4.2.2 Brief Review and Background of Mathematical Operators, Which Are Implemented by Neurons 129
4.2.3 Mathematical Models of Nonlinear Transformations of Image Intensities 130
4.2.4 Simulation of Array Cells for Image Intensity Transformation 131
4.2.4.1 Simulation of Image Intensity Transformation with Mathcad 131
4.2.4.2 Design and Simulation of Array Cells for Image Intensity Transformation Using OrCad PSpice 133
4.2.4.3 Simulation of Nonlinear Transformation in Analog 64-Input and 81-Input Neuron Equivalentor 139
4.3 Continuous-Logic (CL) Transformation and the Equivalently CL ADC 140
4.3.1 Basic Theoretical Foundations, Equivalence Models, and Their Modification for SMC_CL_ADC 140
4.3.2 Design of CL ADC CM-6 (8) (G): iv (the Iteration Variant) Based on DC-(G) (with Gray Code) 144
4.3.3 Simulating Parallel Conveyor CL_ADC (P_C) Based on Eight 8-DC-(G) with Parallel Serial Output 150
4.4 Conclusions 152
References 158
Part II Detection, Tracking and Stereoscopic Vision Systems 163
5 Image-Based Target Detection and Tracking Using Image-Assisted Robotic Total Stations 164
Abbreviations 164
5.1 Introduction 165
5.2 Principles of Robotic Image-Assisted Total Stations 167
5.2.1 Working Principles of Standard Total Station 170
5.2.1.1 Electronic Distance Measurement 171
5.2.1.2 Electronic Angle Measurement 173
5.3 Automated Reflector-Based Target Recognition and Target Tracking 175
5.3.1 Automated Target Recognition and Detection 175
5.3.1.1 Rough Pointing/Coarse Search 175
5.3.1.2 Fine Pointing/Fine Aiming 176
5.3.2 Target Tracking 178
5.3.3 Time Aspects in Object Tracking 179
5.4 Image-Based Object Recognition, Position Determination, and Tracking 179
5.4.1 Image Processing Fundamentals 180
5.4.2 Image Processing Algorithms for Feature Extraction 181
5.4.2.1 SIFT (Scale-Invariant Feature Transform) Algorithm 182
5.4.2.2 SURF (Speeded-Up Robust Feature) Algorithm 182
5.4.3 Object Recognition and Matching 184
5.4.4 Object Position Determination 186
5.4.5 Principles of Image-Based Object Tracking 187
5.5 Applications 187
5.5.1 Example of Static Object Recognition and Positioning 187
5.5.2 Example of Kinematic Image-Based Object Tracking 192
5.6 Quality Control of Total Stations in Kinematic Mode Using a Laser Tracker 193
5.7 Conclusion 198
References 199
6 The Methods of Radar Detection of Landmarks by Mobile Autonomous Robots 201
Abbreviations 201
6.1 Introduction 201
6.2 The Navigation Problem of Mobile Autonomous Robots 202
6.3 EMW Reflection from the Surrounding Area in Different Frequency Ranges 206
6.4 Mathematical Models of Random Processes Describing the Amplitudes of Echo Signals from the Distributed Objects 209
6.5 Mathematical Models of Random Processes Describing Amplitudes of Echo Signals from Concentrated Objects 217
6.6 Measurement of Amplitude Jump of Signals for Landmark Detection by Mobile Autonomous Robots 220
References 225
7 Machine Vision System for Orchard Management 227
Abbreviations 227
7.1 Introduction 228
7.2 The Machine Vision System 230
7.2.1 Scene Constraints 230
7.2.2 Image Acquisition 230
7.2.3 Image Processing 231
7.2.3.1 Preprocessing 231
7.2.3.2 Segmentation 231
7.2.3.3 Feature Extraction 231
7.2.3.4 Classification 232
7.2.4 Actuation 232
7.3 Agricultural Machine Vision Applications 232
7.3.1 Plant Identification 233
7.3.2 Process Control 234
7.3.3 Machine Guidance and Control 235
7.4 Machine Vision Development for Fruit Yield Estimation: An Example of Plant Identification Application 237
7.4.1 Image Processing for Blossom Isolation 237
7.4.1.1 Methods of Data Transformations 238
7.4.1.2 Testing Blossom Isolation 242
7.4.1.3 Tree Isolation 244
7.4.1.4 Blossom Isolation and Counting for Apple Trees 246
7.4.1.5 Blossom Isolation and Counting for Peach Trees 248
7.4.2 Results of Yield Estimation 249
7.4.2.1 Transition from Blossom Count to Yield Estimation 250
7.4.2.2 Derivation of Weight Values 251
7.4.2.3 Statistical and Probabilistic Results 256
7.4.3 General Image Processing Techniques for Other Projects 257
7.4.3.1 Potential Problems with Over-Constraining the Sample Data 257
7.5 An Alternate Method of Object Isolation 259
7.5.1 Introduction 259
7.5.2 Spatial Mapping 260
7.5.3 Stereo Camera Operation 261
7.5.4 Difficulties of Using Spatial Mapping to Isolate Objects 262
7.5.5 Object Isolation Conclusion 263
7.6 A Machine Vision for Peach Orchard Navigation 263
7.6.1 Introduction 263
7.6.2 Visual Feedback System for Navigation 264
7.6.3 Experimental Ground Vehicle Platform 266
7.7 Conclusion 267
References 268
8 Stereoscopic Vision Systems in Machine Vision, Models, and Applications 271
Acronyms 271
8.1 Introduction 272
8.2 Binocular Vision Systems 273
8.2.1 Artificial Biological Vision Model 273
8.2.2 Other Binocular Vision Model 278
8.3 Multivision Systems 280
8.3.1 Trinocular Vision Models 280
8.3.1.1 Right Triangular Model 280
8.3.1.2 Parallel Model 281
8.3.1.3 Surrounding Model 283
8.3.1.4 Divergent Model 283
8.3.1.5 Arbitrary Model 284
8.3.2 Multi-Camera Models 284
8.4 Applications 286
8.4.1 Binocular Vision System Applications 286
8.4.1.1 Artificial Biological SVS Applications 286
8.4.1.2 Other Binocular Vision Model Applications 288
8.4.2 Multivision System Applications 288
8.4.2.1 Trinocular SVS Applications 288
8.4.2.2 Multivision SVS Applications 289
8.5 Conclusion 290
References 291
9 UKF-Based Image Filtering and 3D Reconstruction 296
Acronyms 296
9.1 Introduction 297
9.2 Kalman Filter Framework-Based Probabilistic Inference 298
9.2.1 Maximum Likelihood Estimator (MLE) 299
9.2.2 Probabilistic Inference and Bayesian Rule 300
9.2.3 Bayes Filter and Belief Update 302
9.2.3.1 KF Framework 303
9.2.3.2 EKF Linearization Technique 304
9.2.3.3 UKF Stochastic Linearization Technique 304
9.3 Stereo Vision System 304
9.3.1 Perspective Projection and Collinearity Constraint 307
9.3.2 Epipolar Geometry and Coplanarity Constraint 309
9.4 Uncertainties in Stereo Vision System 311
9.5 Examples 313
9.5.1 Pose Tracking Using UKF and Stereo Vision 313
9.5.2 Localization Approach-Based 2D Landmark Map 315
9.6 Conclusion 316
References 316
Part III Pose Estimation, Avoidance of Objects, Control and Data Exchange for Navigation 319
10 Lie Algebra Method for Pose Optimization Computation 320
Acronyms 320
10.1 Introduction 320
10.2 Small Rotations and Angular Velocity 321
10.3 Exponential Expression of Rotation 323
10.4 Lie Algebra of Infinitesimal Rotations 324
10.5 Optimization of Rotation 327
10.6 Rotation Estimation by Maximum Likelihood 330
10.7 Fundamental Matrix Computation 336
10.8 Bundle Adjustment 341
10.9 Summary 345
References 345
11 Optimal Generation of Closed Trajectories over Large, Arbitrary Surfaces Based on Non-calibrated Vision 347
Acronyms 347
11.1 Introduction 347
11.2 General Aspects Associated with a Path-Generation and Tracking Maneuver 349
11.3 Camera-Space Kinematics 351
11.4 Characterization of Surface 355
11.5 Path Tracking 358
11.6 Experimental Validation 363
11.7 Conclusions 370
References 370
12 Unified Passivity-Based Visual Control for Moving ObjectTracking 372
Acronym 372
12.1 Introduction 372
12.2 System Models 374
12.2.1 Dynamic Model of the Robotic Manipulator 375
12.2.2 Dynamic Model of the Mobile Robot 375
12.2.3 Dynamic Model of the Mobile Manipulator 375
12.2.4 Kinematic Model of the Vision System 376
12.3 Passivity-Based Visual Controller Design 377
12.3.1 Passivity Property of the Vision System 377
12.3.2 Design of the Kinematic-Based Controller 378
12.3.2.1 Analysis of the Kinematic Control System 380
12.3.2.2 Particular Consideration for the Mobile Manipulator 380
12.3.3 Dynamic Compensation Controller 381
12.3.4 Robustness Analysis 383
12.4 Simulation and Experimental Results 384
12.4.1 Mobile Robot 385
12.4.2 Mobile Manipulator 387
12.4.3 Robotic Manipulator 396
12.5 Conclusions 401
A.1 Appendix 1 404
A.1.1 Mobile Robot Model 405
A.1.2 Feature Selection 406
B.1 Appendix 2 409
References 410
13 Data Exchange and Task of Navigation for Robotic Group 413
Acronyms 413
13.1 Introduction 414
13.2 Swarm Robotics 414
13.2.1 Nature Swarm Adaption 414
13.2.2 Tasks of Swarm Robotics 416
13.2.3 Swarm Robotics Projects 418
13.3 Robotics Vision Systems 421
13.3.1 Technical Vision System 422
13.3.1.1 Historical Background 422
13.3.1.2 Structure and Working Principles 423
13.3.1.3 Data Reduction 424
13.4 Path Planning Methods 426
13.4.1 Path Planning Using Technical Vision System 427
13.4.2 Secondary Objectives Placement for Surface Mapping 429
13.5 Data Transferring Networks and Local Exchange of Information for Robotic Group 430
13.5.1 Spanning Tree Forming for Swarm Robotics 431
13.5.2 Leader Based Communication 432
13.5.3 Feedback Implementation 434
13.6 Surface Mapping 437
13.6.1 Simulation Frameworks 437
13.6.2 Modeling System Structure 439
13.6.3 Influence of Data Exchange on Path Planning 440
13.6.4 Objects Extraction 444
13.6.5 Effectiveness of Robotic Group 447
13.7 Conclusions 448
References 449
14 Real-Time Egocentric Navigation Using 3D Sensing 455
Acronyms 455
14.1 Introduction 456
14.2 Global Planning 458
14.2.1 Planning in Discrete Spaces 458
14.2.2 Planning in Continuous Spaces 460
14.3 Local Planning 461
14.3.1 Moving to 3D Environment Representation 463
14.4 Neuroscience Research Related to Navigation 464
14.5 PiPS: An Ego-Centric Navigation Framework 466
14.5.1 Collision Checking in Perception Space 469
14.5.2 Egocylindrical Perception Space for Enhanced Awareness 475
14.5.3 Egocircular Representation and Trajectory Scoring 478
14.5.3.1 Egocircle Trajectory-Based Cost Functions 481
14.5.4 Working with Stereo Cameras 487
14.6 Benchmarking Navigation Methods 489
14.6.1 World Synthesis 489
14.6.2 Scenario Configuration 490
14.6.3 Benchmarking 492
14.7 Navigation Experiments 493
14.7.1 Sector World with Laser Safe and Unsafe Obstacles 495
14.7.2 Campus World and Office World 496
14.7.3 Review of Outcomes 497
14.7.4 Implementation Using Stereo Camera 499
14.8 Conclusion 502
References 502
15 Autonomous Mobile Vehicle System Overview for Wheeled Ground Applications 509
Acronyms 509
15.1 Introduction 510
15.2 Fundamentals of Autonomous Mobile Vehicles 511
15.2.1 Levels of Automation 511
15.2.2 Main Components 513
15.2.3 Applications 518
15.2.3.1 Automated Storage and Retrieval System (AS/RS) 518
15.2.3.2 Mobile Industrial Robots 519
15.2.3.3 Commercial Autonomous Vehicles 519
15.2.3.4 Autonomous Vacuum Cleaners 520
15.3 Perception 521
15.3.1 Environment Sensing 521
15.3.1.1 LiDAR Sensor 521
15.3.1.2 Kinect Sensor 523
15.3.2 Obstacle Detection and Tracking 524
15.3.2.1 Probabilistic Occupancy Map 524
15.3.2.2 Digital Elevation Map 525
15.3.2.3 Scene Flow Segmentation 525
15.3.2.4 Geometry-Based Clusters 526
15.3.3 Traffic Signs 526
15.3.4 Landmarks 527
15.4 Localization and Map Building 528
15.4.1 Mapping Sensors 531
15.4.1.1 LiDAR Sensor 531
15.4.1.2 Kinect Sensor 532
15.4.2 Localization Sensors 534
15.4.2.1 Wheel Encoders 534
15.4.2.2 Global Positioning System (GPS) 536
15.4.3 Navigation Control 537
15.4.3.1 Simultaneous Localization and Mapping (SLAM) 537
15.5 Path Planning 543
15.5.1 Algorithms 543
15.5.1.1 A Algorithm 543
15.5.1.2 Field D Algorithm 544
15.6 Case Study: Intelligent Transportation Scheme for Autonomous Vehicles in Smart Campus 546
15.6.1 Applied Simultaneous Localization and Mapping (SLAM) 547
15.6.2 Mechanical Design and Kinematic Model 549
15.7 How Innovation in Business Models Will Change the Future of Cars 550
15.7.1 The Misuse of Expensive Vehicles 551
15.7.2 Generation Z Consumer Profile and the Future of Vehicle 551
15.7.3 Business Model Canvas for Car to Go 552
15.7.4 Autonomous Car as an Innovative Business Model 553
15.8 Conclusions 553
References 554
Part IV Aerial Imagery Processing 558
16 Methods for Ensuring the Accuracy of Radiometric and Optoelectronic Navigation Systems of Flying Robots in a Developed Infrastructure 559
Abbreviations 559
16.1 Introduction 560
16.1.1 Autonomous and Noise-Cancel FR Navigation Systems 561
16.1.2 A Formalized Basic Description Model for the Functioning Process of the FR with Correlation-Extreme Navigation Systems and Radiometric and Optoelectronic Sensors 563
16.1.2.1 The Main Objectives and Model of Signal Processing in the RM Channel CENS 563
16.1.3 Analysis of Factors That Lead to Distortions in a Decisive Function Formed by a Correlation-Extreme Navigation System 568
16.1.4 The Changes Impact Analysis in the FR Spatial Position on the CI Formation 570
16.2 Formation Features of a Crucial Function by a Radiometric CENS Channel When Navigating Low-Flying FR in Terms of Three-Dimensional Object Binding 577
16.2.1 Formation of Reference Images of Three-Dimensional Form Object Binding 577
16.2.2 Formation of unimodal decision function of the radiometric CENS 582
16.3 Features of the Formation of the CENS Decisive Function When Navigating Flying Robots in the Case of the Appearance of False Objects in the Current Image 587
16.3.1 Models of Current and Reference Images: Statement of the Task of Developing a Method for Localizing an Object Binding on Image 587
16.3.2 The Solution of the Detection Problem and Multi-Threshold Selection of the OB in a Current Image with Bright False Objects 590
16.3.3 Solution to the Problem of Forming a Unimodal Decision Function 594
16.4 Conclusions 596
References 597
17 Stabilization of Airborne Video Using Sensor Exterior Orientation with Analytical Homography Modeling 600
Acronyms 600
17.1 Introduction 600
17.1.1 Related Work 602
17.2 Feature Track Building 603
17.3 Imaging Model 603
17.4 Optimization 605
17.5 Experiments 609
17.6 Conclusions 613
References 614
18 Visual Servo Controllers for an UAV Tracking Vegetal Paths 617
Acronyms 617
18.1 Introduction 617
18.2 UAV Models 619
18.2.1 Kinematic Model 620
18.2.2 Dynamic Model 620
18.3 Vision System 621
18.3.1 Image Processing 621
18.3.2 Kinematics of the Vision System 622
18.4 Kinematic Visual Servoing Controllers 625
18.4.1 Position Based Controller 625
18.4.1.1 Controller Analysis 626
18.4.2 Image Based Controller 627
18.4.3 Passivity Based Controller 628
18.4.3.1 Controller Analysis 629
18.5 Compensation of UAV Dynamics 629
18.5.1 Controller Analysis 630
18.6 Simulation Results 632
18.7 Conclusions 639
Appendix 640
Passive Property of the UAV Dynamic Model 640
Passive Property of the Vision System 641
Passive Property of the Kinematic Passivity Based Controller 642
Robustness Analysis of the Passivity Based Controller 642
References 644
Part V Machine Vision for Scientific, Industrial and Civil Applications 646
19 Advances in Image and Video Compression Using Wavelet Transforms and Fovea Centralis 647
Acronyms 647
19.1 Introduction 648
19.2 Data Compression 650
19.2.1 Fovea Centralis 653
19.3 Wavelet Transforms 654
19.4 Image Compression 657
19.4.1 Foveated Images 659
19.5 Video Compression 660
19.6 An Approach to Image Compression Based on ROI and Fovea Centralis 661
19.6.1 FVHT Algorithm 662
19.6.2 Simulation Results 662
19.7 Wavelet-Based Coding Approaches: SPECK-Based Codec and Adaptive Wavelet/Fovea Centralis-Based Codec 664
19.7.1 Adaptive Binary Arithmetic Coding 664
19.7.2 AFV-SPECK Algorithm 666
19.7.3 Simulation Results 667
19.8 Conclusions 669
References 669
20 Stairway Detection Based on Single Camera by Motion Stereo for the Blind and Visually Impaired 674
Acronyms 674
20.1 Introduction 675
20.2 Algorithm Description 676
20.2.1 Convolutional Neural Network Model Description 676
20.2.2 Stairway Detections 677
20.3 Experimental Results 682
20.4 Conclusions 684
References 689
21 Advanced Phase Triangulation Methods for 3D Shape Measurements in Scientific and Industrial Applications 691
Abbreviations 691
21.1 Introduction 691
21.2 The Steady Method for Decoding Phase Images with Arbitrary Phase Shifts 692
21.3 Method for Nonlinearity Compensation of the Source–Receiver Path of Optical Radiation in 3D Measurements Based on Phase Triangulation 700
21.4 Comparing Methods of Structured Image Decoding at Nonlinearity of the Source–Receiver Path of Optical Radiation 706
21.5 Methods for Expanding the Dynamic Range of Phase Triangulation Measurements 716
21.6 Method for Estimating the Optimal Frequency of Spatial Modulation in Phase Triangulation Measurements 719
21.7 Conclusion 723
References 724
22 Detection and Tracking of Melt Pool in Blown Powder Deposition Through Image Processing of Infrared Camera Data 726
Abbreviations 726
22.1 Introduction 726
22.2 Influence of Feedback Systems 729
22.2.1 Energy Management System 730
22.2.2 Height Control System 732
22.2.3 Variation in the High Temperature Region 735
22.3 Melt Pool Identification 738
22.3.1 Sensitivity and Repeatability 743
22.4 Conclusions 744
References 745
23 Image Processing Filters for Machine Fault Detectionand Isolation 748
Acronyms 748
23.1 Introduction 749
23.2 Image Processing Median Filters 752
23.3 Gas Path Measurement Images 753
23.3.1 Objective Function 756
23.4 Ant Colony Optimization 757
23.4.1 Ant Colony Algorithm 759
23.4.2 Filter Weight Optimization 760
23.5 Numerical Experiments 761
23.6 Conclusions 763
References 763
24 Control and Automation for Miniaturized Microwave GSG Nanoprobing 765
Acronyms 765
24.1 Introduction 766
24.1.1 Context 766
24.1.2 Short Description of the SEM 767
24.1.3 Specifications 769
24.2 Modeling and Control of a Linear Nanopositioner Using LabVIEW™ 769
24.2.1 Central Idea of This Study 769
24.2.2 Modeling 770
24.2.2.1 Identification of the Open-Loop Transfer Function of the Nanopositioner 770
24.2.2.2 Nanopositioning in Closed Loop 773
24.2.3 Control with LabVIEW™ 775
24.3 Angular Control: Feasibility Study with Matlab™ 777
24.4 Determining Set Points of the Nanopositioners on X, Y, and Z Axes 778
24.4.1 Detecting the Patterns 778
24.4.2 Detecting a Point to Reach 779
24.5 Conclusions 781
References 781
25 Development of Design and Training Application for Deep Convolutional Neural Networks and Support Vector Machines 783
Abbreviations 783
25.1 Introduction 785
25.2 Design and Training Application for DCNNs and SVMs 786
25.3 Review of Back Propagation Algorithm for Implementation 789
25.4 Design and Training Experiments of Designed DCNN 791
25.4.1 Test Trial of Design and Training of a DCNN for Binary Classification 791
25.4.2 Test Trial of Design and Training for Five Categories 794
25.5 Support Vector Machines Based on Trained DCNNs 795
25.6 Conclusions 799
References 799
26 Computer Vision-Based Monitoring of Ship Navigationfor Bridge Collision Risk Assessment 801
Acronyms 801
26.1 Introduction 801
26.2 Engineering Background 804
26.2.1 Bridge Introduction 804
26.2.2 Navigation Condition 805
26.2.3 Collision Incidents Analysis 805
26.2.4 Significance of the System 807
26.3 Ship–Bridge Anti-Collision System 807
26.3.1 Monitoring and Tracking System 808
26.3.2 Risk Assessment and Early Warning System 811
26.3.2.1 Warning Area Division 811
26.3.2.2 System Warning Trigger Method 812
26.3.2.3 Early Warning Event Risk Assessment 812
26.3.3 Post Recording System 814
26.4 Field Test 814
26.4.1 System Summary 815
26.4.2 Monitoring Interface 815
26.4.3 Warning System 816
26.4.4 Ship Identification 817
26.5 Conclusions 819
References 819
About the Authors 822
Further Readings 852
Index 853
| Erscheint lt. Verlag | 30.9.2019 |
|---|---|
| Zusatzinfo | XXXV, 851 p. 483 illus., 362 illus. in color. |
| Sprache | englisch |
| Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
| Medizin / Pharmazie | |
| Technik ► Bauwesen | |
| Technik ► Elektrotechnik / Energietechnik | |
| Technik ► Nachrichtentechnik | |
| Schlagworte | 2D & 3D • Laser scanning and Machine Vision and navigation • Machine Vision and navigation • Measurements and Machine Vision and navigation • Monitoring and Machine Vision and navigation • Motion Control and Machine Vision and navigation • Optoelectronics and Machine Vision and navigation • Sensors and Machine Vision and navigation • Surface Mapping and Machine Vision and navigation • Triangulation and Machine Vision and navigation • UWB visualization and Machine Vision and navigation • Vision Algorithms and Machine Vision and navigation |
| ISBN-10 | 3-030-22587-9 / 3030225879 |
| ISBN-13 | 978-3-030-22587-2 / 9783030225872 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasserzeichen und ist damit für Sie personalisiert. Bei einer missbräuchlichen Weitergabe des eBooks an Dritte ist eine Rückverfolgung an die Quelle möglich.
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich