Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2021 Jan 12;21(2):488.
doi: 10.3390/s21020488.

CMOS Image Sensors in Surveillance System Applications

Affiliations
Review

CMOS Image Sensors in Surveillance System Applications

Susrutha Babu Sukhavasi et al. Sensors (Basel). .

Abstract

Recent technology advances in CMOS image sensors (CIS) enable their utilization in the most demanding of surveillance fields, especially visual surveillance and intrusion detection in intelligent surveillance systems, aerial surveillance in war zones, Earth environmental surveillance by satellites in space monitoring, agricultural monitoring using wireless sensor networks and internet of things and driver assistance in automotive fields. This paper presents an overview of CMOS image sensor-based surveillance applications over the last decade by tabulating the design characteristics related to image quality such as resolution, frame rate, dynamic range, signal-to-noise ratio, and also processing technology. Different models of CMOS image sensors used in all applications have been surveyed and tabulated for every year and application.

Keywords: CMOS image sensor; dynamic range; frame rate; resolution; signal-to-noise ratio; surveillance systems.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Taxonomy of CMOS image sensor applications.
Figure 2
Figure 2
Classification of CMOS image sensor based applications in various fields for surveillance.
Figure 3
Figure 3
CMOS image sensor architecture. Adapted with from [8] with permission of Elsevier, 2006.
Figure 4
Figure 4
(a) Passive pixel sensor; (b) Active pixel sensor. Adapted with permission from [8], Elsevier, 2006.
Figure 5
Figure 5
Types of CMOS image sensors.
Figure 6
Figure 6
Applications of CMOS IMAGE SENSOR as surveillance system in various fields.
Figure 7
Figure 7
(a) Privacy-preserving sensor for person detection; (b) Field experiment: person detected at the right middle and left side positions; (c) The brightness distributions are made according to the position of the person. Adapted with permission from [10] Elsevier, 2010.
Figure 8
Figure 8
Surveillance in low crowed environment. Empty boxes and illegible text.
Figure 9
Figure 9
Visual surveillance and intrusion detection application network. Empty boxes and illegible text.
Figure 10
Figure 10
(a) Surveillance modes-peace mode, emergency mode; (b) Images of different resolution modes [21].
Figure 11
Figure 11
(a) Image captured in classroom; (b) Captured facial expressions detected using FER technology [23].
Figure 12
Figure 12
(a) Captured image of traffic on the Kuwait highway; (b) Drone Flight; (c) Flight locations in north and south Kuwait [24].
Figure 13
Figure 13
(a) CMOS camera used for nuclear radioactive signal detection; (b) Field experiment; (c) Radiation bright blotch. Adapted from [25] with permission of Elsevier, 2020.
Figure 14
Figure 14
Non-contact neonatal monitoring system. Empty boxes and illegible text.
Figure 15
Figure 15
Hybrid Object DEtection and Tracking (HODET).
Figure 16
Figure 16
(a) Captured image of crop monitoring network; (b) Temperature curve changing with humidity. Adapted from [28] with permission of Elsevier, 2011.
Figure 17
Figure 17
(a) Sensor node for a vineyard monitoring system; (b) Vineyard monitoring by cameras in a wireless sensor network; (c) Detection of brown leaves in vines; (d) Captured images of brown leaves with different sizes taken from different distances [29].
Figure 18
Figure 18
Human monitoring system in the real ship on board monitoring different emotions of the navigational officer.
Figure 19
Figure 19
Smart camera networks-based surveillance system with CITRIC mote.
Figure 20
Figure 20
Smart image sensor with Multi Point Tracking (MPT).
Figure 21
Figure 21
Flood detection and control monitoring system.
Figure 22
Figure 22
Home automation system.
Figure 23
Figure 23
Continental Urban Mobility Experience (CUbE).
Figure 24
Figure 24
Autonomous Micro Digital Sun Sensor (µDSS).
Figure 25
Figure 25
Lightning detection and imaging observation over earth.
Figure 26
Figure 26
Imaging camera setup for star tracking measurement.
Figure 27
Figure 27
Enhanced Engineering camera (EECAM) using CMOS image sensor CMV-20000, i.e., Navcam.
Figure 28
Figure 28
NASA Integrated Solar Array and Reflectarray Antenna (ISARA) mission.
Figure 29
Figure 29
Cloud monitoring camera system for imaging satellites- INSAT satellite and NOAA GOES satellite.
Figure 30
Figure 30
(a) Radiation test setup block diagram; (b) Displacement damage dose test with metal shielding(first image) and without metal shield (second image); (c) Total ionizing dose test setup front view; (d) Total ionizing dose setup with radiation source [42].
Figure 31
Figure 31
Spacecraft for Multi Asteroid Touring (MAT) mission.
Figure 32
Figure 32
(a) Argus2000 Spectrometer with CIS based RGB camera; (b) Attitude determination and control subsystem (Upper Image) and star tracker (lower Image); (c) 3U CubeSat platform and its solar cell distribution; (d) Mechanical structure of MeznSat [44].
Figure 33
Figure 33
Arcsecond Space Telescope Enabling Research in Astrophysics (ASTERIA).
Figure 34
Figure 34
Portable wireless aerial image transmission system.
Figure 35
Figure 35
Intelligent Portable Aerial Surveillance System (IPASS).
Figure 36
Figure 36
Banpil Multi-Spectral Camera.
Figure 37
Figure 37
Mid wave infrared imaging detector for missile applications.
Figure 38
Figure 38
Ballistics experiment using X-ray imaging with a grenade launcher and M240 barrel gun.
Figure 39
Figure 39
Wireless vision sensor.
Figure 40
Figure 40
Concept of Catch and Release Manipulation Architecture (CARMA).
Figure 41
Figure 41
(a) Under Vehicle Inspection System (UVIS); (b) Real-time Inspection; (c) Under view for bomb Inspection [55].
Figure 42
Figure 42
Gun muzzle flash detection system.
Figure 43
Figure 43
Reconnaissance balloon critical part detection.
Figure 44
Figure 44
(a) Automobile lane detection using CMOS image sensor; (b) Original captured image; (c) Image captured by CMOS imager [58].
Figure 45
Figure 45
On Screen Display (OSD).
Figure 46
Figure 46
Non-contact heart rate detection of driver during driving the vehicle in motion.
Figure 47
Figure 47
Fish eye automotive camera for blind spot detection.
Figure 48
Figure 48
Visible Light Communication (VLC) in two modes of operation namely vehicle to interface (V2I)-VLC using an LED traffic light and a vehicle to vehicle-based VLC System (V2V)-VLC using LED brake lights.
Figure 49
Figure 49
Source identification using image sensor based optical wireless communication system.
Figure 50
Figure 50
3D ranging CMOS SPAD camera for advanced driver assistance systems.
Figure 51
Figure 51
Night vision systems.
Figure 52
Figure 52
(a) Traffic light detection with camera; (b) Detection of traffic lights during day and night scenarios. Adapted permission from [67], Elsevier 2015.
Figure 53
Figure 53
TigerCENSE.
Figure 54
Figure 54
On chip moving object detection and localization using CMOS image sensor.
Figure 55
Figure 55
MasliNET-olive grove monitoring system using WSN.
Figure 56
Figure 56
Eco-hydrological monitoring using WSN.
Figure 57
Figure 57
(a) Trap deployment aerial view; (b) Red Palm Weevil trap; (c) Image sensor used in trap [72].
Figure 58
Figure 58
(a) Near Infra-Red (NIR) imaging camera with internal structure; (b) NIR captured images of river surface by applying LPSIV method in two different spectrum band with and without spatial high pass filtering. Adapted with permission from [73] Elsevier, 2013.
Figure 59
Figure 59
TrustEYE coupled with Raspberry Pi board having linux operating system.
Figure 60
Figure 60
Amazon rainforest wildlife monitoring using multimedia wireless sensor networks (MWSN).
Figure 61
Figure 61
MINLU architecture for monitoring the light pollution from small UAV’s.
Figure 62
Figure 62
(a) Minirhizotron field experiment; (b) SoilCam; (c) Root and soil analyzer software; (d) Control box; (e) 360° image captured by SoilCam of a Canola plant root; (f) Multispectral images captured by SoilCam [77].
Figure 63
Figure 63
CMOS process technology variation [78].
Figure 64
Figure 64
Various resolutions display [79].
Figure 65
Figure 65
Year wise usage of CIS models according to survey data, where x-axis represents years and y axis represents number of CIS models.

Similar articles

Cited by

References

    1. Cyr J.S., Vanderpool J., Chen Y., Li X. Sensors and Systems for Space Applications XIII. International Society for Optics and Photonics; Bellingham, WA, USA: 2020. HODET: Hybrid object detection and tracking using mmWave radar and visual sensors; p. 114220I.
    1. Turturici M., Saponara S., Fanucci L., Franchi E. Low-power embedded system for real-time correction of fish-eye automotive cameras; Proceedings of the 2012 Design, Automation & Test in Europe Conference & Exhibition (DATE); Dresden, Germany. 12–16 March 2012; pp. 340–341.
    1. Arima M., Kii S. Development of an Autonomous Human Monitoring System for Preventative Safety in Sea Transportation; Proceedings of the International Conference on Offshore Mechanics and Arctic Engineering; Nantes, France. 9–14 June 2013; p. V02AT02A040.
    1. Jallad A.-H., Marpu P., Abdul Aziz Z., Al Marar A., Awad M. MeznSat—A 3U CubeSat for Monitoring Greenhouse Gases Using Short Wave Infra-Red Spectrometry: Mission Concept and Analysis. Aerospace. 2019;6:118. doi: 10.3390/aerospace6110118. - DOI
    1. Blumenau A., Ishak A., Limone B., Mintz Z., Russell C., Sudol A., Linton R., Lai L., Padir T., Van Hook R. Design and implementation of an intelligent portable aerial surveillance system (ipass); Proceedings of the 2013 IEEE Conference on Technologies for Practical Robot Applications (TePRA); Woburn, MA, USA. 22–23 April 2013; pp. 1–6.

LinkOut - more resources