Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 May 21;17(5):1173.
doi: 10.3390/s17051173.

Eyes of Things

Affiliations

Eyes of Things

Oscar Deniz et al. Sensors (Basel). .

Abstract

Embedded systems control and monitor a great deal of our reality. While some "classic" features are intrinsically necessary, such as low power consumption, rugged operating ranges, fast response and low cost, these systems have evolved in the last few years to emphasize connectivity functions, thus contributing to the Internet of Things paradigm. A myriad of sensing/computing devices are being attached to everyday objects, each able to send and receive data and to act as a unique node in the Internet. Apart from the obvious necessity to process at least some data at the edge (to increase security and reduce power consumption and latency), a major breakthrough will arguably come when such devices are endowed with some level of autonomous "intelligence". Intelligent computing aims to solve problems for which no efficient exact algorithm can exist or for which we cannot conceive an exact algorithm. Central to such intelligence is Computer Vision (CV), i.e., extracting meaning from images and video. While not everything needs CV, visual information is the richest source of information about the real world: people, places and things. The possibilities of embedded CV are endless if we consider new applications and technologies, such as deep learning, drones, home robotics, intelligent surveillance, intelligent toys, wearable cameras, etc. This paper describes the Eyes of Things (EoT) platform, a versatile computer vision platform tackling those challenges and opportunities.

Keywords: Internet of Things; computer vision; embedded computer vision; eyes of things.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Development of Ears of Things (EoT) boards. Sizes, from left to right (in mm): 200 × 180, 100 × 100, 57 × 46.
Figure 2
Figure 2
Top and rear views of the EoT board, showing the main components.
Figure 3
Figure 3
Board connected to a LiPo battery.
Figure 4
Figure 4
EoT block diagram.
Figure 5
Figure 5
Myriad 2 SoC architecture.
Figure 6
Figure 6
SHAVE internal architecture.
Figure 7
Figure 7
Block diagram of SIPP accelerator, AMC and connections to CMX.
Figure 8
Figure 8
Cameras supported by EoT.
Figure 9
Figure 9
Main EoT software modules.
Figure 10
Figure 10
Flash memory layout.
Figure 11
Figure 11
EoT configuration application screenshot (desktop version).
Figure 12
Figure 12
Battery-operated EoT streaming video to an Android smartphone.
Figure 13
Figure 13
Push notification received on an Android smartphone. Left: prior to the user opening it; right: after touching on the notification.
Figure 14
Figure 14
Libccv examples running in EoT. Left: Canny edge detection. Right: text detection in images.
Figure 15
Figure 15
QR code recognition. In this image, two QR codes are recognized.
Figure 16
Figure 16
Rotation-invariant face detection. In this example the user (pictured in the image) is holding the EoT and tilting it.
Figure 17
Figure 17
Fathom deep learning framework.
Figure 18
Figure 18
EoT MicroPython remote terminal and editor showing a simple vision application.
Figure 19
Figure 19
Peephole demonstrator.
Figure 20
Figure 20
Museum audio tour demonstrator.
Figure 21
Figure 21
Architecture of the versatile mobile camera demonstrator.
Figure 22
Figure 22
Sample screens of the demonstrator’s Android app.
Figure 23
Figure 23
Left: defining a region of interest for motion detection. The region configured is sent to the EoT device through MQTT over WiFi communication. Right: access to captured images from a PC web browser. The images were previously retrieved from the EoT device and uploaded to the cloud database.
Figure 24
Figure 24
Smart doll with emotion recognition. Left: The camera was in the head, connected through a flex cable to the board and battery inside the body. Right: emotion recognition.
Figure 25
Figure 25
Emotion recognition network.
Figure 26
Figure 26
Four sample emotion images and the output given by the network.
Figure 27
Figure 27
Efficiency for the task of face emotion recognition, in images/s/Watt.

References

    1. Moloney D., Suarez O.D. A Vision for the Future [Soapbox] IEEE Consum. Electron. Mag. 2015;4:40–45. doi: 10.1109/MCE.2015.2392956. - DOI
    1. Markets and Markets . Artificial Intelligence Market by Technology (Deep Learning, Robotics, Digital Personal Assistant, Querying Method, Natural Language Processing, Context Aware Processing), Offering, End-User Industry, and Geography—Global Forecast to 2022. Markets and Markets; Maharashtra, India: 2016.
    1. EoT Project. [(accessed on 18 May 2017)]; Available online: http://eyesofthings.eu.
    1. Akyildiz I.F., Melodia T., Chowdury K.R. Wireless multimedia sensor networks: A survey. IEEE Wirel. Commun. 2007;14:32–39. doi: 10.1109/MWC.2007.4407225. - DOI
    1. Farooq M.O., Kunz T. Wireless Multimedia Sensor Networks Testbeds and State-of-the-Art Hardware: A Survey. In: Kim T.H., Adeli H., Fang W.C., Vasilakos T., Stoica A., Patrikakis C.Z., Zhao G., Villalba J.G., Xiao Y., editors. Communication and Networking: International Conference, FGCN 2011, Held as Part of the Future Generation Information Technology Conference, FGIT 2011, in Conjunction with GDC 2011, Jeju Island, Korea, 8–10 December 2011. Proceedings, Part I. Springer; Berlin/Heidelberg, Germany: 2012. pp. 1–14.