Language selection

Search

Patent 2764192 Summary

Third-party information liability

Some of the information on this Web page has been provided by external sources. The Government of Canada is not responsible for the accuracy, reliability or currency of the information supplied by external sources. Users wishing to rely upon this information should consult directly with the source of the information. Content provided by external sources is not subject to official languages, privacy and accessibility requirements.

Claims and Abstract availability

Any discrepancies in the text and image of the Claims and Abstract are due to differing posting times. Text of the Claims and Abstract are posted:

  • At the time the application is open to public inspection;
  • At the time of issue of the patent (grant).
(12) Patent: (11) CA 2764192
(54) English Title: APPARATUS FOR DETECTING HUMANS ON CONVEYOR BELTS USING ONE OR MORE IMAGING DEVICES
(54) French Title: APPAREIL DE DETECTION DE PERSONNES SUR LES COURROIES D'UN CONVOYEUR A L'AIDE D'UN OU PLUSIEURS DISPOSITIFS D'IMAGERIE
Status: Granted and Issued
Bibliographic Data
(51) International Patent Classification (IPC):
  • G08B 13/194 (2006.01)
  • B65G 15/30 (2006.01)
  • G01N 21/84 (2006.01)
  • G08B 13/191 (2006.01)
  • G08B 13/196 (2006.01)
  • G08B 21/02 (2006.01)
  • H04N 7/18 (2006.01)
(72) Inventors :
  • BADAWY, WAEL (Canada)
  • SHEHATA, MOHAMED (Canada)
  • MOHAMED, TAMER (Canada)
(73) Owners :
  • INTELLIVIEW TECHNOLOGIES INC.
(71) Applicants :
  • INTELLIVIEW TECHNOLOGIES INC. (Canada)
(74) Agent: LAMBERT INTELLECTUAL PROPERTY LAW
(74) Associate agent:
(45) Issued: 2018-10-30
(22) Filed Date: 2012-01-16
(41) Open to Public Inspection: 2013-07-16
Examination requested: 2016-11-08
Availability of licence: N/A
Dedicated to the Public: N/A
(25) Language of filing: English

Patent Cooperation Treaty (PCT): No

(30) Application Priority Data: None

Abstracts

English Abstract

A system for detecting a class of objects at a location, for example humans on a conveyor belt. A thermal camera may be used to detect objects and to detect the variance of the heat distribution of objects to classify them. Objects detected in an image from one camera may be detected in an image from another camera using geometric correction. A color camera may be used to detect the number of edges and the number of colors of an object to classify it. A color camera may be used with an upright human body classifier to detect humans in an area, and blobs corresponding to the detected humans may be tracked in a thermal or color camera image to detect if a human enters an adjacent forbidden area such as a conveyor belt.


French Abstract

Un système de détection dune classe dobjets à un emplacement, par exemple, des humains sur une courroie de convoyeur. Une caméra thermique peut être utilisée pour détecter des objets et la variance de la distribution thermique dobjets pour les classer. Les objets détectés dans une image provenant dune caméra peuvent être détectés dans une image dune autre caméra utilisant une correction géométrique. Une caméra couleur peut être utilisée pour détecter le nombre de bords et le nombre de couleurs dun objet pour le classer. Une caméra de couleur peut être utilisée avec un classificateur de corps humains verticaux pour détecter les humains dans une zone, et des nuées correspondant aux humains détectés peuvent être suivies dans une image de caméra thermique ou couleur pour détecter si un humain entre dans une zone interdite adjacente comme une courroie de convoyeur.

Claims

Note: Claims are shown in the official language in which they were submitted.


THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE PROPERTY OR
PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. A method of responding to the presence in a scene of a member of a
specified class of
objects, the specified class of objects being one of plural classes of
objects, comprising:
acquiring an image of the scene using a sensor;
in a computer system, identifying one or more objects in the image, each
object having a
radiation intensity distribution, determining for each object a variance value
representing a
variance of a histogram of the radiation intensity distribution of the
respective object, and
classifying each object as a member of one of the plural classes of objects
according to a
comparison of the variance value determined for the respective object to one
or more thresholds;
and
for each object, taking an action upon the respective object being classified
as a member
of the specified class of objects.
2. The method of claim 1 in which the sensor is a thermal imaging device,
the image is a
thermal image and the radiation intensity distribution is a heat intensity
distribution.
3. The method of claim 2 in which the specified class of objects is humans.
4. A method of responding to the presence in a scene of a member of a
specified class of
objects, the specified class of objects being one of plural classes of
objects, comprising:
acquiring an image of the scene using a sensor;
in a computer system, identifying one or more objects in the image, each
object having a
radiation intensity distribution, determining for each object a variance value
representing a
variance of a histogram of the radiation intensity distribution of the
respective object, and
classifying each object as a member of one of the plural classes of objects
according to the
variance value determined for the respective object, wherein each object is
classified as human if
the variance value for the respective object falls within a predetermined
range intermediate
between a range of variances of image intensity typical for metal objects and
a range of variances
of image intensity typical for fabric or plastic objects; and
18

for each object, taking an action upon the respective object being classified
as a member
of the specified class of objects.
5. The method of claim 2 in which the scene is a view of a conveyor belt.
6. The method of claim 5 in which the action comprises stopping the
conveyor belt.
7. The method of claim 2 in which the action comprises alerting operating
personnel.
8. The method of claim 2 in which stationary heat sources are subtracted
from the thermal
image before identifying objects in the thermal image.
9. A method of responding to the presence in a scene of a member of a
specified class of
objects, comprising:
acquiring an image of the scene using a sensor;
in a computer system, identifying one or more objects in the image, each
object having a
radiation intensity distribution, determining a variance of the radiation
intensity distribution for
each object, and classifying each object according to the variance of the
radiation intensity
distribution determined for the respective object;
for each object, taking an action if the respective object is classified as
one of the
specified class of objects;
acquiring a color image of the scene using a color imaging device; and
in the computer system, identifying in the color image the one or more objects
identified
by the computer system in the thermal image, analyzing a color histogram of
each object, and
classifying each object according to a quantization of colors determined from
the color histogram
of the respective object;
in which the sensor is a thermal imaging device, the image is a thermal image
and the
radiation intensity distribution is a heat intensity distribution.
10. A method of responding to the presence in a scene of a member of a
specified class of
objects, comprising:
19

acquiring an image of the scene using a sensor;
in a computer system, identifying one or more objects in the image, each
object having a
radiation intensity distribution, determining a variance of the radiation
intensity distribution for
each object, and classifying each object according to the variance of the
radiation intensity
distribution determined for the respective object;
for each object, taking an action if the respective object is classified as
one of the
specified class of objects; and
in the computer system, applying an edge filter to each object and classifying
each object
according to a number of edge-like features of the object detected by the edge
filter;
in which the sensor is a thermal imaging device, the image is a thermal image
and the
radiation intensity distribution is a heat intensity distribution.
11. A method of responding to the entry of a member of a specified class of
objects into an
area, comprising:
acquiring a first sequence of images of a scene with a first imaging device
oriented in a
manner suitable for detecting members of the specified class of objects;
acquiring a second sequence images of the scene with a second imaging device
oriented
in a manner suitable to detect whether blobs detected in the second sequence
of images are
within the area;
in a computer system, detecting members of the specified class of objects in
the first
sequence of images, for each member of the specified class of objects detected
in the first
sequence of images detecting a corresponding blob in the second sequence of
images, and
detecting, for each blob corresponding to a member of the specified class of
objects, when the
respective blob enters the area; and
taking an action when a blob corresponding to a member of the specified class
of objects
is detected to enter the area.
12. The method of claim 11 in which the specified class of objects is
humans.
13. The method of claim 12 in which the humans are detected based on an
upright human
body classifier.

14. The method of claim 13 in which the upright human body classifier is
based on a
histogram of oriented gradients.
15. The method of claim 13 in which the upright human body classifier is
based on optical
flow patterns.
16. The method of claim 13 in which the upright human body classifier is
based on
covariance features.
17. The method of claim 11 in which the first imaging device is a color
camera and the first
sequence of images is a sequence of color images.
18. The method of claim 11 in which the second imaging device is a thermal
camera and the
second sequence of images is a sequence of thermal images.
19. The method of claim 11 in which the second imaging device is a color
camera and the
second sequence of images is a sequence of color images.
20. The method of claim 11 in which each blob is tracked using a Kalman
filter.
21. The method of claim 11 in which the area is an area above a conveyor
belt.
22. The method of claim 21 in which the conveyor belt is a baggage handling
conveyor belt.
23. The method of claim 21 in which the action comprises stopping the
conveyor belt.
24. The method of claim 11 in which the action comprises alerting operating
personnel.
25. A system for responding to the presence in a scene of a member of a
specified class of
objects, the specified class of objects being one of plural classes of
objects, comprising:
21

a sensor for detecting an image of the scene; and
a computer system for analyzing the image, the computer system being
configured to
identify one or more objects in the image, each object having a radiation
intensity distribution,
determine for each object a variance value representing a variance of a
histogram of the radiation
intensity distribution of the respective object, and classify each object as a
member of one of the
plural classes of objects according to a comparison of the variance value
determined for the
respective object to one or more thresholds.
26. The system of claim 25 further comprising an actuation system
responsive to the
computer system for taking an action if the respective object is classified as
one of the specified
class of objects.
27. The system of claim 26 in which the sensor is a thermal imaging device,
the image is a
thermal image and the radiation intensity distribution is a heat intensity
distribution.
28. The system of claim 27 in which the scene is a view of a conveyor belt.
29. The system of claim 28 in which the action comprises stopping the
conveyor belt.
30. A system for responding to the entry of a member of a specified class
of objects into an
area, comprising:
an imaging system for acquiring a first sequence of images of a scene with a
first imaging
device oriented in a manner suitable for detecting members of the specified
class of objects and
for acquiring a second sequence images of the scene with a second imaging
device oriented in a
manner suitable to detect whether blobs detected in the second sequence of
images are within the
area;
a computer system configured to detect members of the specified class of
objects in the
first sequence of images, detect a corresponding blob in the second sequence
of images for each
member of the specified class of objects detected in the first sequence of
images, and to detect,
for each blob corresponding to a member of the specified class of objects,
when the respective
blob enters the area; and
22

an actuator responsive to the computer system for taking an action when a blob
corresponding to a member of the specified class of objects is detected to
enter the area.
31. The method of claim 1 further comprising adjusting a threshold of the
one or more
thresholds based on the time of day.
32. The method of claim 1 further comprising adjusting a threshold of the
one or more
thresholds based on the time of year.
23

Description

Note: Descriptions are shown in the official language in which they were submitted.


CA 02764192 2012-01-16
APPARATUS FOR DETECTING HUMANS ON CONVEYOR BELTS USING ONE OR
MORE IMAGING DEVICES
TECHNICAL FIELD
[0001] Detection
BACKGROUND
[0002] North American airports are employing new methods for baggage check-
in
and shipping. A passenger is typically able to buy the ticket and check in
online. With the
boarding pass in hand, the only remaining step is to handle the passenger
baggage. In the
airports attempts to reduce queues and automate the whole process of passenger
check-in,
passengers are now responsible for taking the baggage to a conveyor belt. The
consequence
of this is that a part of the conveyor belt is accessible to the public which
cases many issues
including safety and security issues.
[0003] It would seem logical for each Airline to supervise this process
directly
because the luggage belt poses a safety risk for the passengers. In practice,
however, Airline
staff may be engaged in other tasks and do not directly monitor the luggage
loading
processes.
[0004] These public and unobserved luggage belt areas have introduced
several
safety and security concerns.
[0005] Safety:
[0006] There are three main kinds of accidents that could lead to harm:
injuries
caused by the conveyor belt itself, caused by the conveyor belts steep decline
behind the
public area and health risks caused by the industrial X-ray scanner.
[0007] There are two main reasons why people get onto the conveyor belt.
Firstly
passengers are requested to lift and load items that may weigh 20 kg or more
onto various
configurations of moving luggage belts. Some people become unbalanced and fall
on the
belt. For example, an elderly person tripped onto the belt while attempting to
put the baggage
on the belt. Secondly there have been cases of passengers deliberately
climbing onto the
1

CA 02764192 2012-01-16
luggage belt. For example, there have been incidents where children took
joyrides on the
luggage belt.
[0008] Security:
[0009] It has to be guaranteed that no unauthorized person can enter the
security
sterile baggage handling area with the conveyor belt. This could interrupt and
delay the
baggage loading process at the airport. There are several other security
ramifications arising
like theft, vandalism and terrorism.
[0010] Main challenges and problem cases:
[0011] -The algorithm has to have a very low false positive rate. One to
two bags of
the 600000 bags transported daily are allowed to be registered as false
alarms.
[0012] -Crouched humans sitting still on the conveyor belt and thus
expressing no
motion.
[0013] -Occlusion of body parts.
[0014] -Variety of expected postures due to the camera position: standing
people,
people laying on the conveyor belt in different orientations, crouched humans.
[0015] --Hot bags" emitting heat similar to humans.
[0016] -Unusual movement like falling, still (but moving with the conveyor
belt),
running, walking.
[0017] -Visual noise: moving shadows, light reflections, arms adjusting the
baggage
which should not trigger the alarm.
[0018] -Variety of different movement speed: running in the same direction
as the
conveyor belt produces very fast motion and walking in the opposite direction
produces a
very slow motion.
[0019] Authors who have worked on the problem of detecting humans in video,
include: Thome N., Ambellouis S., Bodor R., Jackson B., Papanikolopoulos N.,
Bertozzi M.,
Broggi A., Fascioli A., Graf T., Meinecke M-M., Zhou J., Hoang J, Wren CR,
Azarbayejani
A, Darrell T, Pentland AP, Gutta S., Brodsky T., Steffens JB, Elagin EV,
Nocera LPA,
Maurer T, Neven H, Chen H-P, Ozturk 0, Yamasaki T, Aizawa K, Zin TT, Takahashi
H,
Hama H, Gilmore III ET, Frazier PD, Chouikha MF, Dalal N, Triggs B, Schmid C,
Viola P,
Jones M, Miezianko R, Pokrajac D, Grisleri P, Cutler R, Davis LS, Sidenbladh
H, Toth D,

CA 02764192 2012-01-16
Aach T, Lee D-Jye, Zhan P, Thomas A, Schoenberger R, Snow D, Zhu Q, Yeh M-C,
Cheng
K-T, Avidan S, Suard F, Rakotomamonjy A, Bensrhair A, Del Rose M, Felisa M,
Yao J,
Odobez JM, Tuzel 0, Porikli F, Meer P, Fujimura K, Xu F, Kim HG, Ahn SC, Kim
NH,
Echigo T, Maeda J, Nakano H, Schwartz WR, Kembhavi A, Harwood D, Fang Y,
Yamada
K, Ninomiya Y, Horn BKP, Masaki I, Yun T-J, Guo Y-C, Chao G.
[0020] However there remains a need for a system capable of distinguishing
humans
from other blobs in a video.
SUMMARY
[0021] A method and system is provided for responding to the presence in a
scene of
a member of a specified class of objects, the method comprising acquiring an
image of the
scene using a sensor, identifying in a computer system one or more objects in
the image,
each object having a radiation intensity distribution, determining a variance
of the radiation
intensity distribution for each object, and classifying each object according
to the variance of
the radiation intensity distribution determined for the respective object, and
for each object,
taking an action if the respective object is classified as one of the
specified class of objects.
[0022] In various embodiments, there may be included any one or more of the
following features: The sensor may be a thermal imaging device, the image may
be a thermal
image and the radiation intensity distribution may be a heat intensity
distribution. Stationary
heat sources may be subtracted from the thermal image before identifying
objects in the
thermal image. A color image of the scene may further be acquired using a
color imaging
device, and in the computer system, the one or more objects identified by the
computer
system in the thermal image may be identified in the color image, a color
histogram of each
object may be analyzed, and each object classified acccording to a
quantization of colors
determined from the color histogram of the respective object. The specified
class of objects
may be humans. Each object may be classified as human if the variance of the
image
intensity distribution for the respective object falls within a predetermined
range
intermediate between a range of variances of image intensity typical for metal
objects and a
range of variances of image intensity typical for fabric or plastic objects.
The scene may be a
view of a conveyor belt. The action may comprise stopping the conveyor belt.
The action
3

may comprise alerting operating personnel. An edge filter may be applied to
each object and
each object classified according to a number of edge-like features of the
object detected by the
edge filter.
[0023] A method is provided for responding to the entry of a member of a
specified class
of objects into an area, the method comprising acquiring a first sequence of
images of a scene
with a first imaging device oriented in a manner suitable for detecting
members of the specified
class of objects, acquiring a second sequence images of the scene with a
second imaging device
oriented in a manner suitable to detect whether blobs detected in the second
sequence of images
are within the area, detecting in a computer system members of the specified
class of objects in
the first sequence of images, for each member of the specified class of
objects detected in the
first sequence of images detecting in the computer system a corresponding blob
in the second
sequence of images, and detecting in the computer system, for each blob
corresponding to a
member of the specified class of objects, when the respective blob enters the
area, and taking an
action when a blob corresponding to a member of the specified class of objects
is detected to
enter the area.
[0024] In various embodiments, there may be included any one or more of the
following
features: The specified class of objects may be humans. The humans may be
detected based on
an upright human body classifier. The upright human body classifier may be
based on a
histogram of oriented gradients. The upright human body classifier may be
based on optical flow
patterns. The upright human body classifier may be based on covariance
features. The first
imaging device may be a color camera and the first sequence of images may be a
sequence of
color images. The second imaging device may be a thermal camera and the second
sequence of
images may be a sequence of thermal images. The second imaging device may be a
color camera
and the second sequence of images may be a sequence of color images. Each blob
may be
tracked using a Kalman filter. The area may be an area above a conveyor belt.
The conveyor belt
may be a baggage handling conveyor belt. The action may comprise stopping the
conveyor belt.
The action may comprise alerting operating personnel.
[0025] These and other aspects of the device and method are set out in the
claims.
4
CA 2764192 2018-03-15

CA 02764192 2012-01-16
BRIEF DESCRIPTION OF THE FIGURES
[0026] Embodiments will now be described with reference to the figures, in
which
like reference characters denote like elements, by way of example, and in
which:
[0027] Fig. 1 is a block diagram showing the physical layer of an
embodiment of an
image analysis system;
[0028] Fig. 2 is an illustration of an embodiment of the analytic system
using a single
thermal camera;
[0029] Fig. 3 is a block diagram showing the steps of the analytics system
using a
single thermal camera;
[0030] Fig. 4 is a block diagram showing the steps of the heat signature
analysis
stage of the analytics system;
[0031] Fig. 5 illustrates an embodiment of the invention using one thermal
camera
and a color video camera;
[0032] Fig. 6 is a block diagram showing the steps of the analytics system
when
using both a thermal camera and a video camera;
[0033] Fig. 7 is an illustration of an embodiment of the analytics system
using a
thermal camera and a fish-eye video camera;
[0034] Fig. 8 is a block diagram showing the steps of the analytics system
when
using both a thermal camera and a fish-eye video camera;
[0035] Fig. 9 is an illustration of an embodiment of the analytics system
using a
video camera and a fish-eye video camera;
[0036] Fig. 10 is a block diagram showing the steps of the analytics system
when
several of the described systems are combined using a weighted majority
system; and
[0037] Fig. 11 shows a block diagram of the steps of the analytics system
using
image fusion.
DETAILED DESCRIPTION
[0038] A system for detecting a class of objects at a location, for example
humans on
a conveyor belt.

CA 02764192 2012-01-16
[0039] Any combination of the following systems can be used to increase the
detection rate and reduce the error rate further by combining information from
the systems,
for example using majority rating.
[0040] Physical Layer
[0041] Fig. 1 is a block diagram showing the physical layer of an
embodiment of an
image analysis system. An imaging and analysis subsystem 100 comprises an
imaging
system 102, in this case consisting of one thermal camera 112, and computer
configured
software setup 104 which is responsible for the detection of humans. The
result from the
software setup is passed to actuator system 106. The actuator system 106,
which is
responsive to the computer system 104, stops the conveyor belt if a human is
detected. The
computer system may also trigger alarm system 108 which informs the operating
personnel
about the incident. The imaging system, computer system and actuator may use
any
conventional communication channels for communications passed between them.
The
computer system may provide control signals to the imaging system and to the
actuator.
[0042] Description of analytic System number 1: Single thermal camera based
system.
[0043] Fig. 2 is an illustration of an embodiment of the analytic system
using a single
thermal camera. Thermal camera 112 views the area above conveyor belt 110, in
this case
from above, in order to detect humans on the conveyor belt.
[0044] Analytics System
[0045] Fig. 3 is a block diagram showing the steps of the analytics system
using a
single thermal camera. In step 120 the analytics system 104 receives data from
the single
thermal camera 112. In step 122 the software setup 104 conducts background
compensation,
in this case by subtracting stationary heat sources, such as light sources and
conveyor belt
rollers, and the semi stationary heat source, which is the conveyor belt
itself. In step 124 the
software setup obtains object meshes from the thermal image to detect an
object. In step 126
the heat signature is analyzed, to make a classification in step 128 whether
it is a human
signature. If a human is detected the alarm will be triggered in step 130.
6

CA 02764192 2012-01-16
[0046] Background compensation
[0047] The first stage image analysis system subtracts the stationary heat
sources,
which are the light sources and the conveyor belt rollers, and the semi
stationary heat source,
which is the conveyor belt itself
[0048] Detect Object X
[0049] The second stage image analysis system detects the presence of a
foreground
hot object in the area of the conveyor belt itself. The system may use the
technique for object
detection and tracking described in the Patent US 7,616,782, B2.
[0050] The result of the algorithm is a mesh of anchor points which
describes the
detected object.
[0051] Analyze heat signature of X
[0052] The third stage of the detection system rejects hot objects that are
not human
to prevent false alarms and passes the result to the actuator, which sounds an
alarm and
activates an emergency shutdown of the conveyor belt.
[0053] Fig. 4 is a block diagram showing the steps of the heat signature
analysis
stage of the analytics system. In step 140 data is received on what objects
have been detected
in step 124. In step 142 it is determined if all detected objects have
undergone heat signature
analysis. If so, the heat signature analysis stage waits to receive more
object detection data.
If not, the heat signature analysis stage proceeds in step 144 to receive
image data about a
detected object. In step 146 the heat signature analysis stage calculates a
local histogram of
thermal brightness from the image data about the object. In step 148 the
variance of the
histogram is calculated. In step 150 it is determined if the variance is
between preset
threshholds. If the variance is between the preset thresholds an alarm is
triggered in step 130,
otherwise the system proceeds to analyze the next object.
7

CA 02764192 2012-01-16
[0054] Calculate Local Histogram histx
[0055] The system calculates the histogram of the heat intensity
distribution
(histob)eõ ) of the object X. The histogram describes the probabilities
p, = histx(i), I = 0 _255 of every brightness value x E X.
[0056]
Wk. = i,x E X} 11
histx(1)¨
[0057] Calculate Variance Var(histx)
[0058] The variance is a measure for the distribution of values compared to
the
mean. The system calculates the variance of the local histogram which can be
written in this
way:
255
Var(histabitict)= Ihistobõeõ(i)* (1
[0059] =o
[0060] The mean is defined as follows:
255
p
i=o
[0061] Is variance between thresholds tintai and t
- f abric?
[0062] Different classes of objects have characteristic heat distributions.
Objects
made out of metal (as in metal case baggage) have very sharp heat intensity
histograms that
exhibits almost zero variance. Objects made of leather/cloth/plastic exhibit
very large
variance heat distributions. Thermal images of humans are characterized by
narrow heat
distributions centered about 35.5 degrees Celsius.
[0063] If the variance of the object is smaller than the variance of
objects made out
of fabric/plastic/cloth and is greater than the variance for objects made out
of metal then the
8

CA 02764192 2012-01-16
objects is classified as a human.
tmõaz < Var(histobjeõ) < tfabri,
[0064] The thresholds can be adjusted by the following rules:
The system is supplied by the latitude and longitude of the airport location
and this way it
can calculate sun rise and sun set times. Based on this information, the
system increases the
bias towards identifying hot objects as humans when the time of the detection
is in the time
range of sunset time plus one hour and sunrise time plus one hour. Outside
this time range,
the bias is increased towards detecting false positives. The system also
rejects more hot-
object occurrences during the months of the summer.
[0065] The system can be manually configured to run only during night hours
when
traffic is slow and the incident is more likely to happen. This further
reduces the chance for
false alarms without sacrificing the sensitivity of the system and risking
false negatives.
[0066] Description of system number 2a: Thermal camera plus a color camera
[0067] Fig. 5 illustrates an embodiment of the invention using one thermal
camera
112 and a color video camera 114. In this embodiment, both the video camera
and thermal
camera look at the conveyor belt from above.
[0068] Analytics System
[0069] Fig. 6 is a block diagram showing the steps of an embodiment of the
analytics
system using both a thermal camera and a video camera. In step 160 image data
is received
from the cameras. Background compensation may be performed on the thermal data
as in
Fig. 3 but this is not shown in Fig. 6. In step 124 objects are detected in
the thermal data. In
step 162 the heat distributions of detected objects are analyzed. The same
technique may be
used as in step 126 in Fig. 3 shown in more detail in Fig. 4. In step 164 the
number of edge
like features in the thermal data is detected. In step 166 geometric
correction is performed on
the objects detected in the thermal data to identify those objects in the
color image data. In
step 168 the color distribution of objects identified in the color image data
is analyzed. In
step 170 the information from steps 162, 164 and 168 are combined to make a
determination
9

CA 02764192 2012-01-16
whether a detected object is human. If the object is determined to be human,
in step 130 the
alarm is triggered.
[0070] Object Detection
[0071] The image stream from the infrared camera is used to detect moving
hot
objects.
[0072] The image analysis system detects the presence of a foreground hot
object in
the area of the conveyor belt itself. A technique for object detection and
tracking is described
in US 7,616,782, B2.
[0073] The result of the algorithm is a mesh which describes the detected
object.
[0074] Geometric correction
[0075] The detected objects are meshes which consist of several anchor
points. The
position of every anchor point is geometrically transformed to find the
corresponding point
in the color image. This transformation can be represented as a linear
Transformation in 3D
space. A homography H is a matrix that translates points from one camera plane
to another
plane. The matrix is computed based on 4 reference points which have to be
entered
manually. d is the distance between the two cameras. xi is a point of the
color camera and x;
is a point in the corresponding point in respect to the color cameras
viewpoint.
[0076] d
[0077] This transformation is done to find the area in the view point of
the color
camera which corresponds to the detection area in the thermal image.
[0078] Heat distribution
[0079] See Analyze heat signature of X.

CA 02764192 2012-01-16
[0080] Edge Count
[0081] An edge filter is applied to the object and the amount of edge-like
features is
counted. The amount has to be smaller than a threshold t "Igo , because
baggage pieces are
more likely to have edge-like features.
[0082] Typical edge filters or corner filters are, for example, Sobel
operator, Laplace
operator or SUSAN.
[0083] Color histogram analysis
[0084] In most cases, baggage pieces don't have more than three colors.
Humans can
be more colourful because they have a lot of the different styles of clothing
and wear
different parts of cloth pieces (e.g. clothing for the upper body and lower
body). Skin and
hair color is also different from human to human.
[0085] The more different colors an object has, the more likely it is to be
a human.
[0086] The frequency of the quantized colors is measured and has to exceed
a
threshold
[0087] Object has human signature?
[0088] The final decision whether an object is a human or not is based on
all three
parameters: heat distribution, edge count and color distribution.
[0089] The three different parameters can be weighted separately.
[0090] The system can be manually configured to run only during night hours
when
traffic is slow and the incident is more likely to happen. This further
reduces the chance for
false alarms without sacrificing the sensitivity of the system and risking
false negatives.
[0091] Description of the system number 2b: Thermal camera plus a color
camera
[0092] Fig. 7 is an illustration of an embodiment of the analytics system
using a
thermal camera 112 and a fish-eye video camera 116. In this embodiment the
thermal camera
looks at the conveyor belt 110 from above with a field of view extending into
a neighbouring
11

CA 02764192 2012-01-16
area from which humans may interact with the conveyor belt. Fish-eye video
camera 116 has
a field of view which also extends from the belt area into the neighbouring
area.
[0093] Fig. 8 is a block diagram showing the steps of the analytics system
in an
embodiment using both a thermal camera and a fish-eye video camera. In step
180 the
system receives data from the cameras. In step 182 the system detects upright
humans in the
data from the fish-eye video camera. In step 184 the system applies geometric
correction to
the detected upright humans to detect heat blobs corresponding to the upright
humans in the
data from the thermal camera. In step 186 the system tracks the heat blobs
detected as
corresponding to upright humans, for example using a Kalman filter. In step
188 the system
detects if a blob detected as corresponding to an upright human coincides with
the belt area.
If so, in step 130 the system triggers the alarm. If not, the system continues
to perform steps
180 to 186.
[0094] Upright Human detection
[0095] The first stage in the detection is to analyze the image of the
color camera and
detect silhouettes of human beings based on a multi scale upright human body
classifier.
[0096] The classification of an upright human can be based on, for example,
Histogram of Oriented Gradients (N. Dalal and B. Triggs), Optical Flow
Patterns (H.
Sidenbladh) or covariance features (0. Tuzel, F. Porikli, and P. Meer) for
example.
[0097] Geometric correction
[0098] The output of the classifier is geometrically corrected to find the
corresponding heat blob in the heat intensity image of the thermal camera.
[0099] This is the same method as described in Geometric correction in
relation to
Fig. 6 but this time the origin is the color image and the target is the
thermal image.
12

CA 02764192 2012-01-16
[00100] Heat blob tracking
[00101] The heat blob identified as human is marked and tracked by means of
a
Kalman filter in the view of the thermal camera. The system activates the
alarm if the
marked blob track starts to coincide with the belt area.
[00102] Description of the system number 3: Two color cameras
[00103] Fig. 9 is an illustration of an embodiment of the analytics system
using a
video camera and a fish-eye video camera. The analysis for this setup may be
the same as for
the analysis shown in Fig. 8 for the thermal camera and fish eye camera setup
shown in Fig.
7, except that the blobs tracked in step 186 are not heat blobs.
[00104] Upright human detection
[00105] See Fig. 7 is an illustration of an embodiment of the analytics
system using a
thermal camera 112 and a fish-eye video camera 116. In this embodiment the
thermal camera
looks at the conveyor belt 110 from above with a field of view extending into
a neighbouring
area from which humans may interact with the conveyor belt. Fish-eye video
camera 116 has
a field of view which also extends from the belt area into the neighbouring
area.
[00106] Fig. 8 is a block diagram showing the steps of the analytics system
in an
embodiment using both a thermal camera and a fish-eye video camera. In step
180 the
system receives data from the cameras. In step 182 the system detects upright
humans in the
data from the fish-eye video camera. In step 184 the system applies geometric
correction to
the detected upright humans to detect heat blobs corresponding to the upright
humans in the
data from the thei inal camera. In step 186 the system tracks the heat
blobs detected as
corresponding to upright humans, for example using a Kalman filter. In step
188 the system
detects if a blob detected as corresponding to an upright human coincides with
the belt area.
If so, in step 130 the system triggers the alarm. If not, the system continues
to perform steps
180 to 186.
[00107] Upright Human detection as described in relation to Fig. 8.
13

CA 02764192 2012-01-16
[00108] Geometric correction
[00109] The analysis system corrects for viewpoint and geometry and
identifies the
same human objects in the scene of the second color camera. This
transformation can be
represented as a linear Transformation in 3D space (J. Han and B. Bhanu). A
homography
H is a matrix that translates points from one camera plane to another plane.
It is precomputed
based on 4 reference points. d is the distance between the two cameras.
[00110] x = Hx + d
[00111] Blob tracking
[00112] The positions of the upright humans are tracked in the view of the
second
color camera.
[00113] See Heat blob tracking as described in relation to Fig. 8.
[00114] Blob coincides with conveyor belt?
[00115] The system activates the alarm if the marked blob track starts to
coincide with
the belt area. The area of the conveyor belt is defined by a bounding box. As
soon as the heat
blob enters the bounding box the alarm will be triggered.
[00116] Combination of the described systems
[00117] Fig. 10 is a block diagram showing the steps of an embodiment of
the system
in which several of the described imaging and analytic systems are combined.
There are
multiple imaging and analysis subsystems 100 each with an imaging system 102
and
analytics system 104 (a computer configured with software for carrying out the
disclosed
methods). The outputs of the imaging and analysis subsystems are combined in
step 190, for
example using majority voting, to produce an overall decision. If the overall
decision is that
there is a human on the conveyor belt, actuator system 106 stops the conveyor
belt, and
alarm system 108 informs the operating personnel about the incident.
[00118] Majority voting
14

CA 02764192 2012-01-16
[00119] The decision of every imaging and analysis subsystem can be
combined to
increase the detection rate and reduce the error rate.
[00120] Aõ = (1,0} Decision of subsystem n
[00121] Voting decision
[00122] Trigger alarm if a certain amount of subsystems detect humans:
Ein-o Ai > talarrn
[00123] Simple majority
71
[00124] Trigger alarm if the majority of subsystems detect humans: E7_c, A
i > ¨
2
[00125] Weighted majority
[00126] Trigger alarm if a certain threshold of positive detections is
reached
but each subsystem has a different weight wõ which relates informative value
(e.g. a thermal
camera could have more weight than a normal camera because it is more suited
to detect
humans): 7.0A 147, > tau,
[00127] More techniques of combining the output of systems than majority
voting
may be used. For example, instead of each imaging and analysis subsystem
producing a
binary human on belt / no human on belt decision, each subsystem may produce a
likelihood
of a human on belt given the observed data, which may include factors such as
time of day or
outside air temperature, and the likelihoods produced by the subsystems may be
combined to
produce an overall likelihood (or combined along with a prior to produce an
overall
probability) which may be compared to a threshold to produce a human on belt/
no human
on belt binary decision. The combination of the likelihoods may assume
independence or
take into account the non-independence of the systems. In a further
embodiment, each
subsystem produces a likelihood for each of a number of locations and the
likelihoods
produced at each location are combined to produce an overall decision as to
whether there is
a human at that location. In a still further embodiment, blobs detected by
each subsystem are
correlated and each subsystem produces a likelihood for each blob, and the
likelihoods

CA 02764192 2012-01-16
produced for each blob are combined to produce an overall decision as to
whether the blob is
human and on the belt.
[00128] Image Fusion
[00129] Fig. 11 shows a block diagram of the steps of the analytics system
using
image fusion. Images are received from multiple cameras or imaging systems
102.
Optionally, feature detection is then performed on the received images in step
192. In step
194 the information of the images from the different sources is combined. In
step 196 an
analytics system processes the combined images, to produce a determination as
to whether
there is a human on the conveyor belt. If there is, actuator system 106 stops
the conveyor
belt, and alarm system 108 informs the operating personnel about the incident.
[00130] Image Fusion
[00131] The process of image fusion combines the information of multiple
image
sources before the image is analysed. This can result in better performance.
Images can also
be fused after the process of feature detection such as edge detection.
[00132] While the embodiments shown detect humans on a conveyor belt, the
techniques used may detect other objects on a conveyor belt, humans in places
other than on
a conveyor belt, or other objects in other locations. The computer used for
the analysis
system may be any computing device now known or later developed that is
configured to
carry out the processes described here. The computing devices may for example
be personal
computers programmed to carry out the described processes, or may be
application specific
devices that are hard wired to carry out the described processes.
Communications between
the various apparatus may use any suitable communication links such as wires
or wireless
that supply a sufficient data rate. The required communication links and
general purpose
computing devices required for implementing the method steps described here
after suitable
programming are already known and do not need to be described further.
16

CA 02764192 2012-01-16
[00133] Immaterial modifications may be made to the embodiments described
here
without departing from what is covered by the claims. In the claims, the word
"comprising"
is used in its inclusive sense and does not exclude other elements being
present. The
indefinite articles "a" and "an" before a claim feature do not exclude more
than one of the
feature being present. Each one of the individual features described here may
be used in one
or more embodiments and is not, by virtue only of being described here, to be
construed as
essential to all embodiments as defined by the claims.
17

Representative Drawing
A single figure which represents the drawing illustrating the invention.
Administrative Status

2024-08-01:As part of the Next Generation Patents (NGP) transition, the Canadian Patents Database (CPD) now contains a more detailed Event History, which replicates the Event Log of our new back-office solution.

Please note that "Inactive:" events refers to events no longer in use in our new back-office solution.

For a clearer understanding of the status of the application/patent presented on this page, the site Disclaimer , as well as the definitions for Patent , Event History , Maintenance Fee  and Payment History  should be consulted.

Event History

Description Date
Inactive: IPC expired 2023-01-01
Revocation of Agent Requirements Determined Compliant 2020-04-22
Appointment of Agent Requirements Determined Compliant 2020-04-22
Common Representative Appointed 2019-10-30
Common Representative Appointed 2019-10-30
Grant by Issuance 2018-10-30
Inactive: Cover page published 2018-10-29
Pre-grant 2018-09-18
Inactive: Final fee received 2018-09-18
Notice of Allowance is Issued 2018-08-21
Letter Sent 2018-08-21
Notice of Allowance is Issued 2018-08-21
Inactive: Q2 passed 2018-08-16
Inactive: Approved for allowance (AFA) 2018-08-16
Amendment Received - Voluntary Amendment 2018-03-15
Inactive: S.30(2) Rules - Examiner requisition 2017-09-15
Inactive: Report - No QC 2017-09-13
Letter Sent 2016-11-14
Request for Examination Requirements Determined Compliant 2016-11-08
All Requirements for Examination Determined Compliant 2016-11-08
Request for Examination Received 2016-11-08
Inactive: Cover page published 2013-07-22
Application Published (Open to Public Inspection) 2013-07-16
Inactive: IPC assigned 2012-04-03
Inactive: IPC assigned 2012-02-16
Inactive: First IPC assigned 2012-02-16
Inactive: IPC assigned 2012-02-16
Inactive: IPC assigned 2012-02-16
Inactive: IPC assigned 2012-02-16
Inactive: IPC assigned 2012-02-16
Inactive: IPC assigned 2012-02-14
Inactive: IPC assigned 2012-02-14
Application Received - Regular National 2012-01-27
Filing Requirements Determined Compliant 2012-01-27
Inactive: Filing certificate - No RFE (English) 2012-01-27
Small Entity Declaration Determined Compliant 2012-01-16

Abandonment History

There is no abandonment history.

Maintenance Fee

The last payment was received on 2018-10-24

Note : If the full payment has not been received on or before the date indicated, a further fee may be required which may be one of the following

  • the reinstatement fee;
  • the late payment fee; or
  • additional fee to reverse deemed expiry.

Please refer to the CIPO Patent Fees web page to see all current fee amounts.

Fee History

Fee Type Anniversary Year Due Date Paid Date
Application fee - small 2012-01-16
MF (application, 2nd anniv.) - small 02 2014-01-16 2013-12-18
MF (application, 3rd anniv.) - small 03 2015-01-16 2015-01-08
MF (application, 4th anniv.) - small 04 2016-01-18 2016-01-07
Request for examination - small 2016-11-08
MF (application, 5th anniv.) - small 05 2017-01-16 2016-11-08
MF (application, 6th anniv.) - small 06 2018-01-16 2017-12-14
Final fee - small 2018-09-18
MF (application, 7th anniv.) - small 07 2019-01-16 2018-10-24
MF (patent, 8th anniv.) - small 2020-01-16 2019-10-18
MF (patent, 9th anniv.) - small 2021-01-18 2020-11-11
MF (patent, 10th anniv.) - small 2022-01-17 2021-11-09
MF (patent, 11th anniv.) - small 2023-01-16 2022-11-04
MF (patent, 12th anniv.) - small 2024-01-16 2023-11-21
Owners on Record

Note: Records showing the ownership history in alphabetical order.

Current Owners on Record
INTELLIVIEW TECHNOLOGIES INC.
Past Owners on Record
MOHAMED SHEHATA
TAMER MOHAMED
WAEL BADAWY
Past Owners that do not appear in the "Owners on Record" listing will appear in other documentation within the application.
Documents

To view selected files, please enter reCAPTCHA code :



To view images, click a link in the Document Description column. To download the documents, select one or more checkboxes in the first column and then click the "Download Selected in PDF format (Zip Archive)" or the "Download Selected as Single PDF" button.

List of published and non-published patent-specific documents on the CPD .

If you have any difficulty accessing content, you can call the Client Service Centre at 1-866-997-1936 or send them an e-mail at CIPO Client Service Centre.


Document
Description 
Date
(yyyy-mm-dd) 
Number of pages   Size of Image (KB) 
Description 2012-01-16 17 682
Claims 2012-01-16 5 149
Drawings 2012-01-16 8 66
Abstract 2012-01-16 1 17
Representative drawing 2013-06-18 1 6
Cover Page 2013-07-22 1 41
Description 2018-03-15 17 685
Claims 2018-03-15 6 206
Representative drawing 2018-10-01 1 5
Cover Page 2018-10-01 1 39
Filing Certificate (English) 2012-01-27 1 167
Reminder of maintenance fee due 2013-09-17 1 112
Reminder - Request for Examination 2016-09-19 1 119
Acknowledgement of Request for Examination 2016-11-14 1 175
Commissioner's Notice - Application Found Allowable 2018-08-21 1 162
Maintenance fee payment 2023-11-21 1 26
Maintenance fee payment 2018-10-24 1 25
Final fee 2018-09-18 1 32
Correspondence 2012-01-27 1 46
Request for examination 2016-11-08 1 30
Fees 2016-11-08 1 25
Examiner Requisition 2017-09-15 6 292
Maintenance fee payment 2017-12-14 1 25
Amendment / response to report 2018-03-15 12 375
Maintenance fee payment 2019-10-18 1 25
Maintenance fee payment 2020-11-11 1 26
Maintenance fee payment 2021-11-09 1 26
Maintenance fee payment 2022-11-04 1 26