MARK
OF THE BEAST By the time it arrives, it will be common technology
By now, a hundred books minimum
must have been written speculating as to what the mark of the Beast will be. It
really doesn't matter anymore. Indeed, it will not be needed for identification,
for that is already possible about seven different ways. Read the following.
I think it is about time we understand that the mark of the Beast will be
first and foremost a spiritual act, not a codifying of the human race. A
discussion of that notion will lead a lot closer to some believable prognostications,
though even that is suspect at this early date. Caught
on camera New Scientist http://www.newscientist.com/ns/19990925/caughtonca.html
You can run, but you can't hide. Big Brother can recognise
your face even if you're in disguise, says Nick Schoon EVERY DAY, EVERY
MINUTE, video cameras scan the crowds in a busy shopping centre. But this is no
ordinary public surveillance system. The faces of all those passers-by are being
converted into digital code and processed by computer. Continuously, instantaneously,
the facial code of each stranger is compared with that of several dozen local
criminals. If there is a match, an alarm sounds in a control room, a human operator
double-checks the computer's assessment and alerts the police. Someone they want
to keep tabs on, or someone they want to arrest, is in town... This
is not a scene from some Orwellian future. The system has been running for a year
on the streets of Newham in East London- -the first trial of its kind in the world.
And the potential this kind of system doesn't stop there. Last month, several
large stores in Regent Street, London, began using a system that includes a database
of convicted shoplifters who operate in the area. When a store detective catches
someone they think has been stealing and takes them to the manager's office, a
camera mounted on the wall captures their face, which is automatically checked
against the database. You would also expect this technology to be leapt on by
state security services, such as those unseen watchers who scan airport crowds
for terrorists and drug smugglers. And sure enough, a face-recognition system
devised by Neurodynamics, a company based in Cambridge, England, is being tested
in secret at a British airport. Facial recognition technology creates
new opportunities for companies and government agencies that want to keep tabs
on people. But it is an infant technology which some fear may be trying to run
before it can walk. Is it reliable enough to be used in such a sensitive field
as public safety? And have our rights to privacy been adequately debated?
We are our faces. To our fellow human beings, if not to ourselves, they
are the key identifiers. Our brains have exquisite machinery for processing and
storing a particular arrangement of eyes, nose and mouth and for picking it out
from other very similar arrangements. This ability is now being passed on to computers.
True, facial recognition systems have worked for years under ideal conditions
in labs. But they are now moving out. Recent increases in processing power and
improved algorithms can give even fairly ordinary PCs the ability to capture faces
in the hustle and bustle of the real world. They can spot people who are on the
move and are not facing square on to the camera, and they can compensate for changing
light levels. Visionics of New Jersey, the company behind the trials in Newham
and Regent Street, claims that its technology is not even fooled by hats, spectacles,
beards and moustaches (see "I see you"). We know who you are Surveillance
is not the only way this technology can be used. In fact much of the impetus behind
it comes from a different branch of the security industry--one which wants ways
to make sure we are who we say we are. "Biometric" features such as faces, fingerprints,
irises, hand geometry and DNA are powerful identifiers, and facial recognition
offers important advantages over its rivals. It is remote, quick and convenient--the
machine equivalent of a cool glance from a stranger. Many people don't like having
to touch things that thousands of others have fingered and they are certainly
not keen on parting with bodily fluids. So, when employees in Texas
withdraw their pay cheques at hole- in-the-wall cash machines, a camera and computer
make sure that their faces and PIN codes match. On the Mexican border, people
can "fast track" their way into the US after submitting their faces to automated
scans. And in Georgia, digital images of applicants for a driving licence are
checked against the facial codes of the millions of other licence-holders. (The
system has already detected the same face on more than a dozen different licences.)
But what is so special, and slightly sinister about facial recognition technology
is that people need never know that their identity is being checked, which is
where many surveillance projects begin. The project in Newham is slightly
different in that the local council and London's Metropolitan Police want criminals
to know they're being watched. The system works by picking out as many faces as
possible from passing crowds, demonstrating that it has "acquired" a face by placing
a red ring around it on a monitor in Newham's closed circuit television control
room. If the software finds a good match between that face and one in its database
of mug shots, the ring turns green and an alarm sounds. On another
screen, a close-up of the face on the street flashes up alongside that of the
criminal. Then human judgment comes into play. Only if Newham's civilian operator
in the control room considers that the two faces are identical does she or he
phone the local police station. For the police, the system is a potentially useful
tool for gathering intelligence about local villains as much as a way to locate
wanted criminals. Before the Visionics system, called FaceIt, went
live last October, Newham carried out tests to see if it could detect staff members
whose images had been placed on the database. They paraded before the cameras
wearing hats, glasses and other disguises, but the software still recognised them.
Since the system went live, however, it has succeeded in identifying only two
criminals on its database walking the streets of Newham. In both instances, the
police decided not to make an arrest. There are two possible explanations
for this low detection rate. The first is that villains were so alarmed when they
heard about this new crime-fighting technology that they decided to stay away
and have done so ever since. The second is that criminals are still coming into
the area but the system isn't spotting them. Bob Lack, Newham's head of security,
is hedging his bets on which explanation is correct. He is delighted that crime
in the target area has fallen and believes it could be acting as a deterrent.
But he also accepts that the software still needs improving. Criminals do, it
seems, have a good chance of going undetected, although not surprisingly Lack
and the Metropolitan Police are reluctant to discuss this. FaceIt
was initially attached to only six of the 154 video cameras scanning Newham's
streets, and it does not acquire every face "seen" by those six. So what was originally
intended as a six- month trial has been extended, apparently indefinitely. The
aim is to have it linked to more cameras and adapted for "multi-heading"- - acquiring
three faces at a time from the passing crowds. Newham is seeking Home Office funding
for this expansion; in the meantime, the system remains operational.
Lack is keen to stress the multitude of measures designed to prevent the system
being abused. The control room is closely guarded and the digitised faces of men
and women in the streets are discarded as soon as the system finds no match. Furthermore,
the control room operators see only the faces of criminals on the database: they
do not see their names or records. That information stays with the police, who
will not say who is on the database or even how many offenders are included.
Nothing to fear? An internal police committee chooses
the list of names, which probably embraces paedophiles and drug dealers as well
as violent robbers and burglars. David Holdstock, a spokesman for the Metropolitan
Police, says people convicted of trivial offences, or who were not regarded as
active criminals, would never go on the database. "It's pretty focused and pretty
targeted," he said. "If you're innocent you have nothing to fear."
Others are not convinced by this reassurance. When the project began, the Metropolitan
Police reckoned that the system made accurate matches in 80 per cent of cases,
says Liz Parratt of Liberty, Britain's council for civil liberties. "I see that
as a 20 per cent failure rate," she says. "That's an unacceptable level of error."
How many innocent people must have their collar felt for every criminal caught?
There's also the more general issue of whether we are watched enough already.
Britain has become the surveillance capital of Europe, with cameras in virtually
every city and large town, and many smaller ones. "Everyone wants to see crime
fall," says Parratt, "But I'm not sure I want to live in a society in which when
I walk down the road my face is compared with a couple of hundred local criminals."
Robin Wales, the elected leader of Newham Council, sees things the other
way round. "There is a civil liberties issue at stake here," he says. "It's the
right of our citizens to live in Newham without fear of crime." He claims widespread
local support for its decision to introduce the technology. Britain's
Data Protection Registrar, whose task is to protect the interest of individuals
against data collectors, has scrutinised Newham's system and is satisfied with
the safeguards. But what happens next might be a problem. If the trial is eventually
pronounced a success in reducing crime then it may well be extended to other public
surveillance systems, first in Britain and then elsewhere. And if these systems
share a large, common database then criminals on it would, in effect, be "electronically
tagged". The police would know whenever they showed their faces in any place covered
by cameras. Clearly this could be a powerful deterrent to crime, but
it would take us a big step nearer to a world in which Big Brother really is continuously
watching everyone. "We would have serious concerns about such a general system,"
says Jonathan Bamford at the Office of the Data Protection Registrar. "We think
it would be inappropriate." After all, the decision to sentence offenders to be
electronically tagged, which gives the authorities a constant check on their location,
is made by the court, not the police. So the spread of this technology
raises an old but important question: who guards the guardians? The actions of
police, other state authorities and private companies in compiling and maintaining
face databases needs to be kept under scrutiny. How serious a crime do you have
to commit before your face goes on a system? How long do you have to stay clear
of crime before your face is removed? Do you have the right to know that you are
on a database? In Newham, there has been no debate about these issues.
These questions are not just of interest to the criminally-inclined. A small
but fast growing number of PCs are sold with an integral video camera facing the
operator. One potential use for this is to check a person's identity when they
log on to a network or try to buy something over the Internet. Chris Solomon,
who works on facial recognition at the University of Kent in Canterbury, believes
this kind of security system will be commonplace within a few years.
Visionics' rival Miros, a company based in Wellesley, Massachusetts, is already
marketing software that gives access to secure Web servers only to those who have
registered their faces. Solomon sees great advantages and a host of applications
in the technology. "But there are concerns," he says. "Like so many things in
life, it's a stick with two ends."
I see you
BOTH IN GREY MATTER and in silico, facial recognition is a two-stage process.
First you need to detect that a face is in the visual field. Then you need to
lock onto it and rapidly compare it with a database of thousands of faces in memory.
This process needs to allow for the fact that faces differ from moment to moment
and year to year. The most obvious source of variation is the angle from which
it is seen, but there are others, from changing light levels and skin tones to
expressions and facial hair. Our brains overcome these confusions,
allowing us to recognise a face with astonishing rapidity, along with a host of
accompanying information--favours owed or owing, character assessment and (if
we're lucky) a name. Algorithms capable of mimicking this astonishing ability,
at least in part, improved through the 1990s. Increases in computing power and
the reduction in cost of computers and video technology is starting to make facial
recognition affordable. Several different approaches to automatic
face recognition have emerged. One builds up a person's face by overlaying many
different facial types, known as eigenfaces. A computer varies the brightness
of every eigenface in turn until it finds a combination that resembles the face.
The set of numbers representing the brightnesses of the eigenfaces can then be
compared with other sets stored in memory (Sleuth City, New Scientist supplement,
4 October 1997, p1). The Visionics system, deployed in Newham, works
by dividing up the image of a face into between 12 and 40 elements, such as the
outer corners of the eyes, tip of the nose and ends of the eyebrows. It then measures
the distances between these elements, and gauges the angles between lines that
are drawn between them. These numbers become the digital identity of a face. The
software can compensate for head movements and changes in expression because it
"knows" that the elements can move slightly relative to one another.
The system focuses on a triangle between the base of the nose and the outer edges
of the eyebrows so, say the company, moustaches and beards do not confuse it.
Nor, it claims, do spectacles--unless heavy shades or glare occludes the eyes.
Faces can be turned up to 15 degrees away from the camera without any loss of
recognition performance. Once that angle is exceeded, its ability to identify
starts to deteriorate. Of course, before the work of recognising can
begin, the system has to find a face within the video camera's field of view.
It does this with another set of pattern-matching algorithms which detect "head-shaped"
objects in the field of view. Then, once a face has been detected, it is "normalised"
to compensate for variations in lighting condition, size (or distance from the
camera) and skin colour. This is rather like taking a photograph of a crowd, enlarging
each face in it to fill the same size of frame and adjusting the contrast and
the average shade of each to standard values. What can you do to fool
a face recognition system? Sporting a bag over your head might arouse suspicion.
Wearing heavy shades and a daft expression would almost certainly throw the software
off the scent, but you'd just have to hope that you didn't bump into someone you
knew.
BACK TO THE TITLE PAGE ge |