'Automatic Facial Recognition' System Is Completely failed in this task
'Automatic Facial Recognition' System Is Completely failed in this task
Share:

It is now being used extensively after the facial signature feature was launched in smartphones. Security agencies and police in many countries have started using automatic facial recognisability to nab criminals. Countries, where government and private security agencies are using automatic facial recognition, include countries such as the US and China. In India too, the use of automatic facial recognition is on the rise. Let us know the full details

also read Apple closes sale of MacBook Air and Pro

The National Crime Records Bureau has also sought tenders from several companies for automated facial resolution systems. Automatic facial recognition will be an Internet-connected system controlled by NCRB's data centre in New Delhi. It will also have access to all police stations. This will be done using the Crime and Criminal Tracking Network and System (CCTNS). Through this system, the offender will identify and send an alert to the list of people on the blacklist as soon as CCTV arrives.

Recently, the U.S. city of San Francisco has banned the use of facial speculation by police and security agencies. Just last week, a report by the University of Essex Human Rights Centre claimed that it would be illegal if the police's live facial recognition(LFR) technology is challenged in court. In 2017, facial sequential recording led to the arrest of an innocent man during the Notting Hill Carnival in London. The Notting Hill Carnival is a music festival that is held every year in August. It involves about two million people and has 9,000 police deployments in security. Facial recognition technique proved wrong in 80 per cent of cases

also read Galaxy Note 10's teaser came in, could replace laptops and PCs

According to media reports, the London police's facial recognition technology has been proven to be wrong in 80 per cent of the cases. Researchers from the University of Essex have even said. That the live report of facial recognition is so useless that it should be discontinued immediate effect. The researchers also gave a live demo of the camera installed by police at the Westfield shopping center in Stratford, London. During the live demo, facial recording technology identified 42 people as suspects based on information in the database, of which only 8 were matched correctly. In this case, facial recognition was only 19 per cent correct. Muffin and the dog could not identify the computer!

also read Qualcomm's powerful processor will also make cheap smartphones powerful

 

When questions about Xinhua dog and moffin were asked for testing of the same facial sequential recording and artificial intelligence in 2017, the result was shocking. Computers, mofins and Xinhua dogs could not be identified, and while looking at the photos, a person can easily identify them. After the failure of facial records in all these tests, the question arises as to who will be responsible if it punishes an innocent person. Let's understand this with an example.

 

For your information, tell us that a terrorist is on the verge of a suicide attack in a crowded place, but only if the facial recognisable system scans the terrorist's face and matches the face with the photo-videos in its database. The system then alerts the automatic counterterrorism squad. Then the police arrive, the terrorist is caught and hundreds of people die, but imagine what will happen if facial recognising is proved wrong? There is clearly no law, though human rights law also sines that facial recognising is also necessary in a democratic society. It is important to have a law on facial recognising and explain how to use facial recognisables.

 

 

 

 

Join NewsTrack Whatsapp group
Related News