Note: There are two previous articles in this series. Read Part OneRead Part Two

There are many different and unique biometric markers.Biometrics, the study of measurable biological characteristics, is here to stay. In spite of reluctance on the part of some cultures, biometrics and accompanying technologies continue to spread across the globe. Many governments are compiling massive biometric identification programs on citizens and visitors, major banks are utilizing various biometric technologies to improve security on mobile banking apps, and the U.S. military is using digital scans of faces and fingerprints to pursue and capture suspected terrorists.

There is great promise in biometrics. India’s Universal ID program is one example of a nation-wide roll-out, with more than 450 million citizens having had their irises scanned into a database. As a result, millions who lacked IDs now have secure access to loans, credit cards, and bank accounts. Unfortunately, the downside is that government officials can more easily track people for purposes of taxation.

Just like everything else IT related, biometrics is evolving at a rapid clip with newer and more sophisticated ways of confirming identity coming into play. Three biometric markers receiving increased attention are DNA, earlobe, and gait recognition.

DNA Recognition

Everyone’s DNA (deoxyribonucleic acid — the double helix-structured molecule that forms the basis of all living cells) is unique. To date, this individuality has been used primarily in forensics for the identification and/or exclusion of people suspected of crimes.

Even though DNA is, in the truest sense of the word, a biometric, its commercial application is still a long way off. Unlike other physiological and behavioral biometrics, DNA cannot be used for fast, automated verification and identification — it can take up to two weeks to complete a DNA analysis.

DNA recognition also differs from other biometric technologies in three key areas:

  • As actual DNA samples are used, no biometric templates are created.
  • Whereas biometric templates can be analyzed and compared in real time, DNA samples cannot.
  • No DNA images are taken (instead, actual biological samples are used).

As a biometrics technology, DNA enjoys several advantages over physiological and behavioral biometrics. DNA contains a wealth of unique information, possibly even exceeding the uniqueness of the retina by a large margin (the retina is currently considered the most unique and stable biometric).

A person’s DNA doesn’t change over time, and assuming use is made of all four of the nucleotides structural DNA units) as well as longer DNA chains, the likelihood of duplication is extremely remote. Unfortunately, DNA verification also has several disadvantages:

  • Current technologies focus on biometrics that cannot be separated from the carrier. However, DNA can be separated via nails, hair, skin, saliva and so forth.
  • DNA samples are prone to degradation and contaminants.
  • From a privacy perspective, the use of DNA is highly controversial particularly as it pertains to the handling and storage of DNA data.
  • There is widespread fear that the sensitive data contained in DNA (hereditary conditions and medical details) could be used to deny a person insurance, employment and bank loans.
  • Implementing security protocols for a DNA-based biometric system could prove exceptionally complex and expensive. This applies to database protection and overall system confidentiality in equal measure.

Earlobe Recognition

Earlobe recognition involves examining the unique geometry of the earlobe using techniques that are similar to those used for hand geometry recognition. The ability to examine the ear in its entirety is also being studied. While numerous studies have underpinned the uniqueness of the human earlobe, even amongst groups of people, however, earlobe recognition is still in its infancy.

Scientific studies conducted to date have culminated in preliminary results only. Nevertheless, numerous projects have been initiated to evaluate the potential and feasibility of earlobe recognition. At this stage, it is widely felt that earlobe recognition will prove as effective as facial recognition.

Surprisingly ear scans are gaining wider acceptance in crime-scene investigations. Materials present on the ear (wax, skin, oils, etc.) leave a unique “ear-print.” It is common for ear prints to be lifted off doors or windows of a crime scene as perpetrators often press their ears to them as a way to listen for anyone.

Additionally, legal systems typically require two or more different types of corroborating evidence in order to place a suspect at the scene of a crime. For this reason, ear prints are an excellent source of confirmatory data.

In order to become an established technology, however, earlobe recognition will have to leverage its strengths in three areas:

  • The position of the earlobe and the shape of the ear remains fairly constant; its structure does not change, either with time or with facial expression.
  • Unlike, gait recognition, the external environment does not significantly affect the quality of the earlobe data acquired.
  • Compared to other physiological biometrics, like the face, the outer ear is not greatly affected by as a person ages.

Gait Recognition

Gait recognition techniques identify someone by his gait (or the way that they walks). Unlike other biometric technologies, gait recognition is based on dynamic movement. In other words, the subject is, and needs to be in motion. Unfortunately, an individual’s gait is affected by a number of factors including:

  • Clothing
  • The surface on which the subject walks
  • Fatigue or injury to the subject
  • The type of shoes the subject is wearing
  • The presence of items such as handbags, briefcases, umbrellas and the like
  • Background or extraneous noise, including lighting and changes in the external environment

Gait recognition technology is based on a model of a moving person, which is used to obtain a series of mathematical vectors. Numerous methods have been created to capture the human body in motion, including stick figure models, volumetric (3-D) models and so-called “blob” models. So-called eigenvalues have also been used (these are applied to facial recognition, in much the same way).

In addition to motion-based developments, primitive templates have been created by capturing and tracking the orientation of the thigh or other physical features.

In 2002, the Defense Advanced Research Projects Agency (DARPA) funded two major gait recognition projects at the Georgia Institute of Technology (under its Human ID at a Distance Program). The impetus for these studies is to help identify individuals who may be carrying a concealed bomb or weapon.

There are many different and unique biometric markers.One project examined gait on the basis of machine vision techniques. The machine vision project involved an examination of static body parts and stride variables, as well as the collection of distance-related data (between the feet and the pelvis, between the left and the right foot, and between the feet and the head).

This type of recognition, which is based on an ‘activity-specific state biometric’, allows a subject’s stride or leg dimensions to be determined on the basis of a single image. To obtain this data, there is no need for the subject to be in motion.

The second project involved a radar device used to generate a computerized image of the subject. Radar offers several advantages over computer vision, in that it can be used under different external conditions like fog or poor lighting.

There are several areas where gait recognition enjoys advantages over more common physical and behavioral biometrics:

  • There is no physical contact with the subject.
  • Since there is no physical contact, gait recognition is not particularly invasive. It may, therefore, be less affected by privacy-related issues. (This is not the case for biometric technologies such as facial or iris/retina recognition.)
  • Subjects can be verified without their knowledge.
  • Data can be acquired at a much greater distance compared to other biometric technologies.

The Future

Although the certification exams of today may not ask anything specifically with regards to the four nucleotides of the DNA strand, it is still important to know something about biometric security procedures. Additionally, many certifications require that you be proactive about your security education by regularly earing continuing education credits. There are many ways you can do this; teaching, writing exam questions, composing IT certification tutorials, and giving presentations.

As a technology, biometrics is convenient, secure and cost effective. The science has come a long way since William Herschel first fingerprinted Indians in 1858 and it continues to rapidly evolve. The market for biometrics is expected to reach $24 billion U.S. by 2020, and potential markers currently under study include electrocardiograms, vein scans, body odor, and yes, even derriere scans.

It is truly a brave new world.