Amid growing concerns about the cross-sectoral deployment of facial recognition technology (FRT), the NITI Aayog has asked the government for clarification on possible storage of passenger data, compliance with the law and the trustworthiness of the ecosystem that governs Digi Yatra, an airline ticketing system that supports them technology that the government intends to use in the aviation sector in the future.
The Digi Yatra Policy provides Digi Yatra as a completely voluntary system. If the passengers register and consent to the use of Digi Yatra for the purposes of check-in and boarding, this agreement would have the legal character of a voluntary agreement for temporary collection, storage and use of data.
“This agreement must be compatible with existing data protection laws and regulations,” suggests a discussion paper on “Responsible Artificial Intelligence For All” by the government think tank.
The discussion paper, published on November 2nd, focuses on the responsible use of artificial intelligence-based tools and technologies such as FRT.
Also Read: New Terminal of Kempegowda International Airport Gets 5G Connectivity
The paper adds that the “rules are currently set out under the Information Technology Act, 2000 and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data and Information) Rules, 2011 (“SPDI Rules”).
“Given that the Digi Yatra Foundation, which commissioned the Digi Yatra Central Ecosystem, was incorporated under the Companies Act 2013, it would equate to a “corporate body” for the purposes of the SPDI Rules. Therefore, Digi Yatra would have to comply with the SPDI rules.”
The Digi Yatra policy also states that the facial biometric data will be deleted from the local airport database 24 hours after the departure of the passenger flight.
However, NITI Aayog demands that the rules regarding the deletion of other information collected from the passengers as well as all facial biometric data stored in other registries must be clearly stated in Digi Yatra’s policy.
The Digi Yatra policy mentions that users may also be able to give their consent to value-added services at the airport, for which purpose their data may be shared with other companies such as taxi companies and other commercial companies.
In this regard, the state think tank has suggested that particular care must be taken to ensure that such consent is given in a meaningful way and is not bundled as standard.
“In addition to cyber security audits, it is imperative to establish a mechanism to conduct algorithmic audits by independent and accredited auditors on a regular basis prior to system deployment,” the discussion paper states.
It is important to mention that FRT, like other intelligent algorithms, is fundamentally a data-intensive technology. In order to ensure the adequacy and legality of the way in which the data processing for training and developing FRT systems is done, it is imperative to have a codified data protection regime in the country at the earliest, the discussion paper further states.
“The new Data Protection Act must maintain the framework to ensure data protection, including obligations, enforcement mechanisms, a regulator, penalties and remedies from the Personal Data Protection Act 2019.
“Furthermore, such a regime must not be limited to regulating data processing by private entities, but must adequately codify the protection of the fundamental right to privacy vis-à-vis state entities (including law enforcement agencies),” it added.
NITI Aayog also recommended that sensitive personal data should be protected under the new data protection law, including biometric data such as facial images and scans.
“Consequently, it is recommended that strict standards for data processing, as well as storage and retention of sensitive biometric data, be given due consideration in any proposed data protection regime to address privacy risks associated with FRT systems,” it concluded.
Read all Latest car news here