top of page
  • Voltaire Staff

India walks back on approval for AI requirement after backlash



India has reversed its stance on a recent AI advisory following backlash from both local and global entrepreneurs and investors.


The Ministry of Electronics and IT on Friday issued a revised AI advisory to industry stakeholders, removing the requirement for government approval before launching or deploying AI models to users in India.


In the updated guidelines, companies are now encouraged to clearly label AI models that have not undergone thorough testing or may be unreliable, aiming to provide users with transparency about potential flaws.


Less than a year ago, the Ministry had opted not to regulate AI development, recognising its critical importance to India's strategic objectives.



India's IT ministry made the revision after facing strong criticism earlier this month from prominent figures for requiring government permission before the deployment of any AI model.


Martin Casado, a partner at venture firm Andreessen Horowitz, labelled India's action as "a travesty."


Earlier this month, the ministry clarified that while the advisory was not legally binding, it signifies the future direction of regulation, with government compliance being necessary.


The advisory that now supersedes the previous one underscores that AI models should not facilitate the dissemination of illegal content under Indian law and should actively mitigate bias, discrimination, or threats to the integrity of elections.


Intermediaries are encouraged to employ "consent popups" or similar mechanisms to explicitly alert users to the potential unreliability of AI-generated output.


The Ministry maintains its focus on ensuring the easy identification of deepfakes and misinformation, advising intermediaries to label or embed content with unique metadata or identifiers.


However, the requirement for firms to develop a method for identifying the "originator" of specific messages has been removed.

 

 

 

 

Commentaires


bottom of page