South Korea’s Artificial Intelligence Basic Act takes effect with strict transparency and risk rules

South Korea’s Artificial Intelligence Basic Act has come into force, introducing transparency duties, special rules for high-impact and high-performance systems, and extraterritorial obligations for foreign providers. Companies using Artificial Intelligence in products or services that touch the Korean market face new compliance, governance, and enforcement risks.

South Korea’s Act on the Development of Artificial Intelligence and Establishment of Trust, known as the Artificial Intelligence Basic Act, took effect on January 22, 2026 and sits alongside the European Union Artificial Intelligence Act as a comprehensive regulatory regime. The law applies to businesses that develop and provide Artificial Intelligence, referred to as “Artificial Intelligence development business operators,” and to businesses that provide products or services that incorporate Artificial Intelligence, referred to as “Artificial Intelligence utilization business operators,” and it defines Artificial Intelligence broadly as an electronic implementation of human intellectual abilities like learning, reasoning, perception, decision-making and language comprehension. The statute creates a high-level framework for transparency obligations, treatment of high-risk systems, and extraterritorial reach, while delegating detailed technical and compliance rules to enforcement decrees to be finalized by South Korea’s Ministry of Science and Information and Communication Technology, which will be responsible for developing specific requirements through existing and new government bodies.

The Artificial Intelligence Basic Act targets operators of generative Artificial Intelligence and high-impact Artificial Intelligence with additional duties, and introduces a separate category of “high-performance” Artificial Intelligence subject to heightened safety expectations. Generative Artificial Intelligence is defined as Artificial Intelligence that mimics the input data’s structure and features to produce outputs such as text, images, sound and video, and operators providing Artificial Intelligence-generated sound, image or video that is difficult to distinguish from human-created content must provide clear notice that the content is an output of Artificial Intelligence. Additionally, operators of both generative Artificial Intelligence and high-impact Artificial Intelligence are required to notify users in advance if their product or service is developed using Artificial Intelligence, and operators of generative Artificial Intelligence must also include a label that informs whether content has been produced by generative Artificial Intelligence. High-impact Artificial Intelligence covers applications in critical sectors like healthcare, energy, transportation, hiring and biometric analysis, and operators must assess whether their systems qualify as high-impact, provide a “meaningful explanation” of outcomes and key criteria, adopt user protection plans and human oversight mechanisms, document trust and safety measures, and make efforts to conduct impact assessments on fundamental rights before deployment.

The enforcement decree notice clarifies that Artificial Intelligence systems trained with a cumulative compute of at least 10²⁶ floating-point operations (FLOPs) are designated as high-performance Artificial Intelligence, and according to the Artificial Intelligence Basic Act, operators of such systems may be required to implement a risk management plan and user protection measures across the system’s life cycle and report implementation outcomes to the ministry. The law applies to Artificial Intelligence systems outside South Korea if they affect users or markets in the country, and any foreign Artificial Intelligence business without a physical office in Korea must appoint a local agent if its total revenue exceeded one trillion KRW in the previous year, or its revenue from Artificial Intelligence services exceeded 10 billion KRW in the previous year, or its average daily users in Korea exceeded one million users during the three months preceding the end of the previous year. The local representative will be legally responsible for responding to government inquiries and safety reports, and the ministry can issue corrective orders, including service suspensions if a system threatens safety, and impose administrative fines of up to 30 million KRW (about US$21,000) for failures such as not notifying users about Artificial Intelligence use, not appointing a domestic representative, or violating corrective orders or refusing inspections, although the ministry has indicated a grace period of one year before these administrative fines are imposed. Beyond enforcement, the Artificial Intelligence Basic Act establishes a national Artificial Intelligence committee chaired by the president, an Artificial Intelligence policy center and an Artificial Intelligence safety research institute, and it mandates government support for research, data centers and smaller businesses, signaling that companies operating in South Korea should now review their Artificial Intelligence use and prepare risk-based compliance programs aligned with this new framework.

70

Impact Score

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.