Mapping Key Actors
- Suvi Ferraz
- Feb 11
- 7 min read

Artificial Intelligence (AI) has the potential to innovate teaching and learning practices to help face today’s education challenges and to help accelerate the progress towards Sustainable Development Goal 4 (SDG 4). However, rapid technological developments bring along risks and challenges, which outpaced policy debates and regulatory frameworks. (Miao, F. et al, 2021).
In this second blog post about the wicked problem of AI in adult education, I discuss the problem from the perspective of the social triangle model of business-government-civil society, which has different drivers for action and behaviors. (Dahan, N. et al, 2015). There are different ways of regulating, but due to the fuzziness of the term governance, it is hard to see who is paying tribute to whom and who’s in charge. (Steurer, R., 2013).
State
I would map the wicked problem of AI in adult education as a hybrid actor that engages with all three sectors the state, civil society, and market/ business.Generative AI training is cost-effective and its scalability appeals to employers, training providers, educators, and policymakers. However, more research is needed on AI in adult education, on learning competencies, its effectiveness in learning, and examining and comparing AI-based solutions with both non-AI solutions. AI use in adult education and training raises ethical concerns regarding data privacy, intellectual property, security, and cheating by learners. Adult educators should implement research-based curriculum design. (Storey, V. 2023).
Government regulation can be hierarchical, hard, consisting of regulations and laws, or soft using instruments such as taxes and fees for facilitating certain behaviors. (Steurer, R. 2013). The way government actors connect with the wicked problem of AI in adult education and training is through hard regulations with sanctions such as the EU General Data Protection Regulation (GDPR) which governs how the personal data of individuals in the EU may be processed and transferred. This is a hard way to regulate technology companies as well as education and training providers on their end users, students, teachers, and other staff members’ rights.
On June 2012 the European Parliament laid down harmonized rules on artificial intelligence and amending Regulations. The means for the regulations is to ensure safety and compliance with a fundamental right, as well as to boost innovation and establish Europa as a leader in the field of AI.
The regulations set obligations based on AI’s potential risk and level of impact. In the regulation education and vocational training-related systems are defined as high-risk systems due to those potential risks of harming e.g. safety, fundamental rights, environment, or democracy. The regulation sets obligations for the systems to assess and reduce risk by maintaining user logs, with transparency and accuracy with human oversight. (European Parliament, 2024; EUR-Lex, 2024)
However, data is the fuel of AI and is continuously required to advance AI solutions in education. This brings us to the second actor in the wicked problem.
Market
What is the minimum necessary data to collect on learners' behavior, and how can it be used while ensuring compliance with existing regulations? In adult education, vocational institutions and higher education must adhere to these rules, but what about private training providers and corporations already using or planning to implement AI in their training systems?
The power of the government is limited geographically. Firms can relocate for instance for taxation reasons, for regulative landscape, and for freedom. (Dahan, N. et al., 2015). AI regulations in the EU are rather new and those can be hard for small businesses to interpret. Due to the data regulations in the EU area, some education technology (EdTech) businesses are interested in exploring and exploiting markets with looser data protection regulations. Complex regulations may push AI-driven education businesses to develop outside the EU, where such rules are seen as barriers to innovation and market entry. Local governments vary in their approach to AI, influencing governance and legislation to either support or restrict its adoption.
Society
An example of a poly-centered governance system is where the government does not engage directly with non-state regulation but facilitates in the following example integration of AI into education. (Steurer, R. 2013). In 2022 the European Commission published “Ethical Guidelines on the Use of Artificial Intelligence (AI) and Data in Teaching and Learning for Educators”. This is a soft instrument for steering educators’ behavior towards AI in teaching and learning with the contributions of an expert group of practitioners, researchers, and representatives from international organizations such as UNICEF, UNESCO, and OECD. (European Commission, 2022). As a response to civil society pressure, business partners such as large suppliers, bulk buyers, lenders, or institutional investors demand certain Corporate Social Responsibility (CSR) practices which are conducted as ‘‘business-to-business self-regulation’’ auditing for instance businesses’ regimes on labor conditions. (Steurer, R. 2013). I believe that businesses in adult education, especially those operating within platform ecosystems, will adopt business-to-business self-regulation to address civil society pressures and establish quality standards and ethical guidelines for AI use.
Partnerships
I think that there is a dependence on AI in adult education between government, business, and civil society. AI is still rather new in adult education and training with little research on its long-term or even short-term effects on learning outcomes. Therefore, the government is using limited legal instruments and soft regulations to guide business behavior. At the same time, the government should closely monitor through research how AI-related regulations and guidelines are impacting adult education businesses and society.
The role of societal actors in partnerships can be: (1) mandating, (2) facilitating, (3) partnering, and/ or (4) endorsing. In the facilitating role, governments search for enabling instruments for firms and citizens to move in the “right‟ direction. This can include the use of procurement policies focused on particular goals such as corporate social responsibility or national competitiveness or setting up public schools or hospitals. (Van Tulder, R. 2013). Governments can incentivize training firms and educational institutions to adopt AI technologies by implementing procurement policies that prioritize AI-enhanced education and training.
According to Van Tulder, R., & Pfisterer, S. (2013) partnering spaces “non-profit public-private partnerships” aim to increase participation in designing and implementing effective public policies and an adequate provision of common goods such as education and public health. I find this relevant partnership type for the wickedness of AI in adult education.
As an example of partnerships between different actors. In Finland, the Ministry of Education and Culture (state) is the highest authority and is responsible for all publicly funded education in Finland. Preparing and deciding about hard regulation educational legislation. As a two-tier organization, the Finnish National Agency for Education Finland’s development agency is responsible for adult education and training. However, higher education is the responsibility of the Ministry of Education and Culture. (OPH, 2024). The University of Helsinki (society) is an independent public institution, serving public education and research purposes. It’s mainly funded by the Finnish government but also has public-private partnerships through donations and between businesses and this way exercising public co-regulation. In addition to the government and businesses, in the civil sector, the Trade Union of Education in Finland (OAJ) (society) is a key influencer of education policy and an advocate for its members, professionals of education, training, and research sector, addressing their concerns.The example illustrates how different actors government, educational institutions, and civil society, can collaborate to shape effective policies and practices. This multi-stakeholder approach, involving both public and private sector partnerships, highlights the importance of co-regulation and shared responsibility in addressing the challenges of integrating AI into adult education.
My thoughts
I find interesting the role of government and businesses as part of my wicked problem. Edtech startups utilizing AI are interesting as those can be considered the frontrunners of AI innovations in adult education, but at the same time, they can create solutions that may create new challenges, such as the risk of spreading educational biases due to algorithms. Governments are also financing startups as in Finland through public grants. In comparison to researchers who serve for public interest, businesses’ goal is to profit, which can sometimes conflict with the broader societal goals of equitable and ethical education.
This leaves me with the question of who governs AI in adult education. To my understanding, it is a balancing collaboration between the state, business, and society, but there’s a tension between governments wanting to regulate AI and adult education businesses pushing for faster adoption with minimal constraints. I see that there is also a struggle between businesses seeking profit and civil society advocating for equity and fairness.
This blog post is based on a Business and Society course assignment at the Hanken School of Economics written in October 2024. It is the second post of a series of four posts discussing the grand challenge (wicked problem) of AI in education and training.
References:
Dahan, N. M., Doh, J. P., & Raelin, J. D. (2015). Pivoting the role of government in the business and society interface: A stakeholder perspective. Journal of Business ethics, 131(3), 665-680.
Steurer, R. (2013). Disentangling governance: a synoptic view of regulation by government, business and civil society. Policy Sciences, 46(4), 387-410.
Van Tulder, R., & Pfisterer, S. (2013). Creating partnering space: Exploring the right fit for sustainable development partnerships. In Social Partnerships and Responsible Business (pp. 105-124).
Routledge.Miao, F. & Holmes, W. & Huang, R. & Zhang, H. (2021). AI and education Guidance for policymakers AI and education Guidance for policy-makers. (Accessed 16.9.2024)European Parliament (2024). Artificial Intelligence Act: MEPs adopt landmark law. Press release 13.3.2024. https://www.europarl.europa.eu/news/en/press-room/20240308IPR19015/artificial-intelligence-act-meps-adopt-landmark-law
EUR-Lex (2024). Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) (Text with EEA relevance). https://eur-lex.europa.eu/eli/reg/2024/1689/oj.
European Comission (2022). Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for Educators. European Education Area (Europa.eu) European Union publication.
Storey, V., Wagner, A (2023). Integrating Artificial Intelligence (AI) Into Adult Education. International Journal of Adult Education and Technology, 15(1):1-15. 10.4018/IJAET.345921.
OPH, Finnish National Agency for Education (2024). https://www.oph.fi/en/about-us.
Comments