Cloud 101CircleEventsBlog
Participate in the CSA Top Threats to Cloud Computing 2025 peer review to help shape industry insights!

From 2024 to 2025: How These GRC Trends are Reshaping the Industry

Published 02/05/2025

From 2024 to 2025: How These GRC Trends are Reshaping the Industry

Originally published by Scrut Automation.

Written by Aayush Ghosh Choudhory.


According to Cybersecurity Ventures, the global cost of cybercrime is projected to hit a staggering $10.5 trillion in 2025, rising from $9.5 trillion in 2024. This stark reminder of the urgent need for strong cybersecurity measures within Governance, Risk, and Compliance (GRC) frameworks has been marked by significant shifts and innovations in the GRC landscape this year.

In this post, we’ll dive into the top 10 developments that stood out and explore how companies can gear up for the challenges and opportunities that 2025 will bring.


1. European Union continues its regulatory push with DSA, DORA, and EU AI Act

Beginning in 2016 with the General Data Protection Regulation (GDPR), the European Union has led the globe in terms of cybersecurity and privacy regulation. This year the trend continued with the:

  • Digital Services Act (full DSA enforcement began in February)
  • Digital Operational Resilience Act (DORA, entering into force in January 2025)
  • Artificial Intelligence Act (AI Act, passed this summer)

Regardless of how this regulatory burden affects innovation, companies will need to address it. If the GDPR is any example, the follow-on regulations will likely trigger other jurisdictions to pass similar laws. And we are already seeing this as various regulations pop up globally.


2. U.S. state-level regulations expand

With data privacy and cybersecurity a relatively low priority at the federal level for both American political parties, individual states have started implementing their own rules. Bloomberg Law reported that approximately 20 states have already passed their own comprehensive data privacy laws. Some key ones seeing movement in 2024 include:

  • Washington State’s My Health My Data Act (MHMDA, enforced starting March 2024)
  • Colorado’s Artificial Intelligence Act (based on the EU AI Act and passed in May 2024).

State-level regulation is likely to continue as the federal government focuses elsewhere during the next presidential administration. And we are already seeing states like Texas propose their own AI governance laws.

On the note of federal action in cybersecurity, the outgoing Biden Administration focused its efforts on an initiative that may not end up bearing fruit.


3. Rise (and perhaps fall) of “Safe Harbor” standards for software security

Beginning with the 2023 release of the National Cybersecurity Strategy, the White House and Cybersecurity and Infrastructure Security Agency (CISA) pushed hard to establish mandatory standards for software development. However, due to the challenges of codifying such rules and the planned departure of CISA Director Jen Easterly, it’s unlikely this effort will materialize into legislation in the near future.

With that said, CISA did roll out a voluntary “Secure by Design” pledge allowing software manufacturers to commit to certain steps. These include providing features like multi-factor authentication to customers at no additional cost.


4. Security and compliance concerns slow AI adoption

Despite the buzz around AI, there is still substantial skepticism about its data security and privacy. Scale AI’s 2024 “Zeitgeist” survey reveals that 60% of respondents who have yet to adopt AI cite security concerns and a lack of expertise as the primary barriers to implementation. Similarly, LucidWorks found a nearly 3x increase in data security concerns related to generative AI from 2023 to 2024 in its Global Benchmark Study.

A middle ground clearly exists between neglecting AI’s potential and using it without restraint, which can lead to excessive risks. Effective guardrails and governance frameworks are key here. And they even facilitate leveraging AI for security and compliance tasks.


5. AI helps with security and compliance

Despite concerns about its effectiveness as well as data security, firms are at the same time leveraging AI to accelerate GRC efforts. Especially when it comes to repetitive tasks like completing security questionnaires, AI has demonstrated huge potential to increase productivity for security and compliance teams.


6. Intellectual property rights blur in the age of AI

Along with the direct security considerations, there remain many unanswered questions regarding the applicability of existing intellectual property law when it comes to artificial intelligence. The U.S. Copyright Office said early in 2024 that works with AI-generated content could not be copyrighted without evidence of human contribution. While that added some clarity to the debate, it still left unresolved questions about:

  • Whether training generative AI models on public news stories constitutes “fair use”
  • If code generation tools can be trained on open-source-licensed libraries
  • The obligations of companies further down the supply chain

With no precedent to guide them, even some attorneys are unsure how to proceed. From a practical perspective, though, it makes sense to investigate the indemnification provisions that major generative AI vendors—such as OpenAI and Microsoft—offer. These can potentially provide a legal backstop if a company is challenged on intellectual property grounds.


7. No-code and low-code adds another burden to GRC teams

On top of (and combined with) AI tools are no-code and low-code offerings that offer hugely powerful capabilities to non-developers. These applications let employees build fully functioning front and back ends, including application programming interfaces (API). While these systems can make productivity explode, they also present unique risks. No-code and low-code tools can pose risks, particularly because less-trained staff members may inadvertently affect the confidentiality, integrity, and availability of large volumes of data.

Michael Bargury’s presentation, “15 Ways to Break Your Copilot,” at the Black Hat cybersecurity conference in August 2024, backs this point by highlighting the security and compliance challenges of these tools, particularly when combined with AI.


8. New technology means new compliance frameworks

Thankfully, as new technology emerges, so do new ways of securing them. 2024 saw the release of many AI-specific GRC approaches, including the:

  • Databricks AI Security Framework (DASF)
  • Open Web Application Security Project (OWASP) Top 10 risks for Large Language Models (LLM)
  • HITRUST AI Security Certification

These tools provide excellent guidance to GRC practitioners (and give them more homework to do). Combined with the first accredited certifications of companies under the ISO 42001 standard, the release of these frameworks made 2024 a big year for AI-related GRC.


9. Personal liability for leaders of breached companies

2024 was a landmark year in other ways as well. Across the globe, regulators began targeting individual executives for regulatory action related to alleged cybersecurity weaknesses in their companies. Beginning with the U.S. Securities and Exchange Commission’s (SEC) actions against SolarWinds and its Chief Information Security Officer (CISO) at the end of 2023, other countries have followed suit.

After the breach of Change Healthcare in early 2024, one U.S. Senator called for SEC and Federal Trade Commission (FTC) investigations into it and parent company UnitedHealthcare. Describing the companies as “negligent,” this Senator implied that personal accountability for its executives might be appropriate given the damage the breach caused.

CISOs and other risk advisors have traditionally played advisory roles in risk management, but a growing trend indicates they are increasingly being held accountable for critical decisions. For instance, regulatory developments like the New York Department of Financial Services (NYDFS) Cybersecurity Regulation (23 NYCRR §500.4), effective November 2024, require CISOs to report directly to the board and mandate heightened board oversight of cybersecurity risks. Clearly documenting decisions and supporting rationales will thus become a key part of defending against individual liability should the worst happen.


10. Compliance-as-code gets traction

A final—and positive—development from 2024 is the mainstreaming of GRC engineering (which some describe as “compliance-as-code.” A manifesto written by several practitioners in 2024 announcing the launch of their movement, focuses on making compliance more effective and efficient. It emphasizes using automation, practical risk measurement, and open-source tools while prioritizing solutions that work well for users, not just GRC teams. By addressing issues earlier in processes and treating compliance as a core function, this approach aims to improve outcomes and simplify traditionally complex practices.

By embedding compliance and risk management into development pipelines and operational systems, GRC practitioners can ensure governance and security measures evolve alongside technology. Practitioners can also collaborate more closely with engineering teams to co-create solutions, fostering a culture where compliance is seen as a shared responsibility rather than a bottleneck. Leveraging data-driven insights and continuous assurance mechanisms, GRC professionals can provide real-time visibility into risk, enabling faster, more informed decision-making.


Conclusion

The year 2024 was a turning point for the GRC landscape, with a surge in regulatory activity, technological advancements, and evolving security risks reshaping how organizations approach governance, risk, and compliance. As we step into 2025, the stakes are higher than ever. Businesses must navigate an increasingly complex web of global regulations, leverage emerging technologies like AI responsibly, and proactively address challenges like personal liability and compliance gaps in new tools.

The path forward lies in embracing innovation while ensuring robust security and compliance frameworks. By treating GRC as a strategic enabler rather than an operational hurdle, organizations can not only meet evolving requirements but also build resilience and foster trust in a rapidly changing world.

Share this content on your favorite social network today!