AI companies are moving to user interface innovations to try to grab more unwilling training individuals. These new UI innovations will feel like shit if they try to force adoption, full of warnings and disclaimers.
Reminiscent of the cookie law, which many people hate, but they hate because companies insist in having cookies (if you don't track, you don't need the cookie popup).
Privacy, safety and reliability debates are back into dark patterns awareness. This is a territory tech companies were trying very desperately to get out of.
I think it's also brilliant in the way it answers the black box paradigm. "Oh, we cannot explain it, it's a black box". "Then explain how you made it, otherwise it's a no go".
Ultimately, this sets the discourse straight regarding what AI skepticism is all about. This is not about being anti-commerce, it's about being good commercial entities.
Because an AI model developed in the EU will, by design, be compliant with EU data and AI regulations from day one, which makes it a strong selling point for companies looking to launch AI-powered products on the EU market?
Yes, regulatory compliance is a significant concern for software product design in the EU these days. But that's a good thing - it stops a lot of hare-brained ideas and abusive business models at the drafting phase. Also, from my own observation, the rules seem annoying at first, because they tend to shut down the most exciting ideas - but after a while you notice that this is because those ideas come with bad failure modes and bad second-order effects, and regulations are forcing you to actually consider them.
But EU citizens want good AI models, not EU approved models, and people can use VPN. Because regulatory process kills fast moving business and whatever is rubber stamped by EU bureaucracts and compliance-industrial complex law firms sucking out money by selling snake oil compliance services is already few years behind.
Different markets are made of up different people, who (‘s leaders) may want different things…
The car market is a great example. The US market has decided it doesn’t want EVs from the biggest EV producer in the world (China), so people are indeed buying cars with older technology than what the new global standard has become thanks to China’s successfully state-sponsored EV market.
It may very well be that the US leads globally in AI, while some markets handicap access and development for internal reasons.
... despite collecting enough metrics to infer the shape and weight of your body. Also there's the risk of ads suddenly appearing at the discretion of the vendor.
A factor that goes into that decision is the now inherent unreliability of relying on entities in the US as a partner. For some situations it's safer to do it in the EU despite the regulations.
> The EU has zero tech companies that rival FAANG et al here in the US. Zero. Because of it's (well-intentioned but harmful) business regulations.
Not really, it's because the EU has 28 sets of business regulations, those of the 27 members states and of the EU itself. The single market is not yet all that single, especially when it comes to digital services. The now abandoned project of the ever closer union wasn't some idealistic bs, it was the plan to gradually fix this.
The fact of the matter is that of the voters who care about AI, the most vocal ones vehemently oppose it. Sabotaging AI development is a feature, not a bug. This regulation might well be an attempt at appeasing the faction that wants to ban LLMs in the EU entirely.
I'm guessing that due to regulations, the only AIs you'll be allowed to use in the EU will be the EU developed ones that fulfill the requirements, so there's a captive market right there for local companies.
Granted, they'll probably end up performing worse than US or Chinese ones operating without restrictions and being uncompetitive on the global free market, but when did EU leaders ever think about long term consequences? Certainly not when they tied their economy to Russian gas and banned nuclear, certainly not when they prioritized toxic diesel engines over gasoline, certainly not when they demilitarized or when they ceded tech innovation to US and China, but for once this will be the right call, I can feel it, this will bring EU to the forefront of tech supremacy.
Not just AI models. The EU only leads in regulations and nothing else and it shows: since 2008 both China and the US have experienced insane GDP growth while the EU has been totally stagnant (inflation adjusted).
I don't think people realize at which speed the EU is falling into oblivion. It's really horrible to witness from inside the EU. Cities are poorer and poorer, high-trust societies are becoming low-trust ones due to rising (imported) poverty and crimes. Cities that used to be beautiful cities now see weekly kidnapping (Paris) and AK47s are fired in Brussels on a weekly basis.
But we should all applaud because AI is going to be regulated and because we're going to be green. Go EU, yay!
Meanwhile people in the EU don't even want to have kids anymore: I honestly have got a hard time figuring out why young people in the EU would even want kids seen the overall atmosphere reigning here now.
So while India, Brazil, China, the US, and many other countries are still going to see growth, I fully expect the EU to keep shooting itself in the foot (like it did with its car industry, destroying it with regulations and handing over the EU EV car market to China).
Those who can do do, losers who cannot do regulate.
Just the same? If you publish a model that doesn't follow these rules nobody in the EU could use that model in their business. You could publish unlicensed source code as well and nobody could really use it for anything business related either.
> If you publish a model that doesn't follow these rules
The rules require tracking outputs, which open-weight models cannot do. So I'm wondering if open-weight models have separate rules or this effectively bans releasing such models.
Of course they can. Lets assume you are using such a model in your product, this now makes tracking its output your responsibility. It is really no different from the way you would use an open source library.
That is confusing since closed-weight models (the models, not the applications using the models) also can't track outputs. It would be weird if the rules applied to the model and not the application because then it literally only applies to open-weight models since the closed-weight models are, by definition, never released to the public.
Trying to understand the rules but it doesn't seem to make a clear distinction between these things. I assume that they are intending the applications that use the models, not the models.
I haven't read the directive/law, but the EU should also invite EVERYONE to submit well documented and reproducible examples of AI/LLM failures. It's only through public exposure that improvement will be assured.
The AI vendors will NEVER fix any system flaws that can be ignored or hidden. Only a public database can force these into the open.
> The AI industry participated in drafting the AI Act, but some companies have recently urged the EU to delay enforcement of the law, warning that the EU may risk hampering AI innovation by placing heavy restrictions on companies.
Where YOU live you can have all the unbridled capitalism as you want - be a product for tech bros and help make some executive a billionaire - I don't care!
Where I live, I want this shit regulated. So, good stuff, EU.
IIUC the infamous cookie banner law only to inform the user of cookies/client tracking, it did not require to ask for consent. that came with GDPR and other privacy laws
Most cookie banners don't seem to actually meet the requirements of the law, at least to my reading, which does say denying them has to be as easy as accepting. And the law doesn't even suggest a popup or banner, just that you can't add cookies without receiving permission - fully opt in would work for it
TL;DR: the cookie banners and consent forms are designed to make you blame the EU.
You don't need a cookie banner if you aren't doing anything shady. Using cookies (or other such mechanisms) does not require information or consent popups when they're necessary to make the product/service work for technical reason - the canonical example being session cookies.
The corollary here being, you only need consent popups if you're doing something shady but not strictly illegal. They're not meant to be annoying - they're there because it's illegal to do shady shit without the user agreeing to it, and "agreeing" in the EU means "informed consent". So you have to inform, and then get consent.
It's all pretty reasonable. But of course, people doing shady shit really don't want the users to understand it and risk them not consenting - and they especially hate having to ask in the first place. The industry settled on a "malicious compliance" approach to GDPR - show popups that, as much as they can get away with it, maximize the chance of people consenting to make the popup go away, make the "informed" part as opaque as possible, and generally make this thing super annoying - and then tell people it's all EU's fault, hoping enough Europeans will buy it and the public pressure will make EU undo GDPR.
Though again, not ideal IMO; based on skimming those policies, I think they could've set it up so consent popup only shows in specific situations that trigger the need for it. That, and I don't get why they use (a minimal build of) Google Analytics, and let that data fly over to the US (which they explicitly acknowledge). That's just lazy.
From a web designer's perspective if its a question of "do I do it the way that europa.eu does it, or try to pioneer some new other-than-banner approach to GDPR compliance - what is the risk to me or my company that I'm doing it wrong that that the EU will come down and fine me?
Maintaining the same interface as europa.eu is the least risky approach and so everyone does it that way.
If one wants to say "the GDPR doesn't mandate cookie banners" then it should be the GDPR site in europa.eu that demonstrates how that can be done with other styles of cookie consent.
Until then, it is perfectly fair and reasonable to assert that the GDPR requires it because the GDPR site itself uses it and companies that haven't done it that way have gotten fined.
EU two or three steps ahead.
AI companies are moving to user interface innovations to try to grab more unwilling training individuals. These new UI innovations will feel like shit if they try to force adoption, full of warnings and disclaimers.
Reminiscent of the cookie law, which many people hate, but they hate because companies insist in having cookies (if you don't track, you don't need the cookie popup).
Privacy, safety and reliability debates are back into dark patterns awareness. This is a territory tech companies were trying very desperately to get out of.
I think it's also brilliant in the way it answers the black box paradigm. "Oh, we cannot explain it, it's a black box". "Then explain how you made it, otherwise it's a no go".
Ultimately, this sets the discourse straight regarding what AI skepticism is all about. This is not about being anti-commerce, it's about being good commercial entities.
Why would anyone develop AI models in the EU? You have a lot of compliance requirements and there's fewer enterprises that will pay for them
Because an AI model developed in the EU will, by design, be compliant with EU data and AI regulations from day one, which makes it a strong selling point for companies looking to launch AI-powered products on the EU market?
Yes, regulatory compliance is a significant concern for software product design in the EU these days. But that's a good thing - it stops a lot of hare-brained ideas and abusive business models at the drafting phase. Also, from my own observation, the rules seem annoying at first, because they tend to shut down the most exciting ideas - but after a while you notice that this is because those ideas come with bad failure modes and bad second-order effects, and regulations are forcing you to actually consider them.
Maybe for businesses like governments.
But EU citizens want good AI models, not EU approved models, and people can use VPN. Because regulatory process kills fast moving business and whatever is rubber stamped by EU bureaucracts and compliance-industrial complex law firms sucking out money by selling snake oil compliance services is already few years behind.
Would you buy a three years old car as a new?
> Would you buy a three years old car as a new?
Regardless of how I feel about the price of the old car, I wouldn't buy the new car if it's not legal to drive on public roads.
Different markets are made of up different people, who (‘s leaders) may want different things…
The car market is a great example. The US market has decided it doesn’t want EVs from the biggest EV producer in the world (China), so people are indeed buying cars with older technology than what the new global standard has become thanks to China’s successfully state-sponsored EV market.
It may very well be that the US leads globally in AI, while some markets handicap access and development for internal reasons.
I wouldn't buy a car that sometimes goes the opposite way of where I want and where the manufacturer tells me they have no idea why.
... despite collecting enough metrics to infer the shape and weight of your body. Also there's the risk of ads suddenly appearing at the discretion of the vendor.
A factor that goes into that decision is the now inherent unreliability of relying on entities in the US as a partner. For some situations it's safer to do it in the EU despite the regulations.
> the now inherent unreliability
What "unreliability" are you talking about in terms of American tech businesses?
> For some situations it's safer to do it in the EU despite the regulations
The EU has zero tech companies that rival FAANG et al here in the US. Zero. Because of it's (well-intentioned but harmful) business regulations.
I have a feeling you're projecting your dissatisfaction with election results more than anything tangible...
> The EU has zero tech companies that rival FAANG et al here in the US. Zero. Because of it's (well-intentioned but harmful) business regulations.
Not really, it's because the EU has 28 sets of business regulations, those of the 27 members states and of the EU itself. The single market is not yet all that single, especially when it comes to digital services. The now abandoned project of the ever closer union wasn't some idealistic bs, it was the plan to gradually fix this.
Many companies see it has a positive to not have monopoly-abusing competitors able use the government they bought to crush startups.
> What "unreliability" are you talking about in terms of American tech businesses?
https://nltimes.nl/2025/05/20/microsofts-icc-email-block-tri...
This sort of stuff.
The fact of the matter is that of the voters who care about AI, the most vocal ones vehemently oppose it. Sabotaging AI development is a feature, not a bug. This regulation might well be an attempt at appeasing the faction that wants to ban LLMs in the EU entirely.
I'm guessing that due to regulations, the only AIs you'll be allowed to use in the EU will be the EU developed ones that fulfill the requirements, so there's a captive market right there for local companies.
Granted, they'll probably end up performing worse than US or Chinese ones operating without restrictions and being uncompetitive on the global free market, but when did EU leaders ever think about long term consequences? Certainly not when they tied their economy to Russian gas and banned nuclear, certainly not when they prioritized toxic diesel engines over gasoline, certainly not when they demilitarized or when they ceded tech innovation to US and China, but for once this will be the right call, I can feel it, this will bring EU to the forefront of tech supremacy.
Why would you assume they would end up performing worse?
I have yet to see a company that prioritized quality over profits, unless forced to by regulation.
Globalisation is over, and the way wars are going on, lets see what gets left of the planet to save, if at all.
> Why would anyone develop AI models in the EU?
Not just AI models. The EU only leads in regulations and nothing else and it shows: since 2008 both China and the US have experienced insane GDP growth while the EU has been totally stagnant (inflation adjusted).
I don't think people realize at which speed the EU is falling into oblivion. It's really horrible to witness from inside the EU. Cities are poorer and poorer, high-trust societies are becoming low-trust ones due to rising (imported) poverty and crimes. Cities that used to be beautiful cities now see weekly kidnapping (Paris) and AK47s are fired in Brussels on a weekly basis.
But we should all applaud because AI is going to be regulated and because we're going to be green. Go EU, yay!
Meanwhile people in the EU don't even want to have kids anymore: I honestly have got a hard time figuring out why young people in the EU would even want kids seen the overall atmosphere reigning here now.
So while India, Brazil, China, the US, and many other countries are still going to see growth, I fully expect the EU to keep shooting itself in the foot (like it did with its car industry, destroying it with regulations and handing over the EU EV car market to China).
Those who can do do, losers who cannot do regulate.
The EU continuing their strategy of "If we can't have homegrown tech to tax, we'll just fine the foreign tech people use instead".
If strategy of foreigners is to make money from citizens of EU then strategy of EU should logically be to make money out of foreigners.
How dare the EU prioritize the rights of its citizens over the rights of evil foreign megacorporations!
We are fine not having neoliberal capitalism, plus we can follow the example of other countries and erect the big European wall, for that matter.
Meanwhile at xAI: "Let's have Wolfenstein in real life".
How would this apply to open-weight models? The creators of the models cannot know who is using the model and for what.
Why would it be the creators’ responsibility? It should be the one running the model.
Just the same? If you publish a model that doesn't follow these rules nobody in the EU could use that model in their business. You could publish unlicensed source code as well and nobody could really use it for anything business related either.
> If you publish a model that doesn't follow these rules
The rules require tracking outputs, which open-weight models cannot do. So I'm wondering if open-weight models have separate rules or this effectively bans releasing such models.
Of course they can. Lets assume you are using such a model in your product, this now makes tracking its output your responsibility. It is really no different from the way you would use an open source library.
That is confusing since closed-weight models (the models, not the applications using the models) also can't track outputs. It would be weird if the rules applied to the model and not the application because then it literally only applies to open-weight models since the closed-weight models are, by definition, never released to the public.
Trying to understand the rules but it doesn't seem to make a clear distinction between these things. I assume that they are intending the applications that use the models, not the models.
You ban open weight models and the problem is solved
Users of the model could be legally required to take part in the tracking.
I haven't read the directive/law, but the EU should also invite EVERYONE to submit well documented and reproducible examples of AI/LLM failures. It's only through public exposure that improvement will be assured.
The AI vendors will NEVER fix any system flaws that can be ignored or hidden. Only a public database can force these into the open.
What is a failure?
When an AI gives you a link that it made up. (This is easily verifiable)
Telling lies basically.
In AI-speak, a "hallucination".
I wish the headline used a word such as "force", "tell", or "require".
To me, "ask" connotes that compliance is voluntary. Which in some circumstances strikes me as an intentional, rhetorical lie.
Yes, this use of 'ask' is statist propaganda.
Anti-statist not so smug when the AI finally reaches their pro-state
People say the EU is behind at AI. The EU is the only one thinking about AI long-term, everyone else is just seeing what happens.
Press release: https://ec.europa.eu/commission/presscorner/detail/en/ip_25_...
> The AI industry participated in drafting the AI Act, but some companies have recently urged the EU to delay enforcement of the law, warning that the EU may risk hampering AI innovation by placing heavy restrictions on companies.
Where YOU live you can have all the unbridled capitalism as you want - be a product for tech bros and help make some executive a billionaire - I don't care!
Where I live, I want this shit regulated. So, good stuff, EU.
Don't be surprised when all the innovation happens in the US or China and EU will be left behind like always.
EU leads in medical innovation, civil engineering, mechanical engineering.
When has it ever been left behind? The WWW was invented in Europe. All modern chips are bottle necked by European EUV.
Linux is a Swedish invention
> The WWW was invented in Europe.
Built using the internet, which was invented in America.
> All modern chips are bottle necked by European EUV.
Which uses US tech. My understanding is they have to follow US sanctions even.
Sure. However, this was _never_ about money and _always_ about good honest work.
> Where YOU live you can have all the unbridled capitalism as you want
A bit rich when all the companies mentioned are across the pond?
So what, they should slurp up all the data they can?
Uh... you know that there are offline models that are already downloadable?
LLM regulation is too late, models are already at chatGPT3.5 or 4 levels, which is enough to do basically anything.
You are confusing intent for ground reality. Its like saying 'we banned drugs' but we still have a drug problem.
If it weren't for the cookie banners (I blame the EU, but I'm open to corrections), I'd agree with you.
I disagree, cookie banners are not the bane people make them out to be.
The alternative is much worse, which is having zero say in tracking cookies. I'll take a banner on every single website to have more control of that.
I really don't see the issue. If you really find them annoying, use ublock with a proper cookie banner filter or something like that
IIUC the infamous cookie banner law only to inform the user of cookies/client tracking, it did not require to ask for consent. that came with GDPR and other privacy laws
Most cookie banners don't seem to actually meet the requirements of the law, at least to my reading, which does say denying them has to be as easy as accepting. And the law doesn't even suggest a popup or banner, just that you can't add cookies without receiving permission - fully opt in would work for it
You can make as many as laws you want, but if no one cares you are only harming those businesses who care.
TL;DR: the cookie banners and consent forms are designed to make you blame the EU.
You don't need a cookie banner if you aren't doing anything shady. Using cookies (or other such mechanisms) does not require information or consent popups when they're necessary to make the product/service work for technical reason - the canonical example being session cookies.
The corollary here being, you only need consent popups if you're doing something shady but not strictly illegal. They're not meant to be annoying - they're there because it's illegal to do shady shit without the user agreeing to it, and "agreeing" in the EU means "informed consent". So you have to inform, and then get consent.
It's all pretty reasonable. But of course, people doing shady shit really don't want the users to understand it and risk them not consenting - and they especially hate having to ask in the first place. The industry settled on a "malicious compliance" approach to GDPR - show popups that, as much as they can get away with it, maximize the chance of people consenting to make the popup go away, make the "informed" part as opaque as possible, and generally make this thing super annoying - and then tell people it's all EU's fault, hoping enough Europeans will buy it and the public pressure will make EU undo GDPR.
>TL;DR: the cookie banners and consent forms are designed to make you blame the EU.
https://european-union.europa.eu/index_en
https://gdpr.eu
If cookie banners were not designed to be required... why does the EU pages themselves use them?
Not ideal, but those banners are unobtrusive, and link to clear information on what they collect and why, and which things need your consent:
https://gdpr.eu/privacy-policy/
https://european-union.europa.eu/cookies_en
Though again, not ideal IMO; based on skimming those policies, I think they could've set it up so consent popup only shows in specific situations that trigger the need for it. That, and I don't get why they use (a minimal build of) Google Analytics, and let that data fly over to the US (which they explicitly acknowledge). That's just lazy.
From a web designer's perspective if its a question of "do I do it the way that europa.eu does it, or try to pioneer some new other-than-banner approach to GDPR compliance - what is the risk to me or my company that I'm doing it wrong that that the EU will come down and fine me?
Maintaining the same interface as europa.eu is the least risky approach and so everyone does it that way.
If one wants to say "the GDPR doesn't mandate cookie banners" then it should be the GDPR site in europa.eu that demonstrates how that can be done with other styles of cookie consent.
Until then, it is perfectly fair and reasonable to assert that the GDPR requires it because the GDPR site itself uses it and companies that haven't done it that way have gotten fined.
That's the effect of malicious compliance rather than the law itself
https://gdpr.eu maliciously complies with itself?
Note that's a company page that the EU pays to do it.. the EU government page is even worse. https://commission.europa.eu/law/law-topic/data-protection/r...
Blame the companies not the EU. Most companies don't need to track you and thus don't need the cookie banner.
If this is your biggest complaint then I think you lack perspective.
Cookie banners themselves is not a huge problem, but it is a good example of EU regulations being prone to stupid unintended consequences.