The Senate voted at the end of January overwhelmingly in favour of a bill that would allow its use during the event, with proponents arguing that the technology would help prevent crowd crushes or terrorist attacks.
These fears are not surprising given the chaos of last year's Champion's League final between Liverpool and Real Madrid at the Stade de France, where police used tear gas and some fans complained of antisocial behaviour and muggings around the stadium.
The shadow of the November 2015 coordinated terrorist attacks in Paris undoubtedly also plays a role in the government's decision, as the event could attract 13 million spectators.
However, opponents to the bill, such as human rights groups, fear that it will pose a danger to civil liberties, transforming the country into a police state.
The legislation includes plans to use AI to detect, for the first time ever in France, suspicious body language or crowd movements through CCTV cameras and drones, information which would be sent directly to the police.
Indicators of suspicious behaviour could include individuals being static, walking the wrong way or wearing a some form of cover.
The technology could also be used around stadiums, on streets, and on public transport.
Another point of contention is that the bill states the cameras can be used until June 2025 during sporting, festive, or cultural events, as part of an experimental pilot.
The bill still has some hurdles to overcome before it is implemented during the Olympic Games; in March it should be examined by the National Assembly, before an independent commission (CNIL) reviews the legislation.
The French Minister of Sport, the Olympic and Paralympic games, wrote on Twitter: "Adoption at first reading by @Senat of #PJLJOP. Thank you to the senators for their contributions to this text, which will promote the best possible organization of the #Paris2024 Games. The examination will continue at @AssembleeNat with the same desire for balance of @gouvernementFR."
Some French cities already use a similar form of artificial intelligence to impose the law, such as in Massy, a commune in the southern suburbs of Paris.
Régis Lebeaupin, Video Protection Manager at Massy Municipial Police explained how algorithms are used to help police detect traffic offences.
He said: "When a vehicle parks in a prohibited space, video analysis sends us a signal that saves us time. The image comes directly to us."
Currently, this technology is highly regulated; in France, facial recognition is prohibited and should remain so when the new law is passed.
Lebeaupin added: "The French legal framework prohibits the cross-referencing of data. Of course I film faces, however, the law forbids me to link this face to an identity."
On 13 February, French digital rights group, La Quadrature du Net, launched a campaign against the use of algorithmic video surveillance.
Noémie Levain, a lawyer from the group, said: "The Olympics are a pretext. We know that it won't stop in 2025. As soon as there is an experiment, it is perpetuated. It's important to see the movement that France is taking with this law, to want to give more importance to the development of the video surveillance market than to public liberties.
Levain argued that while in Brussels regulation of such measures is being debated, in France the government "doesn't care" about civil liberties.
"In two months it has passed a law that takes the opposite path. It is the first European country to adopt such a law," she added.
Another area of concern for civil liberty groups is data retention, which has so far been set at five years, a time period which would stretch far beyond the Olympic Games.