Skip to content

Sam Altman Wants Your Eyeball

Image of a red circle of light that resembles a human iris over a black background.

Last week, OpenAI's CEO Sam Altman announced in San Francisco that the World project he co-founded, formerly known as Worldcoin, is opening six stores across the United States, allowing users of the project's app to scan their eyeballs.

Simply put, the premise is this: scan your eyeball, get a biometric tag, verify yourself, buy our apps (and cryptocurrency). The scary part is the for-profit company developing the project has now gathered millions in venture capital investment, powerful partners, and is ready to expand and impose its Minority Report style technology everywhere. Welcome to Dystopialand.

The World(coin) project is an initiative from the startup Tools for Humanity, co-founded by its CEO Alex Blania. Despite its friendly name, the for-profit corporation has been on the radar of many critics through the years already. From experts to journalists to privacy commissioners around the world, not everyone shares Blania's enthusiasm for his biometric-based technology.

What is the World App?

The World project, recently rebranded from the Worldcoin project (possibly to convey better its expansionist ambitions) presented its plan for the World App to Americans this week. The project is now expanding well beyond the cryptocurrency it started from.

The World App is an everything app, providing users with a World ID, that can be verified through the collection of biometric data in the form of an iris scan.

The scan is then filtered and hashed to create a unique identifier that is stored as a so-called "proof of personhood" on the World Network, a blockchain-based protocol.

The World App itself contains a collection of "Mini Apps", where users can manage their cryptocurrencies, chat together, play games, receive their pay check even, and ultimately live their whole life within the closed "verified" ecosystem of the app.

For a company constantly praising decentralization, it sure looks like they want to make sure they are the center of it all.

To obtain this coveted verification code, users must be ready to share their precious eyeball data with the Orb.

The Orb is a piece of hardware designed by Tools for Humanity to perform iris scans. It is available to access in the United States at one of the currently six locations in Austin, Atlanta, Los Angeles, Miami, Nashville and San Francisco (more to come soon), like some sort of biometrics collection ATM.

The World project has for ambition to expand its reach across the United States to install 7,500 Orbs by the end of this year, so be prepared to see this dystopian technology everywhere soon.

The San Francisco presentation last week was clearly prepared to impress investors with its Apple announcement vibe. The promise of a quickly growing startup that everyone will soon want to work with, was repeated over and over in different flavors.

Tools for Humanity bragged about many large partnerships that should make any privacy advocates shiver in dread: the Match Group dating apps conglomerate (Tinder, OkCupid, Hinge, Plenty of Fish), Stripe, and Visa are some of them.

If they succeed in convincing enough people, many of us could soon have little choice but to unwillingly have to enroll.

World(coin) isn't new, you might have heard of its unethical practices already

The project claims to have onboarded 26 million people already, including 12 millions "users" who are verified (had their biometric data collected).

These "users" are largely located in Latin America, Africa, and Asia. This is because the company started testing for its project there a few years ago, in regions where people often have fewer legal protections.

In 2022, MIT Technology Review produced an extensive investigation on the startup's debut in an article titled: Deception, exploited workers, and cash handouts: How Worldcoin recruited its first half a million test users.

The investigation revealed a collection of unethical practices to pressure the most vulnerable populations in signing up for Worldcoin, and have their eyeball scanned in exchange for money they desperately needed.

Some participants had to provide much more personal information than the company says is required, such as emails, phone numbers, and even photos of official ID. Many people who gave their biometric data to Worldcoin were rushed and misinformed. Some who signed up didn't even have an email and had to create one. The "Orb operators" hired to perform the scans locally were often poorly trained, poorly informed, and unable to answer the questions asked by participants.

So much so that Kenya suspended the company's operations in 2023 over concerns for privacy, security, and financial service practices.

Some people who signed up never received the promised money. Some officials were bribed to give the impression to participants these operations were official and supported by the government.

As Ruswandi, one of the person targeted by this early campaign remarked: "why did Worldcoin target lower-income communities in the first place, instead of crypto enthusiasts or communities?"

Exploiting people in situation of poverty in order to test a biometric identification technology isn't a great way to start a project developed by a company called "Tools for Humanity".

Creating the problem, selling the solution

Why developing such a technology in the first place?

Sam Altman himself have expressed concern about the problem this alleged solution solves: the avalanche of fake accounts and pretend persons online caused by the new AI tools unleashed everywhere.

The proposed use of a "proof of personhood" claims to solve this problem by allocating a unique identifier to each human, a personal code supposedly impossible to duplicate or cheat. Of course, this has already been proven wrong.

No one will miss the irony of the CEO of OpenAI, responsible for creating the largest share of this problem, expressing such concern while continuing to feed the fire.

This is a classic case of creating a problem and selling the solution. Well, in this case it is more like selling the problem and selling the solution. As researcher and cryptocurrency critic Molly White pointed out in 2023:

"That's right, the guy who's going to sell us all the solution to a worsening AI-powered bot infestation of the Internet and to AI-induced mass unemployment is the same guy who's making the AI in question."

Sadly, this proposed solution also isn't really a solution, or at least it isn't a good solution. Indeed, this will create a whole collection of new problems, many much worse than a bot infestation.

The risks of sharing biometric data

Biometric data is incredibly sensitive data, because it's irrevocably attached to a person. Whether it's from a face scan, palm scan, fingerprint, keystroke pattern, or iris scan, this data is part of our bodies and cannot be changed like a password if it gets compromised.

For this reason, a growing number of legislations around the world now include special categories for such data collection, and require extra protections and supervision for it.

There are many dangers in collecting and potentially endangering biometric data. First, if this data gets stolen, criminals can impersonate a victim much more convincingly, because they will have the "proof" to "verify" this is really you.

While straight-up stealing your eyeball or face might still belong to science-fiction, the risk of getting the data produced from the scan stolen is very real.

When the World project claims it is secure because biometric data isn't stored anywhere, even if that was true, the iris code derivative of this data is indeed stored and processed somewhere, and this can potentially be stolen.

How hard will it be for a victim to recover an account from a biometric thief when everything is reinforcing the false narrative shared with investors that this technology can't be cheated?

Then, there is the loss of pseudonymity protections online.

If every social media account becomes tied to a unique biometric-based identifier, whether directly or indirectly, there is no pseudonymity anymore.

Further, if only one account is allowed by "verified human", then no one can create separate accounts for their work life and personal life anymore. Creating separate accounts for separate purposes is an excellent privacy-preserving practice.

Even if the identifier isn't tied to a legal name directly, accounts on different platforms using the same identifier could potentially get liked together. To be fair, it does seem Tools for Humanity worked to prevent different platforms from having access to the same code, but how well will this hold the test of time? Will platforms increasingly escalate privacy-invasive requests from this point, like they often do?

Pseudonymity saves lives. It is an essential tool for the safety of the most vulnerable online. Killing pseudonymity by requiring unique biometric identification could endanger millions.

This is a serious problem coming up with age verification processes as well, which World ID will soon also be a part of when testing its implementation for Tinder in Japan.

Biometric data should never be used lightly. It should be reserved for the most extreme cases only.

The regions who have adopted stronger regulations for biometric data collection are moving in the right direction. But will protective legislation be enough to resist the pressure from a for-profit VC-backed corporation with a valuation at billions?

Flipping the coin

Tools for Humanity seems to be well aware of its creepiness factor, and of the criticisms brought by privacy commissioners around the world.

Its recent Orb redesign from the previous cold (Black)mirror finish clearly tries hard to replace creepiness with cuteness.

The company has also evidently invested a lot in presenting a pro-privacy image, likely in an attempt to reassure users (and investors).

Unfortunately, many of these privacy-preserving claims are inaccurate. Some claims promoting "features" that might sound impressive to a neophyte's ear are actually just the baseline, and others sadly are misleading at best.

While a few privacy-preserving efforts are indeed positive, most of the focus on privacy relates to marketing much more than any serious protections.

How privacy-preserving is it?

Most people are still put off by the idea of having their eyeball scanned, and the company has evidently invested a lot in promoting a "privacy-preserving" image, possibly as an attempt to reassure unconvinced humans and privacy commissioners alike.

But how much can we trust those claims?

Flawed assumption about what constitutes personal data

The largest assumption about why this technology is "privacy-preserving" seems to come from the fact that the World App doesn't collect names, official IDs (unless it does), emails (unless it does), phone numbers (unless it does), date of birth (unless it does), or other identifiers.

This assumption however neglects the fact that 1) even data that isn't attached to a legal name can be personal data, and 2) the iris code it produces from the iris scan is indeed personal data.

While there are variations, most privacy regulations have similar definitions of what constitute personal data. The European General Data Protection Regulation (GDPR) defines it as "any information relating to an identified or identifiable natural person". An iris code derived from an iris scan of course fits this definition.

Moreover, to create a World ID, the company also collects a face image. Together, the original iris scan and face photo are referred to as Image Data. For "privacy-preserving" purposes, Image Data of course never leaves the Orb device (unless it does).

While it seems some effort has been made to protect the Image Data in some ways, the idea that derivative data from the scans isn't still sensitive personal information anymore is wrong.

If there is a way for a person to scan their iris again and generate the same code, then this data relates to their identifiable person. This also means that someone else could scan their iris and generate the same code.

As whistleblower Edward Snowden rightfully pointed out in a 2021 tweet:

“This looks like it produces a global (hash) database of people's iris scans (for 'fairness'), and waves away the implications by saying 'we deleted the scans!' Yeah, but you save the hashes produced by the scans. Hashes that match future scans. Don't catalogue eyeballs.”

Questionable reassurance about local data

One of the biggest reassurance relates to the claim that sensitive biometric data (Image Data) is only stored locally. But this isn't completely accurate either, and there seems to be conflicting information about it from the company's own documentation.

The World white paper specifies that:

"The Orb verifies that it sees a human, runs local fraud prevention checks, and takes pictures of both irises. The iris images are converted on the Orb hardware into the iris code. Raw biometric data does not leave the device (unless explicitly approved by the user for training purposes)."

However, according to the Biometric Data Consent Form users have to sign prior to data collection, if a user needs a fully verified World ID, inevitably this sensitive biometric data will be sent to their phone, therefore leaving the Orb.

After a user agrees to the form, they can keep the option for Data Custody disabled to have their biometric data deleted from the Orb "later", and have it uploaded to their phone (with all the risk that this entails).

The other option users have is to enable Data Custody (if allowed in the user's country) and have this sensitive data sent to both their phone and to Tools for Humanity.

This means the Orb inevitably sends this sensitive data to a mobile device. Then, this data is only as secure as the mobile device is. Which isn't so reassuring.

The documentation does maintain this biometric data is sent as an "end-to-end encrypted data bundle", but this doesn't mean the data never leaves the Orb, it just means it leaves it while encrypted (which is really just the basics), and copies it to the user's device.

Furthermore, future users are strongly incentivized to share their Image Data with Tools for Humanity, for algorithm improvement purposes. Pressure to opt in is even presented as a convenience option, because it would be cumbersome to have to come over for another scan after every update.

As stated in the Biometric Data Consent Form:

"This will likely help you avoid some inconvenience because, if we have your Image Data, then you will not need to return to an Orb to re-verify your digital identity when we update the software."

The company continues to repeat they have a "privacy by default and by design approach". But you can't keep your privacy-preserving cake and eat it, too.

What does the white paper say

In tech, a white paper is usually a research-based document produced by the developers that presents more technical details on an application, product, or process. It is especially valuable for products like the Orb and the World App, where security and privacy should be paramount, and therefore examined closer.

Because it isn't an independent review, a white paper can also not be worth much more than a marketing pamphlet.

To its credit, Tools for Humanity does warn in its white paper that this information is "intended for general informational purposes and community discussion only and do not constitute a prospectus, an offer document, an offer of securities, a solicitation for investment, or any offer to sell any product, item or asset (whether digital or otherwise)."

Furthermore, the company makes sure to specify that "circumstances may change and that the Whitepaper or the Website may become outdated as a result; and the [World] Foundation is not under any obligation to update or correct this document in connection therewith."

The document is also described as a "crypto-asset white paper".

We have been warned.

In its Privacy section, the white paper states that "no data collected, including images taken by the Orb has or will ever be sold. Nor will it be used for any other intent than to improve World ID."

However, its Privacy Notice also states that they may "share your personal information in connection with, or during negotiations concerning, any merger, sale of company assets, financing, or acquisition of all or a portion of our business by another company."

If this happens, many regretful users might find themselves in the same shoes as 23andMe users this year, where the DNA collecting company started to look for buyers of its biometric data assets after filling for bankruptcy.

Additionally, the Face Authentication section of the white paper describes a process where encrypted facial biometrics collected from the Orb are used for authentication in the World App.

Even if this data is stored on-device, it is still biometric data getting collected by the Orb then processed by the phone app. There is no question this is sensitive and personal biometric data, and it is indeed kept outside the orb.

Tools for Humanity lacks consistency in the various claims and statements found through its documentation and promotion material. It becomes difficult to know which version to trust, and if it is to be trusted at all.

No deletion on the blockchain

Tools for Humanity's Privacy Policy declares that the company will delete all account data (when laws allow it) one month after it is closed, this is good. They also state they will delete entirely any inactive account after 2 years, and this is actually a great policy.

But what happens to the World ID, transactions, and other data stored on the blockchain?

While some thoughts have been put into deletion and some good mechanisms seem to have been implemented, unfortunately data stored on the blockchain might be "deletion-resistant".

There's a possibility that what happens on the blockchain stays on the blockchain, forever.

The policy notes that:

"Due to the public and immutable nature of blockchain technology, we cannot amend, erase, or control the disclosure of data that is stored on blockchains."

So that is something to keep in mind if you value your right to delete.

Data security considerations

Even if some thoughtful security features seem to have been implemented for the World App and its Orbs, nothing processing sensitive data at such a large scale should be left in the hands of a single for-profit, largely unregulated, organization.

This would be like putting 8 billion eggs in a very fragile basket, held by someone paid to make the basket pretty and convince as many people as possible to put their precious single egg in it, with no incentive whatsoever to ensure the basket doesn't break. I would not want to put my egg in there, especially with how much it costs now.

The idea of using one single for-profit app worldwide for "human verification", identity verification, age verification, money transactions, and storing official IDs (and so on and so forth) makes this application a huge target for criminals and hostile governments alike.

It's good that the app had security audits, made some code available as open source, and reportedly plans to open a bug bounty program.

However, there are still problems that remain. For example, the phone in this case becomes a single point of failure. The easiest way to steal someone's identity and money (all at once) will be to steal their phone data (whether physically or remotely). Even without criminal intent, what happens when someone just loses their phone? Or accidentally drop it in the pool? Or step on it?

With everything relying on a single app and a single device, risk is greatly amplified.

Outside the user's responsibility, Orb operators and Orb stores are susceptible to various attacks. This will increase exponentially with the number of users of course, as the target becomes bigger. In fact, Orb operators have already been hacked.

Then, there is the appeal of fake identities and money fraud for criminals. Already, there is a black market for iris data in China, where people buy iris data (or verified World ID according to World) from people in Cambodia, Kenya, and other countries for a few dollars only. The vulnerability allowing this was reportedly fixed, but it is doubtful this is the last one we hear about.

The Orb itself is also an important potential attack surface. With Tools for Humanity's ambition to fill the world with Orbs everywhere, will Orbs become the next version of the sketchy ATM? Where you might wonder if this funny-looking Orb is trustworthy enough to pay your bar tab without risking emptying your crypto wallet?

Privacy legislators aren't on board

Despite all its privacy promotion material, the World project has failed to convince privacy commissioners around the world of their supposedly good intentions. Perhaps in this case actions speak louder than words, and privacy commissioners aren't so gullible.

With the expansion the project plans this year, we can expect even more experts will examine the company's claims and challenge its "privacy-preserving" assumptions

There are many reasons to remain skeptical about these promises of privacy. Indeed, numerous countries have already suspended, fined, or called for investigation on the company's (mal)practices.

The company was fined for personal data violation

In 2024, the company was fined 1.1 billion Korean won for violating South Korea's Personal Information Protection Act (PIPA). The Worldcoin Foundation was also imposed corrective orders and recommendations. Organizations that are truly "privacy-first" rarely reach this point.

The Data Custody feature, which allows (and encourages) users to share their biometric data with Tools for Humanity is now unavailable in South Korea.

Brazil has banned Worldcoin in the country

In January this year, the National Data Protection Authority (ANPD) banned Worldcoin's operations in Brazil, after the company's appeal was rejected.

The ban comes from regulation stating that consent to process biometric data must be "free, informed, and unequivocal", which cannot be the case with the World project paying users in cryptocurrency in exchange for their iris scans. Data deletion concerns were also raised by the regulator.

The World project tried again to appeal the decision, in vain.

Kenya and Indonesia suspended its operations

In 2023, Kenya, one of the first country where Worldcoin was available, suspended Worldcoin's operations citing concerns over the "authenticity and legality" of its activities related to privacy, security, and financial services.

The worse part is, months before the Office of the Data Protection Commissioner (ODPC) of the country had ordered Tools for Humanity to stop collecting personal information from its citizens. The company simply ignored the ODPC order and continued to collect biometric data from Kenyans. It only stopped after Kenya's ministry of interior and administration gave the suspension order later on.

This again is quite far from the behavior of a company who genuinely values privacy.

More recently on May 4th 2025, Indonesia also suspended the World project's operation in the country over concerns related to user privacy and security. The Ministry of Communication and Digital will be summoning the project's local operators to clarify the operations and determine potential violation of the Indonesia's electronic system regulation.

German regulator ordered GDPR compliance following investigation

In December 2024, the German regulator, the Bavarian State Office for Data Protection Supervision (BayLDA), issued an order to obligate proving deletion procedures that comply with the GDPR within one month. Additionally, the BayLDA ordered the complete deletion of certain data records that were previously collected without sufficient legal basis.

Again, the World Foundation is fighting the order and will appeal the decision. The company tries to argue the data collected was "anonymized", a common strategy to try evading GDPR compliance, which does not regulate anonymized data.

Data protection authorities around the world are investigating

In 2023, France's data protection authority the CNIL investigated Worldcoin's activities in the country. The same year, UK's privacy watchdog started its own inquiry into the company's operations.

In 2024, Hong Kong's Office of the Privacy Commissioner for Personal Data raided six Worldcoin offices citing personal information privacy and security concerns.

There is no doubt more countries and regions will follow with similar investigations and bans as the World project expands to its ambition.

In the United States, the app is restricted in some states

Even in the US where the company is headquartered, the app is restricted in some states. The announcement for its event this month carried a warning the World is “not available for distribution via World App to people, companies or organizations who are residents of, or are located or incorporated in the State of New York or other restricted territories.”

We can also expect the project will encounter roadblocks in states that have passed regulations specific to the collection of biometric data. This includes states like Illinois, Texas, Washington, and Colorado.

Some regions have special regulations for biometric data

Around the world the number of biometric-specific regulations is growing. Even without a regulation specific to this type of data, many privacy laws have started to include special categories and requirements to govern the collection and processing of sensitive biometric data. As companies are increasingly requesting such collection, legislations to protect users are essential.

For example, the province of Quebec in Canada has recently implemented strong protections for biometric data with its new privacy law, the Law 25. Consent isn't sufficient to collect biometric data, as the law requires organizations to explicitly justify the necessity for such collection in the first place. Importantly, any violation of Law 25 comes with fines as hefty as the GDPR's.

More privacy laws should implement such protections quickly, as corporations collecting biometric information carelessly are multiplying fast.

Welcome to full dystopia

The most concerning part of the World project's recent expansion isn't its cryptocurrency grift as much as stepping out of it.

If cryptocurrency enthusiasts wish to share their personal data to get into a special cryptocurrency club, they might (although privacy regulations should still protect them). But using financial coercion to get new users by exploiting vulnerable communities living in poverty is absolutely despicable.

Further, the fact that the World project has partnered with powerful players in the financial, gaming, and even dating sectors should terrify everyone.

Beyond cryptocurrency, if platforms start to demand users everywhere to verify they are a human and verify they are an adult through the World ID system, then everyone will soon be subjected to this.

The amount of money invested in the project means there will be an incredible pressure to spread it everywhere soon, and monetize it. There will be a strong incentive to monetize our data and to monetize our proof of humanity. This isn't trivial.

The well-known dating app Tinder has already partnered with World ID to verify the age of users in Japan. If this experiment works well, and if users comply without objection, this could be soon mandatory for all dating apps.

Let's not stop at dating apps, the World project has already announced last week they will also be working with Razer to verify humanity of online gamers. How far can this go in the age of age verification? Will every online games with mature content soon require a World ID to play?

What about social media? Tools for Humanity's team have insisted the age of AI made us incapable of detecting if we are interacting with bots online. Therefore, they must valiantly come to our rescue to verify our humanity scanning our eyeballs (which bots tragically lack). What if this human verification is expanded to all our social media accounts? Certainly, regulators pushing for authoritarian age verification online would be delighted by such a product.

Then, it comes for our money. The everything app of course offers payment and money management features. This is the app where you can keep your whole wallet, containing all your official IDs, your cryptocurrencies of all kind, and even connect with your less hyped regular bank accounts.

Imagine a single app, owned by a single for-profit corporation, that collects and processes all the data from all your transactions online, all your communications online, that you absolutely have to continue using for your other social media accounts, your gaming life, and your dating life.

There could soon be no way to escape the grasp of World's everything app. Actually, some governments (Taiwan and Malesia) have already started using it for official services, because why not.

The ways this could degenerate fast into full dystopia are infinite, and very real.

The company even plans to ship next year the Orb Mini, a pocket-size personal spy-device with which users will be able to scan their own eyeballs on the go!

But why stop there? Why not scan other people's eyeballs as well? Maybe all government officials could carry one? Maybe every payment terminal could have one too?

We will find out soon, in one or two years.

Tools for Humanity also bragged about the numerous utilities its new technology could make possible. For example, for event tickets! Order a concert ticket with your "proof of personhood" then maybe confirm you are the owner by having your eyeballs scanned to assist to a Rage Against the Machine concert?

The only fun part in this is the irony.

Tools for Humanity with its expansionist dream is without a doubt hungry enough to eat the whole World™️.

A new world of wealth inequalities

The company brings up a few times the mention of Universal Basic Income (UBI) in its documentation, it even mentions it briefly in its white paper.

While puzzling, it appears Tools for Humanity might consider its cryptocurrency bribe to sign up and subsequent token giveaways as some form of UBI? Or perhaps this is only one of its other ambition to control all the financial systems in the entire world. Why UBI is even mentioned at all in this context is unclear.

Regardless, it's worth mentioning a for-profit company giving cash back in exchange for biometric data isn't UBI at all, it's just a creepy membership card points, at best.

While the World project works hard to present the idea this is a tool for the people, where everyone is equal, wealth will definitely not be distributed evenly in this new World order.

Already, 11.1% of World's cryptocurrency tokens (WLD) have been distributed to the World's team, 13.6% to investors, and 0.3% are reserved for Tools for Humanity. This means these entities would share together 25% of the wealth, while 75% of the world's population (according the Tools for Humanity's ambition) would have to share 75% of what's left.

In the new "human" world this corporation envisions, Tools for Humanity and its investors would own 1 quarter of the entire world's wealth. There is nothing equitable or communal in a system like this.

It's important not to forget this everything app will do everything to pressure its users in eventually using Worldcoins, its ultimate goal.

From Tinder's mandatory age verification to cryptocurrency financial ruin in one single move.

The normalization of surveillance

Even if this process was perfectly secure and perfectly private (which it is definitely not), the problem remains the normalization of surveillance.

This isn't limited to Tools for Humanity, although the way the company tries to advertise itself as a privacy-first organization makes it even more important to scrutinize.

But anyone else with a similar approach to biometric data collection for verifying humanity or age or legal names should be on our radar. Moreover, if it's a for-profit corporation with the power to impose this technology on us everywhere in the world.

One company should never have such power.

Further, biometric data should never be used for trivial purposes like "proof of personhood" or age verification. No amount of supposedly "privacy-preserving" features can change this.

The premise itself is flawed from the start to respect privacy rights.

While the problem of proving identify can still be an important one to solve in some context, the solution to this can never be monopolized by for-profit corporations.

Regardless of Tools for Humanity's intentions and efforts to convince us to trust them, any similar technology is just another step towards a global system of mass surveillance, where ultimately privacy rights and human rights are lost.

So, should you scan your eyeball to get a verified World ID?

No.

No, you really shouldn't.


Join our forum to comment on this article.

Thank you for reading, and please consider sharing this post with your friends. Privacy Guides is an independent, nonprofit media outlet. We don't have ads or sponsors, so if you liked this work your donation would be greatly appreciated. Have a question, comment, or tip for us? You can securely contact us at @privacyguides.01 on Signal.