Crade AI Privacy Review: Should Freelancers Let It Watch Their Screen?
Crade AI launched on Product Hunt this week as a desktop assistant that watches your screen and answers questions about whatever you’re looking at. The pitch is convenient — but for solo freelancers, screen-watching AI raises a different question: when your screen contains client invoices, draft contracts, and unredacted email threads, who else gets a look? Crade processes your screen images through a third-party AI provider, keeps your chats on its servers, and retains a separate stream of anonymous logs that, by its own admission, cannot be deleted once collected. Verdict preview: use with caution, and never on screens containing client data without serious workflow changes.
What Crade AI does with your data
Crade is a Mac and Windows desktop app (with an iOS companion) that captures your screen on demand and routes those images to an AI model for analysis. Per Crade’s privacy policy, retrieved 2026-05-13 (policy last updated 2026-04-09), the data flow has three distinct streams.
First, screen and camera content. When you share your screen on web or use the camera on iOS, the images travel to Google Gemini API for analysis. Crade states it does not retain those images on its own servers, but the policy is clear that Gemini does process them and that this processing is governed by Google’s API terms — not by Crade’s policy directly.
Second, your chat history. Each question you type and each AI response is stored on Crade’s servers, tied to your account, so you can access it across devices.
Third, and this is where the story changes, anonymous product logs. Crade keeps a parallel record of message content, AI responses, the name of the application you had open (for example “Excel” or “Google Chrome”), and, when you’re in a browser, the domain you were on (for example “github.com”). The policy explicitly states these logs are not linked to any account or identifier, that Crade cannot determine which user sent which log entry, and that they may be retained indefinitely. Because nothing in the log can be tied back to a specific user, deleting your account does not remove them — and there is no path to identify or delete your contributions.
Third-party processors named in the policy include Google Gemini API (screen and chat processing), Supabase (account storage), and Apple In-App Purchase (iOS billing). Account info collected is limited to your email and credit balance.
What this means for solo freelancers
If you do bookkeeping, contract work, or anything else involving client data while Crade is active, here is what could go wrong, based on the policy as written.
Scenario one: a designer shares the screen to ask Crade about a CSS bug while the next browser tab shows a client’s logo brief, complete with company name in the URL bar. The visible portion of the screen is sent to Google Gemini. The chat exchange — including any AI-generated description of what was on screen — is stored on Crade’s servers indefinitely tied to the account, and a parallel anonymous log captures the message content and notes that the user was in a browser on, say, “figma.com”. Identifying the specific client is hard from logs alone, but the message content itself may include client-identifying details if Crade described or you typed them.
Scenario two: an accountant asks Crade for help reading a complex spreadsheet during a client engagement. The image of the spreadsheet — potentially including client business names, revenue figures, employee names — is sent to Gemini. Crade does not store the image, but processing happened on Google’s infrastructure under Google’s API terms, which historically allow some form of input use for safety and abuse monitoring.
Scenario three: a freelance researcher working under an NDA uses Crade to summarize a confidential PDF visible on screen. The screen image transits Gemini. The summary is stored on Crade and in anonymous logs that cannot be removed. Based on the policy as written, the freelancer has no mechanism to retroactively delete the anonymous log entries, which carries contract-breach risk if the NDA includes data destruction clauses.
For EU clients, the controller-processor chain here is layered: client → freelancer → Crade → Google Gemini → possibly further downstream within Google. Based on the policy as written, this chain creates ambiguity around who is responsible for what and lengthens the path for any data subject access request.
How to use it safely
If you still want Crade for non-sensitive personal work, the policy and product surface offer some levers.
Run Crade only on a clean profile. On macOS or Windows, create a separate user account that has no client documents, no synced cloud folders, and no auto-loading email or messaging apps. Use Crade exclusively in that profile.
Close everything before sharing the screen. Crade analyzes whatever pixels you send it; the policy does not protect you from inadvertent capture of an adjacent window. Single-window mode in your screen-sharing flow is non-negotiable for any client-adjacent work.
Never share a browser window where the URL or tab title contains a client name. The anonymous logs explicitly capture the browser domain — and while the policy says they exclude the full URL and window title, the chat content itself can leak anything you type.
Delete your account, not just your chats, when you stop using Crade. Account-linked data is deletable through the in-app settings; anonymous logs are not deletable at all, so the only mitigation is reducing what enters them in the first place.
Treat Crade as a personal productivity tool, not a client-data tool. If your work involves NDAs or regulated data (health, legal, financial), assume Crade is incompatible with those engagements until the policy provides per-log deletion and a documented Data Processing Addendum.
Privacy-friendlier alternatives
For solo freelancers who want screen-aware AI help without the third-party processing chain and the indefinitely retained anonymous logs, three alternatives map better to client-data workflows.
For local-only screen analysis, run a vision-capable model on your own machine. Tools like LM Studio (https://lmstudio.ai, free) and Ollama (https://ollama.com, free) now ship with open-weight vision models such as Llama 3.2 Vision and Pixtral. Screenshots stay on your hardware, no third party processes them, and there are no anonymous logs to worry about because there is no vendor doing logging. What this gives you that Crade does not: zero outbound data flow during analysis. Target user: anyone with a recent Mac (M2 or later) or a PC with at least 16 GB RAM and a decent GPU. Pricing: free, with hardware costs you likely already have.
For private cloud AI without screen capture, pair Proton Mail and Proton Drive (https://proton.me, free tier with paid plans from around $4 to $10 per month) with a privacy-first AI provider like Kagi Assistant (https://kagi.com, included with Kagi Ultimate at $25 per month) or Mistral Le Chat (https://chat.mistral.ai, free tier available, EU-hosted). You lose the screen-watching convenience but gain end-to-end encrypted document storage and a clearer, EU-anchored controller-processor chain. What this gives you that Crade does not: a documented Data Processing Addendum on request and named EU hosting. Target user: freelancers serving EU clients, those under NDA, or anyone who handles regulated data.
For password and credential hygiene that matches your AI workflow, 1Password (https://1password.com, around $3 per month for individuals) and Bitwarden (https://bitwarden.com, free tier with paid plans from $1 per month) both publish detailed security whitepapers and offer business plans with audit logs. What this gives you that Crade does not: transparent, audited storage with per-record deletion. Target user: any solo worker who currently relies on browser-saved passwords alongside an AI assistant.
For hardware-backed account protection on whatever cloud AI you do end up using, a YubiKey 5C NFC (Amazon, around $55, https://www.amazon.com/dp/B08DHL1YDL/?tag=aidtaskpro-20) makes account takeover dramatically harder than passwords alone. What this gives you that Crade does not: a physical key that an attacker cannot phish.
For the network layer, if you must use cloud AI on public networks, route your traffic through NordVPN (https://go.nordvpn.net/aff_c?offer_id=15&aff_id=144642&url_id=902) so that vendor-side logs do not also tie your activity to your home or coworking IP address. NordVPN runs from around $3 per month on a two-year plan and includes a kill switch that blocks traffic if the tunnel drops.
The verdict
Use with caution. Crade AI’s policy is unusually transparent about what it collects and where the data goes, which is a credit to the vendor. But two structural choices — routing screen content through Google Gemini under Google’s API terms, and keeping anonymous logs that cannot be deleted because they cannot be identified — make Crade a poor fit for any workflow involving client data, NDAs, or regulated information. For personal use and non-sensitive tasks on a clean profile, the tool can be useful; for paid client work, the data path and the irreversibility of the anonymous log retention create risks that outweigh the convenience.
Frequently asked questions
Does Crade AI train on my screen content?
The Crade privacy policy, retrieved 2026-05-13, does not state that screen content is used to train Crade’s own models. However, screen images are sent to Google Gemini API for analysis, and use of that data is governed by Google’s API terms rather than Crade’s own policy. Solo freelancers should treat any screen content sent to Crade as subject to Google’s processing terms in addition to Crade’s, which is a longer chain than a self-hosted alternative.
Can I use Crade AI for HIPAA-regulated client work?
Based on the policy as written, Crade AI does not publish a HIPAA Business Associate Agreement, does not describe HIPAA-aligned controls, and routes content through Google Gemini API without naming a BAA on that leg either. Freelancers handling protected health information should treat Crade as out of scope until and unless Crade documents a HIPAA-aligned data flow and offers a signed BAA. This is not legal advice; consult a compliance professional for your specific situation.
Is Crade AI suitable for EU clients under data protection law?
Based on the policy as written, Crade does not publish a dedicated Data Processing Addendum, does not specify EU data hosting, and uses Google Gemini API as a sub-processor without naming the hosting region. The controller-processor chain extends through Crade to Google, which adds steps to any data subject access or deletion request. Freelancers serving EU clients should request a written DPA from Crade before processing any client-attributable content.
What happens to my Crade data if I delete my account?
Per the policy retrieved 2026-05-13, deleting your Crade account removes your account-linked data, including stored chat history. It does not remove the anonymous product logs that Crade retains, because those logs are deliberately unlinked from any account identifier — Crade states it cannot determine which user originated which log. The practical effect is that some portion of your activity remains on Crade’s analytics surface indefinitely.
Does Crade encrypt my data end-to-end?
The Crade privacy policy, retrieved 2026-05-13, does not describe end-to-end encryption for screen content, chat messages, or anonymous logs. Screen images move from your device to Google Gemini for processing, which means at minimum they are decrypted at Google’s API endpoint. Freelancers who need end-to-end encrypted document workflows should look at on-device AI options (Ollama, LM Studio) or encrypted-storage providers like Proton.
Is there a way to opt out of Crade’s anonymous logs?
The Crade privacy policy, retrieved 2026-05-13, does not describe a user-facing opt-out for anonymous product logs. The only mitigation is to limit what enters the logs in the first place — by not asking Crade questions about client-identifying screen content, and by closing client tabs before sharing the screen. Based on the policy as written, the logs persist whether or not the user account remains active.
Sources
- Crade AI privacy policy, retrieved 2026-05-13, https://crade.ai/privacy (policy last updated 2026-04-09)
- Crade AI homepage and product description, retrieved 2026-05-13, https://crade.ai/
- Crade Product Hunt launch listing, retrieved 2026-05-13, https://www.producthunt.com/products/crade
Reviewed by Jérémy, founder of AidTaskPro and GreenBudgetHub. Based in central France. Privacy posture sourced from public policies and vendor documentation as of 2026-05-13.
Get Your Free Cybersecurity Checklist
Protect your digital life in 5 minutes. Free checklist + weekly productivity & security tips.