Decoding: Yesterday's Watchtower, Today's Algorithm
Without walls, but still prisoners. A deep analysis of how Bentham's panopticon evolved into the invisible algorithmic surveillance that shapes our lives.

Without walls, but still prisoners. We live in an era where surveillance ceased to be visible to become invisible, hidden in algorithms, digital flows and automated decisions that silently shape every aspect of our lives.
Yesterday's Watchtower, Today's Algorithm
The first image captures a disturbing truth: the watchtower didn't disappear, it just changed form.
What was once concrete and visible - walls, bars, guards - is now abstract and invisible - code, data, probabilities. But the power remains. Control persists.
The difference? Today, we don't even realize we're being watched.
The Panopticon: From Bentham to Digital
In the 19th century, Jeremy Bentham proposed the panopticon: a circular prison with a central watchtower from where a single guard could observe all prisoners, without them knowing if they were being watched at that moment.
The perverse genius of the design: prisoners, uncertain if they were being watched, would internalize surveillance and self-discipline.
The panopticon didn't disappear. It ceased to be visible to infiltrate statistics, recommendations, metrics.
The Evolution of Surveillance
Yesterday:
- Physical and visible tower
- Guards observing directly
- Tangible walls and bars
- Surveillance you could see
Today:
- Invisible and silent algorithms
- Systems constantly collecting data
- "Bars" of calculations and probabilities
- Surveillance you don't even notice
Today there are no walls, but surveillance remains invisible, hidden in calculations, digital flows and automated decisions.
From Observer to Data Collector
Power ceased to directly observe individuals and began to collect data, predict probabilities and induce collective behaviors with apparent statistical neutrality.
This change is fundamental. The new surveillance doesn't need to see you - it just needs your data:
- Every click is recorded
- Every purchase is analyzed
- Every pause is measured
- Every movement is predicted
Kaspersky research (2023) revealed that 64% of users believe algorithms manipulate what they consume online. Most know they're being influenced, but continue being influenced.
The Illusion of Neutrality
They sell us the idea that algorithms are "objective", "impartial", based on "pure data". But this is illusion.
Every algorithm carries:
- Values of who programmed it
- Interests of who finances it
- Biases of the data that trained it
- Objectives of who controls it
"Statistical neutrality" is power's new disguise.
The Tower Was Exchanged For Invisible and Silent Numbers
Power doesn't ask permission. It shapes routines discreetly:
- The feed anticipates your interests (and directs them)
- Waze changes your routes (aggregating you to patterns)
- The score defines your possibilities (limiting your future)
Everything happens before you have a chance to reflect. The decision has already been made. You just execute.
The True Control Field
Statista data (2024) shows that 80% of people recognize algorithms' direct impact on their consumption.
There are no bars, but there are calculations that guide daily decisions, choosing before we can reflect.
This silent influence has become the true control field.
Do you think you choose what to watch on Netflix? Or has the algorithm already decided which options you'll see?
Algorithms Are Not Neutral
They direct attention, amplify profits and establish subtle forms of control.
As Shoshana Zuboff highlighted in her work "Surveillance Capitalism":
"Digital power doesn't act by force, but by capturing daily behaviors without conscious resistance."
Surveillance Capitalism
Zuboff argues we live in a new form of capitalism where:
-
Human experience is free raw material
- Every online action is extracted as data
- Behavior is mined as natural resource
-
Prediction is the product sold
- Companies don't sell products to you
- They sell predictions about you to advertisers
-
Behavioral modification is the goal
- Not just predicting what you'll do
- But influencing what you'll do
-
Consent is illusory
- You "accept" terms you didn't read
- Alternatives are nonexistent (try living without Google/Facebook)
The watchtower became invisible infrastructure of the digital economy.
Resisting Is Not Rejecting Technology, But Questioning It
This occurs culturally and socially in practices like:
Personal Reflection
- Question your own pleasure: Why do I like this? Or did the algorithm teach me to like it?
- Rethink time: Did I choose to spend 3 hours on Instagram or was I captured?
Digital Detox
- Intentional offline periods
- Disable notifications
- Use airplane mode strategically
Critical Consumption
- Series and documentaries about digital manipulation
- Books that unmask surveillance
- Conversations about privacy and data
Active Awareness
Cultivating attention about:
- Data (what do I share?)
- Information (where does what I consume come from?)
- Attention (where am I depositing my time?)
Resistance isn't luddism. It's lucidity. It's not rejecting technology, but refusing invisible submission.
The Global Power Balance
Regulations like the European one on artificial intelligence already seek to limit abuses of invisible systems, recognizing their growing influence.
The Regulatory Dilemma
On one hand:
- ✅ Protects citizens from manipulation
- ✅ Makes algorithms more transparent
- ✅ Establishes ethical limits
On the other:
- ⚠️ Limits innovation as organization
- ⚠️ Can slow technological development
- ⚠️ Creates global competitive disadvantage
Europe regulates. The US innovates without brakes. China controls everything.
Weighing on the global power balance, each approach shapes not just technology, but nations' political and economic future.
Who Defines the Rules?
- Governments try to regulate
- Big Tech resists
- Users are pawns
- The game continues
The question is: who should have the power to decide?
Statistical Power Seems Inevitable, But Can Be Displaced
The panopticon didn't disappear. It ceased to be visible to infiltrate:
- The statistics that classify us
- The recommendations that direct us
- The metrics that evaluate us
The Convenience Trap
What seemed convenience is, often, disguised discipline:
- "Personalized recommendations" = algorithmic bubble
- "Optimized feed" = programmed addiction
- "Customized experience" = segmented manipulation
Freedom today is not measured by the absence of bars, but by the ability to see the calculations that shape us.
You're free when you choose consciously, not when you automatically follow algorithmic suggestions disguised as choices.
How to Displace Statistical Power
1. See the Invisible
Awareness is the first step:
- Perceive when you're being influenced
- Recognize manipulation patterns
- Identify moments of attention capture
2. Create Friction
Add intentional friction:
- Delete addictive apps from phone (access only on PC)
- Disable autoplay on everything
- Use tracking blockers
- Pay with cash sometimes (not just traceable card)
3. Data As Currency
Understand the value:
- Your data is worth billions (aggregated)
- Companies profit from your privacy
- Should you be paid for it?
4. Counter-hegemonic Technology
Use resistance tools:
- Privacy browsers (Brave, Firefox)
- Non-tracking search engines (DuckDuckGo)
- Encrypted messengers (Signal)
- VPNs for browsing
5. Collective Organization
Individual resistance is limited:
- Movements for digital rights
- Political pressure for regulation
- Organized boycotts
- Massive digital education
Creators and Artists Subvert Algorithms
Hacking patterns and exposing hidden mechanisms, breaking expectations to generate digital cracks.
Digital artists are:
- Creating works that confuse recognition algorithms
- Generating "noise" that pollutes surveillance databases
- Developing generative art that exposes AI biases
- Using glitch art to show system failures
Where new paths can be found - not of passive resistance, but of active creative subversion.
Conclusion: Freedom in Times of Invisible Surveillance
The panopticon evolved. Surveillance became sophisticated. Control became invisible.
But it's not inevitable.
Every time we:
- Question a recommendation
- Disable a notification
- Read instead of scroll
- Choose instead of accept
We're displacing statistical power.
Freedom today is not measured by the absence of bars, but by the ability to see the calculations that shape us - and by the courage to not obey them automatically.
Power seems inevitable. But it can be displaced.
Start seeing. Keep questioning. Never stop resisting.
🔗 Continue the Conversation
Join the Sapiens Sintéticos community:
- 🌐 Website: SapiensSinteticos.com - Complete content about AI, technology and autonomy
- 📸 Instagram: @sapiensinteticos - Daily visual reflections
- 💬 WhatsApp Channel: Sapiens Sintéticos Channel - Exclusive updates and discussions
When was the last time you questioned an algorithmic "recommendation"? Share your experience in any of our channels.
Decoding is a series dedicated to unveiling the invisible mechanisms that shape our digital reality. This is the first episode. More will come.