Redefining Tech with Ethics: “Whoever writes the code dictates the rules.”

At the inaugural FPS2025 summit, one of the most captivating conversations centered on the intersection of artificial intelligence (AI), privacy, and the ethical implications of technology.

Meredith Whittaker, President of Signal and a vocal advocate for tech accountability, engaged in a thought-provoking discussion with Matteo Flora, the most expert on AI and digital reputation in Italy and probably one of the most prominent at global level. Their conversation, which took place in L’Aquila, Italy, highlighted critical issues in the tech industry today — from the dangers of surveillance capitalism to the ethical dilemmas of AI deployment

I decided to dedicate this post to this interview, that I found incredibly important and yet not seen / heard enough.

The Code Dictates the Rules: Who’s Behind AI?

A central theme of the conversation was the idea that “whoever writes the code dictates the rules.” Whittaker emphasized the profound implications this has for our society.

With the rise of AI, we are seeing unprecedented scales of power being concentrated in the hands of a few tech giants who not only control vast amounts of data but also influence societal norms and political landscapes.

AI, she explained, is not magic but simply sophisticated code executed on powerful hardware platforms. However, these systems are often built on biased data, reflecting the interests of a small group of corporations with a profit-driven agenda. Whittaker raised concerns about the monopolization of AI development, highlighting how only a handful of companies own the infrastructure and the data that train these systems. This creates a dangerous asymmetry in power, as these companies dictate the narrative surrounding AI, often casting it as a superhuman force that society should revere.

The Ethics of AI and Data Collection

One of the most pressing issues discussed was surveillance capitalism—the business model that underpins much of the tech industry. Whittaker contrasted this model with the mission of Signal, which is to provide a secure, private communication platform that respects user privacy. She pointed out that the current tech landscape, where personal data is collected and monetized, is not designed with the user’s best interest in mind.

Signal, as a nonprofit organization, faces the challenge of maintaining its privacy-first model in an industry that thrives on data exploitation. Running such a platform at scale costs millions of dollars annually, and Whittaker explained that if Signal were a for-profit company, it would inevitably face pressure from investors to compromise its privacy standards for the sake of profit. This is why Signal must remain independent of the profit-driven business models that dominate the tech industry.

The Global Impact of Privacy

Throughout the conversation, the global implications of tech and privacy were a recurring theme. Whittaker discussed the importance of cross-border communication and how platforms like Signal empower people worldwide to connect securely. Unlike nationalized versions of tech products, which would be restricted by jurisdictional boundaries, Signal’s universal design ensures that it works seamlessly for people across different countries and cultures.

In addition to tech’s global reach, Whittaker also emphasized how AI can contribute to the homogenization of culture, creating a risk of losing linguistic diversity and cultural richness. She cautioned against the dangers of AI reflecting only the narrow perspectives of a few dominant cultures, particularly those in Silicon Valley, and warned that this could stifle creativity and unique ways of thinking.

Basically the message is: “Most – all – digital large global platforms are designed to collect user data, making it nearly impossible to build similarly large, stable, and user-friendly platforms without relying on data-driven ROI. The very basic infrastructure is nearly impossible to build without huge investment”

Building a Better Future with Tech

Despite the challenges and ethical quandaries, Whittaker remains optimistic about the potential for change. She advocates for innovating the business models behind tech, suggesting that true innovation lies not just in new technologies but in rethinking how these technologies are funded and distributed. She sees a future where privacy and fairness are not afterthoughts but foundational principles.

The conversation at FPS2025 ended with a call for a societal shift. Whittaker urged policymakers to adopt regulations that challenge the current business models and protect individual rights in the face of rapidly advancing technology.

She also urged the European Parliament to embrace a more equitable tech ecosystem, one that does not succumb to the dominant narratives set by large corporations but instead fosters a digital environment that works for everyone.
Especially now that the UE signed a letter with 44 European CEOs asking the EU to pause the AI Act for two years, it is the time to rethink the legislation and make it suitable for innovating while ensuring parity and equal rights.

The Bottom Line: Tech for Good

Whittaker’s reflections underscore the importance of ethical tech in the modern world. As AI continues to reshape our societies, the conversation about who controls the code and how that power is used has never been more crucial.

Through platforms like Signal, Whittaker and her team are pushing back against the status quo, demonstrating that it is possible to build technology that prioritizes privacy and fairness over profit. The challenge, however, remains: will we as a society allow this vision to flourish, or will we continue to let surveillance capitalism dictate the rules?

In the end, Whittaker’s message is clear: the future of tech must be inclusive, transparent, and accountable. It’s up to us — the users, the builders, and the policymakers — to decide how we want that future to look.

Comments

Leave a comment