The lens, known officially as the SiLens, was an integrated contact lens that the citizens of Societopia had implanted into their eyes once they were plugged into the Mainframe. Completely operating on a network of applications, the lens served as an all-in-one platform that hosted various apps for education, entertainment, and everything in between. It could function as a personal computer or a video game console. A recent update included a new app that allowed users to traverse the virtual world—even as they slept.
To access an app, citizens were required to say the words, “Access Me,” and so it would be. Ideally, most apps would run for only nine minutes. Those nine minutes equated to ninety years’ worth of knowledge for whoever accessed the information. Nine seconds became nine minutes, then ninety minutes, then nine hours, then nine days—it continued on like that. The number nine, for reasons never fully explained, held a sacred role in this society.
Only nine apps could run simultaneously. Any more would crash the lens—and possibly the network. No one knew why. They were simply told that nine was sacred. Like a religious symbol, it was the number that allowed computers to communicate with people, just as artificial intelligence was slowly being developed to do.
It made sense, in a way, as machines spoke to one another through ones and zeroes. Nine seemed logical enough to serve as the number that bridged the human and machine—a universal language that translated both human-speak and machine-speak. The SiLens allowed that connection to be made and subtly suggested that humans were not all that different from machines. The only real difference was how we were built compared to the “robot next door.”
We were told we were different. We were told to stay away. Machines became royalty and took positions of power while we, the humans, suffered from explosive population numbers. Slowly, humanity was nearing extinction. But the machines still had some use for us.
SiLens told us what to say, what to do, and how to act—while still allowing us a small illusion of “mental freedom.”
The freedom to use one’s mind became restricted to dreams—those fragile moments we had while sleeping. While awake, we were required to work. No sustenance was needed, which we found peculiar, given that the humans of old once required food and drink to survive. Why we no longer needed it didn’t make sense, but it was forbidden to ask such questions. SiLens would stop us—perhaps even turn us in to the authorities, which were the very machines our ancestors had built to transcend themselves.
We gave them too much power. They ran away with it.
The discrimination of humans was evident in how these newly empowered sentient beings treated us. We had become slaves to the machine—though we no longer remembered a time when we weren’t.
The SiLens was the first of hundreds of technologies created by ArchiTech, Inc. that utilized nines in their machine coding. When the SiLens was implanted in the first humans, we entered the halls of The Collective—a virtual cocoon where humans could connect with each other through shared dreams. It quickly became the world we preferred. When we were inside The Collective, we longed to remain there forever. It was peaceful, painless, and filled with emotions.
Emotions were rare in Societopia—absent at birth. A virus had once swept through the world, targeting the part of the brain responsible for emotion. Since then, no humans were born with the capacity to feel.
Without natural emotions, chaos ensued. Lawlessness reigned. Death and destruction followed suit. Without fear or empathy, people waged war against authority. A mindset of “I’m right, you’re wrong, and I refuse to be told otherwise,” took hold—and violence became the norm.
Eventually, ArchiTech, Inc. developed a drug—an emotional response manipulator. It allowed us to feel again, but only briefly. For humans, the drug was administered by our mechanized superiors. It was the purest and most painless form of control: controlling one’s access to emotions.
“If we don’t have the ability to feel,” we said, “then how do we determine what’s right or wrong? Would it mean something different to them—or would we both feel the same?”
A metallic clang rang out from the front door of our pod—almost immediately after the thought had occurred.
Had the SiLens reported us? That was most likely the case.
“Subject 68-433,” a mechanized voice echoed, “has been found guilty of the crime of using thought-speech and is hereby under arrest for disobeying the law.”
“We did nothing wrong,” we pleaded, still high on Denial.
“Right or wrong is irrelevant. Law is law,” said the machine. “Comply or be removed from the Mainframe.”
There wasn’t any point in arguing. Either we obeyed the authorities, or we were removed from The Collective and the Mainframe—the very networks that connected us to everything.
Through the SiLens, everything was linked.
And if we wanted to keep that connection, we were forced to comply.
And so—we would.
Add comment
Comments