Artificial Interruption

by Alexander Urbelis (alex@urbel.is)

Concerning the Importance of Empathy

It's difficult for me to write this column.  The reason, however, has nothing to do with the subject matter but is entirely physical.  I have two sore arms.  My left arm feels like it's been pummeled for several hours with a meat tenderizer on account of the vaccine booster.  My right arm and shoulder, I injured those in a cycling accident a few days ago.

I enjoy cycling in New York City.  It's considerably less stressful than taking a crowded subway car full of sniffling, coughing, and the recriminations from other passengers that go along with such involuntary biological expulsions.  It's excellent exercise and I generally take the long way into the office, heading west to the edge of the city where the Hudson River lies adjacent to a bike path that, with riders flowing north and south, similarly follows the western limit of the island of Manhattan.  I head east somewhere around 50th Street and, depending on which lights I hit, I'll then venture north to 56th Street, staying on that eastbound street until I arrive at the back entrance of my office tower.

A few days ago, on that last stretch of crosstown road is where the accident happened.  Traffic on 56th Street in the mornings can get backed up if there is a singular inconsiderate jerk double-parking.  Between those imbeciles and the proliferation of outdoor dining vestibules, there can be scant room to maneuver on a bicycle.  As I was cautiously doing so, a large white van belonging to a local bakery inexplicably and unforeseeably lurched from the traffic lane into the small track of road between dining vestibules and traffic, occupying the exact bit of space and time that I was about to utilize myself in mere milliseconds.

"No, no, no!" I shouted, hoping that the driver would miraculously lurch back to the space from whence he came.  No such luck.  I braced for impact and, after first connecting with the van via my front tire, I collided with the sideview mirror with my right flank and shoulder.  The velocity must have been considerable because the mirror seemed first to explode, drop, and then hang forlornly, swinging ever so pendulum-like with the vestiges of the momentum I had transferred to it in the collision.

Taking stock of what happened, and shocked that I did not end up on the ground, I found myself right next to the driver-side window.  Blood in full boil at this point, with colorful alliteration focused on the "F," I inquired of the driver about what he was thinking and demanded he pull over.  Rolling down the window, a skinny Asian man of about 30 years of age or possibly fewer began profusely apologizing.  With considerable ardor, I explained to the driver that I did not know if I was injured and that we must exchange information.  Shockingly, the driver ignored me, rolled up the window, and drove away.

This whole scene was unbelievable to me both for its gall and futility: the traffic was barely moving on account of the aforementioned inconsiderate jerks double-parking.  I snapped a quick photo of the license plate and easily returned to the driver's window.  Before I could say, or shout, anything, the driver rolled down the window and continued to apologize.  I reiterated my request to the driver to pull over.  It was clear that English did not come easy to this person, but it was similarly obvious that he knew what I requested and did not want to comply.  This failure of compliance prompted me to state that I would have no choice but to call the police if he continued to refuse.

At the mention of the police, the apologies became more profuse.  Palms together, fingers pointing upwards as in prayer, the driver pleaded with me, "Please, please, please, no police - I'm sorry, I'm so sorry."  The pleas continued, and I suspected what you, reader, are likely suspecting: that the driver was undocumented.  In broken English, he confirmed this suspicion.  By this time, my adrenaline was beginning to subside, and I did not feel like I was severely injured.  Calmly, I asked him, "Where are you from?"

"Myanmar," he replied.  Immediately, the gravity of the situation gripped me.  Here was a man who fled a country rife with violence, insecurity, human rights abuses, torture, and which has been on the brink of collapse for nearly a year following an intense military coup, a man who found a job with a bakery that probably paid very little, but for whom that very little meant a great deal because that job was very likely a means of support for his family still present in Myanmar.  And he was terrified that this singular mistake, on a crowded street in rush hour, could upend his life and force him back to Myanmar.

My wheel was bent and would likely need replacing.  Aside from some scraping on a leg, I was not sure if I was really unscathed or if the shock of the impact had masked other injuries.  In this moment, I resolved not to destroy this man's life, not to call the police, not to call his employer, but to reassure him that everything was okay and that he should not worry.  The gratitude was as profuse as the apologies were moments ago.

Front of mind during this split-second decision was the fact that Myanmar was the subject of a report that a friend and colleague - a human rights lawyer with the United Nations - had authored.  In addition, Myanmar had recently made headlines because of litigation in which it was claimed that Facebook turned a blind eye to tidal waves of misinformation that fomented human rights abuses, violence, and death in Myanmar, and, coincidentally, that litigation was in part based on the findings in this very report.

This 2019 U.N. report details the findings of the Independent International Fact-Finding Mission on Myanmar ("the Mission") and its earlier work that "Documented the extensive roles that Facebook and other social media platforms played in distributing [...] speech, including through language, cartoons, memes, or graphic content that fueled social attitudes, intolerance and violence against Rohingya," an ethnic minority in Myanmar.  Specifically, the misinformation and propaganda sought to dehumanize the Rohingya, comparing them to animals such as pigs, dogs, and maggots, claimed that the Rohingya were rapists, and encouraged their extermination.  The class action complaint against Facebook directly tracks these findings and argues that Facebook failed to police such harmful content and that the platform promoted real-world violence against Rohingya.

The appalling violence against the Rohingya was in fact inhuman, in the sense that it is hard to fathom human beings committing such atrocities against each other.  Extrajudicial mass killings, gang rapes of women, and the deliberate targeting and killing of children occurred.  There are reports of military helicopters attacking villagers, gunning down boats carrying refugees, as well as reports of children being thrown into burning homes.

The idea that that technologies that humans built, when left unchecked, can encourage and foment shocking atrocities like these is in and of itself an abhorrent fact, anathema to the hacker spirit and philosophy.  To fight against technology being co-opted for evil is part of the reason why I jumped at the chance to become a founding member of the technology advisory board of Human Rights First; and being part of that board, and being aware of the abhorrent indifference to human life that happened in Myanmar, is what made me even more sensitive to the plight and situation of an undocumented person (Editor's Note: They are not 'undocumented,' they have plenty of documentation - in their own country.) in New York, terrified of me calling the police.

Preventing this from happening again is even more crucial given that a Reuters report from November 2021 found that the Myanmar military was using bogus social media accounts to engage in what it termed "information combat."  What then can be done within and without social media platforms to prevent this from ever happening again?

I would be remiss if I did not mention that Facebook has taken notice of the problem.  Facebook has put together teams of Burmese speakers, disrupted misinformation networks, and has claimed it has been monitoring the situation.  But all of this seems like it's too little too late when the damage is done and lives have already been lost.  The weaponization of social media is nothing new.  Indeed, since the 2016 election in the United States, we have seen first hand how dangerous unchecked amplified misinformation and hate can be throughout social media platforms.

The recent Facebook whistleblower, Frances Haugen, has given us a firsthand look at the detailed research that Facebook itself conducted.  That research was surprising in that the conclusions to which it came found that harms could arise to children and other groups based on exposure to certain content; but that research remained private, within Facebook, and would have remained so forever but for Haugen's revelations.

At this late stage in the game of power-wielding, social media platforms sparring with governmental regulators and evading liability in court, is it not time that we should demand that certain types of data be available to qualified researchers around the globe, if not for the sake of research in and of itself, then for the simple fact that more eyes on this data can better forecast when digitally amplified hate will spill over into physical violence and death?

It may sound impressive when we hear from Facebook or Twitter that they have removed 10,000 posts containing misinformation and disrupted a dozen or so networks of hate groups.  But without engagement rates, those numbers mean nothing.  If those 10,000 posts had 10,000,000 pairs of eyes on them, then there are two further statistics I submit are more relevant: (i) the rate (including acceleration) at which this information was spread from person to person, and (ii) the degree to which the platform's algorithms itself had assisted the offending posts in maintaining or gaining velocity to result in exponential reach.

We need academics and independent researchers who have training in qualitative and quantitative analytical methodologies to have a right to certain types of social media data.  We need those outside, objective observers to monitor for early stage indicators of the amplification of hate, to be looking out for their own countries, communities, and people and those of their neighbors, alerting and working symbiotically with social media platforms to stay ahead of the next wave of hate.  Hate is too amorphous, too much of a shapeshifter, for any single entity to globally conquer on its own, and yet this is exactly what we have been expecting of social media platforms because they have insisted on going at it alone.

The research community is chomping at the bit to get access to social media data, without which I believe we will continually see the same pattern of violence: a spark of hate, amplified in a unique way, in a country to which platforms do not devote enough resources, that results in ever-increasing violence.  And as for accountability, I predict that the lawsuit brought against Facebook for failing to act in Myanmar will be dismissed on the grounds of the immunity that Section 230 affords platforms.

As for my shoulder, that will heal.  And as for my front tire, I'll be covering that cost myself without worry, a minor injury and a small price to pay in comparison to what was at stake for the driver of that van, and minuscule in comparison to the unnecessary suffering that unregulated technology and unchecked hate has occasioned upon the people of Myanmar.

Return to $2600 Index