News — At The Edge — 5/12

Written by doch_one | Published 2018/05/11
Tech Story Tags: tech-newsletters | artificial-intelligence | economics | privacy | future

TLDRvia the TL;DR App

Three sets of articles that fit our brave new world

  • Post-capitalist society —AI ends capitalism— in some form seems certain.
  • Central future issue —privacy versus security— defines daily life ahead.
  • Issues bleeding into the future —dis-information war— must be controlled.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Post-capitalist future

AI will spell the end of capitalism —

“If AI remains under the control of market forces, it will inexorably result in a super-rich oligopoly of data billionaires…[but i]f AI rationally allocates resources [it]…can supplant the imperfections of ‘the invisible hand’ while fairly sharing the vast wealth it creates**….**

[T]he inevitability of mass unemployment and the demand for universal welfare will drive the idea of…nationalizing AI….

’Our responsibility is to our shareholders,’ the robot owners will say…[and] been able to get away with their social irresponsibility because the legal system and its loopholes…are geared to protect private property above all else….

[T]he very pervasiveness of AI that will spell the end of market dominance…[and] more, regulation of private companies will become a necessity to maintain…stability in societies….Laissez-faire capitalism as we have known it can lead nowhere but to a dictatorship of AI oligarchs [and]…leading to a battle between robots for market share that will surely end [badly]….Like nuclear and biochemical weapons, as long as they exist, nothing…can ensure society’s safety….

AI increasingly enables the management of complex systems [and]…presents, for the first time, a real alternative to the market signals that…can lead the way toward this new stage of human development.” https://www.washingtonpost.com/news/theworldpost/wp/2018/05/03/end-of-capitalism/?noredirect=on&utm_term=.82710684cf90

Economists focus too little on what people really care about —

“[Many] economists often assume that prices are all anyone needs to know [which]…biases many of their conclusions, and limits their relevance**….**

[T]hey can pay more attention to the way focusing on ‘material well-being’, as determined by the ‘measuring rod of money’, influences and constrains their work….

Not every dollar is of equal value…[so] if two economists…bid on an apple, the winner would desire the apple more…[yet] greater wealth means that his bid is less of a sacrifice….

But the profession is surprisingly casual about its potential implications: for example…[GDP] measures of output include spending on cigarette advertisements, napalm and the like, while omitting the quality of children’s health and education….Social costs such as pollution are omitted…[and] unpaid work in…2010 would have raised its value by 26%….

Expanding the reach of markets is not just a way to satisfy preferences more efficiently. Rather, it favors market-oriented values over others….

European Commission and the World Bank, now publish data series presenting a more comprehensive picture of social health….

Price is a poor measure of the value of digital goods and services…. Technological progress promises to create ever more situations [that]…conflict with narrowly material ones.” https://www.economist.com/news/finance-and-economics/21741563-fourth-our-series-professions-shortcomings-economists-focus-too

Capitalism and Post-Capitalism — The Whole Truth & Nothing But_As a dog that regularly bites, it is fair to say we have a love-hate relationship with capitalism. This lends itself to…_medium.com

Central future issue

Personal privacy vs. public security: fight! —

“[T]hat privacy is an important part of personal security is [new]…while the need for public security…must be guarded [undisputed]….

Privacy has long been a luxury…so when technological security is treated as a trade-off between public security and privacy…the primacy of the former is accepted…[like] demands for ‘golden key’ back doors so that governments can access encrypted phones…[yet] even if a perfect…key with no vulnerabilities existed…[it] would still be morally complex.

Consider license plate readers that…track the locations of most cars…in near-real-time with remarkable precision….[Or] how the Golden State Killer was identified, by trawling through public genetic data…. Public security — catching criminals, preventing terror attacks — is far more important than personal privacy. Right?…

[Also] corporate security…[is] assumed to be far more important than personal privacy….

Public security is essential; privacy is nice-to-have….Except this dichotomy [is]…often promulgated by people who should know [it’s]…completely false…[because] we’re talking about the collection and use [by]…governments and corporations accumulating massive amounts of highly personal information from billions of people…[and] a massive public security problem….

  • [First] the lack of privacy has a chilling effect on dissidence and original thought…and in this era of cameras everywhere, facial recognition, gait recognition, license plate readers, Stingrays, etc., your every move can be watched…. Are we so certain that all of our laws are perfect…and that we will respond to new technologies by immediately regulating them with farsighted wisdom? I’m not….
  • [Second] privacy eradication for the masses…[but] privacy for the rich, will…perpetuate status-quo laws / standards / establishments, and encourage parasitism, corruption, and crony capitalism…[and] ‘selective enforcement of unjust laws’…[for] anyone who challenges the status quo’?….
  • [Third] technology keeps getting better and better at manipulating the public based on their private data…[so] privacy is no longer a personal luxury. …When constant surveillance…[systematically] dissuades people from…expressing contentious thoughts, privacy is no longer a personal luxury. And that, I fear, is the world we may live in soon enough, if we don’t already.”

https://techcrunch.com/2018/05/06/personal-privacy-vs-public-security-fight/

How Do You Control 1.4 Billion People? —

“[So] you accidentally defaulted on a phone bill [which]…affects your credit score [so]…hard to get a loan…[and] jokes about Marco Rubio on Twitter…will algorithmically define you as…likely to default on social obligations…[and when] close friends miss their student loan repayments…your social circle is now all ‘discredited’….

[China’s] ‘social credit scheme’ will become mandatory for all residents by 2020…[and] rated according to their ‘commercial sincerity,’ ‘social security,’ ‘trust breaking’ and ‘judicial credibility’…[as] a data-driven system that automatically separates the good, the bad, and the ugly…predicts ‘unprecedented’ levels of dictatorial surveillance….

[Some] activities, like late-night web browsing or buying video games, could see one’s rank downgraded for ‘irresponsible’ behavior.….[It’s] a form of high-tech Stalinism…in which those who toe the line are kept doped with rewards…[while] dissenters, dropouts or deadbeats would be effectively excommunicated from mainstream society….

Social credit will align with Communist Party policy…[and] only the security services have access to… every scrap of information the state keeps on them, from exam results to their religious and political views….

[But] few have considered how vulnerable the system is to the corruption, con artistry, and incompetence…[ errors] and some lenders deliberately misrepresent user information….

[Still] social credit ratings ‘will also create a greater disincentive to engage in anti-social behavior’….(So, too, will offering ‘insincere’ apologies for defaulting on loans; one must not only learn to grovel, but like it.)

To work effectively…requires Chinese citizens to place complete trust in…their unaccountable government and vast cartel-like corporations….A secretive scheme that proposes to (literally) codify credibility within a society that inherently lacks any is more likely to undermine public trust that instill it.” https://newrepublic.com/article/148121/control-14-billion-people

How Does This End Well?_The current encryption privacy-security debate has become scary silly. As with many polarizing issues, both sides in…_medium.com

Issues bleeding into the future

Bots Aren’t the Enemy in the Information War — We Are —

Russian interference…[in] 2016 election…finished off both social media’s innocence and traditional media’s authority…[so] Americans, as of now, have nowhere else to turn…[and] stalled in the data smog that hangs over social media and search engines…[with] the sanctity of our reason…routinely violated online….

Computational propaganda,’ as the human-machine hybrid campaigns are known, has been described as a way of ‘hacking people’….

First, the crime is in the software…[and] the information war includes seasoned generals….The weapons are hybrids too…..People can whitewash buggy bot-speak by giving it a human sheen in a retweet…that fire people up, so botnets can ratchet up the velocity of the most incendiary memes….

The content didn’t need to be accurate or fair to be effective; it just needed to seem human…[so] campaigns involve masquerade, deception, and anthropomorphism — the disguising of robots as people — is part of why….

Americans are disinclined to see the internet and the nation as under siege. If we had swollen glands and bloody vomit, we’d accept a diagnosis of anthrax poisoning, but no one likes to see herself as cognitively vulnerable….’No one likes to be told they’ve been duped’….MIT study of false news made it clear that bots have equanimity when it comes to contested stories, while humans decisively prefer to spread lies over truth. In particular, we appear to like and share the lies that shock and disgust….

If so, there’s no way around this problem but through it. Of course, propaganda should be marked, regulated, and debunked. But at the same time, we need to understand our fragility….More even than robots, our most ancient proclivities may be our undoing.” https://www.wired.com/story/social-media-makes-us-soldiers-in-the-war-against-ourselves/

The Biology of Disinformation —

“Political campaigns have always…come down to propaganda: the leverage of social and psychological biases to promote a particular point of view…not just on the emotional makeup of the individual, but the social context in which individuals live….

Regulation, competition, and a free press are all meant to protect the public from such manipulation….But a new breed of high-tech tools for political persuasion have emerged…[with] viruses, bots, and computational propaganda to manipulate minds and sway public opinion, often in secrecy, and on a scale unimaginable to previous media watchdogs….

[Some] suggest new, good algorithms to counteract the effects of the bad ones….The specter of widespread computational propaganda that leverages memetics through persuasive technologies looms large.

Already, artificially intelligent software can evolve false political and social constructs highly targeted to sway specific audiences. Users find themselves in highly individualized, algorithmically determined news…feeds, intentionally designed to: isolate them from conflicting evidence or opinions, create self-reinforcing feedback loops of confirmation, and untether them from fact-based reality.

And these are just early days….[The] defenders of anything approaching ‘objective’ truth are woefully behind in dealing with computational propaganda….This is less a question of technological delivery systems and more a question of human vulnerability….

Since the current social media platforms enjoy near-monopolies…they would be likely to encourage regulations that only cement their power.…[Since] self-policing that has clearly failed…they are offering to upgrade their policing capability by using AI’s and machine learning….[Since] got into the current mess by depending on technologies they didn’t fully understand, we must be guarded about their ability to find solutions using technologies whose ramifications are even less predictable….

[A] more powerful approach is…[by] educating people about the facts around a particular issue or bringing very controversial but memetically potent issues…[yet] any efforts at education would be interpreted as partisan at best, and elitist an untrustworthy at worst….

[T]he inability to establish organic social bonds through digital media increases our suspicion of one another, not the medium….Memetic countermeasures only further weaponize the environment. Upgraded algorithmic filtering of dangerous memes can only result in a technological arms race…magnifying distrust while doing nothing to arm the human beings.” http://www.iftf.org/fileadmin/user_upload/images/DigIntel/SR-2002_IFTF_Biology_of_disinformation_tk_042618.pdf

I Want A New Drug_If you are not familiar with the great, fun song by Huey Lewis and the News, I want a new drug [1]: you should find it…_medium.com

Find more of my ideas on Medium at, A Passion to Evolve.

Or click the Follow button below to add me to your feed.

Prefer a weekly email newsletter — no ads, no spam, & never sell the list — email me [email protected] with

“add me” in the subject line.

May you live long and prosper!Doc Huston


Published by HackerNoon on 2018/05/11