• Skip to primary navigation
  • Skip to content
  • Skip to primary sidebar
  • Skip to footer

The Show So Far

  • About Me
  • Speaking
  • Books & Pubs

dratner

October 4, 2018 by dratner Leave a Comment

Lie, Damn Lies, and Economic Statistics

The title of this article is a minor adjustment of a favorite saw of Mark Twain’s, but I can’t imagine he’d grudge me the use because the intent is the same as the original. Every times I read business or economic news I become more frustrated by ubiquitous (and to some degree iniquitous) use of three economic indicators. Here’s what they are, why they are wrong, and simple ways to fix them:

GDP

GDP is meant to measure the overall size of the economy and, for some reason, has now become a proxy for the overall health of the economy. If GDP is growing, things are good, right? Well, no. GDP can go up while the business sector is shrinking so long as the gap is made up for in government spending (even deficit spending.) After all, it’s just defined by this simple formula:

GDP = C + I + G + (Ex – Im)

Where C is consumer spending, I is business investment, G is government spending, and (Ex – Im) is net exports. But could it really go up while business is in decline? Sure. GDP growth in the US over the years 2012-2015 has averaged around 3.57%. Over roughly the same period, government deficit as a percentage of GDP has been 4.5% and the trade balance has remained negative. Which means that if you subtracted out the “unsustainable” or deficit part of government spending, you’d end up with a net negative GDP.

But there’s an even bigger problem. GDP can go up even while wealth and incomes consolidate to a smaller and smaller number of people. So if it’s being used to inform policy about when stimulus is needed or how taxes should raised, lowered, or structured it gives a complete false impression.

Still, GDP is a single, easily accessible number so you can see why it would be useful for sound bites and news articles, right? Maybe, but not so much since there are better numbers that do something pretty similar. Even just using median income (not average income) would be much better if you want to know how average families are actually feeling.

Unemployment

This is another frustrating one. The unemployment rate is supposed to show how easy it is for people to get jobs. Since there’s always some natural and seasonal churn in the economy it constantly needs to be adjusted and a number of around 4-5% is considered “full employment”. But this one is broken, too. It measures jobless claims as a percentage of the labor force. So it doesn’t take into account the long-term unemployed (who no longer file for benefits) or those who have given up looking since it’s just too hard.

A much better statistic to use is the labor participation rate. This is the ratio of people who are working to the overall population that’s of working age. Admittedly, it still doesn’t help identify patterns among undocumented workers, cash employment, and various other categories, but it’s a darn sight better than the unemployment rate if you only get one data point for measuring the employment situation. It would take some time to adjust since you need to know what “healthy” is, but so did unemployment rate (it isn’t obvious that 4% rather than 0% is “full” employment.)

The DOW Jones Industrial Average

The DOW is the most commonly reported stock market index and it’s also the worst. No serious analyst thinks the DOW is a good measure of the market for a few reasons.

First, it’s small. It only includes a limited number of stocks and those stocks are of the biggest companies out there. If you believe that small business can have ups and downs independent of big business (and the data would support you if you do), the DOW provides a very incomplete view.

Second, it’s price weighted. Each component of the index is weighted by price rather than by market capitalization. This is a problem since price per share is market cap divided by the number of shares and the number of shares is completely arbitrary because a company can just pick how many shares to have outstanding through splits and reverse-splits. Net net, this means a company with a high share price can have a disproportionate impact on the DOW just by having a high share price even if it’s a relatively small part of the overall economy.

For all that, using the DOW to report on the market is probably a less of a big deal than GDP or unemployment since the DOW and broader indexes like the S&P 500 move together roughly 98.5% of the time and never, to my knowledge, move in completely different directions over any significant time frame. Having said that, it’s also the easiest to fix since indexes like the S&P 500 are already widely reported and are much more accurate.

Please follow and like us:

Filed Under: Economics, Politics Tagged With: dow, gdp, noslider, unemployment

September 27, 2018 by dratner Leave a Comment

Kavanaugh and the Prisoner’s Dilemma

In the discussion of the Kavanaugh hearings, there has been much attention played to fairness, justice, burden of proof, and so on, but at the end of the day a confirmation process is not a court trial. While we certainly should wish and expect for our elected representatives to take collective action in the best interests of the country (as they arguably have in the majority of past court confirmations), it’s important to remember it’s a purely political process with purely political drivers. I am not in any way making a statement about the merits of the case (which perhaps belongs on court), only suggesting a different lens that explains the senators’ behavior rather than just the facts in front of them.

In this case, the lens is the Prisoner’s Dilemma, a classic thought problem in economics. Very briefly, it explains why two people (or organizations or parties) might not work together even if it was in their common interest to do so. In its classic formulation, it explores the case of two prisoners – let’s call them D and R. Each has been arrested for a crime and they are being questioned separately. If both are silent, they will both go free since there is insufficient evidence to hold them. If D rats out R and R remains silent, D will go free since there is now D’s evidence to convict R or vice versa. If both accuse each other, they both stay in prison, but on a less charge (a 1 year sentence instead of the 3 year sentence they’d have gotten if only one of them was accused.) It’s clearly in the interest of both to remain silent so they both go free, but, absent any coordination or trust, often one of them will rat the other out in order to avoid the worst case of being accused while remaining silent. (Whether the accusation is justifiable is irrelevant for the sake of the exercise.)

While this sounds narrow, the principle has been generalized to explaining behavior as varied as international relations and sports. I think it also applies to Supreme Court picks.

I’m not suggesting that it is in the common interest of Republicans and Democrats to nominate Brett Kavanaugh, but rather that it’s in their common interest that Supreme Court appointments be fair and relatively smooth. After all, even when dealing with an opposing party’s nominee, everyone knows that at some point the shoe will be on the other foot, so why make the process toxic? Again this goes beyond the merits of the case since resistance was in full force even before the sexual assault allegations became known. I believe that the answer lies with a previous nominee, Merrick Garland.

In a fun, approachable piece on the Prisoner’s Dilemma, NPR’s Planet Money showed how if two parties play the game over and over again, different strategies yield different results. For example, if D is usually silent, R will learn that his safest path is to accuse D since they guarantee it will avoid the worst case of being accused while remaining silent. In fact, the strategy that wins in most simulations is “generous tit-for-tat” – retaliate most of the time, but every so often randomly forgive in order to avoid indefinite escalation.

How does this apply to the current Supreme Court process? Republicans denied Merrick Garland a vote for confirmation for a year despite no substantial opposition to him as a nominee (the equivalent of ratting out in the Prisoner’s Dilemma). If Democrats then allowed even unobjectionable candidates to continue to be confirmed by Republicans without a huge fight (the equivalent of being silent), they’d be transmitting to Republicans that Republicans could safely continue to block Democratic nominees without repercussion. The successful models show that it’s necessary to revenge yourself (tit-for-tat) most of the time when you are ratted out or the other side will take advantage.

But without a majority, Democrats had view options. They tried to make some stink on Neil Gorsuch’s nomination by invoking third-rail political issues but got little traction. Finding a issue of substance for attacking Brett Kavanaugh finally allowed what is, in the end, the only strategically rational behavior.

Now the question is which side will allow the process to return to normality. This is equivalent to the random forgiveness element in “generous tit-for-tat”. So long as both sides keep escalating, everyone gets badly hurt. Hopefully the next candidate to the court – Republican or Democrat – will be a reasonable choice who can be relatively easily confirmed and both sides will allow a return to normalcy. If not, the Prisoner’s Dilemma will continue to dominate and both sides will try to slash and burn each others’ candidates into oblivion. It’s just economics.

Please follow and like us:

Filed Under: News, Politics

September 25, 2018 by dratner Leave a Comment

Facebook, Open the Doors

Today Attorney General Jeff Sessions will meet with states’ attorneys general to discuss his theory of bias among social media sites and what, if anything, should be done about it. Ostensibly at issue is whether social media companies like Facebook use their proprietary algorithms to suppress conservative viewpoints or boost liberal ones. It is virtually certain names like Alex Jones and Diamond and Silk will feature in the conversation.

The importance of the conversation is difficult to overstate. Although Mark Zuckerberg himself famously underestimated social media’s power to put its finger on the political scale both here and abroad (a position he has since walked back), it’s clear from both academic research, Robert Mueller’s indictments, and certainly from my own experience at the Obama campaign and beyond that in fact social media in not only key to elections, but is also aggravating political polarization in the United States.

Before even digging into the merits of Sessions’ conspiracy theory, it’s easy to call BS on Republicans who have routinely argued that corporations, in keeping with Citizens United, have first amendment rights to political speech and, therefore, spending. Facebook and Twitter are both corporations superficially much like those whose rights have been tested before. Social media companies could argue that maybe they’re biased, maybe they’re not, but they have a right to be any way we want.

But as a technologist who is neck deep in the media industry who has done my time in both the political world and Silicon Valley, I would argue that this superficial analysis should be challenged. I am not necessarily arguing that government regulation or prosecutorial discretion is the right way to handle the situation, but those are tools that may need to come into play both for the giants to be able to justify potentially unprofitable changes to their boards and investors or to get them to reexamine some of their entrenched positions.

At issue are the algorithms that determine what social media users actually see and how it gets prioritized. Feeds are no longer simply chronologies, but have become what the Internet portals of yesteryear once were – along with search, they are the jumping off point for virtually all of everyone’s online activity from news to purchases to their original purpose of keeping up with friends. They aren’t simply compilations where everything posted by people you follow is arranged chronologically interspersed with occasional ads. Instead, content is prioritized using a myriad of behavioral data (likes, shares, history of engagement with similar content, etc.) to achieve specific outcomes. The outcomes can be seen as an improved user experience (cutting through the clutter) or an optimization of the social media sites own KPIs (e.g. visits, opportunities to display ads). In Sessions’ view, they can also be seen as more subversive deliberate attempts to advance a political agenda.

Personally, I do not believe in deliberate liberal or progressive bias in these algorithms. Despite the perceived liberal leanings of a majority of people working in the tech industry, the reality is that Valley politics, especially at the billionaire executive level, continues to swing more and more libertarian. Is it possible that line engineering and technical staff have implemented some kind of tweaks and changes without the blessing of corporate overlords, a sort of Silicon Valley deep state? It’s very unlikely since the prioritization of content is absolutely key to these companies’ revenue models and is under relentless internal scrutiny.

But is it possible that, in spite of this, bias still exists? Yes. For two reasons. The first is the manipulation of these systems by outside actors. Certainly marketers around the world are obsessed with how to optimize their messages for Facebook, and so too are political operatives. Both Mueller and Cambridge Analytica whistleblower Chris Wylie have described ways that this can be done that go well beyond marketing. Since in politics many campaigns and organizations seek to discredit each other or sow discord as much as promote their own candidate, it’s actually significantly easier.

The second reason has to do with technology. We don’t know much about the specifics of the algorithms these companies use, but we can look at the foundational technologies they are doubtless based on and make some inferences. For example, the algorithms could be purely rules-based, but that’s very unlikely given especially Facebook’s lauding of the importance of behavioral targeting, the number of variables involved, and the quantity and variety of data in play. It’s more likely that it’s some combination of rules, collaborative filtering, and machine learning, the last of which is a self-proclaimed area of expertise for Facebook.

While ML is certainly a good tool for solving the problem of creating customized experiences for billions of users, it is itself subject to hidden biases. ML models are trained using large sets of real world data and the output they create is subject to the influences and selection biases of the input data. For example, you could create a machine learning model to help screen great candidates from a pool of job applications using the resumes of your past employees scored by performance as input. In theory, this should result in the model figuring out which applications are more likely to be successful at your company. But it could also result in extended hidden biases – it might continue to pick majority male candidates even if it doesn’t know the gender of the applicants based on related signals like name (e.g. “John”), sports, or even the biased makeup of other companies you’ve successfully poached from.

ML researchers have shown that understanding not just how successful algorithms are at their jobs but also understanding how they work is crucially important. For example, one team demonstrated the ability to fool the system from a prototype autonomous car into thinking a stop sign was a 45 mile per hour speed limit sign, a technique now called adversarial images. If they could manipulate the image directly, the changes weren’t even perceptible to the human eye. The point of this was that the car’s recognition system was accurate at detecting stop signs, but it wasn’t just looking for red octagons. It had learned something else that was working for it.

It is likely that if Facebook and the others are using machine learning in their prioritization algorithms they could well be subject to either hidden bias or direct manipulation. While the company would certainly do it’s best to look for these things, outside scrutiny could also be extremely helpful and would also let us know what the companies are actually optimizing for.

While the companies might claim that these algorithms represent some kind of secret sauce and that their exposure would be very damaging, that doesn’t completely stand up to scrutiny. Facebook is a monopoly or near monopoly because of network effect, not the power of its prioritization algorithm. In fact, most people do not have a positive perception of the product as represented by their net promoter score (NPS) of -21 (Twitter’s NPS of 3 is a little better but still terrible.) No competing company would emulate the algorithms with the intent of creating a customer experience like that.

It’s more likely that the exposure would be in terms of how it promotes ads, but that could even be taken out of the picture for this purpose. Just a public understanding of the prioritization of the natural news feed would be an enormous win for transparency.

The last significant pushback would be that opening the algorithms would enable manipulators like Cambridge Analytica. That is almost plausible, but relies on the dubious principle of security through obscurity. The key to securing these systems lies in trust models, identifying bad actors, and in the scrutiny of trending content. It does not lie in keeping the algorithms secret.

Social media companies have resisted classification as media companies or public utilities although they actually have a lot in common with both. Opening their algorithms for prioritizing even non-commercial content to public scrutiny could do a huge amount to restore trust, might actually improve their products, and would be one of the least invasive ways of doing it. I’m not minimizing how hard it is to do – this isn’t just code, but also large samples of anonymized user data would need to be sanitized and made available as well. But I think it’s the best way forward. If there’s nothing to hide, it would defang Jeff Sessions and the attorneys general. If there is something hidden (perhaps even from the companies themselves) it would help bring that to light.

If you want to support organizations working to make social media better, you can do it below.

Please follow and like us:

Filed Under: News, Politics, Technology Tagged With: slider

September 24, 2018 by dratner Leave a Comment

The News Industry in 2018

Going to the Online News Association (ONA) conference each year is a key event in Public Good’s schedule. There are few if any better places to both connect with media and get a sense for the media landscape. Needless to say, a lot has happened over the last year ranging from Facebook’s changing policies on how it treats news and how much news to serve to the President describing the media as “fake news” and the “enemy of the people.” Behind that there’s been a backdrop of technological changes like deep fakes and natural language generation that have journalists concerned. Even the nature of how to report is changing – in a plenary session the speaker castigated journalists for encouraging copycat crimes by popularizing the term Incel while the CDC has published sadly necessary but often ignored guidelines for how to report on mass shootings. We’ve even had to revisit techniques for fighting propaganda (like the “truth sandwich“.) The ongoing saga of shrinking newsrooms and the struggle for digital revenue have not gone away.

But strangely, I left this year’s ONA more optimistic than I was last year, and there are a few reasons why.

First, the industry finally seems to be coping with and maturing in how it views social media. In the past, although I’d hear some digital producers refer to Facebook as “crack”, they’d admit that while they loved the traffic they didn’t quite know how they were going to make money on it. This year, they’ve begun to come to grips with fact that they have no control over that spigot and that those who become too dependent can suffer greatly (as has been seen in the collapsing viewership of a lot of digital first publications.) To combat their form of social media addiction, publishers are starting to think in terms of reader loyalty instead of just clicks. While that change in thinking is just beginning to wend its way into their products, it’s a crucial first step. It feels like history is repeating the transition from William Randolph Heart’s sensationalism and selling by the headline to the New York Time’s model of selling by subscription and being accountable to readers to keep them informed.

My second reason for optimism is the realization that although a lot of consumers are perfectly happy in their bias-reinforcing echo chambers, there’s an increasing movement of those who rely on information they get in the news to make choices that directly affect their welfare. For all a trader or bank boss might like, say, political coverage that supports their views of tax cuts, they also want reliable coverage and analysis that tells them what is actually going to happen so they can plan for it. This movement of pragmatists is starting to be felt, especially by smaller subscription-based sites and newsletters.

More than anything, though, I felt a sense of determination and purpose at the event. There was much less grandstanding than in the past, especially by newer entrants who were (sometimes briefly) experiencing rapid growth and attributing it to things other than social media savvy.

Please follow and like us:

Filed Under: News Tagged With: news, ONA, ONA2018, slider

September 21, 2018 by dratner Leave a Comment

Machine Learning and You

On September 14th, I gave a talk at the Online News Association conference. I learned a huge amount from the journalists present, but perhaps the keenest pain point I saw was the range of knowledge between some folks who were already developing their own machine learning models and those who were still trying to understand the basics of this new technology.

In the talk I tried to introduce ideas ranging from the importance of how to optimize recommendations engines, the sophistication of state-of-the-art natural language generation, and the emergence of deep fakes among other topics. I tried to remain balance between potential and threats. Enjoy the talk and feel free to post any comments. I apologize that the mic wasn’t feeding the video camera so the audio is poor.

The slides are here.

ONA is a great organization doing important work. You can support them here:

Please follow and like us:

Filed Under: News, Technology Tagged With: noslide

September 21, 2018 by dratner Leave a Comment

Nanotechnology: Another Look

Every so often its interesting to go back and reflect on some work from awhile ago. I did that recently with Mark and my 2002 book on nanotechnology and, on the whole, we did pretty well.

Please follow and like us:

Filed Under: Technology Tagged With: noslide

  • Page 1
  • Page 2
  • Next Page »

Primary Sidebar

Recent Posts

  • Lie, Damn Lies, and Economic Statistics
  • Kavanaugh and the Prisoner’s Dilemma
  • Facebook, Open the Doors
  • The News Industry in 2018
  • Machine Learning and You

Recent Comments

    Archives

    • October 2018
    • September 2018

    Categories

    • Economics
    • News
    • Politics
    • Technology

    Meta

    • Log in
    • Entries RSS
    • Comments RSS
    • WordPress.org

    Footer

    Elsewhere on the Internets

    RSS
    Facebook
    Facebook
    Twitter
    Visit Us
    Follow
    LinkedIn

    Copyright © 2019 · Executive Pro on Genesis Framework · WordPress · Log in