Category Archives: Original Articles

OLS, BLUE and the Gauss Markov Theorem

From left to right, Carl Friedrich Gauss and Andrey Markov, known for their contributions in statistical methods.

In today’s article, we will extend our knowledge of the Simple Linear Regression Model to the case where there are more than one explanatory variables. Under certain conditions, the Gauss Markov Theorem assures us that through the Ordinary Least Squares (OLS) method of estimating parameters, our regression coefficients are the Best Linear Unbiased Estimates, or BLUE (Wooldridge 101). However, if these underlying assumptions are violated, there are undesirable implications to the usage of OLS.

In practice, it is almost impossible to find two economic variables that share a perfect relationship captured by the Simple Linear Regression Model. For example, suppose we are interested in measuring wage for different people in Canada. While it is plausible to assume that education is a valid explanatory variable, most people would agree it is certainly not the only one. Indeed, one may include work experience (in years), age, gender or perhaps even location as regressors.

As such, suppose we have collected the data for multiple variables, x1,… xn, and y. Through a Multiple Linear Regression Model, we can estimate the relationship between y and the various regressors, x1,… xn (Wooldridge 71).

  • yi is the ith observation for the independent variable
  • xki is the ith observation for the kth regressor
  • βk is the coefficient for the kth regressor
  • εi is the error term

As in the simple case, we can use the Ordinary Least Squares method (OLS) to derive the estimates for our coefficients in the Multiple Linear Regression Model. Recall, our goal is to summarize the sum of squared residuals, that is (Wooldridge 73) :

If we take the partial derivatives of the above equation with respect to β0, β1, …, βn and set them to zero, the result is a system of n+1 equations. The solution to this system will produce the estimates for each βi.

In general, the OLS method for estimation is preferred because it is easy to use and understand. However, simplicity comes with its limitations. Ordinary Least Squares provides us with a linear estimator of parameters in Multiple Linear Regression. In other words, we obtain a column vector of estimates for βi that can be expressed as a linear function of the dependent variable y. Like all other linear estimators, the ultimate goal of OLS is to obtain the BLUE Let us first agree on a formal definition of BLUE. On one hand, the term “best” means that it has “lowest variance”; on the other, unbiasedness refers to the expected value of the estimator being equivalent to the true value of the parameter (Wooldridge 102).

We now turn our attention to the Gauss Markov Theorem, which guarantees that the Ordinary Least Squares method under certain conditions. They are colloquially referred to as the Gauss Markov Assumptions. It is important to note that the first four ensure the unbiasedness of the linear estimator, while the last one preserves the lowest variance (Wooldridge 105).

  1. Linearity in Parameters
  2. Random Sampling
  3. No Perfect Collinearity
  4. Exogeneity
  5. Homoscedasticity

The first two assumptions are self-explanatory; the parameters we are estimating must be linear, and our sample data is to be collected through a randomized, probabilistic mechanism. The third condition, no perfect collinearity, ensures that the regressors are not perfectly correlated with one another. An example of this is including both outcomes of a binary variable into a model. Suppose we are interested in official language preferences: if we were to add English and French as regressors, the model would exhibit perfect collinearity because we know if someone prefers English, they do not prefer French at the exact same time. Mathematically, if they were both indicator variables, we would not be able to differentiate when an observation prefers English or French because one of them will always have a value of 1. Exogeneity means that the regressors cannot be correlated with the error term. The converse of this is endogeneity, and examples of this include omitted variable bias, reverse causality, and measurement error. The fifth and final assumption is homoscedasticity, which means the variance of the error term must be constant no matter what the value of regressors are.

Admittedly, no one will ever walk up to you and ask “What are the conditions for the Gauss Markov Theorem?”. However, as the first article alluded to a few weeks ago, we need to use econometric models with discretion. To put the importance of these assumptions into perspective, consider this analogy. The criminal code is in place so that the citizens of our country can function well together without harming one another. A police officer will never come up to you and ask you to recite the criminal code, but when you start violating the laws, you will likely find yourself in trouble. It is important for us to identify when we are breaking the law, and find methods to avoid doing so. The same can be said using OLS. By learning the five assumptions, we know of possible issues that we may run into when performing linear regression.

In summary, let’s end the discussion of OLS with more insights on the Gauss Markov Theorem.   If all of the conditions simultaneously hold, we know that OLS can is BLUE. In later articles, we will discuss specific ways to mitigate violations of these conditions. For example, when we have endogeneity present (the fourth assumption is violated), our OLS estimator will be biased. We will talk about methods to solve this issue like performing an Instrumental Variable Estimation to produce unbiased estimates.

 

REFERENCE

  1. Wooldridge, Jeffrey M. Introductory Econometrics: A Modern Approach. 5th ed. Mason, OH: South-Western Cengage Learning, 2013. Print.

Newsonomics: Trends in Competition and Bias in the News Industry

Allegedly the most empirical civilization of all time, our Information Age would no doubt serve its audience righteously in their attempts to obtain knowledge. But take a look at the N-gram, a Google search engine that charts the frequency of a word in printed sources over time, for the word epistemology:

Google N-gram Viewer of ‘epistemology’

As the study of ‘how we know’, epistemology distinguishes justified beliefs from opinion. Since the dawn of the Information Age in the 1990s and the advent of the Internet, the use of this word, and implicitly its application to our lives, has been in decline. But what does this trend mean for the news media industry in terms of how news firms compete?

Firstly, considering audience trends in the US, newspapers have decreased in circulation by 7%, while the average viewership for prime-time news has increased by 8% [1]. Competition in the Cable TV market has increased because of the reduction of regulatory controls during the 1980s, subsequently incentivizing news firms to enter this market [2]. This raised much appraise with academics and professionals in the field who hold that the ‘persuasion game’, between firms in the market who bout for news scoops and larger readerships, will always yield the truth. Given that at least one news source propagates the truth and consumers read all sources, the truth will be known by all readers as all firms eventually bend to the most empirical facts and information over time as presented in the truthful news source, since each firm’s reputation is on the line [3]. For example, a Democrat newspaper reveals a scandal concerning a Republican, and a Republican newspaper initially denies it. However, assuming the Democrat newspaper has the best facts, the Republican newspaper eventually concedes to some of the allegations because their reputation is at stake as their readership, who also reads the Democrat newspaper, begins to know of the truth.

Now, let’s complicate our ‘persuasion game’ by introducing a bias on the supply-side of the news market. Naturally, news firms are incentivized to be the first to find and publish ‘scoops’, news stories that are desirable to the public. However, a firm might be suppressed as a result of government intrusion. Consider the following variables: government bribe B, firm revenue for story circulation R, the number of firms N, and value to government of suppression V. The bribe must be B ≥ R. Further, B ≤ V/N, since the value of suppression will be distributed between the number of firms. Therefore, the suppression equilibrium is V/N ≥ R, which indicates that a greater number of firms, or increased competition, will decrease the likelihood that the story is suppressed. Additionally, as firms drop out and avoid a particular story, remaining firms have a growing incentive to publish as their potential audience grows. Human rights violations in Iraq’s Abu Ghraib prison and the leak of the ‘Pentagon Papers’ are examples of stories that were suppressed by government intrusion after their initial publication [3].

More often in our Information Age, a bias is introduced on the demand-side of the news market. Consumers have a preference for news sources that confirm their prior beliefs [4]. When the main source of news was newspapers, readers could pick up multiple papers with different biases to get an objective view of all sides of an issue, thus the success of the ‘persuasion game’ in yielding truth. However, with the rise of prime-TV news coverage, and readers turning to other sources on the Internet, like Facebook, it became simple and easy to appease your own bias. Given that consumers have a psychological urge to confirm and fall further in their beliefs [5] and that news quality is increasingly being associated with whether or not their belief is confirmed [3], it comes as no surprise that news firms cater to their audience by bias-targeting. Thus, considering the N-gram presented above, a decline in empiricism can be causally related to the advent of Internet news and the drinking of the Kool-Aid, en masse.

Bias-targeting is ever present in the strategy of prime-TV news firms who hope to satisfy their audience. With the Information Age, such a formula has unfurled itself farther as the news industry’s competition increases with evermore rapid forms of ingestion: Websites, mobile apps and social media posts. Just in case such conveniences weren’t courtly enough, Facebook’s news feed algorithm prioritizes what a user is likely to click on and browse through [6]. However, this may only reinforce false biases. Further, as a user’s online traffic becomes more prevalent, it paves the way for bias-targeting on a political level.

Cambridge Analytica is a Big Data company that worked for the ‘Brexit’ campaign in its primal stages and Trump’s Presidential campaign [7]. Their accurate modelling of people’s digital footprints gives particular persons an edge as they confirm those biases at that right place, at the right time, to the right people. And the irony that seeps through is that the populist movement, so unempirical and unscientific in their diatribes and nationalistic jargon, was thrust forth unto the steeple because of the modern work of statisticians and scientists of the day.

[1] http://www.journalism.org/2016/06/15/state-of-the-news-media-2016/
[2] Hamilton, James T. 2004. All the News that’s Fit to Sell. Princeton, NJ: Princeton University Press.
[3] https://web.stanford.edu/~gentzkow/research/jepmedia.pdf
[4] https://web.stanford.edu/~gentzkow/research/BiasReputation.pdf
[5] Nisbett, Richard, and Lee Ross. 1980. Human Inference: Strategies and Shortcomings of Social Judgment. Englewood Cliffs, NJ: Prentice-Hall, Inc.
[6] https://www.bloomberg.com/view/articles/2017-02-17/mark-zuckerberg-s-manifesto-for-facebook-offers-a-social-dystopia
[7] https://motherboard.vice.com/en_us/article/how-our-likes-helped-trump-win

 

Implications of a Strong USD

After Donald Trump’s surprise U.S. election victory and the Republicans’ full control of the Congress, the markets have reacted and the U.S. Dollar (dollar) has been continually surging – catching companies and investors off guard. The new U.S. administration seems to believe that this is a sign of “global confidence in Trumpism”, but there are many concerns for U.S. exporters towards an overly strong exchange rate [1].

A strong dollar is defined as one that can purchase more foreign currency relative to a weak dollar. This means that U.S. consumers will pay less for imports but foreign consumers will pay more for U.S. exports [4]. This is good for U.S. consumers as the appreciation of the dollar against other currencies makes foreign goods and foreign travel cheaper, both of which American consumers enjoy. However, this negatively affects tourism as the United States becomes a less affordable travel destination [3].  

A second consideration is the impact of a rising dollar on the earnings of U.S. companies with large foreign operations [5]. In 2012, for companies in the S&P 500 that provided foreign sales details, 47% of total sales came from abroad, mainly Europe and Asia. Clearly, a stronger dollar would have a negative effect on net exports produced domestically, thus creating a drag on potential earnings. Interestingly, one can consider that “truly global U.S. –based” companies involved in exports do not produce within the U.S., but rather internationally [6]. The effects of globalization in the past decades have allowed companies the ability to purchase materials and set up factories abroad, which means that the rising dollar does not have a huge negative relationship with production as initially understood [2]. The real issue is when the earnings in foreign currencies are converted back to the domestic currency, as companies will feel the full brunt of the reduced returns.

As an example, Apple, the world’s most valuable company and a company known for their international dominance, faces some of the greatest foreign exchange exposures with 22% of their sales from China and 23% from Europe [1]. In the past quarter, Apple reported its biggest hit to its margins in China, about 3% in revenue growth, due to the weakness of the Chinese RMB against the dollar. Luca Maestri, Apple’s finance chief, has suggested the company has been preparing for further dollar strength but has come to realize that “at some point, the strong dollar becomes the new normal and we need to work with that” [1].

Unemployment Rate in the United States averaged 5.81 percent from 1948 until 2017. The unemployment rate is currently at 4.8% in January 2017.

On the positive side, a higher dollar effectively transfers demand from the U.S. economy to other economies around the world [5]. The U.S. unemployment rate is currently below its 50-year average and is showing signs that it will continually decrease. By contrast, other economies, notably in Japan and emerging Asia countries, would benefit greatly from a boost to their exports as a result of a higher dollar. In the long run, this will develop a stronger and more balanced global economy [5].

The strong dollar will remain a concern in the coming years as President Trump moves to revive domestic production. As it currently stands, having a stubborn stance for domestic development may harm the U.S. in the long run with reduced export potential; however, the strong exchange rate will be hugely favoured by American consumers. The rise of the dollar in 2016 will have impacts well into 2017, and those impacts should be considered positive on a global scale in the U.S. and around the world [5].


[1] https://www.ft.com/content/8399c6a2-aa82-11e6-ba7d-76378e4fef24
[2] http://fortune.com/2015/03/04/strong-dollar-effects/
[3] https://www.thestreet.com/story/13327355/1/3-impacts-of-a-strong-dollar-weigh-on-next-week-s-fed-meeting.html
[4] http://www.infoplease.com/cig/economics/dollar-us-economy.html
[5] http://www.barrons.com/articles/3-ways-a-strong-dollar-impacts-the-global-economy-1413236429
[6] https://hbr.org/2015/10/strong-dollar-weak-thinking

Pure vs. Mixed Strategies

The stadium lights are blinding, and the murmuring of the crowd in the stands is amplified into a deafening roar. Yet, your senses have never been more acute. The date is January 29, 2017, and are playing to win your fifth Australian Open championship title in tennis. Millions of people are watching your every movement from across the world. You are Roger Federer. Where do you place your serves across the net?

We are often unaware of the different dimensions that everyday games are comprised of. Something seemingly as simple as a serve in tennis can be dissected into many parts, both physical and mental. In this article, we are going to explore pure and mixed strategies in game theory, using tennis as an example.

What is a pure strategy?

A pure strategy is an unconditional, defined choice that a person makes in a situation or game. For example, in the game of Rock-Paper-Scissors,if a player would choose to only play scissors for each and every independent trial, regardless of the other player’s strategy, choosing scissors would be the player’s pure strategy. The probability for choosing scissors equal to 1 and all other options (paper and rock) is chosen with the probability of 0. The set of all options (i.e. rock, paper, and scissors) available in this game is known as the strategy set.

What is a mixed strategy?

A mixed strategy is an assignment of probability to all choices in the strategy set. Using the example of Rock-Paper-Scissors, if a person’s probability of employing each pure strategy is equal, then the probability distribution of the strategy set would be 1/3 for each option, or approximately 33%. In other words, a person using a mixed strategy incorporates more than one pure strategy into a game.

The definition of a mixed strategy does not rule out the possibility for an option(s)to never be chosen (eg. pscissors= 0.5, prock = 0.5, ppaper = 0). This means that in a way, a pure strategy can also be considered a mixed strategy at its extreme, with a binary probability assignment (setting one option to 1 and all others equal to 0). For this article, we shall say that pure strategies are not mixed strategies.

In the game of tennis, each point is a zero-sum game with two players (one being the server S, and the other being the returner R). In this scenario, assume each player has two strategies (forehand F, and backhand B). Observe the following hypothetical in the payoff matrix:

The strategies FS or BS are observed for the server when the ball is served to the side of the service box closest to the returner’s forehand or backhand, respectively. For the returner, the strategies FR and BR are observed when the returner moves to the forehand or backhand side to return the serve, respectively. This gives us the payoffs when the returner receives the serve correctly (FS,FR or BS,BR), or incorrectly (FS,BR or BS,FR). The payoffs to each player for every action are given in pure strategy payoffs, as each player is only guaranteed their payoff given the opponent’s strategy is employed 100% of the time. Given these pure strategy payoffs, we can calculate the mixed strategy payoffs by figuring out the probability each strategy is chosen by each player.

So you are Roger. It is apparent to you that a pure strategy would be exploitable. If you serve to the backhand 100% of the time, it would be easy for the opponent to catch on and return from the backhand side more often than the forehand, maximizing his expected payoff. Same goes for the serve to the forehand. But how often should you mix your strategy and serve to each side to minimize your opponent’s chances of winning? Calculating these probabilities would give us our mixed strategy Nash equilibria, or the probabilities that each strategy is used which would minimize the opponent’s expected payoff. In the following article, we will look at how to find mixed strategy Nash equilibria, and how to interpret them.

Okanagan Apple to Serve as Litmus Test for GMOs

GMOs have been the centre of a political debate for a long time. Now, a product made in Canada will serve as a major indicator to where that debate is in the public conscience. This debate could open up the floodgates to the GMO market, and result in a major shakeup to the food industry as a whole. If the Arctic Apple succeeds, many other products in other industries may be opened up to GMOs as well.

The Arctic Apple underwent limited release in midwestern markets on February 1. The company believes that the apple could be available in Canada in the form of slices by 2019.

The attraction to the Arctic Apple is that it will not brown. The hope for the Okanogan company is that this feature will compel people to try the product, and then hope they will like the product. In fact, they see it more as a matter of convenience, rather than an issue of GMOs. The argument for them, is that every consumer will want an apple that doesn’t brown.

The idea came to the company after realizing if baby carrots can become as popular as it had, because of convenience, then apples should be able to do the same. The company also hopes they can help reverse declining apple consumption.

Historically, GMO-style products have failed in the market. GMO products have been greatly limited in specific markets like corn, wheat, tomatoes, and more because of efforts from Anti-GMO groups.

Despite, the nine years of testing, Anti-GMO groups say the apple is understudied, and believe that consumers will not have any interest in modified apples – citing inability to measure freshness of apples without natural browning.

There is major hurdles that all GMO companies must overcome. In a poll conducted by ABC News, 52% of Americans believe that GMOs are unsafe to eat [1]. That is the environment that the Arctic Apple will walk into.

It should be noted that both the World Health Organization [2] and the National Academies in Sciences [3] have said there is no danger to human health from genetic modification.

Also, after three years of review [4] by Canadian Food Inspection Agency and Health Canada, CFIA said “[Arctic Apples] are as safe and nutritious as traditional apples, while Health Canada said the apple is safe to consume, and has the same nutritional value.

The big test for this particular apple, is whether or not the convenience of the product can overcome negative connotations of GMOs. If the apple can overcome the negativity surrounding GMOs, it will be a major turning point in the GMO industry. Which, in turn, will result in a big shakeup in the entire food industry.

References:

[1] http://abcnews.go.com/Technology/story?id=97567&page=1
[2] http://www.who.int/foodsafety/areas_work/food-technology/faq-genetically-modified-food/en/
[3] https://www.nap.edu/catalog/23395/genetically-engineered-crops-experiences-and-prospects
[4] http://www.ctvnews.ca/sci-tech/canadian-created-non-browning-arctic-apple-opens-gmo-debate-1.2295324