Facebook’s F8: It’s not just for software developers

by Arthur and Drew

Click here to read article with annotations

You’re a small business owner, looking to stay ahead of the curve in everything you do. Say, for example, you run a local electrical service, where you employ 50 technicians, and you send them to customers’ homes when something isn’t working and they call in. As the owner of this business (like countless other small businesses), you’re likely using Facebook to advertise, but you’re probably not very tech savvy yourself or have the time to keep up with all the technology enhancements. Evolutions can be really tough.

But … you know it’s important. Competition is relentless, and your users are on Facebook, so staying up to date on advertising practices is important to you.

Last week, Facebook held their annual conference called F8, and though this event is meant to be for software developers, it’s important for business owners who rely on Facebook to understand the announcements at this event and think proactively about how they can act to take advantage of these platform upgrades for their businesses.

Below, we break down the three most important Facebook platform upgrades that concern advertising and businesses, and share some ideas for how small businesses can leverage these new tools to further their sales.

To get a basic understanding of the announcements below, you should watch either the entire keynote speech or this shortened summary.

Facebook Augmented Reality (AR)

Frame Studio and AR Studio

You know those cool filters on Snapchat that give you dog ears or turn you into a taco? Well, Facebook now allows you to do the same thing. The basic idea is that you can spend some of your advertising dollars to create custom filters for users to use, and those filters will act as highly engaging advertising.

SLAM (Simultaneous Localization and Mapping)

Facebook also talked about a new technology, called SLAM, which is slightly, yet importantly, different from the filters above. Whereas the underlying technology of the above allows you to recognize and digitally paint on a person’s face, this recognizes shapes and objects in the real world, which allows you to now place digital items into the physical world, and have those items follow real world physics laws (e.g. you put a digital cup on a table, and as you pan around the table, the cup stays in the same place as if it were a real cup!)

Now, you might be thinking, “So what? Why is this relevant?” But imagine leveraging this form of AR to allow users to digitally tag things relevant to your business. For example, let’s go back to our technician example. You can now have your customers use their camera to digitally circle the specific light switch that doesn’t work, or where they left the keys under the mat so that your technician can find it and head outside, or a note on the wall with special instructions just for the technician. This could allow you to offer a better service than your competitors, giving you that leg up you need to attract customers and grow your business.

Facebook Spaces

Whereas the above announcement is all about AR (Augmented Reality), this is Facebook’s VR (Virtual Reality) play. As a small business owner, imagine building a space on Facebook Spaces for your business to act as a digital customer service center. Imagine a future where everyone has a VR headset, and customers who want to learn more about your service don’t have to take the time to drive down, and they also don’t have to settle for the limitations of a phone call. Instead, they can enter Facebook’s virtual world, find your virtual store in that virtual world, and then engage with someone from your sales or customer support team through that virtual store.

Facebook Messenger

New integrations in Messenger allow for games, music sharing, and more capable bots. These bots can now be included in group chats, allowing businesses to provide useful services for drawing in new customers. Discovery of these bots has also been enhanced, making it easier for customers to find relevant bots.

For example, customers could interact with an electrical services business through the convenience of Facebook Messenger, and on their phone. (As of last week, customers can now scan QR codes from a flyer and immediately be put into a conversation with the business.) On the small business side, the owner could automate the matching process: using software to dispatch customer’s requests to the appropriate electrician in that area.

Posted in All

Anti-Semitic Incidents in MA: A Tale of Two Data Sources

By AAAD (Arthur, Anne, Anne, Drew)

“Data-driven” is the theme of the modern age. From business decision-making to policy changes, from news stories to social impact evaluations, data is a foundational building block for many organizations and their causes. However, while we would like to think that all data represent absolute truths, the real world presents many challenges to accurate reporting.

Our team was motivated by the question: How has the prevalence of anti-Semitic incidents in Massachusetts changed over the past several years?  In our exploration of this question, we learned an old but important truth when you see data, dive deep and make sure you understand how the data collection methods could affect the resulting data.

To begin our exploration of anti-Semitic incidents in MA, we looked into two sources: the Anti-Defamation League (ADL) and Massachusetts Executive Office of Public Safety and Security (EOPSS). To begin with, we noticed obvious discrepancies in the annual totals of anti-Semitic incidents reported by the two sources:

Anti-Semitic incidents in Massachusetts

Year ADL EOPSS
2015 50 40
2014 47 36
2013 46 83
2012 38 90
2011 72 92
2010 64 48

Source: ADL press releases and EOPSS data from a FOI request

After seeing these discrepancies, we decided to dig deeper and try to understand what might account for the differences.  We began by investigating how the data is collected and then comparing differing statistics between the two sources.

EOPSS’ approach and its implications

Massachusetts passed its first hate crime legislation in 1991, but not every agency has adhered to it. According to reports from the Massachusett’s Executive Office of Public Safety and Security (EOPSS), the state did not begin tracking non-reporting agencies until 2005.

The Massachusetts “Hate Crimes Reporting Act requires that the number of hate crimes that occur in the Commonwealth be submitted to the Secretary of Public Safety annually, even if that number is zero. (M.G.L. c. 22C, § 32).” Nonetheless, as late as 2014, some districts were not reporting this statistic. The FBI also compiles hate crime data, though submitting this information is voluntary. Some Massachusetts agencies that have failed to report hate crime data to the FBI have stated they did not realize the FBI had even requested the information.

The accuracy of hate crime reporting data can be influenced by a number of factors, including record keeping procedures within a given agencies and whether or not officers are trained to inquire about factors that qualify crimes as hate crimes.

When agencies do not report data to the state, any hate crimes recorded in the populations in those districts are not represented by the official state statistics. Agencies that have zero hate crimes should report zero hate crimes to the state (These are designated as “zero-reporting” agencies in official reports). A further complication in determining trends can occur when formerly non-reporting agencies begin to report incidents of hate crime if the number is not zero.

Data collected by Massachusetts indicates the population covered by agencies that did not report hate crime statistics grew from roughly 66,000 in 2011 to over 300,000 in 2014.

Massachusetts has recently taken steps to increase the public’s ability to report hate crimes, setting up a hotline in November of 2016. Some police districts also have a designated Civil Rights Officer to handle hate crimes.  

The issues raised by non-reporting are far from academic. When national tragedies occur, one reaction may be in an increase in hate crimes against particular populations. In these cases, hate crime statistics can provide insight about the implications for local communities.

In the wake of the 2016 presidential election, Bristol County Sheriff, Thomas Hodgson, called for the issue of arrest warrants for elected officials of “sanctuary cities.” This prompted Somerville mayor, Joe Curtatone, to defend the legality of sanctuary cities and refer to Sheriff Hodgson as a “jack-booted thug.” He further taunted Hodgson to, “come and get me.” These flare ups between public officials indicate the tension that has formed in the public sphere around the issue of immigration.

Hate crime reporting statistics can provide a tool to measure claims of anti-immigrant-related incidents and provide the public with a sense of whether these incidents are on the rise. Massachusetts has responded to concerns about an increase in hate crimes by setting up a hate crime reporting hotline.

Official statistics from police departments and college campuses can bring clarity to the issue, but Massachusetts must both require and enforce reporting mandates as well as provide training to local agencies to improve and standardize the reporting of these statistics.

ADL’s selected approach and its implications

Another source of data on Massachusetts anti-Semitic crimes comes from the Jewish NGO, the Anti-Defamation League (ADL). The ADL was founded in the United States in 1913 and aims to “stop anti-Semitism and defend the Jewish people,” according to their website.

Since 1979, the ADL has conducted an annual “Audit of Anti-Semitic Incidents.” The ADL’s data partially overlaps with official data — they use data from law enforcement — but they also collect outside information from victims and community leaders.

The limitations in the ADL’s audits are like those of any audits trying to cover anti-Semitic crimes. The way the ADL handles them, however, should carefully be noted as it greatly affects the resulting numbers.

First of all, unlike the official data, the ADL also includes non-criminal acts of “harassment and intimidation” in its numbers, which encompasses hate propaganda, threats, and slurs.

Another key difference from the official data is that ADL staff attempt to verify all anti-Semitic acts included in the audit. While crimes against institutions are easier to confirm, harassment against individuals that are reported anonymously provide unique challenges for verifying.

Additionally, some crimes are not easily identifiable as anti-Semitic even though they may be. In their annual audit, the ADL considers all types of vandalism at Jewish institutions to be anti-Semitic, even without explicit evidence of anti-Semitic intent. This includes stones thrown at synagogue windows, for example.

On the other hand, the ADL does not consider all swastikas to be anti-Semitic. As of 2009, they have stopped counting swastikas that don’t target Jews, as it has become a universal symbol of hate in certain cases.

The ADL also does not count related incidents separately. Instead, crimes or harassment that occurred in multiple nearby places at similar times are counted as one event.

All of these choices made by the ADL greatly affect the numbers that they produce each year.

Comparing and contrasting the results of the two methodologies

Numbers can tell different stories depending on the choices and circumstances surrounding the ADL and the EOPSS’ hate crime data collection processes.  To demonstrate this, we compare some of the conclusions between the two datasets for anti-Semitic hate crimes in Massachusetts.

Starting small: One location claim

One of the ADL’s figures for 2013 indicated that 28% (or 13) of the 46 total anti-Semitic incidents that year took place on a school or college campus.  If we look for the same percentage in the EOPSS data, we find a similar 29% of reported 2013 incidents occurring on a school or college campus.  

This single summary seems to bode well for comparisons between the two datasets: however, things get a little hazier when you look at the absolute numbers.  Instead of 13 out of 46 total incidents, the EOPSS data reported 24 out of 83 incidents on a school or college campus, and it’s unclear what accounts for the difference in scale.

Time trends in reports

If we look at time trends, 25% of the anti-Semitic incidents in Massachusetts reported by the ADL in 2014 occurred in July and August, while that figure was 8% for the same time period in 2013.  

That “marked increase” in anti-Semitic incidents was attributed to the 50-day Israel-Gaza conflict that took place from July 8 to August 26, 2014 by ADL’s New England Regional Director saying, “This year’s audit provides an alarming snapshot into anti-Semitic acts connected to Operation Protective Edge over the summer as well as acts directed at Jewish institutions.  The conflict last summer demonstrates how anti-Israel sentiment rapidly translates into attacks on Jewish institutions.”

If we look at EOPSS data for 2013 and 2014, however, there appears to be no sign of a marked increase in anti-Semitic incidents recorded in the summer months — in fact, in absolute numbers, both incidents in July/August and incidents in the entire year decreased from 2013 to 2014 in the EOPSS data.  

Because the ADL does not provide their underlying data to the public, we can’t dig into the stories of the specific incidents in July/August 2014 and see if they could indeed be a result of the Israel-Gaza conflict.  Additionally, with not-particularly-scientific or consistent reporting methodologies, it’s hard to make concrete conclusions from either of these datasets.

Incident types: Differences might be explained by differing reporting policies

Thus far, we’ve identified contradictions between the two datasets, but have not been able to discern how the two data collection methods may have specifically contributed to those contradictions.  

One topic where we can attempt to do so is the matter of vandalism:

According to the annual ADL audit, 16 of the 46 anti-Semitic incidents in Massachusetts in 2013 (35%) involved vandalism.  The same figure from the ADL for 2014 was 23 vandalism incidents out of 47 total anti-Semitic incidents in Massachusetts (49%).  In EOPSS’ numbers, however, vandalism looks like an even larger portion of anti-Semitic incidents in Massachusetts.

As discussed previously, the ADL reports all vandalism of Jewish institutions as anti-Semitic incidents, but does not count all vandalism including swastikas as anti-Semitic incidents in their data.  Although not directly specified, the EOPSS datasets likely do categorize all reports of swastikas as anti-Semitic vandalism, which would be a possible explanation for the large discrepancy in percentages (on top of the simple explanation that with numbers of this magnitude and lack of precision, variations are inevitable).

Do Data Due-Diligence!

Investigating the discrepancies and the data collection methodologies was not merely an academic exercise: it demonstrates that this is a necessary step to understanding what kinds of conclusions you can reasonably draw from your data and what kinds of caveats you should include when reporting or making decisions based on that data.  

Using only one dataset without exploring how the data was collected and digging into the details of the data could yield very different headlines:

Blindly using ADL data might yield: “Anti-Semitic hate crimes in Massachusetts increase 2% from 2013 to 2014.” (This was just 46 to 47 — is it really reflective of the situation to call that a 2% increase? Does this reflect the reality?)

Blindly using EOPSS data might yield: “Massachusetts became safer for Jewish people in 2014: anti-Semitic hate crimes dropped 43% from 2013.”  (Is this message true, or is this “trend” due to data collection issues? Why does it paint such a different picture from the ADL data?)

Do your data due-diligence.

Posted in All

MIT students, in fact, take a reasonable number of classes

It turns out the average MIT student doesn’t overload on courses as much as one might expect.

Spending time on campus or browsing MIT Confessions, however, might give you the wrong impression.

“I hear a lot of bragging… ‘Why would anyone take less than 48 units?’”

#1646 I wish I could say the depression and anxiety so frequently found here were entirely the fault of MIT's high…

MIT Confessions 发布于 2015年3月9日

“I’m only taking 54 units this semester with no clubs or other obligations and I’m already finding it hard to keep up.”

#1395 I'm only taking 54 units this semester with no clubs or other obligations and I'm already finding it hard to keep…

MIT Confessions 发布于 2014年10月1日

I had a friend who took over 100 units in a semester. According to MIT standards, that’s equivalent to over 100 hours of work a week just on classes.

Yet, many of the most ambitious students at MIT have decided on a different path: take a small number of classes — 3 to 4 — and then push yourself to your limits in extracurriculars. This is similarly brag-able, but might yield better opportunities for one’s future than overloading on classes.

One successful alum, a professor of Cognitive Science at UC San Diego, even proposes capping the number of classes an undergraduate can take. Guo never took more than 48 units in a semester himself.

The data is clear: it’s a myth that most MIT undergrads are taking over 5 classes and overloading on academics. It’s just not true.

“The median MIT student takes four classes per semester, which means most people walk around taking 48 units,” writes Danny B.D. ’15 on the MIT Admissions Blog.

A analysis I did for this blog post provides a rough estimate for the average number of classes: 5. As one might expect, this is larger than the median of 4 because both public discourse and the numbers are skewed by a few at the top.

But perhaps more convincing than the numbers are the testimonials of those who ventured into taking six or more classes themselves.

Holden Lee took 8 classes (18.101, 18.152, 18.705, 18.712, 18.725, 18.784, 18.901, and Chinese 3) in his sophomore year. After describing this experience on Quora, he writes: “I wouldn’t recommend taking so many classes under any circumstances. While I survived the semester fine, it was a process of gradual burnout.”

“I’d blocked out almost everything else that semester to focus on work, but found there are little voices in my head that don’t want to be ignored. I liked to write stories, and never had time to pursue it seriously. I thought about that poem, ‘Dream deferred.’”

Matt Hodel writes similarly of his experience taking 6 classes in a semester: “I’m a sophomore who took 6 classes last semester and failed miserably at pulling it off.” He describes falling into periods of depression throughout the semester.

His parting advice? “So I won’t say to never take 6 classes at MIT, or at college in general for that matter. Lots of people do it and many of them can handle it. Just think long and hard about how much you can handle and what your priorities are before making that decision.”

Those who do display more moderation in course loads have seen great results. In Danny’s blog post, Below 48, he describes how taking 3½ classes allowed him to pursue side projects he had been wanting to for a while, “breathe a bit more”, and spend more time with friends.

Guo, the professor who considered capping the number of units at MIT, thinks his taking few classes may have even increased his job opportunities. His employers never cared about how many classes he had taken. His resume only lists his GPA. By having more free time, Guo thinks students can develop “deep expertise” and work on research that will differentiate them from other students.

~~~

Appendix: Analysis for finding the mean number of classes taken by an MIT student

MIT doesn’t publicly release this information, or even the mean class size. They do, however, release a distribution of class sizes.

Fall 2015’s distribution is as follows:

I used this discrete distribution to estimate a continuous exponential distribution, which served as a decent model for the data.

I then wanted to find the number of student-classes (summing over the number of classes that each student takes). To do this, I computed the following integral with Wolfram Alpha:

Dividing the number of student-classes by students (~4500), we obtain an estimate for the average number of classes each student takes: 5.07.

Posted in All

Feminist activists take on the Kremlin

by Drew and Arthur

On March 8, International Women’s Day, a group of Russian feminist activists protested outside the Kremlin.

Their banner said, “200 years men in power, out with them!”
Ekaterina Nenasheva’s post accompanying the video reads:

“Moscow and St. Petersburg feminists, #CapturedKremlin, congratulate you on March 8
UPD: Tishchenko, Orlova and a photographer from Nova already in the Police Station – China Town
UPD: at 14:20 – released all detainees”

Meanwhile, Putin was congratulating the staff at the new perinatal centre in Bryansk. After all, the history of International Women’s day is rooted in Russia.

The feminists gathered in a prominent location, Alexander Garden, right at the edge of the Kremlin’s walls:

News of the demonstration spread quickly on social media, with over 43k people watching Nenasheva’s video.

Some declared the protesters heroines.

One person wrote on Facebook in Russian: “You are still bathing in a bath with champagne, and your revolutionary friends have already taken the Kremlin.”

Not all coverage of the demonstration was positive, though.

A photo of protesters appearing to have breached the Kremlin walls turned out to be Photoshopped.

The fake photo was quickly denounced, even by the organizers, in a Facebook post that has since been deleted (but was reported on by Buzzfeed).


In that deleted post, Ekaterina Nenasheva says:

“I’m hurting right now for Russian art activism and the feminist collective, because the picture of the Arsenal tower really did turn out to be photoshop. Only a few participants knew about it, and now I know too.
I deeply respect all participants of the protest and don’t want to devalue their actions. All the other photos and videos are real. Thank you, girls!
But I also consider it absolutely unprofessional and unacceptable to have such an approach to work, in any case, the use of photoshop was not part of the original concept.”

Others used it as an opportunity to discuss the much talked about “fake news”.

Fake news. Actual fake news. 😂

Joseph Griffiths 发布于 2017年3月8日

Posted in All

Drew’s media diary

After tracking my computer usage for 6 days, I was interested in three questions:

  1. How often do I use my computer for consuming, and how often do I use it for producing?
  2. Which are the websites I consume from the most?
  3. By studying the types of media I consume, what can I learn about myself?

Tracking all my media usage was not as difficult as it may have been for others in the class. I don’t use a smartphone, so I have no mobile consumption (beyond phone calls!) I occasionally read print sources, such as The Tech, but that contributed to fewer than 30 minutes this past week. Almost all of my media consumption is done through my computer, which I’ve been tracking with an application called Timing.

To answer the first question, I reviewed all the time I spent on my laptop and divided it into two broad buckets: productivity and media usage. This was more of an art than a science. I thought of productivity as any application or website in which I was actively producing something (e.g. writing something in LaTeX, composing emails, reading a pset on the computer while solving it on paper, buying stocks, etc.)

These tasks served as a nice benchmark for the thing I was really interested in: my media usage. I defined this category as anything that wasn’t “productive” as defined above, i.e. consisted almost entirely of consumption. This included reading news or blogs, browsing Facebook, or even reading other posts on this website!

Granted, these categories were separated by a fuzzy boundary at best. Some things like email were hard to classify: when was I consuming emails and newsletters vs. when was I preparing an email? Furthermore, Timing had trouble knowing when I was actively viewing something on my computer, so all the data should be treated as having huge error bars. (Times are probably underestimated.)

I believe the results still yield interesting results, however:

My media consumption is (thankfully) consistently lower than my productive uses of the computer. It is still considerably large, however. What surprised me was that I wasn’t spending a lot of time on any single website, but rather that I was spending a little time on many websites, which added up to significant periods of time in “consumption mode.” (See the chart below for more context.) Consumption in the age of the Internet is incredibly distributed.

My overall usage of the computer was lower on the weekend as expected (see Feb. 18, a Saturday). However, the weekend is also when the highest percentage of my computer time is spent on media consumption.

The chart below sheds light on my second question:

Google Hangouts, which I included in media consumption (although that could be debated), took up 149 minutes because of several videochats I had this past week. (One of these was a conversation with other Americans across the country discussing the recent political events.)

As a subscriber on the NYTimes, I’m happy to see it make it into my top 3, and the time I spent on it is about what I’d expect.

To answer the third and final question, however, requires a more aggregated analysis:

This chart yields the most value for me.

Social websites like Facebook and Twitter are having a large impact on how I see the world, even if I don’t go there for traditional news. That’s because they make up over half the time I spend consuming media on the computer. Just by the nature of scrolling through their newsfeeds and adhering to their algorithms, I am being shaped by them.

I don’t read the mainstream media (like NYTimes, Washington Post, and WSJ) as much as I would expect. When I think about where my opinions form and where I find the facts that I reference in conversation, they usually stem from these mainstream sources. However, given that only a fifth of my media consumption comes from there, I must be weighting these sources substantially higher in my head because of the perceived credibility that comes with them.

I was happy to see that new forms of journalism (like BuzzFeed and MuckRock) are the next largest category. These are sources that can provide alternative perspectives and in new forms. I intentionally try to seek them out, and so it was reassuring to see that I’ve been somewhat successful in keeping up with it.

~

To end on a fun note, here are some fascinating tidbits that I picked out from my week’s worth of data:

  • The longest time I spent on a Wikipedia page at one time was 7 minutes. Article: “Gas constant
  • The news websites that didn’t make it into the charts above (because I visited them for under 3 minutes) included CBS Sports, NYPost, The Independent, The Verge, Huffington Post, Fox News, and The Crimson.
  • I like to visit BuzzFeed occasionally because they have some great original reporting. However, I still don’t spend as much time reading them as I would like! See below:
Posted in All

Drew’s Bio

Hi there! I’m Drew, an undergraduate student at MIT studying computer science & electrical engineering, along with physics.

Motivation: I see journalism and participatory media as a great equalizer — holding powerful people and institutions accountable, while giving a voice to the voiceless. The democratization of media online these past two decades has brought more voices into the mix (a positive force), but at the cost of some veracity, responsibility, and credibility (a negative force). However, I believe there is no fundamental reason that the positive necessitates the negative — and rather that we are simply still waiting for a creative, new approach to media that will provide the benefits of journalistic democratization without sacrificing journalistic integrity.

Technology: programming since before I can remember; interned at Sony Ericsson and Khan Academy as a software developer in high school; co-founded and served as CTO of a start-up with 6 people; consulting at the World Bank on big data

Journalism: writing and editing for MIT’s school newspaper, The Tech; reported on the Boston Marathon bombing trial from the courtroom for half a year; interviewed people such as IMF Managing Director Christine Lagarde

Random life experiences: biked across the U.S.; worked in a quantum computing lab; built iOS apps in ’09; got rid of my smartphone for a dumbphone

Twitter: @drew_bent
Website: www.drewbent.com

Posted in All