Anti-Semitic Incidents in MA: A Tale of Two Data Sources

By AAAD (Arthur, Anne, Anne, Drew)

“Data-driven” is the theme of the modern age. From business decision-making to policy changes, from news stories to social impact evaluations, data is a foundational building block for many organizations and their causes. However, while we would like to think that all data represent absolute truths, the real world presents many challenges to accurate reporting.

Our team was motivated by the question: How has the prevalence of anti-Semitic incidents in Massachusetts changed over the past several years?  In our exploration of this question, we learned an old but important truth when you see data, dive deep and make sure you understand how the data collection methods could affect the resulting data.

To begin our exploration of anti-Semitic incidents in MA, we looked into two sources: the Anti-Defamation League (ADL) and Massachusetts Executive Office of Public Safety and Security (EOPSS). To begin with, we noticed obvious discrepancies in the annual totals of anti-Semitic incidents reported by the two sources:

Anti-Semitic incidents in Massachusetts

Year ADL EOPSS
2015 50 40
2014 47 36
2013 46 83
2012 38 90
2011 72 92
2010 64 48

Source: ADL press releases and EOPSS data from a FOI request

After seeing these discrepancies, we decided to dig deeper and try to understand what might account for the differences.  We began by investigating how the data is collected and then comparing differing statistics between the two sources.

EOPSS’ approach and its implications

Massachusetts passed its first hate crime legislation in 1991, but not every agency has adhered to it. According to reports from the Massachusett’s Executive Office of Public Safety and Security (EOPSS), the state did not begin tracking non-reporting agencies until 2005.

The Massachusetts “Hate Crimes Reporting Act requires that the number of hate crimes that occur in the Commonwealth be submitted to the Secretary of Public Safety annually, even if that number is zero. (M.G.L. c. 22C, § 32).” Nonetheless, as late as 2014, some districts were not reporting this statistic. The FBI also compiles hate crime data, though submitting this information is voluntary. Some Massachusetts agencies that have failed to report hate crime data to the FBI have stated they did not realize the FBI had even requested the information.

The accuracy of hate crime reporting data can be influenced by a number of factors, including record keeping procedures within a given agencies and whether or not officers are trained to inquire about factors that qualify crimes as hate crimes.

When agencies do not report data to the state, any hate crimes recorded in the populations in those districts are not represented by the official state statistics. Agencies that have zero hate crimes should report zero hate crimes to the state (These are designated as “zero-reporting” agencies in official reports). A further complication in determining trends can occur when formerly non-reporting agencies begin to report incidents of hate crime if the number is not zero.

Data collected by Massachusetts indicates the population covered by agencies that did not report hate crime statistics grew from roughly 66,000 in 2011 to over 300,000 in 2014.

Massachusetts has recently taken steps to increase the public’s ability to report hate crimes, setting up a hotline in November of 2016. Some police districts also have a designated Civil Rights Officer to handle hate crimes.  

The issues raised by non-reporting are far from academic. When national tragedies occur, one reaction may be in an increase in hate crimes against particular populations. In these cases, hate crime statistics can provide insight about the implications for local communities.

In the wake of the 2016 presidential election, Bristol County Sheriff, Thomas Hodgson, called for the issue of arrest warrants for elected officials of “sanctuary cities.” This prompted Somerville mayor, Joe Curtatone, to defend the legality of sanctuary cities and refer to Sheriff Hodgson as a “jack-booted thug.” He further taunted Hodgson to, “come and get me.” These flare ups between public officials indicate the tension that has formed in the public sphere around the issue of immigration.

Hate crime reporting statistics can provide a tool to measure claims of anti-immigrant-related incidents and provide the public with a sense of whether these incidents are on the rise. Massachusetts has responded to concerns about an increase in hate crimes by setting up a hate crime reporting hotline.

Official statistics from police departments and college campuses can bring clarity to the issue, but Massachusetts must both require and enforce reporting mandates as well as provide training to local agencies to improve and standardize the reporting of these statistics.

ADL’s selected approach and its implications

Another source of data on Massachusetts anti-Semitic crimes comes from the Jewish NGO, the Anti-Defamation League (ADL). The ADL was founded in the United States in 1913 and aims to “stop anti-Semitism and defend the Jewish people,” according to their website.

Since 1979, the ADL has conducted an annual “Audit of Anti-Semitic Incidents.” The ADL’s data partially overlaps with official data — they use data from law enforcement — but they also collect outside information from victims and community leaders.

The limitations in the ADL’s audits are like those of any audits trying to cover anti-Semitic crimes. The way the ADL handles them, however, should carefully be noted as it greatly affects the resulting numbers.

First of all, unlike the official data, the ADL also includes non-criminal acts of “harassment and intimidation” in its numbers, which encompasses hate propaganda, threats, and slurs.

Another key difference from the official data is that ADL staff attempt to verify all anti-Semitic acts included in the audit. While crimes against institutions are easier to confirm, harassment against individuals that are reported anonymously provide unique challenges for verifying.

Additionally, some crimes are not easily identifiable as anti-Semitic even though they may be. In their annual audit, the ADL considers all types of vandalism at Jewish institutions to be anti-Semitic, even without explicit evidence of anti-Semitic intent. This includes stones thrown at synagogue windows, for example.

On the other hand, the ADL does not consider all swastikas to be anti-Semitic. As of 2009, they have stopped counting swastikas that don’t target Jews, as it has become a universal symbol of hate in certain cases.

The ADL also does not count related incidents separately. Instead, crimes or harassment that occurred in multiple nearby places at similar times are counted as one event.

All of these choices made by the ADL greatly affect the numbers that they produce each year.

Comparing and contrasting the results of the two methodologies

Numbers can tell different stories depending on the choices and circumstances surrounding the ADL and the EOPSS’ hate crime data collection processes.  To demonstrate this, we compare some of the conclusions between the two datasets for anti-Semitic hate crimes in Massachusetts.

Starting small: One location claim

One of the ADL’s figures for 2013 indicated that 28% (or 13) of the 46 total anti-Semitic incidents that year took place on a school or college campus.  If we look for the same percentage in the EOPSS data, we find a similar 29% of reported 2013 incidents occurring on a school or college campus.  

This single summary seems to bode well for comparisons between the two datasets: however, things get a little hazier when you look at the absolute numbers.  Instead of 13 out of 46 total incidents, the EOPSS data reported 24 out of 83 incidents on a school or college campus, and it’s unclear what accounts for the difference in scale.

Time trends in reports

If we look at time trends, 25% of the anti-Semitic incidents in Massachusetts reported by the ADL in 2014 occurred in July and August, while that figure was 8% for the same time period in 2013.  

That “marked increase” in anti-Semitic incidents was attributed to the 50-day Israel-Gaza conflict that took place from July 8 to August 26, 2014 by ADL’s New England Regional Director saying, “This year’s audit provides an alarming snapshot into anti-Semitic acts connected to Operation Protective Edge over the summer as well as acts directed at Jewish institutions.  The conflict last summer demonstrates how anti-Israel sentiment rapidly translates into attacks on Jewish institutions.”

If we look at EOPSS data for 2013 and 2014, however, there appears to be no sign of a marked increase in anti-Semitic incidents recorded in the summer months — in fact, in absolute numbers, both incidents in July/August and incidents in the entire year decreased from 2013 to 2014 in the EOPSS data.  

Because the ADL does not provide their underlying data to the public, we can’t dig into the stories of the specific incidents in July/August 2014 and see if they could indeed be a result of the Israel-Gaza conflict.  Additionally, with not-particularly-scientific or consistent reporting methodologies, it’s hard to make concrete conclusions from either of these datasets.

Incident types: Differences might be explained by differing reporting policies

Thus far, we’ve identified contradictions between the two datasets, but have not been able to discern how the two data collection methods may have specifically contributed to those contradictions.  

One topic where we can attempt to do so is the matter of vandalism:

According to the annual ADL audit, 16 of the 46 anti-Semitic incidents in Massachusetts in 2013 (35%) involved vandalism.  The same figure from the ADL for 2014 was 23 vandalism incidents out of 47 total anti-Semitic incidents in Massachusetts (49%).  In EOPSS’ numbers, however, vandalism looks like an even larger portion of anti-Semitic incidents in Massachusetts.

As discussed previously, the ADL reports all vandalism of Jewish institutions as anti-Semitic incidents, but does not count all vandalism including swastikas as anti-Semitic incidents in their data.  Although not directly specified, the EOPSS datasets likely do categorize all reports of swastikas as anti-Semitic vandalism, which would be a possible explanation for the large discrepancy in percentages (on top of the simple explanation that with numbers of this magnitude and lack of precision, variations are inevitable).

Do Data Due-Diligence!

Investigating the discrepancies and the data collection methodologies was not merely an academic exercise: it demonstrates that this is a necessary step to understanding what kinds of conclusions you can reasonably draw from your data and what kinds of caveats you should include when reporting or making decisions based on that data.  

Using only one dataset without exploring how the data was collected and digging into the details of the data could yield very different headlines:

Blindly using ADL data might yield: “Anti-Semitic hate crimes in Massachusetts increase 2% from 2013 to 2014.” (This was just 46 to 47 — is it really reflective of the situation to call that a 2% increase? Does this reflect the reality?)

Blindly using EOPSS data might yield: “Massachusetts became safer for Jewish people in 2014: anti-Semitic hate crimes dropped 43% from 2013.”  (Is this message true, or is this “trend” due to data collection issues? Why does it paint such a different picture from the ADL data?)

Do your data due-diligence.