Stephen – Future of News and Participatory Media https://partnews.mit.edu Treating newsgathering as an engineering problem... since 2012! Wed, 19 Feb 2014 04:31:02 +0000 en-US hourly 1 https://wordpress.org/?v=5.2 Stephen’s Final Project Proposal: Press Play on Newsgames https://partnews.mit.edu/2014/04/16/stephens-final-project-proposal-press-play-on-newsgames/ Wed, 16 Apr 2014 20:30:50 +0000 http://partnews.brownbag.me/?p=4932 Continue reading ]]> Thanks to an influx of new, cheap tools and growing diversity among game developers, the video game industry is experiencing a period of significant change, especially in the indie scene. Leveraging one of the medium’s primary affordances — empathy-building — games like Cart Life, Depression Quest, and Papers, Please offer us a roadmap for how journalism can use games to tell even more compelling stories. In Cart Life, you assume the role of a street vendor, living and struggling through the monotony of menial labor — and the repetitive, tiresome gameplay reflects that:

In Depression Quest, a choose-your-adventure style game, you assume the role of someone living with depression. Just like any interactive fiction game, you make choices to advance the story, but there’s one catch: to tangibly convey the emotional experience of depression, the game shows you the entire range of possible actions, but some of those options are crossed out depending on your current happiness level. The frustration a player experiences in playing Depression Quest can tell them a lot more than a simple medical description of depression.

“Dystopian document thriller” Papers, Please is a game in which you play as an immigration inspector at the border checkpoint of fictional country Arstotzka. While the gameplay is relatively basic (reviewing and stamping papers), the game effectively portrays a volatile political situation and evokes a sense of emotional toll within the player. All these affordances seem to suggest that the video game medium has the opportunity to enable new modes of journalism.

Newsrooms haven’t missed out on this — the most-trafficked New York Times story of last year was How Y’all, Youse and You Guys Talk, a quiz (based on real research) that asks you about what words and pronunciations you use for different things in order to identify your dialect. Mother Jones has open sourced a library that turns Google Spreadsheets into simple news quiz games. Nobody’s Facebook feed is free from the reach of the BuzzFeed quiz, a format that they continue to iterate upon.

In some ways, the newsgame seems to be the natural extension of the interactive graphic. They can be used to distill complex concepts, like BusinessWeek’s bitcoin mining minigame, what I like to think of as a game version of the explainer. Moreover, newsgames can be used to viscerally communicate real-world data, as can be seen in ProPublica’s HeartSaver, a game in which you must rush the heart attack victims of New York to nearby hospitals. Now we’re starting to get into the territory of conveying emotional experience. Similarly, The New York Times’ Gauging Your Distraction puts you behind the wheel of a car, in which you must juggle the tasks of switching lanes and texting.

While I’m still in the exploratory phase of my project, my general idea is to create some sort of newsgame design tool or toolkit to help newsrooms create games for journalism, with the hope of eliminating some of the technical hurdles that have prevented all but the most code-savvy newsrooms from dabbling in the field. My major concern with developing it as a tool is the balancing act between procedural generation/abstraction and the hand-crafted design elements that are so core to an effective game experience.

Highly procedural approaches like The Cartoonist are great in terms of lowering the technical barriers to entry for newsrooms, but the resulting games (at least the ones I have been able to read about or watch online) seem pretty reductive, as the engine generates games based on user-defined relationships and verbs. Another approach would be to anticipate a few key narrative frames and having those directly translate into corresponding gameplay elements. With any of these approaches, repetitiveness between the resulting games seems to be a major problem — even if the theming/artwork has been changed, players will easily recognize that they are playing the same game.

With these concerns in mind, it seems like I will focus on a more generalized tool — perhaps repurposing an existing engine like Twine specifically for the newsroom, or creating tools for playful interactive experiences (but not full “games” per se) like quizzes. The project might even take the form of a design toolkit to help newsrooms through the process of conceptualizing a newsgame. I will be talking to the MIT Game Lab and other game developers to further define the scope of the project; it will also be important to get feedback from newsrooms looking to develop games and what sorts of stories they see coming out of this. Finally, I welcome any suggestions and feedback from all of you in terms of where to take this project.

]]>
Stephen’s Curated Story: Mission Bay Fire (#sffire) https://partnews.mit.edu/2014/03/12/stephens-curated-story-sffire/ https://partnews.mit.edu/2014/03/12/stephens-curated-story-sffire/#comments Wed, 12 Mar 2014 17:08:23 +0000 http://partnews.brownbag.me/?p=4331 Continue reading ]]>

Yesterday evening, a fire broke out in an apartment building under construction in San Francisco’s Mission Bay neighborhood, near AT&T park. The fire eventually escalated to a 6-alarm rating, and nearly half of the city’s firefighters were eventually called in to battle the blaze. Using twXplorer, Keepr, and Storify‘s built-in social media navigator, I found and curated a series of tweets, Vines, and YouTube videos to tell the story:
http://storify.com/s2tephen/mission-bay-fire-sffire

]]>
https://partnews.mit.edu/2014/03/12/stephens-curated-story-sffire/feed/ 1
Stephen’s 4 Hour Challenge: Just Another NUZ Story https://partnews.mit.edu/2014/02/18/stephens-4-hour-challenge-just-another-nuz-story/ Wed, 19 Feb 2014 03:05:56 +0000 http://partnews.brownbag.me/?p=3680 Continue reading ]]> Photo by Tara Lee

Photo by Tara Lee (http://www.flickr.com/photos/117913248@N08/12596590383/)

Author’s note: this piece is cross-posted from Tuesday’s issue of The Tech, and can be found at http://tech.mit.edu/V134/N5/makemit.html — I only occasionally write for the news department of The Tech, referred affectionately by its three-letter abbreviation, NUZ. I took a pretty head-on approach to the 4 Hour Challenge, using it as an opportunity to brush up my rusty newswriting skills and actually get something published in the paper.

Mechanical engineers flock to hardware hackathon MakeMIT
50 teams, 200 students participate in MakeMIT’s first year

Approximately 200 students gathered in Lobdell Dining Hall last Saturday to participate in the first phase of MakeMIT, a hardware hackathon organized by TechX. While the past year has seen college hackathons (including TechX’s very own HackMIT) increase in both scale and number, most of the emphasis has been on software, with few options for non-computer science students to get in on the action.

“We saw there were hackathons happening basically everywhere across the U.S. Hackathons provide a great environment for Course 6 people,” said Thuan D. Doan ’15, one of the event organizers. “We said, why isn’t there something like that for hardware? For MechE’s and for 6-1’s? One of the reasons why is because there’s such a high barrier to entry. We thought this would be a great opportunity to fix that.”

MakeMIT debuted this year with about 50 teams of three to five students each. Registration opened in January.

“MakeMIT provided me with an excuse to ditch my daily monotonous life, and build something just for the fun of it,” said Emma M. Steinhardt ’16, a student in Course 2 and first-time hackathon participant. “I haven’t done a hackathon before, because they are all for software-related things, and I’m into designing mechanical things.” In fact, mechanical engineering was the best represented course at MakeMIT, the major of 46 percent of the hackers. Another 36 percent were from EECS, with the remaining participants coming from a variety of majors and skill backgrounds.

While a number of the big college hackathons provide support for hardware, it’s often tacked on as a sideshow to the software projects. Skyler E. Adams ’16, one of Steinhardt’s teammates, recalled his experience working on a hardware project during HackMIT. “Doing something hardware-oriented costs money, and if you order materials it means your idea is somewhat pre-meditated. [My group] and I built these motor systems from spare parts we had lying around, but the people that looked at our project thought we had started much earlier.”

At MakeMIT, all tools and materials were provided, including 3D printers, Kinect sensors, microcontrollers, sensors and input mechanisms, motors and actuators, and more. In order to level the playing field, no outside materials were allowed. “We spent a lot of time just thinking, what is the perfect amount of materials? What materials do we get to allow teams to do everything they want to without costing too much? Having a team of ten people try to speak for 200 people is not very easy. It’s a huge risk.”

The event organizers consulted with professors and makers to create a comprehensive list of materials, but some teams were unsatisfied. “I think next time they should get some input on what materials to stock,” said Adams. His team built a physical arcade-style version of the late mobile game Flappy Bird, featuring a 3D-printed bird avoiding pipes on a scrolling LED matrix. They went in expecting individually addressable LED strips, but ended up having to hand-solder the array themselves. At a hackathon — where time is the most valuable resource — this was a huge productivity sink for the team.

Another bottleneck came when teams were allocated only a limited amount of material to laser cut. Many teams opted to 3D print their larger parts instead, which “took ages and almost DoS’d them,” said Adams, comparing the lines to a denial-of-service attack. Some projects had to be cut due to insufficient time for 3D printing and laser cutting near the end of the hackathon.

In fact, time was even more valuable than usual for Saturday’s hackers. Typically, hackathons range from 24 to 48 hours in length, but MakeMIT was instead broken up into two one-day phases, due in part to safety concerns related to hosting an overnight event. “If we have a drilling or Dremel section, having hackers that are sleepy and tired operating those is obviously a risk to safety,” said Rachel S. Wang ‘16, one of MakeMIT’s co-directors. “We really did try to push for an overnight hackathon, but [since this was] year one, we saw that it wasn’t feasible.”

At the end of Saturday’s event, teams presented a diverse range of projects, ranging from Wakey Wakey, a silent alarm clock, to ShotBot, a robotic bartender. Other projects included a relay baton that tracks split times based on handoffs and a oscilloscope probe built on a budget of under $50. One team, which included one of the developers of Tidbit, created another Bitcoin-related hack, exposing a design flaw in the official Bitcoin wallet.

These hacks were judged based on three criteria — functionality (how successful they were that day), potential for success (how much further the project could be developed), and hack factor (resourcefulness and creativity). Only the top ten teams from phase one are being invited back next weekend for the second round of MakeMIT, where they will further iterate upon their prototypes.

Taking first place and $2000 in prize money was a guitar-playing robot capable of both strumming and fretting. In second place was GoPro DataPac, an attachment for GoPro cameras that records data about action sports, such as velocity, altitude, rotation, and acceleration. LexoGlove, an exoskeleton glove that teaches the deaf-blind how to perform American Sign Language fingerings, took third place in Saturday’s competition.

Elizabeth Zhang ’16, one of the hackers who built LexoGlove, appreciated the fact that the hackathon was split over two days. “I don’t really have the stamina for [an overnight event.] I like sunlight and fresh air.” Her teammate Julia C. Canning ‘16 agreed, saying that the extra week gave the team “time to refine [their] design more and time to order materials.”

The teams advancing to phase two of MakeMIT now have an opportunity to request a bill of materials. “Obviously, now they have more of a sense of what materials they want,” said Wang, “so we’re going to get those for them so we can be prepared.” Additionally, teams will have access to a machine shop and mentors, giving them a chance to develop their prototypes into more complete products.

LexoGlove currently uses a servo-based underactuated mechanism to pull the wearer’s fingers in and out to show them the sign corresponding to a particular word. This approach was sufficient for a proof of concept, but servos can be bulky. Before next Saturday, Zhang, Canning, and their third teammate, Edwin H. Zhang ’16, will think through the next iteration to make it thinner and lighter, perhaps by switching to a linear motor.

“At this point, it’s a bit less competitive,” noted Doan. “Everyone’s kind of made it, and now it’s more of a we-want-to-make-this-work kind of thing.” And the same can also be said of MakeMIT itself — despite the current trend among hackathons to “grow big,” scaling is far more difficult for a hardware-based event, where materials are paramount.

Unlike HackMIT, MakeMIT wasn’t trying to reach hackers beyond the Boston area. Nevertheless, 30 percent of participants were from outside MIT, showing a growing demand for hardware-based opportunities. “I definitely want MIT and other schools to do more hardware hackathons,” said Zhang. “I keep telling all my friends who do software, there’s no virtual without the physical.”

Despite a few logistical hiccups, the feedback from sponsors and participants was positive. In the near future, MakeMIT may accept a few more teams, but the organizers don’t want to expand at the cost of quality. Instead, they believe MakeMIT can inspire other schools to organize hardware hackathons. “Our hope was that MIT can lead the charge on this thing and show that it is possible. Hopefully, across the country, we’ll start seeing more of it.”

]]>
Stephen’s Media Diary: Video Gaaaaaaames https://partnews.mit.edu/2014/02/12/stephens-media-diary/ https://partnews.mit.edu/2014/02/12/stephens-media-diary/#comments Wed, 12 Feb 2014 06:03:11 +0000 http://partnews.brownbag.me/?p=3411 Continue reading ]]> Hello, my name is Stephen, and this is my first post on the MAS.700 blog! I have to admit, usually when I invoke phrases like “the future of news” and “the future of journalism” it’s with a grain (or a fistful) of sarcasm. Nevertheless, I am very excited for this class and look forward to blogging here for the rest of the semester.

For our first assignment, we had to track all of our media consumption for a week and figure out a way to measure and present this information. Makes sense — to understand the future of news, we must first understand the present, most of all ourselves. “Know thyself, and you will know the universe.” Errr, of media. Read on!

The method to this madness

Like many others in the class, I resorted to using RescueTime as my primary mechanism for tracking my media consumption. Given the focus of the class, I also supplemented RescueTime with a manually-curated Google spreadsheet of any news content I was consuming. In any measurement-based observation, it’s important to understand the limitations of the tools you’re using to make the measurements. In the case of RescueTime:

  1. RescueTime is good for tracking your time usage on a single device, but when you introduce other devices in addition to your PC, there can be difficulty measuring media consumption across all platforms. While there exists an Android app, there are no solutions for my iPhone (which, luckily, was out of battery for most of the period) or my Xbox. Neither is there a good way to keep track of media consumed in communal spaces, such as TV shows in my hall lounge. RescueTime also failed to properly log time when I had my laptop plugged into my larger monitor, (I suspect) because the application was not “focused” on the correct display.
  2. RescueTime can create weekly reports of your time usage, but the week can only start on Sunday or Monday according to the dashboard. Since we started this assignment in the middle of the week, this led to a very annoying issue where you could only see your weekly reports for 2/3 – 2/9 and 2/10 – 2/16 individually, with no way to specify a custom start and endpoint (for the purposes of this experiment, 2/6 – 2/11). In other words, there was no real way to see the data in aggregate — as a workaround, some of the class looked at the two weeks separately, focused on only one, or did a day-by-day breakdown.

I managed to solve both these problems by using the 14-day free trial for a RescueTime Pro account. This gives you two perks: you can manually enter “offline” data, which I used as a workaround to enter data on my secondary devices or data that failed to log properly. Secondly, you can export your data as .csv files, thus allowing me to merge data from the two weekly reports using OpenRefine.

As for my supplementary Google spreadsheet, I opted for a relatively broad definition of news content, listing any news story, article, or blog post that I read (but excluding social media posts simply due to the sheer number and difficulty of measurement). For each piece, I made a note of when I read it, what the source was, and how I was referred to the link (direct navigation in the browser, Facebook, Twitter, Reddit, etc).

Small data

With those methodological choices in mind, let’s delve into the data. As it logs your computer usage, RescueTime tries to predict what category a particular site or application falls into, with varying degrees of accuracy. Based on this categorization it also tags time as being “productive” or “unproductive”, which is a rather lazy approach to measuring productivity, as it assumes that each site (or category of sites) has a single, fixed mode of interaction, which is hardly the case. Unless otherwise specified, RescueTime automatically tags News & Opinion content as being “very unproductive.” As a result, I chose not to look at the productivity data and instead looked generally at the category breakdown:

Of the 50 hours I logged this past week…

  • 18.6% of my time was spent playing video games
  • 16.6% of my time was spent watching videos, TV, or movies (YouTube, Netflix, offline)
  • 16.5% of my time was spent doing or reading about software development
  • 10.2% of my time was spent on social networks (Facebook, Twitter via Tweetbot)
  • 8.9% of my time was spent logging/reviewing the data (RescueTime, Google Docs – the observer effect for this experiment)
  • 7.3% of my time was spent reading the news/blogs
  • 4.9% of my time was spent on my email

Overwhelmingly, I spend my time on entertainment media — along with music (not pictured), it occupies a combined total of 37.3% of my time. Sublime Text 2, my preferred text editor, and localhost, my machine’s local testing server, logged significantly higher numbers than they would have on a normal week — over the weekend, I participated in the 2014 QUILTBAG Jam and built a game within a day, skewing the numbers immensely. In other words, I’m even less productive than this data suggests. If we exclude these two items and Google Docs, which was used as a data-logging tool, the worst offenders were:

  1. Hearthstone (Blizzard’s online trading card game — think Magic: The Gathering)
  2. Facebook
  3. Twitter (Tweetbot)
  4. Gmail

Interestingly enough, all these things have one thing in common: they have a very short use cycle. Hearthstone features quick games that typically take 5-10 minutes, making it the perfect way to take a quick break from work or fill the small void of time before a class. As a Facebook user, I like to pull up the news feed every now and again via the easy two-letter fb.com domain name. I quickly flick through it to look for things that pique my interest, and that’s it.

I use Twitter in very much the same way, except I check Tweetbot more often as it’s a desktop application (though, apparently, for shorter periods of time than Facebook). Email tends to work the same way — I’ll check it once in a while, resolve any important emails, gif threads, or flame wars, and then quit out. All four of these things take up small units of time per use cycle, and yet over a week’s time, quickly snowball into the biggest recipients of my time and attention.

Breaking (down) the news

Finally, onto the Google Spreadsheet data. I decided to look at two main things: the news sources that I was reading, and how I getting linked to those articles. I don’t really have any favorite publications that I exhaustively read — The Tech, maybe, but the amount of content is so low that I would hardly consider this exhaustive.

Instead, most of my news reading comes in via links on social networks and is largely dependent on the ebbs and flows of the week as well as the particular time at which I check those social networks. As a result, most of the 83 news articles I read were unique to their source (i.e. that was the only article I read from that website). Looking at the sites that managed to get multiple hits, we see the following results:

Taking a step back from the top news sources, we can see which referral methods were most frequent overall. Facebook and Twitter dominated the pack — for future study, it’d be interesting to see how these referrals cascade down from Twitter (where I follow more journalists/news nerds) to Facebook (more of my IRL friends) and how these two frequencies might fluctuate depending on what time(s) of day I was looking at each network. If, for example, I’m not checking Twitter when a particular story is published and getting shared, I may see it later in the day if/when it propagates through my Facebook friend network.

Moreover, there is a stark contrast between the number of articles I directly navigated to in the browser or hit via inlinks and the number of articles I accessed via outlinks. What this says is that I hardly click into external links or source links in articles (or perhaps it’s a testament to the fact that people aren’t providing these links to retain traffic?) I almost never use search to find news stories (in the one case shown here, it was a reference piece, not a news story). Unsurprisingly, the social stream remains the supreme decider of what articles I do or don’t read — thus explaining why everyone’s so keen on trying to dominate it.

There are two more interesting statistics based on my news consumption for the week. First, I wanted to see the percentage of listicle content I was reading and the sources/referrals for them. I’m pleased to announce that only 4 of the 83 articles I read this week fell into this bucket (naturally, from BuzzFeed, Mashable, and Gizmodo, all referred via Facebook), making my news diet only 4.8% listicle.

Secondly, I wanted to see how I really latched onto a single story (as I mentioned in class last week, it was previously Philip Seymour Hoffman’s death) and read every article I could about it. This week, that story was the meteoric rise and fall of Flappy Bird, a game that seemingly came out of nowhere and invaded the public consciousness even faster than doge meme.

This week, 8 out of the 83 articles I read were exclusively about Flappy Bird. That’s nearly 10%. Far less than I would have expected, given how often the game was referenced in my conversations, email/comment threads, and my Twitter feed. Still, it’s a nontrivial portion of my news consumption, and perhaps points to the reasons why more and more news outlets (most prominently The Atlantic, but Flappy Bird was covered everywhere from CNN to Forbes) are doing more and more of these critical/longform pieces dissecting pop/meme culture. It’s a particular subgenre of online journalism that my brother and I experimented with when we ran 21st Century Boy, and definitely one that I will continue to keep my eye on as we progress through this course.

]]>
https://partnews.mit.edu/2014/02/12/stephens-media-diary/feed/ 1