Monday, April 1, 2013

My 2013 AFCEA San Diego Plugfest Participation - Showing off Terracotta In-Genius

A couple of weeks ago (end of January 2013), 2 colleagues and I participated (under our company banners, Terracotta and SoftwareAG) in a government "plugfest"...and we won first place! Check out this other articles that also talks about our win: - and if you're in a rush, jump directly to the short 3 minute video demo below.

What is that, you may ask? As explained on the AFCEA website (, a plugfest is a "collaborative competitive" challenge where industry vendors, academic, and government teams work towards solving a specific set of "challenges" strictly using the RI2P industrial best practices (agile, open standard, SOA, cloud, etc.) for enterprise information system development and deployment.

The idea is to "plug" technologies together (technologies provided by the various players, not necessarily within your team) as opposed to rebuild everything from scratch. And indeed, "plugging" is almost mandatory since the scenario is only announced 24 hours before the event, giving the teams a mere 72 hours to create something based on the scenario provided. 

Overall, it's the government effort to encourage/push for more interoperability and reuse of IT components across projects and/or even agencies.

This particular January 2013 plugfest was about solving a Humanitarian Assistance and Disaster Relief (HADR) use case problem where technology:
  • Helps track in real-time what's happening on the ground (data streams about hazardous materials, first responders, sensors, injured civilians, etc...) and report it in an actionable, geospatially-enabled, format
  • Provides real-time decision support based on pre-defined emergency protocols
  • Correlates various "BigData" streams (sensors, social feeds, etc...) to perform real-time analytics in order to predict movements and/or identify "flash mobs" / criminal hotspots taking advantage of the confusion.
The end result of what we put together was a real-time "map" dashboard that shows everything that's happening on the ground, and provide contextual highlights to help decision support.
Here is a short 3 minute video showing the nuts and bolts of that demo:

What you particularly see in the video demo:
  • Moving actors on the disaster zone (first responders, plumes of toxicity, drones, etc...). Each of these actors are "broadcasting" their current geolocations (lat, long + metadata) at various time interval using the nirvana universal messaging (1000s of message per second)
  • Terracotta's Complex event processing (CEP) engine performing continuous "geo" queries identifying in real-time the distance, speed and direction of the hazardous plumes in comparison to the various red-cross shelters on the map. The CEP engine automatically generates alerts if hazardous plumes are indeed forecasted to impact shelters...providing critical decision support to the commander in charge.
  • All events and metadata are stored in-memory, using Terracotta BigMemory for faster, micro-second access and analytics.
  • The ability to drill into the moving drones, planes and responders to see a ground view in real-time.
  • A triaged based causality tracking and available blood supply.
  • Availability of shelters, red cross centers, blood banks, and other supporting organizations(DOD types).
List of what we "plugged":
Thanks for everyone who organized this event. It's been a blast to participate as part of Terracotta team, and I'm looking forward to participating in the next "plugfest" event! 

No comments :