The concept of data node started at the Naval Research Laboratory in Washington DC in 2005. The project, funded in part by the Office of Force Transformation and the Office of Naval Research was called VMOC (Virtual Mission Operation Center). The original intent was to demonstrate significant force transformation by launching a tactical satellite and allowing direct tasking from users on the ground. Data would be made available right away to all tactical users. VMOC was the web portal enabling this capability anytime anywhere in the world. Collaboration was also deemed critical between users. The emergence of social communities outside the strict DOD organizational structure was expected and to be measured.

As the TACSAT-1 satellite was delayed (has not been launched yet), VMOC searched for other data to serve the user community. Several other future TACSAT satellites became new requirements to support. Office of Naval Research (ONR) was extremely interested in the technology for Maritime Awareness. The US Coast Guard (USCG) was mentioned as a potential customer. Automated Identification System (AIS) data coming from ships was another good source of data to be collected by sea and ground nodes. We were still lacking a satellite to task and retrieve data from. Aware of the EO-1 technology demonstration opportunity, NASA was eventually contacted to become a space node proxy on the open Internet.

The Earth Observing-1 (EO-1) mission, as part of the New Millennium Program (NMP) is a technology-driven mission. The initial baseline was mainly a hardware experiment. It has now shifted to a software emphasis with the Autonomous Sciencecraft Onboard Experiment (NASA Ames Livingstone Model-based diagnostics, CASPER JPL Planner and Expert System), Onboard Cloud Cover Detection and Prediction, Automated Detection of Volcanoes and Fires with Collaborating Space/Ground Assets. The final goal is in experimenting with an Evolving Ground/Space-based Software Architecture to Enable Sensor Webs and eventually full Autonomous Mission Operations Systems – A perfect test bed.

It quickly became apparent that development of a stovepipe was not going to solve our problems. The concept of data nodes quickly emerged. If we could have documented API’s, we could just hand them over to the “data” community and focus on VMOC development. Unfortunately, there was no such thing. Partial solutions existed but nothing comprehensive. Coincidentally, the Open Geospatial Consortium issued a request for an interoperability demo called OWS-4 with the intent of demonstrating interoperability of upcoming specifications. Many of those specifications appeared to be exactly what we needed. We had internally worked out some implementations and proposed to join the effort under the SWE subtask, one of five of this ambitious demonstration. This was accepted and EO-1 was renamed the OWS-4 satellite for the occasion.

OGC Sensor Web Enablement subtask (SWE) is itself extremely ambitious: "The ultimate vision is of a sensor market place where users can identify, evaluate, select and request a sensor collection regardless of sensor type, platform or owner. The goal is to enable a federation of sensors, platforms and management infrastructure for a single sensor enterprise. This enterprise will enable the discovery and tasking of sensors as well as the delivery of sensor measurements regardless of sensor type and controlling organization."

Our goal is to enabling data nodes to emerge quickly with a low entry cost. We now had a vision for the standard APIs (rather an idea of what to start with). We needed a framework that we could rapidly deploy on the Internet. The concept of GeoBliki emerged as a fusion of many web technologies. To keep the cost of entry extremely low and avoid proprietary implementations that would have increased the barriers of entry, we went with an open source approach. Time to market being of essence, we needed a quick prototyping environment to demonstrate the technology. We had the opportunity start from scratch and pick the best web-app. We decided to go with the most promising language and framework. Ruby was selected for its succinctness and elegance. Many writings affected our thinking in that area. Ruby-on-Rails was chosen for its quick prototyping speed, its Model-View-Controller approach, MySQL support and its built-in testing framework. Test-driven development was becoming our mantra at the time.

Several other components were quickly selected. We needed to display our geospatial data on a map and within a standard browser. Being developers for Community MapBuilder Web Map Client, this was an easy decision. CMB is a thin cross-browser Ajax client that supports many OGC specifications such as WMS and WFS and even the new GeoRSS/Atom specification.

One of the first problems was to acquire the data and make it seamlessly available from one location. In this case, data had to be retrieved from many sites. We had to scrape the information from the United States Geological Survey, the Jet Propulsion Laboratory, and a few NASA Goddard sites. Data products include (high/low resolution jpeg), raw data (which can only be kept for a few days due to size constraint), mission results goals, feasibilities and tasking, ephemeris… We used various web scraping tools such as: htmltools-1.09 and Rubyful Soup 1.0.4. As the data node acquires data, GeoRSS /Atom1.0 feeds are sent to users that subscribe for that information. Further information will be presented in the OGC Services section.

There are many ways to present the data to the user: A chronological view is provided by a Geo-Blog interface. Typo 2.6 was selected. It is packed with features and includes a theme manager. This is critical for customization of new data nodes. As of this writing, Typo 4 was just released and we hope to integrate it soon. The Geo-blog view allows the local scientific community to discuss particular results or pictures. They can discuss image quality, processing algorithms and ways to improve particular results. Photo annotations will be added very soon to allow information exchange around photographic data.

Data can be arranged by topics (books) and/or grouped by areas of interest by users or data librarians. This is the wiki view. Our data being geospatial oriented, we extended the concept of Wiki to Geo-Wiki. This is to allow the grouping of data along areas of interests and showing results on a map. We used Hieraki2 initially written by the same developer: Tobias Luetke and now supported by Alexander Horn. User selected Web Map Context (WMC) documents can be saved and stored under a specific topic. Those documents can specify Map Features, Zoom level, Layers, Feeds to be visualized on the map. Discussion around that information can take place on the Wiki page (given the proper access level for that user).

Forums have been implemented using Opinion , again from the same author. Forums are used for specific user communities (science, development…) and provide permanent feedback to the various groups. WildFire an XMPP server is integrated into the site which provides real-time instant messages back to our customers. We are still searching for a good IM AJAX based client which integrates in a multi-platform web browser. Users will have to rely on their existing clients. Gabbly is being used temporarily in the sidebar but is proven to be memory intensive. It is also page specific.

Active RBAC provides our Role-Based Access Control. We can create static permissions and allocate them to roles. Users can be assigned a specific role at login and inherit the specified permissions. More complex situations could use the definition of user groups and super-groups. This capability is fairly critical to this application for high resolution imagery and satellite tasking are reserved to specific roles. Identity 2.0 is becoming a very sensitive issue for Web 2.0. Several protocols are emerging. This is a critical capability for our system, as we want to accept users from other trusted domains. It is critical for us to accept users with proper credentials and exchange profile information with their respective identity providers in a streamlined manner. We have selected the Yadis protocol and the OpenID 1.1 implementation. Every GeoBliki registered user automatically gets an OpenID url that may be presented at many sites for authentication. As part of the profile exchange, static permissions are required to be exchanged. We will very soon add Friend of a friend (FOAF ) document, which is becoming important for social network analysis.