scalable scientific processors
Developer Cloud Sandboxes
- Hosting experiments available from and to researchers worldwide.
- Productivity at your fingertips, for scientists wanting to boost their research agenda.
The 'Developer Cloud Sandboxes' service draws from Terradue's expertise with the development, maintenance and evolutions of the European Space Agency's G-POD operational environment, where specific data handling applications can be seamlessly plugged, coupled with high-performance and scalable computing resources. It leverages technology designed for researchers with data-intensive requirements. Using the power of 'Hadoop Streaming' for legacy applications integration, scientific processors are enabled to access distributed data holdings and scale over computing clusters. The service seamlessly leverages the power of OpenNebula technology for managing virtualized computing nodes, and standard Cloud APIs for resources provisioning and appliances deployment. All integrated: data and processors swarming at your fingertips.
- Implement processing chains with full control of code, parameters & data flows.
- Collaborate within a shared Platform, delivered as a Service.
- Streamline development and dissemination efforts.
- Leverage seamless Cloud APIs to stage data & deploy code on ad-hoc clusters.
data management federations
- Enacting sensor measurements to scale within cloud infrastructures.
- Data management for eScience, designed to support non-for-profit research.
The Virtual Archives service provides a cloud-based facility for earth sciences data, coupling high bandwidth, large storage and integrated search. It builds on Storage-as-a-Service, coupled with user authentication and authorization, simple OpenSearch data discovery interface, query results in ATOM, RDF or KML formats, and data staging via common web protocols. Terradue's research & development for Virtual Archives is partially funded by European Commission Framework Programme 7 in the context of the GENESI-DEC, GEOWOW and MED-SUV (Mediterranean Supersites Volcanoes) projects, and by the European Space Agency's SSEP (Supersites Exploitation Platform) contribution to the geohazards communities dealing with interferometry, landslide and change detection (over 50.000 SAR data products), evolved as a flagship application within the Helix Nebula Federated Cloud for science.
- Value data from sensors deployed for long-term monitoring experiments.
- Feed experiment measurements to user communities through managed services.
- Benefit from Cloud-enabled software-as-a-service and usage metrics.
- Contribute Digital science and Open Archives (e.g. GEO's Supersites).
Data Challenge Platforms
- Addressing new research problems in a challenging and collaborative way.
- A platform for the automation of data mining and information extraction experiments.
The Data Challenge Platforms service helps in promoting and accelerating innovative uses of Earth Sciences data, where demands for greater returns on data collection investments can be met. Organize contests for different algorithms to be assessed in the service's common software environment, facilitating the comparison of approaches. The US Government 'COMPETES' Act authorizes the use of creative techniques such as challenges and contests to spur innovations that leverage Federal data holdings. In April 2012, NASA conducted the first International Space Apps Challenge to “encourage scientists and citizens to create, build, and invent new solutions”. Early 2013, ESA awarded Terradue to evolve it's G-POD infrastructure for the management of scientific contests that leverage ESA satellites data. Now you can already transform the way you do science.
- Experience multi-disciplinary collaborations, and foster new ideas in data usages.
- Automate the conduct of experiments in data mining and information extraction.
- Transition from data computation to knowledge exploration.
- Build the research continuum, from raw data to publications.
- Building new models for executable papers and reproducible experiments.
- Innovation as a service: shape the future of scientific publication.
The Digital Marketplaces service is a linked data driven approach for curating research results and value-adding services. Along with Terradue's Developer Cloud Sandboxes solution, researchers create linkages between scientific publications and fully reproducible, verifiable experiments where measurements, models, and datasets are all part of an integrated ecosystem. Relate your research results with multiple open data streams available from a variety of sources (earth sciences, socio-economic), and share knowledge aimed at fostering innovation and new applications. The platform's selection engine increases the value of your scientific experiments for novel applications and users. Terradue's research & development for Digital Marketplaces is partially funded by European Commission Framework Programme 7 in the context of the MELODIES and i-MARINE projects.
- Link measurements, algorithms and scientific papers.
- Relate experiments, parameters, and research communities.
- Create lively marketplaces of data analytics resources for Earth Sciences.
- Curate and deliver scientific information, from raw data to publication.
open web adoption
- Going beyond service integration, with scalable web services and simple interfaces.
- Data virtualization at work, to overcome the data and processor swarming challenges.
We integrate and contribute Open Source components on key business domains for Earth Sciences: Cloud provisioning & Cloud bursting APIs (e.g. leveraging the OpenNebula technology), geospatial information management APIs (e.g. leveraging the OGC's Web Processing Service and OWS Context standards), distributed data processing APIs (e.g. Hadoop Streaming). We deploy data casting gateways to spotlight "dark data” from archives having complex query APIs, and broadcast them over the internet. We build cross-feeds readers to aggregate and podcast multi-provider sources in meaningful ways for scientists, so they stage only the content that meets their requirements. Altogether we support applications for scientific Processor authoring powered by time and location automatic clustering. Our references include engineering services for the future european ground segments of the next generation "Sentinel" satellites.
- Publish and maintain Linked Open Data as RDF models.
- Experience the power of scalable, RESTful web services technology.
- Leverage architecture Styles and Patterns of Distributed systems.
- Enable your resources to join the innovation and economic growth of the Web.
Business & Market Analysis
- Going for IT transformation, focus on managed services and distributed processing.
- Innovations from emerging technologies and best practices for IT procurement.
We collaborate to identify and tailor processes, leading to the creation of added-value services. Our analysts help you identify what are the shortcomings in user engagement channels, how in the future you can deliver new products, or partner to do so, and what are the system architecture choices and tradeoffs to advance the deployment of a portfolio of on-line products and services. From global environmental analysis using massive amounts of Earth Sciences data, to regional land change detection with complex 3rd party algorithms, Terradue offers a specialist service for the management and distribution of very large spatial datasets complemented with data inventory, query and processing systems, carried out collaboratively with open source projects. Terradue also brings its experience as an operator of a private Cloud infrastructure, supporting bursting to commercial Cloud APIs.
- Update in-house knowledge of data sharing policies and interoperability rules.
- Develop strategies for scientific data curation and long term data preservation.
- Compare your organization's maturity levels with state of the art technologies.
- Analyze community trends and new opportunities for your corporate roadmap.