ODPi is a nonprofit organization committed to simplification & standardization of the big data ecosystem with common reference specifications and test suites

Why ODPi?

Apache Hadoop, its components, and Apache Hadoop Distros, are innovating very quickly and in different ways. This diversity, while healthy in many ways, also slows the Big Data Ecosystem.

By providing specifications for common Apache Hadoop runtime and operations, reference implementations and test suites, ODPi removes cost and complexity and accelerates the development of Big Data solutions.

The ODPi

  • Reinforces the role of the Apache Software Foundation (ASF) in the development and governance of upstream projects.
  • Accelerates the delivery of Big Data solutions by providing a well-defined core platform to target.
  • Defines, integrates, tests, and certifies a standard “ODPi Core” of compatible versions of select Big Data open source projects.
  • Provides a stable base against which Big Data solution providers can qualify solutions.
  • Produces a set of tools and methods that enable members to create and test differentiated offerings based on the ODPi Core.
  • Contributes to ASF projects in accordance with ASF processes and Intellectual Property guidelines.
  • Supports community development and outreach activities that accelerate the rollout of modern data architectures that leverage Apache Hadoop®.
  • Will help minimize the fragmentation and duplication of effort within the industry.