ODPi FAQ

By March 29, 2016Blog

Who We Are

ODPi is a nonprofit organization committed to simplification and standardization of the big data ecosystem with a common reference specification called ODPi Core.

As a shared industry effort and Linux Foundation project, ODPi is focused on promoting and advancing the state of Apache Hadoop® and big data technologies for the enterprise.

The rapid influx of digital information available to enterprises has resulted in a big data ecosystem that is challenged and slowed by fragmented, duplicated efforts. ODPi’s members aim to accelerate the adoption of Apache Hadoop and related big data technologies with the goal of making it easier to rapidly develop applications through the integration and standardization of a common reference platform called the ODPi Core.

Where We Are Today

  • ODPi currently has 26 members and more than 35 maintainers from 25 companies dedicated to its ongoing work.

  • Membership investments nearly doubled since ODPi was announced in February 2015.

  • Open to all with a very low hurdle for any developer or company to participate and have an impact.

Key Points

  1. ODPi provides cross-compatibility between different distributions of Hadoop and big data technologies.

    1. ODPi Core specifies how Apache components should be installed and configured and provides a set of tests for validation to make it easier to create big data solutions and data-driven applications.

    2. ODPi Core is not a distribution, it’s an industry standard deployment model over which the industry can build enterprise-class big data solutions.

  2. The fragmented Hadoop market Increases ISV costs, reduces innovation and makes delivering business value harder. ODPi, by solving these problems, fills a gap in the big data ecosystem.

    1. To overcome the interoperability and fragmentation challenges this industry faces, it will take all of us working together. Linux is a great example of how open source can speed innovation and market transformation – that’s what we’re doing at ODPi.

    2. Organized to support the ASF, ODPi promotes innovation and development of upstream projects like Hadoop and Ambari.

    3. The now 10 years old Hadoop has become a mature technology that serves hyperscale environments and is able to handle a wide varying amount and type of data. It’s a proven and popular platform among developers requiring a technology that can power large, complex applications.

    4. Yet, Hadoop components and Hadoop Distros are innovating very quickly and in many different ways. This diversity, while healthy in many ways, also slows big data ecosystem development and limits adoption.

    5. The industry now needs more open source-based big data technologies and standards so application developers and enterprises are able to more easily build data-driven applications.

  3. The ODPi Core removes cost and complexity to accelerate the development of big data solutions.

    1. ODPi helps the three key ecosystem players:

      1. Hadoop Platforms (distros): ODPi complaint guidelines that enable ODPi-compatible software to run successfully on their solutions. The guidelines allow providers to patch their customers in an expeditious manner to deal with emergencies.

      2. ISVs/SIs: ODPi compatibility guidelines allow them to “test once, run everywhere,” eliminating the burden and cost of certification and testing across multiple distributions. They can have a predictable release cadence to reduce maintenance and support costs.

      3. Enterprises (end users): Ability to run any “ODPi-compatible” big data software on any “ODPi-compliant” platform and have it work.

  4. ODPi will bring value to the market by:

    1. Standardizing the commodity work of the components of an Hadoop distribution

    2. Providing a common platform against which to certify apps, reducing the complexities of interoperability

    3. Ensuring a level of compatibility and standardization across distribution and application offerings for management and integration

FAQ: Project Scope and Roadmap

Q: How is testing administered? What is the process for becoming ODPi compliant?  

A: Testing is self-administered currently. To become ODPi compliant, vendors must submit test results for the product release they would like certificated. They do not have to comply with every specification for every product release.

This GitHub repository is where vendors can commit their ODPi spec test runs to let others know when their distro is compliant. Instructions on how to report self-certification are also included.

Q: How long is the testing process to become ODPi-certified?

A: The specification has just become available, but several members that have been planning to do the validation believe running the tests will take only 20 minutes, making a 1-2 day effort at most overall.

Q: Can you explain the ODPi release cycle?

A: ODPi will continue developing the Runtime Specification with updated releases coming every six months. After the March release, expect another in October 2016. The ODPi Operations Specification 1.0 is expected late this summer.

Q: When will the Operations Specification be published?

A: The ODPi Operations Specification is the other piece of the ODPi Core puzzle.  It will help improve installation and management of Hadoop and Hadoop-based applications and will be available in late summer.  The Operations Specification covers Apache Ambari, the ASF project for provisioning, managing, and monitoring Apache Hadoop clusters.

Q: How does ODPi compliment Apache Software Foundation (ASF)?

A: The Apache Software Foundation supports many rapidly growing open source projects. Complementary, ODPi, a shared-industry organization, is solely focused on easing integration and standardization within the Hadoop ecosystem.

ODPi is also contributing to ASF projects in accordance with ASF processes and Intellectual Property guidelines. ODPi will support community development and outreach activities that accelerate the rollout of modern data architectures that leverage Apache Hadoop. For example, ODPi is also contributing back to projects like Ambari and BigTop with more than half the code in the latest release of BigTop coming from ODPi.

Q: How do I get involved?

A: Membership is not a requirement to become involved with ODPi’s technology as all development is done in the open. Visit www.github.com/odpi/specs. Get involved with the ODPi project on GitHub by signing up for our mailing list, sending pull requests or or giving us feedback at https://jira.odpi.org. Our governance model offers one member one vote equality.

Q: How is ODPi governed and managed?

A: OPDi runs under an open governance model that offers one member one vote equality. This ensures our members bring a balanced representation of the big data ecosystem with a perspective and expertise well beyond Hadoop.

Q: What is the role of The Linux Foundation with ODPi?

A:  ODPi is a Linux Foundation project that is independently funded. It harnesses the power of collaborative development to fuel innovation across the big data ecosystem. By aligning with The Linux Foundation, ODPi is able to leverage best practices for community governance, operations and development the organization established running Linux. www.linuxfoundation.org