Ten Atlassian Data Center Apps now available on Marketplace
Demonstrating our continued commitment to enterprise, we are launching not one, but ten Atlassian approved Data Center Ready Apps.
We're thrilled to announce the launch of ten Atlassian Data Center Apps in the Atlassian Marketplace. With Atlassian's Data Center platform, clients can scale-with-ease and benefit from increased availability.
The following apps are now available for Data Center:
- Test Management for Jira
- ScriptRunner for Confluence
- ScriptRunner for Jira
- ScriptRunner for Bitbucket
- Project Configurator
- SmartDraw for Confluence
- SmartDraw for Jira
- Forms for Confluence
- Community Forums for Confluence
- Content Formatting Macros for Confluence
Reliable products built to meet the rigorous demands of modern enterprises
It's our mission at Adaptavist to build reliable products to solve complex business problems, our approach to preparing apps for Data Center was no different. To deliver apps successfully on this platform, we developed the necessary functionality and support for the Data Center environment, including management of cache and locking operations, supporting clustered databases while ensuring events are appropriately handled across data clusters. We embarked on a journey to modernise and refresh our approach to testing our apps to ensure we were able to meet the rigorous Data Center standards.
Our approach to Data Center Testing
For us, Data Center readiness goes beyond compatibility. It's not just a badge of compliance, but also one of quality. To this end, we created a pipeline based on a key set of stages that can be continually executed with every release.
We also established a common testing infrastructure and suite for all apps, to capture performance parameters under multiple test configurations. This test environment can now be used to validate plugins for Data Center Readiness and continually measure performance following each new release.
Our testing stages follow these key principles:
Automation - repeatable using a script with a declarative configuration
Integration - executed regularly, at least upon each release
Comparability - a consistent output to enable execution results to be compared
Creating a testing framework and setting up a process that can be easily customised and scaled for all Apps across different platforms (Jira, Confluence and Bitbucket) was challenging. We analysed the best performance test tools available on the market to find the right fit for what we wanted to achieve and after deliberation decided on Gatling.
Using Gatling as the best-of-breed load-testing platform, we developed a framework consisting of a common set of reusable components and generic scripts for each platform. We also created a custom reporting engine to consume reports generated by Gatling and generate a consolidated reporting dashboard to easily compare with previous test runs.
The impressive part is how quickly we were able to roll out this process to other apps. We started with one App (Test Management for Jira) and then rapidly implemented the same process for other apps across multiple platforms.
Dogfooding Data Center: Using Test Management for Jira and ScriptRunner for Jira
While building out our testing framework, we were able to use some of our own apps to complete the process.
All tests were executed using a large data set for each platform and app, and to dramatically speed up the process we used ScriptRunner for Jira to generate the App and Plugin specific data on Jira, Confluence, and Bitbucket. ScriptRunner scripts were then created and included in the app as a built-in script with configurable parameters to generate any required volume of data.
The results: Increased value for all our users
When we started building this process for Test Management for Jira, we had no idea what the results would be. However, after executing the tests on the then production version of each app, we realised that some Jira operations (View, Comment and Edit) were reporting slow response times. The Adaptavist TM4J development team investigated the issue and made changes to the App to reduce the mean response time from approx 1.2 secs to 0.3 secs for View operation, approx 10 secs to 0.7 secs for Comment operation and approx 10 secs to 1 secs for Edit operation.
With this level of testing, we were able to identify and improve performance. While some App vendors might complete this process once and call it a day, we've integrated this process into every release, ensuring our customers continually achieve high availability and optimal performance every time.
Why we are working with other Atlassian ecosystem vendors
As an Atlassian Platinum Solution provider, Adaptavist is keen to work with other premier ecosystem partners in their DC readiness efforts. This helps to ensure the other apps we bring to bear in customer solutions meet the same DC readiness standards as our own. For example, we worked with ALM Works, the makers of the “Structure for Jira” product family, to put their apps through our DC testing framework. Structure helps Atlassian's largest customers visualize, track and manage progress across Jira projects with adaptable, user-defined issue hierarchies—and we often recommend it to our customers.
ALM Works got an extra pair of eyeballs (and more DC testing) on their Apps. Adaptavist is assured another key Jira App meets the same DC-readiness standards as our own.
ALM Works and Adaptavist teams worked together to adapt the testing framework developed at Adaptavist for the Structure App family. ALM Works benefited from a number of upstream decisions taken by Adaptavist – such as the use of Gatling tool, a way to bundle tests together and present the final report with data. Adaptavist also supplied ALM Works with the source code for the baseline tests, which allowed the Structure team to quickly run tests without Structure Apps installed, in order to establish the baseline latencies and other numbers to compare to. The Structure team then modified existing testing methods and added new tests to fully establish the latency and scalability metrics for a Data Center with Structure apps, under load.
Igor Sereda, CEO of ALM Works, says, “Adaptavist’s testing solution for Atlassian Apps helped us start testing for performance and scalability right away. We were glad to use their advice and tools to make sure the entire Structure product family meets the rigorous performance and scalability requirements for Jira and Confluence Data Center.”
But wait there's more...
The Adaptavist Data Center Difference
Besides ensuring quality and scale with our new testing framework, we are offering our enterprise customers even more when they purchase or upgrade to Data Center.
Our Data Center specific features include:
Support Escalation - Each Data Center app will garner prioritised support and an improved Service Level Agreement (SLA) of 24 hours as a reasonable response time. We've even hired extra support engineers to accommodate this.
Dedicated Account Management - We've established a client services team at Adaptavist to help our enterprise customers get the most out of their tools. Every Data Center Atlassian Approved App customer will have a dedicated account manager to help with implementation and maintenance.