Through applying thoughtful research, we identified opportunities to surface Confluence to Jira users that would increase new Confluence trials and active usage between the two products.
While working on Atlassian’s growth team, I led the design of a number of multivariate experiments on our sign up, onboarding and purchasing flows to increase acquisition, activation and retention. As the growth team’s mission was to find growth opportunities within the product, the team would operate with a lean mentality where the experiments would be built on top of the product’s existing infrastructure. This meant that these experiments would effectively act as functioning prototypes to validate growth opportunities, allowing the team to move quickly without being burdened by the work that would be required if they were operating as a feature development team.
While relatively quick and easy, these experiments weren’t free: there was a limited volume of customers that we could experiment on and this meant that we needed to focus on the best ideas that could have the most impact on the business. The role of the designer in the growth team was to identify where improvements to the experience could be the most impactful and to ensure that our experiments were the best shot at validating a particular idea.
Informed by an analysis of expansion opportunities, I worked on a project that had us explore how to increase Jira Software customers who expand into Confluence and test our hypothesis and ideas as a series of these experiments inside of Jira. The project was supported by Atlassian’s flywheel business model, where we rely on being able to land customers in one of our products, and over time expand them to other products.
We knew that a large number of customers already use Jira Software and Confluence together successfully, however we identified that there were customers who never discovered the additional value they could have from our integrated suite of products, and those that did had a rocky path to success, with much friction and churn along the way.
The team, comprised of myself and another designer, a product manager, and the engineering team set out to unpack the problem using a process similar to the Google Venture’s Design Sprint. After the sprint, we would spend the following weeks designing and building experiences that would be tested using the multivariate experimentation instrumentation built into Atlassian’s products. The sprint was facilitated by myself and Benjamin Berger, my design counterpart on the project.
To get started, the team immersed ourselves in transcripts from user interviews that Mary Trombley, the researcher on the project, had conducted with customers that had recently expanded to Confluence from Jira Software. We then used Jira Software’s team personas to facilitate an empathy mapping exercise where we ran a hypothetical retrospective for each persona to highlight their pain points. Involving engineers in these exercises helped to motivate the team and empathise with our customers and the design process. It was pleasantly surprising to see engineering pushing for a good user experience and referencing quotes from the interview transcripts later in the project.
Our big takeaway from these exercises was how distinct the benefits are for each team, and how often they are based on the organisation’s needs, and not the team’s. Using each of the personas, we then mapped out and envisioned how exactly Confluence might solve for them and how they would use it. This involved visualising the team’s current processes, maturity and tools alongside what an ideal solution using Confluence could look like.
After building an understanding of our customers and their needs, we mapped out the journey for how a customer would actually start a trial of Confluence from Jira Software. This exercise would immediately highlight clear areas where improvements could be made. Basic ideas like providing a way for customers to add Confluence to their existing Jira site from the Confluence product tour and eliminating a bunch of unnecessary steps became painfully obvious.
We took the journey map we formed on the wall and converted it into a more portable format, working with our data analysts to overlay analytics across the journey like drop-off and time spent on page, as well as qualitative customer feedback we collected from prior work done by other teams at Atlassian.
After immersing ourselves in the research and building an understanding of the problems with the current journey, the team did a number of time-boxed sketching exercises. These sketches were presented, discussed and refined further. Everyone would then pick their best idea and narrow it down into a three-panel storyboards on one solution poster. The team would then individually vote on their favourite ideas from the exercises. This would conclude the first part of the sprint and over the subsequent weeks Benjamin and myself would use these ideas to inform the design of an experience to deliver as an experiment within the product.
After the sketching exercises we had plenty of ideas to work with. We now needed to visualise these ideas into cohesive experience for the team to build out into an experiment that would be seen by Jira users.
To narrow the scope of our ideas we needed to focus on our primary metric, which was acquisition. This meant that our experiences were centred around getting users to start a new trial of Confluence from Jira. Unfortunately as our research had identified, the experience of activating a user in Confluence after they started a trial leaved a lot to be desired, but we knew we needed to validate the opportunity for acquiring users further up the funnel before we would invest in improving activation.
Informed by our sketches, we would arrive at a number of touch points inside of Jira where we thought it made sense to surface Confluence. We believed that if we could surface Confluence in the right way at the right time inside of Jira, customers would be more likely to start a trial. I led the design of these touch points and the journey from discovery through to starting a trial.
One of these touch points would see us introducing a Project documents page inside of Jira. This would surface the types of page templates available in Confluence that might be useful for a team using Jira. We knew from our research that these page templates are one of the primary features that a customer mentions as the reason for using Jira and Confluence together.
One of the key value propositions of Confluence is that it helps with documentation and planning. We designed an experience to surface the same page templates on the backlog, where users are planning and prioritising the work for their sprints.
Another touch point where we highlighted Confluence’s page templates was on Jira’s issue details page. One of the strongest points of integration between Jira and Confluence is the issue links feature that lets you link a Confluence page to a Jira issue. We designed an empty state for this feature to showcase how Jira can be integrated with Confluence.
All of these touch points would lead users to start a 30 day trial of Confluence in just a few clicks, right from within the product. But there was a problem: only administrators of Jira who were billing contacts were able to start a new trial. We knew from our research that a point of friction to evaluated products was the procurement and purchasing within many organisations. More often than not, pricing was not a barrier to evaluating new products, but the internal processes within these organisations prevented people from trying new software.
If we only showed our touch points to users who could start a new trial, the opportunity and audience for our experiment would be severely limited. To overcome this, we designed a new experience that would allow non-administrators to effortlessly request a trial of Confluence from their administrator right from within the product. When requested, a notification would be sent to the administrator with the steps they need to take to start a trial of Confluence. The requestor would then be notified when they were given access to Confluence.
We knew this last part of the experience would be critical to the success of the experiment, as it all hinged on a user successfully starting a trial. We tested the comprehension of multiple variants of the design in user testing to increase our confidence that users understood the information we were presenting. This was important because starting a trial had the potential to significantly impact a customer’s bill.
All of these touch points and experiences were bundled together into one experiment that would be released to a sample cohort of Jira customers for a month. In parallel, we would also measure a control cohort of users who were not exposed to the experiment. This would allow us to have absolute statistical confidence that our new experience was responsible for affecting our measures of success.
We saw an 145% increase in Confluence trials from Jira Software in the new experience when compared to our control group for the experiment. We also saw directionally positive increases in active usage, conversion and revenue. This was a massive result for the growth team and reflected an impressive opportunity for the business, which even got a mention in the September 2017 investor presentation. This outcome of this experiment was a catalyst for a new program of work that would be exclusively focused on expansion opportunities across our products.
Because the growth team engineered their experiments for speed, execution and delivery – it now meant that there would be work required to implement the design as a permanent feature in product. Shortly after the successful experiment, I moved on from growth to work on the new issue design for Jira and was unfortunately not involved with shaping the final design of what would make it in product. For completeness of the project, I’ve included the final designs of how the touch points from the original experiment were updated as part of Atlassian’s new visual design language.