Three months after product launch, and with 500 companies running Oribi, what have we learned?

One of the key decisions I made when choosing to write a personal blog on startups, is to write on the move. Almost all blog posts describe the stages of building a startup from a retrospective perspective of a few months or years, when the results, successes and mistakes have become clear. Although some perspective may be lacking, this is an accurate description of how things have been in Oribi over the past few months since our product was launched – how we reached our first users, how we understand what works and what doesn’t, and what has really changed.

A short recap of what Oribi is all about: One of the challenges I’ve always been most drawn to, and liked dealing with, is simplifying complicated things. To a large extent, the companies I appreciate the most are not the ones that invented anything new, but those that took an existing, complicated and usually disliked issue and managed to turn it into something simple and lovable. When I decided to found my third startup, I knew I would be interested in taking an ‘old’, complicated field and examining ways of making it simpler and more accessible. A field I felt very close to, and which certainly matches these criteria, is the world of data analysis – data, BI, and analytics.  The following scenario will probably be familiar to anyone who had to deal, even a little, with this field. Much integration and help from programmers is required to reach a basic dataset. You’re forced to invest an almost endless amount of work in dealing with complex tables, spending hours going over information, only to understand what you should really change. Our goal in Oribi is to enable everyone to access their data and understand how they should change their current results: starting with large companies that have sufficient resources, to small and medium-sized businesses. Moreover, this should be attained in an easy, accessible manner, with no help from programmers and no integration with other systems. We chose to start with marketing and user behavior analysis, primarily because the questions of different users in this field tend to be rather similar and, at this initial stage when this is still a basic product, we would be able to provide a proper response to this problem. At present, Oribi enables an automatic tracking of primary site operations (subscriptions, downloads, purchases, etc.), an analysis of the way each of these operations could be optimized, of changes from week to week, and of the reasons behind them, as well as analyzing the functioning of each of the different marketing channels, and more.


What happened to the first product we launched? And why bury a successful product? 

Those who’ve kept track of this blog will remember that the first product Oribi launched was a small product enhancing Facebook analytics and predominantly related to Facebook ads. This was never intended to be the company’s main product. I chose to start with a ‘small’ product in order to get to know the market better and the expenses needed to reach users, among other things. My original scenario also leveraged this product as a marketing tool to be used to bring users to our more complex product. Ironically, the biggest problem with this product was that it was too successful. Without much work or marketing, we’ve reached over 5,000 users and an analysis of over 1 billion USD invested by our users on Facebook ads. The ‘small’ product grew more and more, requiring ever increasing development team resources, in a way that significantly hampered the development of the main product.

One of the toughest decisions I’ve made during my career so far as an entrepreneur was to take this product off the shelf. I’ve reached a junction where I felt I had to make a decision – whether to turn this ‘small’ product into the company’s main product, or to stop investing resources in it. On the one hand, the market clearly signaled that such a product is required, and the product gained thousands of users; on the other hand, I didn’t believe that a large company could be built around such a product. A few months before we launched the ‘real’ Oribi, we took down Facebook Analytics and the entire team moved to working on the main product. 


When is the right time to release the product?

I have to admit that I’m no longer entirely certain of the differences are between an alpha, a beta and an MVP is. The idea behind an MVP sounds great – investing the minimum effort to estimate the viability of your product. MVP almost always focuses on relatively simple B2C products. But what do you do when a product requires complex technology and many features to begin with? Take for example a product like Waze, entirely based on deep mapping technology and on analyzing the data of multiple users. Although they’ve started in certain cities with a limited number of features, this was certainly not an MVP in the classic sense. Had they tried to launch a product after a few weeks, it would probably have had no value for users, all of whom would likely have abandoned the tool. And this certainly wouldn’t have served as a proper indication that the product is not needed.

In Oribi, the main question I’ve asked myself is not what the minimum product would be but rather what is the minimum required to start learning? What would enable us to begin understanding what the users need?

Oribi is technologically unique in tracking key website/product operations without the involvement of programmers. This served as the basis for the product’s first version.


Getting started with the toughest users (or – forget about a product launch)

When considering the launch of a startup product, you usually think about a public announcement, publishing in TechCrunch and similar blogs, together with those activities that, together, constitute a product launch. The problem is that while the right time to go ‘on air’ is probably when the product is still in its infant stages, this is certainly not the time for an official launch. The accepted alternative is to start beta testing – choose 5-10 potential clients and start working with them to get an initial feedback. While this is the most accessible path, the problem, or so I believe, is that it does not actually reflect the real challenges we will have to face. A user who is familiar with us, has been instructed, values the company, and has people to turn to with any questions, in no way represents the ‘real’ user we will encounter in the future.

We decided that the first users would be those who do not know us at all. Three months ago, we first allowed anyone visiting our site to register and start using Oribi. In addition, we started doing some limited Facebook advertising (for a few dollars each day), and invited the readers of our blog to try the product.

After three months, we have over 500 companies who have installed Oribi on their website or in their product.

And the best part? With this number of users, we are learning a lot. Rather than working with several ‘betas’, we have the ‘wisdom of the masses’ at our disposal, which enables us to understand quite clearly, what works for users, what doesn’t, and what they look for.

What about the worst part? In stories, MVP always sounds glamorous and exciting. The truth is that it’s one of the most challenging things I’ve ever done. Being exposed to real users with a product which is still taking its first steps, is somewhat similar to coming to a high-end party in your pajamas. We know about the features still in stock, the amazing technology and the vision, but what the user encounters is still far from the ideal.


Product, product, product

When choosing, as was the case here, a low touch strategy, and when the end goal is to reach a very large user audience, the product is everything. You cannot hide behind a savvy sales team, or a very-connected management. The tool itself is being examined and, if it isn’t good, is somewhat unclear or doesn’t perform well, users will stop using it very soon. The Oribi team is currently made up of 15 people and, while most companies of this size would have probably settled for one product manager, Oribi has four people dealing with the product, each from a different perspective. Besides the importance of the product itself, I firmly believe that a stronger product team leads to a more efficient development team. More people working on the product, prevents focusing efforts on unnecessary features and directions, which is usually the biggest time drainer for development teams.

So, what do four people do when planning the product? Let’s start with the more standard jobs – I am the product lead when it comes to deciding on features, direction and product definitions. We have an UI/UX designer responsible for the visual aspect. And now for the less standard positions:

Customer discovery –  typically the responsibility of interviewing clients and tracking user analytics falls on the product manager. One of the best decisions I made in Oribi was to turn this from ‘just a part’ of the product manager’s job to a designated position we’ve recruited for. There is an immense difference between allocating partial resources to these analytics and a person focusing entirely on this issue. We’ve noted a more rapid progress in our understanding of the product since the customer discovery manager started working. The main tools are Skype discussions with users who have tried the product, and analytic results regarding product users. We receive a fair amount of feedback on a daily basis, which helps improve the product; from little touches, such as making texts clearer, to our choice of the next major features to be developed.

Technical product manager – in the regular track, the product manager prepares a PRD with differing levels of detail, which passes directly to the development team. My experience is that this leads to quite a few mistakes and problems on the way. I think that one of the most significant positions in a startup is the technical product manager, responsible for taking care of all the technical settings, verifying that the development team is working on the right things, planning tests and considering when every feature should be launched. This accelerates development and reduces the work devoted to unnecessary issues.


What is our goal now?

I genuinely believe that a startup company with limited resources should remain super-focused, and have one primary goal at any given moment. At present, our goal is to examine and bolster user engagement. We are also examining the possibility of requested payment from the first moment the product is released, but this is currently considered to be a secondary priority. In a few months, when we feel that we have reached our engagement goal, we will move on to our next challenge: paying customers.


What did we compromise on in the first version?

One my most important rules is that, while the initial stages might not provide users with value, they should never worsen the existing situation. We’ve invested significant resources in writing the code that has been running on customer websites, in order to ensure that we do not break or slow down any sites. Over 500 sites are currently running Oribi, without any negative effects.

  • It was clear that the product upload stage would only serve as the basis around which everything will change. We’ve invested significant resources in building a relatively dynamic infrastructure, both in the field of data and in the field of UI. This investment has completely proven itself over the past few months, and enabled us to almost completely change the interface and significant portions of how information is handled, relatively quickly.
  • We invested substantial resources in tools that help us better understand our users and the way the product works in different conditions. Over the past few months, the development team invested a fair amount of time supporting the ability to enable and disable features for users. As well as in a debug that helps us see how Oribi looks in different information conditions, in various notifications, and even in a Slack bot which reports what happens in users. While choosing to develop this kind of feature is always difficult, because they are transparent to the user, I feel that without them we wouldn’t have been able to gain an in-depth understanding of the product.


What did we compromise on?

  • Our main compromise relates to the number of features in the product. At the initial stages, we released an almost ‘bare bones’ product. Even today, significant features are still missing. We are trying to develop the features based on user needs, while considering the things that currently prevent us from better understanding how their site functions.
  • To date, we have no videos, tutorials or help on the sites. These are all materials we hope to release in the next few weeks.


What have we learned so far?


The product is changing dramatically from week to week – the design, the technology behind the scenes, the features and the overall experience. Below are the key things we’ve learned over the past few months:

The product is genuinely needed. The happiest part of releasing the product was the main feedback: a simple ‘programmer-free’ analytics product is actually sorely needed. This feedback is voiced in most discussions with users; besides that, it is also reflected in the results – in the fact that over 500 companies have come to install and try a product they’ve never heard of.


Simple is complex. There’s nothing new in the realization that making something simple is one of the most complex things there is. At the same time, until we released Oribi, I didn’t realize just how much of a challenge it is. Unlike other products I’ve worked on, every dialogue, text line or small new feature are subject to scrutiny. We are trying to build a super-simple and ‘lean’ product without encumbering it with anything unnecessary.


Once the product is released, everything starts moving at the speed of light. After Oribi was released, and in use by a significant number of companies, we feel that everything has started moving more rapidly – our understanding of the product and the technology, as well as design and marketing changes. Working with a relatively large number of users allows us to examine our assumptions quickly, get real feedback and make the necessary adjustments, quickly.


Users have little patience. You might expect a user who has invested time in subscribing to a new product and installing it on their site, to take a few more minutes to understand the product. But this is not the case. Time and time again, we have come to understand how little is necessary to lose a client. Instructions that are not clear enough, initial information that is not sufficiently interesting, or any glitches in implementation, all immediately lead a client to disappear.


You shouldn’t take anything for granted. When planning the product, there is a feeling that some parts are ‘complex’, while others are fairly trivial – the user’s initial selection of important events, navigation inside the application and even choosing the active domain. Less time is typically invested in these parts, both in terms of planning and analytics. We’ve learned that the most basic things, such as moving between screens in the application, are often where you lose most users. Today, we try to think every button, text line and dialogue through, even those that look standard with no room for error, can often lead us to lose customers.


Interested in trying Oribi?

If you’re a marketing manager, or run a website, you’re invited to try Oribi, or contact us here and join the beta group, which receives new features before anyone else.

Oribi is particularly suitable for sites with a fair number of visitors, rather than small or low-traffic sites.