We’re looking for a Data Analyst excited to help build the best entrepreneur service in the world. You’ll have the opportunity to build and improve a world class data analytics platform to shape the future of what’s possible with our data.
At Betao, we’re building a service that is fair, transparent and a delight to use. We’re growing extremely fast and have over a 50,000 customers in France, with over 3,000 new people joining every month. We’ve built a product that people love and more than 40% of our growth comes from word of mouth and referrals.
Our Data team’s mission is to
Enable Betao to Make Better Decisions, Faster
At the core of this mission sits our data platform. We’re great believers in powerful, real-time analytics and empowerment of the wider business. Every engineer at Betao is responsible for collection of relevant analytics events from their microservices. All our data lives in one place and is super easy to use. 90% of day-to-day data-driven decisions are covered by self-serve analytics through Looker which gives data scientists the head space to focus on more impactful business questions and analyses.
Our vision is to enable analysts to go from a raw events log to a robust data model to an valuable insight autonomously without relying on any other team. And we need you to help us design and implement an environment in which they can do so efficiently and safely.
As part of your role, you’ll:
- Work closely with data analysts, engineers and product manager to understand data needs across the business
- Build systems and tools which allow teams of analyst to create custom data models efficiently and safely
- Build data expertise and come up with ways of how we can use technology to ensure data quality (be it in databases or in Looker) in a scalable way
- Integrate new data sources into our data warehouse
- Design, build and launch new data pipelines in production
- Define and manage SLAs for the main datasets which the whole company relies on
- Educate analysts and data scientist on data modelling design principles and best practices
Our technology stack:
At Betao you will get to work with a lot of exciting new technology. We rely heavily on the following tools and technologies:
- Go to write our application code
- Python for data science
- Cassandra for most persistent data storage
- Bigquery for events storage and analytics queries
- Kafka for our asynchronous message queue
- Kubernetes and Docker to schedule and run our services
- AWS for most of our backend infrastructure
- Google Cloud Platform for all of our analytics infrastructure
You should apply if:
- What we’re doing here at Betao excites you!
- You’re impact driven and eager to have a real positive impact on the company, product, users and very importantly your colleagues as well
- You have a self-starter mindset; you proactively identify issues and opportunities and tackle them without being told to do so
- You’re comfortable working in a team that deals with ambiguity every day
- You have a strong background and data and analytics engineering and experience building analytics data sets
- You’re keen to learn more about new technologies
- You have solid SQL and Python skills plus ideally some experience with strongly-typed languages (e.g. Go, Java, Scala…)