Knihobot & Bookbot is the largest re-commerce bookstore in the Czech Republic. Every day, we return pre-read books back into circulation and help protect the environment. So far, we’ve processed over 15 million books. We currently operate in 9 countries and have ambitious growth plans.
Our warehouse holds over a 1.5 million books, and every day we handle another 30,000 coming in and going out. To keep this running smoothly, we need to leverage data fully. The faster we get books — with the right contextual data and price — onto our e-shop, the better they reach the right customers.
We work fast, with minimal bureaucracy. You’ll see the results of your work daily and play an important role in driving the company forward.
What to expect
- You’ll join an internal Data team of 7 people and work closely with IT, BizDev, Marketing, and Product.
- You’ll help deliver data products used across the entire Bookbot ecosystem.
- You’ll work fully in the cloud, building data warehouses and pipelines that process tens of millions of records every day.
Example projects you’ll work on
- Analyzing how effectively users interact with book contextual data on our website to support their purchase decisions.
- Forecasting the volume of incoming books to help plan operations and improve availability across our markets.
- Integrating data from a third-party workforce-planning system used for hundreds of temporary workers.
- Customer segmentation to better understand their behavior from our sales, behavior and web analytics data.
Who we’re looking for
We’re looking for someone experienced and confident — someone who not only can build pipelines, but also understands what should be built and why. You’ve already led data projects, can turn business needs into well-designed data solutions, and want to become a strategic partner to our Head of Data (Markéta).
You’ll fit right in if you have:
- 3+ years of experience in a similar role (Data Analyst, Data Engineer, Data Product Owner…)
- Experience building and orchestrating ETL/ELT pipelines
- Strong analytical skills; understanding of data modeling, databases, SQL/Python
- Hands-on experience with cloud infrastructure (Azure / GCP / AWS)
- Familiarity with Git/GitHub, testing, automated QA
- Independence, proactivity, and an ownership mindset
- Fluent English for professional discussions
Nice-to-have:
- AWS experience
- Experience with big data
Tech stack
- Databases: MySQL, Snowflake, BigQuery
- Visualization: PowerBI, Metabase
- Data warehousing & orchestration: AWS, Keboola, Dataform, Databricks
- Analytics: SQL, Python, Google Sheets
What we offer
- A role in a fast-growing, dynamic company that has been sustainably expanding for years
- 1 extra week of vacation after your first year, 2 extra weeks after two years
- Informal environment with a pragmatic approach
- 20% discount on books, VIP selling tariff, parking at Kolbenova
- MultiSport card and a budget for personal development
- 2 days/week home office