CLOUD BIG DATA ENGINEER – SENIOR
Place of work:
Bratislava
Salary offered (gross):
From 3.500,- € – based on experience
Contract type:
full-time
Start date:
ASAP
Your Responsibilities:
- Manage the Data Lake, consisting of transactional data from our booking as well as stream of events from our frontend.
- Write code in Python / Scala / Go to implement parts of the Data Lake, focusing on automatization to the highest extent possible, which includes implementing batch pipelines to incrementally load data as well as implementing streaming pipelines.
- Design and implement company Data Lake from transactional and streaming data
- Write ML algorithms and get AI to production to actually have an impact on the product
- Identify weak spots, refactor code that needs it during development
- Optimize code and usage of 3rd party services for speed and cost effectiveness
Requirements
Must-have Skills:
- Python
- Big Data engineering in the Cloud (ideally GCP)
- Airflow or alternatives
What Do We Expect:
- 2+ years of full-time experience in a similar position
- Strong coding skills in Python or Scala (the team uses both)
- Broad knowledge of different types of data storage engines (relational, non-relational)
- Hands on experience with at least 2 of them – e.g. PostgreSQL, MySQL, Redshift, ElasticSearch
- Experience with orchestration tools (Airflow ideally)
- Experience in Big Data processing engines such as Apache Spark, Apache Beam and its cloud runners Dataproc/Dataflow
- Knowledge of ML/AI algorithms like OLS and Gradient Descent and its application from linear regression to deep neural networks
- Advanced query language (SQL) knowledge
- Experience with Batch and Real-time data processing
- Cloud Knowledge (GCP is the best fit, alternatively AWS or Azure)
- BS/MS in Computer Science or a related field (ideal)
Contact:
E-mail: astell@astell.sk