
Principal Android Developer (Java, Kotlin) IRC271690
- Slovensko
- Trvalý pracovný pomer
- Plný úväzok
- Familiarity with Google Cloud Platform, including BigQuery, Cloud Composer(Airflow), Compute Engine, and Monitoring.
- Strong proficiency in Python, including experience in building, testing, and deploying robust ETL/ELT pipelines in production
- Experience developing data quality frameworks, including automated tests, cross-dataset validations, and anomaly detection across diverse data assets
- Advanced SQL skills, including data modeling (star/snowflake/3NF), optimization, and writing performant queries tailored to BigQuery’s slot-based execution model
- Solid working knowledge of DBT, including incremental models, testing, documentation, and advanced features like custom macros
- Demonstrated ability to optimize BigQuery workloads through partitioning, clustering, materialized views, and cost-aware development practices
- Experience working with ETL orchestration tools, ideally Cloud Composer or similar frameworks (Kestra, Dagster,etc.)
- Hands-on experience consuming and managing APIs for data extraction.
- Exposure to Site Reliability Engineering (SRE) best practices, including ticket triage, incident management, and documenting runbooks/SOPs
- Familiar with Git and modern software development workflows, including pull requests and code reviews
- Comfortable working in an agile team environment, with the ability to multitask and reprioritize based on changing project needs
- Clear and effective communication skills, with the ability to engage technical and non-technical stakeholders alike
- Designing, building, and maintaining scalable, reliable data pipelines using Python,SQL, DBT, and Google Cloud Platform (GCP) services like BigQuery and Cloud Composer.
- Contributing to the development and evolution of our data quality framework, ensuring robust automated testing and cross-dataset validation across all critical data assets
- Writing and optimizing advanced SQL to power data models, analytics, and reporting pipelines, with a focus on performance and efficiency within BigQuery
- Developing and maintaining DBT models, including testing, documentation, incremental loading strategies, and the creation of reusable macros
- Supporting day-to-day incident and ticket resolution, including root cause analysis and documentation of runbooks/SOPs to drive platform reliability
- Working with APIs and external data sources to extract, normalize, and integrate new datasets into the platform
- Participating in agile ceremonies (stand-ups, retros, planning), contributing to sprint goals and helping support the wider team objectives
- Actively contributing to code reviews, documentation, and peer learning—helping to raise engineering standards and share knowledge within the team
- Monitoring system health and pipeline performance using GCP-native tools and dashboards, and proactively identifying opportunities for cost or performance optimization
- GCP Data Engineer certification
- Prior experience in media, marketing, or digital advertising analytics
- Experience implementing data quality frameworks and governance tools
- Familiarity with cost optimization strategies for BigQuery and other GCP services