Data Engineering Analyst II
Takeda Zobraziť všetky práce
- Bratislava, Bratislavský kraj
- 2 100 € za mesiac
- Práca na zmluvu/dohodu
- Plný úväzok
Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators’ journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team.The OpportunityAs a Data Engineer, you will be responsible for building and maintaining data systems and constructing datasets that are optimized for analysis, supporting Business Intelligence requirements and downstream systems.Responsibilities
- Develop and maintain scalable data pipelines, and build new integrations using AWS native technologies to support increasing data sources, volumes, and complexity.
- Collaborate with analytics and business teams to improve data models that enhance business intelligence tools and dashboards, fostering data-driven decision-making across the organization.
- Implement processes and systems to ensure data reconciliation, monitor data quality, and ensure production data is accurate and available for key stakeholders, downstream systems, and business processes.
- Write unit, integration, and performance test scripts, contribute to engineering documentation, and maintain the engineering wiki.
- Perform data analysis to troubleshoot and resolve data-related issues.
- Work closely with frontend and backend engineers, product managers, and analysts to deliver integrated data solutions.
- Collaborate with enterprise teams, including Enterprise Architecture, Security, and Enterprise Data Backbone Engineering, to design and develop data integration patterns and models supporting various analytics use cases.
- Partner with DevOps and the Cloud Center of Excellence to deploy data pipeline solutions in Takeda AWS environments, meeting security and performance standards.
- Bachelor’s Degree from an accredited institution in Engineering, Computer Science, or a related field.
- 3+ years of experience in data engineering, software development, data warehousing, data lakes, and analytics reporting.
- Strong expertise in data integration, data modeling, and modern database technologies (Graph, SQL, No-SQL) and AWS cloud technologies (e.g., DMS, Lambda, Databricks / PySpark, SQS, Step Functions, Data Streaming, and Visualization).
- Extensive experience in DBA, schema design & dimensional modeling, and SQL optimization.
- Excellent written and verbal communication skills, with the ability to collaborate effectively with cross-functional teams.
- Understanding of good engineering practices (DevSecOps, source-code versioning).
- Fluency in English.
- Experience with streaming technologies like Spark Streaming or Kafka
- Infrastructure as Code (IaC) experience, preferably with Terraform.
- Experience designing and developing API data integrations using SOAP / REST.
- Cafeteria
- Medical - High standard medical care
- Bridge days
- Summer Hours
- Multisport Card