Tasks
YOUR TASKS - Data Pipeline Development: Support the design, development, and maintenance of robust data pipelines for capturing, processing, and storing data from various sources across the entire production line. - Data Quality Assurance: Implement and monitor data quality checks to ensure the accuracy, completeness, and reliability of data throughout the pipeline. - Automation: Develop and operate ETL pipelines based on IoT and cloud technologies such as AWS Glue, Amazon S3, Redshift, and Amazon Lambda. - Develop and maintain Power BI or Grafana dashboards to create value for the company with data. - Collaboration: Work with cross-functional teams, including engineers, data scientists, analysts, and IT professionals, to understand data requirements and deliver solutions that meet requirements. - Documentation: Clear and detailed documentation of data pipeline architecture, processes, and best practices. - The position is based in Kirchentellinsfurt.
Requirements
YOUR PROFILE - Ongoing Master's degree in Computer Science or a comparable field - Good programming skills in Python - Good knowledge of developing dashboards based on Power BI or Grafana - Knowledge of data modeling, ETL processes, and data management best practices - Excellent problem-solving skills and attention to detail - Strong communication and teamwork skills - Familiarity with AWS (AWS Glue, AWS Athena, AWS Lake Formation) as well as understanding of data warehousing concepts and experience with databases (e.g., SQL, NoSQL) are desirable - Fluent German and English skills
For more detail, salary and company information, use the apply link