Task
Design, build, and maintain scalable data pipelines and data models using AWS-native services.
Develop and deliver impactful, user-friendly dashboards and visualizations in Tableau.
Manage end-to-end data and BI projects — from gathering business requirements to deployment and support.
Integrate data from diverse structured and unstructured sources, ensuring performance, reliability, and scalability of analytics solutions.
Apply best practices in data governance, quality, and security across all data processes.
Collaborate with stakeholders to translate business needs into effective technical solutions.
Stay current with emerging technologies in the cloud analytics space, with openness to adopting tools such as Azure, Microsoft Fabric, and Power BI.
QualificationsQualificationsQualifications
Must have skills:
End-to-end data pipeline and dashboard development using AWS services (e.g., EC2, Glue, Lambda, Athena, Redshift) and Tableau.
Performance optimization of large-scale Tableau dashboards and AWS-based data workflows.
Deep understanding of Tableau capabilities including LOD expressions, calculated fields, actions, parameters, and best practices for UI/UX.
Data integration from various sources (e.g., S3, RDS, APIs, external files).
Ability to independently lead the delivery of BI and data engineering solutions in AWS and Tableau environments.
Strong SQL skills for querying, joining, and transforming structured data.
Good understanding of data warehousing concepts: fact/dimension tables, star/snowflake schemas, and ETL/ELT patterns.
Familiarity with data governance, access control, and deployment processes in cloud environments.
Familiarity with agile methodologies, sprint-based delivery, and using tools like Jira or Azure DevOps.
Nice to have skills:
Ability to build interactive dashboards and reusable data models using Power Query and DAX.
Knowledge of Row-Level Security (RLS), bookmarks, drill through, and performance optimization.
Openness to support/report in Power BI where needed during cross-platform projects.
Experience with Python and/or PySpark for data transformation and automation.
Familiarity with modern data lake architecture (e.g., Delta Lake, Lakehouse) and orchestration tools like Apache Airflow or AWS Step Functions.
Willingness to explore other modern cloud analytics platforms if required (e.g., Databricks, Snowflake).
Things to know before departure:
Start: by arrangement - always on the 1st and 15th of the month
Working hours: full time (40h); 27 vacation days
Employment contract: Unlimited
Line of work: Consulting
Language skills: business-fluent English at C1 level; German is a plus
Flexibility & willingness to travel
Other: a valid work permit
For more detail, salary and company information, use the apply link