Location : Baden, Madrid | Workload : 80–100%
Who We Are
Axpo is driven by a single purpose to enable a sustainable future through innovative energy solutions. As Switzerland's largest producer of renewable energy and a leading international energy trader, we leverage cutting-edge technologies to serve customers in over 30 countries. We thrive on collaboration, innovation, and a passion for driving impactful change.
About the Team
You’ll report to the Head of Development and work closely with the Chief Data & Analytics Office (CDAO) as part of a cross-functional effort to build a secure, scalable, and business-aligned data platform. Our mission is to empower Axpo’s decentralized business hubs with self-service analytics and AI capabilities, combining the strengths of engineering, governance, and business ownership
What You Will Do
As a Databricks Solution Architect, you will play a pivotal role in Axpo’s enterprise data transformation by designing and governing scalable and secure solutions on the Databricks Lakehouse platform.
You will :
- Design performant, secure, and cost-effective Lakehouse architectures that adhere to enterprise data governance and domain modeling standards defined by the CDAO.
- Lead the design of performant, secure, and cost-effective Lakehouse architectures aligned with enterprise needs.
- Collaborate with business stakeholders, engineers, and data scientists to design end-to-end solutions that enable innovation and data-driven decision making.
- Guide engineering teams on implementing technical best practices, ensuring alignment with CDAO-defined data models and stewardship principles.
- Collaborate with the CDAO office to implement Unity Catalog policies for access control, lineage, and metadata management.
- Support platform observability, data quality monitoring, and operational excellence in partnership with data governance stakeholders.
- Evaluate new Databricks features (e.g., Delta Sharing, governance enhancements) and lead their integration into platform capabilities.
- Establish solution review processes and mentor engineers and analysts on architectural thinking and Databricks capabilities.
- Support security, compliance, and cost-optimization efforts in close collaboration with platform and cloud teams.
What You Bring & Who You Are
You are a strategic thinker with hands-on technical expertise and a strong focus on business value. You bring :
A degree in Computer Science, Data Engineering, Information Systems, or related field.5+ years in data engineering and 3+ years in architecture roles, with deep experience designing solutions on Databricks and Apache Spark.Strong grasp of Delta Lake, Lakehouse architecture, and Unity Catalog policy implementation in coordination with data governance functions.Expertise in Python, SQL, and optionally Scala; strong familiarity with dbt and modern ELT practices.Proven experience integrating Databricks with Azure services (e.g., Data Lake, Synapse, Event Hubs).Hands-on knowledge of CI / CD, GitOps, Terraform, and orchestration tools (e.g., Dragster, Airflow).Sound understanding of enterprise data architecture, data governance, and security principles (e.g., GDPR).Strong communication and stakeholder management skills, able to bridge technical and business domains.Fluency in English; other European languages a plus.Technologies You’ll Work With
Core : Databricks, Spark, Delta Lake, Unity Catalog, dbt, SQL, PythonCloud : Microsoft Azure (Data Lake, Synapse, Storage, Event Hubs)DevOps : Bitbucket / GitHub, Azure DevOps, TerraformOrchestration & Monitoring : Dragster, Airflow, Datadog, GrafanaVisualization : Power BIOther : Confluence, Docker, LinuxNice to Have
Knowledge of Microsoft Fabric or SnowflakeFamiliarity with Dataiku or similar low-code analytics platformsExperience with enterprise metadata and lineage solutionsBackground in energy trading or related industries