/ L'annuaire des offres d'emploi en Suisse Romande
n/a n/a Genf CH
full-time

Data Engineering Specialist (ETL, SQL, Python) - Onsite Geneva.

Entreprise
Randstad (Schweiz) AG
Lieu
Genf
Date de publication
28.04.2025
Référence
4799382

Description

  • Genf, Genève
    • Contract

Job Details

Randstad Digital Switzerland is seeking a highly skilled and motivated Data Engineering Specialist to join our team and work on an exciting project with a prestigious international organization. This role involves enhancing and developing security metrics within a large-scale IP Analytics Platform, utilizing cutting-edge cloud technologies and agile methodologies. 

Responsibilities:

  • Collaborate with the Information Security team to define and implement data quality assurance processes.  
  • Identify suitable data visualization tools and create detailed visualization and presentation layers (Tableau or Amazon QuickSight).  
  • Identify and document data structures for additional data sources and ensure ongoing data drift monitoring.  
  • Assist Information Security experts in defining and extracting relevant features for data analysis.  
  • Collaborate with business areas to improve the performance of central collation of big datasets.  
  • Develop data products, including pipelines, jobs, and transformations.  
  • Create use case diagrams, data, and process flow diagrams, and user guides.  
  • Design and implement Security Metrics presentation/visualization dashboards with drill-down capabilities.  
  • Ensure solution evolution in response to changing data sources and emerging technologies.  

Requirements:

  • A first-level university degree in information security, computer science, engineering, mathematics, business, or a related discipline.  
  • Expert knowledge and experience in developing and implementing ETL jobs for data warehouses using SQL and Python.  
  • Strong knowledge of software development languages and tools, including SQL, Python, and Spark.  
  • Good knowledge and experience with AWS or Azure Big Data tools (e.g., Glue, Athena, Redshift, Kinesis, Data bricks, Azure analytics, Data explorer) and cloud-based storage and functions (S3, Blob, Lambda, Azure Functions).  
  • Expert knowledge and experience with data engineering tools and methodologies (Data warehouse, data lake, star schema).  
  • Good knowledge and experience with AWS Cloud Formation or Terraform.  
  • Knowledge of CI/CD concepts, particularly AWS CodePipeline, CodeBuild, CodeDeploy.  
  • Knowledge of provisioning data APIs and information security concepts and terminology.  
  • Excellent written and verbal communication skills, with the ability to articulate complex technical ideas to non-technical stakeholders.  
  • Confident communicator and team player with strong organizational and interpersonal skills.  
  • Personal drive, ownership, and accountability to meet deadlines and achieve results.  
  • Proficient user of Git, with familiarity with Jira and Bitbucket.  

Desirable:

  • Additional certifications such as Certified Amazon Web Services (AWS) Solutions Architect and AWS Certified Data Analytics Specialty.  
  • Experience in the implementation of or demonstrable familiarity with the Gartner ODM framework.
  • Experience of working with Tableau and Amazon QuickSight would be an advantage.  

Ideally, you will be working from our client-s offices in Geneva, although a hybrid working arrangement (nearshore or Switzerland) is possible.

Randstad Digital Switzerland is seeking a highly skilled and motivated Data Engineering Specialist to join our team and work on an exciting project with a prestigious international organization. This role involves enhancing and developing security metrics within a large-scale IP Analytics Platform, utilizing cutting-edge cloud technologies and agile methodologies. 

Responsibilities:

  • Collaborate with the Information Security team to define and implement data quality assurance processes.  
  • Identify suitable data visualization tools and create detailed visualization and presentation layers (Tableau or Amazon QuickSight).  
  • Identify and document data structures for additional data sources and ensure ongoing data drift monitoring.  
  • Assist Information Security experts in defining and extracting relevant features for data analysis.  
  • Collaborate with business areas to improve the performance of central collation of big datasets.  
  • Develop data products, including pipelines, jobs, and transformations.  
  • Create use case diagrams, data, and process flow diagrams, and user guides.  
  • Design and implement Security Metrics presentation/visualization dashboards with drill-down capabilities.  
  • Ensure solution evolution in response to changing data sources and emerging technologies.  

Requirements:

  • A first-level university degree in information security, computer science, engineering, mathematics, business, or a related discipline.  
  • Expert knowledge and experience in developing and implementing ETL jobs for data warehouses using SQL and Python.  
  • Strong knowledge of software development languages and tools, including SQL, Python, and Spark.  
  • Good knowledge and experience with AWS or Azure Big Data tools (e.g., Glue, Athena, Redshift, Kinesis, Data bricks, Azure analytics, Data explorer) and cloud-based storage and functions (S3, Blob, Lambda, Azure Functions).  
  • Expert knowledge and experience with data engineering tools and methodologies (Data warehouse, data lake, star schema).  
  • Good knowledge and experience with AWS Cloud Formation or Terraform.  
  • Knowledge of CI/CD concepts, particularly AWS CodePipeline, CodeBuild, CodeDeploy.  
  • Knowledge of provisioning data APIs and information security concepts and terminology.  
  • Excellent written and verbal communication skills, with the ability to articulate complex technical ideas to non-technical stakeholders.  
  • Confident communicator and team player with strong organizational and interpersonal skills.  
  • Personal drive, ownership, and accountability to meet deadlines and achieve results.  
  • Proficient user of Git, with familiarity with Jira and Bitbucket.  

Desirable:

  • Additional certifications such as Certified Amazon Web Services (AWS) Solutions Architect and AWS Certified Data Analytics Specialty.  
  • Experience in the implementation of or demonstrable familiarity with the Gartner ODM framework.
  • Experience of working with Tableau and Amazon QuickSight would be an advantage.  

Ideally, you will be working from our client-s offices in Geneva, although a hybrid working arrangement (nearshore or Switzerland) is possible.

Job teilen

Postuler