Job Description Job Title: Senior Associate L2 – Data Engineering

Publicis Sapient Overview:

Publicis Sapient is at the forefront of enabling our clients to thrive in the digital age by offering expert strategies, customer-centric experience design, and top-tier product engineering. With over 20 years in the IT industry, we’ve witnessed an unprecedented need for transformation across various sectors, from finance to retail, energy, and beyond. To navigate this transformative journey, we’re seeking individuals who embody our core values of innovation, collaboration, and adaptability.

Job Summary:

As a Senior Associate L2 in Data Engineering at Publicis Sapient, you’ll be instrumental in translating client requirements into technical designs and implementing components for data engineering solutions. With a deep understanding of data integration and big data design principles, you’ll develop custom solutions or implement package solutions while driving design discussions independently to ensure the overall solution’s health. This role demands a hands-on technologist with a strong programming background in Java, Scala, or Python, along with experience in data ingestion, integration, data wrangling, computation, analytics pipelines, and exposure to the Hadoop ecosystem components. Additionally, hands-on knowledge of at least one cloud platform such as AWS, GCP, or Azure is required.

Role & Responsibilities:

Your responsibilities will revolve around designing, developing, and delivering solutions, including:

  • Data Integration, Processing & Governance
  • Data Storage and Computation Frameworks, Performance Optimizations
  • Analytics & Visualizations
  • Infrastructure & Cloud Computing
  • Data Management Platforms

Experience Guidelines:

Mandatory Experience and Competencies:

  • Overall 5+ years of IT experience with a minimum of 3 years in Data-related technologies
  • Minimum 2.5 years of experience in Big Data technologies and exposure to at least one cloud platform (AWS/Azure/GCP)
  • Hands-on experience with the Hadoop stack and other related components
  • Strong programming experience in Java, Scala, or Python
  • Hands-on working knowledge of NoSQL and MPP data platforms
  • Well-versed in data platform-related services on at least one cloud platform, IAM, and data security

Preferred Experience and Knowledge (Good to Have):

  • Knowledge of traditional ETL tools and database technologies
  • Understanding of data governance processes and related tools
  • Familiarity with distributed messaging frameworks, search & indexing, and microservices architectures
  • Experience in performance tuning and optimization of data pipelines
  • Familiarity with CI/CD practices and cloud data specialty certifications

Personal Attributes:

  • Strong written and verbal communication skills
  • Articulation skills
  • Good team player
  • Self-starter requiring minimal oversight
  • Ability to prioritize and manage multiple tasks
  • Process orientation and the ability to define and set up processes

More Information

Apply for this job
Share this job