Publicis Sapient is renowned for empowering clients to excel in the digital age by crafting expert strategies, designing customer-centric experiences, and delivering top-tier product engineering. With over two decades of experience in the IT industry, we recognize the imperative for transformation across diverse sectors, including financial services, automotive, consumer products, retail, energy, and travel.

To embark on this transformative journey, we seek individuals who embody leadership and innovation, individuals who are unafraid to pioneer the future and reimagine the status quo. Our ideal candidates are deeply skilled, collaborative, and flexible, with a relentless drive to push boundaries and challenge conventional norms.

As a Senior Associate L2 in Data Engineering, you will play a pivotal role in translating client requirements into technical designs and implementing cutting-edge data engineering solutions. You will leverage your expertise in data integration and big data design principles to develop custom solutions or deploy packaged solutions. This role demands a hands-on technologist with a robust programming background in languages such as Java, Scala, or Python, coupled with proficiency in data ingestion, integration, computation, and analytics pipelines.

Key Responsibilities:

  • Designing, developing, and delivering solutions encompassing data integration, processing, and governance.
  • Implementing scalable architectural models for data processing and storage.
  • Building functionality for data ingestion from diverse sources in batch and real-time modes.
  • Developing analytics, search, and aggregation functionalities.
  • Leveraging infrastructure and cloud computing capabilities for optimal performance.
  • Implementing and managing data management platforms.

Qualifications and Skills:

Mandatory:

  • Minimum 6 years of IT experience, with at least 4 years in data-related technologies.
  • Proficiency in big data technologies and experience with at least one cloud platform (AWS, Azure, GCP).
  • Hands-on experience with the Hadoop stack and related components.
  • Strong programming skills in Java, Scala, or Python.
  • Experience with NoSQL and MPP data platforms.
  • Knowledge of data platform services on at least one cloud platform.

Preferred (Good to Have):

  • Familiarity with traditional ETL tools and database technologies.
  • Understanding of data governance processes and tools.
  • Knowledge of distributed messaging frameworks and microservices architectures.
  • Experience in performance tuning and optimization of data pipelines.
  • Proficiency in CI/CD practices and cloud data specialty certifications.

Personal Attributes:

  • Excellent written and verbal communication skills.
  • Strong articulation and presentation abilities.
  • Collaborative team player with a self-starting attitude.
  • Effective prioritization and multitasking skills.
  • Process-oriented mindset with the ability to establish and optimize processes.

Join our world-class engineering team and be a part of shaping the future of digital transformation. If you embody these qualities, we invite you to join us on this exciting journey.

More Information

Apply for this job
Share this job