Publicis Sapient Overview:

At Publicis Sapient, we empower our clients to excel in the future by creating business value through expert strategies, customer-centric experience design, and world-class product engineering.

The future of business is disruptive, transformative, and deeply digital. In our two decades of experience, we have witnessed an urgent need for transformation across major industries, from finance to automotive, consumer products, retail, energy, and travel.

To navigate this transformative journey, we seek thought leaders and trailblazers who are willing to:

  • Embrace the challenge of shaping the future from the present.
  • Exhibit boundless optimism that believes there are no limits to what we can achieve.
  • Possess deep expertise, boldness, collaboration, and adaptability.
  • Reimagine how the world operates to enhance the daily lives of people and the world itself.

Our team thrives on:

  • Pushing boundaries.
  • Collaborating across disciplines.
  • Working in highly agile teams.
  • Harnessing the latest technologies and platforms.

If this resonates with you, we invite you to join us!

Job Summary:

As a Senior Associate L2 in Data Engineering, you will translate client requirements into technical designs and implement components for data engineering solutions. You will leverage a deep understanding of data integration and big data design principles to create custom solutions or implement packaged solutions. This role demands a hands-on technologist with a strong programming background in Java/Scala/Python, experience in data ingestion, integration, wrangling, computation, analytics pipelines, and exposure to Hadoop ecosystem components. Additionally, you should have hands-on knowledge of at least one of AWS, GCP, Azure cloud platforms.

Role & Responsibilities:

Your role focuses on designing, developing, and delivering solutions involving:

  • Data integration, processing, and governance.
  • Data storage and computation frameworks, performance optimizations.
  • Analytics and visualizations.
  • Infrastructure and cloud computing.
  • Data management platforms.

You will:

  • Implement scalable architectural models for data processing and storage.
  • Develop functionality for data ingestion from multiple heterogeneous sources in batch and real-time mode.
  • Build functionality for data analytics, search, and aggregation.

Experience Guidelines:

Mandatory Experience and Competencies:

  1. Overall 5+ years of IT experience with 3+ years in Data related technologies.
  2. Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS/Azure/GCP).
  3. Hands-on experience with the Hadoop stack – HDFS, Sqoop, Kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, Hive, Oozie, Airflow, and other components required in building end-to-end data pipelines.
  4. Strong experience in at least one programming language: Java, Scala, Python. Java is preferable.
  5. Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDB, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery, etc.
  6. Well-versed in and working knowledge of data platform-related services on at least one cloud platform, IAM, and data security.

Preferred Experience and Knowledge (Good to Have):

  1. Good knowledge of traditional ETL tools (Informatica, Talend, etc.) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands-on experience.
  2. Knowledge of data governance processes (security, lineage, catalog) and tools like Collibra, Alation, etc.
  3. Knowledge of distributed messaging frameworks like ActiveMQ/RabbitMQ/Solace, search & indexing, and Microservices architectures.
  4. Performance tuning and optimization of data pipelines.
  5. CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality.
  6. Cloud data specialty and other related Big data technology certifications.

Personal Attributes:

  • Strong written and verbal communication skills.
  • Articulation skills.
  • Good team player.
  • Self-starter who requires minimal oversight.
  • Ability to prioritize and manage multiple tasks.
  • Process orientation and the ability to define and set up processes.

More Information

Apply for this job
Share this job