Publicis Sapient Overview:
Publicis Sapient is a digital transformation partner dedicated to helping established organizations transition to their digitally enabled future. Through a fusion of strategy, consulting, and customer experience with agile engineering and problem-solving creativity, we unlock value and accelerate business growth. With a global team of over 20,000 professionals across 53 offices, we combine expertise in technology, data sciences, consulting, and customer obsession to design products and services that truly resonate with our clients’ customers.
Job Summary:
As a Senior Associate L1 in Data Engineering at Publicis Sapient, you will be responsible for technical design and implementation of components for data engineering solutions. Leveraging a deep understanding of data integration and big data design principles, you will create custom solutions or implement package solutions, driving design discussions independently to ensure the overall solution’s effectiveness. This role demands a hands-on technologist with a strong programming background in spark/Pyspark and Java/Scala/Python, along with experience in Data Ingestion, Integration, Data Wrangling, Computation, Analytics pipelines, and exposure to Hadoop ecosystem components. Hands-on knowledge of GCP is also required.
Role & Responsibilities:
Your role will focus on designing, developing, and delivering solutions involving:
- Data Ingestion, Integration, and Transformation
- Data Storage and Computation Frameworks, with an emphasis on Performance Optimization
- Analytics & Visualizations
- Infrastructure & Cloud Computing
- Data Management Platforms
Experience Guidelines:
Mandatory Experience and Competencies:
- Overall 4+ years of IT experience with a minimum of 2+ years in Data-related technologies.
- Minimum 2+ years of experience in Big Data technologies.
- Hands-on experience with the Hadoop stack and working knowledge of real-time data pipelines.
- Strong proficiency in at least one programming language among Java, Scala, or Python, with Java being preferable.
- Hands-on working knowledge of NoSQL and MPP data platforms.
- Preferred Experience and Knowledge (Good to Have):
- Good knowledge of traditional ETL tools and database technologies.
- Knowledge of data governance processes and tools.
- Familiarity with distributed messaging frameworks, search & indexing, and Microservices architectures.
- Experience in performance tuning and optimization of data pipelines.
- CI/CD experience, including infra provisioning on cloud and code quality management.
- Working knowledge of data platform-related services on at least one cloud platform, IAM, and data security.
- Cloud data specialty and other related Big data technology certifications.
Personal Attributes:
- Strong written and verbal communication skills.
- Articulation skills.
- Good team player.
- Self-starter who requires minimal oversight.
- Ability to prioritize and manage multiple tasks.
- Process orientation and the ability to define and set up processes.
More Information
- Experience 5-10 Years