Overview:

The Data & Analytics team is seeking a Big Data Lead Engineer to spearhead our mission of maximizing the potential of data assets. In this role, you will drive innovation, streamline access to data from Big Data repositories, and uphold standards and principles in the Big Data domain. Our Hadoop Data Warehouse is instrumental in facilitating insights, machine learning, and data science activities that support various revenue-generating initiatives across the organization.

Role:

  • Develop high-quality, secure, and scalable data pipelines using Spark, Scala/Python/Java on Hadoop or object storage.
  • Design and architect data flow schemes in the Hadoop environment to ensure scalability, repeatability, and efficiency.
  • Drive automation and efficiency in data ingestion, movement, and access workflows through innovation and collaboration.
  • Implement and enforce software development standards and engineering principles in the Big Data space.
  • Collaborate closely with business stakeholders and embedded engineering teams within business units to expedite the development of scalable products.
  • Embrace new technologies and methodologies to innovate with increasingly large data sets.
  • Collaborate with project teams to meet scheduled due dates, identifying emerging issues and recommending solutions.
  • Independently perform assigned tasks and address production incidents.
  • Contribute ideas to enhance standards and improve process efficiency.

All About You:

  • 10-14 years of experience in Data Warehouse projects within product or service-based organizations.
  • Expertise in Data Engineering and implementing multiple end-to-end Data Warehouse projects in the Big Data Hadoop environment.
  • Experience building data pipelines with Spark and Python on Hadoop or object storage.
  • Proficiency in working with databases like Oracle, Hadoop, and strong SQL knowledge.
  • Experience with real-time data flow systems such as NiFi and Kafka is advantageous.
  • Familiarity with automating data flow processes in a Big Data environment.
  • Experience working in Agile teams.
  • Strong analytical skills for debugging production issues and providing root cause analysis.
  • Excellent verbal and written communication skills, with a knack for collaboration and building relationships.
  • Ability to multitask across multiple projects, provide technical leadership to junior team members, and interface with external/internal resources.
  • High-energy, detail-oriented, proactive, and able to work under pressure in an independent environment.
  • Quick learner with the ability to implement new technologies and perform proof of concepts to explore optimal solutions.
  • Flexible approach to working in diverse and geographically distributed project teams within a matrix-based structure.

More Information

Apply for this job
Share this job

EY

(0)