← Back to all jobs
NTT DATA

Offshore Pod Leads (TBD, AN, IN)

NTT DATA

4h ago

0OtherIndia, United Kingdom, United Stateshimalayas
Data-EngineeringCloud-Data-EngineeringAWS-Data-EngineeringData-ArchitectureData-MigrationSenior

Job Description

Req ID:360093NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.We are currently seeking a Offshore Pod Leads to join our team in TBD, Andaman and Nicobar Islands (IN-AN), India (IN).JOB DESCRIPTIONData Engineering Pod LeadDatabricks Lakehouse Migration ProgramTwo Roles: Informatica Pod Lead | AWS Glue Pod LeadEngagement TypeContract / Staff Augmentation or Full-Time Employee (FTE) — OpenSeniority LevelLead / Architect — 12+ years of relevant experienceNumber of Openings2 (one per pod)Team Size4–6 Data Engineers per pod leadCloud PlatformAWS (Glue, Redshift, S3, Kinesis Streams, IAM, CloudWatch)Target PlatformDatabricks Lakehouse (Unity Catalog, Delta Lake, Workflows)Program TypeClient-facing migration engagement — ETL modernizationProgram Context & OpportunityOur client is undertaking a large-scale data platform modernization initiative — migrating from a legacy ETL ecosystem (Informatica PowerCenter, AWS Glue, and Amazon Kinesis Streams) feeding Amazon Redshift into a unified Databricks Lakehouse architecture built on Delta Lake. This is a high-impact, high-visibility program requiring experienced technical leaders who can navigate complex legacy systems, architect modern solutions, and lead skilled engineering teams through the full migration lifecycle.We are hiring two dedicated Pod Leads — one for each legacy source domain — who will be jointly accountable for technical excellence, delivery velocity, and team development throughout the engagement.Common Responsibilities — Both Pod LeadsTechnical Leadership & ArchitectureOwn the end-to-end technical design and implementation of the migration from the respective source platform to Databricks Lakehouse (Delta Lake, Unity Catalog, Databricks Workflows).Conduct thorough assessments of existing ETL jobs — analyzing lineage, dependencies, transformation logic, scheduling, and data quality rules — prior to migration planning.Define migration patterns, reusable frameworks, and coding standards adopted across the pod.Architect scalable, cost-efficient pipelines using Databricks PySpark, Spark SQL, and Delta Live Tables (DLT) as appropriate.Make and document key architectural decisions (ADRs) with clear rationale and trade-off analysis.Drive adoption of software engineering best practices: version control (Git), CI/CD, unit testing, and code review within the pod.Team Leadership & Delivery ManagementDirectly lead a pod of 4–6 Data Engineers, providing technical mentorship, task assignment, code reviews, and unblocking day-to-day impediments.Manage sprint planning, backlog refinement, and progress tracking against migration milestones in close coordination with the Program Manager.Hold the team accountable for quality and velocity — proactively flag risks, scope changes, and dependencies before they become blockers.Conduct regular 1:1s and technical feedback sessions to support the professional growth of pod members.Foster a culture of ownership, collaboration, and continuous improvement within the pod.Client & Stakeholder CommunicationServe as the primary technical point of contact for your pod's workstream with the client.Translate complex technical concepts and migration trade-offs into clear, concise communications for both technical and non-technical stakeholders.Participate in program-level status reviews, architecture governance meetings, and client steering committees as required.Manage expectations around scope, timelines, and quality, escalating issues appropriately.Quality, Governance & DocumentationEnsure all migrated pipelines meet data quality, SLA, and observability requirements defined by the client.Champion data governance best practices including lineage tracking, catalog registration in Databricks Unity Catalog, and access control alignment.Produce and maintain clear technical documentation: architecture diagrams, runbooks, migration playbooks, and handover materials.Coordinate with QA/testing resources to validate migrated pipelines against source-system outputs.ROLE 1 — Informatica PowerCenter Pod LeadRole OverviewThe Informatica Pod Lead will own the migration of Informatica PowerCenter-based ETL jobs to the Databricks Lakehouse platform. This role demands deep expertise in Informatica's architecture, transformation logic, and metadata — paired with the ability to re-engineer complex legacy workflows into modern, cloud-native Databricks pipelines on AWS.Role-Specific ResponsibilitiesAnalyze and decompose Informatica PowerCenter mappings, sessions, workflows, and worklets to understand full transformation logic, source/target connectivity, and scheduling dependencies.Define and execute a structured migration methodology — assess, convert, validate — for translating Informatica logic into equivalent PySpark/Spark SQL code on Databricks.Identify opportunities to simplify or consolidate legacy transf