← Back to all jobs
PradeepIT Consulting Services Pvt Ltd

Pyspark & Databricks certified Developer

PradeepIT Consulting Services Pvt Ltd

7d ago

0DevIndiahimalayas
Data-EngineeringBig-Data-DevelopmentDatabricks-DeveloperPySpark-DeveloperCloud-Data-EngineerEntry-level

Job Description

About the jobAccelerate your career with PradeepITPradeepIT is one of the largest, globally recognized IT Consulting firm to connect India's deeply vetted talent team to global customer.Were headquartered in Bengaluru, Silicon Valley of India. PradeepITs customers include SAP Lab, Bosch, Rolls-Royce, Daikin, Daimler and J&J and hundreds of other Fortune 500 companies and fast-growing startups.With continuous hard work and working remotely by choice, PradeepIT is certified as a Great Place to Work! Trusted by leading brands and fortune 500 companies from around the world, we have achieved:6+ Years of Experience580+ Open source technology Consultant120+ SAP Consultant40+ Salesforce Consultant60+ Adobe Consultant100+ Mobility Consultant890+ Clients in APAC, EMEA & USAOur BeliefsPradeepIT believes in connecting people across the globe and provide them an opportunity work on remotely. Being a people-first organization, PradeepIT constantly strives for individuals who won't just keep up, but break new ground, work with cutting edge technology and ramp-up their skills with course created by our Vertical Heads, Senior Architect for freely with help of PradeepIT Academy.ResponsibilitesDesigning and implementing data ingestion pipelines from multiple sources using Azure DatabricksDeveloping scalable and re-usable frameworks for ingesting of data setsIntegrating the end to end data pipleline - to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all timesWorking with event based / streaming technologies to ingest and process dataWorking with other members of the project team to support delivery of additional project components (API interfaces, Search)Evaluating the performance and applicability of multiple tools against customer requirementsRequirementsBachelors degree and at least one year of experience designing, developing, deploying and/or supporting data pipelines using DatabricksExpertise in designing and deploying data applications on cloud solutions, such as Azure or AWSHands on experience in performance tuning and optimizing code running in Databricks environmentProficient in programming languages like Pyspark and PythonGood understanding of SQL, T-SQL and/or PL/SQLDemonstrated analytical and problem-solving skills particularly those that apply to a big data environmentWillingness to work on-site or remote, as neededOriginally posted on Himalayas