Deloitte US GLS India Bangalore Azure Data Engineer Posts | Bachelor’s Degree/Master’s Degree | Apply Now
Deloitte US GLS India is a leading multinational professional service providing company.
In the latest job announcement, Deloitte announces job vacancies for Azure Data Engineer posts, with work location in Bangalore.
Under Deloitte Bangalore Azure Data Engineer 2023 Jobs, candidates having required skills in Python, and Apache Spark can apply.
The selected candidate will be recruited with a permanent and full-time job.
An Interested and qualified candidate has to apply through online mode.
Job Designation: Azure Data Engineer.
Job Code: 55890.
Education Qualification: Bachelor’s Degree/Master’s Degree.
Experience Level: 5 to 8 years.
Job Location: Bangalore
Apply Mode: Online.
Responsibilities:
- Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark and/or Azure Databricks
- Delivering and presenting proofs of concept of key technology components to project stakeholders.
- Developing scalable and re-usable frameworks for ingesting and enriching datasets
- Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
- Working with event based / streaming technologies to ingest and process data
- Working with other members of the project team to support delivery of additional project components (API interfaces, Search)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
Qualifications:
- Strong knowledge of Data Management principles
- 5 to 8 years of total years of experience
- Strong understanding of Python 3 concepts and fundamentals, including syntax, common data structures, testing, and object-oriented programming basics.
- Should be able to demonstrate coding and problem-solving acumen where the candidate can write a working code.
- Loops, dictionaries, arrays, modules, UDF, Classes, Objects, pyspark, Dataframe, SQL, databricks, and Azure blob storage.
- Experience in building ETL / data warehouse transformation processes
- Direct experience in building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks).
- Experience using Apache Spark and associated design and development patterns
- Microsoft Azure Big Data Architecture certification is an advantage.
- Hands-on experience designing and delivering solutions using Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics
- Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Terraform etc.
How to apply:
Interested and qualified job applicants have to apply through online mode, by initially registering with Deloitte career portal and login in to apply.
Apply online:
Subscribe for Job Alerts
Join our mailing list to receive the latest news and updates from jobalertshub.com.
0 Comments