S&P Global Hyderabad Lead Cloud Engineer Posts | Bachelor’s Degree | Apply Now
S&P Global is an American publicly traded corporation, with a focus on its business in areas of financial information and analytics.
In the latest job announcement, S&P Global announces job vacancies for Lead Cloud Engineer posts, with work location in Hyderabad.
Under S&P Global Hyderabad Lead Cloud Engineer 2022 Jobs, candidates having required skills in AWS, Apache Kafka can apply.
The selected candidate will be recruited with a permanent and full-time job.
An Interested and qualified candidate has to apply through online mode.
Job Designation: Lead Cloud Engineer.
Job Code: 273902.
Education Qualification: Bachelor’s Degree.
Experience Level: 5 to 10 years.
Job Location: Hyderabad.
Apply Mode: Online.
- Work together with various domestic and overseas teams across the organization to ensure that the SOA & Docker & Kubernetes based Container Platform is reliable, operational, performant, and meets business requirements as well as committed SLA.
- Setup configure and monitor API & Messaging platform and conduct routine maintenance work for smooth operation with guaranteed uptime.
- Onboard applications to the platform as and when needed with high priority.
- Assist various DEV and QA teams during their development and testing following the guidelines provided.
- Work closely with supervisor/manager in day-to-day operation activities.
- Conduct and/or assist in conducting regular capacity analysis and POCs.
- Develop and maintain the platform automation tools, dashboard and utilities (Java and .NET C#).
- Provide support for Production, DR and lower environments making sure platform is stable and highly available.
- BS in Computer Science, Engineering or in equivalent discipline is required.
- Minimum 5-10 years of relevant work experience managing platform and/or infrastructure.
- Minimum 1-2 years of experience in any industry leading container platform such as Docker enterprise or Kubernetes is required.
- Proficient with modern DevOps tools including Jenkins and Azure DevOps (VSTS), GitHub Enterprise and Cloud based CI/CD pipelines.
- Hands-on knowledge or experience of AWS is a plus.
- Strong experience on Apache Kafka
- Good experience on apache Kafka tools, schema registry, Mirror Marker2
- Solid experience on Apache Kafka broker configurations, tunning, publisher api, consumer api and management api.
- Good working experience in managing and administrating Application platform / Appliance such as, Kafka, Envoy based Kafka Filter, Kafka connectors
- Strong experience on application deployment and monitoring. Experience with Datadog, Prometheus or similar tools is preferable.
- Experience on Kafka operators
- Experience in deploying and managing Kafka deployments in k8s.
- Experience in designing async api for Kafka applications and event driven applications.
- Authentication & Authorization techniques such as OAUTH, JWT, and mTLS
- API Mindset and Driving API based Strategies
- Manage Kafka platform and implement tools to improve existing monitoring, automation practices.
- Should be able to write good documentation, prepare the architecture diagram and present to Senior Management.
- Have good understanding of single, hybrid and multi cloud architecture with preferably hands-on experience.
- Have hands-on experience of integrating apps and data across various environments.
- Have experience in Scrum & Agile processes.
- Have excellent communication and troubleshooting skills.
- Have ability to present solution of complex problems to technical and non-technical audience.
- Have passion to learn new technologies and grow with team.
- Good hands-on experience with Linux/Unix and Windows OS.
How to apply:
Interested and qualified job applicants have to apply through online mode, by initially registering with S&P Global career portal and login in to apply.