Siemens Pune DevOps Engineer Posts | Bachelor’s Degree/Master’s Degree | Apply Now
Siemens Ltd, a German conglomerate company, focuses its services in Industry, Energy, Healthcare, and Infrastructure & Cities.
In the latest job announcement, Siemens India announces job vacancies for DevOps Engineer posts, with work location in Pune.
Under Siemens Pune DevOps Engineer 2021 Jobs, candidates having required skills in Python, C#, AWS services can apply.
The selected candidate will be recruited with a permanent and full-time job.
An Interested and qualified candidate has to apply through online mode.
Job Designation: DevOps Engineer.
Job Code: 261327.
Education Qualification: Bachelor’s Degree/Master’s Degree.
Experience Level: 3 to 6 years.
Job Location: Pune.
Apply Mode: Online.
- You should Take responsibility and ownership for the end-to-end operation of a data analytics environment. Build and maintain a platform that hosts our data driven services, allowing us to process and utilize the wealth of data we have. Mainly a back-end solution, it has a limited front-end side as well.
- You need to Integrate data analytics services to global offerings, by packaging Python code produced by data scientists to be compatible with our R&D infrastructure and target products that will consume the service.
- You need to Work closely with global, cross-functional teams like Architects, Data Scientists, DevOps and Product Managers to understand and implement the solution requirements.
- You need to Make sure the numerous APIs are well maintained, test and validate the environment with end-users.
What Makes me Eligible for this Role:
- Educational Qualification- A University degree in Computer Science or a comparable education, we are flexible as long as a high quality of code is ensured.
- Experience Required- 3 to 8 Years.
- The scenario in our team needs an early starter, bringing along a few years of professional experience in software engineering, ideally with Python, alternatively with Scala, or C#.
- As you will be working with these from day one, familiarity with AWS services beyond EC2 (e.g., Fargate, Batch, RDS, SageMaker) is something we expect from applicants.
- Initial experience or willingness to explore big data pipeline and compute tooling such as Luigi, Airflow, Beam, Spark, Databricks. When it comes to methodologies, knowledge of agile software development processes would be highly valued.