S&P Global Hyderabad ML Engineer Posts | Bachelor’s Degree/Master’s Degree | Apply Now
S&P Global is an American publicly traded corporation, with a focus on its business in the areas of financial information and analytics.
In the latest job announcement, S&P Global announces job vacancies for ML Engineer posts, with work location in Hyderabad.
Under S&P Global Hyderabad ML Engineer 2025 Jobs, candidates having required skills in Python, R, Java or C# can apply.
The selected candidate will be recruited with a permanent and full-time job.
An Interested and qualified candidate has to apply through online mode.
Job Designation: ML Engineer.
Job Code: 308656.
Education Qualification: Bachelor’s Degree/Master’s Degree.
Experience Level: 4+ years.
Job Location: Hyderabad.
Apply Mode: Online.
Job Responsibilities:
- To collaborate with stakeholders, including data scientists, analysts, and other engineers, to understand and refine requirements related to data processing and transformation needs.
- To design, construct, install, and maintain large-scale processing systems and other infrastructure.
- To build high-performance algorithms, prototypes, and conceptual models and enable the efficient retrieval and analysis of data.
- To implement ETL processes to acquire, validate, and process incoming data from diverse sources.
- To ensure data architecture and model adhere to compliance, privacy, and security standards.
- To work in conjunction with data scientists to optimize data science and machine learning algorithms and models.
- To provide technical expertise in the resolution of data-related issues, including data quality, data lineage, and data processing errors.
- To manage the deployment of analytics solutions into production and maintain them.
- To maintain high-quality processes and deliver projects in collaborative Agile team environments.
Requirements:
- 3+ years of programming experience particularly in Python, R, Java or C#.
- 1+ years of experience working with SQL or NoSQL databases.
- Experience working with Pyspark.
- University degree in Computer Science, Engineering, Mathematics, or related disciplines.
- Strong understanding of big data technologies such as Hadoop, Spark, or Kafka.
- Demonstrated ability to design and implement end-to-end scalable and performant data pipelines.
- Experience with workflow management platforms like Airflow.
- Strong analytical and problem-solving skills.
- Ability to collaborate and communicate effectively with both technical and non-technical stakeholders. • Experience building solutions and working in the Agile working environment
- Experience working with git or other source control tools
How to apply:
Interested and qualified job applicants have to apply through online mode, by initially registering with S&P Global career portal and login in to apply.
Apply online:
https://careers.spglobal.com/jobs/308656?lang=en-us
For more information about S&P Global vacancies, visit the S&P Global Recruitment page.
Subscribe for Job Alerts
Join our mailing list to receive the latest news and updates from jobalertshub.com.
0 Comments