Hitachi Consulting Software Services India Pvt. Ltd Chennai Data Streaming Engineer Posts | Bachelor’s Degree/Master’s Degree | Apply Now
Hitachi Consulting Corporation is an American international management and technology consulting firm with headquarters in Dallas, Texas.
In the latest job announcement, Hitachi announces job vacancies for Data Streaming Engineer posts, with work location in Chennai.
Under Hitachi Chennai Data Streaming Engineer 2022 Jobs, candidates having required skills in AWS Glue, S3, Athena, SQS, Kafka brokers, zookeepers, KSQL, KStream can apply.
The selected candidate will be recruited with a permanent and full-time job.
An Interested and qualified candidate has to apply through online mode.
Job Designation: Data Streaming Engineer.
Job Code: 1019299HV.
Education Qualification: Bachelor’s Degree/Master’s Degree.
Experience Level: 4.5 years.
Job Location: Chennai.
Apply Mode: Online.
Technical experience:
- Design and recommend the best approach suited for data movement to/from different sources using Apache/Confluent Kafka.
- Good understanding of Event-based architecture, messaging frameworks and stream processing solutions using Kafka Messaging framework.
- Hands-on experience working on Kafka connect using schema registry in a high-volume environment.
- Strong knowledge and exposure to Kafka brokers, zookeepers, KSQL, KStream and Kafka Control centre.
- Good knowledge of big data ecosystem to design and develop capabilities to deliver solutions using CI/CD pipelines.
- Skilled experience in writing and troubleshooting Python/PySpark scripts to generate extracts, cleanse, conform and deliver data for consumption
- Strong working knowledge of the AWS Data analytics eco-system such as AWS Glue, S3, Athena, SQS etc.
- Good understanding of other AWS services such as CloudWatch monitoring, scheduling and automation services
- Good experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connectors, JMS source connectors, Tasks, Workers, converters, and
- Working knowledge on Kafka Rest proxy and experience on custom connectors using the Kafka core concepts and API.
- Create topics, set up redundancy cluster, deploy monitoring tools, and alerts, and has good knowledge of best practices.
- Develop and ensure adherence to published system architectural decisions and development standards
Good to have:
- Ability to perform data-related benchmarking, performance analysis and tuning.
- Understanding of Data warehouse architecture and data modelling
- Strong skills in In-memory applications, Database Design, and Data Integration
- Ability to guide and mentor team members on using Kafka.
How to apply:
Interested and qualified job applicants have to apply through online mode, by initially registering with Hitachi career portal and login in to apply.
Apply online:
https://careers.hitachi.com/jobs/10215498-data-streaming-engineer
Subscribe for Job Alerts
Join our mailing list to receive the latest news and updates from jobalertshub.com.
0 Comments