IBM India Private Limited Kolkata Big Data Engineer Posts | Bachelor’s Degree/Master’s Degree | Apply Now
IBM is an American multinational technology company that manufactures and markets computer hardware, middleware, and software, and provides hosting and consulting services.
In the latest job announcement, IBM announces job vacancies for Big Data Engineer posts, with work location in Kolkata.
Under IBM Kolkata Big Data Engineer 2023 Jobs, candidates having required skills in GCP Cloud, Python, ETL Pipelines, Datawarehouse, Data Lake, and Data Analysis can apply.
The selected candidate will be recruited with a permanent and full-time job.
An Interested and qualified candidate has to apply through online mode.
Job Designation: Big Data Engineer.
Job Code: 622288BR.
Education Qualification: Bachelor’s Degree/Master’s Degree.
Experience Level: Required.
Job Location: Kolkata.
Apply Mode: Online.
- In this role, you’ll work in our IBM Client Innovation Center (CIC), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. These centers offer our clients locally-based skills and technical expertise to drive innovation and adoption of new technology
- Skilled Multiple GCP services – GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc.
- Must have Python and SQL work experience
- Ability to analyse data for functional / business requirements.
- Excellent communication skills
- Ability to front face customer
- Proactive, collaborative and ability to respond to critical situation
- Mandatory skills GCP Cloud, Python, ETL Pipelines, Datawarehouse, Data Lake, Data Analysis, Data Modelling and Implementation SQL (Oracle/MySQL, PL/SQL – primarily for understanding existing datasets), Big Data technology stack(BigQuery preferred) Agile methodology and ways of working.
- Ability to translate business requirements into technical specifications Ability to build tactical and strategic roadmap on GCP platform Experience in defining the technical architecture on GCP involving cloud storage, big query, dataproc, cloud functions, dataflow Hands on experience in migrating the on-premise data pipelines and datawarehouse to Google Cloud Platform Design and develop Data warehouse / Data Lake solutions on Google Cloud Platform Design, code, and construct data models, ETL processes
Required Technical and Professional Expertise:
- Intuitive individual with an ability to manage change and proven time management
- Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
- Up-to-date technical knowledge by attending educational workshops, reviewing publications
Preferred Technical and Professional Expertise:
- Skilled Multiple GCP services – GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer
- You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies
- Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work
How to apply:
Interested and qualified job applicants have to apply through online mode, by initially registering with IBM career portal and login in to apply.