Salesforce Hyderabad Data Engineer Posts | Bachelor’s Degree/Master’s Degree | Apply Now
Salesforce is an American cloud-based software company, sells a complementary suite of enterprise applications focused on customer service, marketing automation, analytics, and application development.
In the latest job announcement, Salesforce careers India announces job vacancies for Data Engineer posts, with work location in Hyderabad.
Under Salesforce Hyderabad Hiring Data Engineer 2021 Jobs, candidates having required skills in SQL, Python, Cloud technologies can apply.
The selected candidate will be recruited with a permanent and full-time job.
An Interested and qualified candidate has to apply through online mode.
Job Designation: Data Engineer - Customer Intelligence.
Job Code: JR117246.
Education Qualification: Bachelor’s Degree/Master’s Degree.
Experience Level: 3 to 7 years.
Job Location: Hyderabad.
Apply Mode: Online.
- Design and enhance the next generation data model to accommodate the ever-changing data as well as design & implement quick data solutions to business data needs
- Support data migration initiatives to Cloud platforms like Snowflake
- Build an end to end solutions (Proof of Concept to Production grade) using open source ETL frameworks, document the solutions, and enable the datasets for end-users
- Be flexible in terms of tools and technologies as we explore efficient and unique ways to support our diverse stakeholders
- Closely work with data scientists to build the next generation data integration capabilities which in turn will support a variety of predictive applications
- Always be on the lookout to automate and improve existing data processes for quicker turnaround and high productivity
- Build flawless and highly efficient data pipelines
- Take pride & complete ownership of the data pipelines built and have a passion for high-quality data
- Have a high sense of urgency to meet milestones/ dates and deliver projects
- Bachelors or Masters in Computer Science or related field with 3 - 7 years experience in a technical data organization
- 2+ years of hands-on experience in cloud technologies (GCP/AWS) is a must
- 2+ years experience in writing complex SQL ( Oracle, Snowflake, Hive, etc.)
- Proficient in Python with at least 2 years of scripting experience. Shell scripting will be a plus
- Experience with API development for exposing data-as-a-service
- 2+ years of experience in building ETL solutions tools or custom ETL frameworks
- Experience with automation and scheduling tools like Airflow is a huge plus
- Experience building/ maintaining data pipelines in a data warehouse, data lake environment preferably on a cloud platform
- Hands-on familiarity with big data, Hadoop Ecosystem of tools is desired
- Salesforce experience is a plus but not required
How to apply:
Interested and qualified job applicants have to apply through online mode, by initially registering with Salesforce career portal and login in to apply.