Job Description
Batch-2020 & before
JOB DESCRIPTION
GDIA Data Tech Connected Vehicle is the business owner of the enterprise data lake for all global embedded modem, Smart Mobility experiment and Ford Pass data. The role of the Data Engineer is a critical enabler to the development of Ford’s Data Supply Chain including Connected Vehicle data. The engineer will ensure that data standards, information interoperability, data quality, and data availability fully support the goals of GDIA and the enterprise.
RESPONSIBILITIES
Responsibilities:
- Work closely with Business team to gather the requirements and other required information
- Coordination between the teams Business, Analytics and other vendors
- Analyze the data and create a design document based on the requirements for LandingWork hands-on with the team and other stakeholders to deliver quality software products that meet our customer’s requirements and needs.
- Help Product Owners understand our iterative development approach and focus on delivering a Minimum Viable Product through careful and deliberate prioritization
- Transform the data and load the data into summary table
- Document desk procedures and keep them current
QUALIFICATIONS
Bachelor’s degree in Statistics, Economics, Data Science, Computer Science, Engineering or Mathematics
- Minimum of 2 years of experience in a Data Management role including running queries and compiling data for analytics
- Minimum of 2 years of experience in data design, data architecture and data modeling (both transactional and analytic)
- Minimum of 1 years of experience in Big Data / NoSQL technologies including Hadoop (HDFS, MapReduce, Hive, Scala, Spark etc.), especially command line experience with loading and manipulating files within HDFS
- Exposure on GCP Technologies
Preferred Qualifications:
Master’s degree in Statistics, Economics, Data Science, Computer Science, Engineering or Mathematics
- 3 years of experience in a Data Management role including running queries and compiling data for analytics
- Minimum of 2 years of experience in data design, data architecture and data modeling (both transactional and analytic)
- Minimum of 1 years of experience in Big Data / NoSQL technologies including Hadoop (HDFS, MapReduce, Hive, Scala, Spark etc.), especially command line experience with loading and manipulating files within HDFS Strong oral and written communication skills
- Experience programming and producing working models or transformations with modern programming languages
- Demonstrated experience building visualizations using Tableau/Qlikview