Data Engineer (IT)
7339
At Precision Castparts (PCC), we make extraordinary products for aerospace and other industries. This is made possible by the hard work and creativity of a diverse and global workforce. We are committed to fostering a culture of inclusiveness, empowerment and respect that embraces the differences in who we are. Working together, we will continue to solve complex problems every day.
We are relentless in our dedication to being a high-quality and on-time producer, delivering the highest value to our customers while continually pursuing strategic, profitable growth.
PCC employs more than 20,000 people worldwide in over 120 plants spread across twenty-six states in the US and over a dozen countries.
The Data Engineer delivers data to internal customers to enable actionable analytics. The data to be delivered includes raw data, transformed data, integrated data, in either batch or streaming mode, and must be tailored to meet the various business use-cases and downstream applications. To meet these needs, the Data Engineer must command a broad set of skills that span CDC, ETL, big data frameworks, API integration, IoT, data warehousing, business intelligence, and machine learning.
Primary Duties and Responsibilities:
- Integrate multi-plant/system data within the enterprise data lake
- Develop and maintain efficient ingestion and transformation pipelines
- Monitor data pipelines and ensure they meet quality and timeliness SLAs
- Partner with data stewards, analysts, and scientists to ensure data is fit-for-purpose
Experience and Education:
- 3 years’ experience as a data engineer or similar role such as data developer, data architect, ETL developer, integration specialist, etc. (5+ years preferred)
- Bachelor’s degree in data engineering, computer science, or a related field (or equivalent job experience)
- Master’s degree preferred
- Preferred to have at least one related certificate such as Certified Analytics Professional, Azure Data Engineer Associate, AWS Big Data Specialty, etc.
Required Skills:
- Highly motivated and independent learner
- Experience building data pipelines in cloud environments
- Experience with big data frameworks (e.g. Hadoop, Cloudera, Databricks, etc.)
- Deep understanding of database systems, including relational, time-series, and NoSQL
- Experience ingesting data from various sources such as databases, REST, IoT devices, etc.
- Fluency in a SQL language
- Fluency in a data preparation scripting language (Python preferred)
- Experience with a data visualization tool (Power BI preferred)
- Familiarity working in DevOps, DataOps, and Agile projects
Travel:
- Occasional travel to support sites, attend team meetings, and for training
- Travel up to 15%
This requisition is closed to applications.