With a company culture rooted in collaboration, expertise and innovation, we aim to promote progress and inspire our clients, employees, investors and communities to achieve their greatest potential. Our work is the catalyst that helps others achieve their goals. In short, We Enable Possibility℠.
The Data Engineer IV--internally known as Software Engineer IV, will specialize in designing, implementing, and optimizing Enterprise Datawarehouse platform for efficient storage, retrieval, and management of structured data. Join us as we embark on an exciting modernization journey, transitioning from our legacy platforms to cutting-edge technologies like Power BI and Snowflake. We're seeking passionate individuals to drive this transformation, leveraging their expertise to revamp our data infrastructure and propel us into the future.
*This is a Hybrid role that requires attendance in the Raleigh office, twice a week.
Responsibilities
- Works with the Agile Program Manager (APM), Digital Product Manager (DPM), and Business Systems Analyst (BSA) to accurately capture stakeholder requests and system specifications and translate them into engineering artifacts, which typically include design specifications, source code, test scripts and test results.
- Lead the end-to-end architecture and implementation of enterprise-grade data solutions using Snowflake.
- Design, develop, and optimize scalable ETL/ELT pipelines to support analytics, data science, and business intelligence functions.
- Define and implement data modeling best practices, including Dimensional Modeling (Kimball) and Data Vault 2.0 methodologies.
- Helps mentor junior developers and guide them through project execution.
- Consults with enterprise architects to ensure that the engineering realization is in accordance with Enterprise Architecture principles and software development best practices.
- As part of Agile teams, completes software development work which includes application design, coding, code review and testing. Keeps Agile team and APM apprised of project status.
- Offers suggestions to stakeholders on devising effective and efficient approaches to achieve project and program objectives.
- Manages engineering risks by proactively tracking and communicating issues and devising methods to mitigate them; and collaborating with involved parties to implement solutions.
- Liaises with other project and program areas to coordinate interdependencies and resolve issues.
- Supports business units in the resolution of in-depth user questions and issues following production support process and SLA’s.
- Maintains a working knowledge of new technology and software engineering standards, practices and tools.
- Provides input to APM/DPM in creation of Product Roadmap, High Level Estimates
- Collaborates with IT management to define and develop documentation & engineering artifact standards, guidelines, processes, and templates.
Required Qualifications
- 8+ years of hands-on experience in data engineering, with a minimum of 3 years in a leadership or architectural role.
- Strong proficiency in Snowflake data warehouse development, architecture, and performance tuning.
- Proven experience designing and implementing data models using Kimball dimensional models and Data Vault 2.0 methodologies.
- 5-7 years of experience in developing reports, preferably using Power BI.
- Expertise in SQL, Python, and orchestration frameworks such as Airflow, DBT, or Matillion.
- Solid understanding of cloud platforms (AWS, Azure, or GCP) with direct experience integrating cloud services into data pipelines.
- Hands on experience in DataOps and infrastructure-as-code techniques.
- Hands on experience in designing the data platform to ensure data cataloging, metadata management, lineage, data quality and governance from day one.
- Excellent analytical, problem solving and organizational skills.
- Excellent communication and collaboration skills with both technical and non-technical stakeholders.
- Knowledge of production support processes such as incident and problem management techniques.
Preferred Qualifications
- Snowflake certification (SnowPro Advanced Architect or SnowPro Core).
- Experience with real-time data streaming tools (Kafka, etc.).
- Mortgage industry experience
Education
- Required knowledge and skills would typically be acquired through a bachelor’s degree in computer science, business, or related field plus 8 or more years of related experience.
#LI-Hybrid
#LI-ZP1
Do you like solving complex business problems, working with talented colleagues and have an innovative mindset? Arch may be a great fit for you. If this job isn’t the right fit but you’re interested in working for Arch, create a job alert! Simply create an account and opt in to receive emails when we have job openings that meet your criteria. Join our talent community to share your preferences directly with Arch’s Talent Acquisition team.
14500 Arch U.S. MI Services Inc.