To Apply for this Job Click Here
Senior Data Engineer
Role Overview
Heitmeyer Consulting is working with a regional bank who is seeking a Senior Data Engineer to support the end-to-end data pipeline responsible for extracting data from FCRM and other internal source systems and delivering it into Verafin, leveraging Google Cloud Platform (GCP)–based data ecosystem. This role will play a key part in enabling fraud, AML, and BSA operations by ensuring high-quality, well-structured, and reliable data ingestion.
Key Responsibilities
- Design, build, and maintain end-to-end data pipelines from legacy and modern source systems into cloud platforms
- Extract data from FCRM (legacy BSA platform) and additional internal systems
- Use dbt to transform, model, and structure data for Verafin ingestion
- Develop and maintain data pipelines using Python and SQL
- Work within Arvest’s GCP-native data environment
- Ensure data accuracy, reliability, and performance in a regulated financial environment
- Collaborate with data, compliance, and fraud stakeholders to support business needs
Required Qualifications
- Senior-level experience in data engineering
- Strong hands-on expertise in:
- Python
- SQL (advanced, production-level usage)
- dbt (data build tool) for transformations and modeling
- Google Cloud Platform (GCP)
- Proven experience building data pipelines from application and source systems into cloud platforms
- Ability to work independently in a fully remote environment while collaborating across teams
Nice-to-Have Qualifications
- Experience with fraud, AML, or BSA platforms
- Prior exposure to Verafin
- Experience working with financial services or banking data
- Familiarity with:
- Fiserv Signature (core banking platform)
- Legacy system integrations
Location & Schedule
- Fully remote
- Must align to Central Time business hours
- Location preferences (flexible):
- Arkansas, Missouri, Oklahoma, Kansas
- North Carolina, Texas, Florida, Georgia
